Teaching the Meaning of Statistical Techniques with Microcomputer Simulation.
ERIC Educational Resources Information Center
Lee, Motoko Y.; And Others
Students in an introductory statistics course are often preoccupied with learning the computational routines of specific summary statistics and thereby fail to develop an understanding of the meaning of those statistics or their conceptual basis. To help students develop a better understanding of the meaning of three frequently used statistics,…
Yang, Yi; Tokita, Midori; Ishiguchi, Akira
2018-01-01
A number of studies revealed that our visual system can extract different types of summary statistics, such as the mean and variance, from sets of items. Although the extraction of such summary statistics has been studied well in isolation, the relationship between these statistics remains unclear. In this study, we explored this issue using an individual differences approach. Observers viewed illustrations of strawberries and lollypops varying in size or orientation and performed four tasks in a within-subject design, namely mean and variance discrimination tasks with size and orientation domains. We found that the performances in the mean and variance discrimination tasks were not correlated with each other and demonstrated that extractions of the mean and variance are mediated by different representation mechanisms. In addition, we tested the relationship between performances in size and orientation domains for each summary statistic (i.e. mean and variance) and examined whether each summary statistic has distinct processes across perceptual domains. The results illustrated that statistical summary representations of size and orientation may share a common mechanism for representing the mean and possibly for representing variance. Introspections for each observer performing the tasks were also examined and discussed.
Stone, M.A.J.; Mann, Larry J.; Kjelstrom, L.C.
1993-01-01
Statistical summaries and graphs of streamflow data were prepared for 13 gaging stations with 5 or more years of continuous record on and near the Idaho National Engineering Laboratory. Statistical summaries of streamflow data for the Big and Little Lost Rivers and Birch Creek were analyzed as a requisite for a comprehensive evaluation of the potential for flooding of facilities at the Idaho National Engineering Laboratory. The type of statistical analyses performed depended on the length of streamflow record for a gaging station. Streamflow statistics generated for stations with 5 to 9 years of record were: (1) magnitudes of monthly and annual flows; (2) duration of daily mean flows; and (3) maximum, median, and minimum daily mean flows. Streamflow statistics generated for stations with 10 or more years of record were: (1) magnitudes of monthly and annual flows; (2) magnitudes and frequencies of daily low, high, instantaneous peak (flood frequency), and annual mean flows; (3) duration of daily mean flows; (4) exceedance probabilities of annual low, high, instantaneous peak, and mean annual flows; (5) maximum, median, and minimum daily mean flows; and (6) annual mean and mean annual flows.
Statistical summaries of selected Iowa streamflow data through September 2013
Eash, David A.; O'Shea, Padraic S.; Weber, Jared R.; Nguyen, Kevin T.; Montgomery, Nicholas L.; Simonson, Adrian J.
2016-01-04
Statistical summaries of streamflow data collected at 184 streamgages in Iowa are presented in this report. All streamgages included for analysis have at least 10 years of continuous record collected before or through September 2013. This report is an update to two previously published reports that presented statistical summaries of selected Iowa streamflow data through September 1988 and September 1996. The statistical summaries include (1) monthly and annual flow durations, (2) annual exceedance probabilities of instantaneous peak discharges (flood frequencies), (3) annual exceedance probabilities of high discharges, and (4) annual nonexceedance probabilities of low discharges and seasonal low discharges. Also presented for each streamgage are graphs of the annual mean discharges, mean annual mean discharges, 50-percent annual flow-duration discharges (median flows), harmonic mean flows, mean daily mean discharges, and flow-duration curves. Two sets of statistical summaries are presented for each streamgage, which include (1) long-term statistics for the entire period of streamflow record and (2) recent-term statistics for or during the 30-year period of record from 1984 to 2013. The recent-term statistics are only calculated for streamgages with streamflow records pre-dating the 1984 water year and with at least 10 years of record during 1984–2013. The streamflow statistics in this report are not adjusted for the effects of water use; although some of this water is used consumptively, most of it is returned to the streams.
Yang, Yi; Tokita, Midori; Ishiguchi, Akira
2018-01-01
A number of studies revealed that our visual system can extract different types of summary statistics, such as the mean and variance, from sets of items. Although the extraction of such summary statistics has been studied well in isolation, the relationship between these statistics remains unclear. In this study, we explored this issue using an individual differences approach. Observers viewed illustrations of strawberries and lollypops varying in size or orientation and performed four tasks in a within-subject design, namely mean and variance discrimination tasks with size and orientation domains. We found that the performances in the mean and variance discrimination tasks were not correlated with each other and demonstrated that extractions of the mean and variance are mediated by different representation mechanisms. In addition, we tested the relationship between performances in size and orientation domains for each summary statistic (i.e. mean and variance) and examined whether each summary statistic has distinct processes across perceptual domains. The results illustrated that statistical summary representations of size and orientation may share a common mechanism for representing the mean and possibly for representing variance. Introspections for each observer performing the tasks were also examined and discussed. PMID:29399318
Statistical Tutorial | Center for Cancer Research
Recent advances in cancer biology have resulted in the need for increased statistical analysis of research data. ST is designed as a follow up to Statistical Analysis of Research Data (SARD) held in April 2018. The tutorial will apply the general principles of statistical analysis of research data including descriptive statistics, z- and t-tests of means and mean
Asquith, William H.; Vrabel, Joseph; Roussel, Meghan C.
2007-01-01
Analysts and managers of surface-water resources might have interest in selected statistics of daily mean streamflow for U.S. Geological Survey (USGS) streamflow-gaging stations in Texas. The selected statistics are the annual mean, maximum, minimum, and L-scale of daily meanstreamflow. Annual L-scale of streamflow is a robust measure of the variability of the daily mean streamflow for a given year. The USGS, in cooperation with the Texas Commission on Environmental Quality, initiated in 2006a data and reporting process to generate annual statistics for 712 USGS streamflow-gaging stations in Texas. A graphical depiction of the history of the annual statistics for most active and inactive, continuous-record gaging stations in Texas provides valuable information by conveying the historical perspective of streamflow for the watershed. Each figure consists off our time-series plots of the annual statistics of daily mean streamflow for each streamflow-gaging station. Each of the four plots is augmented with horizontal lines that depict the mean and median annual values of the corresponding statistic for the period of record. Monotonic trends for each of the four annual statistics also are identified using Kendall's T. The history of one or more streamflow-gaging stations could be used in a watershed, river basin, or other regional context by analysts and managers of surface-water resources to guide scientific, regulatory, or other inquiries of streamflow conditions in Texas.
Code of Federal Regulations, 2010 CFR
2010-07-01
... Institute of Justice; BJS means the Bureau of Justice Statistics; OJARS means the Office of Justice Assistance, Research and Statistics; OJJDP means Office of Juvenile Justice and Delinquency Prevention. (e...
Standard deviation and standard error of the mean.
Lee, Dong Kyu; In, Junyong; Lee, Sangseok
2015-06-01
In most clinical and experimental studies, the standard deviation (SD) and the estimated standard error of the mean (SEM) are used to present the characteristics of sample data and to explain statistical analysis results. However, some authors occasionally muddle the distinctive usage between the SD and SEM in medical literature. Because the process of calculating the SD and SEM includes different statistical inferences, each of them has its own meaning. SD is the dispersion of data in a normal distribution. In other words, SD indicates how accurately the mean represents sample data. However the meaning of SEM includes statistical inference based on the sampling distribution. SEM is the SD of the theoretical distribution of the sample means (the sampling distribution). While either SD or SEM can be applied to describe data and statistical results, one should be aware of reasonable methods with which to use SD and SEM. We aim to elucidate the distinctions between SD and SEM and to provide proper usage guidelines for both, which summarize data and describe statistical results.
Standard deviation and standard error of the mean
In, Junyong; Lee, Sangseok
2015-01-01
In most clinical and experimental studies, the standard deviation (SD) and the estimated standard error of the mean (SEM) are used to present the characteristics of sample data and to explain statistical analysis results. However, some authors occasionally muddle the distinctive usage between the SD and SEM in medical literature. Because the process of calculating the SD and SEM includes different statistical inferences, each of them has its own meaning. SD is the dispersion of data in a normal distribution. In other words, SD indicates how accurately the mean represents sample data. However the meaning of SEM includes statistical inference based on the sampling distribution. SEM is the SD of the theoretical distribution of the sample means (the sampling distribution). While either SD or SEM can be applied to describe data and statistical results, one should be aware of reasonable methods with which to use SD and SEM. We aim to elucidate the distinctions between SD and SEM and to provide proper usage guidelines for both, which summarize data and describe statistical results. PMID:26045923
Asquith, William H.; Heitmuller, Franklin T.
2008-01-01
Analysts and managers of surface-water resources have interest in annual mean and annual harmonic mean statistics of daily mean streamflow for U.S. Geological Survey (USGS) streamflow-gaging stations in Texas. The mean streamflow represents streamflow volume, whereas the harmonic mean streamflow represents an appropriate statistic for assessing constituent concentrations that might adversely affect human health. In 2008, the USGS, in cooperation with the Texas Commission on Environmental Quality, conducted a large-scale documentation of mean and harmonic mean streamflow for 620 active and inactive, continuous-record, streamflow-gaging stations using period of record data through water year 2007. About 99 stations within the Texas USGS streamflow-gaging network are part of the larger national Hydroclimatic Data Network and are identified. The graphical depictions of annual mean and annual harmonic mean statistics in this report provide a historical perspective of streamflow at each station. Each figure consists of three time-series plots, two flow-duration curves, and a statistical summary of the mean annual and annual harmonic mean streamflow statistics for available data for each station.The first time-series plot depicts daily mean streamflow for the period 1900-2007. Flow-duration curves follow and are a graphical depiction of streamflow variability. Next, the remaining two time-series plots depict annual mean and annual harmonic mean streamflow and are augmented with horizontal lines that depict mean and harmonic mean for the period of record. Monotonic trends for the annual mean streamflow and annual harmonic mean streamflow also are identified using Kendall's tau, and the slope of the trend is depicted using the nonparametric (linear) Theil-Sen line, which is only drawn for p-values less than .10 of tau. The history of annual mean and annual harmonic mean streamflow of one or more streamflow-gaging stations could be used in a watershed, river basin, or other regional context by analysts and managers of surface-water resources to guide scientific, regulatory, or other inquiries of streamflow conditions in Texas.
Mathematics pre-service teachers’ statistical reasoning about meaning
NASA Astrophysics Data System (ADS)
Kristanto, Y. D.
2018-01-01
This article offers a descriptive qualitative analysis of 3 second-year pre-service teachers’ statistical reasoning about the mean. Twenty-six pre-service teachers were tested using an open-ended problem where they were expected to analyze a method in finding the mean of a data. Three of their test results are selected to be analyzed. The results suggest that the pre-service teachers did not use context to develop the interpretation of mean. Therefore, this article also offers strategies to promote statistical reasoning about mean that use various contexts.
Thompson, Ronald E.; Hoffman, Scott A.
2006-01-01
A suite of 28 streamflow statistics, ranging from extreme low to high flows, was computed for 17 continuous-record streamflow-gaging stations and predicted for 20 partial-record stations in Monroe County and contiguous counties in north-eastern Pennsylvania. The predicted statistics for the partial-record stations were based on regression analyses relating inter-mittent flow measurements made at the partial-record stations indexed to concurrent daily mean flows at continuous-record stations during base-flow conditions. The same statistics also were predicted for 134 ungaged stream locations in Monroe County on the basis of regression analyses relating the statistics to GIS-determined basin characteristics for the continuous-record station drainage areas. The prediction methodology for developing the regression equations used to estimate statistics was developed for estimating low-flow frequencies. This study and a companion study found that the methodology also has application potential for predicting intermediate- and high-flow statistics. The statistics included mean monthly flows, mean annual flow, 7-day low flows for three recurrence intervals, nine flow durations, mean annual base flow, and annual mean base flows for two recurrence intervals. Low standard errors of prediction and high coefficients of determination (R2) indicated good results in using the regression equations to predict the statistics. Regression equations for the larger flow statistics tended to have lower standard errors of prediction and higher coefficients of determination (R2) than equations for the smaller flow statistics. The report discusses the methodologies used in determining the statistics and the limitations of the statistics and the equations used to predict the statistics. Caution is indicated in using the predicted statistics for small drainage area situations. Study results constitute input needed by water-resource managers in Monroe County for planning purposes and evaluation of water-resources availability.
Statistical Tutorial | Center for Cancer Research
Recent advances in cancer biology have resulted in the need for increased statistical analysis of research data. ST is designed as a follow up to Statistical Analysis of Research Data (SARD) held in April 2018. The tutorial will apply the general principles of statistical analysis of research data including descriptive statistics, z- and t-tests of means and mean differences, simple and multiple linear regression, ANOVA tests, and Chi-Squared distribution.
Evaluation of a New Mean Scaled and Moment Adjusted Test Statistic for SEM
ERIC Educational Resources Information Center
Tong, Xiaoxiao; Bentler, Peter M.
2013-01-01
Recently a new mean scaled and skewness adjusted test statistic was developed for evaluating structural equation models in small samples and with potentially nonnormal data, but this statistic has received only limited evaluation. The performance of this statistic is compared to normal theory maximum likelihood and 2 well-known robust test…
Code of Federal Regulations, 2010 CFR
2010-07-01
... Administration DEPARTMENT OF JUSTICE CONFIDENTIALITY OF IDENTIFIABLE RESEARCH AND STATISTICAL INFORMATION § 22.2... capacity. (c) Research or statistical project means any program, project, or component thereof which is... statistical information means any information which is collected during the conduct of a research or...
Code of Federal Regulations, 2011 CFR
2011-07-01
... Administration DEPARTMENT OF JUSTICE CONFIDENTIALITY OF IDENTIFIABLE RESEARCH AND STATISTICAL INFORMATION § 22.2... capacity. (c) Research or statistical project means any program, project, or component thereof which is... statistical information means any information which is collected during the conduct of a research or...
Quality of reporting statistics in two Indian pharmacology journals.
Jaykaran; Yadav, Preeti
2011-04-01
To evaluate the reporting of the statistical methods in articles published in two Indian pharmacology journals. All original articles published since 2002 were downloaded from the journals' (Indian Journal of Pharmacology (IJP) and Indian Journal of Physiology and Pharmacology (IJPP)) website. These articles were evaluated on the basis of appropriateness of descriptive statistics and inferential statistics. Descriptive statistics was evaluated on the basis of reporting of method of description and central tendencies. Inferential statistics was evaluated on the basis of fulfilling of assumption of statistical methods and appropriateness of statistical tests. Values are described as frequencies, percentage, and 95% confidence interval (CI) around the percentages. Inappropriate descriptive statistics was observed in 150 (78.1%, 95% CI 71.7-83.3%) articles. Most common reason for this inappropriate descriptive statistics was use of mean ± SEM at the place of "mean (SD)" or "mean ± SD." Most common statistical method used was one-way ANOVA (58.4%). Information regarding checking of assumption of statistical test was mentioned in only two articles. Inappropriate statistical test was observed in 61 (31.7%, 95% CI 25.6-38.6%) articles. Most common reason for inappropriate statistical test was the use of two group test for three or more groups. Articles published in two Indian pharmacology journals are not devoid of statistical errors.
Rasch fit statistics and sample size considerations for polytomous data.
Smith, Adam B; Rush, Robert; Fallowfield, Lesley J; Velikova, Galina; Sharpe, Michael
2008-05-29
Previous research on educational data has demonstrated that Rasch fit statistics (mean squares and t-statistics) are highly susceptible to sample size variation for dichotomously scored rating data, although little is known about this relationship for polytomous data. These statistics help inform researchers about how well items fit to a unidimensional latent trait, and are an important adjunct to modern psychometrics. Given the increasing use of Rasch models in health research the purpose of this study was therefore to explore the relationship between fit statistics and sample size for polytomous data. Data were collated from a heterogeneous sample of cancer patients (n = 4072) who had completed both the Patient Health Questionnaire - 9 and the Hospital Anxiety and Depression Scale. Ten samples were drawn with replacement for each of eight sample sizes (n = 25 to n = 3200). The Rating and Partial Credit Models were applied and the mean square and t-fit statistics (infit/outfit) derived for each model. The results demonstrated that t-statistics were highly sensitive to sample size, whereas mean square statistics remained relatively stable for polytomous data. It was concluded that mean square statistics were relatively independent of sample size for polytomous data and that misfit to the model could be identified using published recommended ranges.
Rasch fit statistics and sample size considerations for polytomous data
Smith, Adam B; Rush, Robert; Fallowfield, Lesley J; Velikova, Galina; Sharpe, Michael
2008-01-01
Background Previous research on educational data has demonstrated that Rasch fit statistics (mean squares and t-statistics) are highly susceptible to sample size variation for dichotomously scored rating data, although little is known about this relationship for polytomous data. These statistics help inform researchers about how well items fit to a unidimensional latent trait, and are an important adjunct to modern psychometrics. Given the increasing use of Rasch models in health research the purpose of this study was therefore to explore the relationship between fit statistics and sample size for polytomous data. Methods Data were collated from a heterogeneous sample of cancer patients (n = 4072) who had completed both the Patient Health Questionnaire – 9 and the Hospital Anxiety and Depression Scale. Ten samples were drawn with replacement for each of eight sample sizes (n = 25 to n = 3200). The Rating and Partial Credit Models were applied and the mean square and t-fit statistics (infit/outfit) derived for each model. Results The results demonstrated that t-statistics were highly sensitive to sample size, whereas mean square statistics remained relatively stable for polytomous data. Conclusion It was concluded that mean square statistics were relatively independent of sample size for polytomous data and that misfit to the model could be identified using published recommended ranges. PMID:18510722
Code of Federal Regulations, 2010 CFR
2010-01-01
... include Government-controlled corporations. Bureau of Labor Statistics (BLS) means the Bureau of Labor Statistics of the Department of Labor. Commonwealth of the Northern Mariana Islands (CNMI) means the....207. Detailed Expenditure Category (DEC) means the lowest level of expenditure shown in tabulated...
EVALUATION OF A NEW MEAN SCALED AND MOMENT ADJUSTED TEST STATISTIC FOR SEM.
Tong, Xiaoxiao; Bentler, Peter M
2013-01-01
Recently a new mean scaled and skewness adjusted test statistic was developed for evaluating structural equation models in small samples and with potentially nonnormal data, but this statistic has received only limited evaluation. The performance of this statistic is compared to normal theory maximum likelihood and two well-known robust test statistics. A modification to the Satorra-Bentler scaled statistic is developed for the condition that sample size is smaller than degrees of freedom. The behavior of the four test statistics is evaluated with a Monte Carlo confirmatory factor analysis study that varies seven sample sizes and three distributional conditions obtained using Headrick's fifth-order transformation to nonnormality. The new statistic performs badly in most conditions except under the normal distribution. The goodness-of-fit χ(2) test based on maximum-likelihood estimation performed well under normal distributions as well as under a condition of asymptotic robustness. The Satorra-Bentler scaled test statistic performed best overall, while the mean scaled and variance adjusted test statistic outperformed the others at small and moderate sample sizes under certain distributional conditions.
Methods for estimating flow-duration and annual mean-flow statistics for ungaged streams in Oklahoma
Esralew, Rachel A.; Smith, S. Jerrod
2010-01-01
Flow statistics can be used to provide decision makers with surface-water information needed for activities such as water-supply permitting, flow regulation, and other water rights issues. Flow statistics could be needed at any location along a stream. Most often, streamflow statistics are needed at ungaged sites, where no flow data are available to compute the statistics. Methods are presented in this report for estimating flow-duration and annual mean-flow statistics for ungaged streams in Oklahoma. Flow statistics included the (1) annual (period of record), (2) seasonal (summer-autumn and winter-spring), and (3) 12 monthly duration statistics, including the 20th, 50th, 80th, 90th, and 95th percentile flow exceedances, and the annual mean-flow (mean of daily flows for the period of record). Flow statistics were calculated from daily streamflow information collected from 235 streamflow-gaging stations throughout Oklahoma and areas in adjacent states. A drainage-area ratio method is the preferred method for estimating flow statistics at an ungaged location that is on a stream near a gage. The method generally is reliable only if the drainage-area ratio of the two sites is between 0.5 and 1.5. Regression equations that relate flow statistics to drainage-basin characteristics were developed for the purpose of estimating selected flow-duration and annual mean-flow statistics for ungaged streams that are not near gaging stations on the same stream. Regression equations were developed from flow statistics and drainage-basin characteristics for 113 unregulated gaging stations. Separate regression equations were developed by using U.S. Geological Survey streamflow-gaging stations in regions with similar drainage-basin characteristics. These equations can increase the accuracy of regression equations used for estimating flow-duration and annual mean-flow statistics at ungaged stream locations in Oklahoma. Streamflow-gaging stations were grouped by selected drainage-basin characteristics by using a k-means cluster analysis. Three regions were identified for Oklahoma on the basis of the clustering of gaging stations and a manual delineation of distinguishable hydrologic and geologic boundaries: Region 1 (western Oklahoma excluding the Oklahoma and Texas Panhandles), Region 2 (north- and south-central Oklahoma), and Region 3 (eastern and central Oklahoma). A total of 228 regression equations (225 flow-duration regressions and three annual mean-flow regressions) were developed using ordinary least-squares and left-censored (Tobit) multiple-regression techniques. These equations can be used to estimate 75 flow-duration statistics and annual mean-flow for ungaged streams in the three regions. Drainage-basin characteristics that were statistically significant independent variables in the regression analyses were (1) contributing drainage area; (2) station elevation; (3) mean drainage-basin elevation; (4) channel slope; (5) percentage of forested canopy; (6) mean drainage-basin hillslope; (7) soil permeability; and (8) mean annual, seasonal, and monthly precipitation. The accuracy of flow-duration regression equations generally decreased from high-flow exceedance (low-exceedance probability) to low-flow exceedance (high-exceedance probability) . This decrease may have happened because a greater uncertainty exists for low-flow estimates and low-flow is largely affected by localized geology that was not quantified by the drainage-basin characteristics selected. The standard errors of estimate of regression equations for Region 1 (western Oklahoma) were substantially larger than those standard errors for other regions, especially for low-flow exceedances. These errors may be a result of greater variability in low flow because of increased irrigation activities in this region. Regression equations may not be reliable for sites where the drainage-basin characteristics are outside the range of values of independent vari
Weir, Christopher J; Butcher, Isabella; Assi, Valentina; Lewis, Stephanie C; Murray, Gordon D; Langhorne, Peter; Brady, Marian C
2018-03-07
Rigorous, informative meta-analyses rely on availability of appropriate summary statistics or individual participant data. For continuous outcomes, especially those with naturally skewed distributions, summary information on the mean or variability often goes unreported. While full reporting of original trial data is the ideal, we sought to identify methods for handling unreported mean or variability summary statistics in meta-analysis. We undertook two systematic literature reviews to identify methodological approaches used to deal with missing mean or variability summary statistics. Five electronic databases were searched, in addition to the Cochrane Colloquium abstract books and the Cochrane Statistics Methods Group mailing list archive. We also conducted cited reference searching and emailed topic experts to identify recent methodological developments. Details recorded included the description of the method, the information required to implement the method, any underlying assumptions and whether the method could be readily applied in standard statistical software. We provided a summary description of the methods identified, illustrating selected methods in example meta-analysis scenarios. For missing standard deviations (SDs), following screening of 503 articles, fifteen methods were identified in addition to those reported in a previous review. These included Bayesian hierarchical modelling at the meta-analysis level; summary statistic level imputation based on observed SD values from other trials in the meta-analysis; a practical approximation based on the range; and algebraic estimation of the SD based on other summary statistics. Following screening of 1124 articles for methods estimating the mean, one approximate Bayesian computation approach and three papers based on alternative summary statistics were identified. Illustrative meta-analyses showed that when replacing a missing SD the approximation using the range minimised loss of precision and generally performed better than omitting trials. When estimating missing means, a formula using the median, lower quartile and upper quartile performed best in preserving the precision of the meta-analysis findings, although in some scenarios, omitting trials gave superior results. Methods based on summary statistics (minimum, maximum, lower quartile, upper quartile, median) reported in the literature facilitate more comprehensive inclusion of randomised controlled trials with missing mean or variability summary statistics within meta-analyses.
Robust Statistics: What They Are, and Why They Are So Important
ERIC Educational Resources Information Center
Corlu, Sencer M.
2009-01-01
The problem with "classical" statistics all invoking the mean is that these estimates are notoriously influenced by atypical scores (outliers), partly because the mean itself is differentially influenced by outliers. In theory, "modern" statistics may generate more replicable characterizations of data, because at least in some…
The Utility of Robust Means in Statistics
ERIC Educational Resources Information Center
Goodwyn, Fara
2012-01-01
Location estimates calculated from heuristic data were examined using traditional and robust statistical methods. The current paper demonstrates the impact outliers have on the sample mean and proposes robust methods to control for outliers in sample data. Traditional methods fail because they rely on the statistical assumptions of normality and…
Code of Federal Regulations, 2013 CFR
2013-04-01
... Community Development Act of 1974 (42 U.S.C. 5302). Eligible Metropolitan Statistical Area (EMSA) means a....C. 12902). Metropolitan statistical area has the meaning given it in section 853(5) of the AIDS... diseases means the disease of acquired immunodeficiency syndrome or any conditions arising from the...
Code of Federal Regulations, 2010 CFR
2010-07-01
... of the Juvenile Justice and Delinquency Prevention Act of 1974, Public Law 93-415, as amended by..., Research, and Statistics. (d) LEAA means the Law Enforcement Assistance Administration. (e) NIJ means the National Institute of Justice. (f) BJS means the Bureau of Justice Statistics. (g) Employment practices...
Quality of reporting statistics in two Indian pharmacology journals
Jaykaran; Yadav, Preeti
2011-01-01
Objective: To evaluate the reporting of the statistical methods in articles published in two Indian pharmacology journals. Materials and Methods: All original articles published since 2002 were downloaded from the journals’ (Indian Journal of Pharmacology (IJP) and Indian Journal of Physiology and Pharmacology (IJPP)) website. These articles were evaluated on the basis of appropriateness of descriptive statistics and inferential statistics. Descriptive statistics was evaluated on the basis of reporting of method of description and central tendencies. Inferential statistics was evaluated on the basis of fulfilling of assumption of statistical methods and appropriateness of statistical tests. Values are described as frequencies, percentage, and 95% confidence interval (CI) around the percentages. Results: Inappropriate descriptive statistics was observed in 150 (78.1%, 95% CI 71.7–83.3%) articles. Most common reason for this inappropriate descriptive statistics was use of mean ± SEM at the place of “mean (SD)” or “mean ± SD.” Most common statistical method used was one-way ANOVA (58.4%). Information regarding checking of assumption of statistical test was mentioned in only two articles. Inappropriate statistical test was observed in 61 (31.7%, 95% CI 25.6–38.6%) articles. Most common reason for inappropriate statistical test was the use of two group test for three or more groups. Conclusion: Articles published in two Indian pharmacology journals are not devoid of statistical errors. PMID:21772766
Parallel auto-correlative statistics with VTK.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pebay, Philippe Pierre; Bennett, Janine Camille
2013-08-01
This report summarizes existing statistical engines in VTK and presents both the serial and parallel auto-correlative statistics engines. It is a sequel to [PT08, BPRT09b, PT09, BPT09, PT10] which studied the parallel descriptive, correlative, multi-correlative, principal component analysis, contingency, k-means, and order statistics engines. The ease of use of the new parallel auto-correlative statistics engine is illustrated by the means of C++ code snippets and algorithm verification is provided. This report justifies the design of the statistics engines with parallel scalability in mind, and provides scalability and speed-up analysis results for the autocorrelative statistics engine.
Asquith, William H.; Barbie, Dana L.
2014-01-01
Selected summary statistics (L-moments) and estimates of respective sampling variances were computed for the 35 streamgages lacking statistically significant trends. From the L-moments and estimated sampling variances, weighted means or regional values were computed for each L-moment. An example application is included demonstrating how the L-moments could be used to evaluate the magnitude and frequency of annual mean streamflow.
Lexical Ambiguities in the Vocabulary of Statistics
ERIC Educational Resources Information Center
Whitaker, Douglas
2016-01-01
Lexical ambiguities exist when two different meanings are ascribed to the same word. Such lexical ambiguities can be particularly problematic for learning material with technical words that have everyday meanings that are not the same as the technical meaning. This study reports on lexical ambiguities in six statistical words germane to statistics…
Assaad, Houssein I; Choudhary, Pankaj K
2013-01-01
The L -statistics form an important class of estimators in nonparametric statistics. Its members include trimmed means and sample quantiles and functions thereof. This article is devoted to theory and applications of L -statistics for repeated measurements data, wherein the measurements on the same subject are dependent and the measurements from different subjects are independent. This article has three main goals: (a) Show that the L -statistics are asymptotically normal for repeated measurements data. (b) Present three statistical applications of this result, namely, location estimation using trimmed means, quantile estimation and construction of tolerance intervals. (c) Obtain a Bahadur representation for sample quantiles. These results are generalizations of similar results for independently and identically distributed data. The practical usefulness of these results is illustrated by analyzing a real data set involving measurement of systolic blood pressure. The properties of the proposed point and interval estimators are examined via simulation.
Wagner, Daniel M.; Krieger, Joshua D.; Merriman, Katherine R.
2014-01-01
The U.S. Geological Survey (USGS) and the U.S. Army Corps of Engineers (USACE) conducted a statistical analysis of trends in precipitation, streamflow, reservoir pool elevations, and reservoir releases in Arkansas and selected sites in Louisiana, Missouri, and Oklahoma for the period 1951–2011. The Mann-Kendall test was used to test for trends in annual and seasonal precipitation, annual and seasonal streamflows of 42 continuous-record USGS streamflow-gaging stations, annual pool elevations and releases from 16 USACE reservoirs, and annual releases from 11 dams on the Arkansas River. A statistically significant (p≤0.10) upward trend was observed in annual precipitation for the State, with a Sen slope of approximately 0.10 inch per year. Autumn and winter were the only seasons that had statistically significant trends in precipitation. Five of six physiographic sections and six of seven 4-digit hydrologic unit code (HUC) regions in Arkansas had statistically significant upward trends in autumn precipitation, with Sen slopes of approximately 0.06 to 0.10 inch per year. Sixteen sites had statistically significant upward trends in the annual mean daily streamflow and were located on streams that drained regions with statistically significant upward trends in annual precipitation. Expected annual rates of change corresponding to statistically significant trends in annual mean daily streamflows, which ranged from 0.32 to 0.88 percent, were greater than those corresponding to regions with statistically significant upward trends in annual precipitation, which ranged from 0.19 to 0.28 percent, suggesting that the observed trends in regional annual precipitation do not fully account for the observed trends in annual mean daily streamflows. Trends in annual maximum daily streamflows were similar to trends in the annual mean daily streamflows but were only statistically significant at seven sites. There were more statistically significant trends (28 of 42 sites) in the annual minimum daily streamflows than in the annual means or maximums. Statistically significant trends in the annual minimum daily streamflows were upward at 18 sites and downward at 10 sites. Despite autumn being the only season that had statistically significant upward trends in seasonal precipitation, statistically significant upward trends in seasonal mean streamflows occurred in every season but spring. Trends in the annual mean, maximum, and minimum daily pool elevations of USACE reservoirs were consistent between metrics for reservoirs in the White, Arkansas, and Ouachita River watersheds, while trends varied between metrics at DeQueen Lake, Millwood Lake, and Lake Chicot. Most of the statistically significant trends in pool elevation metrics were upward and gradual—Sen slopes were less than 0.37 foot per year—and were likely the result of changes in reservoir regulation plans. Trends in the annual mean and maximum daily releases from USACE reservoirs were generally upward in all HUC regions. There were few statistically significant trends in the annual mean daily releases because the reservoirs are operated to maintain a regulation stage at a downstream site according to guidelines set forth in the regulation plans of the reservoirs. The annual number of low-flow days was both increasing and decreasing for reservoirs in northern Arkansas and southern Missouri and generally increasing for reservoirs in southern Arkansas.
The Interplay between Spoken Language and Informal Definitions of Statistical Concepts
ERIC Educational Resources Information Center
Lavy, Ilana; Mashiach-Eizenberg, Michal
2009-01-01
Various terms are used to describe mathematical concepts, in general, and statistical concepts, in particular. Regarding statistical concepts in the Hebrew language, some of these terms have the same meaning both in their everyday use and in mathematics, such as Mode; some of them have a different meaning, such as Expected value and Life…
Technique for estimation of streamflow statistics in mineral areas of interest in Afghanistan
Olson, Scott A.; Mack, Thomas J.
2011-01-01
A technique for estimating streamflow statistics at ungaged stream sites in areas of mineral interest in Afghanistan using drainage-area-ratio relations of historical streamflow data was developed and is documented in this report. The technique can be used to estimate the following streamflow statistics at ungaged sites: (1) 7-day low flow with a 10-year recurrence interval, (2) 7-day low flow with a 2-year recurrence interval, (3) daily mean streamflow exceeded 90 percent of the time, (4) daily mean streamflow exceeded 80 percent of the time, (5) mean monthly streamflow for each month of the year, (6) mean annual streamflow, and (7) minimum monthly streamflow for each month of the year. Because they are based on limited historical data, the estimates of streamflow statistics at ungaged sites are considered preliminary.
Explorations in statistics: the log transformation.
Curran-Everett, Douglas
2018-06-01
Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This thirteenth installment of Explorations in Statistics explores the log transformation, an established technique that rescales the actual observations from an experiment so that the assumptions of some statistical analysis are better met. A general assumption in statistics is that the variability of some response Y is homogeneous across groups or across some predictor variable X. If the variability-the standard deviation-varies in rough proportion to the mean value of Y, a log transformation can equalize the standard deviations. Moreover, if the actual observations from an experiment conform to a skewed distribution, then a log transformation can make the theoretical distribution of the sample mean more consistent with a normal distribution. This is important: the results of a one-sample t test are meaningful only if the theoretical distribution of the sample mean is roughly normal. If we log-transform our observations, then we want to confirm the transformation was useful. We can do this if we use the Box-Cox method, if we bootstrap the sample mean and the statistic t itself, and if we assess the residual plots from the statistical model of the actual and transformed sample observations.
A critique of the usefulness of inferential statistics in applied behavior analysis
Hopkins, B. L.; Cole, Brian L.; Mason, Tina L.
1998-01-01
Researchers continue to recommend that applied behavior analysts use inferential statistics in making decisions about effects of independent variables on dependent variables. In many other approaches to behavioral science, inferential statistics are the primary means for deciding the importance of effects. Several possible uses of inferential statistics are considered. Rather than being an objective means for making decisions about effects, as is often claimed, inferential statistics are shown to be subjective. It is argued that the use of inferential statistics adds nothing to the complex and admittedly subjective nonstatistical methods that are often employed in applied behavior analysis. Attacks on inferential statistics that are being made, perhaps with increasing frequency, by those who are not behavior analysts, are discussed. These attackers are calling for banning the use of inferential statistics in research publications and commonly recommend that behavioral scientists should switch to using statistics aimed at interval estimation or the method of confidence intervals. Interval estimation is shown to be contrary to the fundamental assumption of behavior analysis that only individuals behave. It is recommended that authors who wish to publish the results of inferential statistics be asked to justify them as a means for helping us to identify any ways in which they may be useful. PMID:22478304
ERIC Educational Resources Information Center
Gordon, Sheldon P.; Gordon, Florence S.
2010-01-01
One of the most important applications of the definite integral in a modern calculus course is the mean value of a function. Thus, if a function "f" is defined on an interval ["a", "b"], then the mean, or average value, of "f" is given by [image omitted]. In this note, we will investigate the meaning of other statistics associated with a function…
Comparison between the Laser-Badal and Vernier Optometers.
1988-09-01
naval aviators (SNAs). We also measured dark vcrgence in the same sample of SNAs. THE FINDINGS There was no statistically significant difference found...relatively inexperienced operator. 7. The difference between mean scores on the vernier and laser-Badal optometers was statistically significant...thus indicating that test results were reliable within instru- menrts. TAbLE 1. Test and Retest Statistics . Measure Mean SD n t-value Dark vergence
1988-12-09
Measurement of Second Order Statistics .... .............. .54 5.4 Measurement of Triple Products ...... ................. .58 5.6 Uncertainty Analysis...deterministic fluctuations, u/ 2 , were 25 times larger than the mean fluctuations, u, there were no significant variations in the mean statistical ...input signals, the three velocity components are cal- culated, Awn in ,i-;dual phase ensembles are collected for the appropriate statistical 3
Kılıç, D; Göksu, E; Kılıç, T; Buyurgan, C S
2018-05-01
The aim of this randomized cross-over study was to compare one-minute and two-minute continuous chest compressions in terms of chest compression only CPR quality metrics on a mannequin model in the ED. Thirty-six emergency medicine residents participated in this study. In the 1-minute group, there was no statistically significant difference in the mean compression rate (p=0.83), mean compression depth (p=0.61), good compressions (p=0.31), the percentage of complete release (p=0.07), adequate compression depth (p=0.11) or the percentage of good rate (p=51) over the four-minute time period. Only flow time was statistically significant among the 1-minute intervals (p<0.001). In the 2-minute group, the mean compression depth (p=0.19), good compression (p=0.92), the percentage of complete release (p=0.28), adequate compression depth (p=0.96), and the percentage of good rate (p=0.09) were not statistically significant over time. In this group, the number of compressions (248±31 vs 253±33, p=0.01) and mean compression rates (123±15 vs 126±17, p=0.01) and flow time (p=0.001) were statistically significant along the two-minute intervals. There was no statistically significant difference in the mean number of chest compressions per minute, mean chest compression depth, the percentage of good compressions, complete release, adequate chest compression depth and percentage of good compression between the 1-minute and 2-minute groups. There was no statistically significant difference in the quality metrics of chest compressions between 1- and 2-minute chest compression only groups. Copyright © 2017 Elsevier Inc. All rights reserved.
Borba, Alexandre Meireles; José da Silva, Everton; Fernandes da Silva, André Luis; Han, Michael D; da Graça Naclério-Homem, Maria; Miloro, Michael
2018-01-12
To verify predicted versus obtained surgical movements in 2-dimensional (2D) and 3-dimensional (3D) measurements and compare the equivalence between these methods. A retrospective observational study of bimaxillary orthognathic surgeries was performed. Postoperative cone-beam computed tomographic (CBCT) scans were superimposed on preoperative scans and a lateral cephalometric radiograph was generated from each CBCT scan. After identification of the sella, nasion, and upper central incisor tip landmarks on 2D and 3D images, actual and planned movements were compared by cephalometric measurements. One-sample t test was used to statistically evaluate results, with expected mean discrepancy values ranging from 0 to 2 mm. Equivalence of 2D and 3D values was compared using paired t test. The final sample of 46 cases showed by 2D cephalometry that differences between actual and planned movements in the horizontal axis were statistically relevant for expected means of 0, 0.5, and 2 mm without relevance for expected means of 1 and 1.5 mm; vertical movements were statistically relevant for expected means of 0 and 0.5 mm without relevance for expected means of 1, 1.5, and 2 mm. For 3D cephalometry in the horizontal axis, there were statistically relevant differences for expected means of 0, 1.5, and 2 mm without relevance for expected means of 0.5 and 1 mm; vertical movements showed statistically relevant differences for expected means of 0, 0.5, 1.5 and 2 mm without relevance for the expected mean of 1 mm. Comparison of 2D and 3D values displayed statistical differences for the horizontal and vertical axes. Comparison of 2D and 3D surgical outcome assessments should be performed with caution because there seems to be a difference in acceptable levels of accuracy between these 2 methods of evaluation. Moreover, 3D accuracy studies should no longer rely on a 2-mm level of discrepancy but on a 1-mm level. Copyright © 2018 American Association of Oral and Maxillofacial Surgeons. Published by Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Jacob, Bridgette L.
2013-01-01
The difficulties introductory statistics students have with formal statistical inference are well known in the field of statistics education. "Informal" statistical inference has been studied as a means to introduce inferential reasoning well before and without the formalities of formal statistical inference. This mixed methods study…
Parameter Estimation in Astronomy with Poisson-Distributed Data. 1; The (CHI)2(gamma) Statistic
NASA Technical Reports Server (NTRS)
Mighell, Kenneth J.
1999-01-01
Applying the standard weighted mean formula, [Sigma (sub i)n(sub i)ssigma(sub i, sup -2)], to determine the weighted mean of data, n(sub i), drawn from a Poisson distribution, will, on average, underestimate the true mean by approx. 1 for all true mean values larger than approx.3 when the common assumption is made that the error of the i th observation is sigma(sub i) = max square root of n(sub i), 1).This small, but statistically significant offset, explains the long-known observation that chi-square minimization techniques which use the modified Neyman'chi(sub 2) statistic, chi(sup 2, sub N) equivalent Sigma(sub i)((n(sub i) - y(sub i)(exp 2)) / max(n(sub i), 1), to compare Poisson - distributed data with model values, y(sub i), will typically predict a total number of counts that underestimates the true total by about 1 count per bin. Based on my finding that weighted mean of data drawn from a Poisson distribution can be determined using the formula [Sigma(sub i)[n(sub i) + min(n(sub i), 1)](n(sub i) + 1)(exp -1)] / [Sigma(sub i)(n(sub i) + 1)(exp -1))], I propose that a new chi(sub 2) statistic, chi(sup 2, sub gamma) equivalent, should always be used to analyze Poisson- distributed data in preference to the modified Neyman's chi(exp 2) statistic. I demonstrated the power and usefulness of,chi(sub gamma, sup 2) minimization by using two statistical fitting techniques and five chi(exp 2) statistics to analyze simulated X-ray power - low 15 - channel spectra with large and small counts per bin. I show that chi(sub gamma, sup 2) minimization with the Levenberg - Marquardt or Powell's method can produce excellent results (mean slope errors approx. less than 3%) with spectra having as few as 25 total counts.
Endpoint in plasma etch process using new modified w-multivariate charts and windowed regression
NASA Astrophysics Data System (ADS)
Zakour, Sihem Ben; Taleb, Hassen
2017-09-01
Endpoint detection is very important undertaking on the side of getting a good understanding and figuring out if a plasma etching process is done in the right way, especially if the etched area is very small (0.1%). It truly is a crucial part of supplying repeatable effects in every single wafer. When the film being etched has been completely cleared, the endpoint is reached. To ensure the desired device performance on the produced integrated circuit, the high optical emission spectroscopy (OES) sensor is employed. The huge number of gathered wavelengths (profiles) is then analyzed and pre-processed using a new proposed simple algorithm named Spectra peak selection (SPS) to select the important wavelengths, then we employ wavelet analysis (WA) to enhance the performance of detection by suppressing noise and redundant information. The selected and treated OES wavelengths are then used in modified multivariate control charts (MEWMA and Hotelling) for three statistics (mean, SD and CV) and windowed polynomial regression for mean. The employ of three aforementioned statistics is motivated by controlling mean shift, variance shift and their ratio (CV) if both mean and SD are not stable. The control charts show their performance in detecting endpoint especially W-mean Hotelling chart and the worst result is given by CV statistic. As the best detection of endpoint is given by the W-Hotelling mean statistic, this statistic will be used to construct a windowed wavelet Hotelling polynomial regression. This latter can only identify the window containing endpoint phenomenon.
Simple Statistics: - Summarized!
ERIC Educational Resources Information Center
Blai, Boris, Jr.
Statistics are an essential tool for making proper judgement decisions. It is concerned with probability distribution models, testing of hypotheses, significance tests and other means of determining the correctness of deductions and the most likely outcome of decisions. Measures of central tendency include the mean, median and mode. A second…
Code of Federal Regulations, 2010 CFR
2010-07-01
... agency which has received a research, statistics, discretionary, technical assistance, special emphasis...) Categorical grant applicant means a public or private agency which has applied for a research, statistics... Justice Act means the Juvenile Justice and Delinquency Prevention Act of 1974, 42 U.S.C. 5601, et seq., as...
NASA Astrophysics Data System (ADS)
Langley, Robin S.
2018-03-01
This work is concerned with the statistical properties of the frequency response function of the energy of a random system. Earlier studies have considered the statistical distribution of the function at a single frequency, or alternatively the statistics of a band-average of the function. In contrast the present analysis considers the statistical fluctuations over a frequency band, and results are obtained for the mean rate at which the function crosses a specified level (or equivalently, the average number of times the level is crossed within the band). Results are also obtained for the probability of crossing a specified level at least once, the mean rate of occurrence of peaks, and the mean trough-to-peak height. The analysis is based on the assumption that the natural frequencies and mode shapes of the system have statistical properties that are governed by the Gaussian Orthogonal Ensemble (GOE), and the validity of this assumption is demonstrated by comparison with numerical simulations for a random plate. The work has application to the assessment of the performance of dynamic systems that are sensitive to random imperfections.
Streamflow statistics for selected streams in North Dakota, Minnesota, Manitoba, and Saskatchewan
Williams-Sether, Tara
2012-01-01
Statistical summaries of streamflow data for the periods of record through water year 2009 for selected active and discontinued U.S. Geological Survey streamflow-gaging stations in North Dakota, Minnesota, Manitoba, and Saskatchewan were compiled. The summaries for each streamflow-gaging station include a brief station description, a graph of the annual peak and annual mean discharge for the period of record, statistics of monthly and annual mean discharges, monthly and annual flow durations, probability of occurrence of annual high discharges, annual peak discharge and corresponding gage height for the period of record, and monthly and annual mean discharges for the period of record.
Saleh, Dina K.
2010-01-01
Statistical summaries of streamflow data for all long-term streamflow-gaging stations in the Tigris River and Euphrates River Basins in Iraq are presented in this report. The summaries for each streamflow-gaging station include (1) a station description, (2) a graph showing annual mean discharge for the period of record, (3) a table of extremes and statistics for monthly and annual mean discharge, (4) a graph showing monthly maximum, minimum, and mean discharge, (5) a table of monthly and annual mean discharges for the period of record, (6) a graph showing annual flow duration, (7) a table of monthly and annual flow duration, (8) a table of high-flow frequency data (maximum mean discharge for 3-, 7-, 15-, and 30-day periods for selected exceedance probabilities), and (9) a table of low-flow frequency data (minimum mean discharge for 3-, 7-, 15-, 30-, 60-, 90-, and 183-day periods for selected non-exceedance probabilities).
An Overview of Particle Sampling Bias
NASA Technical Reports Server (NTRS)
Meyers, James F.; Edwards, Robert V.
1984-01-01
The complex relation between particle arrival statistics and the interarrival statistics is explored. It is known that the mean interarrival time given an initial velocity is generally not the inverse of the mean rate corresponding to that velocity. Necessary conditions for the measurement of the conditional rate are given.
Statistical physics of hard combinatorial optimization: Vertex cover problem
NASA Astrophysics Data System (ADS)
Zhao, Jin-Hua; Zhou, Hai-Jun
2014-07-01
Typical-case computation complexity is a research topic at the boundary of computer science, applied mathematics, and statistical physics. In the last twenty years, the replica-symmetry-breaking mean field theory of spin glasses and the associated message-passing algorithms have greatly deepened our understanding of typical-case computation complexity. In this paper, we use the vertex cover problem, a basic nondeterministic-polynomial (NP)-complete combinatorial optimization problem of wide application, as an example to introduce the statistical physical methods and algorithms. We do not go into the technical details but emphasize mainly the intuitive physical meanings of the message-passing equations. A nonfamiliar reader shall be able to understand to a large extent the physics behind the mean field approaches and to adjust the mean field methods in solving other optimization problems.
Wind speed statistics for Goldstone, California, anemometer sites
NASA Technical Reports Server (NTRS)
Berg, M.; Levy, R.; Mcginness, H.; Strain, D.
1981-01-01
An exploratory wind survey at an antenna complex was summarized statistically for application to future windmill designs. Data were collected at six locations from a total of 10 anemometers. Statistics include means, standard deviations, cubes, pattern factors, correlation coefficients, and exponents for power law profile of wind speed. Curves presented include: mean monthly wind speeds, moving averages, and diurnal variation patterns. It is concluded that three of the locations have sufficiently strong winds to justify consideration for windmill sites.
Mathematical and Statistical Software Index.
1986-08-01
geometric) mean HMEAN - harmonic mean MEDIAN - median MODE - mode QUANT - quantiles OGIVE - distribution curve IQRNG - interpercentile range RANGE ... range mutliphase pivoting algorithm cross-classification multiple discriminant analysis cross-tabul ation mul tipl e-objecti ve model curve fitting...Statistics). .. .. .... ...... ..... ...... ..... .. 21 *RANGEX (Correct Correlations for Curtailment of Range ). .. .. .... ...... ... 21 *RUMMAGE II (Analysis
Selected Streamflow Statistics for Streamgaging Stationsin Northeastern Maryland, 2006
Ries, Kernell G.
2006-01-01
Streamflow statistics were calculated for 47 U.S. Geological Survey (USGS) streamgaging stations in northeastern Maryland, in cooperation with (1) the University of Maryland, Baltimore County, Center for Urban Environmental Research and Education; (2) the Baltimore City Department of Public Works; and (3) the Baltimore County Department of Environmental Protection and Resource Management. The statistics include the mean, minimum, maximum, and standard deviation of the daily mean discharges for the periods of record at the stations, as well as flow-duration and low-flow frequency statistics. The flow-duration statistics include the 1-, 2-, 5-, 10-, 15-, 20-, 25-, 30-, 40-, 50-, 60-, 70-, 75-, 80-, 85-, 90-, 95-, 98-, and 99-percent duration discharges. The low-flow frequency statistics include the average discharges for 1, 7, 14, and 30 days that recur, on average, once in 1.01, 2, 5, 10, 20, 50, and 100 years. The statistics were computed only for the 25 stations with periods of record of 10 years or more. The statistics were computed from records available through September 30, 2004 using standard methods and computer software developed by the USGS. A comparison between low-flow frequency statistics computed for this study and for a previous study that used data available through September 30, 1989 was done for seven stations. The comparison indicated that, for the 7-day mean low flow, the newer values were 19.8 and 15.3 percent lower for the 20- and 10-year recurrence intervals, respectively, and 2.1 percent higher for the 2-year recurrence interval, than the older values. For the 14-day mean low flow, the newer 20- and 10-year values were 25.2 and 15.5 percent lower, respectively, and the 2-year value was 2.9 percent higher than the older values. For the 30-day mean low flow, the newer 20-, 10-, and 2-year values were 10.8, 7.9, and 0.8 percent lower, respectively, than the older values. The newer values are generally lower than the older ones most likely because two major droughts have occurred since the older study was completed.
Analysis of statistical misconception in terms of statistical reasoning
NASA Astrophysics Data System (ADS)
Maryati, I.; Priatna, N.
2018-05-01
Reasoning skill is needed for everyone to face globalization era, because every person have to be able to manage and use information from all over the world which can be obtained easily. Statistical reasoning skill is the ability to collect, group, process, interpret, and draw conclusion of information. Developing this skill can be done through various levels of education. However, the skill is low because many people assume that statistics is just the ability to count and using formulas and so do students. Students still have negative attitude toward course which is related to research. The purpose of this research is analyzing students’ misconception in descriptive statistic course toward the statistical reasoning skill. The observation was done by analyzing the misconception test result and statistical reasoning skill test; observing the students’ misconception effect toward statistical reasoning skill. The sample of this research was 32 students of math education department who had taken descriptive statistic course. The mean value of misconception test was 49,7 and standard deviation was 10,6 whereas the mean value of statistical reasoning skill test was 51,8 and standard deviation was 8,5. If the minimal value is 65 to state the standard achievement of a course competence, students’ mean value is lower than the standard competence. The result of students’ misconception study emphasized on which sub discussion that should be considered. Based on the assessment result, it was found that students’ misconception happen on this: 1) writing mathematical sentence and symbol well, 2) understanding basic definitions, 3) determining concept that will be used in solving problem. In statistical reasoning skill, the assessment was done to measure reasoning from: 1) data, 2) representation, 3) statistic format, 4) probability, 5) sample, and 6) association.
Self-consistent mean-field approach to the statistical level density in spherical nuclei
NASA Astrophysics Data System (ADS)
Kolomietz, V. M.; Sanzhur, A. I.; Shlomo, S.
2018-06-01
A self-consistent mean-field approach within the extended Thomas-Fermi approximation with Skyrme forces is applied to the calculations of the statistical level density in spherical nuclei. Landau's concept of quasiparticles with the nucleon effective mass and the correct description of the continuum states for the finite-depth potentials are taken into consideration. The A dependence and the temperature dependence of the statistical inverse level-density parameter K is obtained in a good agreement with experimental data.
On the Spike Train Variability Characterized by Variance-to-Mean Power Relationship.
Koyama, Shinsuke
2015-07-01
We propose a statistical method for modeling the non-Poisson variability of spike trains observed in a wide range of brain regions. Central to our approach is the assumption that the variance and the mean of interspike intervals are related by a power function characterized by two parameters: the scale factor and exponent. It is shown that this single assumption allows the variability of spike trains to have an arbitrary scale and various dependencies on the firing rate in the spike count statistics, as well as in the interval statistics, depending on the two parameters of the power function. We also propose a statistical model for spike trains that exhibits the variance-to-mean power relationship. Based on this, a maximum likelihood method is developed for inferring the parameters from rate-modulated spike trains. The proposed method is illustrated on simulated and experimental spike trains.
From Statistics to Meaning: Infants’ Acquisition of Lexical Categories
Lany, Jill; Saffran, Jenny R.
2013-01-01
Infants are highly sensitive to statistical patterns in their auditory language input that mark word categories (e.g., noun and verb). However, it is unknown whether experience with these cues facilitates the acquisition of semantic properties of word categories. In a study testing this hypothesis, infants first listened to an artificial language in which word categories were reliably distinguished by statistical cues (experimental group) or in which these properties did not cue category membership (control group). Both groups were then trained on identical pairings between the words and pictures from two categories (animals and vehicles). Only infants in the experimental group learned the trained associations between specific words and pictures. Moreover, these infants generalized the pattern to include novel pairings. These results suggest that experience with statistical cues marking lexical categories sets the stage for learning the meanings of individual words and for generalizing meanings to new category members. PMID:20424058
Fish: A New Computer Program for Friendly Introductory Statistics Help
ERIC Educational Resources Information Center
Brooks, Gordon P.; Raffle, Holly
2005-01-01
All introductory statistics students must master certain basic descriptive statistics, including means, standard deviations and correlations. Students must also gain insight into such complex concepts as the central limit theorem and standard error. This article introduces and describes the Friendly Introductory Statistics Help (FISH) computer…
Bartoo, G T; Nochlin, D; Chang, D; Kim, Y; Sumi, S M
1997-05-01
Using image analysis techniques to quantify the percentage area covered by the immunopositive marker for amyloid beta-peptide (A beta), we examined subjects with combinations of either early-onset or late-onset Alzheimer disease (AD) and either familial Alzheimer disease (FAD) or sporadic Alzheimer disease (SAD). We measured the mean and maximum A beta loads, in the hippocampus of each subject. There were no statistically significant differences in the mean A beta load between familial and sporadic AD subjects. Although sample sizes were too small for statistical testing, subjects with the epsilon 4/epsilon 4 allele of the apolipoprotein E (ApoE) gene had higher mean A beta loads than those with the epsilon 3/epsilon 3 or epsilon 3/epsilon 4 alleles. Members of the Volga German families (recently linked to chromosome 1) all had high mean A beta loads, and one of the chromosome 14-linked subjects had the highest mean A beta load while the other had a relatively small load, but the sample was too small for statistical comparisons. The duration of dementia and neuropsychological test scores showed a statistically significant correlation with the mean A beta load in the hippocampus, but not with the maximum A beta load. This difference indicates that the mean A beta load may be a more useful feature than the maximum A beta load as an objective neuropathological measure for cognitive status. This finding may help to improve the established methods for quantitative assessment of the neuropathological changes in AD.
Statistics about Hearing, Balance, Ear Infections and Deafness
... You are here Home » Health Info Statistics about Hearing, Balance, Ear Infections, and Deafness Quick Statistics Charts ... What the Numbers Mean: An Epidemiological Perspective on Hearing References on Hearing Epidemiology Last Updated Date: October ...
Regional regression equations for estimation of natural streamflow statistics in Colorado
Capesius, Joseph P.; Stephens, Verlin C.
2009-01-01
The U.S. Geological Survey (USGS), in cooperation with the Colorado Water Conservation Board and the Colorado Department of Transportation, developed regional regression equations for estimation of various streamflow statistics that are representative of natural streamflow conditions at ungaged sites in Colorado. The equations define the statistical relations between streamflow statistics (response variables) and basin and climatic characteristics (predictor variables). The equations were developed using generalized least-squares and weighted least-squares multilinear regression reliant on logarithmic variable transformation. Streamflow statistics were derived from at least 10 years of streamflow data through about 2007 from selected USGS streamflow-gaging stations in the study area that are representative of natural-flow conditions. Basin and climatic characteristics used for equation development are drainage area, mean watershed elevation, mean watershed slope, percentage of drainage area above 7,500 feet of elevation, mean annual precipitation, and 6-hour, 100-year precipitation. For each of five hydrologic regions in Colorado, peak-streamflow equations that are based on peak-streamflow data from selected stations are presented for the 2-, 5-, 10-, 25-, 50-, 100-, 200-, and 500-year instantaneous-peak streamflows. For four of the five hydrologic regions, equations based on daily-mean streamflow data from selected stations are presented for 7-day minimum 2-, 10-, and 50-year streamflows and for 7-day maximum 2-, 10-, and 50-year streamflows. Other equations presented for the same four hydrologic regions include those for estimation of annual- and monthly-mean streamflow and streamflow-duration statistics for exceedances of 10, 25, 50, 75, and 90 percent. All equations are reported along with salient diagnostic statistics, ranges of basin and climatic characteristics on which each equation is based, and commentary of potential bias, which is not otherwise removed by log-transformation of the variables of the equations from interpretation of residual plots. The predictor-variable ranges can be used to assess equation applicability for ungaged sites in Colorado.
micromap: A Package for Linked Micromaps
The R package micromap is used to create linked micromaps, which display statistical summaries associated with areal units, or polygons. Linked micromaps provide a means to simultaneously summarize and display both statistical and geographic distributions by linking statistical ...
Santanelli di Pompeo, Fabio; Sorotos, Michail; Laporta, Rosaria; Pagnoni, Marco; Longo, Benedetto
2018-02-01
Excellent cosmetic results from skin-sparing mastectomy (SSM) are often impaired by skin flaps' necrosis (SFN), from 8%-25% or worse in smokers. This study prospectively investigated the efficacy of Double-Mirrored Omega Pattern (DMOP-SSM) compared to Wise Pattern SSM (WP-SSM) for immediate reconstruction in moderate/large-breasted smokers. From 2008-2010, DMOP-SSM was performed in 51 consecutive immediate breast reconstructions on 41 smokers (mean age = 49.8 years) with moderate/large and ptotic breasts. This active group (AG) was compared to a similar historical control group (CG) of 37 smokers (mean age = 51.1 years) who underwent WP-SSM and immediate breast reconstruction, with a mean follow-up of 37.6 months. Skin ischaemic complications, number of surgical revisions, time to wound healing, and patient satisfaction were analysed. Descriptive statistics were reported and comparison of performance endpoints was performed using Fisher's exact test and Mann-Whitney U-test. A p-value <.05 was considered significant. Patients' mean age (p = .316) and BMI (p = .215) were not statistically different between groups. Ischaemic complications occurred in 11.7% of DMOP-SSMs and in 32.4% of WP-SSMs (p = .017), and revision rates were, respectively, 5.8% and 24.3% (p = .012), both statistically significant. Mean time to wound healing was, respectively, 16.8 days and 18.4 days (p = .205). Mean patients' satisfaction scores were, respectively, 18.9 and 21.1, statistically significant (p = .022). Although tobacco use in moderate/large breasted patients can severely impair outcomes of breast reconstruction, the DMOP-SSM approach, compared to WP-SSM, allows smokers to benefit from SSM, but with statistically significant reduced skin flaps ischaemic complications, revision surgery, and better cosmetic outcomes.
Computational methods to extract meaning from text and advance theories of human cognition.
McNamara, Danielle S
2011-01-01
Over the past two decades, researchers have made great advances in the area of computational methods for extracting meaning from text. This research has to a large extent been spurred by the development of latent semantic analysis (LSA), a method for extracting and representing the meaning of words using statistical computations applied to large corpora of text. Since the advent of LSA, researchers have developed and tested alternative statistical methods designed to detect and analyze meaning in text corpora. This research exemplifies how statistical models of semantics play an important role in our understanding of cognition and contribute to the field of cognitive science. Importantly, these models afford large-scale representations of human knowledge and allow researchers to explore various questions regarding knowledge, discourse processing, text comprehension, and language. This topic includes the latest progress by the leading researchers in the endeavor to go beyond LSA. Copyright © 2010 Cognitive Science Society, Inc.
ERIC Educational Resources Information Center
Luh, Wei-Ming; Guo, Jiin-Huarng
2005-01-01
To deal with nonnormal and heterogeneous data for the one-way fixed effect analysis of variance model, the authors adopted a trimmed means method in conjunction with Hall's invertible transformation into a heteroscedastic test statistic (Alexander-Govern test or Welch test). The results of simulation experiments showed that the proposed technique…
Statistical comparisons of AGDISP prediction with Mission III data
Baozhong Duan; Karl Mierzejewski; William G. Yendol
1991-01-01
Statistical comparison of AGDISP prediction were made against data obtained during aerial spray field trials ("Mission III") conducted in March 1987 at the APHIS Facility, Moore Air Base, Edinburg, Texas, by the NEFAAT group (Northeast Forest Aerial Application Technology). Seven out of twenty one runs were observed and predicted means (O and P), mean bias...
ERIC Educational Resources Information Center
Boysen, Guy A.
2015-01-01
Student evaluations of teaching are among the most accepted and important indicators of college teachers' performance. However, faculty and administrators can overinterpret small variations in mean teaching evaluations. The current research examined the effect of including statistical information on the interpretation of teaching evaluations.…
Forrester, Janet E
2015-12-01
Errors in the statistical presentation and analyses of data in the medical literature remain common despite efforts to improve the review process, including the creation of guidelines for authors and the use of statistical reviewers. This article discusses common elementary statistical errors seen in manuscripts recently submitted to Clinical Therapeutics and describes some ways in which authors and reviewers can identify errors and thus correct them before publication. A nonsystematic sample of manuscripts submitted to Clinical Therapeutics over the past year was examined for elementary statistical errors. Clinical Therapeutics has many of the same errors that reportedly exist in other journals. Authors require additional guidance to avoid elementary statistical errors and incentives to use the guidance. Implementation of reporting guidelines for authors and reviewers by journals such as Clinical Therapeutics may be a good approach to reduce the rate of statistical errors. Copyright © 2015 Elsevier HS Journals, Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Calkins, D. S.
1998-01-01
When the dependent (or response) variable response variable in an experiment has direction and magnitude, one approach that has been used for statistical analysis involves splitting magnitude and direction and applying univariate statistical techniques to the components. However, such treatment of quantities with direction and magnitude is not justifiable mathematically and can lead to incorrect conclusions about relationships among variables and, as a result, to flawed interpretations. This note discusses a problem with that practice and recommends mathematically correct procedures to be used with dependent variables that have direction and magnitude for 1) computation of mean values, 2) statistical contrasts of and confidence intervals for means, and 3) correlation methods.
40 CFR Appendix IV to Part 265 - Tests for Significance
Code of Federal Regulations, 2010 CFR
2010-07-01
... introductory statistics texts. ... student's t-test involves calculation of the value of a t-statistic for each comparison of the mean... parameter with its initial background concentration or value. The calculated value of the t-statistic must...
On Teaching about the Coefficient of Variation in Introductory Statistics Courses
ERIC Educational Resources Information Center
Trafimow, David
2014-01-01
The standard deviation is related to the mean by virtue of the coefficient of variation. Teachers of statistics courses can make use of that fact to make the standard deviation more comprehensible for statistics students.
22 CFR 92.80 - Obtaining American vital statistics records.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 22 Foreign Relations 1 2011-04-01 2011-04-01 false Obtaining American vital statistics records. 92... statistics records. Individuals who inquire as to means of obtaining copies of or extracts from American... Vital Statistics Office at the place where the record is kept, which is usually in the capital city of...
ERIC Educational Resources Information Center
Hilton, Sterling C.; Schau, Candace; Olsen, Joseph A.
2004-01-01
In addition to student learning, positive student attitudes have become an important course outcome for many introductory statistics instructors. To adequately assess changes in mean attitudes across introductory statistics courses, the attitude instruments used should be invariant by administration time. Attitudes toward statistics from 4,910…
22 CFR 92.80 - Obtaining American vital statistics records.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 22 Foreign Relations 1 2010-04-01 2010-04-01 false Obtaining American vital statistics records. 92... statistics records. Individuals who inquire as to means of obtaining copies of or extracts from American... Vital Statistics Office at the place where the record is kept, which is usually in the capital city of...
NASA Astrophysics Data System (ADS)
Andrade, João Rodrigo; Martins, Ramon Silva; Thompson, Roney Leon; Mompean, Gilmar; da Silveira Neto, Aristeu
2018-04-01
The present paper provides an analysis of the statistical uncertainties associated with direct numerical simulation (DNS) results and experimental data for turbulent channel and pipe flows, showing a new physically based quantification of these errors, to improve the determination of the statistical deviations between DNSs and experiments. The analysis is carried out using a recently proposed criterion by Thompson et al. ["A methodology to evaluate statistical errors in DNS data of plane channel flows," Comput. Fluids 130, 1-7 (2016)] for fully turbulent plane channel flows, where the mean velocity error is estimated by considering the Reynolds stress tensor, and using the balance of the mean force equation. It also presents how the residual error evolves in time for a DNS of a plane channel flow, and the influence of the Reynolds number on its convergence rate. The root mean square of the residual error is shown in order to capture a single quantitative value of the error associated with the dimensionless averaging time. The evolution in time of the error norm is compared with the final error provided by DNS data of similar Reynolds numbers available in the literature. A direct consequence of this approach is that it was possible to compare different numerical results and experimental data, providing an improved understanding of the convergence of the statistical quantities in turbulent wall-bounded flows.
PULPAL BLOOD FLOW CHANGES IN ABUTMENT TEETH OF REMOVABLE PARTIAL DENTURES
Kunt, Göknil Ergün; Kökçü, Deniz; Ceylan, Gözlem; Yılmaz, Nergiz; Güler, Ahmet Umut
2009-01-01
The purpose of this study was to investigate the effect of tooth supported (TSD) and toothtissue supported (TTSD) removable partial denture wearing on pulpal blood flow (PBF) of the abutment teeth by using Laser Doppler Flowmeter (LDF). Measurements were carried out on 60 teeth of 28 patients (28 teeth and 12 patients of TTSD group, 32 teeth and 16 patients of TSD group) who had not worn any type of removable partial dentures before, had no systemic problems and were non smokers. PBF values were recorded by LDF before insertion (day 0) and after insertion of dentures at day 1, day 7 and day 30. Statistical analysis was performed by student t test and covariance analyses of repeated measurements. In the group TTSD, the mean values of PBF decreased statistically significantly at day 1 after insertion when compared with PBF values before insertion (p<0,01). There was no statistically significant difference among PBF mean values on 1st, 7th and 30th day. However, in the group TSD, there was no statistically significant difference among PBF mean values before insertion and on 1st, 7th and 30th day. In other words, PBF mean values in group TSD continued without changing statistically significant on 1st, 7th and 30th day. TTSD wearing may show negative effect on the abutment teeth due to decreasing basal PBF. PMID:20001995
Watanabe, Hiroshi
2012-01-01
Procedures of statistical analysis are reviewed to provide an overview of applications of statistics for general use. Topics that are dealt with are inference on a population, comparison of two populations with respect to means and probabilities, and multiple comparisons. This study is the second part of series in which we survey medical statistics. Arguments related to statistical associations and regressions will be made in subsequent papers.
Rigorous Statistical Bounds in Uncertainty Quantification for One-Layer Turbulent Geophysical Flows
NASA Astrophysics Data System (ADS)
Qi, Di; Majda, Andrew J.
2018-04-01
Statistical bounds controlling the total fluctuations in mean and variance about a basic steady-state solution are developed for the truncated barotropic flow over topography. Statistical ensemble prediction is an important topic in weather and climate research. Here, the evolution of an ensemble of trajectories is considered using statistical instability analysis and is compared and contrasted with the classical deterministic instability for the growth of perturbations in one pointwise trajectory. The maximum growth of the total statistics in fluctuations is derived relying on the statistical conservation principle of the pseudo-energy. The saturation bound of the statistical mean fluctuation and variance in the unstable regimes with non-positive-definite pseudo-energy is achieved by linking with a class of stable reference states and minimizing the stable statistical energy. Two cases with dependence on initial statistical uncertainty and on external forcing and dissipation are compared and unified under a consistent statistical stability framework. The flow structures and statistical stability bounds are illustrated and verified by numerical simulations among a wide range of dynamical regimes, where subtle transient statistical instability exists in general with positive short-time exponential growth in the covariance even when the pseudo-energy is positive-definite. Among the various scenarios in this paper, there exist strong forward and backward energy exchanges between different scales which are estimated by the rigorous statistical bounds.
Sloto, Ronald A.; Reif, Andrew G.
2017-06-02
An evaluation of trends in hydrologic and water quality conditions and estimation of water budgets through 2013 was done by the U.S. Geological Survey in cooperation with the Chester County Water Resources Authority. Long-term hydrologic, meteorologic, and biologic data collected in Chester County, Pennsylvania, which included streamflow, groundwater levels, surface-water quality, biotic integrity, precipitation, and air temperature were analyzed to determine possible trends or changes in hydrologic conditions. Statistically significant trends were determined by applying the Kendall rank correlation test; the magnitudes of the trends were determined using the Sen slope estimator. Water budgets for eight selected watersheds were updated and a new water budget was developed for the Marsh Creek watershed. An average water budget for Chester County was developed using the eight selected watersheds and the new Marsh Creek water budget.Annual and monthly mean streamflow, base flow, and runoff were analyzed for trends at 10 streamgages. The periods of record at the 10 streamgages ranged from 1961‒2013 to 1988‒2013. The only statistically significant trend for annual mean streamflow was for West Branch Brandywine Creek near Honey Brook, Pa. (01480300) where annual mean streamflow increased 1.6 cubic feet per second (ft3/s) per decade. The greatest increase in monthly mean streamflow was for Brandywine Creek at Chadds Ford, Pa. (01481000) for December; the increase was 47 ft3/s per decade. No statistically significant trends in annual mean base flow or runoff were determined for the 10 streamgages. The greatest increase in monthly mean base flow was for Brandywine Creek at Chadds Ford, Pa. (01481000) for December; the increase was 26 ft3/s per decade.The magnitude of peaks greater than a base streamflow was analyzed for trends for 12 streamgages. The period of record at the 12 stream gages ranged from 1912‒2012 to 2004–11. Fifty percent of the streamgages showed a small statistically significant increase in peaks greater than the base streamflow. The greatest increase was for Brandywine Creek at Chadds Ford, Pa. (01481000) during 1962‒2012; the increase was 1.8 ft3/s per decade. There were no statistically significant trends in the number of floods equal to or greater than the 2-year recurrence interval flood flow.Twenty‒one monitoring wells were evaluated for statistically significant trends in annual mean water level, minimum annual water level, maximum annual water level, and annual range in water-level fluctuations. For four wells, a small statistically significant increase in annual mean water level was determined that ranged from 0.16 to 0.7 feet per decade. There was poor or no correlation between annual mean groundwater levels and annual mean streamflow and base flow. No correlation was determined between annual mean groundwater level and annual precipitation. Despite rapid population growth and land-use change since 1950, there appears to have been little or no detrimental effects on groundwater levels in 21 monitoring wells.Long-term precipitation and temperature data were available from the West Chester (1893‒2013) and Phoenixville, Pa. (1915‒2013) National Oceanic and Atmospheric Administration (NOAA) weather stations. No statistically significant trends in annual mean precipitation or annual mean temperature were determined for either station. Both weather stations had a significant decrease in the number of days per year with precipitation greater than or equal to 0.1 inch. Annual mean minimum and maximum temperatures from the NOAA Southeastern Piedmont Climate Division increased 0.2 degrees Fahrenheit (F) per decade between 1896 and 2014. The number of days with a maximum temperature equal to or greater than 90 degrees F increased at West Chester and decreased at Phoenixville. No statistically significant trend was determined for annual snowfall amounts.Data from 1974 to 2013 for three stream water-quality monitors in the Brandywine Creek watershed were evaluated. The monitors are on the West Branch Brandywine Creek at Modena, Pa. (01480617), East Branch Brandywine Creek below Downingtown, Pa. (01480870), and Brandywine Creek at Chadds Ford, Pa. (01481000). Statistically significant upward trends were determined for annual mean specific conductance at all three stations, indicating the total dissolved solids load has been increasing. If the current trend continues, the annual mean specific conductance could almost double from 1974 to 2050. The increase in specific conductance likely is due to increases in chloride concentrations, which have been increasing steadily over time at all three stations. No correlation was found between monthly mean specific conductance and monthly mean streamflow or base flow. Statistically significant upward trends in pH were determined for all three stations. Statistically significant upward trends in stream temperature were determined for East Branch Brandywine Creek below Downingtown, Pa. (01480870) and Brandywine Creek at Chadds Ford, Pa. (01481000). The stream water-quality data indicate substantial increases in the minimum daily dissolved oxygen concentrations in the Brandywine Creek over time.The Chester County Index of Biotic Integrity (CC-IBI) determined for 1998‒2013 was evaluated for the five biological sampling sites collocated with streamgages. CC-IBI scores are based on a 0‒100 scale with higher scores indicating better stream quality. Statistically significant upward trends in the CC-IBI were determined for West Branch Brandywine Creek at Modena, Pa. (01480617) and East Branch Brandywine Creek below Downingtown, Pa. (01480870). No correlation was found between the CC-IBI and streamflow, precipitation, or stream specific conductance, pH, temperature, or dissolved oxygen concentration.A Chester County average water budget was developed using the nine estimated watershed water budgets. Average precipitation was 48.4 inches, and average streamflow was 21.4 inches. Average runoff and base flow were 8.3 and 13.1 inches, respectively, and average evapotranspiration and estimation of errors was 27.2 inches."
Pre-Service Teachers' Understanding of Measures of Centre: When the Meaning Gets Lost?
ERIC Educational Resources Information Center
Reaburn, Robyn
2013-01-01
Measures of centre (the mean, median and mode) are fundamental to the discipline of statistics. Yet previous research shows that students may not have a thorough conceptual understanding of these measures, even though these statistics are easy to calculate. This study describes the findings of a study of pre-service teachers' ideas of measure of…
Including the Tukey Mean-Difference (Bland-Altman) Plot in a Statistics Course
ERIC Educational Resources Information Center
Kozak, Marcin; Wnuk, Agnieszka
2014-01-01
The Tukey mean-difference plot, also called the Bland-Altman plot, is a recognized graphical tool in the exploration of biometrical data. We show that this technique deserves a place on an introductory statistics course by encouraging students to think about the kind of graph they wish to create, rather than just creating the default graph for the…
ERIC Educational Resources Information Center
Lunsford, M. Leigh; Rowell, Ginger Holmes; Goodson-Espy, Tracy
2006-01-01
We applied a classroom research model to investigate student understanding of sampling distributions of sample means and the Central Limit Theorem in post-calculus introductory probability and statistics courses. Using a quantitative assessment tool developed by previous researchers and a qualitative assessment tool developed by the authors, we…
Savalei, Victoria
2018-01-01
A new type of nonnormality correction to the RMSEA has recently been developed, which has several advantages over existing corrections. In particular, the new correction adjusts the sample estimate of the RMSEA for the inflation due to nonnormality, while leaving its population value unchanged, so that established cutoff criteria can still be used to judge the degree of approximate fit. A confidence interval (CI) for the new robust RMSEA based on the mean-corrected ("Satorra-Bentler") test statistic has also been proposed. Follow up work has provided the same type of nonnormality correction for the CFI (Brosseau-Liard & Savalei, 2014). These developments have recently been implemented in lavaan. This note has three goals: a) to show how to compute the new robust RMSEA and CFI from the mean-and-variance corrected test statistic; b) to offer a new CI for the robust RMSEA based on the mean-and-variance corrected test statistic; and c) to caution that the logic of the new nonnormality corrections to RMSEA and CFI is most appropriate for the maximum likelihood (ML) estimator, and cannot easily be generalized to the most commonly used categorical data estimators.
NASA Astrophysics Data System (ADS)
Sikora, Grzegorz; Teuerle, Marek; Wyłomańska, Agnieszka; Grebenkov, Denis
2017-08-01
The most common way of estimating the anomalous scaling exponent from single-particle trajectories consists of a linear fit of the dependence of the time-averaged mean-square displacement on the lag time at the log-log scale. We investigate the statistical properties of this estimator in the case of fractional Brownian motion (FBM). We determine the mean value, the variance, and the distribution of the estimator. Our theoretical results are confirmed by Monte Carlo simulations. In the limit of long trajectories, the estimator is shown to be asymptotically unbiased, consistent, and with vanishing variance. These properties ensure an accurate estimation of the scaling exponent even from a single (long enough) trajectory. As a consequence, we prove that the usual way to estimate the diffusion exponent of FBM is correct from the statistical point of view. Moreover, the knowledge of the estimator distribution is the first step toward new statistical tests of FBM and toward a more reliable interpretation of the experimental histograms of scaling exponents in microbiology.
Stable statistical representations facilitate visual search.
Corbett, Jennifer E; Melcher, David
2014-10-01
Observers represent the average properties of object ensembles even when they cannot identify individual elements. To investigate the functional role of ensemble statistics, we examined how modulating statistical stability affects visual search. We varied the mean and/or individual sizes of an array of Gabor patches while observers searched for a tilted target. In "stable" blocks, the mean and/or local sizes of the Gabors were constant over successive displays, whereas in "unstable" baseline blocks they changed from trial to trial. Although there was no relationship between the context and the spatial location of the target, observers found targets faster (as indexed by faster correct responses and fewer saccades) as the global mean size became stable over several displays. Building statistical stability also facilitated scanning the scene, as measured by larger saccadic amplitudes, faster saccadic reaction times, and shorter fixation durations. These findings suggest a central role for peripheral visual information, creating context to free resources for detailed processing of salient targets and maintaining the illusion of visual stability.
*K-means and cluster models for cancer signatures.
Kakushadze, Zura; Yu, Willie
2017-09-01
We present *K-means clustering algorithm and source code by expanding statistical clustering methods applied in https://ssrn.com/abstract=2802753 to quantitative finance. *K-means is statistically deterministic without specifying initial centers, etc. We apply *K-means to extracting cancer signatures from genome data without using nonnegative matrix factorization (NMF). *K-means' computational cost is a fraction of NMF's. Using 1389 published samples for 14 cancer types, we find that 3 cancers (liver cancer, lung cancer and renal cell carcinoma) stand out and do not have cluster-like structures. Two clusters have especially high within-cluster correlations with 11 other cancers indicating common underlying structures. Our approach opens a novel avenue for studying such structures. *K-means is universal and can be applied in other fields. We discuss some potential applications in quantitative finance.
Barker, C.E.; Pawlewicz, M.J.
1993-01-01
In coal samples, published recommendations based on statistical methods suggest 100 measurements are needed to estimate the mean random vitrinite reflectance (Rv-r) to within ??2%. Our survey of published thermal maturation studies indicates that those using dispersed organic matter (DOM) mostly have an objective of acquiring 50 reflectance measurements. This smaller objective size in DOM versus that for coal samples poses a statistical contradiction because the standard deviations of DOM reflectance distributions are typically larger indicating a greater sample size is needed to accurately estimate Rv-r in DOM. However, in studies of thermal maturation using DOM, even 50 measurements can be an unrealistic requirement given the small amount of vitrinite often found in such samples. Furthermore, there is generally a reduced need for assuring precision like that needed for coal applications. Therefore, a key question in thermal maturation studies using DOM is how many measurements of Rv-r are needed to adequately estimate the mean. Our empirical approach to this problem is to compute the reflectance distribution statistics: mean, standard deviation, skewness, and kurtosis in increments of 10 measurements. This study compares these intermediate computations of Rv-r statistics with a final one computed using all measurements for that sample. Vitrinite reflectance was measured on mudstone and sandstone samples taken from borehole M-25 in the Cerro Prieto, Mexico geothermal system which was selected because the rocks have a wide range of thermal maturation and a comparable humic DOM with depth. The results of this study suggest that after only 20-30 measurements the mean Rv-r is generally known to within 5% and always to within 12% of the mean Rv-r calculated using all of the measured particles. Thus, even in the worst case, the precision after measuring only 20-30 particles is in good agreement with the general precision of one decimal place recommended for mean Rv-r measurements on DOM. The coefficient of variation (V = standard deviation/mean) is proposed as a statistic to indicate the reliability of the mean Rv-r estimates made at n ??? 20. This preliminary study suggests a V 0.2 suggests an unreliable mean in such small samples. ?? 1993.
ERIC Educational Resources Information Center
Mirick, Rebecca G.; Davis, Ashley
2017-01-01
Although statistics and research are key components of social work education, students are often described as reluctant consumers and users of statistics. Self-efficacy theory has been used to understand students' engagement with the statistical knowledge needed for practice. This quantitative study explores the relationship between self-efficacy,…
ERIC Educational Resources Information Center
Cook, Samuel A.; Fukawa-Connelly, Timothy
2016-01-01
Studies have shown that at the end of an introductory statistics course, students struggle with building block concepts, such as mean and standard deviation, and rely on procedural understandings of the concepts. This study aims to investigate the understandings entering freshman of a department of mathematics and statistics (including mathematics…
Eash, David A.; Barnes, Kimberlee K.
2017-01-01
A statewide study was conducted to develop regression equations for estimating six selected low-flow frequency statistics and harmonic mean flows for ungaged stream sites in Iowa. The estimation equations developed for the six low-flow frequency statistics include: the annual 1-, 7-, and 30-day mean low flows for a recurrence interval of 10 years, the annual 30-day mean low flow for a recurrence interval of 5 years, and the seasonal (October 1 through December 31) 1- and 7-day mean low flows for a recurrence interval of 10 years. Estimation equations also were developed for the harmonic-mean-flow statistic. Estimates of these seven selected statistics are provided for 208 U.S. Geological Survey continuous-record streamgages using data through September 30, 2006. The study area comprises streamgages located within Iowa and 50 miles beyond the State's borders. Because trend analyses indicated statistically significant positive trends when considering the entire period of record for the majority of the streamgages, the longest, most recent period of record without a significant trend was determined for each streamgage for use in the study. The median number of years of record used to compute each of these seven selected statistics was 35. Geographic information system software was used to measure 54 selected basin characteristics for each streamgage. Following the removal of two streamgages from the initial data set, data collected for 206 streamgages were compiled to investigate three approaches for regionalization of the seven selected statistics. Regionalization, a process using statistical regression analysis, provides a relation for efficiently transferring information from a group of streamgages in a region to ungaged sites in the region. The three regionalization approaches tested included statewide, regional, and region-of-influence regressions. For the regional regression, the study area was divided into three low-flow regions on the basis of hydrologic characteristics, landform regions, and soil regions. A comparison of root mean square errors and average standard errors of prediction for the statewide, regional, and region-of-influence regressions determined that the regional regression provided the best estimates of the seven selected statistics at ungaged sites in Iowa. Because a significant number of streams in Iowa reach zero flow as their minimum flow during low-flow years, four different types of regression analyses were used: left-censored, logistic, generalized-least-squares, and weighted-least-squares regression. A total of 192 streamgages were included in the development of 27 regression equations for the three low-flow regions. For the northeast and northwest regions, a censoring threshold was used to develop 12 left-censored regression equations to estimate the 6 low-flow frequency statistics for each region. For the southern region a total of 12 regression equations were developed; 6 logistic regression equations were developed to estimate the probability of zero flow for the 6 low-flow frequency statistics and 6 generalized least-squares regression equations were developed to estimate the 6 low-flow frequency statistics, if nonzero flow is estimated first by use of the logistic equations. A weighted-least-squares regression equation was developed for each region to estimate the harmonic-mean-flow statistic. Average standard errors of estimate for the left-censored equations for the northeast region range from 64.7 to 88.1 percent and for the northwest region range from 85.8 to 111.8 percent. Misclassification percentages for the logistic equations for the southern region range from 5.6 to 14.0 percent. Average standard errors of prediction for generalized least-squares equations for the southern region range from 71.7 to 98.9 percent and pseudo coefficients of determination for the generalized-least-squares equations range from 87.7 to 91.8 percent. Average standard errors of prediction for weighted-least-squares equations developed for estimating the harmonic-mean-flow statistic for each of the three regions range from 66.4 to 80.4 percent. The regression equations are applicable only to stream sites in Iowa with low flows not significantly affected by regulation, diversion, or urbanization and with basin characteristics within the range of those used to develop the equations. If the equations are used at ungaged sites on regulated streams, or on streams affected by water-supply and agricultural withdrawals, then the estimates will need to be adjusted by the amount of regulation or withdrawal to estimate the actual flow conditions if that is of interest. Caution is advised when applying the equations for basins with characteristics near the applicable limits of the equations and for basins located in karst topography. A test of two drainage-area ratio methods using 31 pairs of streamgages, for the annual 7-day mean low-flow statistic for a recurrence interval of 10 years, indicates a weighted drainage-area ratio method provides better estimates than regional regression equations for an ungaged site on a gaged stream in Iowa when the drainage-area ratio is between 0.5 and 1.4. These regression equations will be implemented within the U.S. Geological Survey StreamStats web-based geographic-information-system tool. StreamStats allows users to click on any ungaged site on a river and compute estimates of the seven selected statistics; in addition, 90-percent prediction intervals and the measured basin characteristics for the ungaged sites also are provided. StreamStats also allows users to click on any streamgage in Iowa and estimates computed for these seven selected statistics are provided for the streamgage.
ERIC Educational Resources Information Center
Tay, Louis; Drasgow, Fritz
2012-01-01
Two Monte Carlo simulation studies investigated the effectiveness of the mean adjusted X[superscript 2]/df statistic proposed by Drasgow and colleagues and, because of problems with the method, a new approach for assessing the goodness of fit of an item response theory model was developed. It has been previously recommended that mean adjusted…
ERIC Educational Resources Information Center
de Vries, John
This paper addresses the issue of measuring the integration of various ethnocultural communities into Canadian society by means of statistical or social indicators. The overall philosophy of the study is based on the following principles: (1) indicators should have a clear meaning with respect to the underlying concept of integration; (2)…
Patients and medical statistics. Interest, confidence, and ability.
Woloshin, Steven; Schwartz, Lisa M; Welch, H Gilbert
2005-11-01
People are increasingly presented with medical statistics. There are no existing measures to assess their level of interest or confidence in using medical statistics. To develop 2 new measures, the STAT-interest and STAT-confidence scales, and assess their reliability and validity. Survey with retest after approximately 2 weeks. Two hundred and twenty-four people were recruited from advertisements in local newspapers, an outpatient clinic waiting area, and a hospital open house. We developed and revised 5 items on interest in medical statistics and 3 on confidence understanding statistics. Study participants were mostly college graduates (52%); 25% had a high school education or less. The mean age was 53 (range 20 to 84) years. Most paid attention to medical statistics (6% paid no attention). The mean (SD) STAT-interest score was 68 (17) and ranged from 15 to 100. Confidence in using statistics was also high: the mean (SD) STAT-confidence score was 65 (19) and ranged from 11 to 100. STAT-interest and STAT-confidence scores were moderately correlated (r=.36, P<.001). Both scales demonstrated good test-retest repeatability (r=.60, .62, respectively), internal consistency reliability (Cronbach's alpha=0.70 and 0.78), and usability (individual item nonresponse ranged from 0% to 1.3%). Scale scores correlated only weakly with scores on a medical data interpretation test (r=.15 and .26, respectively). The STAT-interest and STAT-confidence scales are usable and reliable. Interest and confidence were only weakly related to the ability to actually use data.
Patients and Medical Statistics
Woloshin, Steven; Schwartz, Lisa M; Welch, H Gilbert
2005-01-01
BACKGROUND People are increasingly presented with medical statistics. There are no existing measures to assess their level of interest or confidence in using medical statistics. OBJECTIVE To develop 2 new measures, the STAT-interest and STAT-confidence scales, and assess their reliability and validity. DESIGN Survey with retest after approximately 2 weeks. SUBJECTS Two hundred and twenty-four people were recruited from advertisements in local newspapers, an outpatient clinic waiting area, and a hospital open house. MEASURES We developed and revised 5 items on interest in medical statistics and 3 on confidence understanding statistics. RESULTS Study participants were mostly college graduates (52%); 25% had a high school education or less. The mean age was 53 (range 20 to 84) years. Most paid attention to medical statistics (6% paid no attention). The mean (SD) STAT-interest score was 68 (17) and ranged from 15 to 100. Confidence in using statistics was also high: the mean (SD) STAT-confidence score was 65 (19) and ranged from 11 to 100. STAT-interest and STAT-confidence scores were moderately correlated (r=.36, P<.001). Both scales demonstrated good test–retest repeatability (r=.60, .62, respectively), internal consistency reliability (Cronbach's α=0.70 and 0.78), and usability (individual item nonresponse ranged from 0% to 1.3%). Scale scores correlated only weakly with scores on a medical data interpretation test (r=.15 and .26, respectively). CONCLUSION The STAT-interest and STAT-confidence scales are usable and reliable. Interest and confidence were only weakly related to the ability to actually use data. PMID:16307623
NASA Astrophysics Data System (ADS)
Pernot, Pascal; Savin, Andreas
2018-06-01
Benchmarking studies in computational chemistry use reference datasets to assess the accuracy of a method through error statistics. The commonly used error statistics, such as the mean signed and mean unsigned errors, do not inform end-users on the expected amplitude of prediction errors attached to these methods. We show that, the distributions of model errors being neither normal nor zero-centered, these error statistics cannot be used to infer prediction error probabilities. To overcome this limitation, we advocate for the use of more informative statistics, based on the empirical cumulative distribution function of unsigned errors, namely, (1) the probability for a new calculation to have an absolute error below a chosen threshold and (2) the maximal amplitude of errors one can expect with a chosen high confidence level. Those statistics are also shown to be well suited for benchmarking and ranking studies. Moreover, the standard error on all benchmarking statistics depends on the size of the reference dataset. Systematic publication of these standard errors would be very helpful to assess the statistical reliability of benchmarking conclusions.
NASA Astrophysics Data System (ADS)
Siegel, Edward
2008-03-01
Classic statistics digits Newcomb[Am.J.Math.4,39,1881]-Weyl[Goett.Nachr.1912]-Benford[Proc.Am.Phil.Soc.78,4,51,1938]("NeWBe")probability ON-AVERAGE/MEAN log-law: =log[1+1/d]=log[(d+1)/d][google:``Benford's-Law'';"FUZZYICS": Siegel[AMS Nat.-Mtg.:2002&2008)]; Raimi[Sci.Am.221,109,1969]; Hill[Proc.AMS,123,3,887,1996]=log-base=units=SCALE-INVARIANCE!. Algebraic-inverse d=1/[ê(w)-1]: BOSONS(1924)=DIGITS(<1881): Energy-levels:ground=(d=0),first-(d=1)-excited ,... No fractions; only digit-integer-differences=quanta! Quo vadis digit
=oo vs.
<<
Cömert, Itır Tarı; Özyeşil, Zümra Atalay; Burcu Özgülük, S
2016-02-01
The aim of the current study was to investigate the contributions of sad childhood experiences, depression, anxiety, and stress, existence of a sense of meaning, and pursuit of meaning in explaining life satisfaction of young adults in Turkey. The sample comprised 400 undergraduate students ( M age = 20.2 yr.) selected via random cluster sampling. There were no statistically significant differences between men and women in terms of their scores on depression, existence of meaning, pursuit of meaning, and life satisfaction scores. However, there were statistically significant differences between men and women on the sad childhood experiences, anxiety and stress. In heirarchical regression analysis, the model as a whole was significant. Depression and existence of meaning in life made unique significant contributions to the variance in satisfaction in life. Students with lower depression and with a sense of meaning in life tended to be more satisfied with life.
Koley, Sananda; Chakrabarti, Srabani; Pathak, Swapan; Manna, Asim Kumar; Basu, Siddhartha
2015-12-01
Our study was done to assess the cytological changes due to oncotherapy in breast carcinoma especially on morphometry and proliferative activity. Cytological aspirates were collected from a total of 32 cases of invasive ductal carcinoma both before and after oncotherapy. Morphometry was done on the stained cytological smears to assess the different morphological parameters of cell dimension by using the ocular morphometer and the software AutoCAD 2007. Staining was done with Ki-67 and proliferating cell nuclear antigen (PCNA) as proliferative markers. Different morphological parameters were compared before and after oncotherapy by unpaired Student's t test. Statistically significant differences were found in morphometric parameters, e.g., mean nuclear diameter, mean nuclear area, mean cell diameter, and mean cell area, and in the expression of proliferative markers (Ki-67 and PCNA). Statistical analysis was done by obtaining p values. There are statistically significant differences between morphological parameter of breast carcinoma cells before and after oncotherapy.
Kosor, Begüm Yerci; Artunç, Celal; Şahan, Heval
2015-07-01
A key factor of an implant-retained facial prosthesis is the success of the bonding between the substructure and the silicone elastomer. Little has been reported on the bonding of fiber reinforced composite (FRC) to silicone elastomers. Experimental FRC could be a solution for facial prostheses supported by light-activated aliphatic urethane acrylate, orthodontic acrylic resin, or commercially available FRCs. The purpose of this study was to evaluate the bonding of the experimental FRC, orthodontic acrylic resin, and light-activated aliphatic urethane acrylate to a commercially available high-temperature vulcanizing silicone elastomer. Shear and 180-degree peel bond strengths of 3 different substructures (experimental FRC, orthodontic acrylic resin, light-activated aliphatic urethane acrylate) (n=15) to a high-temperature vulcanizing maxillofacial silicone elastomer (M511) with a primer (G611) were assessed after 200 hours of accelerated artificial light-aging. The specimens were tested in a universal testing machine at a cross-head speed of 10 mm/min. Data were collected and statistically analyzed by 1-way ANOVA, followed by the Bonferroni correction and the Dunnett post hoc test (α=.05). Modes of failure were visually determined and categorized as adhesive, cohesive, or mixed and were statistically analyzed with the chi-squared goodness-of-fit test (α=.05). As the mean shear bond strength values were evaluated statistically, no difference was found among the experimental FRC, aliphatic urethane acrylate, and orthodontic acrylic resin subgroups (P>.05). The mean peel bond strengths of experimental fiber reinforced composite and aliphatic urethane acrylate were not found to be statistically different (P>.05). The mean value of the orthodontic acrylic resin subgroup peel bond strength was found to be statistically lower (P<.05). Shear test failure types were found to be statistically different (P<.05), whereas 180-degree peel test failure types were not found to be statistically significant (P>.05). Shear forces predominantly exhibited cohesive failure (64.4%), whereas peel forces predominantly exhibited adhesive failure (93.3%). The mean shear bond strengths of the experimental FRC and aliphatic urethane acrylate groups were not found to be statistically different (P>.05). The mean value of the 180-degree peel strength of the orthodontic acrylic resin group was found to be lower (P<.05). Copyright © 2015 Editorial Council for the Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Savard, Annie; Manuel, Dominic
2015-01-01
Statistics is a domain that is taught in Mathematics in all school levels. We suggest a potential in using an interdisciplinary approach with this concept. Thus the development of the understanding of a situation might mean to use both mathematical and statistical reasoning. In this paper, we present two case studies where two middle school…
ERIC Educational Resources Information Center
Neumann, David L.; Neumann, Michelle M.; Hood, Michelle
2011-01-01
The discipline of statistics seems well suited to the integration of technology in a lecture as a means to enhance student learning and engagement. Technology can be used to simulate statistical concepts, create interactive learning exercises, and illustrate real world applications of statistics. The present study aimed to better understand the…
Statistics Using Just One Formula
ERIC Educational Resources Information Center
Rosenthal, Jeffrey S.
2018-01-01
This article advocates that introductory statistics be taught by basing all calculations on a single simple margin-of-error formula and deriving all of the standard introductory statistical concepts (confidence intervals, significance tests, comparisons of means and proportions, etc) from that one formula. It is argued that this approach will…
Critiquing Statistics in Student and Professional Worlds
ERIC Educational Resources Information Center
Jones, Ryan Seth; Lehrer, Richard; Kim, Min-Joung
2017-01-01
This article compares students' critiques within a class discussion about an invented statistic to STEM professionals' critiques from interviews to better understand how the situated meanings of a statistic are similar and different across student and professional worlds. We discuss similarities and differences in how participants constructed…
NASA Astrophysics Data System (ADS)
Mihardja, H.; Srilestari, A.; Budianto, S. A.
2017-08-01
Laserpuncture is an acupuncture method for pain management. The goal of this study was to determine the effect of laserpuncture at the LI4 Hegu point on the plasma levels of β-endorphin in healthy subjects. A randomized, double-blind, controlled trial with placebo controls was conducted on 29 healthy subjects. Subjects were allocated into the laserpuncture group (n = 15) and the laserpuncture placebo group (n = 14). The plasma levels of β-endorphin were used to measure the output of the study assessed both before treatment and posttreatment. There are statistically significant differences in the mean plasma levels of β-endorphin before and after treatment in the laserpuncture group: changes in mean value from 0.22±0.06 ng/ml to 0.29±0.07 ng/ml with a p value = 0.005 (p < 0.05). There were no statistically significant differences in the mean plasma levels of β-endorphin before and after treatment in the laserpuncture placebo group: changes in mean value from 0.22±0.06 ng/ml to 0.26±0.09 ng/ml with p value = 0.195 (p > 0.05). Between groups, there was not a statistically significant difference in the baseline mean plasma levels of β-endorphin (p = 0.183, p > 0.05). The conclusion of this study is that laserpuncture can affect the plasma levels of β-endorphin in healthy subjects when there is no statistically significant difference in the mean plasma levels of β-endorphin between groups.
Hitting Is Contagious in Baseball: Evidence from Long Hitting Streaks
Bock, Joel R.; Maewal, Akhilesh; Gough, David A.
2012-01-01
Data analysis is used to test the hypothesis that “hitting is contagious”. A statistical model is described to study the effect of a hot hitter upon his teammates’ batting during a consecutive game hitting streak. Box score data for entire seasons comprising streaks of length games, including a total observations were compiled. Treatment and control sample groups () were constructed from core lineups of players on the streaking batter’s team. The percentile method bootstrap was used to calculate confidence intervals for statistics representing differences in the mean distributions of two batting statistics between groups. Batters in the treatment group (hot streak active) showed statistically significant improvements in hitting performance, as compared against the control. Mean for the treatment group was found to be to percentage points higher during hot streaks (mean difference increased points), while the batting heat index introduced here was observed to increase by points. For each performance statistic, the null hypothesis was rejected at the significance level. We conclude that the evidence suggests the potential existence of a “statistical contagion effect”. Psychological mechanisms essential to the empirical results are suggested, as several studies from the scientific literature lend credence to contagious phenomena in sports. Causal inference from these results is difficult, but we suggest and discuss several latent variables that may contribute to the observed results, and offer possible directions for future research. PMID:23251507
On the Helicity in 3D-Periodic Navier-Stokes Equations II: The Statistical Case
NASA Astrophysics Data System (ADS)
Foias, Ciprian; Hoang, Luan; Nicolaenko, Basil
2009-09-01
We study the asymptotic behavior of the statistical solutions to the Navier-Stokes equations using the normalization map [9]. It is then applied to the study of mean energy, mean dissipation rate of energy, and mean helicity of the spatial periodic flows driven by potential body forces. The statistical distribution of the asymptotic Beltrami flows are also investigated. We connect our mathematical analysis with the empirical theory of decaying turbulence. With appropriate mathematically defined ensemble averages, the Kolmogorov universal features are shown to be transient in time. We provide an estimate for the time interval in which those features may still be present. Our collaborator and friend Basil Nicolaenko passed away in September of 2007, after this work was completed. Honoring his contribution and friendship, we dedicate this article to him.
An operational definition of a statistically meaningful trend.
Bryhn, Andreas C; Dimberg, Peter H
2011-04-28
Linear trend analysis of time series is standard procedure in many scientific disciplines. If the number of data is large, a trend may be statistically significant even if data are scattered far from the trend line. This study introduces and tests a quality criterion for time trends referred to as statistical meaningfulness, which is a stricter quality criterion for trends than high statistical significance. The time series is divided into intervals and interval mean values are calculated. Thereafter, r(2) and p values are calculated from regressions concerning time and interval mean values. If r(2) ≥ 0.65 at p ≤ 0.05 in any of these regressions, then the trend is regarded as statistically meaningful. Out of ten investigated time series from different scientific disciplines, five displayed statistically meaningful trends. A Microsoft Excel application (add-in) was developed which can perform statistical meaningfulness tests and which may increase the operationality of the test. The presented method for distinguishing statistically meaningful trends should be reasonably uncomplicated for researchers with basic statistics skills and may thus be useful for determining which trends are worth analysing further, for instance with respect to causal factors. The method can also be used for determining which segments of a time trend may be particularly worthwhile to focus on.
Interpreting “statistical hypothesis testing” results in clinical research
Sarmukaddam, Sanjeev B.
2012-01-01
Difference between “Clinical Significance and Statistical Significance” should be kept in mind while interpreting “statistical hypothesis testing” results in clinical research. This fact is already known to many but again pointed out here as philosophy of “statistical hypothesis testing” is sometimes unnecessarily criticized mainly due to failure in considering such distinction. Randomized controlled trials are also wrongly criticized similarly. Some scientific method may not be applicable in some peculiar/particular situation does not mean that the method is useless. Also remember that “statistical hypothesis testing” is not for decision making and the field of “decision analysis” is very much an integral part of science of statistics. It is not correct to say that “confidence intervals have nothing to do with confidence” unless one understands meaning of the word “confidence” as used in context of confidence interval. Interpretation of the results of every study should always consider all possible alternative explanations like chance, bias, and confounding. Statistical tests in inferential statistics are, in general, designed to answer the question “How likely is the difference found in random sample(s) is due to chance” and therefore limitation of relying only on statistical significance in making clinical decisions should be avoided. PMID:22707861
McLaughlin, Eamon J; Cunningham, Michael J; Kazahaya, Ken; Hsing, Julianna; Kawai, Kosuke; Adil, Eelam A
2016-06-01
To evaluate the feasibility of radiofrequency surgical instrumentation for endoscopic resection of juvenile nasopharyngeal angiofibroma (JNA) and to test the hypothesis that endoscopic radiofrequency ablation-assisted (RFA) resection will have superior intraoperative and/or postoperative outcomes as compared with traditional endoscopic (TE) resection techniques. Case series with chart review. Two tertiary care pediatric hospitals. Twenty-nine pediatric patients who underwent endoscopic transnasal resection of JNA from January 2000 to December 2014. Twenty-nine patients underwent RFA (n = 13) or TE (n = 16) JNA resection over the 15-year study period. Mean patient age was not statistically different between the 2 groups (P = .41); neither was their University of Pittsburgh Medical Center classification stage (P = .79). All patients underwent preoperative embolization. Mean operative times were not statistically different (P = .29). Mean intraoperative blood loss and the need for a transfusion were also not statistically different (P = .27 and .47, respectively). Length of hospital stay was not statistically different (P = .46). Recurrence rates did not differ between groups (P = .99) over a mean follow-up period of 2.3 years. There were no significant differences between RFA and TE resection in intraoperative or postoperative outcome parameters. © American Academy of Otolaryngology—Head and Neck Surgery Foundation 2016.
Population sexual behavior and HIV prevalence in Sub-Saharan Africa: missing links?
Omori, Ryosuke; Abu-Raddad, Laith J
2016-03-01
Patterns of sexual partnering should shape HIV transmission in human populations. The objective of this study was to assess empirical associations between population casual sex behavior and HIV prevalence, and between different measures of casual sex behavior. An ecological study design was applied to nationally representative data, those of the Demographic and Health Surveys, in 25 countries of Sub-Saharan Africa. Spearman rank correlation was used to assess different correlations for males and females and their statistical significance. Correlations between HIV prevalence and means and variances of the number of casual sex partners were positive, but small and statistically insignificant. The majority of correlations across means and variances of the number of casual sex partners were positive, large, and statistically significant. However, all correlations between the means, as well as variances, and the variance of unmarried females were weak and statistically insignificant. Population sexual behavior was not predictive of HIV prevalence across these countries. Nevertheless, the strong correlations across means and variances of sexual behavior suggest that self-reported sexual data are self-consistent and convey valid information content. Unmarried female behavior seemed puzzling, but could be playing an influential role in HIV transmission patterns. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.
Occupation times and ergodicity breaking in biased continuous time random walks
NASA Astrophysics Data System (ADS)
Bel, Golan; Barkai, Eli
2005-12-01
Continuous time random walk (CTRW) models are widely used to model diffusion in condensed matter. There are two classes of such models, distinguished by the convergence or divergence of the mean waiting time. Systems with finite average sojourn time are ergodic and thus Boltzmann-Gibbs statistics can be applied. We investigate the statistical properties of CTRW models with infinite average sojourn time; in particular, the occupation time probability density function is obtained. It is shown that in the non-ergodic phase the distribution of the occupation time of the particle on a given lattice point exhibits bimodal U or trimodal W shape, related to the arcsine law. The key points are as follows. (a) In a CTRW with finite or infinite mean waiting time, the distribution of the number of visits on a lattice point is determined by the probability that a member of an ensemble of particles in equilibrium occupies the lattice point. (b) The asymmetry parameter of the probability distribution function of occupation times is related to the Boltzmann probability and to the partition function. (c) The ensemble average is given by Boltzmann-Gibbs statistics for either finite or infinite mean sojourn time, when detailed balance conditions hold. (d) A non-ergodic generalization of the Boltzmann-Gibbs statistical mechanics for systems with infinite mean sojourn time is found.
78 FR 255 - Resumption of the Population Estimates Challenge Program
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-03
... governmental unit. In those instances where a non-functioning county-level government or statistical equivalent...) A non-functioning county or statistical equivalent means a sub- state entity that does not function... represents a non-functioning county or statistical equivalent, the governor will serve as the chief executive...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-06
... recognized statistical rating organization'' (``NRSRO'') as part of the Commission's amendments to its broker... rating agency'' and ``nationally recognized statistical rating organization'' in Exchange Act Sections 3... ``nationally recognized statistical rating organization'' means a credit rating agency that: (A) issues credit...
Statistical Power in Meta-Analysis
ERIC Educational Resources Information Center
Liu, Jin
2015-01-01
Statistical power is important in a meta-analysis study, although few studies have examined the performance of simulated power in meta-analysis. The purpose of this study is to inform researchers about statistical power estimation on two sample mean difference test under different situations: (1) the discrepancy between the analytical power and…
Comparative Gender Performance in Business Statistics.
ERIC Educational Resources Information Center
Mogull, Robert G.
1989-01-01
Comparative performance of male and female students in introductory and intermediate statistics classes was examined for over 16 years at a state university. Gender means from 97 classes and 1,609 males and 1,085 females revealed a probabilistic--although statistically insignificant--superior performance by female students that appeared to…
An Examination of Statistical Power in Multigroup Dynamic Structural Equation Models
ERIC Educational Resources Information Center
Prindle, John J.; McArdle, John J.
2012-01-01
This study used statistical simulation to calculate differential statistical power in dynamic structural equation models with groups (as in McArdle & Prindle, 2008). Patterns of between-group differences were simulated to provide insight into how model parameters influence power approximations. Chi-square and root mean square error of…
Miró, G; Doménech, A; Escolar, E; Collado, V M; Tejerizo, G; De Las Heras, A; Gómez-Lucía, E
2007-05-01
The electrophoretogram of 89 cats, including those infected by feline immunodeficiency virus (FIV+), feline leukaemia virus (FeLV+) and non-infected, showed statistically significant differences in several of the fractions. FIV+ cats had very high protein values (mean, 8.10 g/dl), mostly because of hypergammaglobulinemia (mean, 2.81 g/dl) as compared with non-infected animals and FeLV+. In addition, in these FIV+ animals, the albumin/globulins ratio (A/G) was very low (mean, 0.72). Statistically significant differences in A/G and alpha2-globulin fraction were observed in FeLV+ group (A/G mean, 0.88 +/- 0.08; alpha2-globulin, mean, 0.84 +/- 0.07 g/dl) when compared with non-infected group (A/G mean, 1.06 +/- 0.08; alpha2-globulin mean, 0.68 +/- 0.04 g/dl). The alpha1-globulin fraction was higher in double infected animals (FIV and FeLV positive, F-F) (3.55 g/dl), than in FeLV+ or FIV+ cats (3.10 and 3.07 g/dl respectively), but no statistical conclusions may be drawn from this fact because of the low number of F-F animals. This technique may help to assess the initial clinical status of retrovirus-infected cats, and the clinical course of these chronic diseases, specifically during and after suitable therapy.
Comparative Analysis Between Computed and Conventional Inferior Alveolar Nerve Block Techniques.
Araújo, Gabriela Madeira; Barbalho, Jimmy Charles Melo; Dias, Tasiana Guedes de Souza; Santos, Thiago de Santana; Vasconcellos, Ricardo José de Holanda; de Morais, Hécio Henrique Araújo
2015-11-01
The aim of this randomized, double-blind, controlled trial was to compare the computed and conventional inferior alveolar nerve block techniques in symmetrically positioned inferior third molars. Both computed and conventional anesthetic techniques were performed in 29 healthy patients (58 surgeries) aged between 18 and 40 years. The anesthetic of choice was 2% lidocaine with 1: 200,000 epinephrine. The Visual Analogue Scale assessed the pain variable after anesthetic infiltration. Patient satisfaction was evaluated using the Likert Scale. Heart and respiratory rates, mean time to perform technique, and the need for additional anesthesia were also evaluated. Pain variable means were higher for the conventional technique as compared with computed, 3.45 ± 2.73 and 2.86 ± 1.96, respectively, but no statistically significant differences were found (P > 0.05). Patient satisfaction showed no statistically significant differences. The average computed technique runtime and the conventional were 3.85 and 1.61 minutes, respectively, showing statistically significant differences (P <0.001). The computed anesthetic technique showed lower mean pain perception, but did not show statistically significant differences when contrasted to the conventional technique.
Estimation of the geochemical threshold and its statistical significance
Miesch, A.T.
1981-01-01
A statistic is proposed for estimating the geochemical threshold and its statistical significance, or it may be used to identify a group of extreme values that can be tested for significance by other means. The statistic is the maximum gap between adjacent values in an ordered array after each gap has been adjusted for the expected frequency. The values in the ordered array are geochemical values transformed by either ln(?? - ??) or ln(?? - ??) and then standardized so that the mean is zero and the variance is unity. The expected frequency is taken from a fitted normal curve with unit area. The midpoint of an adjusted gap that exceeds the corresponding critical value may be taken as an estimate of the geochemical threshold, and the associated probability indicates the likelihood that the threshold separates two geochemical populations. The adjusted gap test may fail to identify threshold values if the variation tends to be continuous from background values to the higher values that reflect mineralized ground. However, the test will serve to identify other anomalies that may be too subtle to have been noted by other means. ?? 1981.
Statistical moments of the Strehl ratio
NASA Astrophysics Data System (ADS)
Yaitskova, Natalia; Esselborn, Michael; Gladysz, Szymon
2012-07-01
Knowledge of the statistical characteristics of the Strehl ratio is essential for the performance assessment of the existing and future adaptive optics systems. For full assessment not only the mean value of the Strehl ratio but also higher statistical moments are important. Variance is related to the stability of an image and skewness reflects the chance to have in a set of short exposure images more or less images with the quality exceeding the mean. Skewness is a central parameter in the domain of lucky imaging. We present a rigorous theory for the calculation of the mean value, the variance and the skewness of the Strehl ratio. In our approach we represent the residual wavefront as being formed by independent cells. The level of the adaptive optics correction defines the number of the cells and the variance of the cells, which are the two main parameters of our theory. The deliverables are the values of the three moments as the functions of the correction level. We make no further assumptions except for the statistical independence of the cells.
A product Pearson-type VII density distribution
NASA Astrophysics Data System (ADS)
Nadarajah, Saralees; Kotz, Samuel
2008-01-01
The Pearson-type VII distributions (containing the Student's t distributions) are becoming increasing prominent and are being considered as competitors to the normal distribution. Motivated by real examples in decision sciences, Bayesian statistics, probability theory and Physics, a new Pearson-type VII distribution is introduced by taking the product of two Pearson-type VII pdfs. Various structural properties of this distribution are derived, including its cdf, moments, mean deviation about the mean, mean deviation about the median, entropy, asymptotic distribution of the extreme order statistics, maximum likelihood estimates and the Fisher information matrix. Finally, an application to a Bayesian testing problem is illustrated.
TSP Symposium 2012 Proceedings
2012-11-01
and Statistical Model 78 7.3 Analysis and Results 79 7.4 Threats to Validity and Limitations 85 7.5 Conclusions 86 7.6 Acknowledgments 87 7.7...Table 12: Overall Statistics of the Experiment 32 Table 13: Results of Pairwise ANOVA Analysis, Highlighting Statistically Significant Differences...we calculated the percentage of defects injected. The distribution statistics are shown in Table 2. Table 2: Mean Lower, Upper Confidence Interval
Statistical analysis of global horizontal solar irradiation GHI in Fez city, Morocco
NASA Astrophysics Data System (ADS)
Bounoua, Z.; Mechaqrane, A.
2018-05-01
An accurate knowledge of the solar energy reaching the ground is necessary for sizing and optimizing the performances of solar installations. This paper describes a statistical analysis of the global horizontal solar irradiation (GHI) at Fez city, Morocco. For better reliability, we have first applied a set of check procedures to test the quality of hourly GHI measurements. We then eliminate the erroneous values which are generally due to measurement or the cosine effect errors. Statistical analysis show that the annual mean daily values of GHI is of approximately 5 kWh/m²/day. Daily monthly mean values and other parameter are also calculated.
Statistical summaries of water-quality data for two coal areas of Jackson County, Colorado
Kuhn, Gerhard
1982-01-01
Statistical summaries of water-quality data are compiled for eight streams in two separate coal areas of Jackson County, Colo. The quality-of-water data were collected from October 1976 to September 1980. For inorganic constituents, the maximum, minimum, and mean concentrations, as well as other statistics are presented; for minor elements, only the maximum, minimum, and mean values are included. Least-squares equations (regressions) are also given relating specific conductance of the streams to the concentration of the major ions. The observed range of specific conductance was 85 to 1,150 micromhos per centimeter for the eight sites. (USGS)
Code of Federal Regulations, 2010 CFR
2010-07-01
..., maps, graphs, pamphlets, notes, charts, tabulations, analyses, statistical or informational... Office. Official business means the authorized business of the Office. General Counsel means the General...
A full year evaluation of the CALIOPE-EU air quality modeling system over Europe for 2004
NASA Astrophysics Data System (ADS)
Pay, M. T.; Piot, M.; Jorba, O.; Gassó, S.; Gonçalves, M.; Basart, S.; Dabdub, D.; Jiménez-Guerrero, P.; Baldasano, J. M.
The CALIOPE-EU high-resolution air quality modeling system, namely WRF-ARW/HERMES-EMEP/CMAQ/BSC-DREAM8b, is developed and applied to Europe (12 km × 12 km, 1 h). The model performances are tested in terms of air quality levels and dynamics reproducibility on a yearly basis. The present work describes a quantitative evaluation of gas phase species (O 3, NO 2 and SO 2) and particulate matter (PM2.5 and PM10) against ground-based measurements from the EMEP (European Monitoring and Evaluation Programme) network for the year 2004. The evaluation is based on statistics. Simulated O 3 achieves satisfactory performances for both daily mean and daily maximum concentrations, especially in summer, with annual mean correlations of 0.66 and 0.69, respectively. Mean normalized errors are comprised within the recommendations proposed by the United States Environmental Protection Agency (US-EPA). The general trends and daily variations of primary pollutants (NO 2 and SO 2) are satisfactory. Daily mean concentrations of NO 2 correlate well with observations (annual correlation r = 0.67) but tend to be underestimated. For SO 2, mean concentrations are well simulated (mean bias = 0.5 μg m -3) with relatively high annual mean correlation ( r = 0.60), although peaks are generally overestimated. The dynamics of PM2.5 and PM10 is well reproduced (0.49 < r < 0.62), but mean concentrations remain systematically underestimated. Deficiencies in particulate matter source characterization are discussed. Also, the spatially distributed statistics and the general patterns for each pollutant over Europe are examined. The model performances are compared with other European studies. While O 3 statistics generally remain lower than those obtained by the other considered studies, statistics for NO 2, SO 2, PM2.5 and PM10 present higher scores than most models.
R package MVR for Joint Adaptive Mean-Variance Regularization and Variance Stabilization
Dazard, Jean-Eudes; Xu, Hua; Rao, J. Sunil
2015-01-01
We present an implementation in the R language for statistical computing of our recent non-parametric joint adaptive mean-variance regularization and variance stabilization procedure. The method is specifically suited for handling difficult problems posed by high-dimensional multivariate datasets (p ≫ n paradigm), such as in ‘omics’-type data, among which are that the variance is often a function of the mean, variable-specific estimators of variances are not reliable, and tests statistics have low powers due to a lack of degrees of freedom. The implementation offers a complete set of features including: (i) normalization and/or variance stabilization function, (ii) computation of mean-variance-regularized t and F statistics, (iii) generation of diverse diagnostic plots, (iv) synthetic and real ‘omics’ test datasets, (v) computationally efficient implementation, using C interfacing, and an option for parallel computing, (vi) manual and documentation on how to setup a cluster. To make each feature as user-friendly as possible, only one subroutine per functionality is to be handled by the end-user. It is available as an R package, called MVR (‘Mean-Variance Regularization’), downloadable from the CRAN. PMID:26819572
An Automated Energy Detection Algorithm Based on Consecutive Mean Excision
2018-01-01
present in the RF spectrum. 15. SUBJECT TERMS RF spectrum, detection threshold algorithm, consecutive mean excision, rank order filter , statistical...Median 4 3.1.9 Rank Order Filter (ROF) 4 3.1.10 Crest Factor (CF) 5 3.2 Statistical Summary 6 4. Algorithm 7 5. Conclusion 8 6. References 9...energy detection algorithm based on morphological filter processing with a semi- disk structure. Adelphi (MD): Army Research Laboratory (US); 2018 Jan
NASA Technical Reports Server (NTRS)
Labovitz, M. L.; Masuoka, E. J. (Principal Investigator)
1981-01-01
The presence of positive serial correlation (autocorrelation) in remotely sensed data results in an underestimate of the variance-covariance matrix when calculated using contiguous pixels. This underestimate produces an inflation in F statistics. For a set of Thematic Mapper Simulator data (TMS), used to test the ability to discriminate a known geobotanical anomaly from its background, the inflation in F statistics related to serial correlation is between 7 and 70 times. This means that significance tests of means of the spectral bands initially appear to suggest that the anomalous site is very different in spectral reflectance and emittance from its background sites. However, this difference often disappears and is always dramatically reduced when compared to frequency distributions of test statistics produced by the comparison of simulated training sets possessing equal means, but which are composed of autocorrelated observations.
Code of Federal Regulations, 2010 CFR
2010-10-01
..., or corporations. A business firm which is identified by the name of one or more persons is not an.... (l) Statistical record means a record maintained for statistical research or reporting purposes only...
Analysis of Statistical Methods Currently used in Toxicology Journals
Na, Jihye; Yang, Hyeri
2014-01-01
Statistical methods are frequently used in toxicology, yet it is not clear whether the methods employed by the studies are used consistently and conducted based on sound statistical grounds. The purpose of this paper is to describe statistical methods used in top toxicology journals. More specifically, we sampled 30 papers published in 2014 from Toxicology and Applied Pharmacology, Archives of Toxicology, and Toxicological Science and described methodologies used to provide descriptive and inferential statistics. One hundred thirteen endpoints were observed in those 30 papers, and most studies had sample size less than 10, with the median and the mode being 6 and 3 & 6, respectively. Mean (105/113, 93%) was dominantly used to measure central tendency, and standard error of the mean (64/113, 57%) and standard deviation (39/113, 34%) were used to measure dispersion, while few studies provide justifications regarding why the methods being selected. Inferential statistics were frequently conducted (93/113, 82%), with one-way ANOVA being most popular (52/93, 56%), yet few studies conducted either normality or equal variance test. These results suggest that more consistent and appropriate use of statistical method is necessary which may enhance the role of toxicology in public health. PMID:25343012
Analysis of Statistical Methods Currently used in Toxicology Journals.
Na, Jihye; Yang, Hyeri; Bae, SeungJin; Lim, Kyung-Min
2014-09-01
Statistical methods are frequently used in toxicology, yet it is not clear whether the methods employed by the studies are used consistently and conducted based on sound statistical grounds. The purpose of this paper is to describe statistical methods used in top toxicology journals. More specifically, we sampled 30 papers published in 2014 from Toxicology and Applied Pharmacology, Archives of Toxicology, and Toxicological Science and described methodologies used to provide descriptive and inferential statistics. One hundred thirteen endpoints were observed in those 30 papers, and most studies had sample size less than 10, with the median and the mode being 6 and 3 & 6, respectively. Mean (105/113, 93%) was dominantly used to measure central tendency, and standard error of the mean (64/113, 57%) and standard deviation (39/113, 34%) were used to measure dispersion, while few studies provide justifications regarding why the methods being selected. Inferential statistics were frequently conducted (93/113, 82%), with one-way ANOVA being most popular (52/93, 56%), yet few studies conducted either normality or equal variance test. These results suggest that more consistent and appropriate use of statistical method is necessary which may enhance the role of toxicology in public health.
ERIC Educational Resources Information Center
Beeman, Jennifer Leigh Sloan
2013-01-01
Research has found that students successfully complete an introductory course in statistics without fully comprehending the underlying theory or being able to exhibit statistical reasoning. This is particularly true for the understanding about the sampling distribution of the mean, a crucial concept for statistical inference. This study…
The Importance of Statistical Modeling in Data Analysis and Inference
ERIC Educational Resources Information Center
Rollins, Derrick, Sr.
2017-01-01
Statistical inference simply means to draw a conclusion based on information that comes from data. Error bars are the most commonly used tool for data analysis and inference in chemical engineering data studies. This work demonstrates, using common types of data collection studies, the importance of specifying the statistical model for sound…
Strategies Used by Students to Compare Two Data Sets
ERIC Educational Resources Information Center
Reaburn, Robyn
2012-01-01
One of the common tasks of inferential statistics is to compare two data sets. Long before formal statistical procedures, however, students can be encouraged to make comparisons between data sets and therefore build up intuitive statistical reasoning. Such tasks also give meaning to the data collection students may do. This study describes the…
Applying Statistical Process Quality Control Methodology to Educational Settings.
ERIC Educational Resources Information Center
Blumberg, Carol Joyce
A subset of Statistical Process Control (SPC) methodology known as Control Charting is introduced. SPC methodology is a collection of graphical and inferential statistics techniques used to study the progress of phenomena over time. The types of control charts covered are the null X (mean), R (Range), X (individual observations), MR (moving…
Statistics and Title VII Proof: Prima Facie Case and Rebuttal.
ERIC Educational Resources Information Center
Whitten, David
1978-01-01
The method and means by which statistics can raise a prima facie case of Title VII violation are analyzed. A standard is identified that can be applied to determine whether a statistical disparity is sufficient to shift the burden to the employer to rebut a prima facie case of discrimination. (LBH)
The Lure of Statistics in Data Mining
ERIC Educational Resources Information Center
Grover, Lovleen Kumar; Mehra, Rajni
2008-01-01
The field of Data Mining like Statistics concerns itself with "learning from data" or "turning data into information". For statisticians the term "Data mining" has a pejorative meaning. Instead of finding useful patterns in large volumes of data as in the case of Statistics, data mining has the connotation of searching for data to fit preconceived…
Improving Statistics Education through Simulations: The Case of the Sampling Distribution.
ERIC Educational Resources Information Center
Earley, Mark A.
This paper presents a summary of action research investigating statistics students' understandings of the sampling distribution of the mean. With four sections of an introductory Statistics in Education course (n=98 students), a computer simulation activity (R. delMas, J. Garfield, and B. Chance, 1999) was implemented and evaluated to show…
Effect of Different Phases of Menstrual Cycle on Heart Rate Variability (HRV).
Brar, Tejinder Kaur; Singh, K D; Kumar, Avnish
2015-10-01
Heart Rate Variability (HRV), which is a measure of the cardiac autonomic tone, displays physiological changes throughout the menstrual cycle. The functions of the ANS in various phases of the menstrual cycle were examined in some studies. The aim of our study was to observe the effect of menstrual cycle on cardiac autonomic function parameters in healthy females. A cross-sectional (observational) study was conducted on 50 healthy females, in the age group of 18-25 years. Heart Rate Variability (HRV) was recorded by Physio Pac (PC-2004). The data consisted of Time Domain Analysis and Frequency Domain Analysis in menstrual, proliferative and secretory phase of menstrual cycle. Data collected was analysed statistically using student's pair t-test. The difference in mean heart rate, LF power%, LFnu and HFnu in menstrual and proliferative phase was found to be statistically significant. The difference in mean RR, Mean HR, RMSSD (the square root of the mean of the squares of the successive differences between adjacent NNs.), NN50 (the number of pairs of successive NNs that differ by more than 50 ms), pNN50 (the proportion of NN50 divided by total number of NNs.), VLF (very low frequency) power, LF (low frequency) power, LF power%, HF power %, LF/HF ratio, LFnu and HFnu was found to be statistically significant in proliferative and secretory phase. The difference in Mean RR, Mean HR, LFnu and HFnu was found to be statistically significant in secretory and menstrual phases. From the study it can be concluded that sympathetic nervous activity in secretory phase is greater than in the proliferative phase, whereas parasympathetic nervous activity is predominant in proliferative phase.
Effect of Different Phases of Menstrual Cycle on Heart Rate Variability (HRV)
Singh, K. D.; Kumar, Avnish
2015-01-01
Background Heart Rate Variability (HRV), which is a measure of the cardiac autonomic tone, displays physiological changes throughout the menstrual cycle. The functions of the ANS in various phases of the menstrual cycle were examined in some studies. Aims and Objectives The aim of our study was to observe the effect of menstrual cycle on cardiac autonomic function parameters in healthy females. Materials and Methods A cross-sectional (observational) study was conducted on 50 healthy females, in the age group of 18-25 years. Heart Rate Variability (HRV) was recorded by Physio Pac (PC-2004). The data consisted of Time Domain Analysis and Frequency Domain Analysis in menstrual, proliferative and secretory phase of menstrual cycle. Data collected was analysed statistically using student’s pair t-test. Results The difference in mean heart rate, LF power%, LFnu and HFnu in menstrual and proliferative phase was found to be statistically significant. The difference in mean RR, Mean HR, RMSSD (the square root of the mean of the squares of the successive differences between adjacent NNs.), NN50 (the number of pairs of successive NNs that differ by more than 50 ms), pNN50 (the proportion of NN50 divided by total number of NNs.), VLF (very low frequency) power, LF (low frequency) power, LF power%, HF power %, LF/HF ratio, LFnu and HFnu was found to be statistically significant in proliferative and secretory phase. The difference in Mean RR, Mean HR, LFnu and HFnu was found to be statistically significant in secretory and menstrual phases. Conclusion From the study it can be concluded that sympathetic nervous activity in secretory phase is greater than in the proliferative phase, whereas parasympathetic nervous activity is predominant in proliferative phase. PMID:26557512
Asquith, William H.; Roussel, Meghan C.; Cleveland, Theodore G.; Fang, Xing; Thompson, David B.
2006-01-01
The design of small runoff-control structures, from simple floodwater-detention basins to sophisticated best-management practices, requires the statistical characterization of rainfall as a basis for cost-effective, risk-mitigated, hydrologic engineering design. The U.S. Geological Survey, in cooperation with the Texas Department of Transportation, has developed a framework to estimate storm statistics including storm interevent times, distributions of storm depths, and distributions of storm durations for eastern New Mexico, Oklahoma, and Texas. The analysis is based on hourly rainfall recorded by the National Weather Service. The database contains more than 155 million hourly values from 774 stations in the study area. Seven sets of maps depicting ranges of mean storm interevent time, mean storm depth, and mean storm duration, by county, as well as tables listing each of those statistics, by county, were developed. The mean storm interevent time is used in probabilistic models to assess the frequency distribution of storms. The Poisson distribution is suggested to model the distribution of storm occurrence, and the exponential distribution is suggested to model the distribution of storm interevent times. The four-parameter kappa distribution is judged as an appropriate distribution for modeling the distribution of both storm depth and storm duration. Preference for the kappa distribution is based on interpretation of L-moment diagrams. Parameter estimates for the kappa distributions are provided. Separate dimensionless frequency curves for storm depth and duration are defined for eastern New Mexico, Oklahoma, and Texas. Dimension is restored by multiplying curve ordinates by the mean storm depth or mean storm duration to produce quantile functions of storm depth and duration. Minimum interevent time and location have slight influence on the scale and shape of the dimensionless frequency curves. Ten example problems and solutions to possible applications are provided.
What to use to express the variability of data: Standard deviation or standard error of mean?
Barde, Mohini P; Barde, Prajakt J
2012-07-01
Statistics plays a vital role in biomedical research. It helps present data precisely and draws the meaningful conclusions. While presenting data, one should be aware of using adequate statistical measures. In biomedical journals, Standard Error of Mean (SEM) and Standard Deviation (SD) are used interchangeably to express the variability; though they measure different parameters. SEM quantifies uncertainty in estimate of the mean whereas SD indicates dispersion of the data from mean. As readers are generally interested in knowing the variability within sample, descriptive data should be precisely summarized with SD. Use of SEM should be limited to compute CI which measures the precision of population estimate. Journals can avoid such errors by requiring authors to adhere to their guidelines.
75 FR 78063 - Passenger Weight and Inspected Vessel Stability Requirements
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-14
... Health Statistics NEPA--National Environmental Policy Act of 1969 NHANES--National Health and Nutrition..., Advance Data From Vital Health Statistics Mean Body Weight, Height, and Body Mass Index, United States...
Humans make efficient use of natural image statistics when performing spatial interpolation.
D'Antona, Anthony D; Perry, Jeffrey S; Geisler, Wilson S
2013-12-16
Visual systems learn through evolution and experience over the lifespan to exploit the statistical structure of natural images when performing visual tasks. Understanding which aspects of this statistical structure are incorporated into the human nervous system is a fundamental goal in vision science. To address this goal, we measured human ability to estimate the intensity of missing image pixels in natural images. Human estimation accuracy is compared with various simple heuristics (e.g., local mean) and with optimal observers that have nearly complete knowledge of the local statistical structure of natural images. Human estimates are more accurate than those of simple heuristics, and they match the performance of an optimal observer that knows the local statistical structure of relative intensities (contrasts). This optimal observer predicts the detailed pattern of human estimation errors and hence the results place strong constraints on the underlying neural mechanisms. However, humans do not reach the performance of an optimal observer that knows the local statistical structure of the absolute intensities, which reflect both local relative intensities and local mean intensity. As predicted from a statistical analysis of natural images, human estimation accuracy is negligibly improved by expanding the context from a local patch to the whole image. Our results demonstrate that the human visual system exploits efficiently the statistical structure of natural images.
A Third Moment Adjusted Test Statistic for Small Sample Factor Analysis.
Lin, Johnny; Bentler, Peter M
2012-01-01
Goodness of fit testing in factor analysis is based on the assumption that the test statistic is asymptotically chi-square; but this property may not hold in small samples even when the factors and errors are normally distributed in the population. Robust methods such as Browne's asymptotically distribution-free method and Satorra Bentler's mean scaling statistic were developed under the presumption of non-normality in the factors and errors. This paper finds new application to the case where factors and errors are normally distributed in the population but the skewness of the obtained test statistic is still high due to sampling error in the observed indicators. An extension of Satorra Bentler's statistic is proposed that not only scales the mean but also adjusts the degrees of freedom based on the skewness of the obtained test statistic in order to improve its robustness under small samples. A simple simulation study shows that this third moment adjusted statistic asymptotically performs on par with previously proposed methods, and at a very small sample size offers superior Type I error rates under a properly specified model. Data from Mardia, Kent and Bibby's study of students tested for their ability in five content areas that were either open or closed book were used to illustrate the real-world performance of this statistic.
Statistical wind analysis for near-space applications
NASA Astrophysics Data System (ADS)
Roney, Jason A.
2007-09-01
Statistical wind models were developed based on the existing observational wind data for near-space altitudes between 60 000 and 100 000 ft (18 30 km) above ground level (AGL) at two locations, Akon, OH, USA, and White Sands, NM, USA. These two sites are envisioned as playing a crucial role in the first flights of high-altitude airships. The analysis shown in this paper has not been previously applied to this region of the stratosphere for such an application. Standard statistics were compiled for these data such as mean, median, maximum wind speed, and standard deviation, and the data were modeled with Weibull distributions. These statistics indicated, on a yearly average, there is a lull or a “knee” in the wind between 65 000 and 72 000 ft AGL (20 22 km). From the standard statistics, trends at both locations indicated substantial seasonal variation in the mean wind speed at these heights. The yearly and monthly statistical modeling indicated that Weibull distributions were a reasonable model for the data. Forecasts and hindcasts were done by using a Weibull model based on 2004 data and comparing the model with the 2003 and 2005 data. The 2004 distribution was also a reasonable model for these years. Lastly, the Weibull distribution and cumulative function were used to predict the 50%, 95%, and 99% winds, which are directly related to the expected power requirements of a near-space station-keeping airship. These values indicated that using only the standard deviation of the mean may underestimate the operational conditions.
Tree-space statistics and approximations for large-scale analysis of anatomical trees.
Feragen, Aasa; Owen, Megan; Petersen, Jens; Wille, Mathilde M W; Thomsen, Laura H; Dirksen, Asger; de Bruijne, Marleen
2013-01-01
Statistical analysis of anatomical trees is hard to perform due to differences in the topological structure of the trees. In this paper we define statistical properties of leaf-labeled anatomical trees with geometric edge attributes by considering the anatomical trees as points in the geometric space of leaf-labeled trees. This tree-space is a geodesic metric space where any two trees are connected by a unique shortest path, which corresponds to a tree deformation. However, tree-space is not a manifold, and the usual strategy of performing statistical analysis in a tangent space and projecting onto tree-space is not available. Using tree-space and its shortest paths, a variety of statistical properties, such as mean, principal component, hypothesis testing and linear discriminant analysis can be defined. For some of these properties it is still an open problem how to compute them; others (like the mean) can be computed, but efficient alternatives are helpful in speeding up algorithms that use means iteratively, like hypothesis testing. In this paper, we take advantage of a very large dataset (N = 8016) to obtain computable approximations, under the assumption that the data trees parametrize the relevant parts of tree-space well. Using the developed approximate statistics, we illustrate how the structure and geometry of airway trees vary across a population and show that airway trees with Chronic Obstructive Pulmonary Disease come from a different distribution in tree-space than healthy ones. Software is available from http://image.diku.dk/aasa/software.php.
ERIC Educational Resources Information Center
Watier, Nicholas N.; Lamontagne, Claude; Chartier, Sylvain
2011-01-01
The arithmetic mean is a fundamental statistical concept. Unfortunately, social science students rarely develop an intuitive understanding of the mean and rely on the formula to describe or define it. According to constructivist pedagogy, educators that have access to a variety of conceptualizations of a particular concept are better equipped to…
Ladar range image denoising by a nonlocal probability statistics algorithm
NASA Astrophysics Data System (ADS)
Xia, Zhi-Wei; Li, Qi; Xiong, Zhi-Peng; Wang, Qi
2013-01-01
According to the characteristic of range images of coherent ladar and the basis of nonlocal means (NLM), a nonlocal probability statistics (NLPS) algorithm is proposed in this paper. The difference is that NLM performs denoising using the mean of the conditional probability distribution function (PDF) while NLPS using the maximum of the marginal PDF. In the algorithm, similar blocks are found out by the operation of block matching and form a group. Pixels in the group are analyzed by probability statistics and the gray value with maximum probability is used as the estimated value of the current pixel. The simulated range images of coherent ladar with different carrier-to-noise ratio and real range image of coherent ladar with 8 gray-scales are denoised by this algorithm, and the results are compared with those of median filter, multitemplate order mean filter, NLM, median nonlocal mean filter and its incorporation of anatomical side information, and unsupervised information-theoretic adaptive filter. The range abnormality noise and Gaussian noise in range image of coherent ladar are effectively suppressed by NLPS.
Water quality management using statistical analysis and time-series prediction model
NASA Astrophysics Data System (ADS)
Parmar, Kulwinder Singh; Bhardwaj, Rashmi
2014-12-01
This paper deals with water quality management using statistical analysis and time-series prediction model. The monthly variation of water quality standards has been used to compare statistical mean, median, mode, standard deviation, kurtosis, skewness, coefficient of variation at Yamuna River. Model validated using R-squared, root mean square error, mean absolute percentage error, maximum absolute percentage error, mean absolute error, maximum absolute error, normalized Bayesian information criterion, Ljung-Box analysis, predicted value and confidence limits. Using auto regressive integrated moving average model, future water quality parameters values have been estimated. It is observed that predictive model is useful at 95 % confidence limits and curve is platykurtic for potential of hydrogen (pH), free ammonia, total Kjeldahl nitrogen, dissolved oxygen, water temperature (WT); leptokurtic for chemical oxygen demand, biochemical oxygen demand. Also, it is observed that predicted series is close to the original series which provides a perfect fit. All parameters except pH and WT cross the prescribed limits of the World Health Organization /United States Environmental Protection Agency, and thus water is not fit for drinking, agriculture and industrial use.
A Divergence Statistics Extension to VTK for Performance Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pebay, Philippe Pierre; Bennett, Janine Camille
This report follows the series of previous documents ([PT08, BPRT09b, PT09, BPT09, PT10, PB13], where we presented the parallel descriptive, correlative, multi-correlative, principal component analysis, contingency, k -means, order and auto-correlative statistics engines which we developed within the Visualization Tool Kit ( VTK ) as a scalable, parallel and versatile statistics package. We now report on a new engine which we developed for the calculation of divergence statistics, a concept which we hereafter explain and whose main goal is to quantify the discrepancy, in a stasticial manner akin to measuring a distance, between an observed empirical distribution and a theoretical,more » "ideal" one. The ease of use of the new diverence statistics engine is illustrated by the means of C++ code snippets. Although this new engine does not yet have a parallel implementation, it has already been applied to HPC performance analysis, of which we provide an example.« less
Sparse approximation of currents for statistics on curves and surfaces.
Durrleman, Stanley; Pennec, Xavier; Trouvé, Alain; Ayache, Nicholas
2008-01-01
Computing, processing, visualizing statistics on shapes like curves or surfaces is a real challenge with many applications ranging from medical image analysis to computational geometry. Modelling such geometrical primitives with currents avoids feature-based approach as well as point-correspondence method. This framework has been proved to be powerful to register brain surfaces or to measure geometrical invariants. However, if the state-of-the-art methods perform efficiently pairwise registrations, new numerical schemes are required to process groupwise statistics due to an increasing complexity when the size of the database is growing. Statistics such as mean and principal modes of a set of shapes often have a heavy and highly redundant representation. We propose therefore to find an adapted basis on which mean and principal modes have a sparse decomposition. Besides the computational improvement, this sparse representation offers a way to visualize and interpret statistics on currents. Experiments show the relevance of the approach on 34 sets of 70 sulcal lines and on 50 sets of 10 meshes of deep brain structures.
Interventions for unilateral refractive amblyopia.
Shotton, Kate; Powell, Christine; Voros, Gerasimos; Hatt, Sarah R
2008-10-08
Unilateral refractive amblyopia is a common cause of reduced visual acuity in childhood, but optimal treatment is not well defined. This review examined the treatment effect from spectacles and conventional occlusion. Evaluation of the evidence of the effectiveness of spectacles and or occlusion in the treatment of unilateral refractive amblyopia. We searched the Cochrane Central Register of Controlled Trials (CENTRAL), MEDLINE, EMBASE and LILACS. Relevant conference proceedings were manually searched. There were no date or language restrictions. The searches were last run on 7 July 2008. Randomised controlled trials of treatment for unilateral refractive amblyopia by spectacles, with or without occlusion were eligible. We included studies with participants of any age. Two authors independently assessed abstracts identified by the searches. We obtained full text copies and contacted study authors where necessary. Eight trials were eligible for inclusion. Data were extracted from seven. No meta-analysis was performed. For all studies mean acuity (standard deviation (SD)) in the amblyopic eye post treatment is reported.Comparison: Spectacles only versus no treatment (Clarke 2003). Mean (SD) visual acuity: spectacles group 0.31 (0.17); no treatment group 0.42 (0.19). Mean difference (MD) between groups -0.11 (borderline statistical significance: 95% confidence interval (CI) -0.22 to 0.00).Comparison: Spectacles plus occlusion versus no treatment (Clarke 2003). Mean (SD) visual acuity: full treatment 0.22 (0.13); no treatment 0.42 (0.19). Mean difference between the groups -0.20 (statistically significant: 95% CI -0.30 to -0.10).Comparison: Spectacles plus occlusion versus spectacles only: Clarke 2003 MD -0.09 (borderline statistical significance 95% CI, -0.18 to 0.00); PEDIG 2005b; MD -0.15 (not statistically significant 95% CI -0.32 to 0.02); PEDIG 2006a; MD 0.01 (not statistically significant 95% CI -0.08 to 0.10).Comparison: Occlusion regimes. PEDIG 2003a: 2 hours versus 6 hours for moderate amblyopia: MD 0.01 (not statistically significant: 95% CI -0.06 to 0.08); PEDIG 2003b: 6 hours versus full-time for severe amblyopia: MD 0.03 (not statistically significant: 95% CI -0.08 to 0.14). Stewart 2007a: 6 hours versus full-time occlusion: MD -0.12 (not statistically significant: 95% CI -0.27 to 0.03) In some cases of unilateral refractive amblyopia it appears that there is a treatment benefit from refractive correction alone. Where amblyopia persists there is some evidence that adding occlusion further improves vision. It remains unclear which treatment regimes are optimal for individual patients. The nature of any dose/response effect from occlusion still needs to be clarified.
Image correlation and sampling study
NASA Technical Reports Server (NTRS)
Popp, D. J.; Mccormack, D. S.; Sedwick, J. L.
1972-01-01
The development of analytical approaches for solving image correlation and image sampling of multispectral data is discussed. Relevant multispectral image statistics which are applicable to image correlation and sampling are identified. The general image statistics include intensity mean, variance, amplitude histogram, power spectral density function, and autocorrelation function. The translation problem associated with digital image registration and the analytical means for comparing commonly used correlation techniques are considered. General expressions for determining the reconstruction error for specific image sampling strategies are developed.
Static Scene Statistical Non-Uniformity Correction
2015-03-01
Error NUC Non-Uniformity Correction RMSE Root Mean Squared Error RSD Relative Standard Deviation S3NUC Static Scene Statistical Non-Uniformity...Deviation ( RSD ) which normalizes the standard deviation, σ, to the mean estimated value, µ using the equation RS D = σ µ × 100. The RSD plot of the gain...estimates is shown in Figure 4.1(b). The RSD plot shows that after a sample size of approximately 10, the different photocount values and the inclusion
Accelerated aging effects on surface hardness and roughness of lingual retainer adhesives.
Ramoglu, Sabri Ilhan; Usumez, Serdar; Buyukyilmaz, Tamer
2008-01-01
To test the null hypothesis that accelerated aging has no effect on the surface microhardness and roughness of two light-cured lingual retainer adhesives. Ten samples of light-cured materials, Transbond Lingual Retainer (3M Unitek) and Light Cure Retainer (Reliance) were cured with a halogen light for 40 seconds. Vickers hardness and surface roughness were measured before and after accelerated aging of 300 hours in a weathering tester. Differences between mean values were analyzed for statistical significance using a t-test. The level of statistical significance was set at P < .05. The mean Vickers hardness of Transbond Lingual Retainer was 62.8 +/- 3.5 and 79.6 +/- 4.9 before and after aging, respectively. The mean Vickers hardness of Light Cure Retainer was 40.3 +/- 2.6 and 58.3 +/- 4.3 before and after aging, respectively. Differences in both groups were statistically significant (P < .001). Following aging, mean surface roughness was changed from 0.039 microm to 0.121 microm and from 0.021 microm to 0.031 microm for Transbond Lingual Retainer and Light Cure Retainer, respectively. The roughening of Transbond Lingual Retainer with aging was statistically significant (P < .05), while the change in the surface roughness of Light Cure Retainer was not (P > .05). Accelerated aging significantly increased the surface microhardness of both light-cured retainer adhesives tested. It also significantly increased the surface roughness of the Transbond Lingual Retainer.
NASA Technical Reports Server (NTRS)
Edwards, B. F.; Waligora, J. M.; Horrigan, D. J., Jr.
1985-01-01
This analysis was done to determine whether various decompression response groups could be characterized by the pooled nitrogen (N2) washout profiles of the group members, pooling individual washout profiles provided a smooth time dependent function of means representative of the decompression response group. No statistically significant differences were detected. The statistical comparisons of the profiles were performed by means of univariate weighted t-test at each 5 minute profile point, and with levels of significance of 5 and 10 percent. The estimated powers of the tests (i.e., probabilities) to detect the observed differences in the pooled profiles were of the order of 8 to 30 percent.
Streamflow characteristics at streamgages in northern Afghanistan and selected locations
Olson, Scott A.; Williams-Sether, Tara
2010-01-01
Statistical summaries of streamflow data for 79 historical streamgages in Northern Afghanistan and other selected historical streamgages are presented in this report. The summaries for each streamgage include (1) station description, (2) graph of the annual mean discharge for the period of record, (3) statistics of monthly and annual mean discharges, (4) monthly and annual flow duration, (5) probability of occurrence of annual high discharges, (6) probability of occurrence of annual low discharges, (7) probability of occurrence of seasonal low discharges, (8) annual peak discharges for the period of record, and (9) monthly and annual mean discharges for the period of record.
NASA Technical Reports Server (NTRS)
Tolson, R. H.
1981-01-01
A technique is described for providing a means of evaluating the influence of spatial sampling on the determination of global mean total columnar ozone. A finite number of coefficients in the expansion are determined, and the truncated part of the expansion is shown to contribute an error to the estimate, which depends strongly on the spatial sampling and is relatively insensitive to data noise. First and second order statistics are derived for each term in a spherical harmonic expansion which represents the ozone field, and the statistics are used to estimate systematic and random errors in the estimates of total ozone.
An empirical analysis of the distribution of overshoots in a stationary Gaussian stochastic process
NASA Technical Reports Server (NTRS)
Carter, M. C.; Madison, M. W.
1973-01-01
The frequency distribution of overshoots in a stationary Gaussian stochastic process is analyzed. The primary processes involved in this analysis are computer simulation and statistical estimation. Computer simulation is used to simulate stationary Gaussian stochastic processes that have selected autocorrelation functions. An analysis of the simulation results reveals a frequency distribution for overshoots with a functional dependence on the mean and variance of the process. Statistical estimation is then used to estimate the mean and variance of a process. It is shown that for an autocorrelation function, the mean and the variance for the number of overshoots, a frequency distribution for overshoots can be estimated.
Joint Adaptive Mean-Variance Regularization and Variance Stabilization of High Dimensional Data.
Dazard, Jean-Eudes; Rao, J Sunil
2012-07-01
The paper addresses a common problem in the analysis of high-dimensional high-throughput "omics" data, which is parameter estimation across multiple variables in a set of data where the number of variables is much larger than the sample size. Among the problems posed by this type of data are that variable-specific estimators of variances are not reliable and variable-wise tests statistics have low power, both due to a lack of degrees of freedom. In addition, it has been observed in this type of data that the variance increases as a function of the mean. We introduce a non-parametric adaptive regularization procedure that is innovative in that : (i) it employs a novel "similarity statistic"-based clustering technique to generate local-pooled or regularized shrinkage estimators of population parameters, (ii) the regularization is done jointly on population moments, benefiting from C. Stein's result on inadmissibility, which implies that usual sample variance estimator is improved by a shrinkage estimator using information contained in the sample mean. From these joint regularized shrinkage estimators, we derived regularized t-like statistics and show in simulation studies that they offer more statistical power in hypothesis testing than their standard sample counterparts, or regular common value-shrinkage estimators, or when the information contained in the sample mean is simply ignored. Finally, we show that these estimators feature interesting properties of variance stabilization and normalization that can be used for preprocessing high-dimensional multivariate data. The method is available as an R package, called 'MVR' ('Mean-Variance Regularization'), downloadable from the CRAN website.
Zheng, Xiliang; Wang, Jin
2015-01-01
We uncovered the universal statistical laws for the biomolecular recognition/binding process. We quantified the statistical energy landscapes for binding, from which we can characterize the distributions of the binding free energy (affinity), the equilibrium constants, the kinetics and the specificity by exploring the different ligands binding with a particular receptor. The results of the analytical studies are confirmed by the microscopic flexible docking simulations. The distribution of binding affinity is Gaussian around the mean and becomes exponential near the tail. The equilibrium constants of the binding follow a log-normal distribution around the mean and a power law distribution in the tail. The intrinsic specificity for biomolecular recognition measures the degree of discrimination of native versus non-native binding and the optimization of which becomes the maximization of the ratio of the free energy gap between the native state and the average of non-native states versus the roughness measured by the variance of the free energy landscape around its mean. The intrinsic specificity obeys a Gaussian distribution near the mean and an exponential distribution near the tail. Furthermore, the kinetics of binding follows a log-normal distribution near the mean and a power law distribution at the tail. Our study provides new insights into the statistical nature of thermodynamics, kinetics and function from different ligands binding with a specific receptor or equivalently specific ligand binding with different receptors. The elucidation of distributions of the kinetics and free energy has guiding roles in studying biomolecular recognition and function through small-molecule evolution and chemical genetics. PMID:25885453
Linhart, S. Mike; Nania, Jon F.; Christiansen, Daniel E.; Hutchinson, Kasey J.; Sanders, Curtis L.; Archfield, Stacey A.
2013-01-01
A variety of individuals from water resource managers to recreational users need streamflow information for planning and decisionmaking at locations where there are no streamgages. To address this problem, two statistically based methods, the Flow Duration Curve Transfer method and the Flow Anywhere method, were developed for statewide application and the two physically based models, the Precipitation Runoff Modeling-System and the Soil and Water Assessment Tool, were only developed for application for the Cedar River Basin. Observed and estimated streamflows for the two methods and models were compared for goodness of fit at 13 streamgages modeled in the Cedar River Basin by using the Nash-Sutcliffe and the percent-bias efficiency values. Based on median and mean Nash-Sutcliffe values for the 13 streamgages the Precipitation Runoff Modeling-System and Soil and Water Assessment Tool models appear to have performed similarly and better than Flow Duration Curve Transfer and Flow Anywhere methods. Based on median and mean percent bias values, the Soil and Water Assessment Tool model appears to have generally overestimated daily mean streamflows, whereas the Precipitation Runoff Modeling-System model and statistical methods appear to have underestimated daily mean streamflows. The Flow Duration Curve Transfer method produced the lowest median and mean percent bias values and appears to perform better than the other models.
ERIC Educational Resources Information Center
Mount, Robert E.; Schumacker, Randall E.
1998-01-01
A Monte Carlo study was conducted using simulated dichotomous data to determine the effects of guessing on Rasch item fit statistics and the Logit Residual Index. Results indicate that no significant differences were found between the mean Rasch item fit statistics for each distribution type as the probability of guessing the correct answer…
Statistical Process Control Charts for Measuring and Monitoring Temporal Consistency of Ratings
ERIC Educational Resources Information Center
Omar, M. Hafidz
2010-01-01
Methods of statistical process control were briefly investigated in the field of educational measurement as early as 1999. However, only the use of a cumulative sum chart was explored. In this article other methods of statistical quality control are introduced and explored. In particular, methods in the form of Shewhart mean and standard deviation…
García-Aparicio, Luis; Blazquez-Gomez, Eva; Martin, Oriol; Manzanares, Alejandro; García-Smith, Natalie; Bejarano, Miguel; Rodo, Joan; Ribó, Josep M
2014-08-01
The aim of our study is to compare the outcomes of open and laparoscopic pyeloplasty in children less than 12 months of age. We reviewed all medical charts of patients less than 12 months old who underwent pyeloplasty from January 2007 to February 2013. We divided them in two groups: Open pyeloplasty (OP) and laparoscopic pyeloplasty (LP). The following data were analyzed: age, sex, weight, US measurements, operative time, hospital stay, complications, and success rate. Quantitative data were analyzed with the Student t test or Mann-Whitney U test, and chi-square test or Fisher test for qualitative data. Fifty-eight patients (46 boys and 12 girls) with a mean age of 4.66 months (±3.05) were included. Mean age was 4.25 months and 5.15 months in OP and LP group respectively. Mean weight was 6.78 kg and 7.02 kg in OP and LP groups. There were no statistical differences in age, weight, and sex between OP and LP groups. There were no statistical differences in preoperative ultrasonography measurements. Mean posterior-anterior (PA) pelvis diameter was 28.57 mm and 23.94 mm in OP and LP groups, respectively. Mean calices diameter were 10.86 mm and 10.96 mm in OP and LP groups, respectively. Mean operative time was 129.53 minutes in the OP group and 151.92 minutes in the LP group with statistical differences (P=0.018). Mean hospital stay was 6.34 days in the OP group and 3.46 in the LP group with statistical differences (P<0.05). No intraoperative and postoperative complications were found in either group. Hydronephrosis improved in all patients, and no patient needed a repeated pyeloplasty. The laparoscopic approach of Anderson-Hynes pyeloplasty in patients less than 12 months old is a safe procedure with the same outcomes as the open approach.
Adverse Climatic Conditions and Impact on Construction Scheduling and Cost
1988-01-01
ABBREVIATIONS ABS MAX MAX TEMP ...... Absolute maximum maximum temperature ABS MIN MIN TEMP ...... Absolute minimum minimum temperature BTU...o Degrees Farenheit MEAN MAX TEMP o.................... Mean maximum temperature MEAN MIN TEMP...temperatures available, a determination had to be made as to whether forecasts were based on absolute , mean, or statistically derived temperatures
Code of Federal Regulations, 2010 CFR
2010-07-01
... photograph. Related definitions include: (1) System of records means a group of any records under the control... means records used for personnel management programs or processes such as staffing, employee development, retirement, and grievances and appeals. (4) Statistical records means records in a system of records...
NASA Astrophysics Data System (ADS)
Pradhan, Prabhakar; John Park, Daniel; Capoglu, Ilker; Subramanian, Hariharan; Damania, Dhwanil; Cherkezyan, Lusik; Taflove, Allen; Backman, Vadim
2017-06-01
Statistical properties of light waves reflected from a one-dimensional (1D) disordered optical medium [n(x) = n0+ dn(x),
Waldinger, Marcel D; Zwinderman, Aeilko H; Olivier, Berend; Schweitzer, Dave H
2008-02-01
The intravaginal ejaculation latency time (IELT) behaves in a skewed manner and needs the appropriate statistics for correct interpretation of treatment results. To explain the rightful use of geometrical mean IELT values and the fold increase of the geometric mean IELT because of the positively skewed IELT distribution. Linking theoretical arguments to the outcome of several selective serotonin reuptake inhibitor and modern antidepressant study results. Geometric mean IELT and fold increase of geometrical mean IELT. Log-transforming each separate IELT measurement of each individual man is the basis for the calculation of the geometric mean IELT. A drug-induced positively skewed IELT distribution necessitates the calculation of the geometric mean IELTs at baseline and during drug treatment. In a positively skewed IELT distribution, the use of the "arithmetic" mean IELT risks an overestimation of the drug-induced ejaculation delay as the mean IELT is always higher than the geometric mean IELT. Strong ejaculation-delaying drugs give rise to a strong positively skewed IELT distribution, whereas weak ejaculation-delaying drugs give rise to (much) less skewed IELT distributions. Ejaculation delay is expressed in fold increase of the geometric mean IELT. Drug-induced ejaculatory performance discloses a positively skewed IELT distribution, requiring the use of the geometric mean IELT and the fold increase of the geometric mean IELT.
Driscoll, Daniel G.; Zogorski, John S.
1990-01-01
The report presents a summary of basin characteristics affecting streamflow, a history of the U.S. Geological Survey 's stream-gaging program, and a compilation of discharge records and statistical summaries for selected sites within the Rapid Creek basin. It is the first in a series which will investigate surface-water/groundwater relations along Rapid Creek. The summary of basin characteristics includes descriptions of the geology and hydrogeology, physiography and climate, land use and vegetation, reservoirs, and water use within the basin. A recounting of the U.S. Geological Survey 's stream-gaging program and a tabulation of historic stream-gaging stations within the basin are furnished. A compilation of monthly and annual mean discharge values for nine currently operated, long-term, continuous-record, streamflow-gaging stations on Rapid Creek is presented. The statistical summary for each site includes summary statistics on monthly and annual mean values, correlation matrix for monthly values, serial correlation for 1 year lag for monthly values, percentile rankings for monthly and annual mean values, low and high value tables, duration curves, and peak-discharge tables. Records of monthend contents for two reservoirs within the basin also are presented. (USGS)
Son, Ji Y; Ramos, Priscilla; DeWolf, Melissa; Loftus, William; Stigler, James W
2018-01-01
In this article, we begin to lay out a framework and approach for studying how students come to understand complex concepts in rich domains. Grounded in theories of embodied cognition, we advance the view that understanding of complex concepts requires students to practice, over time, the coordination of multiple concepts, and the connection of this system of concepts to situations in the world. Specifically, we explore the role that a teacher's gesture might play in supporting students' coordination of two concepts central to understanding in the domain of statistics: mean and standard deviation. In Study 1 we show that university students who have just taken a statistics course nevertheless have difficulty taking both mean and standard deviation into account when thinking about a statistical scenario. In Study 2 we show that presenting the same scenario with an accompanying gesture to represent variation significantly impacts students' interpretation of the scenario. Finally, in Study 3 we present evidence that instructional videos on the internet fail to leverage gesture as a means of facilitating understanding of complex concepts. Taken together, these studies illustrate an approach to translating current theories of cognition into principles that can guide instructional design.
On the statistical properties and tail risk of violent conflicts
NASA Astrophysics Data System (ADS)
Cirillo, Pasquale; Taleb, Nassim Nicholas
2016-06-01
We examine statistical pictures of violent conflicts over the last 2000 years, providing techniques for dealing with the unreliability of historical data. We make use of a novel approach to deal with fat-tailed random variables with a remote but nonetheless finite upper bound, by defining a corresponding unbounded dual distribution (given that potential war casualties are bounded by the world population). This approach can also be applied to other fields of science where power laws play a role in modeling, like geology, hydrology, statistical physics and finance. We apply methods from extreme value theory on the dual distribution and derive its tail properties. The dual method allows us to calculate the real tail mean of war casualties, which proves to be considerably larger than the corresponding sample mean for large thresholds, meaning severe underestimation of the tail risks of conflicts from naive observation. We analyze the robustness of our results to errors in historical reports. We study inter-arrival times between tail events and find that no particular trend can be asserted. All the statistical pictures obtained are at variance with the prevailing claims about ;long peace;, namely that violence has been declining over time.
Arroyo-Hernández, M; Mellado-Romero, M A; Páramo-Díaz, P; Martín-López, C M; Cano-Egea, J M; Vilá Y Rico, J
2015-01-01
The purpose of this study is to analyze if there is any difference between the arthroscopic reparation of full-thickness supraspinatus tears with simple row technique versus suture bridge technique. We accomplished a retrospective study of 123 patients with full-thickness supraspinatus tears between January 2009 and January 2013 in our hospital. There were 60 simple row reparations, and 63 suture bridge ones. The mean age in the simple row group was 62.9, and in the suture bridge group was 63.3 years old. There were more women than men in both groups (67%). All patients were studied using the Constant test. The mean Constant test in the suture bridge group was 76.7, and in the simple row group was 72.4. We have also accomplished a statistical analysis of each Constant item. Strength was higher in the suture bridge group, with a significant statistical difference (p 0.04). The range of movement was also greater in the suture bridge group, but was not statistically significant. Suture bridge technique has better clinical results than single row reparations, but the difference is not statistically significant (p = 0.298).
The capacity limitations of orientation summary statistics
Attarha, Mouna; Moore, Cathleen M.
2015-01-01
The simultaneous–sequential method was used to test the processing capacity of establishing mean orientation summaries. Four clusters of oriented Gabor patches were presented in the peripheral visual field. One of the clusters had a mean orientation that was tilted either left or right while the mean orientations of the other three clusters were roughly vertical. All four clusters were presented at the same time in the simultaneous condition whereas the clusters appeared in temporal subsets of two in the sequential condition. Performance was lower when the means of all four clusters had to be processed concurrently than when only two had to be processed in the same amount of time. The advantage for establishing fewer summaries at a given time indicates that the processing of mean orientation engages limited-capacity processes (Experiment 1). This limitation cannot be attributed to crowding, low target-distractor discriminability, or a limited-capacity comparison process (Experiments 2 and 3). In contrast to the limitations of establishing multiple summary representations, establishing a single summary representation unfolds without interference (Experiment 4). When interpreted in the context of recent work on the capacity of summary statistics, these findings encourage reevaluation of the view that early visual perception consists of summary statistic representations that unfold independently across multiple areas of the visual field. PMID:25810160
NASA Astrophysics Data System (ADS)
Adamová, D.; Agakichiev, G.; Appelshäuser, H.; Belaga, V.; Braun-Munzinger, P.; Campagnolo, R.; Castillo, A.; Cherlin, A.; Damjanović, S.; Dietel, T.; Dietrich, L.; Drees, A.; Esumi, S.; Filimonov, K.; Fomenko, K.; Fraenkel, Z.; Garabatos, C.; Glässel, P.; Hering, G.; Holeczek, J.; Kushpil, V.; Lenkeit, B.; Ludolphs, W.; Maas, A.; MaríN, A.; Milošević, J.; Milov, A.; Miśkowiec, D.; Musa, L.; Panebrattsev, Yu.; Petchenova, O.; Petráček, V.; Pfeiffer, A.; Rak, J.; Ravinovich, I.; Rehak, P.; Richter, M.; Sako, H.; Schmitz, W.; Schukraft, J.; Sedykh, S.; Seipp, W.; Sharma, A.; Shimansky, S.; SlíVová, J.; Specht, H. J.; Stachel, J.; Šumbera, M.; Tilsner, H.; Tserruya, I.; Wessels, J. P.; Wienold, T.; Windelband, B.; Wurm, J. P.; Xie, W.; Yurevich, S.; Yurevich, V.; Ceres Collaboration
2003-11-01
Measurements of event-by-event fluctuations of the mean transverse momentum in PbAu collisions at 40, 80, and 158 A GeV/c are presented. A significant excess of mean pT fluctuations at mid-rapidity is observed over the expectation from statistically independent particle emission. The results are somewhat smaller than recent measurements at RHIC. A possible non-monotonic behavior of the mean pT fluctuations as function of collision energy, which may have indicated that the system has passed the critical point of the QCD phase diagram in the range of μB under investigation, has not been observed. The centrality dependence of mean pT fluctuations in PbAu is consistent with an extrapolation from pp collisions assuming that the non-statistical fluctuations scale with multiplicity. The results are compared to calculations by the RQMD and URQMD event generators.
Color stability comparison of silicone facial prostheses following disinfection.
Goiato, Marcelo Coelho; Pesqueira, Aldiéris Alves; dos Santos, Daniela Micheline; Zavanelli, Adriana Cristina; Ribeiro, Paula do Prado
2009-04-01
The purpose of this study was to evaluate the color stability of two silicones for use in facial prostheses, under the influence of chemical disinfection and storage time. Twenty-eight specimens were obtained half made from Silastic MDX 4-4210 silicone and the other half from Silastic 732 RTV silicone. The specimens were divided into four groups: Silastic 732 RTV and MDX 4-4210 with disinfection three times a week with Efferdent and Sliastic 732 RTV and MDX 4-4210 disinfected with neutral soap. Color stability was analyzed by spectrophotometry, immediately and 2 months after making the specimens. After obtaining the results, ANOVA and Tukey test with 1% reliability were used for statistical analysis. Statistical differences between mean color values were observed. Disinfection with Efferdent did not statistically influence the mean color values. The factors of storage time and disinfection statistically influenced color stability; disinfection acts as a bleaching agent in silicone materials.
Laser Velocimeter Measurements and Analysis in Turbulent Flows with Combustion. Part 2.
1983-07-01
sampling error for 63 this sample size. Mean velocities and turbulence intensi- ties were found to be statistically accurate to ± 1 % and 13%, respectively...Although the statist - ical error was found to be rather small (± 1 % for mean velo- cities and 13% for turbulence intensities), there can be additional...34Computational and Experimental Study of a Captive Annular Eddy," Journal of Fluid Mechanics, Vol. 28, pt. 1 , pp. 43-63, 12 April, 1967. 152 REFERENCES (con’d
Dimensionally regularized Tsallis' statistical mechanics and two-body Newton's gravitation
NASA Astrophysics Data System (ADS)
Zamora, J. D.; Rocca, M. C.; Plastino, A.; Ferri, G. L.
2018-05-01
Typical Tsallis' statistical mechanics' quantifiers like the partition function and the mean energy exhibit poles. We are speaking of the partition function Z and the mean energy 〈 U 〉 . The poles appear for distinctive values of Tsallis' characteristic real parameter q, at a numerable set of rational numbers of the q-line. These poles are dealt with dimensional regularization resources. The physical effects of these poles on the specific heats are studied here for the two-body classical gravitation potential.
Sexual network drivers of HIV and herpes simplex virus type 2 transmission
Omori, Ryosuke; Abu-Raddad, Laith J.
2017-01-01
Objectives: HIV and herpes simplex virus type 2 (HSV-2) infections are sexually transmitted and propagate in sexual networks. Using mathematical modeling, we aimed to quantify effects of key network statistics on infection transmission, and extent to which HSV-2 prevalence can be a proxy of HIV prevalence. Design/methods: An individual-based simulation model was constructed to describe sex partnering and infection transmission, and was parameterized with representative natural history, transmission, and sexual behavior data. Correlations were assessed on model outcomes (HIV/HSV-2 prevalences) and multiple linear regressions were conducted to estimate adjusted associations and effect sizes. Results: HIV prevalence was one-third or less of HSV-2 prevalence. HIV and HSV-2 prevalences were associated with a Spearman's rank correlation coefficient of 0.64 (95% confidence interval: 0.58–0.69). Collinearities among network statistics were detected, most notably between concurrency versus mean and variance of number of partners. Controlling for confounding, unmarried mean/variance of number of partners (or alternatively concurrency) were the strongest predictors of HIV prevalence. Meanwhile, unmarried/married mean/variance of number of partners (or alternatively concurrency), and clustering coefficient were the strongest predictors of HSV-2 prevalence. HSV-2 prevalence was a strong predictor of HIV prevalence by proxying effects of network statistics. Conclusion: Network statistics produced similar and differential effects on HIV/HSV-2 transmission, and explained most of the variation in HIV and HSV-2 prevalences. HIV prevalence reflected primarily mean and variance of number of partners, but HSV-2 prevalence was affected by a range of network statistics. HSV-2 prevalence (as a proxy) can forecast a population's HIV epidemic potential, thereby informing interventions. PMID:28514276
NASA Astrophysics Data System (ADS)
von Storch, Hans; Zorita, Eduardo; Cubasch, Ulrich
1993-06-01
A statistical strategy to deduct regional-scale features from climate general circulation model (GCM) simulations has been designed and tested. The main idea is to interrelate the characteristic patterns of observed simultaneous variations of regional climate parameters and of large-scale atmospheric flow using the canonical correlation technique.The large-scale North Atlantic sea level pressure (SLP) is related to the regional, variable, winter (DJF) mean Iberian Peninsula rainfall. The skill of the resulting statistical model is shown by reproducing, to a good approximation, the winter mean Iberian rainfall from 1900 to present from the observed North Atlantic mean SLP distributions. It is shown that this observed relationship between these two variables is not well reproduced in the output of a general circulation model (GCM).The implications for Iberian rainfall changes as the response to increasing atmospheric greenhouse-gas concentrations simulated by two GCM experiments are examined with the proposed statistical model. In an instantaneous `2 C02' doubling experiment, using the simulated change of the mean North Atlantic SLP field to predict Iberian rainfall yields, there is an insignificant increase of area-averaged rainfall of 1 mm/month, with maximum values of 4 mm/month in the northwest of the peninsula. In contrast, for the four GCM grid points representing the Iberian Peninsula, the change is 10 mm/month, with a minimum of 19 mm/month in the southwest. In the second experiment, with the IPCC scenario A ("business as usual") increase Of C02, the statistical-model results partially differ from the directly simulated rainfall changes: in the experimental range of 100 years, the area-averaged rainfall decreases by 7 mm/month (statistical model), and by 9 mm/month (GCM); at the same time the amplitude of the interdecadal variability is quite different.
Code of Federal Regulations, 2010 CFR
2010-10-01
... care facility or facility means an organization involved in the delivery of health care services for... the delivery of health care services that is typical for a specified group. Norms means numerical or statistical measures of average observed performance in the delivery of health care services. Outliers means...
A Third Moment Adjusted Test Statistic for Small Sample Factor Analysis
Lin, Johnny; Bentler, Peter M.
2012-01-01
Goodness of fit testing in factor analysis is based on the assumption that the test statistic is asymptotically chi-square; but this property may not hold in small samples even when the factors and errors are normally distributed in the population. Robust methods such as Browne’s asymptotically distribution-free method and Satorra Bentler’s mean scaling statistic were developed under the presumption of non-normality in the factors and errors. This paper finds new application to the case where factors and errors are normally distributed in the population but the skewness of the obtained test statistic is still high due to sampling error in the observed indicators. An extension of Satorra Bentler’s statistic is proposed that not only scales the mean but also adjusts the degrees of freedom based on the skewness of the obtained test statistic in order to improve its robustness under small samples. A simple simulation study shows that this third moment adjusted statistic asymptotically performs on par with previously proposed methods, and at a very small sample size offers superior Type I error rates under a properly specified model. Data from Mardia, Kent and Bibby’s study of students tested for their ability in five content areas that were either open or closed book were used to illustrate the real-world performance of this statistic. PMID:23144511
ERIC Educational Resources Information Center
Santos-Delgado, M. J.; Larrea-Tarruella, L.
2004-01-01
The back-titration methods are compared statistically to establish glycine in a nonaqueous medium of acetic acid. Important variations in the mean values of glycine are observed due to the interaction effects between the analysis of variance (ANOVA) technique and a statistical study through a computer software.
ERIC Educational Resources Information Center
Yuan, Ke-Hai
2008-01-01
In the literature of mean and covariance structure analysis, noncentral chi-square distribution is commonly used to describe the behavior of the likelihood ratio (LR) statistic under alternative hypothesis. Due to the inaccessibility of the rather technical literature for the distribution of the LR statistic, it is widely believed that the…
ERIC Educational Resources Information Center
Hood, Michelle; Creed, Peter A.; Neumann, David L.
2012-01-01
We tested a model of the relationship between attitudes toward statistics and achievement based on Eccles' Expectancy Value Model (1983). Participants (n = 149; 83% female) were second-year Australian university students in a psychology statistics course (mean age = 23.36 years, SD = 7.94 years). We obtained demographic details, past performance,…
Schaid, Daniel J
2010-01-01
Measures of genomic similarity are the basis of many statistical analytic methods. We review the mathematical and statistical basis of similarity methods, particularly based on kernel methods. A kernel function converts information for a pair of subjects to a quantitative value representing either similarity (larger values meaning more similar) or distance (smaller values meaning more similar), with the requirement that it must create a positive semidefinite matrix when applied to all pairs of subjects. This review emphasizes the wide range of statistical methods and software that can be used when similarity is based on kernel methods, such as nonparametric regression, linear mixed models and generalized linear mixed models, hierarchical models, score statistics, and support vector machines. The mathematical rigor for these methods is summarized, as is the mathematical framework for making kernels. This review provides a framework to move from intuitive and heuristic approaches to define genomic similarities to more rigorous methods that can take advantage of powerful statistical modeling and existing software. A companion paper reviews novel approaches to creating kernels that might be useful for genomic analyses, providing insights with examples [1]. Copyright © 2010 S. Karger AG, Basel.
Identifying the Source of Misfit in Item Response Theory Models.
Liu, Yang; Maydeu-Olivares, Alberto
2014-01-01
When an item response theory model fails to fit adequately, the items for which the model provides a good fit and those for which it does not must be determined. To this end, we compare the performance of several fit statistics for item pairs with known asymptotic distributions under maximum likelihood estimation of the item parameters: (a) a mean and variance adjustment to bivariate Pearson's X(2), (b) a bivariate subtable analog to Reiser's (1996) overall goodness-of-fit test, (c) a z statistic for the bivariate residual cross product, and (d) Maydeu-Olivares and Joe's (2006) M2 statistic applied to bivariate subtables. The unadjusted Pearson's X(2) with heuristically determined degrees of freedom is also included in the comparison. For binary and ordinal data, our simulation results suggest that the z statistic has the best Type I error and power behavior among all the statistics under investigation when the observed information matrix is used in its computation. However, if one has to use the cross-product information, the mean and variance adjusted X(2) is recommended. We illustrate the use of pairwise fit statistics in 2 real-data examples and discuss possible extensions of the current research in various directions.
Correcting Too Much or Too Little? The Performance of Three Chi-Square Corrections.
Foldnes, Njål; Olsson, Ulf Henning
2015-01-01
This simulation study investigates the performance of three test statistics, T1, T2, and T3, used to evaluate structural equation model fit under non normal data conditions. T1 is the well-known mean-adjusted statistic of Satorra and Bentler. T2 is the mean-and-variance adjusted statistic of Sattertwaithe type where the degrees of freedom is manipulated. T3 is a recently proposed version of T2 that does not manipulate degrees of freedom. Discrepancies between these statistics and their nominal chi-square distribution in terms of errors of Type I and Type II are investigated. All statistics are shown to be sensitive to increasing kurtosis in the data, with Type I error rates often far off the nominal level. Under excess kurtosis true models are generally over-rejected by T1 and under-rejected by T2 and T3, which have similar performance in all conditions. Under misspecification there is a loss of power with increasing kurtosis, especially for T2 and T3. The coefficient of variation of the nonzero eigenvalues of a certain matrix is shown to be a reliable indicator for the adequacy of these statistics.
Perry, Charles A.; Wolock, David M.; Artman, Joshua C.
2004-01-01
Streamflow statistics of flow duration and peak-discharge frequency were estimated for 4,771 individual locations on streams listed on the 1999 Kansas Surface Water Register. These statistics included the flow-duration values of 90, 75, 50, 25, and 10 percent, as well as the mean flow value. Peak-discharge frequency values were estimated for the 2-, 5-, 10-, 25-, 50-, and 100-year floods. Least-squares multiple regression techniques were used, along with Tobit analyses, to develop equations for estimating flow-duration values of 90, 75, 50, 25, and 10 percent and the mean flow for uncontrolled flow stream locations. The contributing-drainage areas of 149 U.S. Geological Survey streamflow-gaging stations in Kansas and parts of surrounding States that had flow uncontrolled by Federal reservoirs and used in the regression analyses ranged from 2.06 to 12,004 square miles. Logarithmic transformations of climatic and basin data were performed to yield the best linear relation for developing equations to compute flow durations and mean flow. In the regression analyses, the significant climatic and basin characteristics, in order of importance, were contributing-drainage area, mean annual precipitation, mean basin permeability, and mean basin slope. The analyses yielded a model standard error of prediction range of 0.43 logarithmic units for the 90-percent duration analysis to 0.15 logarithmic units for the 10-percent duration analysis. The model standard error of prediction was 0.14 logarithmic units for the mean flow. Regression equations used to estimate peak-discharge frequency values were obtained from a previous report, and estimates for the 2-, 5-, 10-, 25-, 50-, and 100-year floods were determined for this report. The regression equations and an interpolation procedure were used to compute flow durations, mean flow, and estimates of peak-discharge frequency for locations along uncontrolled flow streams on the 1999 Kansas Surface Water Register. Flow durations, mean flow, and peak-discharge frequency values determined at available gaging stations were used to interpolate the regression-estimated flows for the stream locations where available. Streamflow statistics for locations that had uncontrolled flow were interpolated using data from gaging stations weighted according to the drainage area and the bias between the regression-estimated and gaged flow information. On controlled reaches of Kansas streams, the streamflow statistics were interpolated between gaging stations using only gaged data weighted by drainage area.
Two Computer Programs for the Statistical Evaluation of a Weighted Linear Composite.
ERIC Educational Resources Information Center
Sands, William A.
1978-01-01
Two computer programs (one batch, one interactive) are designed to provide statistics for a weighted linear combination of several component variables. Both programs provide mean, variance, standard deviation, and a validity coefficient. (Author/JKS)
PERFORMANCE OF TRICKLING FILTER PLANTS: RELIABILITY, STABILITY, VARIABILITY
Effluent quality variability from trickling filters was examined in this study by statistically analyzing daily effluent BOD5 and suspended solids data from 11 treatment plants. Summary statistics (mean, standard deviation, etc.) were examined to determine the general characteris...
Code of Federal Regulations, 2014 CFR
2014-01-01
... non-functioning county or statistical equivalent means a sub-state entity that does not function as an... program, an eligible governmental unit also includes the District of Columbia and non-functioning counties or statistical equivalents represented by a FSCPE member agency. ...
Linear regression models and k-means clustering for statistical analysis of fNIRS data.
Bonomini, Viola; Zucchelli, Lucia; Re, Rebecca; Ieva, Francesca; Spinelli, Lorenzo; Contini, Davide; Paganoni, Anna; Torricelli, Alessandro
2015-02-01
We propose a new algorithm, based on a linear regression model, to statistically estimate the hemodynamic activations in fNIRS data sets. The main concern guiding the algorithm development was the minimization of assumptions and approximations made on the data set for the application of statistical tests. Further, we propose a K-means method to cluster fNIRS data (i.e. channels) as activated or not activated. The methods were validated both on simulated and in vivo fNIRS data. A time domain (TD) fNIRS technique was preferred because of its high performances in discriminating cortical activation and superficial physiological changes. However, the proposed method is also applicable to continuous wave or frequency domain fNIRS data sets.
Linear regression models and k-means clustering for statistical analysis of fNIRS data
Bonomini, Viola; Zucchelli, Lucia; Re, Rebecca; Ieva, Francesca; Spinelli, Lorenzo; Contini, Davide; Paganoni, Anna; Torricelli, Alessandro
2015-01-01
We propose a new algorithm, based on a linear regression model, to statistically estimate the hemodynamic activations in fNIRS data sets. The main concern guiding the algorithm development was the minimization of assumptions and approximations made on the data set for the application of statistical tests. Further, we propose a K-means method to cluster fNIRS data (i.e. channels) as activated or not activated. The methods were validated both on simulated and in vivo fNIRS data. A time domain (TD) fNIRS technique was preferred because of its high performances in discriminating cortical activation and superficial physiological changes. However, the proposed method is also applicable to continuous wave or frequency domain fNIRS data sets. PMID:25780751
NASA Technical Reports Server (NTRS)
Jansen, Mark J.; Jones, William R., Jr.; Wheeler, Donald R.; Keller, Dennis J.
2000-01-01
Because CFC 113, an ozone depleting chemical (ODC), can no longer be produced, alternative bearing cleaning methods must be studied. The objective of this work was to study the effect of the new cleaning methods on lubricant lifetime using a vacuum bearing simulator (spiral orbit rolling contact tribometer). Four alternative cleaning methods were studied: ultra-violet (UV) ozone, aqueous levigated alumina slurry (ALAS), super critical fluid (SCF) CO2 and aqueous Brulin 815GD. Baseline tests were done using CFC 113. Test conditions were the following: a vacuum of at least 1.3 x 10(exp -6) Pa, 440C steel components, a rotational speed of 10 RPM, a lubricant charge of between 60-75 micrograms, a perfluoropolyalkylether lubricant (Z-25), and a load of 200N (44.6 lbs., a mean Hertzian stress of 1.5 GPa). Normalized lubricant lifetime was determined by dividing the total number of ball orbits by the amount of lubricant. The failure condition was a friction coefficient of 0.38. Post-test XPS analysis was also performed, showing slight variations in post-cleaning surface chemistry. Statistical analysis of the resultant data was conducted and it was determined that the data sets were most directly comparable when subjected to a natural log transformation. The natural log life (NL-Life) data for each cleaning method were reasonably normally (statistically) distributed and yielded standard deviations that were not significantly different among the five cleaning methods investigated. This made comparison of their NL-Life means very straightforward using a Bonferroni multiple comparison of means procedure. This procedure showed that the ALAS, UV-ozone and CFC 113 methods were not statistically significantly different from one another with respect to mean NL-Life. It also found that the SCF CO2 method yielded a significantly higher mean NL-Life than the mean NL-Lives of the ALAS, UV-ozone and CFC 113 methods. It also determined that the aqueous Brulin 815GD method yielded a mean NL-Life that was statistically significantly higher than the mean NL-Lives of each of the other four methods. Baseline tests using CFC 113 cleaned parts yielded a mean NL-Life 3.62 orbits/micro-g. ALAS and UV-ozone yielded similar mean NL-Life (3.31 orbits/mg and 3.33 orbits/micro-g, respectively). SCF CO2, gave a mean NL-Life of 4.08 orbits/mg and aqueous Brulin 8l5GD data yielded the longest mean NL-Life (4.66 orbits/micro-g).
Frank, C; Bray, D; Rademaker, A; Chrusch, C; Sabiston, P; Bodie, D; Rangayyan, R
1989-01-01
To establish a normal baseline for comparison, thirty-one thousand collagen fibril diameters were measured in calibrated transmission electron (TEM) photomicrographs of normal rabbit medial collateral ligaments (MCL's). A new automated method of quantitation was used to compare statistically fibril minimum diameter distributions in one midsubstance location in both MCL's from six animals at 3 months of age (immature) and three animals at 10 months of age (mature). Pooled results demonstrate that rabbit MCL's have statistically different (p less than 0.001) mean minimum diameters at these two ages. Interanimal differences in mean fibril minimum diameters were also significant (p less than 0.001) and varied by 20% to 25% in both mature and immature animals. Finally, there were significant differences (p less than 0.001) in mean diameters and distributions from side-to-side in all animals. These mean left-to-right differences were less than 10% in all mature animals but as much as 62% in some immature animals. Statistical analysis of these data demonstrate that animal-to-animal comparisons using these protocols require a large number of animals with appropriate numbers of fibrils being measured to detect small intergroup differences. With experiments which compare left to right ligaments, far fewer animals are required to detect similarly small differences. These results demonstrate the necessity for rigorous control of sampling, an extensive normal baseline and statistically confirmed experimental designs in any TEM comparisons of collagen fibril diameters.
NASA Technical Reports Server (NTRS)
Merceret, Francis J.; Crawford, Winifred C.
2010-01-01
Knowledge of peak wind speeds is important to the safety of personnel and flight hardware at Kennedy Space Center (KSC) and the Cape Canaveral Air Force Station (CCAFS), but they are more difficult to forecast than mean wind speeds. Development of a reliable model for the gust factor (GF) relating the peak to the mean wind speed motivated a previous study of GF in tropical storms. The same motivation inspired a climatological study of non-TS peak wind speed statistics without the use of GF. Both studies presented their respective statistics as functions of mean wind speed and height. The few comparisons of IS and non-TS GF in the literature suggest that the non-TS GF at a given height and mean wind speed are smaller than the corresponding TS GF. The investigation reported here converted the non-TS peak wind statistics mentioned above to the equivalent GF statistics and compared the results with the previous TS GF results. The advantage of this effort over all previously reported studies of its kind is that the TS and non-TS data are taken from the same towers in the same locations. That eliminates differing surface attributes, including roughness length and thermal properties, as a major source of variance in the comparison. The results are consistent with the literature, but include much more detailed, quantitative information on the nature of the relationship between TS and non-TS GF as a function of height and mean wind speed. In addition, the data suggest the possibility of providing an operational model for non-TS GF as a function of height and wind speed in a manner similar to the one previously developed for TS GF.
Kwon, Deukwoo; Reis, Isildinha M
2015-08-12
When conducting a meta-analysis of a continuous outcome, estimated means and standard deviations from the selected studies are required in order to obtain an overall estimate of the mean effect and its confidence interval. If these quantities are not directly reported in the publications, they must be estimated from other reported summary statistics, such as the median, the minimum, the maximum, and quartiles. We propose a simulation-based estimation approach using the Approximate Bayesian Computation (ABC) technique for estimating mean and standard deviation based on various sets of summary statistics found in published studies. We conduct a simulation study to compare the proposed ABC method with the existing methods of Hozo et al. (2005), Bland (2015), and Wan et al. (2014). In the estimation of the standard deviation, our ABC method performs better than the other methods when data are generated from skewed or heavy-tailed distributions. The corresponding average relative error (ARE) approaches zero as sample size increases. In data generated from the normal distribution, our ABC performs well. However, the Wan et al. method is best for estimating standard deviation under normal distribution. In the estimation of the mean, our ABC method is best regardless of assumed distribution. ABC is a flexible method for estimating the study-specific mean and standard deviation for meta-analysis, especially with underlying skewed or heavy-tailed distributions. The ABC method can be applied using other reported summary statistics such as the posterior mean and 95 % credible interval when Bayesian analysis has been employed.
Self-esteem and its associated factors among secondary school students in Klang District, Selangor.
Sherina, M S; Rampal, L; Loh, J W; Chan, C L; Teh, P C; Tan, P O
2008-03-01
Self-esteem is an important determinant of psychological well-being that is particularly problematic during adolescent life stage. There is a correlation between low self-esteem and other social problems among today's adolescents. This study was conducted to determine the mean self-esteem score, and to determine the association between self-esteem and age, sex, race, religion, number of siblings, ranking among siblings, family function, parental marital status and smoking among adolescents aged 12 to 20-years-old. A cross sectional study design using random cluster sampling method was done. Four out of a total of 35 secondary schools in Klang District, Selangor were selected. Respondents consisted of individual students in selected classes from the four selected schools. Data was collected using a self-administered, structured, pre-tested questionnaire and was analyzed using the SPSS version 12.0. Out of 1089 respondents, 793 completed the questionnaire (response rate 73.82%). The overall mean self-esteem score was 27.65. The mean self-esteem score for males (27.99) was slightly higher than females (27.31). The differences in the mean scores by race were statistically significant. There was a statistically significant relationship between mean self-esteem scores and sex, age, race, religion, number of siblings, smoking and family function. There was no statistically significant difference between mean self-esteem score with parental marital status and with ranking among siblings. The overall mean self-esteem score was 27.65. Self-esteem was associated with sex, age, race, religion, number of siblings, smoking and family function.
ERIC Educational Resources Information Center
Zorn, Klaus
1973-01-01
Discussion of statistical apparatus employed in L. Doncheva-Mareva's article on the wide-spread usage of the present and future tense forms with future meaning in German letters, Deutsch als Fremdsprache, n1 1971. (RS)
Code of Federal Regulations, 2010 CFR
2010-01-01
... means a natural person, corporation, or other business entity. (m) Relevant metropolitan statistical... median family income for the metropolitan statistical area (MSA), if a depository organization is located... exclusively to the business of retail merchandising or manufacturing; (ii) A person whose management functions...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nicholson, W L; Harris, J L
1976-03-01
The First ERDA Statistical Symposium was organized to provide a means for communication among ERDA statisticians, and the sixteen papers presented at the meeting are given. Topics include techniques of numerical analysis used for accelerators, nuclear reactors, skewness and kurtosis statistics, radiochemical spectral analysis, quality control, and other statistics problems. Nine of the papers were previously announced in Nuclear Science Abstracts (NSA), while the remaining seven were abstracted for ERDA Energy Research Abstracts (ERA) and INIS Atomindex. (PMA)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kogalovskii, M.R.
This paper presents a review of problems related to statistical database systems, which are wide-spread in various fields of activity. Statistical databases (SDB) are referred to as databases that consist of data and are used for statistical analysis. Topics under consideration are: SDB peculiarities, properties of data models adequate for SDB requirements, metadata functions, null-value problems, SDB compromise protection problems, stored data compression techniques, and statistical data representation means. Also examined is whether the present Database Management Systems (DBMS) satisfy the SDB requirements. Some actual research directions in SDB systems are considered.
Characterizations of linear sufficient statistics
NASA Technical Reports Server (NTRS)
Peters, B. C., Jr.; Reoner, R.; Decell, H. P., Jr.
1977-01-01
A surjective bounded linear operator T from a Banach space X to a Banach space Y must be a sufficient statistic for a dominated family of probability measures defined on the Borel sets of X. These results were applied, so that they characterize linear sufficient statistics for families of the exponential type, including as special cases the Wishart and multivariate normal distributions. The latter result was used to establish precisely which procedures for sampling from a normal population had the property that the sample mean was a sufficient statistic.
Low power and type II errors in recent ophthalmology research.
Khan, Zainab; Milko, Jordan; Iqbal, Munir; Masri, Moness; Almeida, David R P
2016-10-01
To investigate the power of unpaired t tests in prospective, randomized controlled trials when these tests failed to detect a statistically significant difference and to determine the frequency of type II errors. Systematic review and meta-analysis. We examined all prospective, randomized controlled trials published between 2010 and 2012 in 4 major ophthalmology journals (Archives of Ophthalmology, British Journal of Ophthalmology, Ophthalmology, and American Journal of Ophthalmology). Studies that used unpaired t tests were included. Power was calculated using the number of subjects in each group, standard deviations, and α = 0.05. The difference between control and experimental means was set to be (1) 20% and (2) 50% of the absolute value of the control's initial conditions. Power and Precision version 4.0 software was used to carry out calculations. Finally, the proportion of articles with type II errors was calculated. β = 0.3 was set as the largest acceptable value for the probability of type II errors. In total, 280 articles were screened. Final analysis included 50 prospective, randomized controlled trials using unpaired t tests. The median power of tests to detect a 50% difference between means was 0.9 and was the same for all 4 journals regardless of the statistical significance of the test. The median power of tests to detect a 20% difference between means ranged from 0.26 to 0.9 for the 4 journals. The median power of these tests to detect a 50% and 20% difference between means was 0.9 and 0.5 for tests that did not achieve statistical significance. A total of 14% and 57% of articles with negative unpaired t tests contained results with β > 0.3 when power was calculated for differences between means of 50% and 20%, respectively. A large portion of studies demonstrate high probabilities of type II errors when detecting small differences between means. The power to detect small difference between means varies across journals. It is, therefore, worthwhile for authors to mention the minimum clinically important difference for individual studies. Journals can consider publishing statistical guidelines for authors to use. Day-to-day clinical decisions rely heavily on the evidence base formed by the plethora of studies available to clinicians. Prospective, randomized controlled clinical trials are highly regarded as a robust study and are used to make important clinical decisions that directly affect patient care. The quality of study designs and statistical methods in major clinical journals is improving overtime, 1 and researchers and journals are being more attentive to statistical methodologies incorporated by studies. The results of well-designed ophthalmic studies with robust methodologies, therefore, have the ability to modify the ways in which diseases are managed. Copyright © 2016 Canadian Ophthalmological Society. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Kim, E.; Newton, A. P.
2012-04-01
One major problem in dynamo theory is the multi-scale nature of the MHD turbulence, which requires statistical theory in terms of probability distribution functions. In this contribution, we present the statistical theory of magnetic fields in a simplified mean field α-Ω dynamo model by varying the statistical property of alpha, including marginal stability and intermittency, and then utilize observational data of solar activity to fine-tune the mean field dynamo model. Specifically, we first present a comprehensive investigation into the effect of the stochastic parameters in a simplified α-Ω dynamo model. Through considering the manifold of marginal stability (the region of parameter space where the mean growth rate is zero), we show that stochastic fluctuations are conductive to dynamo. Furthermore, by considering the cases of fluctuating alpha that are periodic and Gaussian coloured random noise with identical characteristic time-scales and fluctuating amplitudes, we show that the transition to dynamo is significantly facilitated for stochastic alpha with random noise. Furthermore, we show that probability density functions (PDFs) of the growth-rate, magnetic field and magnetic energy can provide a wealth of useful information regarding the dynamo behaviour/intermittency. Finally, the precise statistical property of the dynamo such as temporal correlation and fluctuating amplitude is found to be dependent on the distribution the fluctuations of stochastic parameters. We then use observations of solar activity to constrain parameters relating to the effect in stochastic α-Ω nonlinear dynamo models. This is achieved through performing a comprehensive statistical comparison by computing PDFs of solar activity from observations and from our simulation of mean field dynamo model. The observational data that are used are the time history of solar activity inferred for C14 data in the past 11000 years on a long time scale and direct observations of the sun spot numbers obtained in recent years 1795-1995 on a short time scale. Monte Carlo simulations are performed on these data to obtain PDFs of the solar activity on both long and short time scales. These PDFs are then compared with predicted PDFs from numerical simulation of our α-Ω dynamo model, where α is assumed to have both mean α0 and fluctuating α' parts. By varying the correlation time of fluctuating α', the ratio of the amplitude of the fluctuating to mean alpha <α'2>/α02 (where angular brackets <> denote ensemble average), and the ratio of poloidal to toroidal magnetic fields, we show that the results from our stochastic dynamo model can match the PDFs of solar activity on both long and short time scales. In particular, a good agreement is obtained when the fluctuation in alpha is roughly equal to the mean part with a correlation time shorter than the solar period.
Interpreting Bivariate Regression Coefficients: Going beyond the Average
ERIC Educational Resources Information Center
Halcoussis, Dennis; Phillips, G. Michael
2010-01-01
Statistics, econometrics, investment analysis, and data analysis classes often review the calculation of several types of averages, including the arithmetic mean, geometric mean, harmonic mean, and various weighted averages. This note shows how each of these can be computed using a basic regression framework. By recognizing when a regression model…
A Unifying Probability Example.
ERIC Educational Resources Information Center
Maruszewski, Richard F., Jr.
2002-01-01
Presents an example from probability and statistics that ties together several topics including the mean and variance of a discrete random variable, the binomial distribution and its particular mean and variance, the sum of independent random variables, the mean and variance of the sum, and the central limit theorem. Uses Excel to illustrate these…
ERIC Educational Resources Information Center
Schumm, Walter R.; Webb, Farrell J.; Castelo, Carlos S.; Akagi, Cynthia G.; Jensen, Erick J.; Ditto, Rose M.; Spencer Carver, Elaine; Brown, Beverlyn F.
2002-01-01
Discusses the use of historical events as examples for teaching college level statistics courses. Focuses on examples of the space shuttle Challenger, Pearl Harbor (Hawaii), and the RMS Titanic. Finds real life examples can bridge a link to short term experiential learning and provide a means for long term understanding of statistics. (KDR)
Sub-poissonian photon statistics in the coherent state Jaynes-Cummings model in non-resonance
NASA Astrophysics Data System (ADS)
Zhang, Jia-tai; Fan, An-fu
1992-03-01
We study a model with a two-level atom (TLA) non-resonance interacting with a single-mode quantized cavity field (QCF). The photon number probability function, the mean photon number and Mandel's fluctuation parameter are calculated. The sub-Poissonian distributions of the photon statistics are obtained in non-resonance interaction. This statistical properties are strongly dependent on the detuning parameters.
Wiley, Jeffrey B.; Curran, Janet H.
2003-01-01
Methods for estimating daily mean flow-duration statistics for seven regions in Alaska and low-flow frequencies for one region, southeastern Alaska, were developed from daily mean discharges for streamflow-gaging stations in Alaska and conterminous basins in Canada. The 15-, 10-, 9-, 8-, 7-, 6-, 5-, 4-, 3-, 2-, and 1-percent duration flows were computed for the October-through-September water year for 222 stations in Alaska and conterminous basins in Canada. The 98-, 95-, 90-, 85-, 80-, 70-, 60-, and 50-percent duration flows were computed for the individual months of July, August, and September for 226 stations in Alaska and conterminous basins in Canada. The 98-, 95-, 90-, 85-, 80-, 70-, 60-, and 50-percent duration flows were computed for the season July-through-September for 65 stations in southeastern Alaska. The 7-day, 10-year and 7-day, 2-year low-flow frequencies for the season July-through-September were computed for 65 stations for most of southeastern Alaska. Low-flow analyses were limited to particular months or seasons in order to omit winter low flows, when ice effects reduce the quality of the records and validity of statistical assumptions. Regression equations for estimating the selected high-flow and low-flow statistics for the selected months and seasons for ungaged sites were developed from an ordinary-least-squares regression model using basin characteristics as independent variables. Drainage area and precipitation were significant explanatory variables for high flows, and drainage area, precipitation, mean basin elevation, and area of glaciers were significant explanatory variables for low flows. The estimating equations can be used at ungaged sites in Alaska and conterminous basins in Canada where streamflow regulation, streamflow diversion, urbanization, and natural damming and releasing of water do not affect the streamflow data for the given month or season. Standard errors of estimate ranged from 15 to 56 percent for high-duration flow statistics, 25 to greater than 500 percent for monthly low-duration flow statistics, 32 to 66 percent for seasonal low-duration flow statistics, and 53 to 64 percent for low-flow frequency statistics.
Attributing Meanings to Representations of Data: The Case of Statistical Process Control
ERIC Educational Resources Information Center
Hoyles, Celia; Bakker, Arthur; Kent, Phillip; Noss, Richard
2007-01-01
This article is concerned with the meanings that employees in industry attribute to representations of data and the contingencies of these meanings in context. Our primary concern is to more precisely characterize how the context of the industrial process is constitutive of the meaning of graphs of data derived from this process. We draw on data…
NASA Technical Reports Server (NTRS)
Garneau, S.; Plaut, J. J.
2000-01-01
The surface roughness of the Vastitas Borealis Formation on Mars was analyzed with fractal statistics. Root mean square slopes and fractal dimensions were calculated for 74 topographic profiles. Results have implications for radar scattering models.
The statistics of Pearce element diagrams and the Chayes closure problem
NASA Astrophysics Data System (ADS)
Nicholls, J.
1988-05-01
Pearce element ratios are defined as having a constituent in their denominator that is conserved in a system undergoing change. The presence of a conserved element in the denominator simplifies the statistics of such ratios and renders them subject to statistical tests, especially tests of significance of the correlation coefficient between Pearce element ratios. Pearce element ratio diagrams provide unambigous tests of petrologic hypotheses because they are based on the stoichiometry of rock-forming minerals. There are three ways to recognize a conserved element: 1. The petrologic behavior of the element can be used to select conserved ones. They are usually the incompatible elements. 2. The ratio of two conserved elements will be constant in a comagmatic suite. 3. An element ratio diagram that is not constructed with a conserved element in the denominator will have a trend with a near zero intercept. The last two criteria can be tested statistically. The significance of the slope, intercept and correlation coefficient can be tested by estimating the probability of obtaining the observed values from a random population of arrays. This population of arrays must satisfy two criteria: 1. The population must contain at least one array that has the means and variances of the array of analytical data for the rock suite. 2. Arrays with the means and variances of the data must not be so abundant in the population that nearly every array selected at random has the properties of the data. The population of random closed arrays can be obtained from a population of open arrays whose elements are randomly selected from probability distributions. The means and variances of these probability distributions are themselves selected from probability distributions which have means and variances equal to a hypothetical open array that would give the means and variances of the data on closure. This hypothetical open array is called the Chayes array. Alternatively, the population of random closed arrays can be drawn from the compositional space available to rock-forming processes. The minerals comprising the available space can be described with one additive component per mineral phase and a small number of exchange components. This space is called Thompson space. Statistics based on either space lead to the conclusion that Pearce element ratios are statistically valid and that Pearce element diagrams depict the processes that create chemical inhomogeneities in igneous rock suites.
Large-volume reduction mammaplasty: the effect of body mass index on postoperative complications.
Gamboa-Bobadilla, G Mabel; Killingsworth, Christopher
2007-03-01
Eighty-six women underwent modified inferior pedicled reduction mammaplasty. All were grouped according to body mass index (BMI): 14 in the overweight group, 51 in the obese group, and 21 in the morbidly obese group. The mean ages were 34, 35, and 36, respectively, for the 3 groups and were not statistically different. The mean resection weight in the overweight group was 929 g, 1316 g for the obese group, and 1760 g for the morbidly obese group. Wound healing complications increased with BMI; the overweight, obese, and morbidly obese groups had 21%, 43%, and 71% of complications, respectively. The results were not statistically different. The rate of repeat operations increased proportionally with the BMI to 7%, 8%, and 19%, respectively. Postoperative BMI was measured in 30 patients. Fifty percent of this group had limited preoperative activity secondary to breast enlargement. The mean postoperative follow-up period was 43 months. Forty-seven percent of this group continued to have limited activity after breast reduction with a mean BMI of 37.8 kg/m2. The mean BMI of all women was 37.41 kg/m2 with a total BMI change of -0.4 kg/m2, suggesting that most women do not lose a significant amount of weight after breast reduction. There was no statistical difference in long-term BMI.
Liver segmentation from CT images using a sparse priori statistical shape model (SP-SSM).
Wang, Xuehu; Zheng, Yongchang; Gan, Lan; Wang, Xuan; Sang, Xinting; Kong, Xiangfeng; Zhao, Jie
2017-01-01
This study proposes a new liver segmentation method based on a sparse a priori statistical shape model (SP-SSM). First, mark points are selected in the liver a priori model and the original image. Then, the a priori shape and its mark points are used to obtain a dictionary for the liver boundary information. Second, the sparse coefficient is calculated based on the correspondence between mark points in the original image and those in the a priori model, and then the sparse statistical model is established by combining the sparse coefficients and the dictionary. Finally, the intensity energy and boundary energy models are built based on the intensity information and the specific boundary information of the original image. Then, the sparse matching constraint model is established based on the sparse coding theory. These models jointly drive the iterative deformation of the sparse statistical model to approximate and accurately extract the liver boundaries. This method can solve the problems of deformation model initialization and a priori method accuracy using the sparse dictionary. The SP-SSM can achieve a mean overlap error of 4.8% and a mean volume difference of 1.8%, whereas the average symmetric surface distance and the root mean square symmetric surface distance can reach 0.8 mm and 1.4 mm, respectively.
Kaplanoglu, Mustafa; Yuce, Tuncay; Bulbul, Mehmet
2015-01-01
The aim was to evaluate the place of mean platelet volume (MPV) in predicting spontaneous miscarriage and to identify any differences in its values following miscarriage after biochemical and clinical pregnancy. We retrospectively evaluated the data of 305 spontaneous miscarriages and 168 control subjects. The miscarriage subjects were evaluated in two groups: miscarriage after biochemical pregnancy (n=79) (BA group) and miscarriage after clinical pregnancy (n=226) (CA group). Demographic and laboratory data of all subjects were statistically compared. No statistically significant difference was found between the miscarriage and control subjects in terms of demographic data and Hb, Htc, WBC, and Plt values. The mean platelet volume (MPV) value in the miscarriage group (8.99±1.47 fl) was statistically significantly lower than in the control group (9.66±1.64 fl) (P<0.001). A statistically significant difference was present between the BA, CA and control group, with the lowest MPV value in the BA group (8.64±1.34 fl, 9.11±1.49 fl, and 9.66±1.64 fl, respectively) (P<0.001). MPV was significantly lower in patients with miscarriage than the control group, and this was correlated with the gestational stage when the miscarriage occurred.
Ward, P. J.
1990-01-01
Recent developments have related quantitative trait expression to metabolic flux. The present paper investigates some implications of this for statistical aspects of polygenic inheritance. Expressions are derived for the within-sibship genetic mean and genetic variance of metabolic flux given a pair of parental, diploid, n-locus genotypes. These are exact and hold for arbitrary numbers of gene loci, arbitrary allelic values at each locus, and for arbitrary recombination fractions between adjacent gene loci. The within-sibship, genetic variance is seen to be simply a measure of parental heterozygosity plus a measure of the degree of linkage coupling within the parental genotypes. Approximations are given for the within-sibship phenotypic mean and variance of metabolic flux. These results are applied to the problem of attaining adequate statistical power in a test of association between allozymic variation and inter-individual variation in metabolic flux. Simulations indicate that statistical power can be greatly increased by augmenting the data with predictions and observations on progeny statistics in relation to parental allozyme genotypes. Adequate power may thus be attainable at small sample sizes, and when allozymic variation is scored at a only small fraction of the total set of loci whose catalytic products determine the flux. PMID:2379825
Spectral statistics of the uni-modular ensemble
NASA Astrophysics Data System (ADS)
Joyner, Christopher H.; Smilansky, Uzy; Weidenmüller, Hans A.
2017-09-01
We investigate the spectral statistics of Hermitian matrices in which the elements are chosen uniformly from U(1) , called the uni-modular ensemble (UME), in the limit of large matrix size. Using three complimentary methods; a supersymmetric integration method, a combinatorial graph-theoretical analysis and a Brownian motion approach, we are able to derive expressions for 1 / N corrections to the mean spectral moments and also analyse the fluctuations about this mean. By addressing the same ensemble from three different point of view, we can critically compare their relative advantages and derive some new results.
The Sport Students’ Ability of Literacy and Statistical Reasoning
NASA Astrophysics Data System (ADS)
Hidayah, N.
2017-03-01
The ability of literacy and statistical reasoning is very important for the students of sport education college due to the materials of statistical learning can be taken from their many activities such as sport competition, the result of test and measurement, predicting achievement based on training, finding connection among variables, and others. This research tries to describe the sport education college students’ ability of literacy and statistical reasoning related to the identification of data type, probability, table interpretation, description and explanation by using bar or pie graphic, explanation of variability, interpretation, the calculation and explanation of mean, median, and mode through an instrument. This instrument is tested to 50 college students majoring in sport resulting only 26% of all students have the ability above 30% while others still below 30%. Observing from all subjects; 56% of students have the ability of identification data classification, 49% of students have the ability to read, display and interpret table through graphic, 27% students have the ability in probability, 33% students have the ability to describe variability, and 16.32% students have the ability to read, count and describe mean, median and mode. The result of this research shows that the sport students’ ability of literacy and statistical reasoning has not been adequate and students’ statistical study has not reached comprehending concept, literary ability trining and statistical rasoning, so it is critical to increase the sport students’ ability of literacy and statistical reasoning
A Statistical Analysis of IrisCode and Its Security Implications.
Kong, Adams Wai-Kin
2015-03-01
IrisCode has been used to gather iris data for 430 million people. Because of the huge impact of IrisCode, it is vital that it is completely understood. This paper first studies the relationship between bit probabilities and a mean of iris images (The mean of iris images is defined as the average of independent iris images.) and then uses the Chi-square statistic, the correlation coefficient and a resampling algorithm to detect statistical dependence between bits. The results show that the statistical dependence forms a graph with a sparse and structural adjacency matrix. A comparison of this graph with a graph whose edges are defined by the inner product of the Gabor filters that produce IrisCodes shows that partial statistical dependence is induced by the filters and propagates through the graph. Using this statistical information, the security risk associated with two patented template protection schemes that have been deployed in commercial systems for producing application-specific IrisCodes is analyzed. To retain high identification speed, they use the same key to lock all IrisCodes in a database. The belief has been that if the key is not compromised, the IrisCodes are secure. This study shows that even without the key, application-specific IrisCodes can be unlocked and that the key can be obtained through the statistical dependence detected.
NASA Astrophysics Data System (ADS)
Lomakina, N. Ya.
2017-11-01
The work presents the results of the applied climatic division of the Siberian region into districts based on the methodology of objective classification of the atmospheric boundary layer climates by the "temperature-moisture-wind" complex realized with using the method of principal components and the special similarity criteria of average profiles and the eigen values of correlation matrices. On the territory of Siberia, it was identified 14 homogeneous regions for winter season and 10 regions were revealed for summer. The local statistical models were constructed for each region. These include vertical profiles of mean values, mean square deviations, and matrices of interlevel correlation of temperature, specific humidity, zonal and meridional wind velocity. The advantage of the obtained local statistical models over the regional models is shown.
Martin, Gary R.; Fowler, Kathleen K.; Arihood, Leslie D.
2016-09-06
Information on low-flow characteristics of streams is essential for the management of water resources. This report provides equations for estimating the 1-, 7-, and 30-day mean low flows for a recurrence interval of 10 years and the harmonic-mean flow at ungaged, unregulated stream sites in Indiana. These equations were developed using the low-flow statistics and basin characteristics for 108 continuous-record streamgages in Indiana with at least 10 years of daily mean streamflow data through the 2011 climate year (April 1 through March 31). The equations were developed in cooperation with the Indiana Department of Environmental Management.Regression techniques were used to develop the equations for estimating low-flow frequency statistics and the harmonic-mean flows on the basis of drainage-basin characteristics. A geographic information system was used to measure basin characteristics for selected streamgages. A final set of 25 basin characteristics measured at all the streamgages were evaluated to choose the best predictors of the low-flow statistics.Logistic-regression equations applicable statewide are presented for estimating the probability that selected low-flow frequency statistics equal zero. These equations use the explanatory variables total drainage area, average transmissivity of the full thickness of the unconsolidated deposits within 1,000 feet of the stream network, and latitude of the basin outlet. The percentage of the streamgage low-flow statistics correctly classified as zero or nonzero using the logistic-regression equations ranged from 86.1 to 88.9 percent.Generalized-least-squares regression equations applicable statewide for estimating nonzero low-flow frequency statistics use total drainage area, the average hydraulic conductivity of the top 70 feet of unconsolidated deposits, the slope of the basin, and the index of permeability and thickness of the Quaternary surficial sediments as explanatory variables. The average standard error of prediction of these regression equations ranges from 55.7 to 61.5 percent.Regional weighted-least-squares regression equations were developed for estimating the harmonic-mean flows by dividing the State into three low-flow regions. The Northern region uses total drainage area and the average transmissivity of the entire thickness of unconsolidated deposits as explanatory variables. The Central region uses total drainage area, the average hydraulic conductivity of the entire thickness of unconsolidated deposits, and the index of permeability and thickness of the Quaternary surficial sediments. The Southern region uses total drainage area and the percent of the basin covered by forest. The average standard error of prediction for these equations ranges from 39.3 to 66.7 percent.The regional regression equations are applicable only to stream sites with low flows unaffected by regulation and to stream sites with drainage basin characteristic values within specified limits. Caution is advised when applying the equations for basins with characteristics near the applicable limits and for basins with karst drainage features and for urbanized basins. Extrapolations near and beyond the applicable basin characteristic limits will have unknown errors that may be large. Equations are presented for use in estimating the 90-percent prediction interval of the low-flow statistics estimated by use of the regression equations at a given stream site.The regression equations are to be incorporated into the U.S. Geological Survey StreamStats Web-based application for Indiana. StreamStats allows users to select a stream site on a map and automatically measure the needed basin characteristics and compute the estimated low-flow statistics and associated prediction intervals.
NASA Technical Reports Server (NTRS)
Geller, M. A.; Wu, M.-F.; Gelman, M. E.
1984-01-01
Individual monthly mean general circulation statistics for the Northern Hemisphere winters of 1978-79, 1979-80, 1980-81, and 1981-82 are examined for the altitude region from the earth's surface to 55 km. Substantial interannual variability is found in the mean zonal geostrophic wind; planetary waves with zonal wavenumber one and two; the heat and momentum fluxes; and the divergence of the Eliassen-Palm flux. These results are compared with previous studies by other workers. This variability in the monthly means is examined further by looking at both time-latitude sections at constant pressure levels and time-height sections at constant latitudes. The implications of this interannual variability for verifying models and interpreting observations are discussed.
Regional statistics in confined two-dimensional decaying turbulence.
Házi, Gábor; Tóth, Gábor
2011-06-28
Two-dimensional decaying turbulence in a square container has been simulated using the lattice Boltzmann method. The probability density function (PDF) of the vorticity and the particle distribution functions have been determined at various regions of the domain. It is shown that, after the initial stage of decay, the regional area averaged enstrophy fluctuates strongly around a mean value in time. The ratio of the regional mean and the overall enstrophies increases monotonously with increasing distance from the wall. This function shows a similar shape to the axial mean velocity profile of turbulent channel flows. The PDF of the vorticity peaks at zero and is nearly symmetric considering the statistics in the overall domain. Approaching the wall, the PDFs become skewed owing to the boundary layer.
Streamflow Characteristics of Streams in the Helmand Basin, Afghanistan
Williams-Sether, Tara
2008-01-01
Statistical summaries of streamflow data for all historical streamflow-gaging stations for the Helmand Basin upstream from the Sistan Wetlands are presented in this report. The summaries for each streamflow-gaging station include (1) manuscript (station description), (2) graph of the annual mean discharge for the period of record, (3) statistics of monthly and annual mean discharges, (4) graph of the annual flow duration, (5) monthly and annual flow duration, (6) probability of occurrence of annual high discharges, (7) probability of occurrence of annual low discharges, (8) probability of occurrence of seasonal low discharges, (9) annual peak discharge and corresponding gage height for the period of record, and (10) monthly and annual mean discharges for the period of record.
Crowson, Cynthia S; Gabriel, Sherine E; Semb, Anne Grete; van Riel, Piet L C M; Karpouzas, George; Dessein, Patrick H; Hitchon, Carol; Pascual-Ramos, Virginia; Kitas, George D
2017-07-01
Cardiovascular disease (CVD) risk calculators developed for the general population do not accurately predict CVD events in patients with RA. We sought to externally validate risk calculators recommended for use in patients with RA including the EULAR 1.5 multiplier, the Expanded Cardiovascular Risk Prediction Score for RA (ERS-RA) and QRISK2. Seven RA cohorts from UK, Norway, Netherlands, USA, South Africa, Canada and Mexico were combined. Data on baseline CVD risk factors, RA characteristics and CVD outcomes (including myocardial infarction, ischaemic stroke and cardiovascular death) were collected using standardized definitions. Performance of QRISK2, EULAR multiplier and ERS-RA was compared with other risk calculators [American College of Cardiology/American Heart Association (ACC/AHA), Framingham Adult Treatment Panel III Framingham risk score-Adult Treatment Panel (FRS-ATP) and Reynolds Risk Score] using c-statistics and net reclassification index. Among 1796 RA patients without prior CVD [mean ( s . d .) age: 54.0 (14.0) years, 74% female], 100 developed CVD events during a mean follow-up of 6.9 years (12430 person-years). Estimated CVD risk by ERS-RA [mean ( s . d .) 8.8% (9.8%)] was comparable to FRS-ATP [mean ( s . d .) 9.1% (8.3%)] and Reynolds [mean ( s . d .) 9.2% (12.2%)], but lower than ACC/AHA [mean ( s . d .) 9.8% (12.1%)]. QRISK2 substantially overestimated risk [mean ( s . d .) 15.5% (13.9%)]. Discrimination was not improved for ERS-RA (c-statistic = 0.69), QRISK2 or EULAR multiplier applied to ACC/AHA compared with ACC/AHA (c-statistic = 0.72 for all) or for FRS-ATP (c-statistic = 0.75). The net reclassification index for ERS-RA was low (-0.8% vs ACC/AHA and 2.3% vs FRS-ATP). The QRISK2, EULAR multiplier and ERS-RA algorithms did not predict CVD risk more accurately in patients with RA than CVD risk calculators developed for the general population. © The Author 2017. Published by Oxford University Press on behalf of the British Society for Rheumatology. All rights reserved. For Permissions, please email: journals.permissions@oup.com
Wiley, Jeffrey B.
2012-01-01
Base flows were compared with published streamflow statistics to assess climate variability and to determine the published statistics that can be substituted for annual and seasonal base flows of unregulated streams in West Virginia. The comparison study was done by the U.S. Geological Survey, in cooperation with the West Virginia Department of Environmental Protection, Division of Water and Waste Management. The seasons were defined as winter (January 1-March 31), spring (April 1-June 30), summer (July 1-September 30), and fall (October 1-December 31). Differences in mean annual base flows for five record sub-periods (1930-42, 1943-62, 1963-69, 1970-79, and 1980-2002) range from -14.9 to 14.6 percent when compared to the values for the period 1930-2002. Differences between mean seasonal base flows and values for the period 1930-2002 are less variable for winter and spring, -11.2 to 11.0 percent, than for summer and fall, -47.0 to 43.6 percent. Mean summer base flows (July-September) and mean monthly base flows for July, August, September, and October are approximately equal, within 7.4 percentage points of mean annual base flow. The mean of each of annual, spring, summer, fall, and winter base flows are approximately equal to the annual 50-percent (standard error of 10.3 percent), 45-percent (error of 14.6 percent), 75-percent (error of 11.8 percent), 55-percent (error of 11.2 percent), and 35-percent duration flows (error of 11.1 percent), respectively. The mean seasonal base flows for spring, summer, fall, and winter are approximately equal to the spring 50- to 55-percent (standard error of 6.8 percent), summer 45- to 50-percent (error of 6.7 percent), fall 45-percent (error of 15.2 percent), and winter 60-percent duration flows (error of 8.5 percent), respectively. Annual and seasonal base flows representative of the period 1930-2002 at unregulated streamflow-gaging stations and ungaged locations in West Virginia can be estimated using previously published values of statistics and procedures.
Feature Statistics Modulate the Activation of Meaning during Spoken Word Processing
ERIC Educational Resources Information Center
Devereux, Barry J.; Taylor, Kirsten I.; Randall, Billi; Geertzen, Jeroen; Tyler, Lorraine K.
2016-01-01
Understanding spoken words involves a rapid mapping from speech to conceptual representations. One distributed feature-based conceptual account assumes that the statistical characteristics of concepts' features--the number of concepts they occur in ("distinctiveness/sharedness") and likelihood of co-occurrence ("correlational…
ERIC Educational Resources Information Center
Haans, Antal
2018-01-01
Contrast analysis is a relatively simple but effective statistical method for testing theoretical predictions about differences between group means against the empirical data. Despite its advantages, contrast analysis is hardly used to date, perhaps because it is not implemented in a convenient manner in many statistical software packages. This…
Code of Federal Regulations, 2010 CFR
2010-10-01
... group of any records under the control of the Department or a bureau thereof from which information is... management programs or processes such as staffing, employee development, retirement, and grievances and appeals. (i) Statistical records. As used in this subpart, “statistical records” means records in a system...
Research Design and Statistics for Applied Linguistics.
ERIC Educational Resources Information Center
Hatch, Evelyn; Farhady, Hossein
An introduction to the conventions of research design and statistical analysis is presented for graduate students of applied linguistics. The chapters cover such concepts as the definition of research, variables, research designs, research report formats, sorting and displaying data, probability and hypothesis testing, comparing means,…
Code of Federal Regulations, 2011 CFR
2011-01-01
.... The participant's Social Security number will remain the identifier for the submission of data and... individual or other identifying particular assigned to the individual; (l) Statistical record means a record in a system of records maintained for statistical research or reporting purposes only and not used in...
Effects of the water level on the flow topology over the Bolund island
NASA Astrophysics Data System (ADS)
Cuerva-Tejero, A.; Yeow, T. S.; Gallego-Castillo, C.; Lopez-Garcia, O.
2014-06-01
We have analyzed the influence of the actual height of Bolund island above water level on different full-scale statistics of the velocity field over the peninsula. Our analysis is focused on the database of 10-minute statistics provided by Risø-DTU for the Bolund Blind Experiment. We have considered 10-minut.e periods with near-neutral atmospheric conditions, mean wind speed values in the interval [5,20] m/s, and westerly wind directions. As expected, statistics such as speed-up, normalized increase of turbulent kinetic energy and probability of recirculating flow show a large dependence on the emerged height of the island for the locations close to the escarpment. For the published ensemble mean values of speed-up and normalized increase of turbulent kinetic energy in these locations, we propose that some ammount of uncertainty could be explained as a deterministic dependence of the flow field statistics upon the actual height of the Bolund island above the sea level.
Sanov and central limit theorems for output statistics of quantum Markov chains
DOE Office of Scientific and Technical Information (OSTI.GOV)
Horssen, Merlijn van, E-mail: merlijn.vanhorssen@nottingham.ac.uk; Guţă, Mădălin, E-mail: madalin.guta@nottingham.ac.uk
2015-02-15
In this paper, we consider the statistics of repeated measurements on the output of a quantum Markov chain. We establish a large deviations result analogous to Sanov’s theorem for the multi-site empirical measure associated to finite sequences of consecutive outcomes of a classical stochastic process. Our result relies on the construction of an extended quantum transition operator (which keeps track of previous outcomes) in terms of which we compute moment generating functions, and whose spectral radius is related to the large deviations rate function. As a corollary to this, we obtain a central limit theorem for the empirical measure. Suchmore » higher level statistics may be used to uncover critical behaviour such as dynamical phase transitions, which are not captured by lower level statistics such as the sample mean. As a step in this direction, we give an example of a finite system whose level-1 (empirical mean) rate function is independent of a model parameter while the level-2 (empirical measure) rate is not.« less
Hansen, John P
2003-01-01
Healthcare quality improvement professionals need to understand and use inferential statistics to interpret sample data from their organizations. In quality improvement and healthcare research studies all the data from a population often are not available, so investigators take samples and make inferences about the population by using inferential statistics. This three-part series will give readers an understanding of the concepts of inferential statistics as well as the specific tools for calculating confidence intervals for samples of data. This article, Part 1, presents basic information about data including a classification system that describes the four major types of variables: continuous quantitative variable, discrete quantitative variable, ordinal categorical variable (including the binomial variable), and nominal categorical variable. A histogram is a graph that displays the frequency distribution for a continuous variable. The article also demonstrates how to calculate the mean, median, standard deviation, and variance for a continuous variable.
Highway runoff quality models for the protection of environmentally sensitive areas
NASA Astrophysics Data System (ADS)
Trenouth, William R.; Gharabaghi, Bahram
2016-11-01
This paper presents novel highway runoff quality models using artificial neural networks (ANN) which take into account site-specific highway traffic and seasonal storm event meteorological factors to predict the event mean concentration (EMC) statistics and mean daily unit area load (MDUAL) statistics of common highway pollutants for the design of roadside ditch treatment systems (RDTS) to protect sensitive receiving environs. A dataset of 940 monitored highway runoff events from fourteen sites located in five countries (Canada, USA, Australia, New Zealand, and China) was compiled and used to develop ANN models for the prediction of highway runoff suspended solids (TSS) seasonal EMC statistical distribution parameters, as well as the MDUAL statistics for four different heavy metal species (Cu, Zn, Cr and Pb). TSS EMCs are needed to estimate the minimum required removal efficiency of the RDTS needed in order to improve highway runoff quality to meet applicable standards and MDUALs are needed to calculate the minimum required capacity of the RDTS to ensure performance longevity.
Vortex dynamics and Lagrangian statistics in a model for active turbulence.
James, Martin; Wilczek, Michael
2018-02-14
Cellular suspensions such as dense bacterial flows exhibit a turbulence-like phase under certain conditions. We study this phenomenon of "active turbulence" statistically by using numerical tools. Following Wensink et al. (Proc. Natl. Acad. Sci. U.S.A. 109, 14308 (2012)), we model active turbulence by means of a generalized Navier-Stokes equation. Two-point velocity statistics of active turbulence, both in the Eulerian and the Lagrangian frame, is explored. We characterize the scale-dependent features of two-point statistics in this system. Furthermore, we extend this statistical study with measurements of vortex dynamics in this system. Our observations suggest that the large-scale statistics of active turbulence is close to Gaussian with sub-Gaussian tails.
2010-02-25
gave significantly higher emphasis ratings (i.e., a statistically significant difference between SOF operators and SOF leaders). Responses were made...i.e., a statistically significant difference between SOF operators and SOF leaders). Responses were made on the following scale: 1 = No emphasis, 2...missions?” Means with an asterisk (*) indicate that the group gave significantly higher emphasis ratings (i.e., a statistically significant difference
Human Deception Detection from Whole Body Motion Analysis
2015-12-01
9.3.2. Prediction Probability The output reports from SPSS detail the stepwise procedures for each series of analyses using Wald statistic values for... statistical significance in determining replication, but instead used a combination of significance and direction of means to determine partial or...and the independents need not be unbound. All data were analyzed utilizing the Statistical Package for Social Sciences ( SPSS , v.19.0, Chicago, IL
The Shock and Vibration Digest, Volume 14, Number 2, February 1982
1982-02-01
figurations. 75 4J DUCTS 82-424 (Also see No. 346) Coupling Lou Factors for Statistical Energy Analysis of Sound Transnission at Rectangular...waves, Sound waves, Wave props- tures by means of statistical energy analysis (SEA) coupling gation loss factors for the structure-borne sound...multilayered panels are discussed. Statistical energy analysis (SEA) has proved to be a promising Experimental results of stiffened panels, damping tape
Powerful Statistical Inference for Nested Data Using Sufficient Summary Statistics
Dowding, Irene; Haufe, Stefan
2018-01-01
Hierarchically-organized data arise naturally in many psychology and neuroscience studies. As the standard assumption of independent and identically distributed samples does not hold for such data, two important problems are to accurately estimate group-level effect sizes, and to obtain powerful statistical tests against group-level null hypotheses. A common approach is to summarize subject-level data by a single quantity per subject, which is often the mean or the difference between class means, and treat these as samples in a group-level t-test. This “naive” approach is, however, suboptimal in terms of statistical power, as it ignores information about the intra-subject variance. To address this issue, we review several approaches to deal with nested data, with a focus on methods that are easy to implement. With what we call the sufficient-summary-statistic approach, we highlight a computationally efficient technique that can improve statistical power by taking into account within-subject variances, and we provide step-by-step instructions on how to apply this approach to a number of frequently-used measures of effect size. The properties of the reviewed approaches and the potential benefits over a group-level t-test are quantitatively assessed on simulated data and demonstrated on EEG data from a simulated-driving experiment. PMID:29615885
Meaning of Parental Involvement among Korean Immigrant Parents: A Mixed-Methods Approach
ERIC Educational Resources Information Center
Kim, Yanghee Anna; An, Sohyun; Kim, Hyun Chu Leah; Kim, Jihye
2018-01-01
The authors' goal was to identify ways in which Korean immigrant parents define the concept of parental involvement and to examine the statistical significances of interrelationships among these meanings. Seventy-seven parents responded to an open-ended question that asked them to define the meaning of parental involvement; 141 responses were…
The mean time-limited crash rate of stock price
NASA Astrophysics Data System (ADS)
Li, Yun-Xian; Li, Jiang-Cheng; Yang, Ai-Jun; Tang, Nian-Sheng
2017-05-01
In this article we investigate the occurrence of stock market crash in an economy cycle. Bayesian approach, Heston model and statistical-physical method are considered. Specifically, Heston model and an effective potential are employed to address the dynamic changes of stock price. Bayesian approach has been utilized to estimate the Heston model's unknown parameters. Statistical physical method is used to investigate the occurrence of stock market crash by calculating the mean time-limited crash rate. The real financial data from the Shanghai Composite Index is analyzed with the proposed methods. The mean time-limited crash rate of stock price is used to describe the occurrence of stock market crash in an economy cycle. The monotonous and nonmonotonous behaviors are observed in the behavior of the mean time-limited crash rate versus volatility of stock for various cross correlation coefficient between volatility and price. Also a minimum occurrence of stock market crash matching an optimal volatility is discovered.
A global estimate of the Earth's magnetic crustal thickness
NASA Astrophysics Data System (ADS)
Vervelidou, Foteini; Thébault, Erwan
2014-05-01
The Earth's lithosphere is considered to be magnetic only down to the Curie isotherm. Therefore the Curie isotherm can, in principle, be estimated by analysis of magnetic data. Here, we propose such an analysis in the spectral domain by means of a newly introduced regional spatial power spectrum. This spectrum is based on the Revised Spherical Cap Harmonic Analysis (R-SCHA) formalism (Thébault et al., 2006). We briefly discuss its properties and its relationship with the Spherical Harmonic spatial power spectrum. This relationship allows us to adapt any theoretical expression of the lithospheric field power spectrum expressed in Spherical Harmonic degrees to the regional formulation. We compared previously published statistical expressions (Jackson, 1994 ; Voorhies et al., 2002) to the recent lithospheric field models derived from the CHAMP and airborne measurements and we finally developed a new statistical form for the power spectrum of the Earth's magnetic lithosphere that we think provides more consistent results. This expression depends on the mean magnetization, the mean crustal thickness and a power law value that describes the amount of spatial correlation of the sources. In this study, we make a combine use of the R-SCHA surface power spectrum and this statistical form. We conduct a series of regional spectral analyses for the entire Earth. For each region, we estimate the R-SCHA surface power spectrum of the NGDC-720 Spherical Harmonic model (Maus, 2010). We then fit each of these observational spectra to the statistical expression of the power spectrum of the Earth's lithosphere. By doing so, we estimate the large wavelengths of the magnetic crustal thickness on a global scale that are not accessible directly from the magnetic measurements due to the masking core field. We then discuss these results and compare them to the results we obtained by conducting a similar spectral analysis, but this time in the cartesian coordinates, by means of a published statistical expression (Maus et al., 1997). We also compare our results to crustal thickness global maps derived by means of additional geophysical data (Purucker et al., 2002).
ERIC Educational Resources Information Center
Grossman, Murray; Troiani, Vanessa; Koenig, Phyllis; Work, Melissa; Moore, Peachie
2007-01-01
This study contrasted two approaches to word meaning: the statistically determined role of high-contribution features like "striped" in the meaning of complex nouns like "tiger" typically used in studies of semantic memory, and the contribution of diagnostic features like "parent's brother" that play a critical role in the meaning of nominal kinds…
A statistical spatial power spectrum of the Earth's lithospheric magnetic field
NASA Astrophysics Data System (ADS)
Thébault, E.; Vervelidou, F.
2015-05-01
The magnetic field of the Earth's lithosphere arises from rock magnetization contrasts that were shaped over geological times. The field can be described mathematically in spherical harmonics or with distributions of magnetization. We exploit this dual representation and assume that the lithospheric field is induced by spatially varying susceptibility values within a shell of constant thickness. By introducing a statistical assumption about the power spectrum of the susceptibility, we then derive a statistical expression for the spatial power spectrum of the crustal magnetic field for the spatial scales ranging from 60 to 2500 km. This expression depends on the mean induced magnetization, the thickness of the shell, and a power law exponent for the power spectrum of the susceptibility. We test the relevance of this form with a misfit analysis to the observational NGDC-720 lithospheric magnetic field model power spectrum. This allows us to estimate a mean global apparent induced magnetization value between 0.3 and 0.6 A m-1, a mean magnetic crustal thickness value between 23 and 30 km, and a root mean square for the field value between 190 and 205 nT at 95 per cent. These estimates are in good agreement with independent models of the crustal magnetization and of the seismic crustal thickness. We carry out the same analysis in the continental and oceanic domains separately. We complement the misfit analyses with a Kolmogorov-Smirnov goodness-of-fit test and we conclude that the observed power spectrum can be each time a sample of the statistical one.
Firoozei, Gholamreza; Shahnaseri, Shirin; Momeni, Hasan; Soltani, Parisa
2017-08-01
The purpose of orthognathic surgery is to correct facial deformity and dental malocclusion and to obtain normal orofacial function. However, there are controversies of whether orthognathic surgery might have any negative influence on temporomandibular (TM) joint. The purpose of this study was to evaluate the influence of orthognathic surgery on articular disc position and temporomandibular joint symptoms of skeletal CI II patients by means of magnetic resonance imaging. For this purpose, fifteen patients with skeletal CI II malocclusion, aged 19-32 years (mean 23 years), 10 women and 5 men, from the Isfahan Department of Oral and Maxillofacial Surgery were studied. All received LeFort I and bilateral sagittal split osteotomy (BSSO) osteotomies and all patients received pre- and post-surgical orthodontic treatment. Magnetic resonance imaging was performed 1 day preoperatively and 3 month postoperatively. Descriptive statistics and Wilcoxon and Mc-Nemar tests were used for statistical analysis. P <0.05 was considered significant. Disc position ranged between 4.25 and 8.09 prior to surgery (mean=5.74±1.21). After surgery disc position range was 4.36 to 7.40 (mean=5.65±1.06). Statistical analysis proved that although TM disc tended to move anteriorly after BSSO surgery, this difference was not statistically significant ( p value<0.05). The findings of the present study revealed that orthognathic surgery does not alter the disc and condyle relationship. Therefore, it has minimal effects on intact and functional TM joint. Key words: Orthognathic surgery, skeletal class 2, magnetic resonance imaging, temporomandibular disc.
NASA Astrophysics Data System (ADS)
Meseguer, S.; Sanfeliu, T.; Jordán, M. M.
2009-02-01
The Oliete basin (Early Cretaceous, NE Teruel, Spain) is one of the most important areas for the supply of mine spoils used as ball clays for the production of white and red stoneware in the Spanish ceramic industry of wall and floor tiles. This study corresponds to the second part of the paper published recently by Meseguer et al. (Environ Geol 2008) about the use of mine spoils from Teruel coal mining district. The present study shows a statistical data analysis from chemical data (major, minor and trace elements). The performed statistical analysis of chemical data included descriptive statistics and cluster analysis (with ANOVA and Scheffé methods). The cluster analysis of chemical data provided three main groups: C3 with the highest mean SiO2 content (66%) and lowest mean Al2O3 content (20%); C2 with lower SiO2 content (48%) and higher mean Al2O3 content (28%); and C1 with medium values for the SiO2 and Al2O3 mean content. The main applications of these materials are refractory, white and red ceramics, stoneware, heavy ceramics (including red earthenware, bricks and roof tiles), and components of white Portland cement and aluminous cement. Clays from group 2 are used in refractories (with higher kaolinite content, and constrictions to CaO + MgO and K2O + Na2O contents). All materials can be used in fine ceramics (white or red, according to the Fe2O3 + TiO2 content).
Min, Sang-Ki; Shin, Ji-Ho; Chang, Mun-Young; Min, Hyun-Jin; Kim, Kyung-Soo; Lee, Sei-Young; Yang, Hoon-Shik; Hong, Young-Ho; Mun, Seog-Kyun
2017-03-01
The objective of this study is to investigate the impact of control of blood glucose level during treatment of sudden deafness. A retrospective study was performed involving 197 patients from January, 2011 to September, 2015. All patients were administrated prednisolone (Pharmaprednisolone tab ® , 5 mg/T; KoreaPharma) p.o under the following regimen: 60 mg/day for 4 days, 40 mg/day for 2 days, 30 mg/day for 1 day, 20 mg/day for 1 day, and 10 mg/day for 2 days. During treatment, pure tone audiometry and blood glucose level were investigated for each patient and the results were statistically analyzed. Mean hearing improvement was 19.2 dB for the non-diabetes group and 24.8 dB for the diabetes group. The greater improvement for diabetics was not statistically significant (p = 0.146). Hearing improvement was 25.1 dB for subjects with mean blood glucose <200 mg/dl and 24.6 dB for subjects with mean blood glucose >200 mg/dl; the difference was not statistically significant (p = 0.267). Mean blood glucose level was 200.8 mg/dl for subjects with hearing improvement >20 dB and 181.8 mg/dl for subjects with hearing improvement <20 dB; the difference was not statistically significant (p = 0.286). Control of blood glucose level during treatment of sudden deafness does not have a direct effect on prognosis.
SPA- STATISTICAL PACKAGE FOR TIME AND FREQUENCY DOMAIN ANALYSIS
NASA Technical Reports Server (NTRS)
Brownlow, J. D.
1994-01-01
The need for statistical analysis often arises when data is in the form of a time series. This type of data is usually a collection of numerical observations made at specified time intervals. Two kinds of analysis may be performed on the data. First, the time series may be treated as a set of independent observations using a time domain analysis to derive the usual statistical properties including the mean, variance, and distribution form. Secondly, the order and time intervals of the observations may be used in a frequency domain analysis to examine the time series for periodicities. In almost all practical applications, the collected data is actually a mixture of the desired signal and a noise signal which is collected over a finite time period with a finite precision. Therefore, any statistical calculations and analyses are actually estimates. The Spectrum Analysis (SPA) program was developed to perform a wide range of statistical estimation functions. SPA can provide the data analyst with a rigorous tool for performing time and frequency domain studies. In a time domain statistical analysis the SPA program will compute the mean variance, standard deviation, mean square, and root mean square. It also lists the data maximum, data minimum, and the number of observations included in the sample. In addition, a histogram of the time domain data is generated, a normal curve is fit to the histogram, and a goodness-of-fit test is performed. These time domain calculations may be performed on both raw and filtered data. For a frequency domain statistical analysis the SPA program computes the power spectrum, cross spectrum, coherence, phase angle, amplitude ratio, and transfer function. The estimates of the frequency domain parameters may be smoothed with the use of Hann-Tukey, Hamming, Barlett, or moving average windows. Various digital filters are available to isolate data frequency components. Frequency components with periods longer than the data collection interval are removed by least-squares detrending. As many as ten channels of data may be analyzed at one time. Both tabular and plotted output may be generated by the SPA program. This program is written in FORTRAN IV and has been implemented on a CDC 6000 series computer with a central memory requirement of approximately 142K (octal) of 60 bit words. This core requirement can be reduced by segmentation of the program. The SPA program was developed in 1978.
Urkmez, Ahmet; Yuksel, Ozgur Haki; Uruc, Fatih; Akan, Serkan; Yildirim, Caglar; Sahin, Aytac; Verit, Ayhan
2016-05-01
Prostatitis affects 10-14% of men of all ages and ethnicities. More than 50% of the men experience episodes of prostatitis at one time of their lives. Patients with CP typically have longlasting genitourinary/pelvic pain and obstructive and/or irritative voiding symptoms. Sexual dysfunction and psychological symptoms are frequently added to these symptoms. We also investigated the relationship between sexual functions, and lower urinary system symptoms, and asymptomatic histological prostatitis detected on transrectal ultrasound-guided (TRUS) biopsy performed with the indication of high PSA levels. Sixty cases compliant with the study criteria among patients who underwent prostate biopsies between September 2014 and June 2015 with the indication of higher PSA levels were included in the study. All patients were requested to complete IIEF-5 and IPSS forms one day previously. Based on histological analysis of biopsy materials, the patients were allocated into groups of BPH (simple BPH without histological prostatitis) (n:30) and histological chronic prostatitis (combination of BPH and histological prostatitis) (n:30). Mean age of the cases was 65.73±5.01 (range, 56-75 yrs) years. PSA levels ranged between 4-15 ng/ml. A statistically significant intergroup difference was not found regarding mean age, BMIs, PSA levels, incidence rates of hypertension and coronary artery disease (p>0.05). Prostate volumes of the HCP group were higher than those of the BPH group , with statistically significant differences (p:0.001; p<0.01). Questionnaire forms of the patients included in the study were statistically evaluated, and mean IPSS score of the HCP group was found to be higher when compared with that of the BPH group, with statistically significant differences. (p:0.016; p<0.05). However mean IIEF score of the BPH group was higher than that of the HCP group, with statistically significant differences (p:0.039; p<0.05). These findings suggested the presence of a correlation between chronic inflammation and lower urinary tract symptoms (LUTS). In addition, statistically significant lower IIEF values in patients with histological chronic prostatitis relative to those without suggested negative effects of even asymptomatic inflammation on sexual functions and mechanism of erection.
Soliman, Mahmoud M; Macky, Tamer A; Samir, M Khaled
2004-08-01
To assess the efficacy of lidocaine gel, bupivacaine drops, and benoxinate drops as topical anesthetic agents in cataract surgery. Kasr El-Aini Hospital, Cairo University, Cairo, Egypt. This prospective randomized study comprised 90 patients scheduled for routine cataract extraction. Patients were randomized into 3 groups of 30 each based on which anesthetic agent they received: lidocaine 2% gel, bupivacaine 0.5% drops, or benoxinate 0.4% drops. Subjective pain at application of the agent and intraoperatively was quantified by the patients using a verbal pain score (VPS) scale from 0 to 10. The duration of discomfort at application, duration of surgery, rate of supplemental sub-Tenon's anesthesia, and complications were recorded. The mean VPS at application was 2.97, 1.53, and 1.03 in the lidocaine, bupivacaine, and benoxinate groups, respectively; the VPS in the lidocaine group was statistically significantly higher than in the other 2 groups (P<.001). The mean duration of pain at application was 25 seconds, 14 seconds, and 6 seconds in the lidocaine, bupivacaine, and benoxinate groups, respectively, and was statistically significantly higher in the lidocaine group (P<.001). The mean VPS during surgery was 1.6, 4.1, and 7.1 in the lidocaine, bupivacaine, and benoxinate groups; the lidocaine group had a statistically significantly lower mean VPS than the other 2 groups (P<.001). The incidence of supplemental sub-Tenon's injection was 3.3%, 10.0%, and 73.3%, respectively, and was statistically significantly lower in the lidocaine and bupivacaine groups than in the benoxinate group (P<.001). The patients' overall satisfaction was statistically significantly higher in the lidocaine and bupivacaine groups than in the benoxinate group (93.3%, 83.3%, and 33.3%, respectively) (P<.001). Three patients in the lidocaine group had corneal haze at the time of surgery, which was not statistically significant (P>.1). Lidocaine gel was a better topical anesthetic agent than bupivacaine and benoxinate drops. Bupivacaine drops were effective in providing deep topical anesthesia.
New Statistics for Testing Differential Expression of Pathways from Microarray Data
NASA Astrophysics Data System (ADS)
Siu, Hoicheong; Dong, Hua; Jin, Li; Xiong, Momiao
Exploring biological meaning from microarray data is very important but remains a great challenge. Here, we developed three new statistics: linear combination test, quadratic test and de-correlation test to identify differentially expressed pathways from gene expression profile. We apply our statistics to two rheumatoid arthritis datasets. Notably, our results reveal three significant pathways and 275 genes in common in two datasets. The pathways we found are meaningful to uncover the disease mechanisms of rheumatoid arthritis, which implies that our statistics are a powerful tool in functional analysis of gene expression data.
Performing Inferential Statistics Prior to Data Collection
ERIC Educational Resources Information Center
Trafimow, David; MacDonald, Justin A.
2017-01-01
Typically, in education and psychology research, the investigator collects data and subsequently performs descriptive and inferential statistics. For example, a researcher might compute group means and use the null hypothesis significance testing procedure to draw conclusions about the populations from which the groups were drawn. We propose an…
Confidence Interval Coverage for Cohen's Effect Size Statistic
ERIC Educational Resources Information Center
Algina, James; Keselman, H. J.; Penfield, Randall D.
2006-01-01
Kelley compared three methods for setting a confidence interval (CI) around Cohen's standardized mean difference statistic: the noncentral-"t"-based, percentile (PERC) bootstrap, and biased-corrected and accelerated (BCA) bootstrap methods under three conditions of nonnormality, eight cases of sample size, and six cases of population…
Performance metrics for the assessment of satellite data products: an ocean color case study
Performance assessment of ocean color satellite data has generally relied on statistical metrics chosen for their common usage and the rationale for selecting certain metrics is infrequently explained. Commonly reported statistics based on mean squared errors, such as the coeffic...
Code of Federal Regulations, 2011 CFR
2011-10-01
... use of such record for a purpose compatible with the purpose for which it was collected, including... purposes ordered by a court referral to potential employers, and for security clearance; Statistical record means a record in a system of records maintained for statistical research or reporting purposes only and...
Code of Federal Regulations, 2010 CFR
2010-10-01
... control of the NTSB pursuant to Federal law or in connection with the transaction of public business... purposes ordered by a court referral to potential employers, and for security clearance; Statistical record means a record in a system of records maintained for statistical research or reporting purposes only and...
Code of Federal Regulations, 2010 CFR
2010-01-01
... individual or other identifying particular assigned to the individual; (l) Statistical record means a record in a system of records maintained for statistical research or reporting purposes only and not used in... when the Board is open for the conduct of Government business and does not include Saturdays, Sundays...
NASA Astrophysics Data System (ADS)
Ishizaki, N. N.; Dairaku, K.; Ueno, G.
2016-12-01
We have developed a statistical downscaling method for estimating probabilistic climate projection using CMIP5 multi general circulation models (GCMs). A regression model was established so that the combination of weights of GCMs reflects the characteristics of the variation of observations at each grid point. Cross validations were conducted to select GCMs and to evaluate the regression model to avoid multicollinearity. By using spatially high resolution observation system, we conducted statistically downscaled probabilistic climate projections with 20-km horizontal grid spacing. Root mean squared errors for monthly mean air surface temperature and precipitation estimated by the regression method were the smallest compared with the results derived from a simple ensemble mean of GCMs and a cumulative distribution function based bias correction method. Projected changes in the mean temperature and precipitation were basically similar to those of the simple ensemble mean of GCMs. Mean precipitation was generally projected to increase associated with increased temperature and consequent increased moisture content in the air. Weakening of the winter monsoon may affect precipitation decrease in some areas. Temperature increase in excess of 4 K was expected in most areas of Japan in the end of 21st century under RCP8.5 scenario. The estimated probability of monthly precipitation exceeding 300 mm would increase around the Pacific side during the summer and the Japan Sea side during the winter season. This probabilistic climate projection based on the statistical method can be expected to bring useful information to the impact studies and risk assessments.
Techniques for estimating selected streamflow characteristics of rural unregulated streams in Ohio
Koltun, G.F.; Whitehead, Matthew T.
2002-01-01
This report provides equations for estimating mean annual streamflow, mean monthly streamflows, harmonic mean streamflow, and streamflow quartiles (the 25th-, 50th-, and 75th-percentile streamflows) as a function of selected basin characteristics for rural, unregulated streams in Ohio. The equations were developed from streamflow statistics and basin-characteristics data for as many as 219 active or discontinued streamflow-gaging stations on rural, unregulated streams in Ohio with 10 or more years of homogenous daily streamflow record. Streamflow statistics and basin-characteristics data for the 219 stations are presented in this report. Simple equations (based on drainage area only) and best-fit equations (based on drainage area and at least two other basin characteristics) were developed by means of ordinary least-squares regression techniques. Application of the best-fit equations generally involves quantification of basin characteristics that require or are facilitated by use of a geographic information system. In contrast, the simple equations can be used with information that can be obtained without use of a geographic information system; however, the simple equations have larger prediction errors than the best-fit equations and exhibit geographic biases for most streamflow statistics. The best-fit equations should be used instead of the simple equations whenever possible.
Jeevanandan, Ganesh; Thomas, Eapen
2018-01-01
This present study was conducted to analyze the volumetric change in the root canal space and instrumentation time between hand files, hand files in reciprocating motion, and three rotary files in primary molars. One hundred primary mandibular molars were randomly allotted to one of the five groups. Instrumentation was done using Group I; nickel-titanium (Ni-Ti) hand file, Group II; Ni-Ti hand files in reciprocating motion, Group III; Race rotary files, Group IV; prodesign pediatric rotary files, and Group V; ProTaper rotary files. The mean volumetric changes were assessed using pre- and post-operative spiral computed tomography scans. Instrumentation time was recorded. Statistical analysis to access intergroup comparison for mean canal volume and instrumentation time was done using Bonferroni-adjusted Mann-Whitney test and Mann-Whitney test, respectively. Intergroup comparison of mean canal volume showed statistically significant difference between Groups II versus IV, Groups III versus V, and Groups IV versus V. Intergroup comparison of mean instrumentation time showed statistically significant difference among all the groups except Groups IV versus V. Among the various instrumentation techniques available, rotary instrumentation is the considered to be the better instrumentation technique for canal preparation in primary teeth.
Dirickson, Amanda; Stutzman, Sonja E; Alberts, Mark J; Novakovic, Roberta L; Stowe, Ann M; Beal, Claudia C; Goldberg, Mark P; Olson, DaiWai M
2017-12-01
Recent studies reveal deficiencies in stroke awareness and knowledge of risk factors among women. Existing stroke education interventions may not address common and sex-specific risk factors in the population with the highest stroke-related rate of mortality. This pilot study assessed the efficacy of a technology-enhanced, sex-specific educational program ("SISTERS") for women's knowledge of stroke. This was an experimental pretest-posttest design. The sample consisted of 150 women (mean age, 55 years) with at least 1 stroke risk factor. Participants were randomized to either the intervention (n = 75) or control (n = 75) group. Data were collected at baseline and at a 2-week posttest. There was no statistically significant difference in mean knowledge score (P = .67), mean confidence score (P = .77), or mean accuracy score (P = .75) between the intervention and control groups at posttest. Regression analysis revealed that older age was associated with lower knowledge scores (P < .001) and lower confidence scores (P < .001). After controlling for age, the SISTERS program was associated with a statistically significant difference in knowledge (P < .001) and confidence (P < .001). Although no change occurred overall, after controlling for age, there was a statistically significant benefit. Older women may have less comfort with technology and require consideration for cognitive differences.
Corneal endothelial cell density and morphology in Phramongkutklao Hospital
Sopapornamorn, Narumon; Lekskul, Manapon; Panichkul, Suthee
2008-01-01
Objective To describe the corneal endothelial density and morphology in patients of Phramongkutklao Hospital and the relationship between endothelial cell parameters and other factors. Methods Four hundred and four eyes of 202 volunteers were included. Noncontact specular microscopy was performed after taking a history and testing the visual acuity, intraocular pressure measurement, Schirmer’s test and routine eye examination by slit lamp microscope. The studied parameters included mean endothelial cell density (MCD), coefficient of variation (CV), and percentage of hexagonality. Results The mean age of volunteers was 45.73 years; the range being 20 to 80 years old. Their MCD (SD), mean percentage of CV (SD) and mean (SD) percentage of hexagonality were 2623.49(325) cell/mm2, 39.43(8.23)% and 51.50(10.99)%, respectively. Statistically, MCD decreased significantly with age (p < 0.01). There was a significant difference in the percentage of CV between genders. There was no statistical significance between parameters and other factors. Conclusion The normative data of the corneal endothelium of Thai eyes indicated that, statistically, MCD decreased significantly with age. Previous studies have reported no difference in MCD, percentage of CV, and percentage of hexagonality between gender. Nevertheless, significantly different percentages of CV between genders were presented in this study. PMID:19668398
NASA Astrophysics Data System (ADS)
Natividad, Gina May R.; Cawiding, Olive R.; Addawe, Rizavel C.
2017-11-01
The increase in the merchandise exports of the country offers information about the Philippines' trading role within the global economy. Merchandise exports statistics are used to monitor the country's overall production that is consumed overseas. This paper investigates the comparison between two models obtained by a) clustering the commodity groups into two based on its proportional contribution to the total exports, and b) treating only the total exports. Different seasonal autoregressive integrated moving average (SARIMA) models were then developed for the clustered commodities and for the total exports based on the monthly merchandise exports of the Philippines from 2011 to 2016. The data set used in this study was retrieved from the Philippine Statistics Authority (PSA) which is the central statistical authority in the country responsible for primary data collection. A test for significance of the difference between means at 0.05 level of significance was then performed on the forecasts produced. The result indicates that there is a significant difference between the mean of the forecasts of the two models. Moreover, upon a comparison of the root mean square error (RMSE) and mean absolute error (MAE) of the models, it was found that the models used for the clustered groups outperform the model for the total exports.
NASA Astrophysics Data System (ADS)
Graham, Wendy; Destouni, Georgia; Demmy, George; Foussereau, Xavier
1998-07-01
The methodology developed in Destouni and Graham [Destouni, G., Graham, W.D., 1997. The influence of observation method on local concentration statistics in the subsurface. Water Resour. Res. 33 (4) 663-676.] for predicting locally measured concentration statistics for solute transport in heterogeneous porous media under saturated flow conditions is applied to the prediction of conservative nonreactive solute transport in the vadose zone where observations are obtained by soil coring. Exact analytical solutions are developed for both the mean and variance of solute concentrations measured in discrete soil cores using a simplified physical model for vadose-zone flow and solute transport. Theoretical results show that while the ensemble mean concentration is relatively insensitive to the length-scale of the measurement, predictions of the concentration variance are significantly impacted by the sampling interval. Results also show that accounting for vertical heterogeneity in the soil profile results in significantly less spreading in the mean and variance of the measured solute breakthrough curves, indicating that it is important to account for vertical heterogeneity even for relatively small travel distances. Model predictions for both the mean and variance of locally measured solute concentration, based on independently estimated model parameters, agree well with data from a field tracer test conducted in Manatee County, Florida.
Calculating stage duration statistics in multistage diseases.
Komarova, Natalia L; Thalhauser, Craig J
2011-01-01
Many human diseases are characterized by multiple stages of progression. While the typical sequence of disease progression can be identified, there may be large individual variations among patients. Identifying mean stage durations and their variations is critical for statistical hypothesis testing needed to determine if treatment is having a significant effect on the progression, or if a new therapy is showing a delay of progression through a multistage disease. In this paper we focus on two methods for extracting stage duration statistics from longitudinal datasets: an extension of the linear regression technique, and a counting algorithm. Both are non-iterative, non-parametric and computationally cheap methods, which makes them invaluable tools for studying the epidemiology of diseases, with a goal of identifying different patterns of progression by using bioinformatics methodologies. Here we show that the regression method performs well for calculating the mean stage durations under a wide variety of assumptions, however, its generalization to variance calculations fails under realistic assumptions about the data collection procedure. On the other hand, the counting method yields reliable estimations for both means and variances of stage durations. Applications to Alzheimer disease progression are discussed.
Long-term changes (1980-2003) in total ozone time series over Northern Hemisphere midlatitudes
NASA Astrophysics Data System (ADS)
Białek, Małgorzata
2006-03-01
Long-term changes in total ozone time series for Arosa, Belsk, Boulder and Sapporo stations are examined. For each station we analyze time series of the following statistical characteristics of the distribution of daily ozone data: seasonal mean, standard deviation, maximum and minimum of total daily ozone values for all seasons. The iterative statistical model is proposed to estimate trends and long-term changes in the statistical distribution of the daily total ozone data. The trends are calculated for the period 1980-2003. We observe lessening of negative trends in the seasonal means as compared to those calculated by WMO for 1980-2000. We discuss a possibility of a change of the distribution shape of ozone daily data using the Kolmogorov-Smirnov test and comparing trend values in the seasonal mean, standard deviation, maximum and minimum time series for the selected stations and seasons. The distribution shift toward lower values without a change in the distribution shape is suggested with the following exceptions: the spreading of the distribution toward lower values for Belsk during winter and no decisive result for Sapporo and Boulder in summer.
The Ultrachopper tip: a wound temperature study.
Barlow, William R; Pettey, Jeff; Olson, Randall J
2013-12-01
To determine the thermal characteristics of the Ultrachopper and its thermal properties in varied viscosurgical substances. Experimental study. Not applicable. The Ultrachopper (Alcon, Inc) tip with the Infiniti (Alcon, Inc) handpiece was attached to a thermistor and placed in a test chamber filled with either an ophthalmic viscosurgical device (OVD) or balanced salt solution (BSS). The thermistor allowed for continuous monitoring of temperature from baseline and the change that occurred over 60 seconds of continuous run time. Mean maximum temperature in each OVD exceeded 50°C over the first 25 seconds of continuous run time. The mean maximum temperature was statistically significantly higher with all OVDs (p < 0.0001) when compared with BSS. A small but statistically significant difference in mean maximum temperature was shown between Healon 5 (AMO, Inc) and Viscoat (Alcon, Inc) (p < 0.05). The linear increase in temperature was statistically significantly different with all OVDs compared with BSS (p < 0.0001). The thermal properties of the Ultrachopper tip demonstrate a heat-generating capacity that achieves published thresholds for risk for wound burn. Copyright © 2013 Canadian Ophthalmological Society. Published by Elsevier Inc. All rights reserved.
Design of off-statistics axial-flow fans by means of vortex law optimization
NASA Astrophysics Data System (ADS)
Lazari, Andrea; Cattanei, Andrea
2014-12-01
Off-statistics input data sets are common in axial-flow fans design and may easily result in some violation of the requirements of a good aerodynamic blade design. In order to circumvent this problem, in the present paper, a solution to the radial equilibrium equation is found which minimizes the outlet kinetic energy and fulfills the aerodynamic constraints, thus ensuring that the resulting blade has acceptable aerodynamic performance. The presented method is based on the optimization of a three-parameters vortex law and of the meridional channel size. The aerodynamic quantities to be employed as constraints are individuated and their suitable ranges of variation are proposed. The method is validated by means of a design with critical input data values and CFD analysis. Then, by means of systematic computations with different input data sets, some correlations and charts are obtained which are analogous to classic correlations based on statistical investigations on existing machines. Such new correlations help size a fan of given characteristics as well as study the feasibility of a given design.
Code of Federal Regulations, 2010 CFR
2010-01-01
... Planning Organization means that organization required by the Department of Transportation, and designated... planning provisions in a Standard Metropolitan Statistical Area. Model Energy Code, 1993, including Errata, means the model building code published by the Council of American Building Officials, which is...
Statistical analogues of thermodynamic extremum principles
NASA Astrophysics Data System (ADS)
Ramshaw, John D.
2018-05-01
As shown by Jaynes, the canonical and grand canonical probability distributions of equilibrium statistical mechanics can be simply derived from the principle of maximum entropy, in which the statistical entropy S=- {k}{{B}}{\\sum }i{p}i{log}{p}i is maximised subject to constraints on the mean values of the energy E and/or number of particles N in a system of fixed volume V. The Lagrange multipliers associated with those constraints are then found to be simply related to the temperature T and chemical potential μ. Here we show that the constrained maximisation of S is equivalent to, and can therefore be replaced by, the essentially unconstrained minimisation of the obvious statistical analogues of the Helmholtz free energy F = E ‑ TS and the grand potential J = F ‑ μN. Those minimisations are more easily performed than the maximisation of S because they formally eliminate the constraints on the mean values of E and N and their associated Lagrange multipliers. This procedure significantly simplifies the derivation of the canonical and grand canonical probability distributions, and shows that the well known extremum principles for the various thermodynamic potentials possess natural statistical analogues which are equivalent to the constrained maximisation of S.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Simonen, F.A.; Khaleel, M.A.
This paper describes a statistical evaluation of the through-thickness copper variation for welds in reactor pressure vessels, and reviews the historical basis for the static and arrest fracture toughness (K{sub Ic} and K{sub Ia}) equations used in the VISA-II code. Copper variability in welds is due to fabrication procedures with copper contents being randomly distributed, variable from one location to another through the thickness of the vessel. The VISA-II procedure of sampling the copper content from a statistical distribution for every 6.35- to 12.7-mm (1/4- to 1/2-in.) layer through the thickness was found to be consistent with the statistical observations.more » However, the parameters of the VISA-II distribution and statistical limits required further investigation. Copper contents at few locations through the thickness were found to exceed the 0.4% upper limit of the VISA-II code. The data also suggest that the mean copper content varies systematically through the thickness. While, the assumption of normality is not clearly supported by the available data, a statistical evaluation based on all the available data results in mean and standard deviations within the VISA-II code limits.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
von Storch, H.; Zorita, E.; Cubasch, U.
A statistical strategy to deduct regional-scale features from climate general circulation model (GCM) simulations has been designed and tested. The main idea is to interrelate the characteristic patterns of observed simultaneous variations of regional climate parameters and of large-scale atmospheric flow using the canonical correlation technique. The large-scale North Atlantic sea level pressure (SLP) is related to the regional, variable, winter (DJF) mean Iberian Peninsula rainfall. The skill of the resulting statistical model is shown by reproducing, to a good approximation, the winter mean Iberian rainfall from 1900 to present from the observed North Atlantic mean SLP distributions. It ismore » shown that this observed relationship between these two variables is not well reproduced in the output of a general circulation model (GCM). The implications for Iberian rainfall changes as the response to increasing atmospheric greenhouse-gas concentrations simulated by two GCM experiments are examined with the proposed statistical model. In an instantaneous [open quotes]2 CO[sub 2][close quotes] doubling experiment, using the simulated change of the mean North Atlantic SLP field to predict Iberian rainfall yields, there is an insignificant increase of area-averaged rainfall of I mm/month, with maximum values of 4 mm/month in the northwest of the peninsula. In contrast, for the four GCM grid points representing the lberian Peninsula, the change is - 10 mm/month, with a minimum of - 19 mm/month in the southwest. In the second experiment, with the IPCC scenario A ([open quotes]business as usual[close quotes]) increase of CO[sub 2], the statistical-model results partially differ from the directly simulated rainfall changes: in the experimental range of 100 years, the area-averaged rainfall decreases by 7 mm/month (statistical model), and by 9 mm/month (GCM); at the same time the amplitude of the interdecadal variability is quite different. 17 refs., 10 figs.« less
Yung, Emmanuel; Wong, Michael; Williams, Haddie; Mache, Kyle
2014-08-01
Randomized clinical trial. Objectives To compare the blood pressure (BP) and heart rate (HR) response of healthy volunteers to posteriorly directed (anterior-to-posterior [AP]) pressure applied to the cervical spine versus placebo. Manual therapists employ cervical spine AP mobilizations for various cervical-shoulder pain conditions. However, there is a paucity of literature describing the procedure, cardiovascular response, and safety profile. Thirty-nine (25 female) healthy participants (mean ± SD age, 24.7 ± 1.9 years) were randomly assigned to 1 of 2 groups. Group 1 received a placebo, consisting of light touch applied to the right C6 costal process. Group 2 received AP pressure at the same location. Blood pressure and HR were measured prior to, during, and after the application of AP pressure. One-way analysis of variance and paired-difference statistics were used for data analysis. There was no statistically significant difference between groups for mean systolic BP, mean diastolic BP, and mean HR (P >.05) for all time points. Within-group comparisons indicated statistically significant differences between baseline and post-AP pressure HR (-2.8 bpm; 95% confidence interval: -4.6, -1.1) and between baseline and post-AP pressure systolic BP (-2.4 mmHg; 95% confidence interval: -3.7, -1.0) in the AP group, and between baseline and postplacebo systolic BP (-2.6 mmHg; 95% confidence interval: -4.2, -1.0) in the placebo group. No participants reported any adverse reactions or side effects within 24 hours of testing. AP pressure caused a statistically significant physiologic response that resulted in a minor drop in HR (without causing asystole or vasodepression) after the procedure, whereas this cardiovascular change did not occur for those in the placebo group. Within both groups, there was a small but statistically significant reduction in systolic BP following the procedure.
Effect of non-normality on test statistics for one-way independent groups designs.
Cribbie, Robert A; Fiksenbaum, Lisa; Keselman, H J; Wilcox, Rand R
2012-02-01
The data obtained from one-way independent groups designs is typically non-normal in form and rarely equally variable across treatment populations (i.e., population variances are heterogeneous). Consequently, the classical test statistic that is used to assess statistical significance (i.e., the analysis of variance F test) typically provides invalid results (e.g., too many Type I errors, reduced power). For this reason, there has been considerable interest in finding a test statistic that is appropriate under conditions of non-normality and variance heterogeneity. Previously recommended procedures for analysing such data include the James test, the Welch test applied either to the usual least squares estimators of central tendency and variability, or the Welch test with robust estimators (i.e., trimmed means and Winsorized variances). A new statistic proposed by Krishnamoorthy, Lu, and Mathew, intended to deal with heterogeneous variances, though not non-normality, uses a parametric bootstrap procedure. In their investigation of the parametric bootstrap test, the authors examined its operating characteristics under limited conditions and did not compare it to the Welch test based on robust estimators. Thus, we investigated how the parametric bootstrap procedure and a modified parametric bootstrap procedure based on trimmed means perform relative to previously recommended procedures when data are non-normal and heterogeneous. The results indicated that the tests based on trimmed means offer the best Type I error control and power when variances are unequal and at least some of the distribution shapes are non-normal. © 2011 The British Psychological Society.
Impact of Major Pulmonary Resections on Right Ventricular Function: Early Postoperative Changes.
Elrakhawy, Hany M; Alassal, Mohamed A; Shaalan, Ayman M; Awad, Ahmed A; Sayed, Sameh; Saffan, Mohammad M
2018-01-15
Right ventricular (RV) dysfunction after pulmonary resection in the early postoperative period is documented by reduced RV ejection fraction and increased RV end-diastolic volume index. Supraventricular arrhythmia, particularly atrial fibrillation, is common after pulmonary resection. RV assessment can be done by non-invasive methods and/or invasive approaches such as right cardiac catheterization. Incorporation of a rapid response thermistor to pulmonary artery catheter permits continuous measurements of cardiac output, right ventricular ejection fraction, and right ventricular end-diastolic volume. It can also be used for right atrial and right ventricular pacing, and for measuring right-sided pressures, including pulmonary capillary wedge pressure. This study included 178 patients who underwent major pulmonary resections, 36 who underwent pneumonectomy assigned as group (I) and 142 who underwent lobectomy assigned as group (II). The study was conducted at the cardiothoracic surgery department of Benha University hospital in Egypt; patients enrolled were operated on from February 2012 to February 2016. A rapid response thermistor pulmonary artery catheter was inserted via the right internal jugular vein. Preoperatively the following was recorded: central venous pressure, mean pulmonary artery pressure, pulmonary capillary wedge pressure, cardiac output, right ventricular ejection fraction and volumes. The same parameters were collected in fixed time intervals after 3 hours, 6 hours, 12 hours, 24 hours, and 48 hours postoperatively. For group (I): There were no statistically significant changes between the preoperative and postoperative records in the central venous pressure and mean arterial pressure; there were no statistically significant changes in the preoperative and 12, 24, and 48 hour postoperative records for cardiac index; 3 and 6 hours postoperative showed significant changes. There were statistically significant changes between the preoperative and postoperative records for heart rate, mean pulmonary artery pressure, pulmonary capillary wedge pressure, pulmonary vascular resistance, right ventricular ejection fraction and right ventricular end diastolic volume index, in all postoperative records. For group (II): There were no statistically significant changes between the preoperative and all postoperative records for the central venous pressure, mean arterial pressure and cardiac index. There were statistically significant changes between the preoperative and postoperative records for heart rate, mean pulmonary artery pressure, pulmonary capillary wedge pressure, pulmonary vascular resistance, right ventricular ejection fraction and right ventricular end diastolic volume index in all postoperative records. There were statistically significant changes between the two groups in all postoperative records for heart rate, mean pulmonary artery pressure, pulmonary capillary wedge pressure, pulmonary vascular resistance, right ventricular ejection fraction and right ventricular end diastolic volume index. There is right ventricular dysfunction early after major pulmonary resection caused by increased right ventricular afterload. This dysfunction is more present in pneumonectomy than in lobectomy. Heart rate, mean pulmonary artery pressure, pulmonary capillary wedge pressure, pulmonary vascular resistance, right ventricular ejection fraction, and right ventricular end diastolic volume index are significantly affected by pulmonary resection.
Balancing Treatment and Control Groups in Quasi-Experiments: An Introduction to Propensity Scoring
ERIC Educational Resources Information Center
Connelly, Brian S.; Sackett, Paul R.; Waters, Shonna D.
2013-01-01
Organizational and applied sciences have long struggled with improving causal inference in quasi-experiments. We introduce organizational researchers to propensity scoring, a statistical technique that has become popular in other applied sciences as a means for improving internal validity. Propensity scoring statistically models how individuals in…
Student Understanding of Taylor Series Expansions in Statistical Mechanics
ERIC Educational Resources Information Center
Smith, Trevor I.; Thompson, John R.; Mountcastle, Donald B.
2013-01-01
One goal of physics instruction is to have students learn to make physical meaning of specific mathematical expressions, concepts, and procedures in different physical settings. As part of research investigating student learning in statistical physics, we are developing curriculum materials that guide students through a derivation of the Boltzmann…
Most analyses of daily time series epidemiology data relate mortality or morbidity counts to PM and other air pollutants by means of single-outcome regression models using multiple predictors, without taking into account the complex statistical structure of the predictor variable...
ERIC Educational Resources Information Center
Tryon, Warren W.; Lewis, Charles
2009-01-01
Tryon presented a graphic inferential confidence interval (ICI) approach to analyzing two independent and dependent means for statistical difference, equivalence, replication, indeterminacy, and trivial difference. Tryon and Lewis corrected the reduction factor used to adjust descriptive confidence intervals (DCIs) to create ICIs and introduced…
Code of Federal Regulations, 2010 CFR
2010-10-01
... Officer refers to the individual designated to process requests and handle various other matters relating... finger or voice print or a photograph. Statistical Record means a record in a system of records maintained for statistical research or reporting purposes only and not used in whole or in part in making any...
Analysis of Variance with Summary Statistics in Microsoft® Excel®
ERIC Educational Resources Information Center
Larson, David A.; Hsu, Ko-Cheng
2010-01-01
Students regularly are asked to solve Single Factor Analysis of Variance problems given only the sample summary statistics (number of observations per category, category means, and corresponding category standard deviations). Most undergraduate students today use Excel for data analysis of this type. However, Excel, like all other statistical…
Measuring Skewness: A Forgotten Statistic?
ERIC Educational Resources Information Center
Doane, David P.; Seward, Lori E.
2011-01-01
This paper discusses common approaches to presenting the topic of skewness in the classroom, and explains why students need to know how to measure it. Two skewness statistics are examined: the Fisher-Pearson standardized third moment coefficient, and the Pearson 2 coefficient that compares the mean and median. The former is reported in statistical…
The Importance of Introductory Statistics Students Understanding Appropriate Sampling Techniques
ERIC Educational Resources Information Center
Menil, Violeta C.
2005-01-01
In this paper the author discusses the meaning of sampling, the reasons for sampling, the Central Limit Theorem, and the different techniques of sampling. Practical and relevant examples are given to make the appropriate sampling techniques understandable to students of Introductory Statistics courses. With a thorough knowledge of sampling…
Pang, Jingxiang; Fu, Jialei; Yang, Meina; Zhao, Xiaolei; van Wijk, Eduard; Wang, Mei; Fan, Hua; Han, Jinxiang
2016-03-01
In the practice and principle of Chinese medicine, herbal materials are classified according to their therapeutic properties. 'Cold' and 'heat' are the most important classes of Chinese medicinal herbs according to the theory of traditional Chinese medicine (TCM). In this work, delayed luminescence (DL) was measured for different samples of Chinese medicinal herbs using a sensitive photon multiplier detection system. A comparison of DL parameters, including mean intensity and statistic entropy, was undertaken to discriminate between the 'cold' and 'heat' properties of Chinese medicinal herbs. The results suggest that there are significant differences in mean intensity and statistic entropy and using this method combined with statistical analysis may provide novel parameters for the characterization of Chinese medicinal herbs in relation to their energetic properties. Copyright © 2015 John Wiley & Sons, Ltd.
12 CFR 900.3 - Terms relating to other entities and concepts used throughout 12 CFR chapter IX.
Code of Federal Regulations, 2010 CFR
2010-01-01
... a credit rating organization regarded as a Nationally Recognized Statistical Rating Organization by... means the Office of Thrift Supervision. SBIC means a small business investment company formed pursuant to section 301 of the Small Business Investment Act (15 U.S.C. 681). SEC means the United States...
Gambling as a teaching aid in the introductory physics laboratory
NASA Astrophysics Data System (ADS)
Horodynski-Matsushigue, L. B.; Pascholati, P. R.; Vanin, V. R.; Dias, J. F.; Yoneama, M.-L.; Siqueira, P. T. D.; Amaku, M.; Duarte, J. L. M.
1998-07-01
Dice throwing is used to illustrate relevant concepts of the statistical theory of uncertainties, in particular the meaning of a limiting distribution, the standard deviation, and the standard deviation of the mean. It is an important part in a sequence of especially programmed laboratory activities, developed for freshmen, at the Institute of Physics of the University of São Paulo. It is shown how this activity is employed within a constructive teaching approach, which aims at a growing understanding of the measuring processes and of the fundamentals of correct statistical handling of experimental data.
Laspas, Fotios; Tsantioti, Dimitra; Roussakis, Arkadios; Kritikos, Nikolaos; Efthimiadou, Roxani; Kehagias, Dimitrios; Andreou, John
2011-04-01
Computed tomography coronary angiography (CTCA) has been widely used since the introduction of 64-slice scanners and dual-source CT technology, but the relatively high radiation dose remains a major concern. To evaluate the relationship between radiation exposure and heart rate (HR), in dual-source CTCA. Data from 218 CTCA examinations, performed with a dual-source 64-slices scanner, were statistically evaluated. Effective radiation dose, expressed in mSv, was calculated as the product of the dose-length product (DLP) times a conversion coefficient for the chest (mSv = DLPx0.017). Heart rate range and mean heart rate, expressed in beats per minute (bpm) of each individual during CTCA, were also provided by the system. Statistical analysis of effective dose and heart rate data was performed by using Pearson correlation coefficient and two-sample t-test. Mean HR and effective dose were found to have a borderline positive relationship. Individuals with a mean HR >65 bpm observed to receive a statistically significant higher effective dose as compared to those with a mean HR ≤65 bpm. Moreover, a strong correlation between effective dose and variability of HR of more than 20 bpm was observed. Dual-source CT scanners are considered to have the capability to provide diagnostic examinations even with high HR and arrhythmias. However, it is desirable to keep the mean heart rate below 65 bpm and heart rate fluctuation less than 20 bpm in order to reduce the radiation exposure.
Munsawaengsub, Chokchai; Yimklib, Somkid; Nanthamongkolchai, Sutham; Apinanthavech, Suporn
2009-12-01
To study the effect of promoting self-esteem by participatory learning program on emotional intelligence among early adolescents. The quasi-experimental study was conducted in grade 9 students from two schools in Bangbuathong district, Nonthaburi province. Each experimental and comparative group consisted of 34 students with the lowest score of emotional intelligence. The instruments were questionnaires, Program to Develop Emotional Intelligence and Handbook of Emotional Intelligence Development. The experimental group attended 8 participatory learning activities in 4 weeks to Develop Emotional Intelligence while the comparative group received the handbook for self study. Assessment the effectiveness of program was done by pre-test and post-test immediately and 4 weeks apart concerning the emotional intelligence. Implementation and evaluation was done during May 24-August 12, 2005. Data were analyzed by frequency, percentage, mean, standard deviation, Chi-square, independent sample t-test and paired sample t-test. Before program implementation, both groups had no statistical difference in mean score of emotional intelligence. After intervention, the experimental group had higher mean score of emotional intelligence both immediately and 4 weeks later with statistical significant (p = 0.001 and < 0.001). At 4 weeks after experiment, the mean score in experimental group was higher than the mean score at immediate after experiment with statistical significance (p < 0.001). The program to promote self-esteem by participatory learning process could enhance the emotional intelligence in early-adolescent. This program could be modified and implemented for early adolescent in the community.
Weaver, J. Curtis
2015-03-12
In 2013, the U.S. Geological Survey, in cooperation with the North Carolina Division of Water Resources, compiled updated low-flow characteristics and flow-duration statistics for selected continuous-record streamgages in North Carolina. The compilation of updated streamflow statistics provides regulators and planners with relevant hydrologic information reflective of the recent droughts, which can be used to better manage the quantity and quality of streams in North Carolina. Streamflow records available through the 2012 water year1 were used to determine the annual (based on climatic year2) and winter 7-day, 10-year (7Q10, W7Q10) low-flow discharges, the 30-day, 2-year (30Q2) low-flow discharge, and the 7-day, 2-year (7Q2) low-flow discharge. Consequently, streamflow records available through March 31, 2012 (or the 2011 climatic year) were used to determine the updated low-flow characteristics. Low-flow characteristics were published for 177 unregulated sites, 56 regulated sites, and 33 sites known or considered to be affected by varying degrees of minor regulation and (or) diversions upstream from the streamgages (266 sites total). The updated 7Q10 discharges were compared for 63 streamgages across North Carolina where (1) long-term streamflow record consisted of 30 or more climatic years of data available as of the 1998 climatic year, and (2) streamflows were not known to be regulated. The 7Q10 discharges did not change for 3 sites, whereas increases and decreases were noted at 5 and 55 sites, respectively. Positive changes (increases) ranged from 4.3 percent (site 362) to 34.1 percent (site 112) with a median of 13.2 percent. Negative percentage changes (decreases) ranged from –3.3 percent (site 514) to –80.0 percent (site 308) with a median of –22.2 percent. The median percentage change for all 63 streamgages was –18.4 percent. Streamflow statistics determined as a part of this compilation included minimum, mean, maximum, and flow-duration statistics of daily mean discharges for categorical periods. Flow-duration statistics based on the daily mean discharge records were compiled in this study for the 5th, 10th, 25th, 50th, 75th, 90th, and 95th percentiles. Flow-duration statistics were determined for each complete water year of record at a streamgage as well as the available period of record (or selected periods if flows were regulated) and selected seasonal, monthly, and calendar day periods. In addition to the streamflow statistics compiled for each of the water years, the number of days the daily mean discharge was at or below the 10th percentile was summed for each water year as well as the number of events during the water year when streamflow was consistently at or below the 10th percentile. All low-flow characteristics for the streamgages were added into the StreamStatsDB, which is a database accessible to users through the recently released USGS StreamStats application for North Carolina. The minimum, mean, maximum, and flow-duration statistics of daily mean discharges based on the available (or selected if regulated flows) period of record were updated in the North Carolina StreamStatsDB. However, for the selected seasonal, monthly, calendar day, and annual water year periods, tab-delimited American Standard Code for Information Interchange (ASCII) tables of the streamflow statistics are available online to users from a link provided in the StreamStats application. 1The annual period from October 1 through September 30, designated by the year in which the period ends. 2The annual period from April 1 through March 31, designated by the year in which the period begins.
Mann, Michael P.; Rizzardo, Jule; Satkowski, Richard
2004-01-01
Accurate streamflow statistics are essential to water resource agencies involved in both science and decision-making. When long-term streamflow data are lacking at a site, estimation techniques are often employed to generate streamflow statistics. However, procedures for accurately estimating streamflow statistics often are lacking. When estimation procedures are developed, they often are not evaluated properly before being applied. Use of unevaluated or underevaluated flow-statistic estimation techniques can result in improper water-resources decision-making. The California State Water Resources Control Board (SWRCB) uses two key techniques, a modified rational equation and drainage basin area-ratio transfer, to estimate streamflow statistics at ungaged locations. These techniques have been implemented to varying degrees, but have not been formally evaluated. For estimating peak flows at the 2-, 5-, 10-, 25-, 50-, and 100-year recurrence intervals, the SWRCB uses the U.S. Geological Surveys (USGS) regional peak-flow equations. In this study, done cooperatively by the USGS and SWRCB, the SWRCB estimated several flow statistics at 40 USGS streamflow gaging stations in the north coast region of California. The SWRCB estimates were made without reference to USGS flow data. The USGS used the streamflow data provided by the 40 stations to generate flow statistics that could be compared with SWRCB estimates for accuracy. While some SWRCB estimates compared favorably with USGS statistics, results were subject to varying degrees of error over the region. Flow-based estimation techniques generally performed better than rain-based methods, especially for estimation of December 15 to March 31 mean daily flows. The USGS peak-flow equations also performed well, but tended to underestimate peak flows. The USGS equations performed within reported error bounds, but will require updating in the future as peak-flow data sets grow larger. Little correlation was discovered between estimation errors and geographic locations or various basin characteristics. However, for 25-percentile year mean-daily-flow estimates for December 15 to March 31, the greatest estimation errors were at east San Francisco Bay area stations with mean annual precipitation less than or equal to 30 inches, and estimated 2-year/24-hour rainfall intensity less than 3 inches.
Fernández, A P; Jaramillo, J; Jaramillo, M
2000-01-01
We compared the efficacy, predictability, and safety of photorefractive keratectomy (PRK) and laser in situ keratomileusis (LASIK) for the surgical correction of low and moderate myopia. A retrospective study was performed to evaluate uncorrected and spectacle-corrected visual acuity, and manifest refraction 1 year after PRK or LASIK. All procedures were done using an automatic microkeratome (Chiron Ophthalmic) and the Nidek EC-5000 excimer laser. PRK was performed in 75 eyes of 45 patients and LASIK in 133 eyes of 77 patients. Mean age for PRK patients was 32.8 years (range, 18 to 52 yr) and LASIK patients was 29.6 years (range, 18 to 49 yr). Mean preoperative spherical equivalent refraction for PRK patients was -3.28 D (range, -1.00 to -6.00 D) and LASIK, -3.86 D (range, -1.00 to -6.00 D). One year after surgery, mean spherical equivalent refraction for Group 1 (baseline, -1.00 to -3.00 D) PRK eyes was -0.18 +/- 0.61 D (range, -1.50 to +0.75 D) and for LASIK eyes, -0.08 +/- 0.61 D (range, -1.50 to +1.62 D), with no statistically significant difference. For Group 2 eyes (baseline, -3.25 to -6.00 D), mean spherical equivalent refraction for PRK eyes was -0.44 +/- 0.87 D (range, -2.00 to +2.12 D) and for LASIK eyes, -0.09 +/- 0.83 D (range, -1.50 to +1.75 D), with no statistically significant difference. The antilogarithm of the mean UCVA (antilogUCVA) in Group 1 for PRK was 0.79 +/- 0.21 (20/25) and for LASIK was 0.87 +/- 0.19 (20/23), with no statistically significant difference. The antilogUCVA in Group 2 for PRK eyes was 0.70 +/- 0.24 (20/28) and for LASIK eyes was 0.83 +/- 0.18 (20/24), with a statistically significant difference (0.7 vs. 0.83, P < .005). The percentage of eyes with a postoperative UCVA >20/40 in Group 1 for PRK was 91.5% (38 eyes) and for LASIK was 95% (50 eyes) (no statistically significant difference), and in Group 2 for PRK eyes, it was 82% (27 eyes) and 97.5% (78 eyes) for LASIK (statistically significant difference, P < .05). PRK and LASIK with the Nidek EC-5000 excimer laser are effective and safe for correcting low to moderate myopia, but LASIK eyes showed better results for moderate myopia in terms of uncorrected visual acuity.
Basic statistics with Microsoft Excel: a review.
Divisi, Duilio; Di Leonardo, Gabriella; Zaccagna, Gino; Crisci, Roberto
2017-06-01
The scientific world is enriched daily with new knowledge, due to new technologies and continuous discoveries. The mathematical functions explain the statistical concepts particularly those of mean, median and mode along with those of frequency and frequency distribution associated to histograms and graphical representations, determining elaborative processes on the basis of the spreadsheet operations. The aim of the study is to highlight the mathematical basis of statistical models that regulate the operation of spreadsheets in Microsoft Excel.
Basic statistics with Microsoft Excel: a review
Di Leonardo, Gabriella; Zaccagna, Gino; Crisci, Roberto
2017-01-01
The scientific world is enriched daily with new knowledge, due to new technologies and continuous discoveries. The mathematical functions explain the statistical concepts particularly those of mean, median and mode along with those of frequency and frequency distribution associated to histograms and graphical representations, determining elaborative processes on the basis of the spreadsheet operations. The aim of the study is to highlight the mathematical basis of statistical models that regulate the operation of spreadsheets in Microsoft Excel. PMID:28740690
NASA Technical Reports Server (NTRS)
Silsby, Norman S
1955-01-01
Statistical measurements of contact conditions have been obtained, by means of a special photographic technique, of 478 landings of present-day transport airplanes made during routine daylight operations in clear air at the Washington National Airport. From the measurements, sinking speeds, rolling velocities, bank angles, and horizontal speeds at the instant before contact have been evaluated and a limited statistical analysis of the results has been made and is reported in this report.
Testing statistical self-similarity in the topology of river networks
Troutman, Brent M.; Mantilla, Ricardo; Gupta, Vijay K.
2010-01-01
Recent work has demonstrated that the topological properties of real river networks deviate significantly from predictions of Shreve's random model. At the same time the property of mean self-similarity postulated by Tokunaga's model is well supported by data. Recently, a new class of network model called random self-similar networks (RSN) that combines self-similarity and randomness has been introduced to replicate important topological features observed in real river networks. We investigate if the hypothesis of statistical self-similarity in the RSN model is supported by data on a set of 30 basins located across the continental United States that encompass a wide range of hydroclimatic variability. We demonstrate that the generators of the RSN model obey a geometric distribution, and self-similarity holds in a statistical sense in 26 of these 30 basins. The parameters describing the distribution of interior and exterior generators are tested to be statistically different and the difference is shown to produce the well-known Hack's law. The inter-basin variability of RSN parameters is found to be statistically significant. We also test generator dependence on two climatic indices, mean annual precipitation and radiative index of dryness. Some indication of climatic influence on the generators is detected, but this influence is not statistically significant with the sample size available. Finally, two key applications of the RSN model to hydrology and geomorphology are briefly discussed.
Statistical Estimation of Heterogeneities: A New Frontier in Well Testing
NASA Astrophysics Data System (ADS)
Neuman, S. P.; Guadagnini, A.; Illman, W. A.; Riva, M.; Vesselinov, V. V.
2001-12-01
Well-testing methods have traditionally relied on analytical solutions of groundwater flow equations in relatively simple domains, consisting of one or at most a few units having uniform hydraulic properties. Recently, attention has been shifting toward methods and solutions that would allow one to characterize subsurface heterogeneities in greater detail. On one hand, geostatistical inverse methods are being used to assess the spatial variability of parameters, such as permeability and porosity, on the basis of multiple cross-hole pressure interference tests. On the other hand, analytical solutions are being developed to describe the mean and variance (first and second statistical moments) of flow to a well in a randomly heterogeneous medium. Geostatistical inverse interpretation of cross-hole tests yields a smoothed but detailed "tomographic" image of how parameters actually vary in three-dimensional space, together with corresponding measures of estimation uncertainty. Moment solutions may soon allow one to interpret well tests in terms of statistical parameters such as the mean and variance of log permeability, its spatial autocorrelation and statistical anisotropy. The idea of geostatistical cross-hole tomography is illustrated through pneumatic injection tests conducted in unsaturated fractured tuff at the Apache Leap Research Site near Superior, Arizona. The idea of using moment equations to interpret well-tests statistically is illustrated through a recently developed three-dimensional solution for steady state flow to a well in a bounded, randomly heterogeneous, statistically anisotropic aquifer.
Outcome of temporal lobe epilepsy surgery predicted by statistical parametric PET imaging.
Wong, C Y; Geller, E B; Chen, E Q; MacIntyre, W J; Morris, H H; Raja, S; Saha, G B; Lüders, H O; Cook, S A; Go, R T
1996-07-01
PET is useful in the presurgical evaluation of temporal lobe epilepsy. The purpose of this retrospective study is to assess the clinical use of statistical parametric imaging in predicting surgical outcome. Interictal 18FDG-PET scans in 17 patients with surgically-treated temporal lobe epilepsy (Group A-13 seizure-free, group B = 4 not seizure-free at 6 mo) were transformed into statistical parametric imaging, with each pixel representing a z-score value by using the mean and s.d. of count distribution in each individual patient, for both visual and quantitative analysis. Mean z-scores were significantly more negative in anterolateral (AL) and mesial (M) regions on the operated side than the nonoperated side in group A (AL: p < 0.00005, M: p = 0.0097), but not in group B (AL: p = 0.46, M: p = 0.08). Statistical parametric imaging correctly lateralized 16 out of 17 patients. Only the AL region, however, was significant in predicting surgical outcome (F = 29.03, p < 0.00005). Using a cut-off z-score value of -1.5, statistical parametric imaging correctly classified 92% of temporal lobes from group A and 88% of those from Group B. The preliminary results indicate that statistical parametric imaging provides both clinically useful information for lateralization in temporal lobe epilepsy and a reliable predictive indicator of clinical outcome following surgical treatment.
Seabed mapping and characterization of sediment variability using the usSEABED data base
Goff, J.A.; Jenkins, C.J.; Jeffress, Williams S.
2008-01-01
We present a methodology for statistical analysis of randomly located marine sediment point data, and apply it to the US continental shelf portions of usSEABED mean grain size records. The usSEABED database, like many modern, large environmental datasets, is heterogeneous and interdisciplinary. We statistically test the database as a source of mean grain size data, and from it provide a first examination of regional seafloor sediment variability across the entire US continental shelf. Data derived from laboratory analyses ("extracted") and from word-based descriptions ("parsed") are treated separately, and they are compared statistically and deterministically. Data records are selected for spatial analysis by their location within sample regions: polygonal areas defined in ArcGIS chosen by geography, water depth, and data sufficiency. We derive isotropic, binned semivariograms from the data, and invert these for estimates of noise variance, field variance, and decorrelation distance. The highly erratic nature of the semivariograms is a result both of the random locations of the data and of the high level of data uncertainty (noise). This decorrelates the data covariance matrix for the inversion, and largely prevents robust estimation of the fractal dimension. Our comparison of the extracted and parsed mean grain size data demonstrates important differences between the two. In particular, extracted measurements generally produce finer mean grain sizes, lower noise variance, and lower field variance than parsed values. Such relationships can be used to derive a regionally dependent conversion factor between the two. Our analysis of sample regions on the US continental shelf revealed considerable geographic variability in the estimated statistical parameters of field variance and decorrelation distance. Some regional relationships are evident, and overall there is a tendency for field variance to be higher where the average mean grain size is finer grained. Surprisingly, parsed and extracted noise magnitudes correlate with each other, which may indicate that some portion of the data variability that we identify as "noise" is caused by real grain size variability at very short scales. Our analyses demonstrate that by applying a bias-correction proxy, usSEABED data can be used to generate reliable interpolated maps of regional mean grain size and sediment character.
Low-flow statistics of selected streams in Chester County, Pennsylvania
Schreffler, Curtis L.
1998-01-01
Low-flow statistics for many streams in Chester County, Pa., were determined on the basis of data from 14 continuous-record streamflow stations in Chester County and data from 1 station in Maryland and 1 station in Delaware. The stations in Maryland and Delaware are on streams that drain large areas within Chester County. Streamflow data through the 1994 water year were used in the analyses. The low-flow statistics summarized are the 1Q10, 7Q10, 30Q10, and harmonic mean. Low-flow statistics were estimated at 34 partial-record stream sites throughout Chester County.
Yu, Xiaojin; Liu, Pei; Min, Jie; Chen, Qiguang
2009-01-01
To explore the application of regression on order statistics (ROS) in estimating nondetects for food exposure assessment. Regression on order statistics was adopted in analysis of cadmium residual data set from global food contaminant monitoring, the mean residual was estimated basing SAS programming and compared with the results from substitution methods. The results show that ROS method performs better obviously than substitution methods for being robust and convenient for posterior analysis. Regression on order statistics is worth to adopt,but more efforts should be make for details of application of this method.
A Streamflow Statistics (StreamStats) Web Application for Ohio
Koltun, G.F.; Kula, Stephanie P.; Puskas, Barry M.
2006-01-01
A StreamStats Web application was developed for Ohio that implements equations for estimating a variety of streamflow statistics including the 2-, 5-, 10-, 25-, 50-, 100-, and 500-year peak streamflows, mean annual streamflow, mean monthly streamflows, harmonic mean streamflow, and 25th-, 50th-, and 75th-percentile streamflows. StreamStats is a Web-based geographic information system application designed to facilitate the estimation of streamflow statistics at ungaged locations on streams. StreamStats can also serve precomputed streamflow statistics determined from streamflow-gaging station data. The basic structure, use, and limitations of StreamStats are described in this report. To facilitate the level of automation required for Ohio's StreamStats application, the technique used by Koltun (2003)1 for computing main-channel slope was replaced with a new computationally robust technique. The new channel-slope characteristic, referred to as SL10-85, differed from the National Hydrography Data based channel slope values (SL) reported by Koltun (2003)1 by an average of -28.3 percent, with the median change being -13.2 percent. In spite of the differences, the two slope measures are strongly correlated. The change in channel slope values resulting from the change in computational method necessitated revision of the full-model equations for flood-peak discharges originally presented by Koltun (2003)1. Average standard errors of prediction for the revised full-model equations presented in this report increased by a small amount over those reported by Koltun (2003)1, with increases ranging from 0.7 to 0.9 percent. Mean percentage changes in the revised regression and weighted flood-frequency estimates relative to regression and weighted estimates reported by Koltun (2003)1 were small, ranging from -0.72 to -0.25 percent and -0.22 to 0.07 percent, respectively.
López, Lydia M; Guerra, María Elena
2015-03-01
The aim of this study was to determine the caries rate and periodontal status in a sample of pregnant women with HIV+ infections from Puerto Rico. A pilot study was conducted on a cross sectional convenience sample of 25 pregnant women with HIV+ infections from Puerto Rico who visit the CEMI clinic (Centro de Estudios Materno Infantil) at the University of Puerto Rico. The women subjects were evaluated for caries, DMFT (D: Decay tooth; M: Missing tooth due to caries; F: Filled tooth) index, oral lesions associated with HIV+/AIDS and periodontal disease parameters, with a Florida probe by a calibrated dentist on periodontal indexes such as as bleeding on probing, CEJ (cemento-enamel junction) and pocket depth. Periodontal disease was classified as having 4 sites with pocket depth greater than 4 mm and caries were identified following the Radike criteria. Data was statistically analyzed using the SSPS Program (Statistical Software Program for Social Sciences) and descriptive statistics were calculated. Mean DT (decayed teeth), MT (missing teeth due to caries), FT (filled teeth) and DMFT (decay, missing and filled teeth) were 4.8, 1.86, 5.3 and 12, respectively; mean sites of bleeding on probing=12.06; mean sites with pocket depth>4 mm=6.95 and mean sites with loss of attachment greater than 4 mm=7.66. [Almost 50% of the patients had generalized chronic periodontitis. A 72% prevalence of periodontal disease was found. No oral lesions related to HIV+/AIDS were reported. CD4 and viral load was statistically associated with bleeding on probing and severe signs of periodontal disease. High levels of dental disease were found in pregnant women with HIV+/AIDS infections from Puerto Rico, and these women were in need of substantial dental services.
Analysis of S-box in Image Encryption Using Root Mean Square Error Method
NASA Astrophysics Data System (ADS)
Hussain, Iqtadar; Shah, Tariq; Gondal, Muhammad Asif; Mahmood, Hasan
2012-07-01
The use of substitution boxes (S-boxes) in encryption applications has proven to be an effective nonlinear component in creating confusion and randomness. The S-box is evolving and many variants appear in literature, which include advanced encryption standard (AES) S-box, affine power affine (APA) S-box, Skipjack S-box, Gray S-box, Lui J S-box, residue prime number S-box, Xyi S-box, and S8 S-box. These S-boxes have algebraic and statistical properties which distinguish them from each other in terms of encryption strength. In some circumstances, the parameters from algebraic and statistical analysis yield results which do not provide clear evidence in distinguishing an S-box for an application to a particular set of data. In image encryption applications, the use of S-boxes needs special care because the visual analysis and perception of a viewer can sometimes identify artifacts embedded in the image. In addition to existing algebraic and statistical analysis already used for image encryption applications, we propose an application of root mean square error technique, which further elaborates the results and enables the analyst to vividly distinguish between the performances of various S-boxes. While the use of the root mean square error analysis in statistics has proven to be effective in determining the difference in original data and the processed data, its use in image encryption has shown promising results in estimating the strength of the encryption method. In this paper, we show the application of the root mean square error analysis to S-box image encryption. The parameters from this analysis are used in determining the strength of S-boxes
Yazawa, Hiroyuki; Takiguchi, Kaoru; Imaizumi, Karin; Wada, Marina; Ito, Fumihiro
2018-04-17
Three-dimensional (3D) laparoscopic surgical systems have been developed to account for the lack of depth perception, a known disadvantage of conventional 2-dimensional (2D) laparoscopy. In this study, we retrospectively compared the outcomes of total laparoscopic hysterectomy (TLH) with 3D versus conventional 2D laparoscopy. From November 2014, when we began using a 3D laparoscopic system at our hospital, to December 2015, 47 TLH procedures were performed using a 3D laparoscopic system (3D-TLH). The outcomes of 3D-TLH were compared with the outcomes of TLH using the conventional 2D laparoscopic system (2D-TLH) performed just before the introduction of the 3D system. The 3D-TLH group had a statistically significantly shorter mean operative time than the 2D-TLH group (119±20 vs. 137±20 min), whereas the mean weight of the resected uterus and mean intraoperative blood loss were not statistically different. When we compared the outcomes for 20 cases in each group, using the same energy sealing device in a short period of time, only mean operative time was statistically different between the 3D-TLH and 2D-TLH groups (113±19 vs. 133±21 min). During the observation period, there was one occurrence of postoperative peritonitis in the 2D-TLH group and one occurrence of vaginal cuff dehiscence in each group, which was not statistically different. The surgeon and assistant surgeons did not report any symptoms attributable to the 3D imaging system such as dizziness, eyestrain, nausea, and headache. Therefore, we conclude that the 3D laparoscopic system could be used safely and efficiently for TLH.
Mahabeer, S; Naidoo, C; Joubert, S M
1990-06-01
Plasma glucose, immunoreactive insulin (IRI) and C-peptide responses during oral glucose tolerance testing (OGTT) were evaluated in 10 non obese women with polycystic ovarian disease (NOB-PCOD) and 10 obese women with polycystic ovarian disease (OB-PCOD). Mean plasma glucose response at 120 minutes in OB-PCOD showed impaired glucose tolerance. Also in this group, 1 patient had frank diabetes mellitus, whilst 3 other patients had impaired glucose tolerance 1 NOB-PCOD patient had impaired glucose tolerance. Mean plasma glucose levels and mean incremental glucose areas were higher in the OB-PCOD at all time intervals and reached statistical significance at 60 and 90 minutes. Mean plasma IRI levels were also higher in OB-PCOD at all time intervals, and reached statistically significant higher levels at 0, 60 and 90 minutes. Mean serum C-peptide valves were also higher at all time intervals in OB-PCOD. The relationship between acanthosis nigricans, obesity and PCOD was also analysed. It is evident from this study that obesity has a significant negative impact on the overall carbohydrate status in women with PCOD.
NASA Astrophysics Data System (ADS)
Wu, Qing; Luu, Quang-Hung; Tkalich, Pavel; Chen, Ge
2018-04-01
Having great impacts on human lives, global warming and associated sea level rise are believed to be strongly linked to anthropogenic causes. Statistical approach offers a simple and yet conceptually verifiable combination of remotely connected climate variables and indices, including sea level and surface temperature. We propose an improved statistical reconstruction model based on the empirical dynamic control system by taking into account the climate variability and deriving parameters from Monte Carlo cross-validation random experiments. For the historic data from 1880 to 2001, we yielded higher correlation results compared to those from other dynamic empirical models. The averaged root mean square errors are reduced in both reconstructed fields, namely, the global mean surface temperature (by 24-37%) and the global mean sea level (by 5-25%). Our model is also more robust as it notably diminished the unstable problem associated with varying initial values. Such results suggest that the model not only enhances significantly the global mean reconstructions of temperature and sea level but also may have a potential to improve future projections.
Determination of the refractive index of dehydrated cells by means of digital holographic microscopy
NASA Astrophysics Data System (ADS)
Belashov, A. V.; Zhikhoreva, A. A.; Bespalov, V. G.; Vasyutinskii, O. S.; Zhilinskaya, N. T.; Novik, V. I.; Semenova, I. V.
2017-10-01
Spatial distributions of the integral refractive index in dehydrated cells of human oral cavity epithelium are obtained by means of digital holographic microscopy, and mean refractive index of the cells is determined. The statistical analysis of the data obtained is carried out, and absolute errors of the method are estimated for different experimental conditions.
A Hands-On Exercise Improves Understanding of the Standard Error of the Mean
ERIC Educational Resources Information Center
Ryan, Robert S.
2006-01-01
One of the most difficult concepts for statistics students is the standard error of the mean. To improve understanding of this concept, 1 group of students used a hands-on procedure to sample from small populations representing either a true or false null hypothesis. The distribution of 120 sample means (n = 3) from each population had standard…
NASA Astrophysics Data System (ADS)
Bian, Tao; Ren, Guoyu
2017-11-01
Based on a homogenized data set of monthly mean temperature, minimum temperature, and maximum temperature at Shijiazhuang City Meteorological Station (Shijiazhuang station) and four rural meteorological stations selected applying a more sophisticated methodology, we reanalyzed the urbanization effects on annual, seasonal, and monthly mean surface air temperature (SAT) trends for updated time period 1960-2012 at the typical urban station in North China. The results showed that (1) urbanization effects on the long-term trends of annual mean SAT, minimum SAT, and diurnal temperature range (DTR) in the last 53 years reached 0.25, 0.47, and - 0.50 °C/decade, respectively, all statistically significant at the 0.001 confidence level, with the contributions from urbanization effects to the overall long-term trends reaching 67.8, 78.6, and 100%, respectively; (2) the urbanization effects on the trends of seasonal mean SAT, minimum SAT, and DTR were also large and statistically highly significant. Except for November and December, the urbanization effects on monthly mean SAT, minimum SAT, and DTR were also all statistically significant at the 0.05 confidence level; and (3) the annual, seasonal, and monthly mean maximum SAT series at the urban station registered a generally weaker and non-significant urbanization effect. The updated analysis evidenced that our previous work for this same urban station had underestimated the urbanization effect and its contribution to the overall changes in the SAT series. Many similar urban stations were being included in the current national and regional SAT data sets, and the results of this paper further indicated the importance and urgency for paying more attention to the urbanization bias in the monitoring and detection of global and regional SAT change based on the data sets.
Comparison of Accuracy Between a Conventional and Two Digital Intraoral Impression Techniques.
Malik, Junaid; Rodriguez, Jose; Weisbloom, Michael; Petridis, Haralampos
To compare the accuracy (ie, precision and trueness) of full-arch impressions fabricated using either a conventional polyvinyl siloxane (PVS) material or one of two intraoral optical scanners. Full-arch impressions of a reference model were obtained using addition silicone impression material (Aquasil Ultra; Dentsply Caulk) and two optical scanners (Trios, 3Shape, and CEREC Omnicam, Sirona). Surface matching software (Geomagic Control, 3D Systems) was used to superimpose the scans within groups to determine the mean deviations in precision and trueness (μm) between the scans, which were calculated for each group and compared statistically using one-way analysis of variance with post hoc Bonferroni (trueness) and Games-Howell (precision) tests (IBM SPSS ver 24, IBM UK). Qualitative analysis was also carried out from three-dimensional maps of differences between scans. Means and standard deviations (SD) of deviations in precision for conventional, Trios, and Omnicam groups were 21.7 (± 5.4), 49.9 (± 18.3), and 36.5 (± 11.12) μm, respectively. Means and SDs for deviations in trueness were 24.3 (± 5.7), 87.1 (± 7.9), and 80.3 (± 12.1) μm, respectively. The conventional impression showed statistically significantly improved mean precision (P < .006) and mean trueness (P < .001) compared to both digital impression procedures. There were no statistically significant differences in precision (P = .153) or trueness (P = .757) between the digital impressions. The qualitative analysis revealed local deviations along the palatal surfaces of the molars and incisal edges of the anterior teeth of < 100 μm. Conventional full-arch PVS impressions exhibited improved mean accuracy compared to two direct optical scanners. No significant differences were found between the two digital impression methods.
Paige, John T; Garbee, Deborah D; Kozmenko, Valeriy; Yu, Qingzhao; Kozmenko, Lyubov; Yang, Tong; Bonanno, Laura; Swartz, William
2014-01-01
Effective teamwork in the operating room (OR) is often undermined by the "silo mentality" of the differing professions. Such thinking is formed early in one's professional experience and is fostered by undergraduate medical and nursing curricula lacking interprofessional education. We investigated the immediate impact of conducting interprofessional student OR team training using high-fidelity simulation (HFS) on students' team-related attitudes and behaviors. Ten HFS OR interprofessional student team training sessions were conducted involving 2 standardized HFS scenarios, each of which was followed by a structured debriefing that targeted team-based competencies. Pre- and post-session mean scores were calculated and analyzed for 15 Likert-type items measuring self-efficacy in teamwork competencies using the t-test. Additionally, mean scores of observer ratings of team performance after each scenario and participant ratings after the second scenario for an 11-item Likert-type teamwork scale were calculated and analyzed using one-way ANOVA and t-test. Eighteen nursing students, 20 nurse anesthetist students, and 28 medical students participated in the training. Statistically significant gains from mean pre- to post-training scores occurred on 11 of the 15 self-efficacy items. Statistically significant gains in mean observer performance scores were present on all 3 subscales of the teamwork scale from the first scenario to the second. A statistically significant difference was found in comparisons of mean observer scores with mean participant scores for the team-based behaviors subscale. High-fidelity simulation OR interprofessional student team training improves students' team-based attitudes and behaviors. Students tend to overestimate their team-based behaviors. Copyright © 2014 American College of Surgeons. Published by Elsevier Inc. All rights reserved.
Duran, Ridvan; Vatansever, Ulfet; Acunaş, Betül; Süt, Necdet
2009-01-01
Preterm infants are prone to temperature maintenance problems due to immature thermoregulatory mechanism and relatively large body surface area. The objective of the present study was to evaluate the performance of a new non-invasive infrared thermometer applied to the mid-forehead and temporal artery in comparison with axillary temperature recordings by mercury-in-glass thermometer, and to determine the discomfort caused by these procedures in preterm infants on incubator care. The present comparative prospective study was composed of 34 preterm infants <1500 g of birthweight nursed in an incubator. Temperature recording from mid-forehead, temporal artery and axilla were recorded six times a day for 7 days since the end of the first week of life. For pain assessment, the premature infant pain profile (PIPP) was used. The mean mid-forehead, temporal artery and axillary temperatures were 36.72 +/- 0.08, 36.81 +/- 0.09 and 36.71 +/- 0.07 degrees C, respectively. No statistically significant difference was noted between the means of mid-forehead and axillary temperatures. The mean temporal artery temperature was statistically higher than the means of the mid-forehead and axillary temperatures. The PIPP scores of the mid-forehead, temporal artery and axillary temperature measurements were 5.07 +/- 0.36 degrees C, 5.18 +/- 0.43 degrees C and 7.59 +/- 0.84 degrees C, respectively. The mean PIPP score of axillary temperature measurements was statistically higher than the means of mid-forehead and temporal artery measurements. The infrared skin thermometer applied to the mid-forehead is a useful and valid device for easy and less painful measurement of skin temperature in preterm infants <1500 g of birthweight.
12 CFR 741.6 - Financial and statistical and other reports.
Code of Federal Regulations, 2010 CFR
2010-01-01
... greater, but may reflect regulatory accounting principles other than GAAP if the credit union has total.... GAAP means generally accepted accounting principles, as defined in § 715.2(e) of this chapter. GAAP is... 12 Banks and Banking 6 2010-01-01 2010-01-01 false Financial and statistical and other reports...
Comparative Costs and Staffing Report for College and University Facilities, 1993-94.
ERIC Educational Resources Information Center
Silberman, Gil, Ed.; Glazner, Steve, Ed.
This report presents comparative data on facility management costs and staffing based on responses from 516 U.S. postsecondary educational facilities during 1993-94. It lists statistics from both private and public institutions, beginning with statistical reductions presenting the survey response tally, institutional profiles, and mean costs per…
Statistical and Cooperative Learning in Reading: An Artificial Orthography Learning Study
ERIC Educational Resources Information Center
Zhao, Jingjing; Li, Tong; Elliott, Mark A.; Rueckl, Jay G.
2018-01-01
This article reports two experiments in which the artificial orthography paradigm was used to investigate the mechanisms underlying learning to read. In each experiment, participants were taught the meanings and pronunications of words written in an unfamiliar orthography, and the statistical structure of the mapping between written and spoken…
Finding Balance at the Elusive Mean
ERIC Educational Resources Information Center
Hudson, Rick A.
2012-01-01
Data analysis plays an important role in people's lives. Citizens need to be able to conduct critical analyses of statistical information in the work place, in their personal lives, and when portrayed by the media. However, becoming a conscientious consumer of statistics is a gradual process. The experiences that students have with data in the…
Fine-Grained Sensitivity to Statistical Information in Adult Word Learning
ERIC Educational Resources Information Center
Vouloumanos, Athena
2008-01-01
A language learner trying to acquire a new word must often sift through many potential relations between particular words and their possible meanings. In principle, statistical information about the distribution of those mappings could serve as one important source of data, but little is known about whether learners can in fact track multiple…
Statistical properties of alternative national forest inventory area estimators
Francis Roesch; John Coulston; Andrew D. Hill
2012-01-01
The statistical properties of potential estimators of forest area for the USDA Forest Service's Forest Inventory and Analysis (FIA) program are presented and discussed. The current FIA area estimator is compared and contrasted with a weighted mean estimator and an estimator based on the Polya posterior, in the presence of nonresponse. Estimator optimality is...
Seed Dispersal Near and Far: Patterns Across Temperate and Tropical Forests
James S. Clark; Miles Silman; Ruth Kern; Eric Macklin; Janneke HilleRisLambers
1999-01-01
Dispersal affects community dynamics and vegetation response to global change. Understanding these effects requires descriptions of dispersal at local and regional scales and statistical models that permit estimation. Classical models of dispersal describe local or long-distance dispersal, but not both. The lack of statistical methods means that models have rarely been...
DOT National Transportation Integrated Search
1981-10-01
Two statistical procedures have been developed to estimate hourly or daily aircraft counts. These counts can then be transformed into estimates of instantaneous air counts. The first procedure estimates the stable (deterministic) mean level of hourly...
Conference Report on Youth Unemployment: Its Measurements and Meaning.
ERIC Educational Resources Information Center
Employment and Training Administration (DOL), Washington, DC.
Thirteen papers presented at a conference on employment statistics and youth are contained in this report. Reviewed are the problems of gathering, interpreting, and applying employment and unemployment data relating to youth. The titles of the papers are as follow: "Counting Youth: A Comparison of Youth Labor Force Statistics in the Current…
What Does Average Really Mean? Making Sense of Statistics
ERIC Educational Resources Information Center
DeAngelis, Karen J.; Ayers, Steven
2009-01-01
The recent shift toward greater accountability has put many educational leaders in a position where they are expected to collect and use increasing amounts of data to inform their decision making. Yet, because many programs that prepare administrators, including school business officials, do not require a statistics course or a course that is more…
International Content as Hidden Curriculum in Business Statistics: An Overlooked Opportunity
ERIC Educational Resources Information Center
Sebastianelli, Rose; Trussler, Susan
2006-01-01
We revisit the issue of internationalizing the required course in business statistics as a means for introducing international subject matter earlier in the undergraduate business curriculum. A survey of sophomore business students indicates that their level of international knowledge is poor. The results are strikingly similar to a decade ago.…
The Effect of Anchor Test Construction on Scale Drift
ERIC Educational Resources Information Center
Antal, Judit; Proctor, Thomas P.; Melican, Gerald J.
2014-01-01
In common-item equating the anchor block is generally built to represent a miniature form of the total test in terms of content and statistical specifications. The statistical properties frequently reflect equal mean and spread of item difficulty. Sinharay and Holland (2007) suggested that the requirement for equal spread of difficulty may be too…
NASA Technical Reports Server (NTRS)
Wilson, Robert M.
2007-01-01
Statistical aspects of the North Atlantic basin tropical cyclones for the interval 1945- 2005 are examined, including the variation of the yearly frequency of occurrence for various subgroups of storms (all tropical cyclones, hurricanes, major hurricanes, U.S. landfalling hurricanes, and category 4/5 hurricanes); the yearly variation of the mean latitude and longitude (genesis location) of all tropical cyclones and hurricanes; and the yearly variation of the mean peak wind speeds, lowest pressures, and durations for all tropical cyclones, hurricanes, and major hurricanes. Also examined is the relationship between inferred trends found in the North Atlantic basin tropical cyclonic activity and natural variability and global warming, the latter described using surface air temperatures from the Armagh Observatory Armagh, Northern Ireland. Lastly, a simple statistical technique is employed to ascertain the expected level of North Atlantic basin tropical cyclonic activity for the upcoming 2007 season.
Collective behaviours: from biochemical kinetics to electronic circuits.
Agliari, Elena; Barra, Adriano; Burioni, Raffaella; Di Biasio, Aldo; Uguzzoni, Guido
2013-12-10
In this work we aim to highlight a close analogy between cooperative behaviors in chemical kinetics and cybernetics; this is realized by using a common language for their description, that is mean-field statistical mechanics. First, we perform a one-to-one mapping between paradigmatic behaviors in chemical kinetics (i.e., non-cooperative, cooperative, ultra-sensitive, anti-cooperative) and in mean-field statistical mechanics (i.e., paramagnetic, high and low temperature ferromagnetic, anti-ferromagnetic). Interestingly, the statistical mechanics approach allows a unified, broad theory for all scenarios and, in particular, Michaelis-Menten, Hill and Adair equations are consistently recovered. This framework is then tested against experimental biological data with an overall excellent agreement. One step forward, we consistently read the whole mapping from a cybernetic perspective, highlighting deep structural analogies between the above-mentioned kinetics and fundamental bricks in electronics (i.e. operational amplifiers, flashes, flip-flops), so to build a clear bridge linking biochemical kinetics and cybernetics.
Parent and Friend Social Support and Adolescent Hope.
Mahon, Noreen E; Yarcheski, Adela
2017-04-01
The purpose of this study was to conduct two meta-analyses. The first examined social support from parents in relation to adolescent hope, and the second examined social support from friends in relation to adolescent hope. Using Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines for the literature reviewed, nine published studies or doctoral dissertations completed between 1990 and 2014 met the inclusion criteria. Using meta-analytic techniques and the mean weighted r statistic, the results indicated that social support from friends had a stronger mean effect size (ES = .31) than social support from parents (ES = .21); there was a statistically significant difference between the two ESs. Two of the four moderators for the parent social support-adolescent hope relationship were statistically significant. They were quality score and health status. Implications for school nurses and nurses in all settings are addressed, and conclusions are drawn based on the findings.
Things we still haven't learned (so far).
Ivarsson, Andreas; Andersen, Mark B; Stenling, Andreas; Johnson, Urban; Lindwall, Magnus
2015-08-01
Null hypothesis significance testing (NHST) is like an immortal horse that some researchers have been trying to beat to death for over 50 years but without any success. In this article we discuss the flaws in NHST, the historical background in relation to both Fisher's and Neyman and Pearson's statistical ideas, the common misunderstandings of what p < .05 actually means, and the 2010 APA publication manual's clear, but most often ignored, instructions to report effect sizes and to interpret what they all mean in the real world. In addition, we discuss how Bayesian statistics can be used to overcome some of the problems with NHST. We then analyze quantitative articles published over the past three years (2012-2014) in two top-rated sport and exercise psychology journals to determine whether we have learned what we should have learned decades ago about our use and meaningful interpretations of statistics.
Adaptation to stimulus statistics in the perception and neural representation of auditory space.
Dahmen, Johannes C; Keating, Peter; Nodal, Fernando R; Schulz, Andreas L; King, Andrew J
2010-06-24
Sensory systems are known to adapt their coding strategies to the statistics of their environment, but little is still known about the perceptual implications of such adjustments. We investigated how auditory spatial processing adapts to stimulus statistics by presenting human listeners and anesthetized ferrets with noise sequences in which interaural level differences (ILD) rapidly fluctuated according to a Gaussian distribution. The mean of the distribution biased the perceived laterality of a subsequent stimulus, whereas the distribution's variance changed the listeners' spatial sensitivity. The responses of neurons in the inferior colliculus changed in line with these perceptual phenomena. Their ILD preference adjusted to match the stimulus distribution mean, resulting in large shifts in rate-ILD functions, while their gain adapted to the stimulus variance, producing pronounced changes in neural sensitivity. Our findings suggest that processing of auditory space is geared toward emphasizing relative spatial differences rather than the accurate representation of absolute position.
Spatiotemporal Analysis of the Ebola Hemorrhagic Fever in West Africa in 2014
NASA Astrophysics Data System (ADS)
Xu, M.; Cao, C. X.; Guo, H. F.
2017-09-01
Ebola hemorrhagic fever (EHF) is an acute hemorrhagic diseases caused by the Ebola virus, which is highly contagious. This paper aimed to explore the possible gathering area of EHF cases in West Africa in 2014, and identify endemic areas and their tendency by means of time-space analysis. We mapped distribution of EHF incidences and explored statistically significant space, time and space-time disease clusters. We utilized hotspot analysis to find the spatial clustering pattern on the basis of the actual outbreak cases. spatial-temporal cluster analysis is used to analyze the spatial or temporal distribution of agglomeration disease, examine whether its distribution is statistically significant. Local clusters were investigated using Kulldorff's scan statistic approach. The result reveals that the epidemic mainly gathered in the western part of Africa near north Atlantic with obvious regional distribution. For the current epidemic, we have found areas in high incidence of EVD by means of spatial cluster analysis.
Collective behaviours: from biochemical kinetics to electronic circuits
NASA Astrophysics Data System (ADS)
Agliari, Elena; Barra, Adriano; Burioni, Raffaella; di Biasio, Aldo; Uguzzoni, Guido
2013-12-01
In this work we aim to highlight a close analogy between cooperative behaviors in chemical kinetics and cybernetics; this is realized by using a common language for their description, that is mean-field statistical mechanics. First, we perform a one-to-one mapping between paradigmatic behaviors in chemical kinetics (i.e., non-cooperative, cooperative, ultra-sensitive, anti-cooperative) and in mean-field statistical mechanics (i.e., paramagnetic, high and low temperature ferromagnetic, anti-ferromagnetic). Interestingly, the statistical mechanics approach allows a unified, broad theory for all scenarios and, in particular, Michaelis-Menten, Hill and Adair equations are consistently recovered. This framework is then tested against experimental biological data with an overall excellent agreement. One step forward, we consistently read the whole mapping from a cybernetic perspective, highlighting deep structural analogies between the above-mentioned kinetics and fundamental bricks in electronics (i.e. operational amplifiers, flashes, flip-flops), so to build a clear bridge linking biochemical kinetics and cybernetics.
NASA Technical Reports Server (NTRS)
Shepherd, J. Marshall
2004-01-01
The study employs a 108-year precipitation data record to identify statistically significant anomalies in rainfall downwind of the Phoenix urban region. The analysis reveals that during the monsoon season locations northeastern suburbs and exurbs of the Phoenix metropolitan area have experienced statistically significant increases in mean precipitation of 12 to 14 percent from a pre-urban (1895-1949) to post-urban (1950-2003) period. Mean and median post-urban precipitation totals in the anomaly region are significantly greater, in the statistical sense, than regions west of the city and in nearby mountainous regions of similar or greater topography. Further analysis of satellite-based rainfall totals for the summer of 2003 also reveal the existence of the anomaly region during a severe drought period. The anomaly can not simply be attributed to maximum topographic relief and is hypothesize to be related to urban-topographic interactions.
Asthma disease management: regression to the mean or better?
Tinkelman, David; Wilson, Steve
2004-12-01
To assess the effectiveness of disease management as an adjunct to treatment for chronic illnesses, such as asthma, and to evaluate whether the statistical phenomenon of regression to the mean is responsible for many of the benefits commonly attributed to disease management. This study evaluated an asthma disease management intervention in a Colorado population covered by Medicaid. The outcomes are presented with the intervention group serving as its own control (baseline and postintervention measurements) and are compared with a matched control group during the same periods. In the intervention group, 388 asthmatics entered and 258 completed the 6-month program; 446 subjects participated in the control group. Facilities charges were compared for both groups during the baseline and program periods. Both groups were well matched demographically and for costs at baseline. Using the intervention group as its own control revealed a 49.1% savings. The control group savings were 30.7%. Therefore, the net savings were 18.4% (P < .001) for the intervention group vs controls. Although the demonstrated savings were less using a control group to correct for regression to the mean, they were statistically significant and clinically relevant. When using a control group to control for the statistical effects of regression to the mean, a disease management intervention for asthma in a population covered by Medicaid is effective in reducing healthcare costs.
The efficacy of tamsulosin in lower ureteral calculi
Griwan, M.S.; Singh, Santosh Kumar; Paul, Himanshu; Pawar, Devendra Singh; Verma, Manish
2010-01-01
Context: There has been a paradigm shift in the management of ureteral calculi in the last decade with the introduction of new less invasive methods, such as ureterorenoscopy and extracorporeal shock wave lithotripsy (ESWL). Aims: Recent studies have reported excellent results with medical expulsive therapy (MET) for distal ureteral calculi, both in terms of stone expulsion and control of ureteral colic pain. Settings and Design: We conducted a comparative study in between watchful waiting and MET with tamsulosin. Materials and Methods: We conducted a comparative study in between watchful waiting (Group I) and MET with tamsulosin (Group II) in 60 patients, with a follow up of 28 days. Statistical Analysis: Independent 't' test and chi-square test. Results: Group II showed a statistically significant advantage in terms of the stone expulsion rate. The mean number of episodes of pain, mean days to stone expulsion and mean amount of analgesic dosage used were statistically significantly lower in Group II (P value is 0.007, 0.01 and 0.007, respectively) as compared to Group I. Conclusions: It is concluded that MET should be considered for uncomplicated distal ureteral calculi before ureteroscopy or extracorporeal lithotripsy. Tamsulosin has been found to increase and hasten stone expulsion rates, decrease acute attacks by acting as a spasmolytic, reduces mean days to stone expulsion and decreases analgesic dose usage. PMID:20882156
Classical Statistics and Statistical Learning in Imaging Neuroscience
Bzdok, Danilo
2017-01-01
Brain-imaging research has predominantly generated insight by means of classical statistics, including regression-type analyses and null-hypothesis testing using t-test and ANOVA. Throughout recent years, statistical learning methods enjoy increasing popularity especially for applications in rich and complex data, including cross-validated out-of-sample prediction using pattern classification and sparsity-inducing regression. This concept paper discusses the implications of inferential justifications and algorithmic methodologies in common data analysis scenarios in neuroimaging. It is retraced how classical statistics and statistical learning originated from different historical contexts, build on different theoretical foundations, make different assumptions, and evaluate different outcome metrics to permit differently nuanced conclusions. The present considerations should help reduce current confusion between model-driven classical hypothesis testing and data-driven learning algorithms for investigating the brain with imaging techniques. PMID:29056896
AnthropMMD: An R package with a graphical user interface for the mean measure of divergence.
Santos, Frédéric
2018-01-01
The mean measure of divergence is a dissimilarity measure between groups of individuals described by dichotomous variables. It is well suited to datasets with many missing values, and it is generally used to compute distance matrices and represent phenograms. Although often used in biological anthropology and archaeozoology, this method suffers from a lack of implementation in common statistical software. A package for the R statistical software, AnthropMMD, is presented here. Offering a dynamic graphical user interface, it is the first one dedicated to Smith's mean measure of divergence. The package also provides facilities for graphical representations and the crucial step of trait selection, so that the entire analysis can be performed through the graphical user interface. Its use is demonstrated using an artificial dataset, and the impact of trait selection is discussed. Finally, AnthropMMD is compared to three other free tools available for calculating the mean measure of divergence, and is proven to be consistent with them. © 2017 Wiley Periodicals, Inc.
Hydrostatic weighing without head submersion in morbidly obese females.
Evans, P E; Israel, R G; Flickinger, E G; O'Brien, K F; Donnelly, J E
1989-08-01
This study tests the validity of hydrostatic weighing without head submersion (HWNS) for determining the body density (Db) of morbidly obese (MO) females. Eighty MO females who were able to perform traditional hydrostatic weighing at residual volume (HW) underwent four counterbalanced trials for each procedure (HW and HWNS) to determine Db. Residual volume was determined by oxygen dilution. Twenty subjects were randomly excluded from the experimental group (EG) and assigned to a cross-validation group (CV). Simple linear regression was performed on EG data (n = 60, means = 36.8 y, means % fat = 50.1) to predict Db from HWNS (Db = 0.569563 [Db HWNS] + 0.408621, SEE = 0.0066). Comparison of the predicted and actual Db for CV group yielded r = 0.69, SEE = 0.0066, E statistic = 0.0067, mean difference = 0.0013 kg/L. The SEE and E statistic for body fat were 3.31 and 3.39, respectively. Mean difference for percent fat was 0.66%. Results indicate that HWNS is a valid technique for assessing body composition in MO females.
Farwell, Lawrence A.; Richardson, Drew C.; Richardson, Graham M.; Furedy, John J.
2014-01-01
A classification concealed information test (CIT) used the “brain fingerprinting” method of applying P300 event-related potential (ERP) in detecting information that is (1) acquired in real life and (2) unique to US Navy experts in military medicine. Military medicine experts and non-experts were asked to push buttons in response to three types of text stimuli. Targets contain known information relevant to military medicine, are identified to subjects as relevant, and require pushing one button. Subjects are told to push another button to all other stimuli. Probes contain concealed information relevant to military medicine, and are not identified to subjects. Irrelevants contain equally plausible, but incorrect/irrelevant information. Error rate was 0%. Median and mean statistical confidences for individual determinations were 99.9% with no indeterminates (results lacking sufficiently high statistical confidence to be classified). We compared error rate and statistical confidence for determinations of both information present and information absent produced by classification CIT (Is a probe ERP more similar to a target or to an irrelevant ERP?) vs. comparison CIT (Does a probe produce a larger ERP than an irrelevant?) using P300 plus the late negative component (LNP; together, P300-MERMER). Comparison CIT produced a significantly higher error rate (20%) and lower statistical confidences: mean 67%; information-absent mean was 28.9%, less than chance (50%). We compared analysis using P300 alone with the P300 + LNP. P300 alone produced the same 0% error rate but significantly lower statistical confidences. These findings add to the evidence that the brain fingerprinting methods as described here provide sufficient conditions to produce less than 1% error rate and greater than 95% median statistical confidence in a CIT on information obtained in the course of real life that is characteristic of individuals with specific training, expertise, or organizational affiliation. PMID:25565941
Wynes, Jacob; Lamm, Bradley M; Andrade, Bijan J; Malay, D Scot
2016-01-01
We used preoperative radiographic and intraoperative anatomic measurements to predict and achieve, respectively, the precise amount of capital fragment lateral translation required to restore anatomic balance to the first metatarsophalangeal joint. Correlation was used to relate the amount of capital fragment translation and operative reduction of the first intermetatarsal angle (IMA), hallux abductus angle (HAA), tibial sesamoid position (TSP), metatarsus adductus angle, and first metatarsal length. The mean capital fragment lateral translation was 5.54 ± 1.64 mm, and the mean radiographic reductions included a first IMA of 5.04° ± 2.85°, an HAA of 9.39° ± 8.38°, and a TSP of 1.38 ± 0.9. These changes were statistically (p < .001) and clinically (≥32.55%) significant. The mean reduction of the metatarsus adductus angle was 0.66° ± 4.44° and that for the first metatarsal length was 0.33 ± 7.27 mm, and neither of these were statistically (p = .5876 and 0.1247, respectively) or clinically (≤3.5%) significant. Pairwise correlations between the amount of lateral translation of the capital fragment and the first IMA, HAA, and TSP values were moderately positive and statistically significant (r = 0.4412, p = .0166; r = 0.5391, p = .0025; and r = 0.3729, p = .0463; respectively). In contrast, the correlation with metatarsus adductus and the first metatarsal shortening were weak and not statistically significant (r = 0.2296, p = .2308 and r = -0.2394, p = .2109, respectively). The results of our study indicate that predicted preoperative and executed intraoperative lateral translation of the capital fragment correlates with statistically and clinically significant reductions in the first IMA, HAA, and TSP. Copyright © 2016 American College of Foot and Ankle Surgeons. Published by Elsevier Inc. All rights reserved.
Reaction Event Counting Statistics of Biopolymer Reaction Systems with Dynamic Heterogeneity.
Lim, Yu Rim; Park, Seong Jun; Park, Bo Jung; Cao, Jianshu; Silbey, Robert J; Sung, Jaeyoung
2012-04-10
We investigate the reaction event counting statistics (RECS) of an elementary biopolymer reaction in which the rate coefficient is dependent on states of the biopolymer and the surrounding environment and discover a universal kinetic phase transition in the RECS of the reaction system with dynamic heterogeneity. From an exact analysis for a general model of elementary biopolymer reactions, we find that the variance in the number of reaction events is dependent on the square of the mean number of the reaction events when the size of measurement time is small on the relaxation time scale of rate coefficient fluctuations, which does not conform to renewal statistics. On the other hand, when the size of the measurement time interval is much greater than the relaxation time of rate coefficient fluctuations, the variance becomes linearly proportional to the mean reaction number in accordance with renewal statistics. Gillespie's stochastic simulation method is generalized for the reaction system with a rate coefficient fluctuation. The simulation results confirm the correctness of the analytic results for the time dependent mean and variance of the reaction event number distribution. On the basis of the obtained results, we propose a method of quantitative analysis for the reaction event counting statistics of reaction systems with rate coefficient fluctuations, which enables one to extract information about the magnitude and the relaxation times of the fluctuating reaction rate coefficient, without a bias that can be introduced by assuming a particular kinetic model of conformational dynamics and the conformation dependent reactivity. An exact relationship is established between a higher moment of the reaction event number distribution and the multitime correlation of the reaction rate for the reaction system with a nonequilibrium initial state distribution as well as for the system with the equilibrium initial state distribution.
Ries, Kernell G.
1999-01-01
A network of 148 low-flow partial-record stations was operated on streams in Massachusetts during the summers of 1989 through 1996. Streamflow measurements (including historical measurements), measured basin characteristics, and estimated streamflow statistics are provided in the report for each low-flow partial-record station. Also included for each station are location information, streamflow-gaging stations for which flows were correlated to those at the low-flowpartial-record station, years of operation, and remarks indicating human influences of stream-flowsat the station. Three or four streamflow measurements were made each year for three years during times of low flow to obtain nine or ten measurements for each station. Measured flows at the low-flow partial-record stations were correlated with same-day mean flows at a nearby gaging station to estimate streamflow statistics for the low-flow partial-record stations. The estimated streamflow statistics include the 99-, 98-, 97-, 95-, 93-, 90-, 85-, 80-, 75-, 70-, 65-, 60-, 55-, and 50-percent duration flows; the 7-day, 10- and 2-year low flows; and the August median flow. Characteristics of the drainage basins for the stations that theoretically relate to the response of the station to climatic variations were measured from digital map data by use of an automated geographic information system procedure. Basin characteristics measured include drainage area; total stream length; mean basin slope; area of surficial stratified drift; area of wetlands; area of water bodies; and mean, maximum, and minimum basin elevation.Station descriptions and calculated streamflow statistics are also included in the report for the 50 continuous gaging stations used in correlations with the low-flow partial-record stations.
Direct Statistical Simulation of Astrophysical and Geophysical Flows
NASA Astrophysics Data System (ADS)
Marston, B.; Tobias, S.
2011-12-01
Astrophysical and geophysical flows are amenable to direct statistical simulation (DSS), the calculation of statistical properties that does not rely upon accumulation by direct numerical simulation (DNS) (Tobias and Marston, 2011). Anisotropic and inhomogeneous flows, such as those found in the atmospheres of planets, in rotating stars, and in disks, provide the starting point for an expansion in fluctuations about the mean flow, leading to a hierarchy of equations of motion for the equal-time cumulants. The method is described for a general set of evolution equations, and then illustrated for two specific cases: (i) A barotropic jet on a rotating sphere (Marston, Conover, and Schneider, 2008); and (ii) A model of a stellar tachocline driven by relaxation to an underlying flow with shear (Cally 2001) for which a joint instability arises from the combination of shearing forces and magnetic stress. The reliability of DSS is assessed by comparing statistics so obtained against those accumulated from DNS, the traditional approach. The simplest non-trivial closure, CE2, sets the third and higher cumulants to zero yet yields qualitatively accurate low-order statistics for both systems. Physically CE2 retains only the eddy-mean flow interaction, and drops the eddy-eddy interaction. Quantitatively accurate zonal means are found for barotropic jet for long and short (but not intermediate) relaxation times, and for Cally problem in the case of strong shearing and large magnetic fields. Deficiencies in CE2 can be repaired at the CE3 level, that is by retaining the third cumulant (Marston 2011). We conclude by discussing possible extensions of the method both in terms of computational methods and the range of astrophysical and geophysical problems that are of interest.
Lower incisor inclination regarding different reference planes.
Zataráin, Brenda; Avila, Josué; Moyaho, Angeles; Carrasco, Rosendo; Velasco, Carmen
2016-09-01
The purpose of this study was to assess the degree of lower incisor inclination with respect to different reference planes. It was an observational, analytical, longitudinal, prospective study conducted on 100 lateral cephalograms which were corrected according to the photograph in natural head position in order to draw the true vertical plane (TVP). The incisor mandibular plane angle (IMPA) was compensated to eliminate the variation of the mandibular plane growth type with the formula "FMApx.- 25 (FMA) + IMPApx. = compensated IMPA (IMPACOM)". As the data followed normal distribution determined by the KolmogorovSmirnov test, parametric tests were used for the statistical analysis, Ttest, ANOVA and Pearson coefficient correlation test. Statistical analysis was performed using a statistical significance of p <0.05. There is correlation between TVP and NB line (NB) (0.8614), Frankfort mandibular incisor angle (FMIA) (0.8894), IMPA (0.6351), Apo line (Apo) (0.609), IMPACOM (0.8895) and McHorris angle (MH) (0.7769). ANOVA showed statistically significant differences between the means for the 7 variables with 95% confidence level, P=0.0001. The multiple range test showed no significant difference among means: APoNB (0.88), IMPAMH (0.36), IMPANB (0.65), FMIAIMPACOM (0.01), FMIATVP (0.18), TVPIMPACOM (0.17). There was correlation among all reference planes. There were statistically significant differences among the means of the planes measured, except for IMPACOM, FMIA and TVP. The IMPA differed significantly from the IMPACOM. The compensated IMPA and the FMIA did not differ significantly from the TVP. The true horizontal plane was mismatched with Frankfort plane in 84% of the sample with a range of 19°. The true vertical plane is adequate for measuring lower incisor inclination. Sociedad Argentina de Investigación Odontológica.
Correlation between normal glucose-6-phosphate dehydrogenase level and haematological parameters.
Ajlaan, S K; al-Naama, L M; al-Naama, M M
2000-01-01
The study involved 143 individuals and aimed to correlate normal glucose-6-phosphate dehydrogenase (G6PD) level with haematological parameters. A statistically significant negative correlation was found between G6PD level and haemoglobin, packed cell volume, red blood cell count, mean corpuscular haemoglobin and mean corpuscular volume. A statistically significant positive correlation was found between G6PD level and white blood cell count and reticulocyte count, but no significant correlation was found between G6PD level and mean corpuscular haemoglobin concentration. The negative correlation between G6PD level and haemoglobin suggests that anaemic people have higher G6PD levels than normal individuals. The positive correlation between G6PD level and white blood cell count indicates that white blood cells may play an important role in contributing to G6PD level.
Collum, L M; Logan, P; McAuliffe-Curtin, D; Hung, S O; Patterson, A; Rees, P J
1985-11-01
Fifty-one patients were treated in a dual-centre, double-blind comparison of acyclovir and adenine arabinoside in herpetic amoeboid (geographic) corneal ulceration. Twenty-four of the 25 patients receiving acyclovir healed in a mean time of 12.2 days, while 24 of the 26 patients treated with adenine arabinoside healed in a mean time of 11.0 days. There was no statistically significant difference between the two groups in terms of healing. A second analysis, excluding any patients who had received antiviral treatment immediately prior to entry into the study, showed that 18 of the 19 who received acyclovir healed in an average of 11.7 days and 18 of the 19 recipients of adenine arabinoside healed in a mean time of 11.2 days. Again the difference was not statistically significant.
Pope, Larry M.; Diaz, A.M.
1982-01-01
Quality-of-water data, collected October 21-23, 1980, and a statistical summary are presented for 42 coal-mined strip pits in Crawford and Cherokee Counties, Southeastern Kansas. The statistical summary includes minimum and maximum observed values , mean, and standard deviation. Simple linear regression equations relating specific conductance, dissolved solids, and acidity to concentrations of dissolved solids, sulfate, calcium, and magnesium, potassium, aluminum, and iron are also presented. (USGS)
No information or horizon paradoxes for Th. Smiths
NASA Astrophysics Data System (ADS)
Maes, Christian
2015-10-01
The Statistical mechanician in the street (our Th. Smiths) must be surprised upon hearing popular versions of some of today's most discussed paradoxes in astronomy and cosmology. In fact, rather standard reminders of the meaning of thermal probabilities in statistical mechanics appear to answer the horizon problem (one of the major motivations for inflation theory) and the information paradox (related to black hole physics), at least as they are usually presented. Still the paradoxes point to interesting gaps in our statistical understanding of (quantum) gravitational effects.
Evaluation of wind field statistics near and inside clouds using a coherent Doppler lidar
NASA Astrophysics Data System (ADS)
Lottman, Brian Todd
1998-09-01
This work proposes advanced techniques for measuring the spatial wind field statistics near and inside clouds using a vertically pointing solid state coherent Doppler lidar on a fixed ground based platform. The coherent Doppler lidar is an ideal instrument for high spatial and temporal resolution velocity estimates. The basic parameters of lidar are discussed, including a complete statistical description of the Doppler lidar signal. This description is extended to cases with simple functional forms for aerosol backscatter and velocity. An estimate for the mean velocity over a sensing volume is produced by estimating the mean spectra. There are many traditional spectral estimators, which are useful for conditions with slowly varying velocity and backscatter. A new class of estimators (novel) is introduced that produces reliable velocity estimates for conditions with large variations in aerosol backscatter and velocity with range, such as cloud conditions. Performance of traditional and novel estimators is computed for a variety of deterministic atmospheric conditions using computer simulated data. Wind field statistics are produced for actual data for a cloud deck, and for multi- layer clouds. Unique results include detection of possible spectral signatures for rain, estimates for the structure function inside a cloud deck, reliable velocity estimation techniques near and inside thin clouds, and estimates for simple wind field statistics between cloud layers.
Statistical fluctuations of an ocean surface inferred from shoes and ships
NASA Astrophysics Data System (ADS)
Lerche, Ian; Maubeuge, Frédéric
1995-12-01
This paper shows that it is possible to roughly estimate some ocean properties using simple time-dependent statistical models of ocean fluctuations. Based on a real incident, the loss by a vessel of a Nike shoes container in the North Pacific Ocean, a statistical model was tested on data sets consisting of the Nike shoes found by beachcombers a few months later. This statistical treatment of the shoes' motion allows one to infer velocity trends of the Pacific Ocean, together with their fluctuation strengths. The idea is to suppose that there is a mean bulk flow speed that can depend on location on the ocean surface and time. The fluctuations of the surface flow speed are then treated as statistically random. The distribution of shoes is described in space and time using Markov probability processes related to the mean and fluctuating ocean properties. The aim of the exercise is to provide some of the properties of the Pacific Ocean that are otherwise calculated using a sophisticated numerical model, OSCURS, where numerous data are needed. Relevant quantities are sharply estimated, which can be useful to (1) constrain output results from OSCURS computations, and (2) elucidate the behavior patterns of ocean flow characteristics on long time scales.
Statistical Data Editing in Scientific Articles.
Habibzadeh, Farrokh
2017-07-01
Scientific journals are important scholarly forums for sharing research findings. Editors have important roles in safeguarding standards of scientific publication and should be familiar with correct presentation of results, among other core competencies. Editors do not have access to the raw data and should thus rely on clues in the submitted manuscripts. To identify probable errors, they should look for inconsistencies in presented results. Common statistical problems that can be picked up by a knowledgeable manuscript editor are discussed in this article. Manuscripts should contain a detailed section on statistical analyses of the data. Numbers should be reported with appropriate precisions. Standard error of the mean (SEM) should not be reported as an index of data dispersion. Mean (standard deviation [SD]) and median (interquartile range [IQR]) should be used for description of normally and non-normally distributed data, respectively. If possible, it is better to report 95% confidence interval (CI) for statistics, at least for main outcome variables. And, P values should be presented, and interpreted with caution, if there is a hypothesis. To advance knowledge and skills of their members, associations of journal editors are better to develop training courses on basic statistics and research methodology for non-experts. This would in turn improve research reporting and safeguard the body of scientific evidence. © 2017 The Korean Academy of Medical Sciences.
Edjabou, Maklawe Essonanawe; Martín-Fernández, Josep Antoni; Scheutz, Charlotte; Astrup, Thomas Fruergaard
2017-11-01
Data for fractional solid waste composition provide relative magnitudes of individual waste fractions, the percentages of which always sum to 100, thereby connecting them intrinsically. Due to this sum constraint, waste composition data represent closed data, and their interpretation and analysis require statistical methods, other than classical statistics that are suitable only for non-constrained data such as absolute values. However, the closed characteristics of waste composition data are often ignored when analysed. The results of this study showed, for example, that unavoidable animal-derived food waste amounted to 2.21±3.12% with a confidence interval of (-4.03; 8.45), which highlights the problem of the biased negative proportions. A Pearson's correlation test, applied to waste fraction generation (kg mass), indicated a positive correlation between avoidable vegetable food waste and plastic packaging. However, correlation tests applied to waste fraction compositions (percentage values) showed a negative association in this regard, thus demonstrating that statistical analyses applied to compositional waste fraction data, without addressing the closed characteristics of these data, have the potential to generate spurious or misleading results. Therefore, ¨compositional data should be transformed adequately prior to any statistical analysis, such as computing mean, standard deviation and correlation coefficients. Copyright © 2017 Elsevier Ltd. All rights reserved.
Streamflow characteristics of streams in southeastern Afghanistan
Vining, Kevin C.
2010-01-01
Statistical summaries of streamflow data for all historical streamgaging stations that have available data in the southeastern Afghanistan provinces of Ghazni, Khost, Logar, Paktya, and Wardak, and a portion of Kabul Province are presented in this report. The summaries for each streamgaging station include a station desciption, table of statistics of monthly and annual mean discharges, table of monthly and annual flow duration, table of probability of occurrence of annual high discharges, table of probability of occurrence of annual low discharges, table of annual peak discharge and corresponding gage height for the period of record, and table of monthly and annual mean discharges for the period of record.
NASA Astrophysics Data System (ADS)
Mormann, Florian; Lehnertz, Klaus; David, Peter; E. Elger, Christian
2000-10-01
We apply the concept of phase synchronization of chaotic and/or noisy systems and the statistical distribution of the relative instantaneous phases to electroencephalograms (EEGs) recorded from patients with temporal lobe epilepsy. Using the mean phase coherence as a statistical measure for phase synchronization, we observe characteristic spatial and temporal shifts in synchronization that appear to be strongly related to pathological activity. In particular, we observe distinct differences in the degree of synchronization between recordings from seizure-free intervals and those before an impending seizure, indicating an altered state of brain dynamics prior to seizure activity.
Assessment of oral health parameters among students attending special schools of Mangalore city.
Peter, Tom; Cherian, Deepthi Anna; Peter, Tim
2017-01-01
The aim of the study was to assess the oral health status and treatment needs and correlation between dental caries susceptibility and salivary pH, buffering capacity and total antioxidant capacity among students attending special schools of Mangalore city. In this study 361 subjects in the age range of 12-18 years were divided into normal ( n = 84), physically challenged ( n = 68), and mentally challenged ( n = 209) groups. Their oral health status and treatment needs were recorded using the modified WHO oral health assessment proforma. Saliva was collected to estimate the salivary parameters. Statistical analysis was done using Statistical Package for Social Sciences version 17. Chicago. On examining, the dentition status of the study subjects, the mean number of decayed teeth was 1.57 for the normal, 2.54 for the physically challenged and 4.41 for the mentally challenged study subjects. These results were highly statistically significant ( P < 0.001). The treatment needs of the study subjects revealed that the mean number of teeth requiring pulp care and restoration were 1 for the normal, 0.12 for the physically challenged, and 1.21 for the mentally challenged study subjects. These results were highly statistically significant ( P < 0.001). The mean salivary pH and buffering capacity were found to be lowest among the mentally challenged subjects. Physically challenged group had the lowest mean total antioxidant capacity among the study subjects. Among the study subjects, normal students had the highest mean salivary pH, buffering capacity, and total antioxidant capacity. These results were highly statistically significant ( P < 0.001). This better dentition status of the normal compared to the physically and mentally challenged study subjects could be due to their improved quality of oral health practices. The difference in the treatment needs could be due to the higher prevalence of untreated dental caries and also due to the neglected oral health care among the mentally challenged study subjects. The salivary pH and buffering capacity were comparatively lower among the physically and mentally challenged study subjects which could contribute to their increased caries experience compared to the normal study subjects. However, further studies are needed to establish a more conclusive result on the total anti-oxidant capacity of the saliva and dental caries.
Detection of Anomalies in Hydrometric Data Using Artificial Intelligence Techniques
NASA Astrophysics Data System (ADS)
Lauzon, N.; Lence, B. J.
2002-12-01
This work focuses on the detection of anomalies in hydrometric data sequences, such as 1) outliers, which are individual data having statistical properties that differ from those of the overall population; 2) shifts, which are sudden changes over time in the statistical properties of the historical records of data; and 3) trends, which are systematic changes over time in the statistical properties. For the purpose of the design and management of water resources systems, it is important to be aware of these anomalies in hydrometric data, for they can induce a bias in the estimation of water quantity and quality parameters. These anomalies may be viewed as specific patterns affecting the data, and therefore pattern recognition techniques can be used for identifying them. However, the number of possible patterns is very large for each type of anomaly and consequently large computing capacities are required to account for all possibilities using the standard statistical techniques, such as cluster analysis. Artificial intelligence techniques, such as the Kohonen neural network and fuzzy c-means, are clustering techniques commonly used for pattern recognition in several areas of engineering and have recently begun to be used for the analysis of natural systems. They require much less computing capacity than the standard statistical techniques, and therefore are well suited for the identification of outliers, shifts and trends in hydrometric data. This work constitutes a preliminary study, using synthetic data representing hydrometric data that can be found in Canada. The analysis of the results obtained shows that the Kohonen neural network and fuzzy c-means are reasonably successful in identifying anomalies. This work also addresses the problem of uncertainties inherent to the calibration procedures that fit the clusters to the possible patterns for both the Kohonen neural network and fuzzy c-means. Indeed, for the same database, different sets of clusters can be established with these calibration procedures. A simple method for analyzing uncertainties associated with the Kohonen neural network and fuzzy c-means is developed here. The method combines the results from several sets of clusters, either from the Kohonen neural network or fuzzy c-means, so as to provide an overall diagnosis as to the identification of outliers, shifts and trends. The results indicate an improvement in the performance for identifying anomalies when the method of combining cluster sets is used, compared with when only one cluster set is used.
Joint Adaptive Mean-Variance Regularization and Variance Stabilization of High Dimensional Data
Dazard, Jean-Eudes; Rao, J. Sunil
2012-01-01
The paper addresses a common problem in the analysis of high-dimensional high-throughput “omics” data, which is parameter estimation across multiple variables in a set of data where the number of variables is much larger than the sample size. Among the problems posed by this type of data are that variable-specific estimators of variances are not reliable and variable-wise tests statistics have low power, both due to a lack of degrees of freedom. In addition, it has been observed in this type of data that the variance increases as a function of the mean. We introduce a non-parametric adaptive regularization procedure that is innovative in that : (i) it employs a novel “similarity statistic”-based clustering technique to generate local-pooled or regularized shrinkage estimators of population parameters, (ii) the regularization is done jointly on population moments, benefiting from C. Stein's result on inadmissibility, which implies that usual sample variance estimator is improved by a shrinkage estimator using information contained in the sample mean. From these joint regularized shrinkage estimators, we derived regularized t-like statistics and show in simulation studies that they offer more statistical power in hypothesis testing than their standard sample counterparts, or regular common value-shrinkage estimators, or when the information contained in the sample mean is simply ignored. Finally, we show that these estimators feature interesting properties of variance stabilization and normalization that can be used for preprocessing high-dimensional multivariate data. The method is available as an R package, called ‘MVR’ (‘Mean-Variance Regularization’), downloadable from the CRAN website. PMID:22711950
NASA Astrophysics Data System (ADS)
Rynders, Maurice; Lidkea, Bruce; Chisholm, William; Thibos, Larry N.
1995-10-01
Subjective transverse chromatic aberration (sTCA) manifest at the fovea was determined for a population of 85 young adults (19-38 years old) by means of a two-dimensional, two-color, vernier alignment technique. The statistical distribution of sTCA was well fitted by a bivariate Gaussian function with mean values that were not significantly different from zero in either the horizontal or the vertical direction. We conclude from this result that a hypothetical, average eye representing the population mean of human eyes with medium-sized pupils is free of foveal sTCA. However, the absolute magnitude of sTCA for any given individual was often significantly greater than zero and ranged from 0.05 to 2.67 arcmin for the red and the blue lights of a computer monitor (mean wavelengths, 605 and 497 nm, respectively). The statistical distribution of the absolute magnitude of sTCA was well described by a Rayleigh probability distribution with a mean of 0.8 arcmin. A simple device useful for population screening in a clinical setting was also tested and gave concordant results. Assuming that sTCA at the fovea is due to decentering of the pupil with respect to the visual axis, we infer from these results that the pupil is, on average, well centered in human eyes. The average magnitude of pupil decentration in individual eyes is less than 0.5 mm, which corresponds to psi =3 deg for the angle between the achromatic and the visual axes of the eye.
NASA Technical Reports Server (NTRS)
Wilson, Robert M.; Hathaway, David H.
2009-01-01
Examined are single- and bi-variate geomagnetic precursors for predicting the maximum amplitude (RM) of a sunspot cycle several years in advance. The best single-variate fit is one based on the average of the ap index 36 mo prior to cycle minimum occurrence (E(Rm)), having a coefficient of correlation (r) equal to 0.97 and a standard error of estimate (se) equal to 9.3. Presuming cycle 24 not to be a statistical outlier and its minimum in March 2008, the fit suggests cycle 24 s RM to be about 69 +/- 20 (the 90% prediction interval). The weighted mean prediction of 11 statistically important single-variate fits is 116 +/- 34. The best bi-variate fit is one based on the maximum and minimum values of the 12-mma of the ap index; i.e., APM# and APm*, where # means the value post-E(RM) for the preceding cycle and * means the value in the vicinity of cycle minimum, having r = 0.98 and se = 8.2. It predicts cycle 24 s RM to be about 92 +/- 27. The weighted mean prediction of 22 statistically important bi-variate fits is 112 32. Thus, cycle 24's RM is expected to lie somewhere within the range of about 82 to 144. Also examined are the late-cycle 23 behaviors of geomagnetic indices and solar wind velocity in comparison to the mean behaviors of cycles 2023 and the geomagnetic indices of cycle 14 (RM = 64.2), the weakest sunspot cycle of the modern era.
Statistical aspects of solar flares
NASA Technical Reports Server (NTRS)
Wilson, Robert M.
1987-01-01
A survey of the statistical properties of 850 H alpha solar flares during 1975 is presented. Comparison of the results found here with those reported elsewhere for different epochs is accomplished. Distributions of rise time, decay time, and duration are given, as are the mean, mode, median, and 90th percentile values. Proportions by selected groupings are also determined. For flares in general, mean values for rise time, decay time, and duration are 5.2 + or - 0.4 min, and 18.1 + or 1.1 min, respectively. Subflares, accounting for nearly 90 percent of the flares, had mean values lower than those found for flares of H alpha importance greater than 1, and the differences are statistically significant. Likewise, flares of bright and normal relative brightness have mean values of decay time and duration that are significantly longer than those computed for faint flares, and mass-motion related flares are significantly longer than non-mass-motion related flares. Seventy-three percent of the mass-motion related flares are categorized as being a two-ribbon flare and/or being accompanied by a high-speed dark filament. Slow rise time flares (rise time greater than 5 min) have a mean value for duration that is significantly longer than that computed for fast rise time flares, and long-lived duration flares (duration greater than 18 min) have a mean value for rise time that is significantly longer than that computed for short-lived duration flares, suggesting a positive linear relationship between rise time and duration for flares. Monthly occurrence rates for flares in general and by group are found to be linearly related in a positive sense to monthly sunspot number. Statistical testing reveals the association between sunspot number and numbers of flares to be significant at the 95 percent level of confidence, and the t statistic for slope is significant at greater than 99 percent level of confidence. Dependent upon the specific fit, between 58 percent and 94 percent of the variation can be accounted for with the linear fits. A statistically significant Northern Hemisphere flare excess (P less than 1 percent) was found, as was a Western Hemisphere excess (P approx 3 percent). Subflares were more prolific within 45 deg of central meridian (P less than 1 percent), while flares of H alpha importance or = 1 were more prolific near the limbs greater than 45 deg from central meridian; P approx 2 percent). Two-ribbon flares were more frequent within 45 deg of central meridian (P less than 1 percent). Slow rise time flares occurred more frequently in the western hemisphere (P approx 2 percent), as did short-lived duration flares (P approx 9 percent), but fast rise time flares were not preferentially distributed (in terms of east-west or limb-disk). Long-lived duration flares occurred more often within 45 deg 0 central meridian (P approx 7 percent). Mean durations for subflares and flares of H alpha importance or + 1, found within 45 deg of central meridian, are 14 percent and 70 percent, respectively, longer than those found for flares closer to the limb. As compared to flares occurring near cycle maximum, the flares of 1975 (near solar minimum) have mean values of rise time, decay time, and duration that are significantly shorter. A flare near solar maximum, on average, is about 1.6 times longer than one occurring near solar minimum.
Gotvald, Anthony J.
2017-01-13
The U.S. Geological Survey, in cooperation with the Georgia Department of Natural Resources, Environmental Protection Division, developed regional regression equations for estimating selected low-flow frequency and mean annual flow statistics for ungaged streams in north Georgia that are not substantially affected by regulation, diversions, or urbanization. Selected low-flow frequency statistics and basin characteristics for 56 streamgage locations within north Georgia and 75 miles beyond the State’s borders in Alabama, Tennessee, North Carolina, and South Carolina were combined to form the final dataset used in the regional regression analysis. Because some of the streamgages in the study recorded zero flow, the final regression equations were developed using weighted left-censored regression analysis to analyze the flow data in an unbiased manner, with weights based on the number of years of record. The set of equations includes the annual minimum 1- and 7-day average streamflow with the 10-year recurrence interval (referred to as 1Q10 and 7Q10), monthly 7Q10, and mean annual flow. The final regional regression equations are functions of drainage area, mean annual precipitation, and relief ratio for the selected low-flow frequency statistics and drainage area and mean annual precipitation for mean annual flow. The average standard error of estimate was 13.7 percent for the mean annual flow regression equation and ranged from 26.1 to 91.6 percent for the selected low-flow frequency equations.The equations, which are based on data from streams with little to no flow alterations, can be used to provide estimates of the natural flows for selected ungaged stream locations in the area of Georgia north of the Fall Line. The regression equations are not to be used to estimate flows for streams that have been altered by the effects of major dams, surface-water withdrawals, groundwater withdrawals (pumping wells), diversions, or wastewater discharges. The regression equations should be used only for ungaged sites with drainage areas between 1.67 and 576 square miles, mean annual precipitation between 47.6 and 81.6 inches, and relief ratios between 0.146 and 0.607; these are the ranges of the explanatory variables used to develop the equations. An attempt was made to develop regional regression equations for the area of Georgia south of the Fall Line by using the same approach used during this study for north Georgia; however, the equations resulted with high average standard errors of estimates and poorly predicted flows below 0.5 cubic foot per second, which may be attributed to the karst topography common in that area.The final regression equations developed from this study are planned to be incorporated into the U.S. Geological Survey StreamStats program. StreamStats is a Web-based geographic information system that provides users with access to an assortment of analytical tools useful for water-resources planning and management, and for engineering design applications, such as the design of bridges. The StreamStats program provides streamflow statistics and basin characteristics for U.S. Geological Survey streamgage locations and ungaged sites of interest. StreamStats also can compute basin characteristics and provide estimates of streamflow statistics for ungaged sites when users select the location of a site along any stream in Georgia.
Ocular changes in primary hypothyroidism.
Ozturk, Banu T; Kerimoglu, Hurkan; Dikbas, Oguz; Pekel, Hamiyet; Gonen, Mustafa S
2009-12-29
To determine the ocular changes related to hypothyrodism in newly diagnosed patients without orbitopathy. Thirty-three patients diagnosed to have primary overt hypothyroidism were enrolled in the study. All subjects were assigned to underwent central corneal thickness (CCT), anterior chamber volume, depth and angle measurements with the Scheimpflug camera (Pentacam, Oculus) and cup to disc ratio (C/D), mean retinal thickness and mean retinal nerve fiber layer (RNFL) thickness measurements with optical coherence tomography (OCT) in addition to ophthalmological examination preceeding the replacement therapy and at the 1(st), 3(rd )and 6(th )months of treatment. The mean age of the patients included in the study were 40.58 +/- 1.32 years. The thyroid hormone levels return to normal levels in all patients during the follow-up period, however the mean intraocular pressure (IOP) revealed no significant change. The mean CCT was 538.05 +/- 3.85 mu initially and demonstrated no statistically significant change as the anterior chamber volume, depth and angle measurements did. The mean C/D ratio was 0.29 +/- 0.03 and the mean retinal thickness was 255.83 +/- 19.49 mu initially and the treatment did not give rise to any significant change. The mean RNFL thickness was also stable during the control visits, so no statistically significant change was encountered. Neither hypothyroidism, nor its replacement therapy gave rise to any change of IOP, CCT, anterior chamber parameters, RNFL, retinal thickness and C/D ratio.
Ogle, K.M.; Lee, R.W.
1994-01-01
Radon-222 activity was measured for 27 water samples from streams, an alluvial aquifer, bedrock aquifers, and a geothermal system, in and near the 510-square mile area of Owl Creek Basin, north- central Wyoming. Summary statistics of the radon- 222 activities are compiled. For 16 stream-water samples, the arithmetic mean radon-222 activity was 20 pCi/L (picocuries per liter), geometric mean activity was 7 pCi/L, harmonic mean activity was 2 pCi/L and median activity was 8 pCi/L. The standard deviation of the arithmetic mean is 29 pCi/L. The activities in the stream-water samples ranged from 0.4 to 97 pCi/L. The histogram of stream-water samples is left-skewed when compared to a normal distribution. For 11 ground-water samples, the arithmetic mean radon- 222 activity was 486 pCi/L, geometric mean activity was 280 pCi/L, harmonic mean activity was 130 pCi/L and median activity was 373 pCi/L. The standard deviation of the arithmetic mean is 500 pCi/L. The activity in the ground-water samples ranged from 25 to 1,704 pCi/L. The histogram of ground-water samples is left-skewed when compared to a normal distribution. (USGS)
Fisher statistics for analysis of diffusion tensor directional information.
Hutchinson, Elizabeth B; Rutecki, Paul A; Alexander, Andrew L; Sutula, Thomas P
2012-04-30
A statistical approach is presented for the quantitative analysis of diffusion tensor imaging (DTI) directional information using Fisher statistics, which were originally developed for the analysis of vectors in the field of paleomagnetism. In this framework, descriptive and inferential statistics have been formulated based on the Fisher probability density function, a spherical analogue of the normal distribution. The Fisher approach was evaluated for investigation of rat brain DTI maps to characterize tissue orientation in the corpus callosum, fornix, and hilus of the dorsal hippocampal dentate gyrus, and to compare directional properties in these regions following status epilepticus (SE) or traumatic brain injury (TBI) with values in healthy brains. Direction vectors were determined for each region of interest (ROI) for each brain sample and Fisher statistics were applied to calculate the mean direction vector and variance parameters in the corpus callosum, fornix, and dentate gyrus of normal rats and rats that experienced TBI or SE. Hypothesis testing was performed by calculation of Watson's F-statistic and associated p-value giving the likelihood that grouped observations were from the same directional distribution. In the fornix and midline corpus callosum, no directional differences were detected between groups, however in the hilus, significant (p<0.0005) differences were found that robustly confirmed observations that were suggested by visual inspection of directionally encoded color DTI maps. The Fisher approach is a potentially useful analysis tool that may extend the current capabilities of DTI investigation by providing a means of statistical comparison of tissue structural orientation. Copyright © 2012 Elsevier B.V. All rights reserved.
Analysis of the dependence of extreme rainfalls
NASA Astrophysics Data System (ADS)
Padoan, Simone; Ancey, Christophe; Parlange, Marc
2010-05-01
The aim of spatial analysis is to quantitatively describe the behavior of environmental phenomena such as precipitation levels, wind speed or daily temperatures. A number of generic approaches to spatial modeling have been developed[1], but these are not necessarily ideal for handling extremal aspects given their focus on mean process levels. The areal modelling of the extremes of a natural process observed at points in space is important in environmental statistics; for example, understanding extremal spatial rainfall is crucial in flood protection. In light of recent concerns over climate change, the use of robust mathematical and statistical methods for such analyses has grown in importance. Multivariate extreme value models and the class of maxstable processes [2] have a similar asymptotic motivation to the univariate Generalized Extreme Value (GEV) distribution , but providing a general approach to modeling extreme processes incorporating temporal or spatial dependence. Statistical methods for max-stable processes and data analyses of practical problems are discussed by [3] and [4]. This work illustrates methods to the statistical modelling of spatial extremes and gives examples of their use by means of a real extremal data analysis of Switzerland precipitation levels. [1] Cressie, N. A. C. (1993). Statistics for Spatial Data. Wiley, New York. [2] de Haan, L and Ferreria A. (2006). Extreme Value Theory An Introduction. Springer, USA. [3] Padoan, S. A., Ribatet, M and Sisson, S. A. (2009). Likelihood-Based Inference for Max-Stable Processes. Journal of the American Statistical Association, Theory & Methods. In press. [4] Davison, A. C. and Gholamrezaee, M. (2009), Geostatistics of extremes. Journal of the Royal Statistical Society, Series B. To appear.
Teshima, Tara Lynn; Patel, Vaibhav; Mainprize, James G; Edwards, Glenn; Antonyshyn, Oleh M
2015-07-01
The utilization of three-dimensional modeling technology in craniomaxillofacial surgery has grown exponentially during the last decade. Future development, however, is hindered by the lack of a normative three-dimensional anatomic dataset and a statistical mean three-dimensional virtual model. The purpose of this study is to develop and validate a protocol to generate a statistical three-dimensional virtual model based on a normative dataset of adult skulls. Two hundred adult skull CT images were reviewed. The average three-dimensional skull was computed by processing each CT image in the series using thin-plate spline geometric morphometric protocol. Our statistical average three-dimensional skull was validated by reconstructing patient-specific topography in cranial defects. The experiment was repeated 4 times. In each case, computer-generated cranioplasties were compared directly to the original intact skull. The errors describing the difference between the prediction and the original were calculated. A normative database of 33 adult human skulls was collected. Using 21 anthropometric landmark points, a protocol for three-dimensional skull landmarking and data reduction was developed and a statistical average three-dimensional skull was generated. Our results show the root mean square error (RMSE) for restoration of a known defect using the native best match skull, our statistical average skull, and worst match skull was 0.58, 0.74, and 4.4 mm, respectively. The ability to statistically average craniofacial surface topography will be a valuable instrument for deriving missing anatomy in complex craniofacial defects and deficiencies as well as in evaluating morphologic results of surgery.
Sapsis, Themistoklis P; Majda, Andrew J
2013-08-20
A framework for low-order predictive statistical modeling and uncertainty quantification in turbulent dynamical systems is developed here. These reduced-order, modified quasilinear Gaussian (ROMQG) algorithms apply to turbulent dynamical systems in which there is significant linear instability or linear nonnormal dynamics in the unperturbed system and energy-conserving nonlinear interactions that transfer energy from the unstable modes to the stable modes where dissipation occurs, resulting in a statistical steady state; such turbulent dynamical systems are ubiquitous in geophysical and engineering turbulence. The ROMQG method involves constructing a low-order, nonlinear, dynamical system for the mean and covariance statistics in the reduced subspace that has the unperturbed statistics as a stable fixed point and optimally incorporates the indirect effect of non-Gaussian third-order statistics for the unperturbed system in a systematic calibration stage. This calibration procedure is achieved through information involving only the mean and covariance statistics for the unperturbed equilibrium. The performance of the ROMQG algorithm is assessed on two stringent test cases: the 40-mode Lorenz 96 model mimicking midlatitude atmospheric turbulence and two-layer baroclinic models for high-latitude ocean turbulence with over 125,000 degrees of freedom. In the Lorenz 96 model, the ROMQG algorithm with just a single mode captures the transient response to random or deterministic forcing. For the baroclinic ocean turbulence models, the inexpensive ROMQG algorithm with 252 modes, less than 0.2% of the total, captures the nonlinear response of the energy, the heat flux, and even the one-dimensional energy and heat flux spectra.
Descriptive and inferential statistical methods used in burns research.
Al-Benna, Sammy; Al-Ajam, Yazan; Way, Benjamin; Steinstraesser, Lars
2010-05-01
Burns research articles utilise a variety of descriptive and inferential methods to present and analyse data. The aim of this study was to determine the descriptive methods (e.g. mean, median, SD, range, etc.) and survey the use of inferential methods (statistical tests) used in articles in the journal Burns. This study defined its population as all original articles published in the journal Burns in 2007. Letters to the editor, brief reports, reviews, and case reports were excluded. Study characteristics, use of descriptive statistics and the number and types of statistical methods employed were evaluated. Of the 51 articles analysed, 11(22%) were randomised controlled trials, 18(35%) were cohort studies, 11(22%) were case control studies and 11(22%) were case series. The study design and objectives were defined in all articles. All articles made use of continuous and descriptive data. Inferential statistics were used in 49(96%) articles. Data dispersion was calculated by standard deviation in 30(59%). Standard error of the mean was quoted in 19(37%). The statistical software product was named in 33(65%). Of the 49 articles that used inferential statistics, the tests were named in 47(96%). The 6 most common tests used (Student's t-test (53%), analysis of variance/co-variance (33%), chi(2) test (27%), Wilcoxon & Mann-Whitney tests (22%), Fisher's exact test (12%)) accounted for the majority (72%) of statistical methods employed. A specified significance level was named in 43(88%) and the exact significance levels were reported in 28(57%). Descriptive analysis and basic statistical techniques account for most of the statistical tests reported. This information should prove useful in deciding which tests should be emphasised in educating burn care professionals. These results highlight the need for burn care professionals to have a sound understanding of basic statistics, which is crucial in interpreting and reporting data. Advice should be sought from professionals in the fields of biostatistics and epidemiology when using more advanced statistical techniques. Copyright 2009 Elsevier Ltd and ISBI. All rights reserved.
NASA Astrophysics Data System (ADS)
El Sharif, H.; Teegavarapu, R. S.
2012-12-01
Spatial interpolation methods used for estimation of missing precipitation data at a site seldom check for their ability to preserve site and regional statistics. Such statistics are primarily defined by spatial correlations and other site-to-site statistics in a region. Preservation of site and regional statistics represents a means of assessing the validity of missing precipitation estimates at a site. This study evaluates the efficacy of a fuzzy-logic methodology for infilling missing historical daily precipitation data in preserving site and regional statistics. Rain gauge sites in the state of Kentucky, USA, are used as a case study for evaluation of this newly proposed method in comparison to traditional data infilling techniques. Several error and performance measures will be used to evaluate the methods and trade-offs in accuracy of estimation and preservation of site and regional statistics.
Robust Mean and Covariance Structure Analysis through Iteratively Reweighted Least Squares.
ERIC Educational Resources Information Center
Yuan, Ke-Hai; Bentler, Peter M.
2000-01-01
Adapts robust schemes to mean and covariance structures, providing an iteratively reweighted least squares approach to robust structural equation modeling. Each case is weighted according to its distance, based on first and second order moments. Test statistics and standard error estimators are given. (SLD)
Using Computer Graphics in Statistics.
ERIC Educational Resources Information Center
Kerley, Lyndell M.
1990-01-01
Described is software which allows a student to use simulation to produce analytical output as well as graphical results. The results include a frequency histogram of a selected population distribution, a frequency histogram of the distribution of the sample means, and test the normality distributions of the sample means. (KR)
ERIC Educational Resources Information Center
Steuerle, Eugene; McClung, Nelson
This technical study is concerned with both the statistical and policy effects of alternative definitions of poverty which result when the definition of means is altered by varying the time period (accounting period) over which income is measured or by including in the measure of means not only realized income, but also unrealized income and…
R.A. Souter; J. Michael Bowker
1996-01-01
It is a generally known statistical fact that the mean of a nonlinear function of a set of random variables is not equivalent to the function evaluated at the means of the variables. However, in dichotomous choice contingent valuation studies, a common practice is to calculate an overall mean (or median) by integrating over offer space (numerically or analytically) an...
NASA Astrophysics Data System (ADS)
Langousis, Andreas; Mamalakis, Antonis; Deidda, Roberto; Marrocu, Marino
2015-04-01
To improve the level skill of Global Climate Models (GCMs) and Regional Climate Models (RCMs) in reproducing the statistics of rainfall at a basin level and at hydrologically relevant temporal scales (e.g. daily), two types of statistical approaches have been suggested. One is the statistical correction of climate model rainfall outputs using historical series of precipitation. The other is the use of stochastic models of rainfall to conditionally simulate precipitation series, based on large-scale atmospheric predictors produced by climate models (e.g. geopotential height, relative vorticity, divergence, mean sea level pressure). The latter approach, usually referred to as statistical rainfall downscaling, aims at reproducing the statistical character of rainfall, while accounting for the effects of large-scale atmospheric circulation (and, therefore, climate forcing) on rainfall statistics. While promising, statistical rainfall downscaling has not attracted much attention in recent years, since the suggested approaches involved complex (i.e. subjective or computationally intense) identification procedures of the local weather, in addition to demonstrating limited success in reproducing several statistical features of rainfall, such as seasonal variations, the distributions of dry and wet spell lengths, the distribution of the mean rainfall intensity inside wet periods, and the distribution of rainfall extremes. In an effort to remedy those shortcomings, Langousis and Kaleris (2014) developed a statistical framework for simulation of daily rainfall intensities conditional on upper air variables, which accurately reproduces the statistical character of rainfall at multiple time-scales. Here, we study the relative performance of: a) quantile-quantile (Q-Q) correction of climate model rainfall products, and b) the statistical downscaling scheme of Langousis and Kaleris (2014), in reproducing the statistical structure of rainfall, as well as rainfall extremes, at a regional level. This is done for an intermediate-sized catchment in Italy, i.e. the Flumendosa catchment, using climate model rainfall and atmospheric data from the ENSEMBLES project (http://ensembleseu.metoffice.com). In doing so, we split the historical rainfall record of mean areal precipitation (MAP) in 15-year calibration and 45-year validation periods, and compare the historical rainfall statistics to those obtained from: a) Q-Q corrected climate model rainfall products, and b) synthetic rainfall series generated by the suggested downscaling scheme. To our knowledge, this is the first time that climate model rainfall and statistically downscaled precipitation are compared to catchment-averaged MAP at a daily resolution. The obtained results are promising, since the proposed downscaling scheme is more accurate and robust in reproducing a number of historical rainfall statistics, independent of the climate model used and the length of the calibration period. This is particularly the case for the yearly rainfall maxima, where direct statistical correction of climate model rainfall outputs shows increased sensitivity to the length of the calibration period and the climate model used. The robustness of the suggested downscaling scheme in modeling rainfall extremes at a daily resolution, is a notable feature that can effectively be used to assess hydrologic risk at a regional level under changing climatic conditions. Acknowledgments The research project is implemented within the framework of the Action «Supporting Postdoctoral Researchers» of the Operational Program "Education and Lifelong Learning" (Action's Beneficiary: General Secretariat for Research and Technology), and is co-financed by the European Social Fund (ESF) and the Greek State. CRS4 highly acknowledges the contribution of the Sardinian regional authorities.
Comparison of direct numerical simulation databases of turbulent channel flow at Reτ = 180
NASA Astrophysics Data System (ADS)
Vreman, A. W.; Kuerten, J. G. M.
2014-01-01
Direct numerical simulation (DNS) databases are compared to assess the accuracy and reproducibility of standard and non-standard turbulence statistics of incompressible plane channel flow at Reτ = 180. Two fundamentally different DNS codes are shown to produce maximum relative deviations below 0.2% for the mean flow, below 1% for the root-mean-square velocity and pressure fluctuations, and below 2% for the three components of the turbulent dissipation. Relatively fine grids and long statistical averaging times are required. An analysis of dissipation spectra demonstrates that the enhanced resolution is necessary for an accurate representation of the smallest physical scales in the turbulent dissipation. The results are related to the physics of turbulent channel flow in several ways. First, the reproducibility supports the hitherto unproven theoretical hypothesis that the statistically stationary state of turbulent channel flow is unique. Second, the peaks of dissipation spectra provide information on length scales of the small-scale turbulence. Third, the computed means and fluctuations of the convective, pressure, and viscous terms in the momentum equation show the importance of the different forces in the momentum equation relative to each other. The Galilean transformation that leads to minimum peak fluctuation of the convective term is determined. Fourth, an analysis of higher-order statistics is performed. The skewness of the longitudinal derivative of the streamwise velocity is stronger than expected (-1.5 at y+ = 30). This skewness and also the strong near-wall intermittency of the normal velocity are related to coherent structures.
Assertiveness and problem solving in midwives.
Yurtsal, Zeliha Burcu; Özdemir, Levent
2015-01-01
Midwifery profession is required to bring solutions to problems and a midwife is expected to be an assertive person and to develop midwifery care. This study was planned to examine the relationship between assertiveness and problem-solving skills of midwives. This cross-sectional study was conducted with 201 midwives between July 2008 and February 2009 in the city center of Sivas. The Rathus Assertiveness Schedule (RAS) and Problem Solving Inventory (PSI) were used to determine the level of assertiveness and problem-solving skills of midwives. Statistical methods were used as mean, standard deviation, percentage, Student's T, ANOVA and Tukey HSD, Kruskal Wallis, Fisher Exact, Pearson Correlation and Chi-square tests and P < 0.05. The RAS mean scores and the PSI mean scores showed statistically significant differences in terms of a midwife's considering herself as a member of the health team, expressing herself within the health care team, being able to say "no" when necessary, cooperating with her colleagues, taking part in problem-solving skills training. A statistically significant negative correlation was found between the RAS and PSI scores. The RAS scores decreased while the problem-solving scores increased (r: -0451, P < 0.01). There were significant statistical differences between assertiveness levels and problem solving skills of midwives, and midwives who were assertive solved their problems better than did others. Assertiveness and problem-solving skills training will contribute to the success of the midwifery profession. Midwives able to solve problems, and display assertive behaviors will contribute to the development of midwifery profession.
Risley, John; Moradkhani, Hamid; Hay, Lauren E.; Markstrom, Steve
2011-01-01
In an earlier global climate-change study, air temperature and precipitation data for the entire twenty-first century simulated from five general circulation models were used as input to precalibrated watershed models for 14 selected basins across the United States. Simulated daily streamflow and energy output from the watershed models were used to compute a range of statistics. With a side-by-side comparison of the statistical analyses for the 14 basins, regional climatic and hydrologic trends over the twenty-first century could be qualitatively identified. Low-flow statistics (95% exceedance, 7-day mean annual minimum, and summer mean monthly streamflow) decreased for almost all basins. Annual maximum daily streamflow also decreased in all the basins, except for all four basins in California and the Pacific Northwest. An analysis of the supply of available energy and water for the basins indicated that ratios of evaporation to precipitation and potential evapotranspiration to precipitation for most of the basins will increase. Probability density functions (PDFs) were developed to assess the uncertainty and multimodality in the impact of climate change on mean annual streamflow variability. Kolmogorov?Smirnov tests showed significant differences between the beginning and ending twenty-first-century PDFs for most of the basins, with the exception of four basins that are located in the western United States. Almost none of the basin PDFs were normally distributed, and two basins in the upper Midwest had PDFs that were extremely dispersed and skewed.
Data from the Television Game Show "Friend or Foe?"
ERIC Educational Resources Information Center
Kalist, David E.
2004-01-01
The data discussed in this paper are from the television game show "Friend or Foe", and can be used to examine whether age, gender, race, and the amount of prize money affect contestants' strategies. The data are suitable for a variety of statistical analyses, such as descriptive statistics, testing for differences in means or proportions, and…
ERIC Educational Resources Information Center
Sidorov, Oleg V.; Kozub, Lyubov' V.; Goferberg, Alexander V.; Osintseva, Natalya V.
2018-01-01
The article discusses the methodological approach to the technology of the educational experiment performance, the ways of the research data processing by means of research methods and methods of mathematical statistics. The article shows the integrated use of some effective approaches to the training of the students majoring in…
Disability Statistics in the Developing World: A Reflection on the Meanings in Our Numbers
ERIC Educational Resources Information Center
Fujiura, Glenn T.; Park, Hye J.; Rutkowski-Kmitta, Violet
2005-01-01
Background: The imbalance between the sheer size of the developing world and what little is known about the lives and life circumstances of persons with disabilities living there should command our attention. Method: International development initiatives routinely give great priority to the collection of statistical indicators yet even the most…
ERIC Educational Resources Information Center
Mackenzie, Helen; Tolley, Harry; Croft, Tony; Grove, Michael; Lawson, Duncan
2016-01-01
This article explores the perspectives of three senior managers in higher education institutions in England regarding their mathematics and statistics support provision. It does so by means of a qualitative case study that draws upon the writing of Ronald Barnett about the identity of an "ecological" university, along with metaphors…
The Adequacy of Different Robust Statistical Tests in Comparing Two Independent Groups
ERIC Educational Resources Information Center
Pero-Cebollero, Maribel; Guardia-Olmos, Joan
2013-01-01
In the current study, we evaluated various robust statistical methods for comparing two independent groups. Two scenarios for simulation were generated: one of equality and another of population mean differences. In each of the scenarios, 33 experimental conditions were used as a function of sample size, standard deviation and asymmetry. For each…
Competitive agents in a market: Statistical physics of the minority game
NASA Astrophysics Data System (ADS)
Sherrington, David
2007-10-01
A brief review is presented of the minority game, a simple frustrated many-body system stimulated by considerations of a market of competitive speculative agents. Its cooperative behaviour exhibits phase transitions and both ergodic and non-ergodic regimes. It provides novel challenges to statistical physics, reminiscent of those of mean-field spin glasses.
When Do Students' Attitudes Change? Investigating Student Attitudes at Midterm
ERIC Educational Resources Information Center
Kerby, April T.; Wroughton, Jacqueline R.
2017-01-01
Statistics educators have been investigating how students' attitudes change in the introductory statistics course for many years. Typically, an overall decrease in mean attitudes over the course has been noted. However, when and how do students' attitudes change during the term? Do they steadily decrease or is there a point when students'…
People Patterns: Statistics. Environmental Module for Use in a Mathematics Laboratory Setting.
ERIC Educational Resources Information Center
Zastrocky, Michael; Trojan, Arthur
This module on statistics consists of 18 worksheets that cover such topics as sample spaces, mean, median, mode, taking samples, posting results, analyzing data, and graphing. The last four worksheets require the students to work with samples and use these to compare people's responses. A computer dating service is one result of this work.…
Gajski, Goran; Gerić, Marko; Oreščanin, Višnja; Garaj-Vrhovac, Vera
2013-01-20
In the present study the alkaline comet assay and the cytokinesis-block micronucleus cytome (CBMN Cyt) assay were used to evaluate the baseline frequency of cytogenetic damage in peripheral blood lymphocytes (PBLs) of 50 healthy children from the general population in Croatia (age, 11.62±1.81 years). Mean values of tail length, tail intensity and tail moment, as comet assay parameters, were 12.92±0.10, 0.73±0.06 and 0.08±0.01, respectively. The mean frequency of micronuclei (MN) for all subjects was 2.32±0.28 per 1000 bi-nucleated cells, while the mean frequency of nucleoplasmic bridges (NPBs) was 1.72±0.24 and of nuclear buds (NBUDs) 1.44±0.19. The mean nuclear division index (NDI) was 1.70±0.05. When comet-assay parameters were considered, higher mean values for all three were found for the female population. According to the Mann-Whitney U test applied on the results of the comet assay, the only statistically significant difference between the male and female populations was found for tail length. Similar to the results obtained by the comet assay, girls showed higher mean values of all three measured parameters of the CBMN Cyt assay. This difference was statistically significant for total number of NPBs only. In the case of the NDI, a higher mean value was also obtained in girls, but this difference was not statistically significant. The results obtained present background data that could be considered as normal values for healthy children living in urban areas, and can later on serve as baseline values for further toxicological monitoring. Additionally, the usefulness of both techniques in measuring cytogenetic damage during bio-monitoring of children is confirmed. Copyright © 2012 Elsevier B.V. All rights reserved.
Calculation of streamflow statistics for Ontario and the Great Lakes states
Piggott, Andrew R.; Neff, Brian P.
2005-01-01
Basic, flow-duration, and n-day frequency statistics were calculated for 779 current and historical streamflow gages in Ontario and 3,157 streamflow gages in the Great Lakes states with length-of-record daily mean streamflow data ending on December 31, 2000 and September 30, 2001, respectively. The statistics were determined using the U.S. Geological Survey’s SWSTAT and IOWDM, ANNIE, and LIBANNE software and Linux shell and PERL programming that enabled the mass processing of the data and calculation of the statistics. Verification exercises were performed to assess the accuracy of the processing and calculations. The statistics and descriptions, longitudes and latitudes, and drainage areas for each of the streamflow gages are summarized in ASCII text files and ESRI shapefiles.
NASA Technical Reports Server (NTRS)
Goldhirsh, J.
1982-01-01
The first absolute rain fade distribution method described establishes absolute fade statistics at a given site by means of a sampled radar data base. The second method extrapolates absolute fade statistics from one location to another, given simultaneously measured fade and rain rate statistics at the former. Both methods employ similar conditional fade statistic concepts and long term rain rate distributions. Probability deviations in the 2-19% range, with an 11% average, were obtained upon comparison of measured and predicted levels at given attenuations. The extrapolation of fade distributions to other locations at 28 GHz showed very good agreement with measured data at three sites located in the continental temperate region.
Common Scientific and Statistical Errors in Obesity Research
George, Brandon J.; Beasley, T. Mark; Brown, Andrew W.; Dawson, John; Dimova, Rositsa; Divers, Jasmin; Goldsby, TaShauna U.; Heo, Moonseong; Kaiser, Kathryn A.; Keith, Scott; Kim, Mimi Y.; Li, Peng; Mehta, Tapan; Oakes, J. Michael; Skinner, Asheley; Stuart, Elizabeth; Allison, David B.
2015-01-01
We identify 10 common errors and problems in the statistical analysis, design, interpretation, and reporting of obesity research and discuss how they can be avoided. The 10 topics are: 1) misinterpretation of statistical significance, 2) inappropriate testing against baseline values, 3) excessive and undisclosed multiple testing and “p-value hacking,” 4) mishandling of clustering in cluster randomized trials, 5) misconceptions about nonparametric tests, 6) mishandling of missing data, 7) miscalculation of effect sizes, 8) ignoring regression to the mean, 9) ignoring confirmation bias, and 10) insufficient statistical reporting. We hope that discussion of these errors can improve the quality of obesity research by helping researchers to implement proper statistical practice and to know when to seek the help of a statistician. PMID:27028280
Sequi, Marco; Campi, Rita; Clavenna, Antonio; Bonati, Maurizio
2013-03-01
To evaluate the quality of data reporting and statistical methods performed in drug utilization studies in the pediatric population. Drug utilization studies evaluating all drug prescriptions to children and adolescents published between January 1994 and December 2011 were retrieved and analyzed. For each study, information on measures of exposure/consumption, the covariates considered, descriptive and inferential analyses, statistical tests, and methods of data reporting was extracted. An overall quality score was created for each study using a 12-item checklist that took into account the presence of outcome measures, covariates of measures, descriptive measures, statistical tests, and graphical representation. A total of 22 studies were reviewed and analyzed. Of these, 20 studies reported at least one descriptive measure. The mean was the most commonly used measure (18 studies), but only five of these also reported the standard deviation. Statistical analyses were performed in 12 studies, with the chi-square test being the most commonly performed test. Graphs were presented in 14 papers. Sixteen papers reported the number of drug prescriptions and/or packages, and ten reported the prevalence of the drug prescription. The mean quality score was 8 (median 9). Only seven of the 22 studies received a score of ≥10, while four studies received a score of <6. Our findings document that only a few of the studies reviewed applied statistical methods and reported data in a satisfactory manner. We therefore conclude that the methodology of drug utilization studies needs to be improved.
An order statistics approach to the halo model for galaxies
NASA Astrophysics Data System (ADS)
Paul, Niladri; Paranjape, Aseem; Sheth, Ravi K.
2017-04-01
We use the halo model to explore the implications of assuming that galaxy luminosities in groups are randomly drawn from an underlying luminosity function. We show that even the simplest of such order statistics models - one in which this luminosity function p(L) is universal - naturally produces a number of features associated with previous analyses based on the 'central plus Poisson satellites' hypothesis. These include the monotonic relation of mean central luminosity with halo mass, the lognormal distribution around this mean and the tight relation between the central and satellite mass scales. In stark contrast to observations of galaxy clustering; however, this model predicts no luminosity dependence of large-scale clustering. We then show that an extended version of this model, based on the order statistics of a halo mass dependent luminosity function p(L|m), is in much better agreement with the clustering data as well as satellite luminosities, but systematically underpredicts central luminosities. This brings into focus the idea that central galaxies constitute a distinct population that is affected by different physical processes than are the satellites. We model this physical difference as a statistical brightening of the central luminosities, over and above the order statistics prediction. The magnitude gap between the brightest and second brightest group galaxy is predicted as a by-product, and is also in good agreement with observations. We propose that this order statistics framework provides a useful language in which to compare the halo model for galaxies with more physically motivated galaxy formation models.
A d-statistic for single-case designs that is equivalent to the usual between-groups d-statistic.
Shadish, William R; Hedges, Larry V; Pustejovsky, James E; Boyajian, Jonathan G; Sullivan, Kristynn J; Andrade, Alma; Barrientos, Jeannette L
2014-01-01
We describe a standardised mean difference statistic (d) for single-case designs that is equivalent to the usual d in between-groups experiments. We show how it can be used to summarise treatment effects over cases within a study, to do power analyses in planning new studies and grant proposals, and to meta-analyse effects across studies of the same question. We discuss limitations of this d-statistic, and possible remedies to them. Even so, this d-statistic is better founded statistically than other effect size measures for single-case design, and unlike many general linear model approaches such as multilevel modelling or generalised additive models, it produces a standardised effect size that can be integrated over studies with different outcome measures. SPSS macros for both effect size computation and power analysis are available.
Detector noise statistics in the non-linear regime
NASA Technical Reports Server (NTRS)
Shopbell, P. L.; Bland-Hawthorn, J.
1992-01-01
The statistical behavior of an idealized linear detector in the presence of threshold and saturation levels is examined. It is assumed that the noise is governed by the statistical fluctuations in the number of photons emitted by the source during an exposure. Since physical detectors cannot have infinite dynamic range, our model illustrates that all devices have non-linear regimes, particularly at high count rates. The primary effect is a decrease in the statistical variance about the mean signal due to a portion of the expected noise distribution being removed via clipping. Higher order statistical moments are also examined, in particular, skewness and kurtosis. In principle, the expected distortion in the detector noise characteristics can be calibrated using flatfield observations with count rates matched to the observations. For this purpose, some basic statistical methods that utilize Fourier analysis techniques are described.
Code of Federal Regulations, 2011 CFR
2011-07-01
....g., “youth,” “juvenile,” “adult,” “older persons,” but not “student”). Department means the... National Institute of Justice, the Bureau of Justice Statistics, and the Office of Juvenile Justice and Delinquency Prevention; OJP includes the Office for Victims of Crime. Program or activity means all of the...
The Implicit Learning of Mappings between Forms and Contextually Derived Meanings
ERIC Educational Resources Information Center
Leung, Janny H. C.; Williams, John N.
2011-01-01
The traditional implicit learning literature has focused primarily on the abstraction of statistical regularities in form-form connections. More attention has been recently directed toward the implicit learning of form-meaning connections, which might be crucial in the acquisition of natural languages. The current article reports evidence for…
Sensitivity of Fit Indices to Misspecification in Growth Curve Models
ERIC Educational Resources Information Center
Wu, Wei; West, Stephen G.
2010-01-01
This study investigated the sensitivity of fit indices to model misspecification in within-individual covariance structure, between-individual covariance structure, and marginal mean structure in growth curve models. Five commonly used fit indices were examined, including the likelihood ratio test statistic, root mean square error of…
Child Sustained Attention in Preschool-Age Children
ERIC Educational Resources Information Center
DiCarlo, Cynthia F.; Baumgartner, Jennifer J.; Ota, Carrie; Geary, Kelly
2016-01-01
This study examined the mean duration of child attention across three teaching conditions (child choice, adult choice, or adult presentation) of 63 preschool-age children. A repeated-measures ANOVA was used to compare the means across the three teaching conditions, indicating a statistically significant difference between the teaching conditions.…
Code of Federal Regulations, 2012 CFR
2012-07-01
....g., “youth,” “juvenile,” “adult,” “older persons,” but not “student”). Department means the... National Institute of Justice, the Bureau of Justice Statistics, and the Office of Juvenile Justice and Delinquency Prevention; OJP includes the Office for Victims of Crime. Program or activity means all of the...
Code of Federal Regulations, 2013 CFR
2013-07-01
....g., “youth,” “juvenile,” “adult,” “older persons,” but not “student”). Department means the... National Institute of Justice, the Bureau of Justice Statistics, and the Office of Juvenile Justice and Delinquency Prevention; OJP includes the Office for Victims of Crime. Program or activity means all of the...
29 CFR 1910.1200 - Hazard communication.
Code of Federal Regulations, 2010 CFR
2010-07-01
... statistically significant evidence based on at least one study conducted in accordance with established... less than one percent (or in the case of carcinogens, less than 0.1 percent) could be released in....) Flammable means a chemical that falls into one of the following categories: (i) Aerosol, flammable means an...
Cognitive Development of Severely and Profoundly Mentally Retarded Individuals.
ERIC Educational Resources Information Center
Silverstein, A. B.; And Others
1982-01-01
H. Corman and S. Escalona's scales for object permanence and spatial relationships were readministered to 71 severely and profoundly mentally retarded individuals (mean age 19 years) five years after the last previous administration of the scales. Gains in mean scores were small but statistically significant for both scales. (Author)
Vital Statistics for Ohio Appalachian School Districts, Fiscal Year 1999.
ERIC Educational Resources Information Center
Ohio Univ., Athens. Coalition of Rural and Appalachian Schools.
This document compiles school district data on 18 factors for the 29 southeastern Ohio counties designated as "Appalachian." Data tables present state means, Appalachian means and ranges, and individual district data for fall enrollment; percentage of minority students; percentage of Aid to Dependent Children; average income; property…
Engler-Hamm, Daniel; Cheung, Wai S; Yen, Alec; Stark, Paul C; Griffin, Terrence
2011-03-01
The aim of this single-masked, randomized controlled clinical trial is to compare hard and soft tissue changes after ridge preservation performed with (control, RPc) and without (test, RPe) primary soft tissue closure in a split-mouth design. Eleven patients completed this 6-month trial. Extraction and ridge preservation were performed using a composite bone graft of inorganic bovine-derived hydroxyapatite matrix and cell binding peptide P-15 (ABM/P-15), demineralized freeze-dried bone allograft, and a copolymer bioabsorbable membrane. Primary wound closure was achieved on the control sites (RPc), whereas test sites (RPe) left the membrane exposed. Pocket probing depth on adjacent teeth, repositioning of the mucogingival junction, bone width, bone fill, and postoperative discomfort were assessed. Bone cores were obtained for histological examination. Intragroup analyses for both groups demonstrated statistically significant mean reductions in probing depth (RPc: 0.42 mm, P = 0.012; RPe: 0.25 mm, P = 0.012) and bone width (RPc: 3 mm, P = 0.002; RPe: 3.42 mm, P <0.001). However, intergroup analysis did not find these parameters to be statistically different at 6 months. The test group showed statistically significant mean change in bone fill (7.21 mm; P <0.001). Compared to the control group, the test group showed statistically significant lower mean postoperative discomfort (RPc 4 versus RPe 2; P = 0.002). Histomorphometric analysis showed presence of 0% to 40% of ABM/P-15 and 5% to 20% of new bone formation in both groups. Comparison of clinical variables between the two groups at 6 months revealed that the mucogingival junction was statistically significantly more coronally displaced in the control group than in the test group, with a mean of 3.83 mm versus 1.21 mm (P = 0.002). Ridge preservation without flap advancement preserves more keratinized tissue and has less postoperative discomfort and swelling. Although ridge preservation is performed with either method, ≈27% to 30% of bone width is lost.
Comparative efficacy of two battery-powered toothbrushes on dental plaque removal.
Ruhlman, C Douglas; Bartizek, Robert D; Biesbrock, Aaron R
2002-01-01
A number of clinical studies have consistently demonstrated that power toothbrushes deliver superior plaque removal compared to manual toothbrushes. Recently, a new power toothbrush (Crest SpinBrush) has been marketed with a design that fundamentally differs from other marketed power toothbrushes. Other power toothbrushes feature a small, round head designed to oscillate for enhanced cleaning between the teeth and below the gumline. The new power toothbrush incorporates a similar round oscillating head in conjunction with fixed bristles, which allows the user to brush with optimal manual brushing technique. The objective of this randomized, examiner-blind, parallel design study was to compare the plaque removal efficacy of a positive control power toothbrush (Colgate Actibrush) to an experimental toothbrush (Crest SpinBrush) following a single use among 59 subjects. Baseline plaque scores were 1.64 and 1.40 for the experimental toothbrush and control toothbrush treatment groups, respectively. With regard to all surfaces examined, the experimental toothbrush delivered an adjusted (via analysis of covariance) mean difference between baseline and post-brushing plaque scores of 0.47, while the control toothbrush delivered an adjusted mean difference of 0.33. On average, the difference between toothbrushes was statistically significant (p = 0.013). Because the covariate slope for the experimental group was statistically significantly greater (p = 0.001) than the slope for the control group, a separate slope model was used. Further analysis demonstrated that the experimental group had statistically significantly greater plaque removal than the control group for baseline plaque scores above 1.43. With respect to buccal surfaces, using a separate slope analysis of covariance, the experimental toothbrush delivered an adjusted mean difference between baseline and post-brushing plaque scores of 0.61, while the control toothbrush delivered an adjusted mean difference of 0.39. This difference between toothbrushes was also statistically significant (p = 0.002). On average, the results on lingual surfaces demonstrated similar directional scores favoring the experimental toothbrush; however these results did not achieve statistical significance. In conclusion, the experimental Crest SpinBrush, with its novel fixed and oscillating bristle design, was found to be more effective than the positive control Colgate Actibrush, which is designed with a small round oscillating cluster of bristles.
Rickmann, Annekatrin; Opitz, Natalia; Szurman, Peter; Boden, Karl Thomas; Jung, Sascha; Wahl, Silke; Haus, Arno; Damm, Lara-Jil; Januschowski, Kai
2018-01-01
Descemet membrane endothelial keratoplasty (DMEK) has been improved over the last decade. The aim of this study was to compare the clinical outcome of the recently introduced liquid bubble method compared to the standard manual preparation. This retrospective study evaluated the outcome of 200 patients after DMEK surgery using two different graft preparation techniques. Ninety-six DMEK were prepared by manual dissection and 104 by the novel liquid bubble technique. The mean follow-up time was 13.7 months (SD ± 8, range 6-36 months). Best corrected mean visual acuity (BCVA) increased for all patients statistically significant from baseline 0.85 logMAR (SD ± 0.5) to 0.26 logMAR (SD ± 0.27) at the final follow-up (Wilcoxon, p = 0.001). Subgroup analyses of BCVA at the final follow-up between manual dissection and liquid bubble preparation showed no statistically significant difference (Mann-Whitney U Test, p = 0.64). The mean central corneal thickness was not statistically different (manual dissection: 539 µm, SD ± 68 µm and liquid bubble technique: 534 µm, SD ± 52 µm,) between the two groups (Mann-Whitney U Test, p = 0.64). At the final follow-up, mean endothelial cell count of donor grafts was statistically not significant different at the final follow-up with 1761 cells/mm 2 (-30.7%, SD ± 352) for manual dissection compared to liquid bubble technique with 1749 cells/mm 2 (-29.9%, SD ± 501) (Mann-Whitney U-Test, p = 0.73). The re-DMEK rate was comparable for manual dissection with 8 cases (8.3%) and 7 cases (6.7%) for liquid bubble dissection (p = 0.69, Chi-Square Test). Regarding the clinical outcome, we did not find a statistical significant difference between manual dissection and liquid bubble graft preparation. Both preparation techniques lead to an equivalent clinical outcome after DMEK surgery.
Paddock, Michael T; Bailitz, John; Horowitz, Russ; Khishfe, Basem; Cosby, Karen; Sergel, Michelle J
2015-03-01
Pre-hospital focused assessment with sonography in trauma (FAST) has been effectively used to improve patient care in multiple mass casualty events throughout the world. Although requisite FAST knowledge may now be learned remotely by disaster response team members, traditional live instructor and model hands-on FAST skills training remains logistically challenging. The objective of this pilot study was to compare the effectiveness of a novel portable ultrasound (US) simulator with traditional FAST skills training for a deployed mixed provider disaster response team. We randomized participants into one of three training groups stratified by provider role: Group A. Traditional Skills Training, Group B. US Simulator Skills Training, and Group C. Traditional Skills Training Plus US Simulator Skills Training. After skills training, we measured participants' FAST image acquisition and interpretation skills using a standardized direct observation tool (SDOT) with healthy models and review of FAST patient images. Pre- and post-course US and FAST knowledge were also assessed using a previously validated multiple-choice evaluation. We used the ANOVA procedure to determine the statistical significance of differences between the means of each group's skills scores. Paired sample t-tests were used to determine the statistical significance of pre- and post-course mean knowledge scores within groups. We enrolled 36 participants, 12 randomized to each training group. Randomization resulted in similar distribution of participants between training groups with respect to provider role, age, sex, and prior US training. For the FAST SDOT image acquisition and interpretation mean skills scores, there was no statistically significant difference between training groups. For US and FAST mean knowledge scores, there was a statistically significant improvement between pre- and post-course scores within each group, but again there was not a statistically significant difference between training groups. This pilot study of a deployed mixed-provider disaster response team suggests that a novel portable US simulator may provide equivalent skills training in comparison to traditional live instructor and model training. Further studies with a larger sample size and other measures of short- and long-term clinical performance are warranted.
A critical evaluation and a search for the ideal colonoscopic preparation.
Arora, Manish; Senadhi, Viplove; Arora, Deepika; Weinstock, Joyce; Dubin, Ethan; Okolo, Patrick I; Dutta, Sudhir K
2013-04-01
The aim of this study was to evaluate the efficacy of various bowel preparations in accomplishing colonic cleansing for optimal mucosal visualization during colonoscopy. The study included a cohort of 980 patients who underwent colonoscopy at our endoscopy center within the last 3 years. All of the study patients were subdivided into four groups. Each group included 245 patients, all receiving a different type of bowel preparation. The bowel preparations used in this study included: magnesium citrate (Group I), a combination of oral sodium phosphate (fleets) and powder PEG-3350 (Group II), powder polyethylene glycol-3350 (PEG-3350 powder for Group III), and oral sodium phosphate (fleets for Group IV). A Colon Prep Score (CPS) was devised to compare the quality of the different bowel preparations used. The colonoscopy results from all of these patients were tabulated and analyzed statistically and expressed as mean ± 1 standard deviation. Statistical analysis was performed using a one way ANOVA with Holm-Sidak method for intergroup analysis. Group I patients received magnesium citrate and had a mean CPS ± 1 SD of 3.11 ± 0.91. Group II patients (fleets and powder PEG-3350 combination) achieved a CPS of 3.37 ± 1.16. The patients in Group III (powder PEG-3350) actually showed the highest mean CPS of 3.44 ± 1.12. Group IV patients who used oral sodium phosphate alone reached a mean CPS of 3.23 ± 1.01. Group III patients (powder PEG-3350 only) demonstrated a statistically higher CPS (P<0.0006) in colon cleansing as compared to Group I patients (magnesium citrate). Similarly, Group II patients (oral sodium phosphate and powder PEG-3350 combination) also showed improved colon cleansing statistically (P<0.006) as compared to Group I patients (magnesium citrate). Overall, all four colon preparations achieved an average CPS greater than 3.0 indicating clinically adequate colonic cleansing. However, powder PEG-3350 alone and in combination with oral sodium phosphate was observed to be statistically superior to magnesium citrate, when used for colon preparation for colonoscopy. Copyright © 2012 Elsevier Masson SAS. All rights reserved.
el Galta, Rachid; Uitte de Willige, Shirley; de Visser, Marieke C H; Helmer, Quinta; Hsu, Li; Houwing-Duistermaat, Jeanine J
2007-09-24
In this paper, we propose a one degree of freedom test for association between a candidate gene and a binary trait. This method is a generalization of Terwilliger's likelihood ratio statistic and is especially powerful for the situation of one associated haplotype. As an alternative to the likelihood ratio statistic, we derive a score statistic, which has a tractable expression. For haplotype analysis, we assume that phase is known. By means of a simulation study, we compare the performance of the score statistic to Pearson's chi-square statistic and the likelihood ratio statistic proposed by Terwilliger. We illustrate the method on three candidate genes studied in the Leiden Thrombophilia Study. We conclude that the statistic follows a chi square distribution under the null hypothesis and that the score statistic is more powerful than Terwilliger's likelihood ratio statistic when the associated haplotype has frequency between 0.1 and 0.4 and has a small impact on the studied disorder. With regard to Pearson's chi-square statistic, the score statistic has more power when the associated haplotype has frequency above 0.2 and the number of variants is above five.
Robustness of S1 statistic with Hodges-Lehmann for skewed distributions
NASA Astrophysics Data System (ADS)
Ahad, Nor Aishah; Yahaya, Sharipah Soaad Syed; Yin, Lee Ping
2016-10-01
Analysis of variance (ANOVA) is a common use parametric method to test the differences in means for more than two groups when the populations are normally distributed. ANOVA is highly inefficient under the influence of non- normal and heteroscedastic settings. When the assumptions are violated, researchers are looking for alternative such as Kruskal-Wallis under nonparametric or robust method. This study focused on flexible method, S1 statistic for comparing groups using median as the location estimator. S1 statistic was modified by substituting the median with Hodges-Lehmann and the default scale estimator with the variance of Hodges-Lehmann and MADn to produce two different test statistics for comparing groups. Bootstrap method was used for testing the hypotheses since the sampling distributions of these modified S1 statistics are unknown. The performance of the proposed statistic in terms of Type I error was measured and compared against the original S1 statistic, ANOVA and Kruskal-Wallis. The propose procedures show improvement compared to the original statistic especially under extremely skewed distribution.
A statistical evaluation and comparison of VISSR Atmospheric Sounder (VAS) data
NASA Technical Reports Server (NTRS)
Jedlovec, G. J.
1984-01-01
In order to account for the temporal and spatial discrepancies between the VAS and rawinsonde soundings, the rawinsonde data were adjusted to a common hour of release where the new observation time corresponded to the satellite scan time. Both the satellite and rawinsonde observations of the basic atmospheric parameters (T Td, and Z) were objectively analyzed to a uniform grid maintaining the same mesoscale structure in each data set. The performance of each retrieval algorithm in producing accurate and representative soundings was evaluated using statistical parameters such as the mean, standard deviation, and root mean square of the difference fields for each parameter and grid level. Horizontal structure was also qualitatively evaluated by examining atmospheric features on constant pressure surfaces. An analysis of the vertical structure of the atmosphere were also performed by looking at colocated and grid mean vertical profiles of both the satellite and rawinsonde data sets. Highlights of these results are presented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brizzee, K.R.; Ordy, J.M.; Kaack, M.B.
1980-09-01
Five squirrel monkeys were exposed to 200 rads whole-body ionizing irradiation (/sup 60/Co) at 0.4 rads per second on approximately the seventy-fifth day of gestation, and six squirrel monkeys were sham-irradiated. The mean cortical depth and the mean number of neurons per mm/sup 3/ in the visual cortex was less in irradiated animals than in controls, but the differences were not statistically significant. The mean number of glial cells in this cortical region was significantly lower in the irradiated animals. In the hippocampus, the depth of the stratum oriens and the combined depth of the strata radiatum, lacunosum, and molecularemore » were significantly less in irradiated than in control animals. Canonical correlations provided statistical evidence for greater radiation vulnerability of the hippocampus compared to motor and visual areas of the cerebral cortex.« less
Impact of Damping Uncertainty on SEA Model Response Variance
NASA Technical Reports Server (NTRS)
Schiller, Noah; Cabell, Randolph; Grosveld, Ferdinand
2010-01-01
Statistical Energy Analysis (SEA) is commonly used to predict high-frequency vibroacoustic levels. This statistical approach provides the mean response over an ensemble of random subsystems that share the same gross system properties such as density, size, and damping. Recently, techniques have been developed to predict the ensemble variance as well as the mean response. However these techniques do not account for uncertainties in the system properties. In the present paper uncertainty in the damping loss factor is propagated through SEA to obtain more realistic prediction bounds that account for both ensemble and damping variance. The analysis is performed on a floor-equipped cylindrical test article that resembles an aircraft fuselage. Realistic bounds on the damping loss factor are determined from measurements acquired on the sidewall of the test article. The analysis demonstrates that uncertainties in damping have the potential to significantly impact the mean and variance of the predicted response.
Asquith, William H.; Vrabel, Joseph; Roussel, Meghan C.
2007-01-01
The U.S. Geological Survey (USGS), in cooperation with numerous Federal, State, municipal, and local agencies, currently (2007) collects data for more than 120 lakes and reservoirs in Texas through a realtime, data-collection network. The National Water Information System that processes and archives water-resources data for the Nation provides a central source for retrieval of real-time as well as historical data. This report provides a brief description of the real-time, data-collection network and graphically summarizes the period-of-record daily mean water-surface elevations for 116 active and discontinued USGS lake and reservoir stations in Texas. The report also graphically depicts selected statistics (minimum, maximum, and mean) of daily mean water-surface-elevation data. The data for water year 2006 are compared to the selected statistics.
Cape Canaveral, Florida range reference atmosphere 0-70 km altitude
NASA Technical Reports Server (NTRS)
Tingle, A. (Editor)
1983-01-01
The RRA contains tabulations for monthly and annual means, standard deviations, skewness coefficients for wind speed, pressure temperature, density, water vapor pressure, virtual temperature, dew-point temperature, and the means and standard deviations for the zonal and meridional wind components and the linear (product moment) correlation coefficient between the wind components. These statistical parameters are tabulated at the station elevation and at 1 km intervals from sea level to 30 km and at 2 km intervals from 30 to 90 km altitude. The wind statistics are given at approximately 10 m above the station elevations and at altitudes with respect to mean sea level thereafter. For those range sites without rocketsonde measurements, the RRAs terminate at 30 km altitude or they are extended, if required, when rocketsonde data from a nearby launch site are available. There are four sets of tables for each of the 12 monthly reference periods and the annual reference period.
There’s plenty of light at the bottom: statistics of photon penetration depth in random media
Martelli, Fabrizio; Binzoni, Tiziano; Pifferi, Antonio; Spinelli, Lorenzo; Farina, Andrea; Torricelli, Alessandro
2016-01-01
We propose a comprehensive statistical approach describing the penetration depth of light in random media. The presented theory exploits the concept of probability density function f(z|ρ, t) for the maximum depth reached by the photons that are eventually re-emitted from the surface of the medium at distance ρ and time t. Analytical formulas for f, for the mean maximum depth 〈zmax〉 and for the mean average depth reached by the detected photons at the surface of a diffusive slab are derived within the framework of the diffusion approximation to the radiative transfer equation, both in the time domain and the continuous wave domain. Validation of the theory by means of comparisons with Monte Carlo simulations is also presented. The results are of interest for many research fields such as biomedical optics, advanced microscopy and disordered photonics. PMID:27256988
Barbie, Dana L.; Wehmeyer, Loren L.
2012-01-01
Trends in selected streamflow statistics during 1922-2009 were evaluated at 19 long-term streamflow-gaging stations considered indicative of outflows from Texas to Arkansas, Louisiana, Galveston Bay, and the Gulf of Mexico. The U.S. Geological Survey, in cooperation with the Texas Water Development Board, evaluated streamflow data from streamflow-gaging stations with more than 50 years of record that were active as of 2009. The outflows into Arkansas and Louisiana were represented by 3 streamflow-gaging stations, and outflows into the Gulf of Mexico, including Galveston Bay, were represented by 16 streamflow-gaging stations. Monotonic trend analyses were done using the following three streamflow statistics generated from daily mean values of streamflow: (1) annual mean daily discharge, (2) annual maximum daily discharge, and (3) annual minimum daily discharge. The trend analyses were based on the nonparametric Kendall's Tau test, which is useful for the detection of monotonic upward or downward trends with time. A total of 69 trend analyses by Kendall's Tau were computed - 19 periods of streamflow multiplied by the 3 streamflow statistics plus 12 additional trend analyses because the periods of record for 2 streamflow-gaging stations were divided into periods representing pre- and post-reservoir impoundment. Unless otherwise described, each trend analysis used the entire period of record for each streamflow-gaging station. The monotonic trend analysis detected 11 statistically significant downward trends, 37 instances of no trend, and 21 statistically significant upward trends. One general region studied, which seemingly has relatively more upward trends for many of the streamflow statistics analyzed, includes the rivers and associated creeks and bayous to Galveston Bay in the Houston metropolitan area. Lastly, the most western river basins considered (the Nueces and Rio Grande) had statistically significant downward trends for many of the streamflow statistics analyzed.
Aging Affects Adaptation to Sound-Level Statistics in Human Auditory Cortex.
Herrmann, Björn; Maess, Burkhard; Johnsrude, Ingrid S
2018-02-21
Optimal perception requires efficient and adaptive neural processing of sensory input. Neurons in nonhuman mammals adapt to the statistical properties of acoustic feature distributions such that they become sensitive to sounds that are most likely to occur in the environment. However, whether human auditory responses adapt to stimulus statistical distributions and how aging affects adaptation to stimulus statistics is unknown. We used MEG to study how exposure to different distributions of sound levels affects adaptation in auditory cortex of younger (mean: 25 years; n = 19) and older (mean: 64 years; n = 20) adults (male and female). Participants passively listened to two sound-level distributions with different modes (either 15 or 45 dB sensation level). In a control block with long interstimulus intervals, allowing neural populations to recover from adaptation, neural response magnitudes were similar between younger and older adults. Critically, both age groups demonstrated adaptation to sound-level stimulus statistics, but adaptation was altered for older compared with younger people: in the older group, neural responses continued to be sensitive to sound level under conditions in which responses were fully adapted in the younger group. The lack of full adaptation to the statistics of the sensory environment may be a physiological mechanism underlying the known difficulty that older adults have with filtering out irrelevant sensory information. SIGNIFICANCE STATEMENT Behavior requires efficient processing of acoustic stimulation. Animal work suggests that neurons accomplish efficient processing by adjusting their response sensitivity depending on statistical properties of the acoustic environment. Little is known about the extent to which this adaptation to stimulus statistics generalizes to humans, particularly to older humans. We used MEG to investigate how aging influences adaptation to sound-level statistics. Listeners were presented with sounds drawn from sound-level distributions with different modes (15 vs 45 dB). Auditory cortex neurons adapted to sound-level statistics in younger and older adults, but adaptation was incomplete in older people. The data suggest that the aging auditory system does not fully capitalize on the statistics available in sound environments to tune the perceptual system dynamically. Copyright © 2018 the authors 0270-6474/18/381989-11$15.00/0.
Tang, An; Chen, Joshua; Le, Thuy-Anh; Changchien, Christopher; Hamilton, Gavin; Middleton, Michael S.; Loomba, Rohit; Sirlin, Claude B.
2014-01-01
Purpose To explore the cross-sectional and longitudinal relationships between fractional liver fat content, liver volume, and total liver fat burden. Methods In 43 adults with non-alcoholic steatohepatitis participating in a clinical trial, liver volume was estimated by segmentation of magnitude-based low-flip-angle multiecho GRE images. The liver mean proton density fat fraction (PDFF) was calculated. The total liver fat index (TLFI) was estimated as the product of liver mean PDFF and liver volume. Linear regression analyses were performed. Results Cross-sectional analyses revealed statistically significant relationships between TLFI and liver mean PDFF (R2 = 0.740 baseline/0.791 follow-up, P < 0.001 baseline/P < 0.001 follow-up), and between TLFI and liver volume (R2 = 0.352/0.452, P < 0.001/< 0.001). Longitudinal analyses revealed statistically significant relationships between liver volume change and liver mean PDFF change (R2 = 0.556, P < 0.001), between TLFI change and liver mean PDFF change (R2 = 0.920, P < 0.001), and between TLFI change and liver volume change (R2 = 0.735, P < 0.001). Conclusion Liver segmentation in combination with MRI-based PDFF estimation may be used to monitor liver volume, liver mean PDFF, and TLFI in a clinical trial. PMID:25015398
Romero, Daniela C; Sauris, Aileen; Rodriguez, Fátima; Delgado, Daniela; Reddy, Ankita; Foody, JoAnne M
2016-03-01
Hispanic women suffer from high rates of cardiometabolic risk factors and an increasingly disproportionate burden of cardiovascular disease (CVD). Particularly, Hispanic women with limited English proficiency suffer from low levels of CVD knowledge associated with adverse CVD health outcomes. Thirty-two predominantly Spanish-speaking Hispanic women completed, Vivir Con un Corazón Saludable (VCUCS), a culturally tailored Spanish language-based 6-week intensive community program targeting CVD health knowledge through weekly interactive health sessions. A 30-question CVD knowledge questionnaire was used to assess mean changes in CVD knowledge at baseline and postintervention across five major knowledge domains including CVD epidemiology, dietary knowledge, medical information, risk factors, and heart attack symptoms. Completion of the program was associated with a statistically significant (p < 0.001) increase in total mean CVD knowledge scores from 39 % (mean 11.7/30.0) to 66 % (mean 19.8/30.0) postintervention consistent with a 68 % increase in overall mean CVD scores. There was a statistically significant (p < 0.001) increase in mean knowledge scores across all five CVD domains. A culturally tailored Spanish language-based health program is effective in increasing CVD awareness among high CVD risk Hispanic women with low English proficiency and low baseline CVD knowledge.
Drinking water fluoride and blood pressure? An environmental study.
Amini, Hassan; Taghavi Shahri, Seyed Mahmood; Amini, Mohamad; Ramezani Mehrian, Majid; Mokhayeri, Yaser; Yunesian, Masud
2011-12-01
The relationship between intakes of fluoride (F) from drinking water and blood pressure has not yet been reported. We examined the relationship of F in ground water resources (GWRs) of Iran with the blood pressure of Iranian population in an ecologic study. The mean F data of the GWRs (as a surrogate for F levels in drinking water) were derived from a previously conducted study. The hypertension prevalence and the mean of systolic and diastolic blood pressures (SBP & DBP) of Iranian population by different provinces and genders were also derived from the provincial report of non-communicable disease risk factor surveillance of Iran. Statistically significant positive correlations were found between the mean concentrations of F in the GWRs and the hypertension prevalence of males (r = 0.48, p = 0.007), females (r = 0.36, p = 0.048), and overall (r = 0.495, p = 0.005). Also, statistically significant positive correlations between the mean concentrations of F in the GWRs and the mean SBP of males (r = 0.431, p = 0.018), and a borderline correlation with females (r = 0.352, p = 0.057) were found. In conclusion, we found the increase of hypertension prevalence and the SBP mean with the increase of F level in the GWRs of Iranian population.
NASA Astrophysics Data System (ADS)
Lucarini, Valerio; Russell, Gary L.
2002-08-01
Results are presented for two greenhouse gas experiments of the Goddard Institute for Space Studies atmosphere-ocean model (AOM). The computed trends of surface pressure; surface temperature; 850, 500, and 200 mbar geopotential heights; and related temperatures of the model for the time frame 1960-2000 are compared with those obtained from the National Centers for Enviromental Prediction (NCEP) observations. The domain of interest is the Northern Hemisphere because of the higher reliability of both the model results and the observations. A spatial correlation analysis and a mean value comparison are performed, showing good agreement in terms of statistical significance for most of the variables considered in the winter and annual means. However, the 850 mbar temperature trends do not show significant positive correlation, and the surface pressure and 850 mbar geopotential height mean trends confidence intervals do not overlap. A brief general discussion about the statistics of trend detection is presented. The accuracy that this AOM has in describing the regional and NH mean climate trends inferred from NCEP through the atmosphere suggests that it may be reliable in forecasting future climate changes.
Insulin-like growth factor I: a biologic maturation indicator.
Ishaq, Ramy Abdul Rahman; Soliman, Sanaa Abou Zeid; Foda, Manal Yehya; Fayed, Mona Mohamed Salah
2012-11-01
Determination of the maturation level and the subsequent evaluation of growth potential during preadolescence and adolescence are important for optimal orthodontic treatment planning and timing. This study was undertaken to evaluate the applicability of insulin-like growth factor I (IGF-I) blood level as a maturation indicator by correlating it to the cervical vertebral maturation index. The study was conducted with 120 subjects, equally divided into 60 males (ages, 10-18 years) and 60 females (ages, 8-16 years). A lateral cephalometric radiograph and a blood sample were taken from each subject. For each subject, cervical vertebral maturation and IGF-I serum level were assessed. Mean values of IGF-I in each stage of cervical vertebral maturation were calculated, and the means in each stage were statistically compared with those of the other stages. The IGF-I mean value at each cervical vertebral maturation stage was statistically different from the mean values at the other stages. The highest mean values were observed in stage 4, followed by stage 5 in males and stage 3 in females. IGF-I serum level is a reliable maturation indicator that could be applied in orthodontic diagnosis. Copyright © 2012 American Association of Orthodontists. Published by Mosby, Inc. All rights reserved.
Srinivasan, A; Galbán, C J; Johnson, T D; Chenevert, T L; Ross, B D; Mukherji, S K
2010-04-01
Does the K-means algorithm do a better job of differentiating benign and malignant neck pathologies compared to only mean ADC? The objective of our study was to analyze the differences between ADC partitions to evaluate whether the K-means technique can be of additional benefit to whole-lesion mean ADC alone in distinguishing benign and malignant neck pathologies. MR imaging studies of 10 benign and 10 malignant proved neck pathologies were postprocessed on a PC by using in-house software developed in Matlab. Two neuroradiologists manually contoured the lesions, with the ADC values within each lesion clustered into 2 (low, ADC-ADC(L); high, ADC-ADC(H)) and 3 partitions (ADC(L); intermediate, ADC-ADC(I); ADC(H)) by using the K-means clustering algorithm. An unpaired 2-tailed Student t test was performed for all metrics to determine statistical differences in the means of the benign and malignant pathologies. A statistically significant difference between the mean ADC(L) clusters in benign and malignant pathologies was seen in the 3-cluster models of both readers (P = .03 and .022, respectively) and the 2-cluster model of reader 2 (P = .04), with the other metrics (ADC(H), ADC(I); whole-lesion mean ADC) not revealing any significant differences. ROC curves demonstrated the quantitative differences in mean ADC(H) and ADC(L) in both the 2- and 3-cluster models to be predictive of malignancy (2 clusters: P = .008, area under curve = 0.850; 3 clusters: P = .01, area under curve = 0.825). The K-means clustering algorithm that generates partitions of large datasets may provide a better characterization of neck pathologies and may be of additional benefit in distinguishing benign and malignant neck pathologies compared with whole-lesion mean ADC alone.
Mean bond-length variations in crystals for ions bonded to oxygen
2017-01-01
Variations in mean bond length are examined in oxide and oxysalt crystals for 55 cation configurations bonded to O2−. Stepwise multiple regression analysis shows that mean bond length is correlated to bond-length distortion in 42 ion configurations at the 95% confidence level, with a mean coefficient of determination (〈R 2〉) of 0.35. Previously published correlations between mean bond length and mean coordination number of the bonded anions are found not to be of general applicability to inorganic oxide and oxysalt structures. For two of 11 ions tested for the 95% confidence level, mean bond lengths predicted using a fixed radius for O2− are significantly more accurate as those predicted using an O2− radius dependent on coordination number, and are statistically identical otherwise. As a result, the currently accepted ionic radii for O2− in different coordinations are not justified by experimental data. Previously reported correlation between mean bond length and the mean electronegativity of the cations bonded to the oxygen atoms of the coordination polyhedron is shown to be statistically insignificant; similar results are obtained with regard to ionization energy. It is shown that a priori bond lengths calculated for many ion configurations in a single structure-type leads to a high correlation between a priori and observed mean bond lengths, but a priori bond lengths calculated for a single ion configuration in many different structure-types leads to negligible correlation between a priori and observed mean bond lengths. This indicates that structure type has a major effect on mean bond length, the magnitude of which goes beyond that of the other variables analyzed here.
ERIC Educational Resources Information Center
Groth, Randall E.
2010-01-01
In the recent past, qualitative research methods have become more prevalent in the field of statistics education. This paper offers thoughts on the process of framing a qualitative study by means of an illustrative example. The decisions that influenced the framing of a study of pre-service teachers' understanding of the concept of statistical…
Statistical Techniques for Signal Processing
1993-01-12
functions and extended influence functions of the associated underlying estimators. An interesting application of the influence function and its...and related filter smtctures. While the influence function is best known for its role in characterizing the robustness of estimators. the mathematical...statistics can be designed and analyzed for performance using the influence function as a tool. In particular, we have examined the mean-median
ERIC Educational Resources Information Center
Sharma, Kshitij; Chavez-Demoulin, Valérie; Dillenbourg, Pierre
2017-01-01
The statistics used in education research are based on central trends such as the mean or standard deviation, discarding outliers. This paper adopts another viewpoint that has emerged in statistics, called extreme value theory (EVT). EVT claims that the bulk of normal distribution is comprised mainly of uninteresting variations while the most…
Lee, F K-H; Chan, C C-L; Law, C-K
2009-02-01
Contrast enhanced computed tomography (CECT) has been used for delineation of treatment target in radiotherapy. The different Hounsfield unit due to the injected contrast agent may affect radiation dose calculation. We investigated this effect on intensity modulated radiotherapy (IMRT) of nasopharyngeal carcinoma (NPC). Dose distributions of 15 IMRT plans were recalculated on CECT. Dose statistics for organs at risk (OAR) and treatment targets were recorded for the plain CT-calculated and CECT-calculated plans. Statistical significance of the differences was evaluated. Correlations were also tested, among magnitude of calculated dose difference, tumor size and level of enhancement contrast. Differences in nodal mean/median dose were statistically significant, but small (approximately 0.15 Gy for a 66 Gy prescription). In the vicinity of the carotid arteries, the difference in calculated dose was also statistically significant, but only with a mean of approximately 0.2 Gy. We did not observe any significant correlation between the difference in the calculated dose and the tumor size or level of enhancement. The results implied that the calculated dose difference was clinically insignificant and may be acceptable for IMRT planning.
Yamani, Nikoo; Changiz, Tahereh; Feizi, Awat; Kamali, Farahnaz
2018-01-01
To assess the trend of changes in the evaluation scores of faculty members and discrepancy between administrators' and students' perspectives in a medical school from 2006 to 2015. This repeated cross-sectional study was conducted on the 10-year evaluation scores of all faculty members of a medical school (n=579) in an urban area of Iran. Data on evaluation scores given by students and administrators and the total of these scores were evaluated. Data were analyzed using descriptive and inferential statistics including linear mixed effect models for repeated measures via the SPSS software. There were statistically significant differences between the students' and administrators' perspectives over time ( p <0.001). The mean of the total evaluation scores also showed a statistically significant change over time ( p <0.001). Furthermore, the mean of changes over time in the total evaluation score between different departments was statistically significant ( p <0.001). The trend of changes in the student's evaluations was clear and positive, but the trend of administrators' evaluation was unclear. Since the evaluation of faculty members is affected by many other factors, there is a need for more future studies.
Eng, Kevin H; Schiller, Emily; Morrell, Kayla
2015-11-03
Researchers developing biomarkers for cancer prognosis from quantitative gene expression data are often faced with an odd methodological discrepancy: while Cox's proportional hazards model, the appropriate and popular technique, produces a continuous and relative risk score, it is hard to cast the estimate in clear clinical terms like median months of survival and percent of patients affected. To produce a familiar Kaplan-Meier plot, researchers commonly make the decision to dichotomize a continuous (often unimodal and symmetric) score. It is well known in the statistical literature that this procedure induces significant bias. We illustrate the liabilities of common techniques for categorizing a risk score and discuss alternative approaches. We promote the use of the restricted mean survival (RMS) and the corresponding RMS curve that may be thought of as an analog to the best fit line from simple linear regression. Continuous biomarker workflows should be modified to include the more rigorous statistical techniques and descriptive plots described in this article. All statistics discussed can be computed via standard functions in the Survival package of the R statistical programming language. Example R language code for the RMS curve is presented in the appendix.
Huang, Shi; MacKinnon, David P.; Perrino, Tatiana; Gallo, Carlos; Cruden, Gracelyn; Brown, C Hendricks
2016-01-01
Mediation analysis often requires larger sample sizes than main effect analysis to achieve the same statistical power. Combining results across similar trials may be the only practical option for increasing statistical power for mediation analysis in some situations. In this paper, we propose a method to estimate: 1) marginal means for mediation path a, the relation of the independent variable to the mediator; 2) marginal means for path b, the relation of the mediator to the outcome, across multiple trials; and 3) the between-trial level variance-covariance matrix based on a bivariate normal distribution. We present the statistical theory and an R computer program to combine regression coefficients from multiple trials to estimate a combined mediated effect and confidence interval under a random effects model. Values of coefficients a and b, along with their standard errors from each trial are the input for the method. This marginal likelihood based approach with Monte Carlo confidence intervals provides more accurate inference than the standard meta-analytic approach. We discuss computational issues, apply the method to two real-data examples and make recommendations for the use of the method in different settings. PMID:28239330
Scaling Laws in Canopy Flows: A Wind-Tunnel Analysis
NASA Astrophysics Data System (ADS)
Segalini, Antonio; Fransson, Jens H. M.; Alfredsson, P. Henrik
2013-08-01
An analysis of velocity statistics and spectra measured above a wind-tunnel forest model is reported. Several measurement stations downstream of the forest edge have been investigated and it is observed that, while the mean velocity profile adjusts quickly to the new canopy boundary condition, the turbulence lags behind and shows a continuous penetration towards the free stream along the canopy model. The statistical profiles illustrate this growth and do not collapse when plotted as a function of the vertical coordinate. However, when the statistics are plotted as function of the local mean velocity (normalized with a characteristic velocity scale), they do collapse, independently of the streamwise position and freestream velocity. A new scaling for the spectra of all three velocity components is proposed based on the velocity variance and integral time scale. This normalization improves the collapse of the spectra compared to existing scalings adopted in atmospheric measurements, and allows the determination of a universal function that provides the velocity spectrum. Furthermore, a comparison of the proposed scaling laws for two different canopy densities is shown, demonstrating that the vertical velocity variance is the most sensible statistical quantity to the characteristics of the canopy roughness.
Redshift data and statistical inference
NASA Technical Reports Server (NTRS)
Newman, William I.; Haynes, Martha P.; Terzian, Yervant
1994-01-01
Frequency histograms and the 'power spectrum analysis' (PSA) method, the latter developed by Yu & Peebles (1969), have been widely employed as techniques for establishing the existence of periodicities. We provide a formal analysis of these two classes of methods, including controlled numerical experiments, to better understand their proper use and application. In particular, we note that typical published applications of frequency histograms commonly employ far greater numbers of class intervals or bins than is advisable by statistical theory sometimes giving rise to the appearance of spurious patterns. The PSA method generates a sequence of random numbers from observational data which, it is claimed, is exponentially distributed with unit mean and variance, essentially independent of the distribution of the original data. We show that the derived random processes is nonstationary and produces a small but systematic bias in the usual estimate of the mean and variance. Although the derived variable may be reasonably described by an exponential distribution, the tail of the distribution is far removed from that of an exponential, thereby rendering statistical inference and confidence testing based on the tail of the distribution completely unreliable. Finally, we examine a number of astronomical examples wherein these methods have been used giving rise to widespread acceptance of statistically unconfirmed conclusions.
NASA Astrophysics Data System (ADS)
Lombaert, G.; Galvín, P.; François, S.; Degrande, G.
2014-09-01
Environmental vibrations due to railway traffic are predominantly due to dynamic axle loads caused by wheel and track unevenness and impact excitation by rail joints and wheel flats. Because of its irregular character, track unevenness is commonly processed statistically and represented by its power spectral density function or its root mean square (RMS) value in one-third octave bands. This statistical description does not uniquely define the track unevenness at a given site, however, and different track unevenness profiles matching the statistical description will lead to different predictions of dynamic axle loads and resulting ground vibration. This paper presents a methodology that allows quantifying the corresponding variability in ground vibration predictions. The procedure is derived assuming the geometry of the track and soil to be homogeneous along the track. The procedure is verified by means of Monte Carlo simulations and its usefulness for assessing the mismatch between predicted and measured ground vibrations is demonstrated in a case study. The results show that the response in time domain and its narrow band spectrum exhibit significant variability which is reduced when the running RMS value or the one-third octave band spectrum of the response is considered.
NASA Astrophysics Data System (ADS)
Decraene, Carolina; Dijckmans, Arne; Reynders, Edwin P. B.
2018-05-01
A method is developed for computing the mean and variance of the diffuse field sound transmission loss of finite-sized layered wall and floor systems that consist of solid, fluid and/or poroelastic layers. This is achieved by coupling a transfer matrix model of the wall or floor to statistical energy analysis subsystem models of the adjacent room volumes. The modal behavior of the wall is approximately accounted for by projecting the wall displacement onto a set of sinusoidal lateral basis functions. This hybrid modal transfer matrix-statistical energy analysis method is validated on multiple wall systems: a thin steel plate, a polymethyl methacrylate panel, a thick brick wall, a sandwich panel, a double-leaf wall with poro-elastic material in the cavity, and a double glazing. The predictions are compared with experimental data and with results obtained using alternative prediction methods such as the transfer matrix method with spatial windowing, the hybrid wave based-transfer matrix method, and the hybrid finite element-statistical energy analysis method. These comparisons confirm the prediction accuracy of the proposed method and the computational efficiency against the conventional hybrid finite element-statistical energy analysis method.
How to show that unicorn milk is a chronobiotic: the regression-to-the-mean statistical artifact.
Atkinson, G; Waterhouse, J; Reilly, T; Edwards, B
2001-11-01
Few chronobiologists may be aware of the regression-to-the-mean (RTM) statistical artifact, even though it may have far-reaching influences on chronobiological data. With the aid of simulated measurements of the circadian rhythm phase of body temperature and a completely bogus stimulus (unicorn milk), we explain what RTM is and provide examples relevant to chronobiology. We show how RTM may lead to erroneous conclusions regarding individual differences in phase responses to rhythm disturbances and how it may appear as though unicorn milk has phase-shifting effects and can successfully treat some circadian rhythm disorders. Guidelines are provided to ensure RTM effects are minimized in chronobiological investigations.
[Practical aspects regarding sample size in clinical research].
Vega Ramos, B; Peraza Yanes, O; Herrera Correa, G; Saldívar Toraya, S
1996-01-01
The knowledge of the right sample size let us to be sure if the published results in medical papers had a suitable design and a proper conclusion according to the statistics analysis. To estimate the sample size we must consider the type I error, type II error, variance, the size of the effect, significance and power of the test. To decide what kind of mathematics formula will be used, we must define what kind of study we have, it means if its a prevalence study, a means values one or a comparative one. In this paper we explain some basic topics of statistics and we describe four simple samples of estimation of sample size.
Gazoorian, Christopher L.
2015-01-01
A graphical user interface, with an integrated spreadsheet summary report, has been developed to estimate and display the daily mean streamflows and statistics and to evaluate different water management or water withdrawal scenarios with the estimated monthly data. This package of regression equations, U.S. Geological Survey streamgage data, and spreadsheet application produces an interactive tool to estimate an unaltered daily streamflow hydrograph and streamflow statistics at ungaged sites in New York. Among other uses, the New York Streamflow Estimation Tool can assist water managers with permitting water withdrawals, implementing habitat protection, estimating contaminant loads, or determining the potential affect from chemical spills.
The effects of multiple repairs on Inconel 718 weld mechanical properties
NASA Technical Reports Server (NTRS)
Russell, C. K.; Nunes, A. C., Jr.; Moore, D.
1991-01-01
Inconel 718 weldments were repaired 3, 6, 9, and 13 times using the gas tungsten arc welding process. The welded panels were machined into mechanical test specimens, postweld heat treated, and nondestructively tested. Tensile properties and high cycle fatigue life were evaluated and the results compared to unrepaired weld properties. Mechanical property data were analyzed using the statistical methods of difference in means for tensile properties and difference in log means and Weibull analysis for high cycle fatigue properties. Statistical analysis performed on the data did not show a significant decrease in tensile or high cycle fatigue properties due to the repeated repairs. Some degradation was observed in all properties, however, it was minimal.
Petsch, Harold E.
1979-01-01
Statistical summaries of daily streamflow data for 189 stations west of the Continental Divide in Colorado are presented in this report. Duration tables, high-flow sequence tables, and low-flow sequence tables provide information about daily mean discharge. The mean, variance, standard deviation, skewness, and coefficient of variation are provided for monthly and annual flows. Percentages of average flow are provided for monthly flows and first-order serial-correlation coefficients are provided for annual flows. The text explain the nature and derivation of the data and illustrates applications of the tabulated information by examples. The data may be used by agencies and individuals engaged in water studies. (USGS)
Miranda de Sá, Antonio Mauricio F L; Infantosi, Antonio Fernando C; Lazarev, Vladimir V
2007-01-01
In the present work, a commonly used index for evaluating the Event-Related Synchronization and Desynchronization (ERS/ERD) in the EEG was expressed as a function of the Spectral F-Test (SFT), which is a statistical test for assessing if two sample spectra are from populations with identical theoretical spectra. The sampling distribution of SFT has been derived, allowing hence ERS/ERD to be evaluated under a statistical basis. An example of the technique was also provided in the EEG signals from 10 normal subjects during intermittent photic stimulation.
Weather related continuity and completeness on Deep Space Ka-band links: statistics and forecasting
NASA Technical Reports Server (NTRS)
Shambayati, Shervin
2006-01-01
In this paper the concept of link 'stability' as means of measuring the continuity of the link is introduced and through it, along with the distributions of 'good' periods and 'bad' periods, the performance of the proposed Ka-band link design method using both forecasting and long-term statistics has been analyzed. The results indicate that the proposed link design method has relatively good continuity and completeness characteristics even when only long-term statistics are used and that the continuity performance further improves when forecasting is employed. .
Harris, Randall J
2004-05-01
Obtaining predictable and esthetic root coverage has become important. Unfortunately, there is only a limited amount of information available on the long-term results of root coverage procedures. The goal of this study was to evaluate the short-term and long-term root coverage results obtained with an acellular dermal matrix and a subepithelial graft. An a priori power analysis was done to determine that 25 was an adequate sample size for each group in this study. Twenty-five patients treated with either an acellular dermal matrix or a subepithelial graft for root coverage were included in this study. The short-term (mean 12.3 to 13.2 weeks) and long-term (mean 48.1 to 49.2 months) results were compared. Additionally, various factors were evaluated to determine whether they could affect the results. This study was a retrospective study of patients in a fee-for-service private periodontal practice. The patients were not randomly assigned to treatment groups. The mean root coverages for the short-term acellular dermal matrix (93.4%), short-term subepithelial graft (96.6%), and long-term subepithelial graft (97.0%) were statistically similar. All three were statistically greater than the long-term acellular dermal matrix mean root coverage (65.8%). Similar results were noted in the change in recession. There were smaller probing reductions and less of an increase in keratinized tissue with the acellular dermal matrix than the subepithelial graft. None of the factors evaluated resulted in the acellular dermal graft having a statistically significant better result than the subepithelial graft. However, in long-term cases where multiple defects were treated with an acellular dermal matrix, the mean root coverage (70.8%) was greater than the mean root coverage in long-term cases where a single defect was treated with an acellular dermal matrix (50.0%). The mean results with the subepithelial graft held up with time better than the mean results with an acellular dermal matrix. However, the results were not universal. In 32.0% of the cases treated with an acellular dermal matrix, the results improved or remained stable with time.
Statistical inference for noisy nonlinear ecological dynamic systems.
Wood, Simon N
2010-08-26
Chaotic ecological dynamic systems defy conventional statistical analysis. Systems with near-chaotic dynamics are little better. Such systems are almost invariably driven by endogenous dynamic processes plus demographic and environmental process noise, and are only observable with error. Their sensitivity to history means that minute changes in the driving noise realization, or the system parameters, will cause drastic changes in the system trajectory. This sensitivity is inherited and amplified by the joint probability density of the observable data and the process noise, rendering it useless as the basis for obtaining measures of statistical fit. Because the joint density is the basis for the fit measures used by all conventional statistical methods, this is a major theoretical shortcoming. The inability to make well-founded statistical inferences about biological dynamic models in the chaotic and near-chaotic regimes, other than on an ad hoc basis, leaves dynamic theory without the methods of quantitative validation that are essential tools in the rest of biological science. Here I show that this impasse can be resolved in a simple and general manner, using a method that requires only the ability to simulate the observed data on a system from the dynamic model about which inferences are required. The raw data series are reduced to phase-insensitive summary statistics, quantifying local dynamic structure and the distribution of observations. Simulation is used to obtain the mean and the covariance matrix of the statistics, given model parameters, allowing the construction of a 'synthetic likelihood' that assesses model fit. This likelihood can be explored using a straightforward Markov chain Monte Carlo sampler, but one further post-processing step returns pure likelihood-based inference. I apply the method to establish the dynamic nature of the fluctuations in Nicholson's classic blowfly experiments.
Wood, Molly S.; Fosness, Ryan L.
2013-01-01
The U.S. Geological Survey, in cooperation with the Bureau of Land Management (BLM), collected streamflow data in 2012 and estimated streamflow statistics for stream segments designated "Wild," "Scenic," or "Recreational" under the National Wild and Scenic Rivers System in the Owyhee Canyonlands Wilderness in southwestern Idaho. The streamflow statistics were used by BLM to develop and file a draft, federal reserved water right claim in autumn 2012 to protect federally designated "outstanding remarkable values" in the stream segments. BLM determined that the daily mean streamflow equaled or exceeded 20 and 80 percent of the time during bimonthly periods (two periods per month) and the bankfull streamflow are important streamflow thresholds for maintaining outstanding remarkable values. Prior to this study, streamflow statistics estimated using available datasets and tools for the Owyhee Canyonlands Wilderness were inaccurate for use in the water rights claim. Streamflow measurements were made at varying intervals during February–September 2012 at 14 monitoring sites; 2 of the monitoring sites were equipped with telemetered streamgaging equipment. Synthetic streamflow records were created for 11 of the 14 monitoring sites using a partial‑record method or a drainage-area-ratio method. Streamflow records were obtained directly from an operating, long-term streamgage at one monitoring site, and from discontinued streamgages at two monitoring sites. For 10 sites analyzed using the partial-record method, discrete measurements were related to daily mean streamflow at a nearby, telemetered “index” streamgage. Resulting regression equations were used to estimate daily mean and annual peak streamflow at the monitoring sites during the full period of record for the index sites. A synthetic streamflow record for Sheep Creek was developed using a drainage-area-ratio method, because measured streamflows did not relate well to any index site to allow use of the partial-record method. The synthetic and actual daily mean streamflow records were used to estimate daily mean streamflow that was exceeded 80, 50, and 20 percent of the time (80-, 50-, and 20-percent exceedances) for bimonthly and annual periods. Bankfull streamflow statistics were calculated by fitting the synthetic and actual annual peak streamflow records to a log Pearson Type III distribution using Bulletin 17B guidelines in the U.S. Geological Survey PeakFQ program. The coefficients of determination (R2) for the regressions between the monitoring and index sites ranged from 0.74 for Wickahoney Creek to 0.98 for the West Fork Bruneau River and Deep Creek. Confidence in computed streamflow statistics is highest among other sites for the East Fork Owyhee River and the West Fork Bruneau River on the basis of regression statistics, visual fit of the related data, and the range and number of streamflow measurements. Streamflow statistics for sites with the greatest uncertainty included Big Jacks, Little Jacks, Cottonwood, Wickahoney, and Sheep Creeks. The uncertainty in computed streamflow statistics was due to a number of factors which included the distance of index sites relative to monitoring sites, relatively low streamflow conditions that occurred during the study, and the limited number and range of streamflow measurements. However, the computed streamflow statistics are considered the best possible estimates given available datasets in the remote study area. Streamflow measurements over a wider range of hydrologic and climatic conditions would improve the relations between streamflow characteristics at monitoring and index sites. Additionally, field surveys are needed to verify if the streamflows selected for the water rights claims are sufficient for maintaining outstanding remarkable values in the Wild and Scenic rivers included in the study.
Increasing age in Achilles rupture patients over time.
Ho, Gavin; Tantigate, Direk; Kirschenbaum, Josh; Greisberg, Justin K; Vosseller, J Turner
2017-07-01
The changing demographics of Achilles tendon rupture (ATR) patients have not fully been investigated. However, there has been a general suspicion that this injury is occurring in an increasingly older population, in terms of mean age. The aim of this study was to objectively show an increase in age in Achilles tendon rupture patients over time. Published literature on Achilles tendon ruptures was searched for descriptive statistics on the demographics of patients in the studies, specifically mean and median age of Achilles tendon rupture patients, gender ratio, percentage of athletics-related injuries, percentage of smokers, and BMI. Linear regression analyses were performed to determine the trend of patient demographics over time. A Welch one-way ANOVA was carried out to identify any possible differences in data obtained from different types of studies. The patient demographics from 142 studies were recorded, with all ATR injuries occurring between the years 1953 and 2014. There was no significant difference in the mean age data reported by varying study types, i.e. randomized controlled trial, cohort study, case series, etc. (P=0.182). There was a statistically significant rise in mean age of ATR patients over time (P<0.0005). There was also a statistically significant drop in percentage of male ATR patients (P=0.02). There is no significant trend for percentage of athletics-related injuries, smoking or BMI. Since 1953 to present day, the mean age at which ATR occurs has been increasing by at least 0.721 years every five years. In the same time period, the percentage of female study patients with ATR injuries has also been increasing by at least 0.6% every five years. Level III; Retrospective cohort study. Copyright © 2017 Elsevier Ltd. All rights reserved.
Hebrani, Paria; Manteghi, Ali Akhoundpour; Behdani, Fatemeh; Hessami, Elham; Rezayat, Kambiz Akhavan; Marvast, Majid Nabizadeh; Rezayat, Amir Akhavan
2015-04-01
One of the major causes of death in schizophrenia is a metabolic syndrome. The clozapine has the highest rate of weight gain among antipsychotics. It has been shown that metformin can promote weight loss. We aimed to investigate the effect of metformin as an adjunctive therapy with clozapine to prevent metabolic syndrome in patients with schizophrenia. A total of 37 patients consisting metformin group (19 cases) and a group of placebo consisting of 18 cases were evaluated. A brief psychiatric rating scale score (BPRS) and metabolic profiles was determined for all patients. All of the variables were also determined at 2, 8, 16, and 20 weeks after the onset of the study. The mean age of the group of metformin was 47.2 ± 10.4 compared with 45.8 ± 10.2 for the group of placebo. The difference in mean waist circumference and serum level of triglyceride at baseline compared with the end of study showed a statistically significant difference between two groups (P = 0. 000). A statistically significant difference was also observed in a comparison of mean difference of weight and body mass index at baseline compared with end of study (P = 0. 000). There was a statistically significant difference of fasting blood sugar (P = 0.011) and serum high-density lipoprotein (P = 0.000) between two groups but this difference was not significant for mean BPRS scores, mean systolic and diastolic blood pressure, serum level of triiodothyronine, thyroxin and thyroid stimulating hormone, serum low-density lipoprotein and serum cholesterol. Metformin could be considered an adjunctive therapy with clozapine to prevent metabolic syndrome in schizophrenic patients.
The Relation between Sarcopenia and Mortality in Patients at Intensive Care Unit.
Toptas, Mehmet; Yalcin, Mazhar; Akkoc, İbrahim; Demir, Eren; Metin, Cagatay; Savas, Yildiray; Kalyoncuoglu, Muhsin; Can, Mehmet Mustafa
2018-01-01
Psoas muscle area (PMA) can reflect the status of skeletal muscle in the whole body. It has been also reported that decreased PMA was associated with postoperative mortality or morbidity after several surgical procedures. In this study, we aimed to investigate the relation between PMA and mortality in all age groups in intensive care unit (UNIT). The study consists of 362 consecutive patients. The demographic characteristics of patients, indications for ICU hospitalization, laboratory parameters, and clinical parameters consist of mortality and length of stay, and surgery history was obtained from intensive care archive records. The mean age was 61.2 ± 18.2 years, and the percentage of female was 33.3%. The mean duration of stay was 10.3 ± 24.4 days. Exitus ratio, partial healing, and healing were 25%, 70%, and 5%, respectively. The mean right, left, and total PMA were 8.7 ± 3.6, 8.9 ± 3.4, and 17.6 ± 6.9, respectively. The left and total PMA averages of the nonoperation patients were statistically significantly lower ( p = 0.021 p = 0.043). The mean PMA between the ex and recovered patients were statistically significantly lower ( p = 0.001, p = 0.001, p < 0.001). Dyspnoea, renal insufficiency, COPD, transfusion rate, operation rate, ventilator needy, and mean duration of hospitalization were statistically significant higher in patients with exitus. There is a significant difference in operation types, anesthesia type, and clinic rates. Our data suggest that sarcopenia can be used to risk stratification in ICU patients. Future studies may use this technique to individualize postoperative interventions that may reduce the risk for an adverse discharge disposition related to critical illness, such as early mobilization, optimized nutritional support, and reduction of sedation and opioid dose.
Ethnic studies of dietary intakes of zinc, copper, iron, and calcium
DOE Office of Scientific and Technical Information (OSTI.GOV)
Matthews, C.; Figueroa, M.; Tam, C.F.
1986-01-01
Immigrants, such as S.E. Asians who live in the L.A. area, often suffer high incidences of diseases. It is of interest to examine ethnic eating patterns whether they influence dietary Zn, Cu, Fe, Ca, protein and Kcal, which are essential for proper immune functions. Three-day dietary intake of adult ethnic groups, Asian(A)(N=18), Caucasian(C)(26), Black(B)(7), Latino(L)(12), Middle Easterner(ME)(9) and Filipino(F)(6) were analyzed for Zn, Cu, Fe, Ca, protein and Kcal by Ohio Data Base Foods II(ODBF) then statistically compared by PROPHET. Zn and Cu were also analyzed by hand calculation(HC). No statistical differences were observed for mean Zn between groups analyzedmore » by ODBF whereas HC of mean Zn between A vs C (A=11.3 +/- S.D.2.9 mg vs C=8.8 +/- 2.8, P<0.01) and A vs L (11.3+/-2.9 vs L=8.9+/-2.2, P<0.05) were statistically different. No differences were found for Cu between the groups. By ODBF, none of mean Cu or Zn met 2/3 RDA for any of the groups. For Fe, no differences were found between groups and only 50% of the subjects met 2/3 RDA. Significant differences were observed for Ca only between A vs C and B vs C. Both A and B had lower mean Ca than C. All groups had adequate protein. Mean Kcal of all groups were found to be at or about 2/3 RDA. Both insufficient Kcal and eating patterns contribute to inadequate Cu, Zn, and Fe intakes and hence may affect immune competency.« less
Fontarensky, Mikael; Alfidja, Agaïcha; Perignon, Renan; Schoenig, Arnaud; Perrier, Christophe; Mulliez, Aurélien; Guy, Laurent; Boyer, Louis
2015-07-01
To evaluate the accuracy of reduced-dose abdominal computed tomographic (CT) imaging by using a new generation model-based iterative reconstruction (MBIR) to diagnose acute renal colic compared with a standard-dose abdominal CT with 50% adaptive statistical iterative reconstruction (ASIR). This institutional review board-approved prospective study included 118 patients with symptoms of acute renal colic who underwent the following two successive CT examinations: standard-dose ASIR 50% and reduced-dose MBIR. Two radiologists independently reviewed both CT examinations for presence or absence of renal calculi, differential diagnoses, and associated abnormalities. The imaging findings, radiation dose estimates, and image quality of the two CT reconstruction methods were compared. Concordance was evaluated by κ coefficient, and descriptive statistics and t test were used for statistical analysis. Intraobserver correlation was 100% for the diagnosis of renal calculi (κ = 1). Renal calculus (τ = 98.7%; κ = 0.97) and obstructive upper urinary tract disease (τ = 98.16%; κ = 0.95) were detected, and differential or alternative diagnosis was performed (τ = 98.87% κ = 0.95). MBIR allowed a dose reduction of 84% versus standard-dose ASIR 50% (mean volume CT dose index, 1.7 mGy ± 0.8 [standard deviation] vs 10.9 mGy ± 4.6; mean size-specific dose estimate, 2.2 mGy ± 0.7 vs 13.7 mGy ± 3.9; P < .001) without a conspicuous deterioration in image quality (reduced-dose MBIR vs ASIR 50% mean scores, 3.83 ± 0.49 vs 3.92 ± 0.27, respectively; P = .32) or increase in noise (reduced-dose MBIR vs ASIR 50% mean, respectively, 18.36 HU ± 2.53 vs 17.40 HU ± 3.42). Its main drawback remains the long time required for reconstruction (mean, 40 minutes). A reduced-dose protocol with MBIR allowed a dose reduction of 84% without increasing noise and without an conspicuous deterioration in image quality in patients suspected of having renal colic.
Statistical modeling of 4D respiratory lung motion using diffeomorphic image registration.
Ehrhardt, Jan; Werner, René; Schmidt-Richberg, Alexander; Handels, Heinz
2011-02-01
Modeling of respiratory motion has become increasingly important in various applications of medical imaging (e.g., radiation therapy of lung cancer). Current modeling approaches are usually confined to intra-patient registration of 3D image data representing the individual patient's anatomy at different breathing phases. We propose an approach to generate a mean motion model of the lung based on thoracic 4D computed tomography (CT) data of different patients to extend the motion modeling capabilities. Our modeling process consists of three steps: an intra-subject registration to generate subject-specific motion models, the generation of an average shape and intensity atlas of the lung as anatomical reference frame, and the registration of the subject-specific motion models to the atlas in order to build a statistical 4D mean motion model (4D-MMM). Furthermore, we present methods to adapt the 4D mean motion model to a patient-specific lung geometry. In all steps, a symmetric diffeomorphic nonlinear intensity-based registration method was employed. The Log-Euclidean framework was used to compute statistics on the diffeomorphic transformations. The presented methods are then used to build a mean motion model of respiratory lung motion using thoracic 4D CT data sets of 17 patients. We evaluate the model by applying it for estimating respiratory motion of ten lung cancer patients. The prediction is evaluated with respect to landmark and tumor motion, and the quantitative analysis results in a mean target registration error (TRE) of 3.3 ±1.6 mm if lung dynamics are not impaired by large lung tumors or other lung disorders (e.g., emphysema). With regard to lung tumor motion, we show that prediction accuracy is independent of tumor size and tumor motion amplitude in the considered data set. However, tumors adhering to non-lung structures degrade local lung dynamics significantly and the model-based prediction accuracy is lower in these cases. The statistical respiratory motion model is capable of providing valuable prior knowledge in many fields of applications. We present two examples of possible applications in radiation therapy and image guided diagnosis.
Refractive Outcomes of 20 Eyes Undergoing ICL Implantation for Correction of Hyperopic Astigmatism.
Coskunseven, Efekan; Kavadarli, Isilay; Sahin, Onurcan; Kayhan, Belma; Pallikaris, Ioannis
2017-09-01
To analyze 1-week, 1-month, and 12-month postoperative refractive outcomes of eyes that under-went ICL implantation to correct hyperopic astigmatism. The study enrolled 20 eyes of patients with an average age of 32 years (range: 21 to 40 years). The outcomes of spherical and cylindrical refraction, uncorrected distance visual acuity (UDVA), corrected distance visual acuity (CDVA), vault, and angle parameters were evaluated 1 week, 1 month, and 12 months postoperatively. The preoperative mean UDVA was 0.15 ± 0.11 (decimal) (20/133 Snellen) and increased to 0.74 ± 0.25 (20/27 Snellen) postoperatively, with a change of 0.59 (decimal) (20/33.9 Snellen) (P < .0001), which was statistically significant. The preoperative mean CDVA was 0.74 ± 0.25 (decimal) (20/27 Snellen) and increased to 0.78 ± 0.21 (20/25 Snellen), with a change of 0.03 (decimal) (20/666 Snellen) (P < .052), which was not statistically significant. The mean preoperative sphere was 6.86 ± 1.77 diopters (D) and the mean preoperative cylinder was -1.44 ± 0.88 D. The mean 12-month postoperative sphere decreased to 0.46 ± 0.89 D (P < .001) and cylinder decreased to -0.61 ± 0.46 D (P < .001), with a change of 6.40 D, both of which were statistically significant. The mean 1-month postoperative vault was 0.65 ± 0.13 mm and decreased to 0.613 ± 0.10 mm at 1 year postoperatively, with a change of 0.44 mm (P < .003). The preoperative/12-month and 1-month/12-month trabecular-iris angle (TIA), trabecular-iris space area 500 mm from the scleral spur (TISA500), and angle opening distance 500 mm from the scleral spur (AOD500) values were analyzed nasally, temporally, and inferiorly. All differences were statistically significant between preoperative/12-month analysis. The only differences between 1- and 12-month analysis were on TISA500 inferior (P < .002) and AOD500 nasal (0.031) values. ICL hyperopic toric implantation is a safe method and provides stable refractive outcomes in patients with high hyperopia (up to 10.00 D) and astigmatism (up to 6.00 D). [J Refract Surg. 2017;33(9):604-609.]. Copyright 2017, SLACK Incorporated.
Peters, Marloes J M; Wierts, Roel; Jutten, Elisabeth M C; Halders, Servé G E A; Willems, Paul C P H; Brans, Boudewijn
2015-11-01
A complication after spinal fusion surgery is pseudarthrosis, but its radiological diagnosis is of limited value. (18)F-fluoride PET with its ability to assess bone metabolism activity could be of value. The goal of this study was to assess the clinical feasibility of calculating the static standardized uptake value (SUV) from a short dynamic scan without the use of blood sampling, thereby obtaining all dynamic and static parameters in a scan of only 30 min. This approach was tested on a retrospective patient population with persisting pain after spinal fusion surgery. In 16 patients, SUVs (SUV max, SUV mean) and kinetic parameters (K 1, k 2, k 3, v b, K i,NLR, K 1/k 2, k 3/(k 2 + k 3), K i,patlak) were derived from static and dynamic PET/CT scans of operated and control regions of the spine, after intravenous administration of 156-214 MBq (18)F-fluoride. Parameter differences between control and operated regions, as well as between pseudarthrosis and fused segments were evaluated. SUVmean at 30 and 60 min was calculated from kinetic parameters obtained from the dynamic data set (SUV mean,2TCM). Agreement between measured and calculated SUVs was evaluated through Bland-Altman plots. Overall, statistically significant differences between control and operated regions were observed for SUV max, SUV mean, K i,NLR, K i,patlak, K 1/k 2 and k 3/(k 2 + k 3). Diagnostic CT showed pseudarthrosis in 6/16 patients, while in 10/16 patients, segments were fused. Of all parameters, only those regarding the incorporation of bone [K i,NLR, K i,patlak, k 3/(k 2 + k 3)] differed statistically significant in the intervertebral disc space between the pseudarthrosis and fused patients group. The mean values of the patient-specific blood clearance rate [Formula: see text] differed statistically significant between the pseudarthrosis and the fusion group, with a p value of 0.011. This may correspond with the lack of statistical significance of the SUV values between pseudarthrosis and fused patients. Bland-Altman plots show that calculated SUV mean,2TCM values corresponded well with the measured SUV mean values. This study shows the feasibility of a 30-min dynamic (18)F-fluoride PET/CT scanning and this may provide dynamic parameters clinically relevant to the diagnosis of pseudarthrosis.
Crossing statistics of laser light scattered through a nanofluid.
Arshadi Pirlar, M; Movahed, S M S; Razzaghi, D; Karimzadeh, R
2017-09-01
In this paper, we investigate the crossing statistics of speckle patterns formed in the Fresnel diffraction region by a laser beam scattering through a nanofluid. We extend zero-crossing statistics to assess the dynamical properties of the nanofluid. According to the joint probability density function of laser beam fluctuation and its time derivative, the theoretical frameworks for Gaussian and non-Gaussian regimes are revisited. We count the number of crossings not only at zero level but also for all available thresholds to determine the average speed of moving particles. Using a probabilistic framework in determining crossing statistics, a priori Gaussianity is not essentially considered; therefore, even in the presence of deviation from Gaussian fluctuation, this modified approach is capable of computing relevant quantities, such as mean value of speed, more precisely. Generalized total crossing, which represents the weighted summation of crossings for all thresholds to quantify small deviation from Gaussian statistics, is introduced. This criterion can also manipulate the contribution of noises and trends to infer reliable physical quantities. The characteristic time scale for having successive crossings at a given threshold is defined. In our experimental setup, we find that increasing sample temperature leads to more consistency between Gaussian and perturbative non-Gaussian predictions. The maximum number of crossings does not necessarily occur at mean level, indicating that we should take into account other levels in addition to zero level to achieve more accurate assessments.
Cınar, Yasin; Cingü, Abdullah Kürşat; Türkcü, Fatih Mehmet; Çınar, Tuba; Yüksel, Harun; Özkurt, Zeynep Gürsel; Çaça, Ihsan
2014-09-01
To compare outcomes of accelerated and conventional corneal cross-linking (CXL) for progressive keratoconus (KC). Patients were divided into two groups as the accelerated CXL group and the conventional CXL group. The uncorrected distant visual acuity (UDVA), corrected distant visual acuity (CDVA), refraction and keratometric values were measured preoperatively and postoperatively. The data of the two groups were compared statistically. The mean UDVA and CDVA were better at the six month postoperative when compared with preoperative values in two groups. While change in UDVA and CDVA was statistically significant in the accelerated CXL group (p = 0.035 and p = 0.047, respectively), it did not reach statistical significance in the conventional CXL group (p = 0.184 and p = 0.113, respectively). The decrease in the mean corneal power (Km) and maximum keratometric value (Kmax) were statistically significant in both groups (p = 0.012 and 0.046, respectively in the accelerated CXL group, p = 0.012 and 0.041, respectively, in the conventional CXL group). There was no statistically significant difference in visual and refractive results between the two groups (p > 0.05). Refractive and visual results of the accelerated CXL method and the conventional CXL method for the treatment of KC in short time period were similar. The accelerated CXL method faster and provide high throughput of the patients.
Adapt-Mix: learning local genetic correlation structure improves summary statistics-based analyses
Park, Danny S.; Brown, Brielin; Eng, Celeste; Huntsman, Scott; Hu, Donglei; Torgerson, Dara G.; Burchard, Esteban G.; Zaitlen, Noah
2015-01-01
Motivation: Approaches to identifying new risk loci, training risk prediction models, imputing untyped variants and fine-mapping causal variants from summary statistics of genome-wide association studies are playing an increasingly important role in the human genetics community. Current summary statistics-based methods rely on global ‘best guess’ reference panels to model the genetic correlation structure of the dataset being studied. This approach, especially in admixed populations, has the potential to produce misleading results, ignores variation in local structure and is not feasible when appropriate reference panels are missing or small. Here, we develop a method, Adapt-Mix, that combines information across all available reference panels to produce estimates of local genetic correlation structure for summary statistics-based methods in arbitrary populations. Results: We applied Adapt-Mix to estimate the genetic correlation structure of both admixed and non-admixed individuals using simulated and real data. We evaluated our method by measuring the performance of two summary statistics-based methods: imputation and joint-testing. When using our method as opposed to the current standard of ‘best guess’ reference panels, we observed a 28% decrease in mean-squared error for imputation and a 73.7% decrease in mean-squared error for joint-testing. Availability and implementation: Our method is publicly available in a software package called ADAPT-Mix available at https://github.com/dpark27/adapt_mix. Contact: noah.zaitlen@ucsf.edu PMID:26072481
Filter Tuning Using the Chi-Squared Statistic
NASA Technical Reports Server (NTRS)
Lilly-Salkowski, Tyler B.
2017-01-01
This paper examines the use of the Chi-square statistic as a means of evaluating filter performance. The goal of the process is to characterize the filter performance in the metric of covariance realism. The Chi-squared statistic is the value calculated to determine the realism of a covariance based on the prediction accuracy and the covariance values at a given point in time. Once calculated, it is the distribution of this statistic that provides insight on the accuracy of the covariance. The process of tuning an Extended Kalman Filter (EKF) for Aqua and Aura support is described, including examination of the measurement errors of available observation types, and methods of dealing with potentially volatile atmospheric drag modeling. Predictive accuracy and the distribution of the Chi-squared statistic, calculated from EKF solutions, are assessed.
Ocular changes in primary hypothyroidism
2009-01-01
Background To determine the ocular changes related to hypothyrodism in newly diagnosed patients without orbitopathy. Findings Thirty-three patients diagnosed to have primary overt hypothyroidism were enrolled in the study. All subjects were assigned to underwent central corneal thickness (CCT), anterior chamber volume, depth and angle measurements with the Scheimpflug camera (Pentacam, Oculus) and cup to disc ratio (C/D), mean retinal thickness and mean retinal nerve fiber layer (RNFL) thickness measurements with optical coherence tomography (OCT) in addition to ophthalmological examination preceeding the replacement therapy and at the 1st, 3rd and 6th months of treatment. The mean age of the patients included in the study were 40.58 ± 1.32 years. The thyroid hormone levels return to normal levels in all patients during the follow-up period, however the mean intraocular pressure (IOP) revealed no significant change. The mean CCT was 538.05 ± 3.85 μ initially and demonstrated no statistically significant change as the anterior chamber volume, depth and angle measurements did. The mean C/D ratio was 0.29 ± 0.03 and the mean retinal thickness was 255.83 ± 19.49 μ initially and the treatment did not give rise to any significant change. The mean RNFL thickness was also stable during the control visits, so no statistically significant change was encountered. Conclusions Neither hypothyroidism, nor its replacement therapy gave rise to any change of IOP, CCT, anterior chamber parameters, RNFL, retinal thickness and C/D ratio. PMID:20040111
Evidence-based orthodontics. Current statistical trends in published articles in one journal.
Law, Scott V; Chudasama, Dipak N; Rinchuse, Donald J
2010-09-01
To ascertain the number, type, and overall usage of statistics in American Journal of Orthodontics and Dentofacial (AJODO) articles for 2008. These data were then compared to data from three previous years: 1975, 1985, and 2003. The frequency and distribution of statistics used in the AJODO original articles for 2008 were dichotomized into those using statistics and those not using statistics. Statistical procedures were then broadly divided into descriptive statistics (mean, standard deviation, range, percentage) and inferential statistics (t-test, analysis of variance). Descriptive statistics were used to make comparisons. In 1975, 1985, 2003, and 2008, AJODO published 72, 87, 134, and 141 original articles, respectively. The percentage of original articles using statistics was 43.1% in 1975, 75.9% in 1985, 94.0% in 2003, and 92.9% in 2008; original articles using statistics stayed relatively the same from 2003 to 2008, with only a small 1.1% decrease. The percentage of articles using inferential statistical analyses was 23.7% in 1975, 74.2% in 1985, 92.9% in 2003, and 84.4% in 2008. Comparing AJODO publications in 2003 and 2008, there was an 8.5% increase in the use of descriptive articles (from 7.1% to 15.6%), and there was an 8.5% decrease in articles using inferential statistics (from 92.9% to 84.4%).
Comparing Visual and Statistical Analysis of Multiple Baseline Design Graphs.
Wolfe, Katie; Dickenson, Tammiee S; Miller, Bridget; McGrath, Kathleen V
2018-04-01
A growing number of statistical analyses are being developed for single-case research. One important factor in evaluating these methods is the extent to which each corresponds to visual analysis. Few studies have compared statistical and visual analysis, and information about more recently developed statistics is scarce. Therefore, our purpose was to evaluate the agreement between visual analysis and four statistical analyses: improvement rate difference (IRD); Tau-U; Hedges, Pustejovsky, Shadish (HPS) effect size; and between-case standardized mean difference (BC-SMD). Results indicate that IRD and BC-SMD had the strongest overall agreement with visual analysis. Although Tau-U had strong agreement with visual analysis on raw values, it had poorer agreement when those values were dichotomized to represent the presence or absence of a functional relation. Overall, visual analysis appeared to be more conservative than statistical analysis, but further research is needed to evaluate the nature of these disagreements.
Global health business: the production and performativity of statistics in Sierra Leone and Germany.
Erikson, Susan L
2012-01-01
The global push for health statistics and electronic digital health information systems is about more than tracking health incidence and prevalence. It is also experienced on the ground as means to develop and maintain particular norms of health business, knowledge, and decision- and profit-making that are not innocent. Statistics make possible audit and accountability logics that undergird the management of health at a distance and that are increasingly necessary to the business of health. Health statistics are inextricable from their social milieus, yet as business artifacts they operate as if they are freely formed, objectively originated, and accurate. This article explicates health statistics as cultural forms and shows how they have been produced and performed in two very different countries: Sierra Leone and Germany. In both familiar and surprising ways, this article shows how statistics and their pursuit organize and discipline human behavior, constitute subject positions, and reify existing relations of power.
Participatory Equity and Student Outcomes in Living-Learning Programs of Differing Thematic Types
ERIC Educational Resources Information Center
Soldner, Matthew Edward
2011-01-01
This study evaluated participatory equity in varying thematic types of living-learning programs and, for a subset of student group x program type combinations found to be below equity, used latent mean modeling to determine whether statistically significant mean differences existed between the outcome scores of living-learning participants and…
Child-Centered Play Therapy in the Schools: Review and Meta-Analysis
ERIC Educational Resources Information Center
Ray, Dee C.; Armstrong, Stephen A.; Balkin, Richard S.; Jayne, Kimberly M.
2015-01-01
The authors conducted a meta-analysis and systematic review that examined 23 studies evaluating the effectiveness of child centered play therapy (CCPT) conducted in elementary schools. Meta-analysis results were explored using a random effects model for mean difference and mean gain effect size estimates. Results revealed statistically significant…
Teaching the Concept of the Sampling Distribution of the Mean
ERIC Educational Resources Information Center
Aguinis, Herman; Branstetter, Steven A.
2007-01-01
The authors use proven cognitive and learning principles and recent developments in the field of educational psychology to teach the concept of the sampling distribution of the mean, which is arguably one of the most central concepts in inferential statistics. The proposed pedagogical approach relies on cognitive load, contiguity, and experiential…
Climate change and the detection of trends in annual runoff
McCabe, G.J.; Wolock, D.M.
1997-01-01
This study examines the statistical likelihood of detecting a trend in annual runoff given an assumed change in mean annual runoff, the underlying year-to-year variability in runoff, and serial correlation of annual runoff. Means, standard deviations, and lag-1 serial correlations of annual runoff were computed for 585 stream gages in the conterminous United States, and these statistics were used to compute the probability of detecting a prescribed trend in annual runoff. Assuming a linear 20% change in mean annual runoff over a 100 yr period and a significance level of 95%, the average probability of detecting a significant trend was 28% among the 585 stream gages. The largest probability of detecting a trend was in the northwestern U.S., the Great Lakes region, the northeastern U.S., the Appalachian Mountains, and parts of the northern Rocky Mountains. The smallest probability of trend detection was in the central and southwestern U.S., and in Florida. Low probabilities of trend detection were associated with low ratios of mean annual runoff to the standard deviation of annual runoff and with high lag-1 serial correlation in the data.
Pope, L.M.; Arruda, J.A.; Fromm, C.H.
1988-01-01
The formation of carcinogenic trihalomethanes during the treatment of public surface water supplies has become a potentially serious problem. The U. S. Geological Survey, in cooperation with the Kansas Department of Health and Environment , investigated the potential for trihalomethane formation in water from 15 small, public water supply lakes in eastern Kansas from April 1984 through April 1986 in order to define the principal factors that affect or control the potential for trihalomethane formation during the water treatment process. Relations of mean concentrations of trihalomethane-formation potential to selected water quality and lake and watershed physical characteristics were investigated using correlation and regression analysis. Statistically significant, direct relations were developed between trihalomethanes produced in unfiltered and filtered lake water and mean concentrations of total and dissolved organic carbon. Correlation coefficients for these relations ranged from 0.86 to 0.93. Mean values of maximum depth of lake were shown to have statistically significant inverse relations to mean concentrations of trihalomethane-formation potential and total and dissolved organic carbon. Correlation coefficients for these relations ranged from -0.76 to -0.81. (USGS)
Experience and limited lighting may affect sleepiness of tunnel workers
2014-01-01
Background Working on shifts, especially on a night shift, influences the endogenous sleep regulation system leading to diminished sleep time and increased somnolence. We attempted to evaluate the impact of shifts on sleepiness and correlate the sleepiness score to the experience in a shift schedule. Materials and methods This cross-sectional study consists of 42 male and 2 female workers involved in a tunnel construction. They underwent spirometry, pulse oximetry and were asked to complete the Epworth Sleepiness Scale questionnaire. Results Statistical analysis revealed that workers of lower Epworth had a mean age of 43.6 years, compared to the mean age of 36.4 years of workers with higher Epworth. Furthermore, workers of lower Epworth were characterized by a mean number of shift years equal to 14.8, while those of higher Epworth possessed a mean number of shift years equal to 8. The shift schedule did not reveal any statistically significant correlation. Conclusions Workers employed for a longer time had diminished sleepiness. However, there is no relationship between night shifts and sleepiness, possibly because of exposure to artificial lighting in the construction site. PMID:24993796
Experience and limited lighting may affect sleepiness of tunnel workers.
Lykouras, Dimosthenis; Karkoulias, Kiriakos; Patouchas, Dimitrios; Lakoumentas, John; Sampsonas, Fotis; Tranou, Magdalini-Konstantina; Faliagka, Evanthia; Tsakalidis, Athanasios; Spiropoulos, Kostas
2014-07-03
Working on shifts, especially on a night shift, influences the endogenous sleep regulation system leading to diminished sleep time and increased somnolence. We attempted to evaluate the impact of shifts on sleepiness and correlate the sleepiness score to the experience in a shift schedule. This cross-sectional study consists of 42 male and 2 female workers involved in a tunnel construction. They underwent spirometry, pulse oximetry and were asked to complete the Epworth Sleepiness Scale questionnaire. Statistical analysis revealed that workers of lower Epworth had a mean age of 43.6 years, compared to the mean age of 36.4 years of workers with higher Epworth. Furthermore, workers of lower Epworth were characterized by a mean number of shift years equal to 14.8, while those of higher Epworth possessed a mean number of shift years equal to 8. The shift schedule did not reveal any statistically significant correlation. Workers employed for a longer time had diminished sleepiness. However, there is no relationship between night shifts and sleepiness, possibly because of exposure to artificial lighting in the construction site.
The score statistic of the LD-lod analysis: detecting linkage adaptive to linkage disequilibrium.
Huang, J; Jiang, Y
2001-01-01
We study the properties of a modified lod score method for testing linkage that incorporates linkage disequilibrium (LD-lod). By examination of its score statistic, we show that the LD-lod score method adaptively combines two sources of information: (a) the IBD sharing score which is informative for linkage regardless of the existence of LD and (b) the contrast between allele-specific IBD sharing scores which is informative for linkage only in the presence of LD. We also consider the connection between the LD-lod score method and the transmission-disequilibrium test (TDT) for triad data and the mean test for affected sib pair (ASP) data. We show that, for triad data, the recessive LD-lod test is asymptotically equivalent to the TDT; and for ASP data, it is an adaptive combination of the TDT and the ASP mean test. We demonstrate that the LD-lod score method has relatively good statistical efficiency in comparison with the ASP mean test and the TDT for a broad range of LD and the genetic models considered in this report. Therefore, the LD-lod score method is an interesting approach for detecting linkage when the extent of LD is unknown, such as in a genome-wide screen with a dense set of genetic markers. Copyright 2001 S. Karger AG, Basel
Nasri Nasrabadi, Mohammad Reza; Razavi, Seyed Hadi
2010-04-01
In this work, we applied statistical experimental design to a fed-batch process for optimization of tricarboxylic acid cycle (TCA) intermediates in order to achieve high-level production of canthaxanthin from Dietzia natronolimnaea HS-1 cultured in beet molasses. A fractional factorial design (screening test) was first conducted on five TCA cycle intermediates. Out of the five TCA cycle intermediates investigated via screening tests, alfaketoglutarate, oxaloacetate and succinate were selected based on their statistically significant (P<0.05) and positive effects on canthaxanthin production. These significant factors were optimized by means of response surface methodology (RSM) in order to achieve high-level production of canthaxanthin. The experimental results of the RSM were fitted with a second-order polynomial equation by means of a multiple regression technique to identify the relationship between canthaxanthin production and the three TCA cycle intermediates. By means of this statistical design under a fed-batch process, the optimum conditions required to achieve the highest level of canthaxanthin (13172 + or - 25 microg l(-1)) were determined as follows: alfaketoglutarate, 9.69 mM; oxaloacetate, 8.68 mM; succinate, 8.51 mM. Copyright 2009 The Society for Biotechnology, Japan. Published by Elsevier B.V. All rights reserved.
Hyatt, M.W.; Hubert, W.A.
2001-01-01
We assessed relative weight (Wr) distributions among 291 samples of stock-to-quality-length brook trout Salvelinus fontinalis, brown trout Salmo trutta, rainbow trout Oncorhynchus mykiss, and cutthroat trout O. clarki from lentic and lotic habitats. Statistics describing Wr sample distributions varied slightly among species and habitat types. The average sample was leptokurtotic and slightly skewed to the right with a standard deviation of about 10, but the shapes of Wr distributions varied widely among samples. Twenty-two percent of the samples had nonnormal distributions, suggesting the need to evaluate sample distributions before applying statistical tests to determine whether assumptions are met. In general, our findings indicate that samples of about 100 stock-to-quality-length fish are needed to obtain confidence interval widths of four Wr units around the mean. Power analysis revealed that samples of about 50 stock-to-quality-length fish are needed to detect a 2% change in mean Wr at a relatively high level of power (beta = 0.01, alpha = 0.05).
Synaptic Transmission Optimization Predicts Expression Loci of Long-Term Plasticity.
Costa, Rui Ponte; Padamsey, Zahid; D'Amour, James A; Emptage, Nigel J; Froemke, Robert C; Vogels, Tim P
2017-09-27
Long-term modifications of neuronal connections are critical for reliable memory storage in the brain. However, their locus of expression-pre- or postsynaptic-is highly variable. Here we introduce a theoretical framework in which long-term plasticity performs an optimization of the postsynaptic response statistics toward a given mean with minimal variance. Consequently, the state of the synapse at the time of plasticity induction determines the ratio of pre- and postsynaptic modifications. Our theory explains the experimentally observed expression loci of the hippocampal and neocortical synaptic potentiation studies we examined. Moreover, the theory predicts presynaptic expression of long-term depression, consistent with experimental observations. At inhibitory synapses, the theory suggests a statistically efficient excitatory-inhibitory balance in which changes in inhibitory postsynaptic response statistics specifically target the mean excitation. Our results provide a unifying theory for understanding the expression mechanisms and functions of long-term synaptic transmission plasticity. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.
The effects of BleedArrest on hemorrhage control in a porcine model.
Gegel, Brian; Burgert, James; Loughren, Michael; Johnson, Don
2012-01-01
The purpose of this study was to examine the effectiveness of the hemostatic agent BleedArrest compared to control. This was a prospective, experimental design employing an established porcine model of uncontrolled hemorrhage. The minimum number of animals (n=10 per group) was used to obtain a statistically valid result. There were no statistically significant differences between the groups (P>.05) indicating that the groups were equivalent on the following parameters: activating clotting time, the subject weights, core body temperatures, amount of one minute hemorrhage, arterial blood pressures, and the amount and percentage of total blood volume. There were significant differences in the amount of hemorrhage (P=.033) between the BleedArrest (mean=72, SD±72 mL) and control (mean=317.30, SD±112.02 mL). BleedArrest is statistically and clinically superior at controlling hemorrhage compared to the standard pressure dressing control group. In conclusion, BleedArrest is an effective hemostatic agent for use in civilian and military trauma management.
Statistical mechanics framework for static granular matter.
Henkes, Silke; Chakraborty, Bulbul
2009-06-01
The physical properties of granular materials have been extensively studied in recent years. So far, however, there exists no theoretical framework which can explain the observations in a unified manner beyond the phenomenological jamming diagram. This work focuses on the case of static granular matter, where we have constructed a statistical ensemble which mirrors equilibrium statistical mechanics. This ensemble, which is based on the conservation properties of the stress tensor, is distinct from the original Edwards ensemble and applies to packings of deformable grains. We combine it with a field theoretical analysis of the packings, where the field is the Airy stress function derived from the force and torque balance conditions. In this framework, Point J characterized by a diverging stiffness of the pressure fluctuations. Separately, we present a phenomenological mean-field theory of the jamming transition, which incorporates the mean contact number as a variable. We link both approaches in the context of the marginal rigidity picture proposed by Wyart and others.
Collective behaviours: from biochemical kinetics to electronic circuits
Agliari, Elena; Barra, Adriano; Burioni, Raffaella; Di Biasio, Aldo; Uguzzoni, Guido
2013-01-01
In this work we aim to highlight a close analogy between cooperative behaviors in chemical kinetics and cybernetics; this is realized by using a common language for their description, that is mean-field statistical mechanics. First, we perform a one-to-one mapping between paradigmatic behaviors in chemical kinetics (i.e., non-cooperative, cooperative, ultra-sensitive, anti-cooperative) and in mean-field statistical mechanics (i.e., paramagnetic, high and low temperature ferromagnetic, anti-ferromagnetic). Interestingly, the statistical mechanics approach allows a unified, broad theory for all scenarios and, in particular, Michaelis-Menten, Hill and Adair equations are consistently recovered. This framework is then tested against experimental biological data with an overall excellent agreement. One step forward, we consistently read the whole mapping from a cybernetic perspective, highlighting deep structural analogies between the above-mentioned kinetics and fundamental bricks in electronics (i.e. operational amplifiers, flashes, flip-flops), so to build a clear bridge linking biochemical kinetics and cybernetics. PMID:24322327
Conditional statistics in a turbulent premixed flame derived from direct numerical simulation
NASA Technical Reports Server (NTRS)
Mantel, Thierry; Bilger, Robert W.
1994-01-01
The objective of this paper is to briefly introduce conditional moment closure (CMC) methods for premixed systems and to derive the transport equation for the conditional species mass fraction conditioned on the progress variable based on the enthalpy. Our statistical analysis will be based on the 3-D DNS database of Trouve and Poinsot available at the Center for Turbulence Research. The initial conditions and characteristics (turbulence, thermo-diffusive properties) as well as the numerical method utilized in the DNS of Trouve and Poinsot are presented, and some details concerning our statistical analysis are also given. From the analysis of DNS results, the effects of the position in the flame brush, of the Damkoehler and Lewis numbers on the conditional mean scalar dissipation, and conditional mean velocity are presented and discussed. Information concerning unconditional turbulent fluxes are also presented. The anomaly found in previous studies of counter-gradient diffusion for the turbulent flux of the progress variable is investigated.
Text grouping in patent analysis using adaptive K-means clustering algorithm
NASA Astrophysics Data System (ADS)
Shanie, Tiara; Suprijadi, Jadi; Zulhanif
2017-03-01
Patents are one of the Intellectual Property. Analyzing patent is one requirement in knowing well the development of technology in each country and in the world now. This study uses the patent document coming from the Espacenet server about Green Tea. Patent documents related to the technology in the field of tea is still widespread, so it will be difficult for users to information retrieval (IR). Therefore, it is necessary efforts to categorize documents in a specific group of related terms contained therein. This study uses titles patent text data with the proposed Green Tea in Statistical Text Mining methods consists of two phases: data preparation and data analysis stage. The data preparation phase uses Text Mining methods and data analysis stage is done by statistics. Statistical analysis in this study using a cluster analysis algorithm, the Adaptive K-Means Clustering Algorithm. Results from this study showed that based on the maximum value Silhouette, generate 87 clusters associated fifteen terms therein that can be utilized in the process of information retrieval needs.
NASA Astrophysics Data System (ADS)
Liu, Deyang; An, Ping; Ma, Ran; Yang, Chao; Shen, Liquan; Li, Kai
2016-07-01
Three-dimensional (3-D) holoscopic imaging, also known as integral imaging, light field imaging, or plenoptic imaging, can provide natural and fatigue-free 3-D visualization. However, a large amount of data is required to represent the 3-D holoscopic content. Therefore, efficient coding schemes for this particular type of image are needed. A 3-D holoscopic image coding scheme with kernel-based minimum mean square error (MMSE) estimation is proposed. In the proposed scheme, the coding block is predicted by an MMSE estimator under statistical modeling. In order to obtain the signal statistical behavior, kernel density estimation (KDE) is utilized to estimate the probability density function of the statistical modeling. As bandwidth estimation (BE) is a key issue in the KDE problem, we also propose a BE method based on kernel trick. The experimental results demonstrate that the proposed scheme can achieve a better rate-distortion performance and a better visual rendering quality.
De Hertogh, Benoît; De Meulder, Bertrand; Berger, Fabrice; Pierre, Michael; Bareke, Eric; Gaigneaux, Anthoula; Depiereux, Eric
2010-01-11
Recent reanalysis of spike-in datasets underscored the need for new and more accurate benchmark datasets for statistical microarray analysis. We present here a fresh method using biologically-relevant data to evaluate the performance of statistical methods. Our novel method ranks the probesets from a dataset composed of publicly-available biological microarray data and extracts subset matrices with precise information/noise ratios. Our method can be used to determine the capability of different methods to better estimate variance for a given number of replicates. The mean-variance and mean-fold change relationships of the matrices revealed a closer approximation of biological reality. Performance analysis refined the results from benchmarks published previously.We show that the Shrinkage t test (close to Limma) was the best of the methods tested, except when two replicates were examined, where the Regularized t test and the Window t test performed slightly better. The R scripts used for the analysis are available at http://urbm-cluster.urbm.fundp.ac.be/~bdemeulder/.
Uppal, S; Nadig, S; Jones, C; Nicolaides, A R; Coatesworth, A P
2004-06-01
The aim of this study was to compare laser palatoplasty with uvulectomy with punctate palatal diathermy as treatment modalities for snoring. The study design was a prospective, single-blind, randomized-controlled trial. Eighty-three patients entered the trial. After a mean follow-up period of more than 18 months there was no statistically significant difference between the two groups regarding the patient perception of benefit from surgery or the subjective improvement in snoring. However, there was a statistically significant difference in the degree of pain in the immediate postoperative period (mean difference = 22.14, 95% CI = 7.98-36.31, P = 0.003), with the pain being worse in the laser palatoplasty group. Relative risk of complications for laser palatoplasty was 1.42 (95% CI = 0.93-2.17). The snoring scores and Glasgow Benefit Inventory scores decreased with time in both the groups but there was no statistically significant difference between the two groups.
Body Weight Reducing Effect of Oral Boric Acid Intake
Aysan, Erhan; Sahin, Fikrettin; Telci, Dilek; Yalvac, Mehmet Emir; Emre, Sinem Hocaoglu; Karaca, Cetin; Muslumanoglu, Mahmut
2011-01-01
Background: Boric acid is widely used in biology, but its body weight reducing effect is not researched. Methods: Twenty mice were divided into two equal groups. Control group mice drank standard tap water, but study group mice drank 0.28mg/250ml boric acid added tap water over five days. Total body weight changes, major organ histopathology, blood biochemistry, urine and feces analyses were compared. Results: Study group mice lost body weight mean 28.1% but in control group no weight loss and also weight gained mean 0.09% (p<0.001). Total drinking water and urine outputs were not statistically different. Cholesterol, LDL, AST, ALT, LDH, amylase and urobilinogen levels were statistically significantly high in the study group. Other variables were not statistically different. No histopathologic differences were detected in evaluations of all resected major organs. Conclusion: Low dose oral boric acid intake cause serious body weight reduction. Blood and urine analyses support high glucose, lipid and middle protein catabolisms, but the mechanism is unclear. PMID:22135611
Body weight reducing effect of oral boric acid intake.
Aysan, Erhan; Sahin, Fikrettin; Telci, Dilek; Yalvac, Mehmet Emir; Emre, Sinem Hocaoglu; Karaca, Cetin; Muslumanoglu, Mahmut
2011-01-01
Boric acid is widely used in biology, but its body weight reducing effect is not researched. Twenty mice were divided into two equal groups. Control group mice drank standard tap water, but study group mice drank 0.28mg/250ml boric acid added tap water over five days. Total body weight changes, major organ histopathology, blood biochemistry, urine and feces analyses were compared. Study group mice lost body weight mean 28.1% but in control group no weight loss and also weight gained mean 0.09% (p<0.001). Total drinking water and urine outputs were not statistically different. Cholesterol, LDL, AST, ALT, LDH, amylase and urobilinogen levels were statistically significantly high in the study group. Other variables were not statistically different. No histopathologic differences were detected in evaluations of all resected major organs. Low dose oral boric acid intake cause serious body weight reduction. Blood and urine analyses support high glucose, lipid and middle protein catabolisms, but the mechanism is unclear.
Alexander, Neal; Cundill, Bonnie; Sabatelli, Lorenzo; Bethony, Jeffrey M.; Diemert, David; Hotez, Peter; Smith, Peter G.; Rodrigues, Laura C.; Brooker, Simon
2011-01-01
Vaccines against human helminths are being developed but the choice of optimal parasitological endpoints and effect measures to assess their efficacy has received little attention. Assuming negative binomial distributions for the parasite counts, we rank the statistical power of three measures of efficacy: ratio of mean parasite intensity at the end of the trial, the odds ratio of infection at the end of the trial, and the rate ratio of incidence of infection during the trial. We also use a modelling approach to estimate the likely impact of trial interventions on the force of infection, and hence statistical power. We conclude that (1) final mean parasite intensity is a suitable endpoint for later phase vaccine trials, and (2) mass effects of trial interventions are unlikely to appreciably reduce the force of infection in the community – and hence statistical power – unless there is a combination of high vaccine efficacy and a large proportion of the population enrolled. PMID:21435404
Assessment of variations in thermal cycle life data of thermal barrier coated rods
NASA Astrophysics Data System (ADS)
Hendricks, R. C.; McDonald, G.
An analysis of thermal cycle life data for 22 thermal barrier coated (TBC) specimens was conducted. The Zr02-8Y203/NiCrAlY plasma spray coated Rene 41 rods were tested in a Mach 0.3 Jet A/air burner flame. All specimens were subjected to the same coating and subsequent test procedures in an effort to control three parametric groups; material properties, geometry and heat flux. Statistically, the data sample space had a mean of 1330 cycles with a standard deviation of 520 cycles. The data were described by normal or log-normal distributions, but other models could also apply; the sample size must be increased to clearly delineate a statistical failure model. The statistical methods were also applied to adhesive/cohesive strength data for 20 TBC discs of the same composition, with similar results. The sample space had a mean of 9 MPa with a standard deviation of 4.2 MPa.
Assessment of variations in thermal cycle life data of thermal barrier coated rods
NASA Technical Reports Server (NTRS)
Hendricks, R. C.; Mcdonald, G.
1981-01-01
An analysis of thermal cycle life data for 22 thermal barrier coated (TBC) specimens was conducted. The Zr02-8Y203/NiCrAlY plasma spray coated Rene 41 rods were tested in a Mach 0.3 Jet A/air burner flame. All specimens were subjected to the same coating and subsequent test procedures in an effort to control three parametric groups; material properties, geometry and heat flux. Statistically, the data sample space had a mean of 1330 cycles with a standard deviation of 520 cycles. The data were described by normal or log-normal distributions, but other models could also apply; the sample size must be increased to clearly delineate a statistical failure model. The statistical methods were also applied to adhesive/cohesive strength data for 20 TBC discs of the same composition, with similar results. The sample space had a mean of 9 MPa with a standard deviation of 4.2 MPa.
Radon-222 concentrations in ground water and soil gas on Indian reservations in Wisconsin
DeWild, John F.; Krohelski, James T.
1995-01-01
For sites with wells finished in the sand and gravel aquifer, the coefficient of determination (R2) of the regression of concentration of radon-222 in ground water as a function of well depth is 0.003 and the significance level is 0.32, which indicates that there is not a statistically significant relation between radon-222 concentrations in ground water and well depth. The coefficient of determination of the regression of radon-222 in ground water and soil gas is 0.19 and the root mean square error of the regression line is 271 picocuries per liter. Even though the significance level (0.036) indicates a statistical relation, the root mean square error of the regression is so large that the regression equation would not give reliable predictions. Because of an inadequate number of samples, similar statistical analyses could not be performed for sites with wells finished in the crystalline and sedimentary bedrock aquifers.
Wavelet methodology to improve single unit isolation in primary motor cortex cells
Ortiz-Rosario, Alexis; Adeli, Hojjat; Buford, John A.
2016-01-01
The proper isolation of action potentials recorded extracellularly from neural tissue is an active area of research in the fields of neuroscience and biomedical signal processing. This paper presents an isolation methodology for neural recordings using the wavelet transform (WT), a statistical thresholding scheme, and the principal component analysis (PCA) algorithm. The effectiveness of five different mother wavelets was investigated: biorthogonal, Daubachies, discrete Meyer, symmetric, and Coifman; along with three different wavelet coefficient thresholding schemes: fixed form threshold, Stein’s unbiased estimate of risk, and minimax; and two different thresholding rules: soft and hard thresholding. The signal quality was evaluated using three different statistical measures: mean-squared error, root-mean squared, and signal to noise ratio. The clustering quality was evaluated using two different statistical measures: isolation distance, and L-ratio. This research shows that the selection of the mother wavelet has a strong influence on the clustering and isolation of single unit neural activity, with the Daubachies 4 wavelet and minimax thresholding scheme performing the best. PMID:25794461
Spatial trends in Pearson Type III statistical parameters
Lichty, R.W.; Karlinger, M.R.
1995-01-01
Spatial trends in the statistical parameters (mean, standard deviation, and skewness coefficient) of a Pearson Type III distribution of the logarithms of annual flood peaks for small rural basins (less than 90 km2) are delineated using a climate factor CT, (T=2-, 25-, and 100-yr recurrence intervals), which quantifies the effects of long-term climatic data (rainfall and pan evaporation) on observed T-yr floods. Maps showing trends in average parameter values demonstrate the geographically varying influence of climate on the magnitude of Pearson Type III statistical parameters. The spatial trends in variability of the parameter values characterize the sensitivity of statistical parameters to the interaction of basin-runoff characteristics (hydrology) and climate. -from Authors
Gaussian statistics for palaeomagnetic vectors
Love, J.J.; Constable, C.G.
2003-01-01
With the aim of treating the statistics of palaeomagnetic directions and intensities jointly and consistently, we represent the mean and the variance of palaeomagnetic vectors, at a particular site and of a particular polarity, by a probability density function in a Cartesian three-space of orthogonal magnetic-field components consisting of a single (unimoda) non-zero mean, spherically-symmetrical (isotropic) Gaussian function. For palaeomagnetic data of mixed polarities, we consider a bimodal distribution consisting of a pair of such symmetrical Gaussian functions, with equal, but opposite, means and equal variances. For both the Gaussian and bi-Gaussian distributions, and in the spherical three-space of intensity, inclination, and declination, we obtain analytical expressions for the marginal density functions, the cumulative distributions, and the expected values and variances for each spherical coordinate (including the angle with respect to the axis of symmetry of the distributions). The mathematical expressions for the intensity and off-axis angle are closed-form and especially manageable, with the intensity distribution being Rayleigh-Rician. In the limit of small relative vectorial dispersion, the Gaussian (bi-Gaussian) directional distribution approaches a Fisher (Bingham) distribution and the intensity distribution approaches a normal distribution. In the opposite limit of large relative vectorial dispersion, the directional distributions approach a spherically-uniform distribution and the intensity distribution approaches a Maxwell distribution. We quantify biases in estimating the properties of the vector field resulting from the use of simple arithmetic averages, such as estimates of the intensity or the inclination of the mean vector, or the variances of these quantities. With the statistical framework developed here and using the maximum-likelihood method, which gives unbiased estimates in the limit of large data numbers, we demonstrate how to formulate the inverse problem, and how to estimate the mean and variance of the magnetic vector field, even when the data consist of mixed combinations of directions and intensities. We examine palaeomagnetic secular-variation data from Hawaii and Re??union, and although these two sites are on almost opposite latitudes, we find significant differences in the mean vector and differences in the local vectorial variances, with the Hawaiian data being particularly anisotropic. These observations are inconsistent with a description of the mean field as being a simple geocentric axial dipole and with secular variation being statistically symmetrical with respect to reflection through the equatorial plane. Finally, our analysis of palaeomagnetic acquisition data from the 1960 Kilauea flow in Hawaii and the Holocene Xitle flow in Mexico, is consistent with the widely held suspicion that directional data are more accurate than intensity data.
Gaussian statistics for palaeomagnetic vectors
NASA Astrophysics Data System (ADS)
Love, J. J.; Constable, C. G.
2003-03-01
With the aim of treating the statistics of palaeomagnetic directions and intensities jointly and consistently, we represent the mean and the variance of palaeomagnetic vectors, at a particular site and of a particular polarity, by a probability density function in a Cartesian three-space of orthogonal magnetic-field components consisting of a single (unimodal) non-zero mean, spherically-symmetrical (isotropic) Gaussian function. For palaeomagnetic data of mixed polarities, we consider a bimodal distribution consisting of a pair of such symmetrical Gaussian functions, with equal, but opposite, means and equal variances. For both the Gaussian and bi-Gaussian distributions, and in the spherical three-space of intensity, inclination, and declination, we obtain analytical expressions for the marginal density functions, the cumulative distributions, and the expected values and variances for each spherical coordinate (including the angle with respect to the axis of symmetry of the distributions). The mathematical expressions for the intensity and off-axis angle are closed-form and especially manageable, with the intensity distribution being Rayleigh-Rician. In the limit of small relative vectorial dispersion, the Gaussian (bi-Gaussian) directional distribution approaches a Fisher (Bingham) distribution and the intensity distribution approaches a normal distribution. In the opposite limit of large relative vectorial dispersion, the directional distributions approach a spherically-uniform distribution and the intensity distribution approaches a Maxwell distribution. We quantify biases in estimating the properties of the vector field resulting from the use of simple arithmetic averages, such as estimates of the intensity or the inclination of the mean vector, or the variances of these quantities. With the statistical framework developed here and using the maximum-likelihood method, which gives unbiased estimates in the limit of large data numbers, we demonstrate how to formulate the inverse problem, and how to estimate the mean and variance of the magnetic vector field, even when the data consist of mixed combinations of directions and intensities. We examine palaeomagnetic secular-variation data from Hawaii and Réunion, and although these two sites are on almost opposite latitudes, we find significant differences in the mean vector and differences in the local vectorial variances, with the Hawaiian data being particularly anisotropic. These observations are inconsistent with a description of the mean field as being a simple geocentric axial dipole and with secular variation being statistically symmetrical with respect to reflection through the equatorial plane. Finally, our analysis of palaeomagnetic acquisition data from the 1960 Kilauea flow in Hawaii and the Holocene Xitle flow in Mexico, is consistent with the widely held suspicion that directional data are more accurate than intensity data.
Koltun, G.F.
2013-01-01
This report presents the results of a study to assess potential water availability from the Atwood, Leesville, and Tappan Lakes, located within the Muskingum River Watershed, Ohio. The assessment was based on the criterion that water withdrawals should not appreciably affect maintenance of recreation-season pool levels in current use. To facilitate and simplify the assessment, it was assumed that historical lake operations were successful in maintaining seasonal pool levels, and that any discharges from lakes constituted either water that was discharged to prevent exceeding seasonal pool levels or discharges intended to meet minimum in-stream flow targets downstream from the lakes. It further was assumed that the volume of water discharged in excess of the minimum in-stream flow target is available for use without negatively impacting seasonal pool levels or downstream water uses and that all or part of it is subject to withdrawal. Historical daily outflow data for the lakes were used to determine the quantity of water that potentially could be withdrawn and the resulting quantity of water that would flow downstream (referred to as “flow-by”) on a daily basis as a function of all combinations of three hypothetical target minimum flow-by amounts (1, 2, and 3 times current minimum in-stream flow targets) and three pumping capacities (1, 2, and 3 million gallons per day). Using both U.S. Geological Survey streamgage data and lake-outflow data provided by the U.S. Army Corps of Engineers resulted in analytical periods ranging from 51 calendar years for the Atwood Lake to 73 calendar years for the Leesville and Tappan Lakes. The observed outflow time series and the computed time series of daily flow-by amounts and potential withdrawals were analyzed to compute and report order statistics (95th, 75th, 50th, 25th, 10th, and 5th percentiles) and means for the analytical period, in aggregate, and broken down by calendar month. In addition, surplus-water mass curve data were tabulated for each of the lakes. Monthly order statistics of computed withdrawals indicated that, for the three pumping capacities considered, increasing the target minimum flow-by amount tended to reduce the amount of water that can be withdrawn. The reduction was greatest in the lower percentiles of withdrawal; however, increasing the flow-by amount had no impact on potential withdrawals during high flow. In addition, for a given target minimum flow-by amount, increasing the pumping rate increased the total amount of water that could be withdrawn; however, that increase was less than a direct multiple of the increase in pumping rate for most flow statistics. Potential monthly withdrawals were observed to be more variable and more limited in some calendar months than others. Monthly order statistics and means of computed daily mean flow-by amounts indicated that flow-by amounts generally tended to be lowest during June–October and February. Increasing the target minimum flow-by amount for a given pumping rate resulted in some small increases in the magnitudes of the mean and 50th percentile and lower order statistics of computed mean flow-by, but had no effect on the magnitudes of the higher percentile statistics. Increasing the pumping rate for a given target minimum flow-by amount resulted in decreases in magnitudes of higher-percentile flow-by statistics by an amount equal to the flow equivalent of the increase in pumping rate; however, some lower percentile statistics remained unchanged.
Assertiveness and problem solving in midwives
Yurtsal, Zeliha Burcu; Özdemir, Levent
2015-01-01
Background: Midwifery profession is required to bring solutions to problems and a midwife is expected to be an assertive person and to develop midwifery care. This study was planned to examine the relationship between assertiveness and problem-solving skills of midwives. Materials and Methods: This cross-sectional study was conducted with 201 midwives between July 2008 and February 2009 in the city center of Sivas. The Rathus Assertiveness Schedule (RAS) and Problem Solving Inventory (PSI) were used to determine the level of assertiveness and problem-solving skills of midwives. Statistical methods were used as mean, standard deviation, percentage, Student's T, ANOVA and Tukey HSD, Kruskal Wallis, Fisher Exact, Pearson Correlation and Chi-square tests and P < 0.05. Results: The RAS mean scores and the PSI mean scores showed statistically significant differences in terms of a midwife's considering herself as a member of the health team, expressing herself within the health care team, being able to say “no” when necessary, cooperating with her colleagues, taking part in problem-solving skills training. A statistically significant negative correlation was found between the RAS and PSI scores. The RAS scores decreased while the problem-solving scores increased (r: -0451, P < 0.01). Conclusions: There were significant statistical differences between assertiveness levels and problem solving skills of midwives, and midwives who were assertive solved their problems better than did others. Assertiveness and problem-solving skills training will contribute to the success of the midwifery profession. Midwives able to solve problems, and display assertive behaviors will contribute to the development of midwifery profession. PMID:26793247
Wellek, Stefan
2017-02-28
In current practice, the most frequently applied approach to the handling of ties in the Mann-Whitney-Wilcoxon (MWW) test is based on the conditional distribution of the sum of mid-ranks, given the observed pattern of ties. Starting from this conditional version of the testing procedure, a sample size formula was derived and investigated by Zhao et al. (Stat Med 2008). In contrast, the approach we pursue here is a nonconditional one exploiting explicit representations for the variances of and the covariance between the two U-statistics estimators involved in the Mann-Whitney form of the test statistic. The accuracy of both ways of approximating the sample sizes required for attaining a prespecified level of power in the MWW test for superiority with arbitrarily tied data is comparatively evaluated by means of simulation. The key qualitative conclusions to be drawn from these numerical comparisons are as follows: With the sample sizes calculated by means of the respective formula, both versions of the test maintain the level and the prespecified power with about the same degree of accuracy. Despite the equivalence in terms of accuracy, the sample size estimates obtained by means of the new formula are in many cases markedly lower than that calculated for the conditional test. Perhaps, a still more important advantage of the nonconditional approach based on U-statistics is that it can be also adopted for noninferiority trials. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.