Practical statistics in pain research.
Kim, Tae Kyun
2017-10-01
Pain is subjective, while statistics related to pain research are objective. This review was written to help researchers involved in pain research make statistical decisions. The main issues are related with the level of scales that are often used in pain research, the choice of statistical methods between parametric or nonparametric statistics, and problems which arise from repeated measurements. In the field of pain research, parametric statistics used to be applied in an erroneous way. This is closely related with the scales of data and repeated measurements. The level of scales includes nominal, ordinal, interval, and ratio scales. The level of scales affects the choice of statistics between parametric or non-parametric methods. In the field of pain research, the most frequently used pain assessment scale is the ordinal scale, which would include the visual analogue scale (VAS). There used to be another view, however, which considered the VAS to be an interval or ratio scale, so that the usage of parametric statistics would be accepted practically in some cases. Repeated measurements of the same subjects always complicates statistics. It means that measurements inevitably have correlations between each other, and would preclude the application of one-way ANOVA in which independence between the measurements is necessary. Repeated measures of ANOVA (RMANOVA), however, would permit the comparison between the correlated measurements as long as the condition of sphericity assumption is satisfied. Conclusively, parametric statistical methods should be used only when the assumptions of parametric statistics, such as normality and sphericity, are established.
Ries, Kernell G.
1999-01-01
A network of 148 low-flow partial-record stations was operated on streams in Massachusetts during the summers of 1989 through 1996. Streamflow measurements (including historical measurements), measured basin characteristics, and estimated streamflow statistics are provided in the report for each low-flow partial-record station. Also included for each station are location information, streamflow-gaging stations for which flows were correlated to those at the low-flowpartial-record station, years of operation, and remarks indicating human influences of stream-flowsat the station. Three or four streamflow measurements were made each year for three years during times of low flow to obtain nine or ten measurements for each station. Measured flows at the low-flow partial-record stations were correlated with same-day mean flows at a nearby gaging station to estimate streamflow statistics for the low-flow partial-record stations. The estimated streamflow statistics include the 99-, 98-, 97-, 95-, 93-, 90-, 85-, 80-, 75-, 70-, 65-, 60-, 55-, and 50-percent duration flows; the 7-day, 10- and 2-year low flows; and the August median flow. Characteristics of the drainage basins for the stations that theoretically relate to the response of the station to climatic variations were measured from digital map data by use of an automated geographic information system procedure. Basin characteristics measured include drainage area; total stream length; mean basin slope; area of surficial stratified drift; area of wetlands; area of water bodies; and mean, maximum, and minimum basin elevation.Station descriptions and calculated streamflow statistics are also included in the report for the 50 continuous gaging stations used in correlations with the low-flow partial-record stations.
Statistics 101 for Radiologists.
Anvari, Arash; Halpern, Elkan F; Samir, Anthony E
2015-10-01
Diagnostic tests have wide clinical applications, including screening, diagnosis, measuring treatment effect, and determining prognosis. Interpreting diagnostic test results requires an understanding of key statistical concepts used to evaluate test efficacy. This review explains descriptive statistics and discusses probability, including mutually exclusive and independent events and conditional probability. In the inferential statistics section, a statistical perspective on study design is provided, together with an explanation of how to select appropriate statistical tests. Key concepts in recruiting study samples are discussed, including representativeness and random sampling. Variable types are defined, including predictor, outcome, and covariate variables, and the relationship of these variables to one another. In the hypothesis testing section, we explain how to determine if observed differences between groups are likely to be due to chance. We explain type I and II errors, statistical significance, and study power, followed by an explanation of effect sizes and how confidence intervals can be used to generalize observed effect sizes to the larger population. Statistical tests are explained in four categories: t tests and analysis of variance, proportion analysis tests, nonparametric tests, and regression techniques. We discuss sensitivity, specificity, accuracy, receiver operating characteristic analysis, and likelihood ratios. Measures of reliability and agreement, including κ statistics, intraclass correlation coefficients, and Bland-Altman graphs and analysis, are introduced. © RSNA, 2015.
Uncertainty Analysis of Seebeck Coefficient and Electrical Resistivity Characterization
NASA Technical Reports Server (NTRS)
Mackey, Jon; Sehirlioglu, Alp; Dynys, Fred
2014-01-01
In order to provide a complete description of a materials thermoelectric power factor, in addition to the measured nominal value, an uncertainty interval is required. The uncertainty may contain sources of measurement error including systematic bias error and precision error of a statistical nature. The work focuses specifically on the popular ZEM-3 (Ulvac Technologies) measurement system, but the methods apply to any measurement system. The analysis accounts for sources of systematic error including sample preparation tolerance, measurement probe placement, thermocouple cold-finger effect, and measurement parameters; in addition to including uncertainty of a statistical nature. Complete uncertainty analysis of a measurement system allows for more reliable comparison of measurement data between laboratories.
ERIC Educational Resources Information Center
Nelson, Frank, Comp.
This report is a compilation of input and output measures and other statistics in reference to Idaho's public libraries, covering the period from October 1997 through September 1998. The introductory sections include notes on the statistics, definitions of performance measures, Idaho public library rankings for fiscal year 1996, and a state map…
Assaad, Houssein I; Choudhary, Pankaj K
2013-01-01
The L -statistics form an important class of estimators in nonparametric statistics. Its members include trimmed means and sample quantiles and functions thereof. This article is devoted to theory and applications of L -statistics for repeated measurements data, wherein the measurements on the same subject are dependent and the measurements from different subjects are independent. This article has three main goals: (a) Show that the L -statistics are asymptotically normal for repeated measurements data. (b) Present three statistical applications of this result, namely, location estimation using trimmed means, quantile estimation and construction of tolerance intervals. (c) Obtain a Bahadur representation for sample quantiles. These results are generalizations of similar results for independently and identically distributed data. The practical usefulness of these results is illustrated by analyzing a real data set involving measurement of systolic blood pressure. The properties of the proposed point and interval estimators are examined via simulation.
Statistical speed of quantum states: Generalized quantum Fisher information and Schatten speed
NASA Astrophysics Data System (ADS)
Gessner, Manuel; Smerzi, Augusto
2018-02-01
We analyze families of measures for the quantum statistical speed which include as special cases the quantum Fisher information, the trace speed, i.e., the quantum statistical speed obtained from the trace distance, and more general quantifiers obtained from the family of Schatten norms. These measures quantify the statistical speed under generic quantum evolutions and are obtained by maximizing classical measures over all possible quantum measurements. We discuss general properties, optimal measurements, and upper bounds on the speed of separable states. We further provide a physical interpretation for the trace speed by linking it to an analog of the quantum Cramér-Rao bound for median-unbiased quantum phase estimation.
NASA Technical Reports Server (NTRS)
Purves, L.; Strang, R. F.; Dube, M. P.; Alea, P.; Ferragut, N.; Hershfeld, D.
1983-01-01
The software and procedures of a system of programs used to generate a report of the statistical correlation between NASTRAN modal analysis results and physical tests results from modal surveys are described. Topics discussed include: a mathematical description of statistical correlation, a user's guide for generating a statistical correlation report, a programmer's guide describing the organization and functions of individual programs leading to a statistical correlation report, and a set of examples including complete listings of programs, and input and output data.
Research Education in Undergraduate Occupational Therapy Programs.
ERIC Educational Resources Information Center
Petersen, Paul; And Others
1992-01-01
Of 63 undergraduate occupational therapy programs surveyed, the 38 responses revealed some common areas covered: elementary descriptive statistics, validity, reliability, and measurement. Areas underrepresented include statistical analysis with or without computers, research design, and advanced statistics. (SK)
Generalization of Entropy Based Divergence Measures for Symbolic Sequence Analysis
Ré, Miguel A.; Azad, Rajeev K.
2014-01-01
Entropy based measures have been frequently used in symbolic sequence analysis. A symmetrized and smoothed form of Kullback-Leibler divergence or relative entropy, the Jensen-Shannon divergence (JSD), is of particular interest because of its sharing properties with families of other divergence measures and its interpretability in different domains including statistical physics, information theory and mathematical statistics. The uniqueness and versatility of this measure arise because of a number of attributes including generalization to any number of probability distributions and association of weights to the distributions. Furthermore, its entropic formulation allows its generalization in different statistical frameworks, such as, non-extensive Tsallis statistics and higher order Markovian statistics. We revisit these generalizations and propose a new generalization of JSD in the integrated Tsallis and Markovian statistical framework. We show that this generalization can be interpreted in terms of mutual information. We also investigate the performance of different JSD generalizations in deconstructing chimeric DNA sequences assembled from bacterial genomes including that of E. coli, S. enterica typhi, Y. pestis and H. influenzae. Our results show that the JSD generalizations bring in more pronounced improvements when the sequences being compared are from phylogenetically proximal organisms, which are often difficult to distinguish because of their compositional similarity. While small but noticeable improvements were observed with the Tsallis statistical JSD generalization, relatively large improvements were observed with the Markovian generalization. In contrast, the proposed Tsallis-Markovian generalization yielded more pronounced improvements relative to the Tsallis and Markovian generalizations, specifically when the sequences being compared arose from phylogenetically proximal organisms. PMID:24728338
Generalization of entropy based divergence measures for symbolic sequence analysis.
Ré, Miguel A; Azad, Rajeev K
2014-01-01
Entropy based measures have been frequently used in symbolic sequence analysis. A symmetrized and smoothed form of Kullback-Leibler divergence or relative entropy, the Jensen-Shannon divergence (JSD), is of particular interest because of its sharing properties with families of other divergence measures and its interpretability in different domains including statistical physics, information theory and mathematical statistics. The uniqueness and versatility of this measure arise because of a number of attributes including generalization to any number of probability distributions and association of weights to the distributions. Furthermore, its entropic formulation allows its generalization in different statistical frameworks, such as, non-extensive Tsallis statistics and higher order Markovian statistics. We revisit these generalizations and propose a new generalization of JSD in the integrated Tsallis and Markovian statistical framework. We show that this generalization can be interpreted in terms of mutual information. We also investigate the performance of different JSD generalizations in deconstructing chimeric DNA sequences assembled from bacterial genomes including that of E. coli, S. enterica typhi, Y. pestis and H. influenzae. Our results show that the JSD generalizations bring in more pronounced improvements when the sequences being compared are from phylogenetically proximal organisms, which are often difficult to distinguish because of their compositional similarity. While small but noticeable improvements were observed with the Tsallis statistical JSD generalization, relatively large improvements were observed with the Markovian generalization. In contrast, the proposed Tsallis-Markovian generalization yielded more pronounced improvements relative to the Tsallis and Markovian generalizations, specifically when the sequences being compared arose from phylogenetically proximal organisms.
A Statistical Method for Syntactic Dialectometry
ERIC Educational Resources Information Center
Sanders, Nathan C.
2010-01-01
This dissertation establishes the utility and reliability of a statistical distance measure for syntactic dialectometry, expanding dialectometry's methods to include syntax as well as phonology and the lexicon. It establishes the measure's reliability by comparing its results to those of dialectology and phonological dialectometry on Swedish…
EHME: a new word database for research in Basque language.
Acha, Joana; Laka, Itziar; Landa, Josu; Salaburu, Pello
2014-11-14
This article presents EHME, the frequency dictionary of Basque structure, an online program that enables researchers in psycholinguistics to extract word and nonword stimuli, based on a broad range of statistics concerning the properties of Basque words. The database consists of 22.7 million tokens, and properties available include morphological structure frequency and word-similarity measures, apart from classical indexes: word frequency, orthographic structure, orthographic similarity, bigram and biphone frequency, and syllable-based measures. Measures are indexed at the lemma, morpheme and word level. We include reliability and validation analysis. The application is freely available, and enables the user to extract words based on concrete statistical criteria 1 , as well as to obtain statistical characteristics from a list of words
Fotina, I; Lütgendorf-Caucig, C; Stock, M; Pötter, R; Georg, D
2012-02-01
Inter-observer studies represent a valid method for the evaluation of target definition uncertainties and contouring guidelines. However, data from the literature do not yet give clear guidelines for reporting contouring variability. Thus, the purpose of this work was to compare and discuss various methods to determine variability on the basis of clinical cases and a literature review. In this study, 7 prostate and 8 lung cases were contoured on CT images by 8 experienced observers. Analysis of variability included descriptive statistics, calculation of overlap measures, and statistical measures of agreement. Cross tables with ratios and correlations were established for overlap parameters. It was shown that the minimal set of parameters to be reported should include at least one of three volume overlap measures (i.e., generalized conformity index, Jaccard coefficient, or conformation number). High correlation between these parameters and scatter of the results was observed. A combination of descriptive statistics, overlap measure, and statistical measure of agreement or reliability analysis is required to fully report the interrater variability in delineation.
Statistical Approaches Used to Assess the Equity of Access to Food Outlets: A Systematic Review
Lamb, Karen E.; Thornton, Lukar E.; Cerin, Ester; Ball, Kylie
2015-01-01
Background Inequalities in eating behaviours are often linked to the types of food retailers accessible in neighbourhood environments. Numerous studies have aimed to identify if access to healthy and unhealthy food retailers is socioeconomically patterned across neighbourhoods, and thus a potential risk factor for dietary inequalities. Existing reviews have examined differences between methodologies, particularly focussing on neighbourhood and food outlet access measure definitions. However, no review has informatively discussed the suitability of the statistical methodologies employed; a key issue determining the validity of study findings. Our aim was to examine the suitability of statistical approaches adopted in these analyses. Methods Searches were conducted for articles published from 2000–2014. Eligible studies included objective measures of the neighbourhood food environment and neighbourhood-level socio-economic status, with a statistical analysis of the association between food outlet access and socio-economic status. Results Fifty-four papers were included. Outlet accessibility was typically defined as the distance to the nearest outlet from the neighbourhood centroid, or as the number of food outlets within a neighbourhood (or buffer). To assess if these measures were linked to neighbourhood disadvantage, common statistical methods included ANOVA, correlation, and Poisson or negative binomial regression. Although all studies involved spatial data, few considered spatial analysis techniques or spatial autocorrelation. Conclusions With advances in GIS software, sophisticated measures of neighbourhood outlet accessibility can be considered. However, approaches to statistical analysis often appear less sophisticated. Care should be taken to consider assumptions underlying the analysis and the possibility of spatially correlated residuals which could affect the results. PMID:29546115
ERIC Educational Resources Information Center
Nguyen, Thuyuyen H.; Newby, Michael; Skordi, Panayiotis G.
2015-01-01
Statistics is a required subject of study in many academic disciplines, including business, education and psychology, that causes problems for many students. This has long been recognised and there have been a number of studies into students' attitudes towards statistics, particularly statistical anxiety. However, none of these studies…
The Effects of Measurement Error on Statistical Models for Analyzing Change. Final Report.
ERIC Educational Resources Information Center
Dunivant, Noel
The results of six major projects are discussed including a comprehensive mathematical and statistical analysis of the problems caused by errors of measurement in linear models for assessing change. In a general matrix representation of the problem, several new analytic results are proved concerning the parameters which affect bias in…
75 FR 72611 - Assessments, Large Bank Pricing
Federal Register 2010, 2011, 2012, 2013, 2014
2010-11-24
... the worst risk ranking and are included in the statistical analysis. Appendix 1 to the NPR describes the statistical analysis in detail. \\12\\ The percentage approximated by factors is based on the statistical model for that particual year. Actual weights assigned to each scorecard measure are largely based...
Conditional Probabilities and Collapse in Quantum Measurements
NASA Astrophysics Data System (ADS)
Laura, Roberto; Vanni, Leonardo
2008-09-01
We show that including both the system and the apparatus in the quantum description of the measurement process, and using the concept of conditional probabilities, it is possible to deduce the statistical operator of the system after a measurement with a given result, which gives the probability distribution for all possible consecutive measurements on the system. This statistical operator, representing the state of the system after the first measurement, is in general not the same that would be obtained using the postulate of collapse.
76 FR 34385 - Program Integrity: Gainful Employment-Debt Measures
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-13
... postsecondary education at a public institution. National Center for Education Statistics, 2004/2009 Beginning... reliable earnings information, including use of State data, survey data, or Bureau of Labor Statistics (BLS...
The SPARC Intercomparison of Middle Atmosphere Climatologies
NASA Technical Reports Server (NTRS)
Randel, William; Fleming, Eric; Geller, Marvin; Gelman, Mel; Hamilton, Kevin; Karoly, David; Ortland, Dave; Pawson, Steve; Swinbank, Richard; Udelhofen, Petra
2003-01-01
Our current confidence in 'observed' climatological winds and temperatures in the middle atmosphere (over altitudes approx. 10-80 km) is assessed by detailed intercomparisons of contemporary and historic data sets. These data sets include global meteorological analyses and assimilations, climatologies derived from research satellite measurements, and historical reference atmosphere circulation statistics. We also include comparisons with historical rocketsonde wind and temperature data, and with more recent lidar temperature measurements. The comparisons focus on a few basic circulation statistics, such as temperature, zonal wind, and eddy flux statistics. Special attention is focused on tropical winds and temperatures, where large differences exist among separate analyses. Assimilated data sets provide the most realistic tropical variability, but substantial differences exist among current schemes.
Kansas's forests, 2005: statistics, methods, and quality assurance
Patrick D. Miles; W. Keith Moser; Charles J. Barnett
2011-01-01
The first full annual inventory of Kansas's forests was completed in 2005 after 8,868 plots were selected and 468 forested plots were visited and measured. This report includes detailed information on forest inventory methods and data quality estimates. Important resource statistics are included in the tables. A detailed analysis of Kansas inventory is presented...
34 CFR Appendix A to Subpart N of... - Sample Default Prevention Plan
Code of Federal Regulations, 2010 CFR
2010-07-01
... relevant default prevention statistics, including a statistical analysis of the borrowers who default on...'s delinquency status by obtaining reports from data managers and FFEL Program lenders. 5. Enhance... academic study. III. Statistics for Measuring Progress 1. The number of students enrolled at your...
Cosmic shear measurements with Dark Energy Survey Science Verification data
Becker, M. R.
2016-07-06
Here, we present measurements of weak gravitational lensing cosmic shear two-point statistics using Dark Energy Survey Science Verification data. We demonstrate that our results are robust to the choice of shear measurement pipeline, either ngmix or im3shape, and robust to the choice of two-point statistic, including both real and Fourier-space statistics. Our results pass a suite of null tests including tests for B-mode contamination and direct tests for any dependence of the two-point functions on a set of 16 observing conditions and galaxy properties, such as seeing, airmass, galaxy color, galaxy magnitude, etc. We use a large suite of simulationsmore » to compute the covariance matrix of the cosmic shear measurements and assign statistical significance to our null tests. We find that our covariance matrix is consistent with the halo model prediction, indicating that it has the appropriate level of halo sample variance. We also compare the same jackknife procedure applied to the data and the simulations in order to search for additional sources of noise not captured by the simulations. We find no statistically significant extra sources of noise in the data. The overall detection significance with tomography for our highest source density catalog is 9.7σ. Cosmological constraints from the measurements in this work are presented in a companion paper.« less
NASA Astrophysics Data System (ADS)
Azila Che Musa, Nor; Mahmud, Zamalia; Baharun, Norhayati
2017-09-01
One of the important skills that is required from any student who are learning statistics is knowing how to solve statistical problems correctly using appropriate statistical methods. This will enable them to arrive at a conclusion and make a significant contribution and decision for the society. In this study, a group of 22 students majoring in statistics at UiTM Shah Alam were given problems relating to topics on testing of hypothesis which require them to solve the problems using confidence interval, traditional and p-value approach. Hypothesis testing is one of the techniques used in solving real problems and it is listed as one of the difficult concepts for students to grasp. The objectives of this study is to explore students’ perceived and actual ability in solving statistical problems and to determine which item in statistical problem solving that students find difficult to grasp. Students’ perceived and actual ability were measured based on the instruments developed from the respective topics. Rasch measurement tools such as Wright map and item measures for fit statistics were used to accomplish the objectives. Data were collected and analysed using Winsteps 3.90 software which is developed based on the Rasch measurement model. The results showed that students’ perceived themselves as moderately competent in solving the statistical problems using confidence interval and p-value approach even though their actual performance showed otherwise. Item measures for fit statistics also showed that the maximum estimated measures were found on two problems. These measures indicate that none of the students have attempted these problems correctly due to reasons which include their lack of understanding in confidence interval and probability values.
Interrupted Time Series Versus Statistical Process Control in Quality Improvement Projects.
Andersson Hagiwara, Magnus; Andersson Gäre, Boel; Elg, Mattias
2016-01-01
To measure the effect of quality improvement interventions, it is appropriate to use analysis methods that measure data over time. Examples of such methods include statistical process control analysis and interrupted time series with segmented regression analysis. This article compares the use of statistical process control analysis and interrupted time series with segmented regression analysis for evaluating the longitudinal effects of quality improvement interventions, using an example study on an evaluation of a computerized decision support system.
North Dakota's forests, 2005: statistics, methods, and quality assurance
Patrick D. Miles; David E. Haugen; Charles J. Barnett
2011-01-01
The first full annual inventory of North Dakota's forests was completed in 2005 after 7,622 plots were selected and 164 forested plots were visited and measured. This report includes detailed information on forest inventory methods and data quality estimates. Important resource statistics are included in the tables. A detailed analysis of the North Dakota...
South Dakota's forests, 2005: statistics, methods, and quality assurance
Patrick D. Miles; Ronald J. Piva; Charles J. Barnett
2011-01-01
The first full annual inventory of South Dakota's forests was completed in 2005 after 8,302 plots were selected and 325 forested plots were visited and measured. This report includes detailed information on forest inventory methods and data quality estimates. Important resource statistics are included in the tables. A detailed analysis of the South Dakota...
A statistical approach to instrument calibration
Robert R. Ziemer; David Strauss
1978-01-01
Summary - It has been found that two instruments will yield different numerical values when used to measure identical points. A statistical approach is presented that can be used to approximate the error associated with the calibration of instruments. Included are standard statistical tests that can be used to determine if a number of successive calibrations of the...
NASA Technical Reports Server (NTRS)
Matolak, D. W.; Apaza, Rafael; Foore, Lawrence R.
2006-01-01
We describe a recently completed wideband wireless channel characterization project for the 5 GHz Microwave Landing System (MLS) extension band, for airport surface areas. This work included mobile measurements at large and small airports, and fixed point-to-point measurements. Mobile measurements were made via transmission from the air traffic control tower (ATCT), or from an airport field site (AFS), to a receiving ground vehicle on the airport surface. The point-to-point measurements were between ATCT and AFSs. Detailed statistical channel models were developed from all these measurements. Measured quantities include propagation path loss and power delay profiles, from which we obtain delay spreads, frequency domain correlation (coherence bandwidths), fading amplitude statistics, and channel parameter correlations. In this paper we review the project motivation, measurement coordination, and illustrate measurement results. Example channel modeling results for several propagation conditions are also provided, highlighting new findings.
Regression Models for Identifying Noise Sources in Magnetic Resonance Images
Zhu, Hongtu; Li, Yimei; Ibrahim, Joseph G.; Shi, Xiaoyan; An, Hongyu; Chen, Yashen; Gao, Wei; Lin, Weili; Rowe, Daniel B.; Peterson, Bradley S.
2009-01-01
Stochastic noise, susceptibility artifacts, magnetic field and radiofrequency inhomogeneities, and other noise components in magnetic resonance images (MRIs) can introduce serious bias into any measurements made with those images. We formally introduce three regression models including a Rician regression model and two associated normal models to characterize stochastic noise in various magnetic resonance imaging modalities, including diffusion-weighted imaging (DWI) and functional MRI (fMRI). Estimation algorithms are introduced to maximize the likelihood function of the three regression models. We also develop a diagnostic procedure for systematically exploring MR images to identify noise components other than simple stochastic noise, and to detect discrepancies between the fitted regression models and MRI data. The diagnostic procedure includes goodness-of-fit statistics, measures of influence, and tools for graphical display. The goodness-of-fit statistics can assess the key assumptions of the three regression models, whereas measures of influence can isolate outliers caused by certain noise components, including motion artifacts. The tools for graphical display permit graphical visualization of the values for the goodness-of-fit statistic and influence measures. Finally, we conduct simulation studies to evaluate performance of these methods, and we analyze a real dataset to illustrate how our diagnostic procedure localizes subtle image artifacts by detecting intravoxel variability that is not captured by the regression models. PMID:19890478
The Journal Usage Statistics Portal (JUSP): Helping Libraries Measure Use and Impact
ERIC Educational Resources Information Center
Mihlrad, Leigh
2012-01-01
The Joint Usage Statistics Portal (JUSP) (jusp.mimas.ac.uk), created by five U.K. libraries in 2009, gives participating libraries a single point of access for electronic journal statistics. It provides its more than 160 participants, including 140+ academic libraries in the United Kingdom, as well as 21 publishers and 3 intermediaries, with…
Selected papers in the hydrologic sciences, 1986
Subitzky, Seymour
1987-01-01
Water-quality data from long-term (24 years), fixed- station monitoring at the Cape Fear River at Lock 1 near Kelly, N.C., and various measures of basin development are correlated. Subbasin population, number of acres of cropland in the subbasin, number of people employed in manufacturing, and tons of fertilizer applied in the basin are considered as measures of basinwide development activity. Linear correlations show statistically significant posi- tive relations between both population and manufacturing activity and most of the dissolved constituents considered. Negative correlations were found between the acres of harvested cropland and most of the water-quality measures. The amount of fertilizer sold in the subbasin was not statistically related to the water-quality measures considered in this report. The statistical analysis was limited to several commonly used measures of water quality including specific conductance, pH, dissolved solids, several major dissolved ions, and a few nutrients. The major dissolved ions included in the analysis were calcium, sodium, potassium, magnesium, chloride, sulfate, silica, bicarbonate, and fluoride. The nutrients included were dissolved nitrite plus nitrate nitrogen, dissolved ammonia nitrogen, total nitrogen, dissolved phosphates, and total phosphorus. For the chemicals evaluated, manufacturing and population sources are more closely associated with water quality in the Cape Fear River at Lock 1 than are agricultural variables.
Primer of statistics in dental research: part I.
Shintani, Ayumi
2014-01-01
Statistics play essential roles in evidence-based dentistry (EBD) practice and research. It ranges widely from formulating scientific questions, designing studies, collecting and analyzing data to interpreting, reporting, and presenting study findings. Mastering statistical concepts appears to be an unreachable goal among many dental researchers in part due to statistical authorities' limitations of explaining statistical principles to health researchers without elaborating complex mathematical concepts. This series of 2 articles aim to introduce dental researchers to 9 essential topics in statistics to conduct EBD with intuitive examples. The part I of the series includes the first 5 topics (1) statistical graph, (2) how to deal with outliers, (3) p-value and confidence interval, (4) testing equivalence, and (5) multiplicity adjustment. Part II will follow to cover the remaining topics including (6) selecting the proper statistical tests, (7) repeated measures analysis, (8) epidemiological consideration for causal association, and (9) analysis of agreement. Copyright © 2014. Published by Elsevier Ltd.
Spriestersbach, Albert; Röhrig, Bernd; du Prel, Jean-Baptist; Gerhold-Ay, Aslihan; Blettner, Maria
2009-09-01
Descriptive statistics are an essential part of biometric analysis and a prerequisite for the understanding of further statistical evaluations, including the drawing of inferences. When data are well presented, it is usually obvious whether the author has collected and evaluated them correctly and in keeping with accepted practice in the field. Statistical variables in medicine may be of either the metric (continuous, quantitative) or categorical (nominal, ordinal) type. Easily understandable examples are given. Basic techniques for the statistical description of collected data are presented and illustrated with examples. The goal of a scientific study must always be clearly defined. The definition of the target value or clinical endpoint determines the level of measurement of the variables in question. Nearly all variables, whatever their level of measurement, can be usefully presented graphically and numerically. The level of measurement determines what types of diagrams and statistical values are appropriate. There are also different ways of presenting combinations of two independent variables graphically and numerically. The description of collected data is indispensable. If the data are of good quality, valid and important conclusions can already be drawn when they are properly described. Furthermore, data description provides a basis for inferential statistics.
Measured, modeled, and causal conceptions of fitness
Abrams, Marshall
2012-01-01
This paper proposes partial answers to the following questions: in what senses can fitness differences plausibly be considered causes of evolution?What relationships are there between fitness concepts used in empirical research, modeling, and abstract theoretical proposals? How does the relevance of different fitness concepts depend on research questions and methodological constraints? The paper develops a novel taxonomy of fitness concepts, beginning with type fitness (a property of a genotype or phenotype), token fitness (a property of a particular individual), and purely mathematical fitness. Type fitness includes statistical type fitness, which can be measured from population data, and parametric type fitness, which is an underlying property estimated by statistical type fitnesses. Token fitness includes measurable token fitness, which can be measured on an individual, and tendential token fitness, which is assumed to be an underlying property of the individual in its environmental circumstances. Some of the paper's conclusions can be outlined as follows: claims that fitness differences do not cause evolution are reasonable when fitness is treated as statistical type fitness, measurable token fitness, or purely mathematical fitness. Some of the ways in which statistical methods are used in population genetics suggest that what natural selection involves are differences in parametric type fitnesses. Further, it's reasonable to think that differences in parametric type fitness can cause evolution. Tendential token fitnesses, however, are not themselves sufficient for natural selection. Though parametric type fitnesses are typically not directly measurable, they can be modeled with purely mathematical fitnesses and estimated by statistical type fitnesses, which in turn are defined in terms of measurable token fitnesses. The paper clarifies the ways in which fitnesses depend on pragmatic choices made by researchers. PMID:23112804
A Statistical Analysis of Brain Morphology Using Wild Bootstrapping
Ibrahim, Joseph G.; Tang, Niansheng; Rowe, Daniel B.; Hao, Xuejun; Bansal, Ravi; Peterson, Bradley S.
2008-01-01
Methods for the analysis of brain morphology, including voxel-based morphology and surface-based morphometries, have been used to detect associations between brain structure and covariates of interest, such as diagnosis, severity of disease, age, IQ, and genotype. The statistical analysis of morphometric measures usually involves two statistical procedures: 1) invoking a statistical model at each voxel (or point) on the surface of the brain or brain subregion, followed by mapping test statistics (e.g., t test) or their associated p values at each of those voxels; 2) correction for the multiple statistical tests conducted across all voxels on the surface of the brain region under investigation. We propose the use of new statistical methods for each of these procedures. We first use a heteroscedastic linear model to test the associations between the morphological measures at each voxel on the surface of the specified subregion (e.g., cortical or subcortical surfaces) and the covariates of interest. Moreover, we develop a robust test procedure that is based on a resampling method, called wild bootstrapping. This procedure assesses the statistical significance of the associations between a measure of given brain structure and the covariates of interest. The value of this robust test procedure lies in its computationally simplicity and in its applicability to a wide range of imaging data, including data from both anatomical and functional magnetic resonance imaging (fMRI). Simulation studies demonstrate that this robust test procedure can accurately control the family-wise error rate. We demonstrate the application of this robust test procedure to the detection of statistically significant differences in the morphology of the hippocampus over time across gender groups in a large sample of healthy subjects. PMID:17649909
Metz, Anneke M
2008-01-01
There is an increasing need for students in the biological sciences to build a strong foundation in quantitative approaches to data analyses. Although most science, engineering, and math field majors are required to take at least one statistics course, statistical analysis is poorly integrated into undergraduate biology course work, particularly at the lower-division level. Elements of statistics were incorporated into an introductory biology course, including a review of statistics concepts and opportunity for students to perform statistical analysis in a biological context. Learning gains were measured with an 11-item statistics learning survey instrument developed for the course. Students showed a statistically significant 25% (p < 0.005) increase in statistics knowledge after completing introductory biology. Students improved their scores on the survey after completing introductory biology, even if they had previously completed an introductory statistics course (9%, improvement p < 0.005). Students retested 1 yr after completing introductory biology showed no loss of their statistics knowledge as measured by this instrument, suggesting that the use of statistics in biology course work may aid long-term retention of statistics knowledge. No statistically significant differences in learning were detected between male and female students in the study.
Statistical learning and auditory processing in children with music training: An ERP study.
Mandikal Vasuki, Pragati Rao; Sharma, Mridula; Ibrahim, Ronny; Arciuli, Joanne
2017-07-01
The question whether musical training is associated with enhanced auditory and cognitive abilities in children is of considerable interest. In the present study, we compared children with music training versus those without music training across a range of auditory and cognitive measures, including the ability to detect implicitly statistical regularities in input (statistical learning). Statistical learning of regularities embedded in auditory and visual stimuli was measured in musically trained and age-matched untrained children between the ages of 9-11years. In addition to collecting behavioural measures, we recorded electrophysiological measures to obtain an online measure of segmentation during the statistical learning tasks. Musically trained children showed better performance on melody discrimination, rhythm discrimination, frequency discrimination, and auditory statistical learning. Furthermore, grand-averaged ERPs showed that triplet onset (initial stimulus) elicited larger responses in the musically trained children during both auditory and visual statistical learning tasks. In addition, children's music skills were associated with performance on auditory and visual behavioural statistical learning tasks. Our data suggests that individual differences in musical skills are associated with children's ability to detect regularities. The ERP data suggest that musical training is associated with better encoding of both auditory and visual stimuli. Although causality must be explored in further research, these results may have implications for developing music-based remediation strategies for children with learning impairments. Copyright © 2017 International Federation of Clinical Neurophysiology. Published by Elsevier B.V. All rights reserved.
Langley Wind Tunnel Data Quality Assurance-Check Standard Results
NASA Technical Reports Server (NTRS)
Hemsch, Michael J.; Grubb, John P.; Krieger, William B.; Cler, Daniel L.
2000-01-01
A framework for statistical evaluation, control and improvement of wind funnel measurement processes is presented The methodology is adapted from elements of the Measurement Assurance Plans developed by the National Bureau of Standards (now the National Institute of Standards and Technology) for standards and calibration laboratories. The present methodology is based on the notions of statistical quality control (SQC) together with check standard testing and a small number of customer repeat-run sets. The results of check standard and customer repeat-run -sets are analyzed using the statistical control chart-methods of Walter A. Shewhart long familiar to the SQC community. Control chart results are presented for. various measurement processes in five facilities at Langley Research Center. The processes include test section calibration, force and moment measurements with a balance, and instrument calibration.
Simple Statistics: - Summarized!
ERIC Educational Resources Information Center
Blai, Boris, Jr.
Statistics are an essential tool for making proper judgement decisions. It is concerned with probability distribution models, testing of hypotheses, significance tests and other means of determining the correctness of deductions and the most likely outcome of decisions. Measures of central tendency include the mean, median and mode. A second…
76 FR 55392 - Proposed Data Collections Submitted for Public Comment and Recommendations
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-07
... which measure factors associated with birth and pregnancy rates, including contraception, infertility... Center for Health Statistics (NCHS), Centers for Disease Control and Prevention (CDC). Background and... statistics on ``family formation, growth, and dissolution,'' as well as ``determinants of health'' and...
Using Carbon Emissions Data to "Heat Up" Descriptive Statistics
ERIC Educational Resources Information Center
Brooks, Robert
2012-01-01
This article illustrates using carbon emissions data in an introductory statistics assignment. The carbon emissions data has desirable characteristics including: choice of measure; skewness; and outliers. These complexities allow research and public policy debate to be introduced. (Contains 4 figures and 2 tables.)
PHYSICS OF NON-GAUSSIAN FIELDS AND THE COSMOLOGICAL GENUS STATISTIC
DOE Office of Scientific and Technical Information (OSTI.GOV)
James, J. Berian, E-mail: berian@berkeley.edu
2012-05-20
We report a technique to calculate the impact of distinct physical processes inducing non-Gaussianity on the cosmological density field. A natural decomposition of the cosmic genus statistic into an orthogonal polynomial sequence allows complete expression of the scale-dependent evolution of the topology of large-scale structure, in which effects including galaxy bias, nonlinear gravitational evolution, and primordial non-Gaussianity may be delineated. The relationship of this decomposition to previous methods for analyzing the genus statistic is briefly considered and the following applications are made: (1) the expression of certain systematics affecting topological measurements, (2) the quantification of broad deformations from Gaussianity thatmore » appear in the genus statistic as measured in the Horizon Run simulation, and (3) the study of the evolution of the genus curve for simulations with primordial non-Gaussianity. These advances improve the treatment of flux-limited galaxy catalogs for use with this measurement and further the use of the genus statistic as a tool for exploring non-Gaussianity.« less
Methodological quality of behavioural weight loss studies: a systematic review
Lemon, S. C.; Wang, M. L.; Haughton, C. F.; Estabrook, D. P.; Frisard, C. F.; Pagoto, S. L.
2018-01-01
Summary This systematic review assessed the methodological quality of behavioural weight loss intervention studies conducted among adults and associations between quality and statistically significant weight loss outcome, strength of intervention effectiveness and sample size. Searches for trials published between January, 2009 and December, 2014 were conducted using PUBMED, MEDLINE and PSYCINFO and identified ninety studies. Methodological quality indicators included study design, anthropometric measurement approach, sample size calculations, intent-to-treat (ITT) analysis, loss to follow-up rate, missing data strategy, sampling strategy, report of treatment receipt and report of intervention fidelity (mean = 6.3). Indicators most commonly utilized included randomized design (100%), objectively measured anthropometrics (96.7%), ITT analysis (86.7%) and reporting treatment adherence (76.7%). Most studies (62.2%) had a follow-up rate >75% and reported a loss to follow-up analytic strategy or minimal missing data (69.9%). Describing intervention fidelity (34.4%) and sampling from a known population (41.1%) were least common. Methodological quality was not associated with reporting a statistically significant result, effect size or sample size. This review found the published literature of behavioural weight loss trials to be of high quality for specific indicators, including study design and measurement. Identified for improvement include utilization of more rigorous statistical approaches to loss to follow up and better fidelity reporting. PMID:27071775
A Simple Statistical Thermodynamics Experiment
ERIC Educational Resources Information Center
LoPresto, Michael C.
2010-01-01
Comparing the predicted and actual rolls of combinations of both two and three dice can help to introduce many of the basic concepts of statistical thermodynamics, including multiplicity, probability, microstates, and macrostates, and demonstrate that entropy is indeed a measure of randomness, that disordered states (those of higher entropy) are…
ASURV: Astronomical SURVival Statistics
NASA Astrophysics Data System (ADS)
Feigelson, E. D.; Nelson, P. I.; Isobe, T.; LaValley, M.
2014-06-01
ASURV (Astronomical SURVival Statistics) provides astronomy survival analysis for right- and left-censored data including the maximum-likelihood Kaplan-Meier estimator and several univariate two-sample tests, bivariate correlation measures, and linear regressions. ASURV is written in FORTRAN 77, and is stand-alone and does not call any specialized libraries.
2008-01-01
There is an increasing need for students in the biological sciences to build a strong foundation in quantitative approaches to data analyses. Although most science, engineering, and math field majors are required to take at least one statistics course, statistical analysis is poorly integrated into undergraduate biology course work, particularly at the lower-division level. Elements of statistics were incorporated into an introductory biology course, including a review of statistics concepts and opportunity for students to perform statistical analysis in a biological context. Learning gains were measured with an 11-item statistics learning survey instrument developed for the course. Students showed a statistically significant 25% (p < 0.005) increase in statistics knowledge after completing introductory biology. Students improved their scores on the survey after completing introductory biology, even if they had previously completed an introductory statistics course (9%, improvement p < 0.005). Students retested 1 yr after completing introductory biology showed no loss of their statistics knowledge as measured by this instrument, suggesting that the use of statistics in biology course work may aid long-term retention of statistics knowledge. No statistically significant differences in learning were detected between male and female students in the study. PMID:18765754
Australia 31-GHz brightness temperature exceedance statistics
NASA Technical Reports Server (NTRS)
Gary, B. L.
1988-01-01
Water vapor radiometer measurements were made at DSS 43 during an 18 month period. Brightness temperatures at 31 GHz were subjected to a statistical analysis which included correction for the effects of occasional water on the radiometer radome. An exceedance plot was constructed, and the 1 percent exceedance statistics occurs at 120 K. The 5 percent exceedance statistics occurs at 70 K, compared with 75 K in Spain. These values are valid for all of the three month groupings that were studied.
Characterizations of linear sufficient statistics
NASA Technical Reports Server (NTRS)
Peters, B. C., Jr.; Reoner, R.; Decell, H. P., Jr.
1977-01-01
A surjective bounded linear operator T from a Banach space X to a Banach space Y must be a sufficient statistic for a dominated family of probability measures defined on the Borel sets of X. These results were applied, so that they characterize linear sufficient statistics for families of the exponential type, including as special cases the Wishart and multivariate normal distributions. The latter result was used to establish precisely which procedures for sampling from a normal population had the property that the sample mean was a sufficient statistic.
Statistical Learning is Related to Early Literacy-Related Skills
Spencer, Mercedes; Kaschak, Michael P.; Jones, John L.; Lonigan, Christopher J.
2015-01-01
It has been demonstrated that statistical learning, or the ability to use statistical information to learn the structure of one’s environment, plays a role in young children’s acquisition of linguistic knowledge. Although most research on statistical learning has focused on language acquisition processes, such as the segmentation of words from fluent speech and the learning of syntactic structure, some recent studies have explored the extent to which individual differences in statistical learning are related to literacy-relevant knowledge and skills. The present study extends on this literature by investigating the relations between two measures of statistical learning and multiple measures of skills that are critical to the development of literacy—oral language, vocabulary knowledge, and phonological processing—within a single model. Our sample included a total of 553 typically developing children from prekindergarten through second grade. Structural equation modeling revealed that statistical learning accounted for a unique portion of the variance in these literacy-related skills. Practical implications for instruction and assessment are discussed. PMID:26478658
AGR-1 Thermocouple Data Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jeff Einerson
2012-05-01
This report documents an effort to analyze measured and simulated data obtained in the Advanced Gas Reactor (AGR) fuel irradiation test program conducted in the INL's Advanced Test Reactor (ATR) to support the Next Generation Nuclear Plant (NGNP) R&D program. The work follows up on a previous study (Pham and Einerson, 2010), in which statistical analysis methods were applied for AGR-1 thermocouple data qualification. The present work exercises the idea that, while recognizing uncertainties inherent in physics and thermal simulations of the AGR-1 test, results of the numerical simulations can be used in combination with the statistical analysis methods tomore » further improve qualification of measured data. Additionally, the combined analysis of measured and simulation data can generate insights about simulation model uncertainty that can be useful for model improvement. This report also describes an experimental control procedure to maintain fuel target temperature in the future AGR tests using regression relationships that include simulation results. The report is organized into four chapters. Chapter 1 introduces the AGR Fuel Development and Qualification program, AGR-1 test configuration and test procedure, overview of AGR-1 measured data, and overview of physics and thermal simulation, including modeling assumptions and uncertainties. A brief summary of statistical analysis methods developed in (Pham and Einerson 2010) for AGR-1 measured data qualification within NGNP Data Management and Analysis System (NDMAS) is also included for completeness. Chapters 2-3 describe and discuss cases, in which the combined use of experimental and simulation data is realized. A set of issues associated with measurement and modeling uncertainties resulted from the combined analysis are identified. This includes demonstration that such a combined analysis led to important insights for reducing uncertainty in presentation of AGR-1 measured data (Chapter 2) and interpretation of simulation results (Chapter 3). The statistics-based simulation-aided experimental control procedure described for the future AGR tests is developed and demonstrated in Chapter 4. The procedure for controlling the target fuel temperature (capsule peak or average) is based on regression functions of thermocouple readings and other relevant parameters and accounting for possible changes in both physical and thermal conditions and in instrument performance.« less
ERIC Educational Resources Information Center
Andrejack, Kate, Comp.; Judge, Amy, Comp.; Simons, Janet, Comp.
This data book provides statistics on a range of indicators that measure critical aspects of children's lives in each of the 50 states and the District of Columbia. Statistics are provided in the following categories: (1) national rankings in population and family characteristics; (2) health and disabilities (including children lacking health…
ERIC Educational Resources Information Center
Judge, Amy, Comp.
This data book provides statistics on a range of indicators that measure critical aspects of children's lives in each of the 50 states and the District of Columbia. Statistics are provided in the following categories: (1) child health, including uninsured children, low birth weight babies, infant deaths, and immunizations; (2) child care and early…
Examples of Data Analysis with SPSS-X.
ERIC Educational Resources Information Center
MacFarland, Thomas W.
Intended for classroom use only, these unpublished notes contain computer lessons on descriptive statistics using SPSS-X Release 3.0 for VAX/UNIX. Statistical measures covered include Chi-square analysis; Spearman's rank correlation coefficient; Student's t-test with two independent samples; Student's t-test with a paired sample; One-way analysis…
Noise characteristics of the Skylab S-193 altimeter altitude measurements
NASA Technical Reports Server (NTRS)
Hatch, W. E.
1975-01-01
The statistical characteristics of the SKYLAB S-193 altimeter altitude noise are considered. These results are reported in a concise format for use and analysis by the scientific community. In most instances the results have been grouped according to satellite pointing so that the effects of pointing on the statistical characteristics can be readily seen. The altimeter measurements and the processing techniques are described. The mathematical descriptions of the computer programs used for these results are included.
Measurement of atmospheric surface layer turbulence using unmanned aerial vehicles
NASA Astrophysics Data System (ADS)
Bailey, Sean; Canter, Caleb
2017-11-01
We describe measurements of the turbulence within the atmospheric surface layer using highly instrumented and autonomous unmanned aerial vehicles (UAVs). Results from the CLOUDMAP measurement campaign in Stillwater Oklahoma are presented including turbulence statistics measured during the transition from stably stratified to convective conditions. The measurements were made using pre-fabricated fixed-wing remote-control aircraft adapted to fly autonomously and carry multi-hole pressure probes, pressure, temperature and humidity sensors. Two aircraft were flown simultaneously, with one flying a flight path intended to profile the boundary layer up to 100 m and the other flying at a constant fixed altitude of 50 m. The evolution of various turbulent statistics was determined from these flights, including Reynolds stresses, correlations, spectra and structure functions. These results were compared to those measured by a sonic anemometer located on a 7.5 m tower. This work was supported by the National Science Foundation through Grant #CBET-1351411 and by National Science Foundation award #1539070, Collaboration Leading Operational UAS Development for Meteorology and Atmospheric Physics (CLOUDMAP).
Measurement of atmospheric surface layer turbulence using unmanned aerial vehicles
NASA Astrophysics Data System (ADS)
Witte, Brandon; Smith, Lorli; Schlagenhauf, Cornelia; Bailey, Sean
2016-11-01
We describe measurements of the turbulence within the atmospheric surface layer using highly instrumented and autonomous unmanned aerial vehicles (UAVs). Results from the CLOUDMAP measurement campaign in Stillwater Oklahoma are presented including turbulence statistics measured during the transition from stably stratified to convective conditions. The measurements were made using pre-fabricated fixed-wing remote-control aircraft adapted to fly autonomously and carry multi-hole pressure probes, pressure, temperature and humidity sensors. Two aircraft were flown simultaneously, with one flying a flight path intended to profile the boundary layer up to 100 m and the other flying at a constant fixed altitude of 50 m. The evolution of various turbulent statistics was determined from these flights, including Reynolds stresses, correlations, spectra and structure functions. These results were compared to those measured by a sonic anemometer located on a 7.5 m tower. This work was supported by the National Science Foundation through Grant #CBET-1351411 and by National Science Foundation award #1539070, Collaboration Leading Operational UAS Development for Meteorology and Atmospheric Physics (CLOUDMAP).
The kappa statistic in rehabilitation research: an examination.
Tooth, Leigh R; Ottenbacher, Kenneth J
2004-08-01
The number and sophistication of statistical procedures reported in medical rehabilitation research is increasing. Application of the principles and methods associated with evidence-based practice has contributed to the need for rehabilitation practitioners to understand quantitative methods in published articles. Outcomes measurement and determination of reliability are areas that have experienced rapid change during the past decade. In this study, distinctions between reliability and agreement are examined. Information is presented on analytical approaches for addressing reliability and agreement with the focus on the application of the kappa statistic. The following assumptions are discussed: (1) kappa should be used with data measured on a categorical scale, (2) the patients or objects categorized should be independent, and (3) the observers or raters must make their measurement decisions and judgments independently. Several issues related to using kappa in measurement studies are described, including use of weighted kappa, methods of reporting kappa, the effect of bias and prevalence on kappa, and sample size and power requirements for kappa. The kappa statistic is useful for assessing agreement among raters, and it is being used more frequently in rehabilitation research. Correct interpretation of the kappa statistic depends on meeting the required assumptions and accurate reporting.
Functional constraints on tooth morphology in carnivorous mammals
2012-01-01
Background The range of potential morphologies resulting from evolution is limited by complex interacting processes, ranging from development to function. Quantifying these interactions is important for understanding adaptation and convergent evolution. Using three-dimensional reconstructions of carnivoran and dasyuromorph tooth rows, we compared statistical models of the relationship between tooth row shape and the opposing tooth row, a static feature, as well as measures of mandibular motion during chewing (occlusion), which are kinetic features. This is a new approach to quantifying functional integration because we use measures of movement and displacement, such as the amount the mandible translates laterally during occlusion, as opposed to conventional morphological measures, such as mandible length and geometric landmarks. By sampling two distantly related groups of ecologically similar mammals, we study carnivorous mammals in general rather than a specific group of mammals. Results Statistical model comparisons demonstrate that the best performing models always include some measure of mandibular motion, indicating that functional and statistical models of tooth shape as purely a function of the opposing tooth row are too simple and that increased model complexity provides a better understanding of tooth form. The predictors of the best performing models always included the opposing tooth row shape and a relative linear measure of mandibular motion. Conclusions Our results provide quantitative support of long-standing hypotheses of tooth row shape as being influenced by mandibular motion in addition to the opposing tooth row. Additionally, this study illustrates the utility and necessity of including kinetic features in analyses of morphological integration. PMID:22899809
Online incidental statistical learning of audiovisual word sequences in adults: a registered report.
Kuppuraj, Sengottuvel; Duta, Mihaela; Thompson, Paul; Bishop, Dorothy
2018-02-01
Statistical learning has been proposed as a key mechanism in language learning. Our main goal was to examine whether adults are capable of simultaneously extracting statistical dependencies in a task where stimuli include a range of structures amenable to statistical learning within a single paradigm. We devised an online statistical learning task using real word auditory-picture sequences that vary in two dimensions: (i) predictability and (ii) adjacency of dependent elements. This task was followed by an offline recall task to probe learning of each sequence type. We registered three hypotheses with specific predictions. First, adults would extract regular patterns from continuous stream (effect of grammaticality). Second, within grammatical conditions, they would show differential speeding up for each condition as a factor of statistical complexity of the condition and exposure. Third, our novel approach to measure online statistical learning would be reliable in showing individual differences in statistical learning ability. Further, we explored the relation between statistical learning and a measure of verbal short-term memory (STM). Forty-two participants were tested and retested after an interval of at least 3 days on our novel statistical learning task. We analysed the reaction time data using a novel regression discontinuity approach. Consistent with prediction, participants showed a grammaticality effect, agreeing with the predicted order of difficulty for learning different statistical structures. Furthermore, a learning index from the task showed acceptable test-retest reliability ( r = 0.67). However, STM did not correlate with statistical learning. We discuss the findings noting the benefits of online measures in tracking the learning process.
Online incidental statistical learning of audiovisual word sequences in adults: a registered report
Duta, Mihaela; Thompson, Paul
2018-01-01
Statistical learning has been proposed as a key mechanism in language learning. Our main goal was to examine whether adults are capable of simultaneously extracting statistical dependencies in a task where stimuli include a range of structures amenable to statistical learning within a single paradigm. We devised an online statistical learning task using real word auditory–picture sequences that vary in two dimensions: (i) predictability and (ii) adjacency of dependent elements. This task was followed by an offline recall task to probe learning of each sequence type. We registered three hypotheses with specific predictions. First, adults would extract regular patterns from continuous stream (effect of grammaticality). Second, within grammatical conditions, they would show differential speeding up for each condition as a factor of statistical complexity of the condition and exposure. Third, our novel approach to measure online statistical learning would be reliable in showing individual differences in statistical learning ability. Further, we explored the relation between statistical learning and a measure of verbal short-term memory (STM). Forty-two participants were tested and retested after an interval of at least 3 days on our novel statistical learning task. We analysed the reaction time data using a novel regression discontinuity approach. Consistent with prediction, participants showed a grammaticality effect, agreeing with the predicted order of difficulty for learning different statistical structures. Furthermore, a learning index from the task showed acceptable test–retest reliability (r = 0.67). However, STM did not correlate with statistical learning. We discuss the findings noting the benefits of online measures in tracking the learning process. PMID:29515876
Intrex Subject/Title Inverted-File Characteristics.
ERIC Educational Resources Information Center
Uemura, Syunsuke
The characteristics of the Intrex subject/title inverted file are analyzed. Basic statistics of the inverted file are presented including various distributions of the index words and terms from which the file was derived, and statistics on stems, the file growth process, and redundancy measurements. A study of stems both with extremely high and…
Nebraska's forests, 2005: statistics, methods, and quality assurance
Patrick D. Miles; Dacia M. Meneguzzo; Charles J. Barnett
2011-01-01
The first full annual inventory of Nebraska's forests was completed in 2005 after 8,335 plots were selected and 274 forested plots were visited and measured. This report includes detailed information on forest inventory methods, and data quality estimates. Tables of various important resource statistics are presented. Detailed analysis of the inventory data are...
ERIC Educational Resources Information Center
Children's Defense Fund, Washington, DC.
This data book provides statistics on a range of indicators that measure critical aspects of children's lives in each of the 50 states and the District of Columbia. Statistics are provided in the following categories: (1) population and family characteristics (including number of children under age 18 and age 5, percentage of population under age…
Murray, Christopher J L
2007-03-10
Health statistics are at the centre of an increasing number of worldwide health controversies. Several factors are sharpening the tension between the supply and demand for high quality health information, and the health-related Millennium Development Goals (MDGs) provide a high-profile example. With thousands of indicators recommended but few measured well, the worldwide health community needs to focus its efforts on improving measurement of a small set of priority areas. Priority indicators should be selected on the basis of public-health significance and several dimensions of measurability. Health statistics can be divided into three types: crude, corrected, and predicted. Health statistics are necessary inputs to planning and strategic decision making, programme implementation, monitoring progress towards targets, and assessment of what works and what does not. Crude statistics that are biased have no role in any of these steps; corrected statistics are preferred. For strategic decision making, when corrected statistics are unavailable, predicted statistics can play an important part. For monitoring progress towards agreed targets and assessment of what works and what does not, however, predicted statistics should not be used. Perhaps the most effective method to decrease controversy over health statistics and to encourage better primary data collection and the development of better analytical methods is a strong commitment to provision of an explicit data audit trail. This initiative would make available the primary data, all post-data collection adjustments, models including covariates used for farcasting and forecasting, and necessary documentation to the public.
Hashim, Muhammad Jawad
2010-09-01
Post-hoc secondary data analysis with no prespecified hypotheses has been discouraged by textbook authors and journal editors alike. Unfortunately no single term describes this phenomenon succinctly. I would like to coin the term "sigsearch" to define this practice and bring it within the teaching lexicon of statistics courses. Sigsearch would include any unplanned, post-hoc search for statistical significance using multiple comparisons of subgroups. It would also include data analysis with outcomes other than the prespecified primary outcome measure of a study as well as secondary data analyses of earlier research.
ERIC Educational Resources Information Center
Weatherup, Jim, Comp.
This glossary contains brief definitions of more than 550 special or technical terms used in scientific, technical, and social science research. Entries include various kinds of statistical measures, research variables, types of research tests, and research methodologies. Some computer terminology is also included. The glossary includes both…
DOT National Transportation Integrated Search
2011-01-01
Transportation Satellite Accounts (TSA), produced by the Bureau of Economic Analysis and the Bureau of Transportation Statistics, provides measures of national transportation output. TSA includes both in-house and for-hire transportation services. Fo...
A simple rain attenuation model for earth-space radio links operating at 10-35 GHz
NASA Technical Reports Server (NTRS)
Stutzman, W. L.; Yon, K. M.
1986-01-01
The simple attenuation model has been improved from an earlier version and now includes the effect of wave polarization. The model is for the prediction of rain attenuation statistics on earth-space communication links operating in the 10-35 GHz band. Simple calculations produce attenuation values as a function of average rain rate. These together with rain rate statistics (either measured or predicted) can be used to predict annual rain attenuation statistics. In this paper model predictions are compared to measured data from a data base of 62 experiments performed in the U.S., Europe, and Japan. Comparisons are also made to predictions from other models.
Local sensitivity analysis for inverse problems solved by singular value decomposition
Hill, M.C.; Nolan, B.T.
2010-01-01
Local sensitivity analysis provides computationally frugal ways to evaluate models commonly used for resource management, risk assessment, and so on. This includes diagnosing inverse model convergence problems caused by parameter insensitivity and(or) parameter interdependence (correlation), understanding what aspects of the model and data contribute to measures of uncertainty, and identifying new data likely to reduce model uncertainty. Here, we consider sensitivity statistics relevant to models in which the process model parameters are transformed using singular value decomposition (SVD) to create SVD parameters for model calibration. The statistics considered include the PEST identifiability statistic, and combined use of the process-model parameter statistics composite scaled sensitivities and parameter correlation coefficients (CSS and PCC). The statistics are complimentary in that the identifiability statistic integrates the effects of parameter sensitivity and interdependence, while CSS and PCC provide individual measures of sensitivity and interdependence. PCC quantifies correlations between pairs or larger sets of parameters; when a set of parameters is intercorrelated, the absolute value of PCC is close to 1.00 for all pairs in the set. The number of singular vectors to include in the calculation of the identifiability statistic is somewhat subjective and influences the statistic. To demonstrate the statistics, we use the USDA’s Root Zone Water Quality Model to simulate nitrogen fate and transport in the unsaturated zone of the Merced River Basin, CA. There are 16 log-transformed process-model parameters, including water content at field capacity (WFC) and bulk density (BD) for each of five soil layers. Calibration data consisted of 1,670 observations comprising soil moisture, soil water tension, aqueous nitrate and bromide concentrations, soil nitrate concentration, and organic matter content. All 16 of the SVD parameters could be estimated by regression based on the range of singular values. Identifiability statistic results varied based on the number of SVD parameters included. Identifiability statistics calculated for four SVD parameters indicate the same three most important process-model parameters as CSS/PCC (WFC1, WFC2, and BD2), but the order differed. Additionally, the identifiability statistic showed that BD1 was almost as dominant as WFC1. The CSS/PCC analysis showed that this results from its high correlation with WCF1 (-0.94), and not its individual sensitivity. Such distinctions, combined with analysis of how high correlations and(or) sensitivities result from the constructed model, can produce important insights into, for example, the use of sensitivity analysis to design monitoring networks. In conclusion, the statistics considered identified similar important parameters. They differ because (1) with CSS/PCC can be more awkward because sensitivity and interdependence are considered separately and (2) identifiability requires consideration of how many SVD parameters to include. A continuing challenge is to understand how these computationally efficient methods compare with computationally demanding global methods like Markov-Chain Monte Carlo given common nonlinear processes and the often even more nonlinear models.
Making the Math/Science Connection.
ERIC Educational Resources Information Center
Sherman, Laurel Galbraith
1989-01-01
Suggestions are made for activities that combine the teaching of math and science. Math concepts include: graphing, estimating, measurement, statistics, geometry, and logic. Science topics include: plant reproduction, solar system, forces, longitude and latitude, Earth's magnetic field, nutrition, and heat. (IAH)
Xiong, Tengbin; Parekh-Bhurke, Sheetal; Loke, Yoon K; Sutton, Alex J; Eastwood, Alison J; Holland, Richard; Chen, Yen-Fu; Glenny, Anne-Marie; Deeks, Jonathan J; Altman, Doug G
2011-01-01
Objective To investigate the agreement between direct and indirect comparisons of competing healthcare interventions. Design Meta-epidemiological study based on sample of meta-analyses of randomised controlled trials. Data sources Cochrane Database of Systematic Reviews and PubMed. Inclusion criteria Systematic reviews that provided sufficient data for both direct comparison and independent indirect comparisons of two interventions on the basis of a common comparator and in which the odds ratio could be used as the outcome statistic. Main outcome measure Inconsistency measured by the difference in the log odds ratio between the direct and indirect methods. Results The study included 112 independent trial networks (including 1552 trials with 478 775 patients in total) that allowed both direct and indirect comparison of two interventions. Indirect comparison had already been explicitly done in only 13 of the 85 Cochrane reviews included. The inconsistency between the direct and indirect comparison was statistically significant in 16 cases (14%, 95% confidence interval 9% to 22%). The statistically significant inconsistency was associated with fewer trials, subjectively assessed outcomes, and statistically significant effects of treatment in either direct or indirect comparisons. Owing to considerable inconsistency, many (14/39) of the statistically significant effects by direct comparison became non-significant when the direct and indirect estimates were combined. Conclusions Significant inconsistency between direct and indirect comparisons may be more prevalent than previously observed. Direct and indirect estimates should be combined in mixed treatment comparisons only after adequate assessment of the consistency of the evidence. PMID:21846695
Children in the States Data Book, 1998.
ERIC Educational Resources Information Center
Children's Defense Fund, Washington, DC.
This data book from the Children's Defense Fund includes statistics on a range of indicators that measure critical aspects of children's lives in each of the states and the United States as a whole. Statistics are provided in the following categories: (1) population and family characteristics (number of children under age 18 and age 6, number of…
Examples of Data Analysis with SPSS/PC+ Studentware.
ERIC Educational Resources Information Center
MacFarland, Thomas W.
Intended for classroom use only, these unpublished notes contain computer lessons on descriptive statistics with files previously created in WordPerfect 4.2 and Lotus 1-2-3 Version 1.A for the IBM PC+. The statistical measures covered include Student's t-test with two independent samples; Student's t-test with a paired sample; Chi-square analysis;…
An Empirical Investigation of Methods for Assessing Item Fit for Mixed Format Tests
ERIC Educational Resources Information Center
Chon, Kyong Hee; Lee, Won-Chan; Ansley, Timothy N.
2013-01-01
Empirical information regarding performance of model-fit procedures has been a persistent need in measurement practice. Statistical procedures for evaluating item fit were applied to real test examples that consist of both dichotomously and polytomously scored items. The item fit statistics used in this study included the PARSCALE's G[squared],…
The other half of the story: effect size analysis in quantitative research.
Maher, Jessica Middlemis; Markey, Jonathan C; Ebert-May, Diane
2013-01-01
Statistical significance testing is the cornerstone of quantitative research, but studies that fail to report measures of effect size are potentially missing a robust part of the analysis. We provide a rationale for why effect size measures should be included in quantitative discipline-based education research. Examples from both biological and educational research demonstrate the utility of effect size for evaluating practical significance. We also provide details about some effect size indices that are paired with common statistical significance tests used in educational research and offer general suggestions for interpreting effect size measures. Finally, we discuss some inherent limitations of effect size measures and provide further recommendations about reporting confidence intervals.
NASA Astrophysics Data System (ADS)
Adams, T.; Batra, P.; Bugel, L.; Camilleri, L.; Conrad, J. M.; de Gouvêa, A.; Fisher, P. H.; Formaggio, J. A.; Jenkins, J.; Karagiorgi, G.; Kobilarcik, T. R.; Kopp, S.; Kyle, G.; Loinaz, W. A.; Mason, D. A.; Milner, R.; Moore, R.; Morfín, J. G.; Nakamura, M.; Naples, D.; Nienaber, P.; Olness, F. I.; Owens, J. F.; Pate, S. F.; Pronin, A.; Seligman, W. G.; Shaevitz, M. H.; Schellman, H.; Schienbein, I.; Syphers, M. J.; Tait, T. M. P.; Takeuchi, T.; Tan, C. Y.; van de Water, R. G.; Yamamoto, R. K.; Yu, J. Y.
We extend the physics case for a new high-energy, ultra-high statistics neutrino scattering experiment, NuSOnG (Neutrino Scattering On Glass) to address a variety of issues including precision QCD measurements, extraction of structure functions, and the derived Parton Distribution Functions (PDF's). This experiment uses a Tevatron-based neutrino beam to obtain a sample of Deep Inelastic Scattering (DIS) events which is over two orders of magnitude larger than past samples. We outline an innovative method for fitting the structure functions using a parametrized energy shift which yields reduced systematic uncertainties. High statistics measurements, in combination with improved systematics, will enable NuSOnG to perform discerning tests of fundamental Standard Model parameters as we search for deviations which may hint of "Beyond the Standard Model" physics.
Measuring Effectiveness in a Virtual Library
ERIC Educational Resources Information Center
Finch, Jannette L.
2010-01-01
Measuring quality of service in academic libraries traditionally includes quantifiable data such as collection size, staff counts, circulation numbers, reference service statistics, qualitative analyses of customer satisfaction, shelving accuracy, and building comfort. In the libraries of the third millennium, virtual worlds, Web content and…
Akuffo, Kwadwo Owusu; Beatty, Stephen; Peto, Tunde; Stack, Jim; Stringham, Jim; Kelly, David; Leung, Irene; Corcoran, Laura; Nolan, John M
2017-10-01
The purpose of this study was to evaluate the impact of supplemental macular carotenoids (including versus not including meso-zeaxanthin) in combination with coantioxidants on visual function in patients with nonadvanced age-related macular degeneration. In this study, 121 participants were randomly assigned to group 1 (Age-Related Eye Disease Study 2 formulation with a low dose [25 mg] of zinc and an addition of 10 mg meso-zeaxanthin; n = 60) or group 2 (Age-Related Eye Disease Study 2 formulation with a low dose [25 mg] of zinc; n = 61). Visual function was assessed using best-corrected visual acuity, contrast sensitivity (CS), glare disability, retinal straylight, photostress recovery time, reading performance, and the National Eye Institute Visual Function Questionnaire-25. Macular pigment was measured using customized heterochromatic flicker photometry. There was a statistically significant improvement in the primary outcome measure (letter CS at 6 cycles per degree [6 cpd]) over time (P = 0.013), and this observed improvement was statistically comparable between interventions (P = 0.881). Statistically significant improvements in several secondary outcome visual function measures (letter CS at 1.2 and 2.4 cpd; mesopic and photopic CS at all spatial frequencies; mesopic glare disability at 1.5, 3, and 6 cpd; photopic glare disability at 1.5, 3, 6, and 12 cpd; photostress recovery time; retinal straylight; mean and maximum reading speed) were also observed over time (P < 0.05, for all), and were statistically comparable between interventions (P > 0.05, for all). Statistically significant increases in macular pigment at all eccentricities were observed over time (P < 0.0005, for all), and the degree of augmentation was statistically comparable between interventions (P > 0.05). Antioxidant supplementation in patients with nonadvanced age-related macular degeneration results in significant increases in macular pigment and improvements in CS and other measures of visual function. (Clinical trial, http://www.isrctn.com/ISRCTN13894787).
An overview of the mathematical and statistical analysis component of RICIS
NASA Technical Reports Server (NTRS)
Hallum, Cecil R.
1987-01-01
Mathematical and statistical analysis components of RICIS (Research Institute for Computing and Information Systems) can be used in the following problem areas: (1) quantification and measurement of software reliability; (2) assessment of changes in software reliability over time (reliability growth); (3) analysis of software-failure data; and (4) decision logic for whether to continue or stop testing software. Other areas of interest to NASA/JSC where mathematical and statistical analysis can be successfully employed include: math modeling of physical systems, simulation, statistical data reduction, evaluation methods, optimization, algorithm development, and mathematical methods in signal processing.
Poole, Kerry; Mason, Howard
2007-03-15
To establish the relationship between quantitative tests of hand function and upper limb disability, as measured by the Disability of the Arm, Shoulder and Hand (DASH) questionnaire, in hand-arm vibration syndrome (HAVS). A total of 228 individuals with HAVS were included in this study. Each had undergone a full HAVS assessment by an experienced physician, including quantitative tests of vibrotactile and thermal perception thresholds, maximal hand-grip strength (HG) and the Purdue pegboard (PP) test. Individuals were also asked to complete a DASH questionnaire. PP and HG of the quantitative tests gave the best and statistically significant individual correlations with the DASH disability score (r2 = 0.168 and 0.096). Stepwise linear regression analysis revealed that only PP and HG measurements were statistically significant predictors of upper limb disability (r2 = 0.178). Overall a combination of the PP and HG measurements, rather than each alone, gave slightly better discrimination, although not statistically significant, between normal and abnormal DASH scores with a sensitivity of 73.1% and specificity of 64.3%. Measurements of manual dexterity and hand-grip strength using PP and HG may be useful in helping to confirm lack of upper limb function and 'perceived' disability in HAVS.
NASA Astrophysics Data System (ADS)
Boning, Duane S.; Chung, James E.
1998-11-01
Advanced process technology will require more detailed understanding and tighter control of variation in devices and interconnects. The purpose of statistical metrology is to provide methods to measure and characterize variation, to model systematic and random components of that variation, and to understand the impact of variation on both yield and performance of advanced circuits. Of particular concern are spatial or pattern-dependencies within individual chips; such systematic variation within the chip can have a much larger impact on performance than wafer-level random variation. Statistical metrology methods will play an important role in the creation of design rules for advanced technologies. For example, a key issue in multilayer interconnect is the uniformity of interlevel dielectric (ILD) thickness within the chip. For the case of ILD thickness, we describe phases of statistical metrology development and application to understanding and modeling thickness variation arising from chemical-mechanical polishing (CMP). These phases include screening experiments including design of test structures and test masks to gather electrical or optical data, techniques for statistical decomposition and analysis of the data, and approaches to calibrating empirical and physical variation models. These models can be integrated with circuit CAD tools to evaluate different process integration or design rule strategies. One focus for the generation of interconnect design rules are guidelines for the use of "dummy fill" or "metal fill" to improve the uniformity of underlying metal density and thus improve the uniformity of oxide thickness within the die. Trade-offs that can be evaluated via statistical metrology include the improvements to uniformity possible versus the effect of increased capacitance due to additional metal.
Statistical methods in personality assessment research.
Schinka, J A; LaLone, L; Broeckel, J A
1997-06-01
Emerging models of personality structure and advances in the measurement of personality and psychopathology suggest that research in personality and personality assessment has entered a stage of advanced development, in this article we examine whether researchers in these areas have taken advantage of new and evolving statistical procedures. We conducted a review of articles published in the Journal of Personality, Assessment during the past 5 years. Of the 449 articles that included some form of data analysis, 12.7% used only descriptive statistics, most employed only univariate statistics, and fewer than 10% used multivariate methods of data analysis. We discuss the cost of using limited statistical methods, the possible reasons for the apparent reluctance to employ advanced statistical procedures, and potential solutions to this technical shortcoming.
THE Role OF Anisotropy AND Intermittency IN Solar Wind/Magnetosphere Coupling
NASA Astrophysics Data System (ADS)
Jankovicova, D.; Voros, Z.
2006-12-01
Turbulent fluctuations are common in the solar wind as well as in the Earth's magnetosphere. The fluctuations of both magnetic field and plasma parameters exhibit non-Gaussian statistics. Neither the amplitude of these fluctuations nor their spectral characteristics can provide a full statistical description of multi-scale features in turbulence. It substantiates a statistical approach including the estimation of experimentally accessible statistical moments. In this contribution, we will directly estimate the third (skewness) and the fourth (kurtosis) statistical moments from the available time series of magnetic measurements in the solar wind (ACE and WIND spacecraft) and in the Earth's magnetosphere (SYM-H index). Then we evaluate how the statistical moments change during strong and weak solar wind/magnetosphere coupling intervals.
Quantifying the impact of between-study heterogeneity in multivariate meta-analyses
Jackson, Dan; White, Ian R; Riley, Richard D
2012-01-01
Measures that quantify the impact of heterogeneity in univariate meta-analysis, including the very popular I2 statistic, are now well established. Multivariate meta-analysis, where studies provide multiple outcomes that are pooled in a single analysis, is also becoming more commonly used. The question of how to quantify heterogeneity in the multivariate setting is therefore raised. It is the univariate R2 statistic, the ratio of the variance of the estimated treatment effect under the random and fixed effects models, that generalises most naturally, so this statistic provides our basis. This statistic is then used to derive a multivariate analogue of I2, which we call . We also provide a multivariate H2 statistic, the ratio of a generalisation of Cochran's heterogeneity statistic and its associated degrees of freedom, with an accompanying generalisation of the usual I2 statistic, . Our proposed heterogeneity statistics can be used alongside all the usual estimates and inferential procedures used in multivariate meta-analysis. We apply our methods to some real datasets and show how our statistics are equally appropriate in the context of multivariate meta-regression, where study level covariate effects are included in the model. Our heterogeneity statistics may be used when applying any procedure for fitting the multivariate random effects model. Copyright © 2012 John Wiley & Sons, Ltd. PMID:22763950
Statistical Model of Dynamic Markers of the Alzheimer's Pathological Cascade.
Balsis, Steve; Geraci, Lisa; Benge, Jared; Lowe, Deborah A; Choudhury, Tabina K; Tirso, Robert; Doody, Rachelle S
2018-05-05
Alzheimer's disease (AD) is a progressive disease reflected in markers across assessment modalities, including neuroimaging, cognitive testing, and evaluation of adaptive function. Identifying a single continuum of decline across assessment modalities in a single sample is statistically challenging because of the multivariate nature of the data. To address this challenge, we implemented advanced statistical analyses designed specifically to model complex data across a single continuum. We analyzed data from the Alzheimer's Disease Neuroimaging Initiative (ADNI; N = 1,056), focusing on indicators from the assessments of magnetic resonance imaging (MRI) volume, fluorodeoxyglucose positron emission tomography (FDG-PET) metabolic activity, cognitive performance, and adaptive function. Item response theory was used to identify the continuum of decline. Then, through a process of statistical scaling, indicators across all modalities were linked to that continuum and analyzed. Findings revealed that measures of MRI volume, FDG-PET metabolic activity, and adaptive function added measurement precision beyond that provided by cognitive measures, particularly in the relatively mild range of disease severity. More specifically, MRI volume, and FDG-PET metabolic activity become compromised in the very mild range of severity, followed by cognitive performance and finally adaptive function. Our statistically derived models of the AD pathological cascade are consistent with existing theoretical models.
Lambert, Nathaniel D.; Pankratz, V. Shane; Larrabee, Beth R.; Ogee-Nwankwo, Adaeze; Chen, Min-hsin; Icenogle, Joseph P.
2014-01-01
Rubella remains a social and economic burden due to the high incidence of congenital rubella syndrome (CRS) in some countries. For this reason, an accurate and efficient high-throughput measure of antibody response to vaccination is an important tool. In order to measure rubella-specific neutralizing antibodies in a large cohort of vaccinated individuals, a high-throughput immunocolorimetric system was developed. Statistical interpolation models were applied to the resulting titers to refine quantitative estimates of neutralizing antibody titers relative to the assayed neutralizing antibody dilutions. This assay, including the statistical methods developed, can be used to assess the neutralizing humoral immune response to rubella virus and may be adaptable for assessing the response to other viral vaccines and infectious agents. PMID:24391140
Fujiura, Glenn T; Rutkowski-Kmitta, Violet; Owen, Randall
2010-12-01
Statistics are critical in holding governments accountable for the well-being of citizens with disability. International initiatives are underway to improve the quality of disability statistics, but meaningful ID data is exceptionally rare. The status of ID data was evaluated in a review of 12 national statistical systems. Recurring data collection by national ministries was identified and the availability of measures of poverty, exclusion, and disadvantage was assessed. A total of 131 recurring systems coordinated by 50 different ministries were identified. The majority included general disability but less than 25% of the systems screened ID. Of these, few provided policy-relevant data. The scope of ID data was dismal at best, though a significant statistical infrastructure exists for the integration of ID data. Advocacy will be necessary. There is no optimal form of data monitoring, and decisions regarding priorities in purpose, targeted audiences, and the goals for surveillance must be resolved.
NASA Technical Reports Server (NTRS)
Butler, C. M.; Hogge, J. E.
1978-01-01
Air quality sampling was conducted. Data for air quality parameters, recorded on written forms, punched cards or magnetic tape, are available for 1972 through 1975. Computer software was developed to (1) calculate several daily statistical measures of location, (2) plot time histories of data or the calculated daily statistics, (3) calculate simple correlation coefficients, and (4) plot scatter diagrams. Computer software was developed for processing air quality data to include time series analysis and goodness of fit tests. Computer software was developed to (1) calculate a larger number of daily statistical measures of location, and a number of daily monthly and yearly measures of location, dispersion, skewness and kurtosis, (2) decompose the extended time series model and (3) perform some goodness of fit tests. The computer program is described, documented and illustrated by examples. Recommendations are made for continuation of the development of research on processing air quality data.
Rowlands, G J; Musoke, A J; Morzaria, S P; Nagda, S M; Ballingall, K T; McKeever, D J
2000-04-01
A statistically derived disease reaction index based on parasitological, clinical and haematological measurements observed in 309 5 to 8-month-old Boran cattle following laboratory challenge with Theileria parva is described. Principal component analysis was applied to 13 measures including first appearance of schizonts, first appearance of piroplasms and first occurrence of pyrexia, together with the duration and severity of these symptoms, and white blood cell count. The first principal component, which was based on approximately equal contributions of the 13 variables, provided the definition for the disease reaction index, defined on a scale of 0-10. As well as providing a more objective measure of the severity of the reaction, the continuous nature of the index score enables more powerful statistical analysis of the data compared with that which has been previously possible through clinically derived categories of non-, mild, moderate and severe reactions.
Measuring X-Ray Polarization in the Presence of Systematic Effects: Known Background
NASA Technical Reports Server (NTRS)
Elsner, Ronald F.; O'Dell, Stephen L.; Weisskopf, Martin C.
2012-01-01
The prospects for accomplishing x-ray polarization measurements of astronomical sources have grown in recent years, after a hiatus of more than 37 years. Unfortunately, accompanying this long hiatus has been some confusion over the statistical uncertainties associated with x-ray polarization measurements of these sources. We have initiated a program to perform the detailed calculations that will offer insights into the uncertainties associated with x-ray polarization measurements. Here we describe a mathematical formalism for determining the 1- and 2-parameter errors in the magnitude and position angle of x-ray (linear) polarization in the presence of a (polarized or unpolarized) background. We further review relevant statistics including clearly distinguishing between the Minimum Detectable Polarization (MDP) and the accuracy of a polarization measurement.
Analysis strategies for longitudinal attachment loss data.
Beck, J D; Elter, J R
2000-02-01
The purpose of this invited review is to describe and discuss methods currently in use to quantify the progression of attachment loss in epidemiological studies of periodontal disease, and to make recommendations for specific analytic methods based upon the particular design of the study and structure of the data. The review concentrates on the definition of incident attachment loss (ALOSS) and its component parts; measurement issues including thresholds and regression to the mean; methods of accounting for longitudinal change, including changes in means, changes in proportions of affected sites, incidence density, the effect of tooth loss and reversals, and repeated events; statistical models of longitudinal change, including the incorporation of the time element, use of linear, logistic or Poisson regression or survival analysis, and statistical tests; site vs person level of analysis, including statistical adjustment for correlated data; the strengths and limitations of ALOSS data. Examples from the Piedmont 65+ Dental Study are used to illustrate specific concepts. We conclude that incidence density is the preferred methodology to use for periodontal studies with more than one period of follow-up and that the use of studies not employing methods for dealing with complex samples, correlated data, and repeated measures does not take advantage of our current understanding of the site- and person-level variables important in periodontal disease and may generate biased results.
Public and patient involvement in quantitative health research: A statistical perspective.
Hannigan, Ailish
2018-06-19
The majority of studies included in recent reviews of impact for public and patient involvement (PPI) in health research had a qualitative design. PPI in solely quantitative designs is underexplored, particularly its impact on statistical analysis. Statisticians in practice have a long history of working in both consultative (indirect) and collaborative (direct) roles in health research, yet their perspective on PPI in quantitative health research has never been explicitly examined. To explore the potential and challenges of PPI from a statistical perspective at distinct stages of quantitative research, that is sampling, measurement and statistical analysis, distinguishing between indirect and direct PPI. Statistical analysis is underpinned by having a representative sample, and a collaborative or direct approach to PPI may help achieve that by supporting access to and increasing participation of under-represented groups in the population. Acknowledging and valuing the role of lay knowledge of the context in statistical analysis and in deciding what variables to measure may support collective learning and advance scientific understanding, as evidenced by the use of participatory modelling in other disciplines. A recurring issue for quantitative researchers, which reflects quantitative sampling methods, is the selection and required number of PPI contributors, and this requires further methodological development. Direct approaches to PPI in quantitative health research may potentially increase its impact, but the facilitation and partnership skills required may require further training for all stakeholders, including statisticians. © 2018 The Authors Health Expectations published by John Wiley & Sons Ltd.
Hawaii: 2002 Economic Census. 2002 Educational Services, Geographic Area Series. EC02-61A-HI.
ERIC Educational Resources Information Center
US Department of Commerce, 2005
2005-01-01
The economic census furnishes an important part of the framework for such composite measures as the gross domestic product estimates, input/output measures, production and price indexes, and other statistical series that measure short-term changes in economic conditions. Specific uses of economic census data include the following: Policymaking…
Montana: 2002. 2002 Economic Census. Educational Services, Geographic Area Series. EC02-61A-MT
ERIC Educational Resources Information Center
US Department of Commerce, 2005
2005-01-01
The economic census furnishes an important part of the framework for such composite measures as the gross domestic product estimates, input/output measures, production and price indexes, and other statistical series that measure short-term changes in economic conditions. Specific uses of economic census data include the following: Policymaking…
CTS/Comstar communications link characterization experiment
NASA Technical Reports Server (NTRS)
Hodge, D. B.; Taylor, R. C.
1980-01-01
Measurements of angle of arrival and amplitude fluctuations on millimeter wavelength Earth-space communication links are described. Measurement of rainfall attenuation and radiometric temperature statistics and the assessment of the performance of a self-phased array as a receive antenna on an Earth-space link are also included.
Advances in Statistical Methods for Substance Abuse Prevention Research
MacKinnon, David P.; Lockwood, Chondra M.
2010-01-01
The paper describes advances in statistical methods for prevention research with a particular focus on substance abuse prevention. Standard analysis methods are extended to the typical research designs and characteristics of the data collected in prevention research. Prevention research often includes longitudinal measurement, clustering of data in units such as schools or clinics, missing data, and categorical as well as continuous outcome variables. Statistical methods to handle these features of prevention data are outlined. Developments in mediation, moderation, and implementation analysis allow for the extraction of more detailed information from a prevention study. Advancements in the interpretation of prevention research results include more widespread calculation of effect size and statistical power, the use of confidence intervals as well as hypothesis testing, detailed causal analysis of research findings, and meta-analysis. The increased availability of statistical software has contributed greatly to the use of new methods in prevention research. It is likely that the Internet will continue to stimulate the development and application of new methods. PMID:12940467
Dinov, Ivo D; Sanchez, Juana; Christou, Nicolas
2008-01-01
Technology-based instruction represents a new recent pedagogical paradigm that is rooted in the realization that new generations are much more comfortable with, and excited about, new technologies. The rapid technological advancement over the past decade has fueled an enormous demand for the integration of modern networking, informational and computational tools with classical pedagogical instruments. Consequently, teaching with technology typically involves utilizing a variety of IT and multimedia resources for online learning, course management, electronic course materials, and novel tools of communication, engagement, experimental, critical thinking and assessment.The NSF-funded Statistics Online Computational Resource (SOCR) provides a number of interactive tools for enhancing instruction in various undergraduate and graduate courses in probability and statistics. These resources include online instructional materials, statistical calculators, interactive graphical user interfaces, computational and simulation applets, tools for data analysis and visualization. The tools provided as part of SOCR include conceptual simulations and statistical computing interfaces, which are designed to bridge between the introductory and the more advanced computational and applied probability and statistics courses. In this manuscript, we describe our designs for utilizing SOCR technology in instruction in a recent study. In addition, present the results of the effectiveness of using SOCR tools at two different course intensity levels on three outcome measures: exam scores, student satisfaction and choice of technology to complete assignments. Learning styles assessment was completed at baseline. We have used three very different designs for three different undergraduate classes. Each course included a treatment group, using the SOCR resources, and a control group, using classical instruction techniques. Our findings include marginal effects of the SOCR treatment per individual classes; however, pooling the results across all courses and sections, SOCR effects on the treatment groups were exceptionally robust and significant. Coupling these findings with a clear decrease in the variance of the quantitative examination measures in the treatment groups indicates that employing technology, like SOCR, in a sound pedagogical and scientific manner enhances overall the students' understanding and suggests better long-term knowledge retention.
Dinov, Ivo D.; Sanchez, Juana; Christou, Nicolas
2009-01-01
Technology-based instruction represents a new recent pedagogical paradigm that is rooted in the realization that new generations are much more comfortable with, and excited about, new technologies. The rapid technological advancement over the past decade has fueled an enormous demand for the integration of modern networking, informational and computational tools with classical pedagogical instruments. Consequently, teaching with technology typically involves utilizing a variety of IT and multimedia resources for online learning, course management, electronic course materials, and novel tools of communication, engagement, experimental, critical thinking and assessment. The NSF-funded Statistics Online Computational Resource (SOCR) provides a number of interactive tools for enhancing instruction in various undergraduate and graduate courses in probability and statistics. These resources include online instructional materials, statistical calculators, interactive graphical user interfaces, computational and simulation applets, tools for data analysis and visualization. The tools provided as part of SOCR include conceptual simulations and statistical computing interfaces, which are designed to bridge between the introductory and the more advanced computational and applied probability and statistics courses. In this manuscript, we describe our designs for utilizing SOCR technology in instruction in a recent study. In addition, present the results of the effectiveness of using SOCR tools at two different course intensity levels on three outcome measures: exam scores, student satisfaction and choice of technology to complete assignments. Learning styles assessment was completed at baseline. We have used three very different designs for three different undergraduate classes. Each course included a treatment group, using the SOCR resources, and a control group, using classical instruction techniques. Our findings include marginal effects of the SOCR treatment per individual classes; however, pooling the results across all courses and sections, SOCR effects on the treatment groups were exceptionally robust and significant. Coupling these findings with a clear decrease in the variance of the quantitative examination measures in the treatment groups indicates that employing technology, like SOCR, in a sound pedagogical and scientific manner enhances overall the students’ understanding and suggests better long-term knowledge retention. PMID:19750185
Ion Channel Conductance Measurements on a Silicon-Based Platform
2006-01-01
calculated using the molecular dynamics code, GROMACS . Reasonable agreement is obtained in the simulated versus measured conductance over the range of...measurements of the lipid giga-seal characteristics have been performed, including AC conductance measurements and statistical analysis in order to...Dynamics kernel self-consistently coupled to Poisson equations using a P3M force field scheme and the GROMACS description of protein structure and
ERIC Educational Resources Information Center
Steuerle, Eugene; McClung, Nelson
This technical study is concerned with both the statistical and policy effects of alternative definitions of poverty which result when the definition of means is altered by varying the time period (accounting period) over which income is measured or by including in the measure of means not only realized income, but also unrealized income and…
Statistical methods for change-point detection in surface temperature records
NASA Astrophysics Data System (ADS)
Pintar, A. L.; Possolo, A.; Zhang, N. F.
2013-09-01
We describe several statistical methods to detect possible change-points in a time series of values of surface temperature measured at a meteorological station, and to assess the statistical significance of such changes, taking into account the natural variability of the measured values, and the autocorrelations between them. These methods serve to determine whether the record may suffer from biases unrelated to the climate signal, hence whether there may be a need for adjustments as considered by M. J. Menne and C. N. Williams (2009) "Homogenization of Temperature Series via Pairwise Comparisons", Journal of Climate 22 (7), 1700-1717. We also review methods to characterize patterns of seasonality (seasonal decomposition using monthly medians or robust local regression), and explain the role they play in the imputation of missing values, and in enabling robust decompositions of the measured values into a seasonal component, a possible climate signal, and a station-specific remainder. The methods for change-point detection that we describe include statistical process control, wavelet multi-resolution analysis, adaptive weights smoothing, and a Bayesian procedure, all of which are applicable to single station records.
RepExplore: addressing technical replicate variance in proteomics and metabolomics data analysis.
Glaab, Enrico; Schneider, Reinhard
2015-07-01
High-throughput omics datasets often contain technical replicates included to account for technical sources of noise in the measurement process. Although summarizing these replicate measurements by using robust averages may help to reduce the influence of noise on downstream data analysis, the information on the variance across the replicate measurements is lost in the averaging process and therefore typically disregarded in subsequent statistical analyses.We introduce RepExplore, a web-service dedicated to exploit the information captured in the technical replicate variance to provide more reliable and informative differential expression and abundance statistics for omics datasets. The software builds on previously published statistical methods, which have been applied successfully to biomedical omics data but are difficult to use without prior experience in programming or scripting. RepExplore facilitates the analysis by providing a fully automated data processing and interactive ranking tables, whisker plot, heat map and principal component analysis visualizations to interpret omics data and derived statistics. Freely available at http://www.repexplore.tk enrico.glaab@uni.lu Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press.
High order statistical signatures from source-driven measurements of subcritical fissile systems
NASA Astrophysics Data System (ADS)
Mattingly, John Kelly
1998-11-01
This research focuses on the development and application of high order statistical analyses applied to measurements performed with subcritical fissile systems driven by an introduced neutron source. The signatures presented are derived from counting statistics of the introduced source and radiation detectors that observe the response of the fissile system. It is demonstrated that successively higher order counting statistics possess progressively higher sensitivity to reactivity. Consequently, these signatures are more sensitive to changes in the composition, fissile mass, and configuration of the fissile assembly. Furthermore, it is shown that these techniques are capable of distinguishing the response of the fissile system to the introduced source from its response to any internal or inherent sources. This ability combined with the enhanced sensitivity of higher order signatures indicates that these techniques will be of significant utility in a variety of applications. Potential applications include enhanced radiation signature identification of weapons components for nuclear disarmament and safeguards applications and augmented nondestructive analysis of spent nuclear fuel. In general, these techniques expand present capabilities in the analysis of subcritical measurements.
Cohen, Alan A; Milot, Emmanuel; Li, Qing; Legault, Véronique; Fried, Linda P; Ferrucci, Luigi
2014-09-01
Measuring physiological dysregulation during aging could be a key tool both to understand underlying aging mechanisms and to predict clinical outcomes in patients. However, most existing indices are either circular or hard to interpret biologically. Recently, we showed that statistical distance of 14 common blood biomarkers (a measure of how strange an individual's biomarker profile is) was associated with age and mortality in the WHAS II data set, validating its use as a measure of physiological dysregulation. Here, we extend the analyses to other data sets (WHAS I and InCHIANTI) to assess the stability of the measure across populations. We found that the statistical criteria used to determine the original 14 biomarkers produced diverging results across populations; in other words, had we started with a different data set, we would have chosen a different set of markers. Nonetheless, the same 14 markers (or the subset of 12 available for InCHIANTI) produced highly similar predictions of age and mortality. We include analyses of all combinatorial subsets of the markers and show that results do not depend much on biomarker choice or data set, but that more markers produce a stronger signal. We conclude that statistical distance as a measure of physiological dysregulation is stable across populations in Europe and North America. Copyright © 2014 Elsevier Inc. All rights reserved.
Aiken, Leona S; West, Stephen G; Millsap, Roger E
2008-01-01
In a survey of all PhD programs in psychology in the United States and Canada, the authors documented the quantitative methodology curriculum (statistics, measurement, and research design) to examine the extent to which innovations in quantitative methodology have diffused into the training of PhDs in psychology. In all, 201 psychology PhD programs (86%) participated. This survey replicated and extended a previous survey (L. S. Aiken, S. G. West, L. B. Sechrest, & R. R. Reno, 1990), permitting examination of curriculum development. Most training supported laboratory and not field research. The median of 1.6 years of training in statistics and measurement was mainly devoted to the modally 1-year introductory statistics course, leaving little room for advanced study. Curricular enhancements were noted in statistics and to a minor degree in measurement. Additional coverage of both fundamental and innovative quantitative methodology is needed. The research design curriculum has largely stagnated, a cause for great concern. Elite programs showed no overall advantage in quantitative training. Forces that support curricular innovation are characterized. Human capital challenges to quantitative training, including recruiting and supporting young quantitative faculty, are discussed. Steps must be taken to bring innovations in quantitative methodology into the curriculum of PhD programs in psychology. PsycINFO Database Record (c) 2008 APA, all rights reserved.
Inferring Small Scale Dynamics from Aircraft Measurements of Tracers
NASA Technical Reports Server (NTRS)
Sparling, L. C.; Einaudi, Franco (Technical Monitor)
2000-01-01
The millions of ER-2 and DC-8 aircraft measurements of long-lived tracers in the Upper Troposphere/Lower Stratosphere (UT/LS) hold enormous potential as a source of statistical information about subgrid scale dynamics. Extracting this information however can be extremely difficult because the measurements are made along a 1-D transect through fields that are highly anisotropic in all three dimensions. Some of the challenges and limitations posed by both the instrumentation and platform are illustrated within the context of the problem of using the data to obtain an estimate of the dissipation scale. This presentation will also include some tutorial remarks about the conditional and two-point statistics used in the analysis.
Nick, Todd G
2007-01-01
Statistics is defined by the Medical Subject Headings (MeSH) thesaurus as the science and art of collecting, summarizing, and analyzing data that are subject to random variation. The two broad categories of summarizing and analyzing data are referred to as descriptive and inferential statistics. This chapter considers the science and art of summarizing data where descriptive statistics and graphics are used to display data. In this chapter, we discuss the fundamentals of descriptive statistics, including describing qualitative and quantitative variables. For describing quantitative variables, measures of location and spread, for example the standard deviation, are presented along with graphical presentations. We also discuss distributions of statistics, for example the variance, as well as the use of transformations. The concepts in this chapter are useful for uncovering patterns within the data and for effectively presenting the results of a project.
Productivity Continued to Increase in Many Industries during 1984.
ERIC Educational Resources Information Center
Herman, Arthur S.
1986-01-01
Productivity, as measured by output per employee hour, grew in 1984 in about three quarters of the industries for which the Bureau of Labor Statistics regularly publishes data. (A table shows productivity trends in industries measured by the Bureau, including mining, transportation and utilities, and trade and services.) (CT)
Standardizing power monitoring and control at exascale
Grant, Ryan E.; Levenhagen, Michael; Olivier, Stephen L.; ...
2016-10-20
Power API-the result of collaboration among national laboratories, universities, and major vendors-provides a range of standardized power management functions, from application-level control and measurement to facility-level accounting, including real-time and historical statistics gathering. Here, support is already available for Intel and AMD CPUs and standalone measurement devices.
How to Use Value-Added Measures Right
ERIC Educational Resources Information Center
Di Carlo, Matthew
2012-01-01
Value-added models are a specific type of "growth model," a diverse group of statistical techniques to isolate a teacher's impact on his or her students' testing progress while controlling for other measurable factors, such as student and school characteristics, that are outside that teacher's control. Opponents, including many teachers, argue…
48 CFR 1401.7001-4 - Acquisition performance measurement systems.
Code of Federal Regulations, 2010 CFR
2010-10-01
...-pronged approach that includes self assessment, statistical data for validation and flexible quality... regulations governing the acquisition process; and (3) Identify and implement changes necessary to improve the...
Villagómez-Ornelas, Paloma; Hernández-López, Pedro; Carrasco-Enríquez, Brenda; Barrios-Sánchez, Karina; Pérez-Escamilla, Rafael; Melgar-Quiñónez, Hugo
2014-01-01
This article validates the statistical consistency of two food security scales: the Mexican Food Security Scale (EMSA) and the Latin American and Caribbean Food Security Scale (ELCSA). Validity tests were conducted in order to verify that both scales were consistent instruments, conformed by independent, properly calibrated and adequately sorted items, arranged in a continuum of severity. The following tests were developed: sorting of items; Cronbach's alpha analysis; parallelism of prevalence curves; Rasch models; sensitivity analysis through mean differences' hypothesis test. The tests showed that both scales meet the required attributes and are robust statistical instruments for food security measurement. This is relevant given that the lack of access to food indicator, included in multidimensional poverty measurement in Mexico, is calculated with EMSA.
Hagell, Peter; Westergren, Albert
Sample size is a major factor in statistical null hypothesis testing, which is the basis for many approaches to testing Rasch model fit. Few sample size recommendations for testing fit to the Rasch model concern the Rasch Unidimensional Measurement Models (RUMM) software, which features chi-square and ANOVA/F-ratio based fit statistics, including Bonferroni and algebraic sample size adjustments. This paper explores the occurrence of Type I errors with RUMM fit statistics, and the effects of algebraic sample size adjustments. Data with simulated Rasch model fitting 25-item dichotomous scales and sample sizes ranging from N = 50 to N = 2500 were analysed with and without algebraically adjusted sample sizes. Results suggest the occurrence of Type I errors with N less then or equal to 500, and that Bonferroni correction as well as downward algebraic sample size adjustment are useful to avoid such errors, whereas upward adjustment of smaller samples falsely signal misfit. Our observations suggest that sample sizes around N = 250 to N = 500 may provide a good balance for the statistical interpretation of the RUMM fit statistics studied here with respect to Type I errors and under the assumption of Rasch model fit within the examined frame of reference (i.e., about 25 item parameters well targeted to the sample).
NASA Astrophysics Data System (ADS)
Chodera, John D.; Noé, Frank
2010-09-01
Discrete-state Markov (or master equation) models provide a useful simplified representation for characterizing the long-time statistical evolution of biomolecules in a manner that allows direct comparison with experiments as well as the elucidation of mechanistic pathways for an inherently stochastic process. A vital part of meaningful comparison with experiment is the characterization of the statistical uncertainty in the predicted experimental measurement, which may take the form of an equilibrium measurement of some spectroscopic signal, the time-evolution of this signal following a perturbation, or the observation of some statistic (such as the correlation function) of the equilibrium dynamics of a single molecule. Without meaningful error bars (which arise from both approximation and statistical error), there is no way to determine whether the deviations between model and experiment are statistically meaningful. Previous work has demonstrated that a Bayesian method that enforces microscopic reversibility can be used to characterize the statistical component of correlated uncertainties in state-to-state transition probabilities (and functions thereof) for a model inferred from molecular simulation data. Here, we extend this approach to include the uncertainty in observables that are functions of molecular conformation (such as surrogate spectroscopic signals) characterizing each state, permitting the full statistical uncertainty in computed spectroscopic experiments to be assessed. We test the approach in a simple model system to demonstrate that the computed uncertainties provide a useful indicator of statistical variation, and then apply it to the computation of the fluorescence autocorrelation function measured for a dye-labeled peptide previously studied by both experiment and simulation.
Assessing dynamics, spatial scale, and uncertainty in task-related brain network analyses
Stephen, Emily P.; Lepage, Kyle Q.; Eden, Uri T.; Brunner, Peter; Schalk, Gerwin; Brumberg, Jonathan S.; Guenther, Frank H.; Kramer, Mark A.
2014-01-01
The brain is a complex network of interconnected elements, whose interactions evolve dynamically in time to cooperatively perform specific functions. A common technique to probe these interactions involves multi-sensor recordings of brain activity during a repeated task. Many techniques exist to characterize the resulting task-related activity, including establishing functional networks, which represent the statistical associations between brain areas. Although functional network inference is commonly employed to analyze neural time series data, techniques to assess the uncertainty—both in the functional network edges and the corresponding aggregate measures of network topology—are lacking. To address this, we describe a statistically principled approach for computing uncertainty in functional networks and aggregate network measures in task-related data. The approach is based on a resampling procedure that utilizes the trial structure common in experimental recordings. We show in simulations that this approach successfully identifies functional networks and associated measures of confidence emergent during a task in a variety of scenarios, including dynamically evolving networks. In addition, we describe a principled technique for establishing functional networks based on predetermined regions of interest using canonical correlation. Doing so provides additional robustness to the functional network inference. Finally, we illustrate the use of these methods on example invasive brain voltage recordings collected during an overt speech task. The general strategy described here—appropriate for static and dynamic network inference and different statistical measures of coupling—permits the evaluation of confidence in network measures in a variety of settings common to neuroscience. PMID:24678295
Assessing dynamics, spatial scale, and uncertainty in task-related brain network analyses.
Stephen, Emily P; Lepage, Kyle Q; Eden, Uri T; Brunner, Peter; Schalk, Gerwin; Brumberg, Jonathan S; Guenther, Frank H; Kramer, Mark A
2014-01-01
The brain is a complex network of interconnected elements, whose interactions evolve dynamically in time to cooperatively perform specific functions. A common technique to probe these interactions involves multi-sensor recordings of brain activity during a repeated task. Many techniques exist to characterize the resulting task-related activity, including establishing functional networks, which represent the statistical associations between brain areas. Although functional network inference is commonly employed to analyze neural time series data, techniques to assess the uncertainty-both in the functional network edges and the corresponding aggregate measures of network topology-are lacking. To address this, we describe a statistically principled approach for computing uncertainty in functional networks and aggregate network measures in task-related data. The approach is based on a resampling procedure that utilizes the trial structure common in experimental recordings. We show in simulations that this approach successfully identifies functional networks and associated measures of confidence emergent during a task in a variety of scenarios, including dynamically evolving networks. In addition, we describe a principled technique for establishing functional networks based on predetermined regions of interest using canonical correlation. Doing so provides additional robustness to the functional network inference. Finally, we illustrate the use of these methods on example invasive brain voltage recordings collected during an overt speech task. The general strategy described here-appropriate for static and dynamic network inference and different statistical measures of coupling-permits the evaluation of confidence in network measures in a variety of settings common to neuroscience.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Newsom, R. K.; Sivaraman, C.; Shippert, T. R.
Accurate height-resolved measurements of higher-order statistical moments of vertical velocity fluctuations are crucial for improved understanding of turbulent mixing and diffusion, convective initiation, and cloud life cycles. The Atmospheric Radiation Measurement (ARM) Climate Research Facility operates coherent Doppler lidar systems at several sites around the globe. These instruments provide measurements of clear-air vertical velocity profiles in the lower troposphere with a nominal temporal resolution of 1 sec and height resolution of 30 m. The purpose of the Doppler lidar vertical velocity statistics (DLWSTATS) value-added product (VAP) is to produce height- and time-resolved estimates of vertical velocity variance, skewness, and kurtosismore » from these raw measurements. The VAP also produces estimates of cloud properties, including cloud-base height (CBH), cloud frequency, cloud-base vertical velocity, and cloud-base updraft fraction.« less
Turbulent pressure fluctuations measured during CHATS
Steven P. Oncley; William J. Massman; Edward G. Patton
2008-01-01
Fast-response pressure fluctuations were included in the Canopy Horizontal Array of Turbulence Study (CHATS) at several heights within and just above the canopy in a walnut orchard. Two independent systems were intercompared and then separated. We present an evaluation of turbulence statistics - including the pressure transport term in the turbulence kinetic energy...
Monographs - SEER Publications
In-depth publications on topics in cancer statistics, including collaborative staging and registry data, cancer survival from a policy and clinical perspective, a description of cancer in American Indians/Alaska Natives, and measures of health disparities.
Chong, Ka Lung; Samsudin, Amir; Keng, Tee Chau; Kamalden, Tengku Ain; Ramli, Norlina
2017-02-01
To evaluate the effect of nocturnal intermittent peritoneal dialysis (NIPD) on intraocular pressure (IOP) and anterior segment optical coherence tomography (ASOCT) parameters. Systemic changes associated with NIPD were also analyzed. Observational study. Nonglaucomatous patients on NIPD underwent systemic and ocular assessment including mean arterial pressure (MAP), body weight, serum osmolarity, visual acuity, IOP measurement, and ASOCT within 2 hours both before and after NIPD. The Zhongshan Angle Assessment Program (ZAAP) was used to measure ASOCT parameters including anterior chamber depth, anterior chamber width, anterior chamber area, anterior chamber volume, lens vault, angle opening distance, trabecular-iris space area, and angle recess area. T tests and Pearson correlation tests were performed with P<0.05 considered statistically significant. A total of 46 eyes from 46 patients were included in the analysis. There were statistically significant reductions in IOP (-1.8±0.6 mm Hg, P=0.003), MAP (-11.9±3.1 mm Hg, P<0.001), body weight (-0.7±2.8 kg, P<0.001), and serum osmolarity (-3.4±2.0 mOsm/L, P=0.002) after NIPD. All the ASOCT parameters did not have any statistically significant changes after NIPD. There were no statistically significant correlations between the changes in IOP, MAP, body weight, and serum osmolarity (all P>0.05). NIPD results in reductions in IOP, MAP, body weight, and serum osmolarity in nonglaucomatous patients.
Network Management of the SPLICE Computer Network.
1982-12-01
Approved for public release; distri4ition unlimited. Network lanagenent Df the SPLICE Computer Network by Zriig E. Opal captaini United St~tes larine... structure of the network must leni itself t3 change and reconfiguration, one author [Ref. 2: p.21] recommended that a global bus topology be adopted for...statistics, trace statistics, snapshot statistiZs, artifi - cial traffic generators, auulat on, a network measurement center which includes control, collction
DOE Office of Scientific and Technical Information (OSTI.GOV)
Biassoni, Pietro
2009-01-01
In this thesis work we have measured the following upper limits at 90% of confidence level, for B meson decays (in units of 10 -6), using a statistics of 465.0 x 10 6 Bmore » $$\\bar{B}$$ pairs: β(B 0 → ηK 0) < 1.6 β(B 0 → ηη) < 1.4 β(B 0 → η'η') < 2.1 β(B 0 → ηΦ) < 0.52 β(B 0 → ηω) < 1.6 β(B 0 → η'Φ) < 1.2 β(B 0 → η'ω) < 1.7 We have no observation of any decay mode, statistical significance for our measurements is in the range 1.3-3.5 standard deviation. We have a 3.5σ evidence for B → ηω and a 3.1 σ evidence for B → η'ω. The absence of observation of the B 0 → ηK 0 open an issue related to the large difference compared to the charged mode B + → ηK + branching fraction, which is measured to be 3.7 ± 0.4 ± 0.1 [118]. Our results represent substantial improvements of the previous ones [109, 110, 111] and are consistent with theoretical predictions. All these results were presented at Flavor Physics and CP Violation (FPCP) 2008 Conference, that took place in Taipei, Taiwan. They will be soon included into a paper to be submitted to Physical Review D. For time-dependent analysis, we have reconstructed 1820 ± 48 flavor-tagged B 0 → η'K 0 events, using the final BABAR statistic of 467.4 x 10 6 B$$\\bar{B}$$ pairs. We use these events to measure the time-dependent asymmetry parameters S and C. We find S = 0.59 ± 0.08 ± 0.02, and C = -0.06 ± 0.06 ± 0.02. A non-zero value of C would represent a directly CP non-conserving component in B 0 → η'K 0, while S would be equal to sin2β measured in B 0 → J/ΨK s 0 [108], a mixing-decay interference effect, provided the decay is dominated by amplitudes of a single weak phase. The new measured value of S can be considered in agreement with the expectations of the 'Standard Model', inside the experimental and theoretical uncertainties. Inconsistency of our result for S with CP conservation (S = 0) has a significance of 7.1 standard deviations (statistical and systematics included). Our result for the direct-CP violation parameter C is 0.9 standard deviations from zero (statistical and systematics included). Our results are in agreement with the previous ones [18]. Despite the statistics is only 20% larger than the one used in previous measurement, we improved of 20% the error on S and of 14% the error on C. This error is the smaller ever achieved, by both BABAR and Belle, in Time-Dependent CP Violation Parameters measurement is a b → s transition.« less
The chi-square test of independence.
McHugh, Mary L
2013-01-01
The Chi-square statistic is a non-parametric (distribution free) tool designed to analyze group differences when the dependent variable is measured at a nominal level. Like all non-parametric statistics, the Chi-square is robust with respect to the distribution of the data. Specifically, it does not require equality of variances among the study groups or homoscedasticity in the data. It permits evaluation of both dichotomous independent variables, and of multiple group studies. Unlike many other non-parametric and some parametric statistics, the calculations needed to compute the Chi-square provide considerable information about how each of the groups performed in the study. This richness of detail allows the researcher to understand the results and thus to derive more detailed information from this statistic than from many others. The Chi-square is a significance statistic, and should be followed with a strength statistic. The Cramer's V is the most common strength test used to test the data when a significant Chi-square result has been obtained. Advantages of the Chi-square include its robustness with respect to distribution of the data, its ease of computation, the detailed information that can be derived from the test, its use in studies for which parametric assumptions cannot be met, and its flexibility in handling data from both two group and multiple group studies. Limitations include its sample size requirements, difficulty of interpretation when there are large numbers of categories (20 or more) in the independent or dependent variables, and tendency of the Cramer's V to produce relative low correlation measures, even for highly significant results.
Mental health benefits of interactions with nature in children and teenagers: a systematic review.
Tillmann, Suzanne; Tobin, Danielle; Avison, William; Gilliland, Jason
2018-06-27
It is commonly believed that nature has positive impacts on children's health, including physical, mental and social dimensions. This review focuses on how accessibility to, exposure to and engagement with nature affects the mental health of children and teenagers. Ten academic databases were used to systematically search and identify primary research papers in English or French from 1990 to 1 March 2017. Papers were included for review based on their incorporation of nature, children and teenagers (0-18 years), quantitative results and focus on mental health. Of the 35 papers included in the review, the majority focused on emotional well-being and attention deficit disorder/hyperactivity disorder. Other outcome measures included overall mental health, self-esteem, stress, resilience, depression and health-related quality of life. About half of all reported findings revealed statistically significant positive relationships between nature and mental health outcomes and almost half reported no statistical significance. Findings support the contention that nature positively influences mental health; however, in most cases, additional research with more rigorous study designs and objective measures of both nature and mental health outcomes are needed to confirm statistically significant relationships. Existing evidence is limited by the cross-sectional nature of most papers. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Chung, Chi-Jung; Kuo, Yu-Chen; Hsieh, Yun-Yu; Li, Tsai-Chung; Lin, Cheng-Chieh; Liang, Wen-Miin; Liao, Li-Na; Li, Chia-Ing; Lin, Hsueh-Chun
2017-11-01
This study applied open source technology to establish a subject-enabled analytics model that can enhance measurement statistics of case studies with the public health data in cloud computing. The infrastructure of the proposed model comprises three domains: 1) the health measurement data warehouse (HMDW) for the case study repository, 2) the self-developed modules of online health risk information statistics (HRIStat) for cloud computing, and 3) the prototype of a Web-based process automation system in statistics (PASIS) for the health risk assessment of case studies with subject-enabled evaluation. The system design employed freeware including Java applications, MySQL, and R packages to drive a health risk expert system (HRES). In the design, the HRIStat modules enforce the typical analytics methods for biomedical statistics, and the PASIS interfaces enable process automation of the HRES for cloud computing. The Web-based model supports both modes, step-by-step analysis and auto-computing process, respectively for preliminary evaluation and real time computation. The proposed model was evaluated by computing prior researches in relation to the epidemiological measurement of diseases that were caused by either heavy metal exposures in the environment or clinical complications in hospital. The simulation validity was approved by the commercial statistics software. The model was installed in a stand-alone computer and in a cloud-server workstation to verify computing performance for a data amount of more than 230K sets. Both setups reached efficiency of about 10 5 sets per second. The Web-based PASIS interface can be used for cloud computing, and the HRIStat module can be flexibly expanded with advanced subjects for measurement statistics. The analytics procedure of the HRES prototype is capable of providing assessment criteria prior to estimating the potential risk to public health. Copyright © 2017 Elsevier B.V. All rights reserved.
The Conceptualization and Measurement of Equity in School Finance in Virginia.
ERIC Educational Resources Information Center
Verstegen, Deborah A.; Salmon, Richard G.
1989-01-01
Employed various statistical techniques to measure fiscal equity in Virginia. The new state aid system for financing education was unable to mitigate large and increasing disparities in education revenues between more and less affluent localities and a strong and growing linkage between revenue and wealth. Includes 34 footnotes. (MLH)
Mapping urban forest structure and function using hyperspectral imagery and lidar data
Michael Alonzo; Joseph P. McFadden; David J. Nowak; Dar A. Roberts
2016-01-01
Cities measure the structure and function of their urban forest resource to optimize forest managementand the provision of ecosystem services. Measurements made using plot sampling methods yield useful results including citywide or land-use level estimates of species counts, leaf area, biomass, and air pollution reduction. However, these quantities are statistical...
Experimental analysis of computer system dependability
NASA Technical Reports Server (NTRS)
Iyer, Ravishankar, K.; Tang, Dong
1993-01-01
This paper reviews an area which has evolved over the past 15 years: experimental analysis of computer system dependability. Methodologies and advances are discussed for three basic approaches used in the area: simulated fault injection, physical fault injection, and measurement-based analysis. The three approaches are suited, respectively, to dependability evaluation in the three phases of a system's life: design phase, prototype phase, and operational phase. Before the discussion of these phases, several statistical techniques used in the area are introduced. For each phase, a classification of research methods or study topics is outlined, followed by discussion of these methods or topics as well as representative studies. The statistical techniques introduced include the estimation of parameters and confidence intervals, probability distribution characterization, and several multivariate analysis methods. Importance sampling, a statistical technique used to accelerate Monte Carlo simulation, is also introduced. The discussion of simulated fault injection covers electrical-level, logic-level, and function-level fault injection methods as well as representative simulation environments such as FOCUS and DEPEND. The discussion of physical fault injection covers hardware, software, and radiation fault injection methods as well as several software and hybrid tools including FIAT, FERARI, HYBRID, and FINE. The discussion of measurement-based analysis covers measurement and data processing techniques, basic error characterization, dependency analysis, Markov reward modeling, software-dependability, and fault diagnosis. The discussion involves several important issues studies in the area, including fault models, fast simulation techniques, workload/failure dependency, correlated failures, and software fault tolerance.
Eustachian tube diameter: Is it associated with chronic otitis media development?
Paltura, Ceki; Can, Tuba Selçuk; Yilmaz, Behice Kaniye; Dinç, Mehmet Emre; Develioğlu, Ömer Necati; Külekçi, Mehmet
To evaluate the effect of ET diameter on Chronic Otitis Media (COM) pathogenesis. Retrospective. Patients with unilateral COM disease are included in the study. The connection between fibrocartilaginous and osseous segments of the Eustachian Tube (ET) on axial Computed Tomography (CT) images was defined and the diameter of this segment is measured. The measurements were carried out bilaterally and statistically compared. 154 (76 (49%) male, 78 (51%) female patients were diagnosed with unilateral COM and included in the study. The mean diameter of ET was 1947mm (Std. deviation±0.5247) for healthy ears and 1788mm (Std. deviation±0.5306) for diseased ears. The statistical analysis showed a significantly narrow ET diameter in diseased ear side (p<0.01). The dysfunction or anatomical anomalies of ET are correlated with COM. Measuring of the bony diameter of ET during routine Temporal CT examination is recommended for our colleagues. Copyright © 2017 Elsevier Inc. All rights reserved.
Barber, Julie A; Thompson, Simon G
1998-01-01
Objective To review critically the statistical methods used for health economic evaluations in randomised controlled trials where an estimate of cost is available for each patient in the study. Design Survey of published randomised trials including an economic evaluation with cost values suitable for statistical analysis; 45 such trials published in 1995 were identified from Medline. Main outcome measures The use of statistical methods for cost data was assessed in terms of the descriptive statistics reported, use of statistical inference, and whether the reported conclusions were justified. Results Although all 45 trials reviewed apparently had cost data for each patient, only 9 (20%) reported adequate measures of variability for these data and only 25 (56%) gave results of statistical tests or a measure of precision for the comparison of costs between the randomised groups. Only 16 (36%) of the articles gave conclusions which were justified on the basis of results presented in the paper. No paper reported sample size calculations for costs. Conclusions The analysis and interpretation of cost data from published trials reveal a lack of statistical awareness. Strong and potentially misleading conclusions about the relative costs of alternative therapies have often been reported in the absence of supporting statistical evidence. Improvements in the analysis and reporting of health economic assessments are urgently required. Health economic guidelines need to be revised to incorporate more detailed statistical advice. Key messagesHealth economic evaluations required for important healthcare policy decisions are often carried out in randomised controlled trialsA review of such published economic evaluations assessed whether statistical methods for cost outcomes have been appropriately used and interpretedFew publications presented adequate descriptive information for costs or performed appropriate statistical analysesIn at least two thirds of the papers, the main conclusions regarding costs were not justifiedThe analysis and reporting of health economic assessments within randomised controlled trials urgently need improving PMID:9794854
2016-12-22
included assessments and instruments, descriptive statistics were calculated. Independent-samples t-tests were conducted using participant survey scores...integrity tests within a multimodal system. Both conditions included the Military Acute Concussion Evaluation (MACE) and an Ease-of-Use survey . Mean scores...for the Ease-of-Use survey and mean test administration times for each measure were compared. Administrative feedback was also considered for
Statistical Analysis of Big Data on Pharmacogenomics
Fan, Jianqing; Liu, Han
2013-01-01
This paper discusses statistical methods for estimating complex correlation structure from large pharmacogenomic datasets. We selectively review several prominent statistical methods for estimating large covariance matrix for understanding correlation structure, inverse covariance matrix for network modeling, large-scale simultaneous tests for selecting significantly differently expressed genes and proteins and genetic markers for complex diseases, and high dimensional variable selection for identifying important molecules for understanding molecule mechanisms in pharmacogenomics. Their applications to gene network estimation and biomarker selection are used to illustrate the methodological power. Several new challenges of Big data analysis, including complex data distribution, missing data, measurement error, spurious correlation, endogeneity, and the need for robust statistical methods, are also discussed. PMID:23602905
Summary Statistics for Homemade ?Play Dough? -- Data Acquired at LLNL
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kallman, J S; Morales, K E; Whipple, R E
Using x-ray computerized tomography (CT), we have characterized the x-ray linear attenuation coefficients (LAC) of a homemade Play Dough{trademark}-like material, designated as PDA. Table 1 gives the first-order statistics for each of four CT measurements, estimated with a Gaussian kernel density estimator (KDE) analysis. The mean values of the LAC range from a high of about 2700 LMHU{sub D} 100kVp to a low of about 1200 LMHUD at 300kVp. The standard deviation of each measurement is around 10% to 15% of the mean. The entropy covers the range from 6.0 to 7.4. Ordinarily, we would model the LAC of themore » material and compare the modeled values to the measured values. In this case, however, we did not have the detailed chemical composition of the material and therefore did not model the LAC. Using a method recently proposed by Lawrence Livermore National Laboratory (LLNL), we estimate the value of the effective atomic number, Z{sub eff}, to be near 10. LLNL prepared about 50mL of the homemade 'Play Dough' in a polypropylene vial and firmly compressed it immediately prior to the x-ray measurements. We used the computer program IMGREC to reconstruct the CT images. The values of the key parameters used in the data capture and image reconstruction are given in this report. Additional details may be found in the experimental SOP and a separate document. To characterize the statistical distribution of LAC values in each CT image, we first isolated an 80% central-core segment of volume elements ('voxels') lying completely within the specimen, away from the walls of the polypropylene vial. All of the voxels within this central core, including those comprised of voids and inclusions, are included in the statistics. We then calculated the mean value, standard deviation and entropy for (a) the four image segments and for (b) their digital gradient images. (A digital gradient image of a given image was obtained by taking the absolute value of the difference between the initial image and that same image offset by one voxel horizontally, parallel to the rows of the x-ray detector array.) The statistics of the initial image of LAC values are called 'first order statistics;' those of the gradient image, 'second order statistics.'« less
Cascioli, Vincenzo; Liu, Zhuofu; Heusch, Andrew; McCarthy, Peter W
2016-05-01
This study presents a method for objectively measuring in-chair movement (ICM) that shows correlation with subjective ratings of comfort and discomfort. Employing a cross-over controlled, single blind design, healthy young subjects (n = 21) sat for 18 min on each of the following surfaces: contoured foam, straight foam and wood. Force sensitive resistors attached to the sitting interface measured the relative movements of the subjects during sitting. The purpose of this study was to determine whether ICM could statistically distinguish between each seat material, including two with subtle design differences. In addition, this study investigated methodological considerations, in particular appropriate threshold selection and sitting duration, when analysing objective movement data. ICM appears to be able to statistically distinguish between similar foam surfaces, as long as appropriate ICM thresholds and sufficient sitting durations are present. A relationship between greater ICM and increased discomfort, and lesser ICM and increased comfort was also found. Copyright © 2016. Published by Elsevier Ltd.
Filter Tuning Using the Chi-Squared Statistic
NASA Technical Reports Server (NTRS)
Lilly-Salkowski, Tyler B.
2017-01-01
This paper examines the use of the Chi-square statistic as a means of evaluating filter performance. The goal of the process is to characterize the filter performance in the metric of covariance realism. The Chi-squared statistic is the value calculated to determine the realism of a covariance based on the prediction accuracy and the covariance values at a given point in time. Once calculated, it is the distribution of this statistic that provides insight on the accuracy of the covariance. The process of tuning an Extended Kalman Filter (EKF) for Aqua and Aura support is described, including examination of the measurement errors of available observation types, and methods of dealing with potentially volatile atmospheric drag modeling. Predictive accuracy and the distribution of the Chi-squared statistic, calculated from EKF solutions, are assessed.
A Backscatter-Lidar Forward-Operator
NASA Astrophysics Data System (ADS)
Geisinger, Armin; Behrendt, Andreas; Wulfmeyer, Volker; Vogel, Bernhard; Mattis, Ina; Flentje, Harald; Förstner, Jochen; Potthast, Roland
2015-04-01
We have developed a forward-operator which is capable of calculating virtual lidar profiles from atmospheric state simulations. The operator allows us to compare lidar measurements and model simulations based on the same measurement parameter: the lidar backscatter profile. This method simplifies qualitative comparisons and also makes quantitative comparisons possible, including statistical error quantification. Implemented into an aerosol-capable model system, the operator will act as a component to assimilate backscatter-lidar measurements. As many weather services maintain already networks of backscatter-lidars, such data are acquired already in an operational manner. To estimate and quantify errors due to missing or uncertain aerosol information, we started sensitivity studies about several scattering parameters such as the aerosol size and both the real and imaginary part of the complex index of refraction. Furthermore, quantitative and statistical comparisons between measurements and virtual measurements are shown in this study, i.e. applying the backscatter-lidar forward-operator on model output.
ERIC Educational Resources Information Center
Martin, Stephanie M.
2010-01-01
The present paper examines the relationship between public school teacher salaries and the racial concentration and segregation of students in the district. A particularly rich set of control variables is included to better measure the effect of racial characteristics. Additional analyses included Metropolitan Statistical Area fixed effects and…
A survey and evaluations of histogram-based statistics in alignment-free sequence comparison.
Luczak, Brian B; James, Benjamin T; Girgis, Hani Z
2017-12-06
Since the dawn of the bioinformatics field, sequence alignment scores have been the main method for comparing sequences. However, alignment algorithms are quadratic, requiring long execution time. As alternatives, scientists have developed tens of alignment-free statistics for measuring the similarity between two sequences. We surveyed tens of alignment-free k-mer statistics. Additionally, we evaluated 33 statistics and multiplicative combinations between the statistics and/or their squares. These statistics are calculated on two k-mer histograms representing two sequences. Our evaluations using global alignment scores revealed that the majority of the statistics are sensitive and capable of finding similar sequences to a query sequence. Therefore, any of these statistics can filter out dissimilar sequences quickly. Further, we observed that multiplicative combinations of the statistics are highly correlated with the identity score. Furthermore, combinations involving sequence length difference or Earth Mover's distance, which takes the length difference into account, are always among the highest correlated paired statistics with identity scores. Similarly, paired statistics including length difference or Earth Mover's distance are among the best performers in finding the K-closest sequences. Interestingly, similar performance can be obtained using histograms of shorter words, resulting in reducing the memory requirement and increasing the speed remarkably. Moreover, we found that simple single statistics are sufficient for processing next-generation sequencing reads and for applications relying on local alignment. Finally, we measured the time requirement of each statistic. The survey and the evaluations will help scientists with identifying efficient alternatives to the costly alignment algorithm, saving thousands of computational hours. The source code of the benchmarking tool is available as Supplementary Materials. © The Author 2017. Published by Oxford University Press.
Goddard trajectory determination subsystem: Mathematical specifications
NASA Technical Reports Server (NTRS)
Wagner, W. E. (Editor); Velez, C. E. (Editor)
1972-01-01
The mathematical specifications of the Goddard trajectory determination subsystem of the flight dynamics system are presented. These specifications include the mathematical description of the coordinate systems, dynamic and measurement model, numerical integration techniques, and statistical estimation concepts.
48 CFR 1401.7001-4 - Acquisition performance measurement systems.
Code of Federal Regulations, 2013 CFR
2013-10-01
...-pronged approach that includes self assessment, statistical data for validation and flexible quality... regulations governing the acquisition process; and (3) Identify and implement changes necessary to improve the... through the review and oversight process. ...
48 CFR 1401.7001-4 - Acquisition performance measurement systems.
Code of Federal Regulations, 2014 CFR
2014-10-01
...-pronged approach that includes self assessment, statistical data for validation and flexible quality... regulations governing the acquisition process; and (3) Identify and implement changes necessary to improve the... through the review and oversight process. ...
48 CFR 1401.7001-4 - Acquisition performance measurement systems.
Code of Federal Regulations, 2011 CFR
2011-10-01
...-pronged approach that includes self assessment, statistical data for validation and flexible quality... regulations governing the acquisition process; and (3) Identify and implement changes necessary to improve the... through the review and oversight process. ...
48 CFR 1401.7001-4 - Acquisition performance measurement systems.
Code of Federal Regulations, 2012 CFR
2012-10-01
...-pronged approach that includes self assessment, statistical data for validation and flexible quality... regulations governing the acquisition process; and (3) Identify and implement changes necessary to improve the... through the review and oversight process. ...
An Overview of Interrater Agreement on Likert Scales for Researchers and Practitioners
O'Neill, Thomas A.
2017-01-01
Applications of interrater agreement (IRA) statistics for Likert scales are plentiful in research and practice. IRA may be implicated in job analysis, performance appraisal, panel interviews, and any other approach to gathering systematic observations. Any rating system involving subject-matter experts can also benefit from IRA as a measure of consensus. Further, IRA is fundamental to aggregation in multilevel research, which is becoming increasingly common in order to address nesting. Although, several technical descriptions of a few specific IRA statistics exist, this paper aims to provide a tractable orientation to common IRA indices to support application. The introductory overview is written with the intent of facilitating contrasts among IRA statistics by critically reviewing equations, interpretations, strengths, and weaknesses. Statistics considered include rwg, rwg*, r′wg, rwg(p), average deviation (AD), awg, standard deviation (Swg), and the coefficient of variation (CVwg). Equations support quick calculation and contrasting of different agreement indices. The article also includes a “quick reference” table and three figures in order to help readers identify how IRA statistics differ and how interpretations of IRA will depend strongly on the statistic employed. A brief consideration of recommended practices involving statistical and practical cutoff standards is presented, and conclusions are offered in light of the current literature. PMID:28553257
Alasbali, Tariq; Smith, Michael; Geffen, Noa; Trope, Graham E; Flanagan, John G; Jin, Yaping; Buys, Yvonne M
2009-01-01
To investigate the relationship between industry- vs nonindustry-funded publications comparing the efficacy of topical prostaglandin analogs by evaluating the correspondence between the statistical significance of the publication's main outcome measure and its abstract conclusions. Retrospective, observational cohort study. English publications comparing the ocular hypotensive efficacy between any or all of latanoprost, travoprost, and bimatoprost were searched from the MEDLINE database. Each article was reviewed by three independent observers and was evaluated for source of funding, study quality, statistically significant main outcome measure, correspondence between results of main outcome measure and abstract conclusion, number of intraocular pressure outcomes compared, and journal impact factor. Funding was determined by published disclosure or, in cases of no documented disclosure, the corresponding author was contacted directly to confirm industry funding. Discrepancies were resolved by consensus. The main outcome measure was correspondence between abstract conclusion and reported statistical significance of the publications' main outcome measure. Thirty-nine publications were included, of which 29 were industry funded and 10 were nonindustry funded. The published abstract conclusion was not consistent with the results of the main outcome measure in 18 (62%) of 29 of the industry-funded studies compared with zero (0%) of 10 of the nonindustry-funded studies (P = .0006). Twenty-six (90%) of the industry-funded studies had proindustry abstract conclusions. Twenty-four percent of the industry-funded publications had a statistically significant main outcome measure; however, 90% of the industry-funded studies had proindustry abstract conclusions. Both readers and reviewers should scrutinize publications carefully to ensure that data support the authors' conclusions.
Vetter, Thomas R
2017-11-01
Descriptive statistics are specific methods basically used to calculate, describe, and summarize collected research data in a logical, meaningful, and efficient way. Descriptive statistics are reported numerically in the manuscript text and/or in its tables, or graphically in its figures. This basic statistical tutorial discusses a series of fundamental concepts about descriptive statistics and their reporting. The mean, median, and mode are 3 measures of the center or central tendency of a set of data. In addition to a measure of its central tendency (mean, median, or mode), another important characteristic of a research data set is its variability or dispersion (ie, spread). In simplest terms, variability is how much the individual recorded scores or observed values differ from one another. The range, standard deviation, and interquartile range are 3 measures of variability or dispersion. The standard deviation is typically reported for a mean, and the interquartile range for a median. Testing for statistical significance, along with calculating the observed treatment effect (or the strength of the association between an exposure and an outcome), and generating a corresponding confidence interval are 3 tools commonly used by researchers (and their collaborating biostatistician or epidemiologist) to validly make inferences and more generalized conclusions from their collected data and descriptive statistics. A number of journals, including Anesthesia & Analgesia, strongly encourage or require the reporting of pertinent confidence intervals. A confidence interval can be calculated for virtually any variable or outcome measure in an experimental, quasi-experimental, or observational research study design. Generally speaking, in a clinical trial, the confidence interval is the range of values within which the true treatment effect in the population likely resides. In an observational study, the confidence interval is the range of values within which the true strength of the association between the exposure and the outcome (eg, the risk ratio or odds ratio) in the population likely resides. There are many possible ways to graphically display or illustrate different types of data. While there is often latitude as to the choice of format, ultimately, the simplest and most comprehensible format is preferred. Common examples include a histogram, bar chart, line chart or line graph, pie chart, scatterplot, and box-and-whisker plot. Valid and reliable descriptive statistics can answer basic yet important questions about a research data set, namely: "Who, What, Why, When, Where, How, How Much?"
Self-Esteem: A Comparison between Hong Kong Children and Newly Arrived Chinese Children
ERIC Educational Resources Information Center
Chan, Yiu Man; Chan, Christine Mei-Sheung
2004-01-01
The Self-esteem Inventory developed by Coopersmith (1967) was used to measure the self-esteem of 387 Chinese children. The sample included newly arrived mainland Chinese children and Hong Kong children. The results showed significant statistical differences when measuring the self-esteem level associated with the length of their stay in Hong Kong…
Atmospheric moisture's influence on fire behavior: surface moisture and plume dynamics.
Brian E. Potter; Joseph J. Charney; Lesley A. Fusina
2006-01-01
Nine measures of atmospheric surface moisture are tested for statistical relationships with fire size and number of fires using data from the Great Lakes region of the United States. The measures include relative humidity, water vapor mixing ratio, mixing ratio deficit, vapor pressure, vapor pressure deficit, dew point temperature, dew point depression, wet bulb...
Anthropometric Survey of U.S. Army Personnel: Summary Statistics, Interim Report for 1988
1989-03-01
measurement is taken at the maximum point of quiet respiration. Note: Breast tissue and latissimus dorsi muscle tissue are NOT included in this measurement...sides of the body; as opposed to medial (see Figure A-l). latissimus dorsi — the large muscle covering the lower half of the back above the waist and
Inference from the small scales of cosmic shear with current and future Dark Energy Survey data
MacCrann, N.; Aleksić, J.; Amara, A.; ...
2016-11-05
Cosmic shear is sensitive to fluctuations in the cosmological matter density field, including on small physical scales, where matter clustering is affected by baryonic physics in galaxies and galaxy clusters, such as star formation, supernovae feedback and AGN feedback. While muddying any cosmological information that is contained in small scale cosmic shear measurements, this does mean that cosmic shear has the potential to constrain baryonic physics and galaxy formation. We perform an analysis of the Dark Energy Survey (DES) Science Verification (SV) cosmic shear measurements, now extended to smaller scales, and using the Mead et al. 2015 halo model tomore » account for baryonic feedback. While the SV data has limited statistical power, we demonstrate using a simulated likelihood analysis that the final DES data will have the statistical power to differentiate among baryonic feedback scenarios. We also explore some of the difficulties in interpreting the small scales in cosmic shear measurements, presenting estimates of the size of several other systematic effects that make inference from small scales difficult, including uncertainty in the modelling of intrinsic alignment on nonlinear scales, `lensing bias', and shape measurement selection effects. For the latter two, we make use of novel image simulations. While future cosmic shear datasets have the statistical power to constrain baryonic feedback scenarios, there are several systematic effects that require improved treatments, in order to make robust conclusions about baryonic feedback.« less
A random-sum Wilcoxon statistic and its application to analysis of ROC and LROC data.
Tang, Liansheng Larry; Balakrishnan, N
2011-01-01
The Wilcoxon-Mann-Whitney statistic is commonly used for a distribution-free comparison of two groups. One requirement for its use is that the sample sizes of the two groups are fixed. This is violated in some of the applications such as medical imaging studies and diagnostic marker studies; in the former, the violation occurs since the number of correctly localized abnormal images is random, while in the latter the violation is due to some subjects not having observable measurements. For this reason, we propose here a random-sum Wilcoxon statistic for comparing two groups in the presence of ties, and derive its variance as well as its asymptotic distribution for large sample sizes. The proposed statistic includes the regular Wilcoxon rank-sum statistic. Finally, we apply the proposed statistic for summarizing location response operating characteristic data from a liver computed tomography study, and also for summarizing diagnostic accuracy of biomarker data.
Wood, Molly S.; Fosness, Ryan L.
2013-01-01
The U.S. Geological Survey, in cooperation with the Bureau of Land Management (BLM), collected streamflow data in 2012 and estimated streamflow statistics for stream segments designated "Wild," "Scenic," or "Recreational" under the National Wild and Scenic Rivers System in the Owyhee Canyonlands Wilderness in southwestern Idaho. The streamflow statistics were used by BLM to develop and file a draft, federal reserved water right claim in autumn 2012 to protect federally designated "outstanding remarkable values" in the stream segments. BLM determined that the daily mean streamflow equaled or exceeded 20 and 80 percent of the time during bimonthly periods (two periods per month) and the bankfull streamflow are important streamflow thresholds for maintaining outstanding remarkable values. Prior to this study, streamflow statistics estimated using available datasets and tools for the Owyhee Canyonlands Wilderness were inaccurate for use in the water rights claim. Streamflow measurements were made at varying intervals during February–September 2012 at 14 monitoring sites; 2 of the monitoring sites were equipped with telemetered streamgaging equipment. Synthetic streamflow records were created for 11 of the 14 monitoring sites using a partial‑record method or a drainage-area-ratio method. Streamflow records were obtained directly from an operating, long-term streamgage at one monitoring site, and from discontinued streamgages at two monitoring sites. For 10 sites analyzed using the partial-record method, discrete measurements were related to daily mean streamflow at a nearby, telemetered “index” streamgage. Resulting regression equations were used to estimate daily mean and annual peak streamflow at the monitoring sites during the full period of record for the index sites. A synthetic streamflow record for Sheep Creek was developed using a drainage-area-ratio method, because measured streamflows did not relate well to any index site to allow use of the partial-record method. The synthetic and actual daily mean streamflow records were used to estimate daily mean streamflow that was exceeded 80, 50, and 20 percent of the time (80-, 50-, and 20-percent exceedances) for bimonthly and annual periods. Bankfull streamflow statistics were calculated by fitting the synthetic and actual annual peak streamflow records to a log Pearson Type III distribution using Bulletin 17B guidelines in the U.S. Geological Survey PeakFQ program. The coefficients of determination (R2) for the regressions between the monitoring and index sites ranged from 0.74 for Wickahoney Creek to 0.98 for the West Fork Bruneau River and Deep Creek. Confidence in computed streamflow statistics is highest among other sites for the East Fork Owyhee River and the West Fork Bruneau River on the basis of regression statistics, visual fit of the related data, and the range and number of streamflow measurements. Streamflow statistics for sites with the greatest uncertainty included Big Jacks, Little Jacks, Cottonwood, Wickahoney, and Sheep Creeks. The uncertainty in computed streamflow statistics was due to a number of factors which included the distance of index sites relative to monitoring sites, relatively low streamflow conditions that occurred during the study, and the limited number and range of streamflow measurements. However, the computed streamflow statistics are considered the best possible estimates given available datasets in the remote study area. Streamflow measurements over a wider range of hydrologic and climatic conditions would improve the relations between streamflow characteristics at monitoring and index sites. Additionally, field surveys are needed to verify if the streamflows selected for the water rights claims are sufficient for maintaining outstanding remarkable values in the Wild and Scenic rivers included in the study.
Accelerated testing of space batteries
NASA Technical Reports Server (NTRS)
Mccallum, J.; Thomas, R. E.; Waite, J. H.
1973-01-01
An accelerated life test program for space batteries is presented that fully satisfies empirical, statistical, and physical criteria for validity. The program includes thermal and other nonmechanical stress analyses as well as mechanical stress, strain, and rate of strain measurements.
20 CFR 661.205 - What is the role of the State Board?
Code of Federal Regulations, 2010 CFR
2010-04-01
... performance measures, including State adjusted levels of performance, to assess the effectiveness of the... employment statistics system described in section 15(e) of the Wagner-Peyser Act; and (i) Development of an...
Cheng, Chui Ling
2016-08-03
Statistical models were developed to estimate natural streamflow under low-flow conditions for streams with existing streamflow data at measurement sites on the Islands of Kauaʻi, Oʻahu, Molokaʻi, Maui, and Hawaiʻi. Streamflow statistics used to describe the low-flow characteristics are flow-duration discharges that are equaled or exceeded between 50 and 95 percent of the time during the 30-year base period 1984–2013. Record-augmentation techniques were applied to develop statistical models relating concurrent streamflow data at the measurement sites and long-term data from nearby continuous-record streamflow-gaging stations that were in operation during the base period and were selected as index stations. Existing data and subsequent low-flow analyses of the available data help to identify streams in under-represented geographic areas and hydrogeologic settings where additional data collection is suggested.Low-flow duration discharges were estimated for 107 measurement sites (including long-term and short-term continuous-record streamflow-gaging stations, and partial-record stations) and 27 index stations. The adequacy of statistical models was evaluated with correlation coefficients and modified Nash-Sutcliff coefficients of efficiency, and a majority of the low-flow duration-discharge estimates are satisfactory based on these regression statistics.Molokaʻi and Hawaiʻi have the fewest number of measurement sites (that are not located on ephemeral stream reaches) at which flow-duration discharges were estimated, which can be partially explained by the limited number of index stations available on these islands that could be used for record augmentation. At measurement sites on some tributary streams, low-flow duration discharges could not be estimated because no adequate correlations could be developed with the index stations. These measurement sites are located on streams where duration-discharge estimates are available at long-term stations at other locations on the main stream channel to provide at least some definition of low-flow characteristics on that stream. In terms of general natural streamflow data availability, data are scarce in the leeward areas for all five islands as many leeward streams are dry or have minimal flow. Other under-represented areas include central Oʻahu, central Maui, and southeastern Maui.
Statistical analysis of fNIRS data: a comprehensive review.
Tak, Sungho; Ye, Jong Chul
2014-01-15
Functional near-infrared spectroscopy (fNIRS) is a non-invasive method to measure brain activities using the changes of optical absorption in the brain through the intact skull. fNIRS has many advantages over other neuroimaging modalities such as positron emission tomography (PET), functional magnetic resonance imaging (fMRI), or magnetoencephalography (MEG), since it can directly measure blood oxygenation level changes related to neural activation with high temporal resolution. However, fNIRS signals are highly corrupted by measurement noises and physiology-based systemic interference. Careful statistical analyses are therefore required to extract neuronal activity-related signals from fNIRS data. In this paper, we provide an extensive review of historical developments of statistical analyses of fNIRS signal, which include motion artifact correction, short source-detector separation correction, principal component analysis (PCA)/independent component analysis (ICA), false discovery rate (FDR), serially-correlated errors, as well as inference techniques such as the standard t-test, F-test, analysis of variance (ANOVA), and statistical parameter mapping (SPM) framework. In addition, to provide a unified view of various existing inference techniques, we explain a linear mixed effect model with restricted maximum likelihood (ReML) variance estimation, and show that most of the existing inference methods for fNIRS analysis can be derived as special cases. Some of the open issues in statistical analysis are also described. Copyright © 2013 Elsevier Inc. All rights reserved.
van Mierlo, Trevor; Hyatt, Douglas; Ching, Andrew T
2016-01-01
Digital Health Social Networks (DHSNs) are common; however, there are few metrics that can be used to identify participation inequality. The objective of this study was to investigate whether the Gini coefficient, an economic measure of statistical dispersion traditionally used to measure income inequality, could be employed to measure DHSN inequality. Quarterly Gini coefficients were derived from four long-standing DHSNs. The combined data set included 625,736 posts that were generated from 15,181 actors over 18,671 days. The range of actors (8-2323), posts (29-28,684), and Gini coefficients (0.15-0.37) varied. Pearson correlations indicated statistically significant associations between number of actors and number of posts (0.527-0.835, p < .001), and Gini coefficients and number of posts (0.342-0.725, p < .001). However, the association between Gini coefficient and number of actors was only statistically significant for the addiction networks (0.619 and 0.276, p < .036). Linear regression models had positive but mixed R 2 results (0.333-0.527). In all four regression models, the association between Gini coefficient and posts was statistically significant ( t = 3.346-7.381, p < .002). However, unlike the Pearson correlations, the association between Gini coefficient and number of actors was only statistically significant in the two mental health networks ( t = -4.305 and -5.934, p < .000). The Gini coefficient is helpful in measuring shifts in DHSN inequality. However, as a standalone metric, the Gini coefficient does not indicate optimal numbers or ratios of actors to posts, or effective network engagement. Further, mixed-methods research investigating quantitative performance metrics is required.
Perea, Manuel; Urkia, Miriam; Davis, Colin J; Agirre, Ainhoa; Laseka, Edurne; Carreiras, Manuel
2006-11-01
We describe a Windows program that enables users to obtain a broad range of statistics concerning the properties of word and nonword stimuli in an agglutinative language (Basque), including measures of word frequency (at the whole-word and lemma levels), bigram and biphone frequency, orthographic similarity, orthographic and phonological structure, and syllable-based measures. It is designed for use by researchers in psycholinguistics, particularly those concerned with recognition of isolated words and morphology. In addition to providing standard orthographic and phonological neighborhood measures, the program can be used to obtain information about other forms of orthographic similarity, such as transposed-letter similarity and embedded-word similarity. It is available free of charge from www .uv.es/mperea/E-Hitz.zip.
The CTS 11.7 GHz angle of arrival experiment
NASA Technical Reports Server (NTRS)
Kwan, B. W.; Hodge, D. B.
1981-01-01
The objective of the experiment was to determine the statistical behavior of attenuation and angle of arrival on an Earth-space propagation path using the CTS 11.7 GHz beacon. Measurements performed from 1976 to 1978 form the data base for analysis. The statistics of the signal attenuation and phase variations due to atmospheric disturbances are presented. Rainfall rate distributions are also included to provide a link between the above effects on wave propagation and meteorological conditions.
Rigorous force field optimization principles based on statistical distance minimization
Vlcek, Lukas; Chialvo, Ariel A.
2015-10-12
We use the concept of statistical distance to define a measure of distinguishability between a pair of statistical mechanical systems, i.e., a model and its target, and show that its minimization leads to general convergence of the model’s static measurable properties to those of the target. Here we exploit this feature to define a rigorous basis for the development of accurate and robust effective molecular force fields that are inherently compatible with coarse-grained experimental data. The new model optimization principles and their efficient implementation are illustrated through selected examples, whose outcome demonstrates the higher robustness and predictive accuracy of themore » approach compared to other currently used methods, such as force matching and relative entropy minimization. We also discuss relations between the newly developed principles and established thermodynamic concepts, which include the Gibbs-Bogoliubov inequality and the thermodynamic length.« less
NASA Astrophysics Data System (ADS)
Cohen, D.; Michlmayr, G.; Or, D.
2012-04-01
Shearing of dense granular materials appears in many engineering and Earth sciences applications. Under a constant strain rate, the shearing stress at steady state oscillates with slow rises followed by rapid drops that are linked to the build up and failure of force chains. Experiments indicate that these drops display exponential statistics. Measurements of acoustic emissions during shearing indicates that the energy liberated by failure of these force chains has power-law statistics. Representing force chains as fibers, we use a stick-slip fiber bundle model to obtain analytical solutions of the statistical distribution of stress drops and failure energy. In the model, fibers stretch, fail, and regain strength during deformation. Fibers have Weibull-distributed threshold strengths with either quenched and annealed disorder. The shape of the distribution for drops and energy obtained from the model are similar to those measured during shearing experiments. This simple model may be useful to identify failure events linked to force chain failures. Future generalizations of the model that include different types of fiber failure may also allow identification of different types of granular failures that have distinct statistical acoustic emission signatures.
Detailed Uncertainty Analysis of the ZEM-3 Measurement System
NASA Technical Reports Server (NTRS)
Mackey, Jon; Sehirlioglu, Alp; Dynys, Fred
2014-01-01
The measurement of Seebeck coefficient and electrical resistivity are critical to the investigation of all thermoelectric systems. Therefore, it stands that the measurement uncertainty must be well understood to report ZT values which are accurate and trustworthy. A detailed uncertainty analysis of the ZEM-3 measurement system has been performed. The uncertainty analysis calculates error in the electrical resistivity measurement as a result of sample geometry tolerance, probe geometry tolerance, statistical error, and multi-meter uncertainty. The uncertainty on Seebeck coefficient includes probe wire correction factors, statistical error, multi-meter uncertainty, and most importantly the cold-finger effect. The cold-finger effect plagues all potentiometric (four-probe) Seebeck measurement systems, as heat parasitically transfers through thermocouple probes. The effect leads to an asymmetric over-estimation of the Seebeck coefficient. A thermal finite element analysis allows for quantification of the phenomenon, and provides an estimate on the uncertainty of the Seebeck coefficient. The thermoelectric power factor has been found to have an uncertainty of +9-14 at high temperature and 9 near room temperature.
Lee, L.; Helsel, D.
2005-01-01
Trace contaminants in water, including metals and organics, often are measured at sufficiently low concentrations to be reported only as values below the instrument detection limit. Interpretation of these "less thans" is complicated when multiple detection limits occur. Statistical methods for multiply censored, or multiple-detection limit, datasets have been developed for medical and industrial statistics, and can be employed to estimate summary statistics or model the distributions of trace-level environmental data. We describe S-language-based software tools that perform robust linear regression on order statistics (ROS). The ROS method has been evaluated as one of the most reliable procedures for developing summary statistics of multiply censored data. It is applicable to any dataset that has 0 to 80% of its values censored. These tools are a part of a software library, or add-on package, for the R environment for statistical computing. This library can be used to generate ROS models and associated summary statistics, plot modeled distributions, and predict exceedance probabilities of water-quality standards. ?? 2005 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pulliam, L.W.
The purpose of this descriptive, correlational study was to ascertain if there is a relationship between social support and the nutritional status of patients receiving radiation therapy for cancer. The data collection instruments used included the Norbeck Social Support Questionnaire (NSSQ), the Personal Characteristics Form, the abbreviated Health History, the Flow Sheet for Nutritional Data, and the Interview Schedule. For the analysis of data descriptive statistics were utilized to provide a profile of subjects, and correlational statistics were used to ascertain if there were relationships among the indicators of nutritional status and the social support variables. A convenience sample wasmore » comprised of 50 cancer patients deemed curable by radiation therapy. Findings included significant decreases in anthropometric measurements and biochemical tests during therapy. Serial assessments of nutritional status, therefore, are recommended for all cancer patients during therapy in order to plan and implement strategies for meeting the self-care requisites for food and water. No statistically significant relationships were found between the social support variables as measured by the NSSQ and the indicators of nutritional status. This suggests that nurses can assist patients by fostering support from actual and potential nutritional confidants.« less
Estimating individual benefits of medical or behavioral treatments in severely ill patients.
Diaz, Francisco J
2017-01-01
There is a need for statistical methods appropriate for the analysis of clinical trials from a personalized-medicine viewpoint as opposed to the common statistical practice that simply examines average treatment effects. This article proposes an approach to quantifying, reporting and analyzing individual benefits of medical or behavioral treatments to severely ill patients with chronic conditions, using data from clinical trials. The approach is a new development of a published framework for measuring the severity of a chronic disease and the benefits treatments provide to individuals, which utilizes regression models with random coefficients. Here, a patient is considered to be severely ill if the patient's basal severity is close to one. This allows the derivation of a very flexible family of probability distributions of individual benefits that depend on treatment duration and the covariates included in the regression model. Our approach may enrich the statistical analysis of clinical trials of severely ill patients because it allows investigating the probability distribution of individual benefits in the patient population and the variables that influence it, and we can also measure the benefits achieved in specific patients including new patients. We illustrate our approach using data from a clinical trial of the anti-depressant imipramine.
Comparative Research Productivity Measures for Economic Departments.
ERIC Educational Resources Information Center
Huettner, David A.; Clark, William
1997-01-01
Develops a simple theoretical model to evaluate interdisciplinary differences in research productivity between economics departments and related subjects. Compares the research publishing statistics of economics, finance, psychology, geology, physics, oceanography, chemistry, and geophysics. Considers a number of factors including journal…
Elliott, Fiona; Oates, Liza; Schembri, Adrian; Mantri, Nitin
2017-01-01
Abstract Background: Wellness retreats use many complementary and alternative therapies within a holistic residential setting, yet few studies have evaluated the effect of retreat experiences on multiple dimensions of health and well-being, and no published studies have reported health outcomes in wellness tourists. Objectives: To assess the effect of a week-long wellness-retreat experience in wellness tourists. Design: A longitudinal observational study with outcomes assessed upon arrival and departure and 6 weeks after the retreat. Setting: A rural health retreat in Queensland, Australia. Interventions: A holistic, 1-week, residential, retreat experience that included many educational, therapeutic, and leisure activities and an organic, mostly plant-based diet. Outcome measures: Multiple outcome measures were performed upon arrival and departure and 6 weeks after the retreat. These included anthropometric measures, urinary pesticide metabolites, a food and health symptom questionnaire, the Five Factor Wellness Inventory, the General Self Efficacy questionnaire, the Pittsburgh Insomnia Rating Scale, the Depression Anxiety Stress Scale, the Profile of Mood States, and the Cogstate cognitive function test battery. Results: Statistically significant improvements (p < 0.05) were seen in almost all measures (n = 37) after 1 week and were sustained at 6 weeks (n = 17). There were statistically significant improvements (p < 0.001) in all anthropometric measures after 1 week, with reductions in abdominal girth (2.7 cm), weight (1.6 kg), and average systolic and diastolic pressure (−16.1 mmHg and −9.3 mmHg, respectively). Statistically significant improvements (p < 0.05) were also seen in psychological and health symptom measures. Urinary pesticide metabolites were detected in pooled urine samples before the retreat and were undetectable after the retreat. Conclusion: Retreat experiences can lead to substantial improvements in multiple dimensions of health and well-being that are maintained for 6 weeks. Further research that includes objective biomarkers and economic measures in different populations is required to determine the mechanisms of these effects and assess the value and relevance of retreat experiences to clinicians and health insurers. PMID:28068147
NASA Technical Reports Server (NTRS)
Talpe, Matthieu J.; Nerem, R. Steven; Forootan, Ehsan; Schmidt, Michael; Lemoine, Frank G.; Enderlin, Ellyn M.; Landerer, Felix W.
2017-01-01
We construct long-term time series of Greenland and Antarctic ice sheet mass change from satellite gravity measurements. A statistical reconstruction approach is developed based on a principal component analysis (PCA) to combine high-resolution spatial modes from the Gravity Recovery and Climate Experiment (GRACE) mission with the gravity information from conventional satellite tracking data. Uncertainties of this reconstruction are rigorously assessed; they include temporal limitations for short GRACE measurements, spatial limitations for the low-resolution conventional tracking data measurements, and limitations of the estimated statistical relationships between low- and high-degree potential coefficients reflected in the PCA modes. Trends of mass variations in Greenland and Antarctica are assessed against a number of previous studies. The resulting time series for Greenland show a higher rate of mass loss than other methods before 2000, while the Antarctic ice sheet appears heavily influenced by interannual variations.
ERIC Educational Resources Information Center
Harris, Margaret L.; Tabachnick, B. Robert
This paper describes test development efforts for measuring achievement of selected concepts in social studies. It includes descriptive item and test statistics for the tests developed. Twelve items were developed for each of 30 concepts. Subject specialists categorized the concepts into three major areas: Geographic Region, Man and Society, and…
Measures of Child Well-Being in Utah, 1998. Measuring Success One Kid at a Time.
ERIC Educational Resources Information Center
Haven, Terry, Ed.
This Kids Count report details statewide trends in the well-being of Utah's children. The statistical portrait is based on five general areas of children's well-being: (1) demographics; (2) health; (3) education; (4) safety; and (5) economic security. Key indicators in these areas include: (1) family composition; (2) prenatal care; (3) infant…
Willems, Mariël; Waninge, Aly; Hilgenkamp, Thessa I M; van Empelen, Pepijn; Krijnen, Wim P; van der Schans, Cees P; Melville, Craig A
2018-05-08
Promotion of a healthy lifestyle for people with intellectual disabilities is important; however, the effectiveness of lifestyle change interventions is unclear. This research will examine the effectiveness of lifestyle change interventions for people with intellectual disabilities. Randomized controlled trials (RCTs) of lifestyle change interventions for people with intellectual disabilities were included in a systematic review and meta-analysis. Data on study and intervention characteristics were extracted, as well as data on outcome measures and results. Internal validity of the selected papers was assessed using the Cochrane Collaboration's risk bias tool. Eight RCTs were included. Multiple outcome measures were used, whereby outcome measures targeting environmental factors and participation were lacking and personal outcome measures were mostly used by a single study. Risks of bias were found for all studies. Meta-analysis showed some effectiveness for lifestyle change interventions, and a statistically significant decrease was found for waist circumference. Some effectiveness was found for lifestyle change interventions for people with intellectual disabilities. However, the effects were only statistically significant for waist circumference, so current lifestyle change interventions may not be optimally tailored to meet the needs of people with intellectual disabilities. © 2018 John Wiley & Sons Ltd.
Hoffer, Kenneth J; Shammas, H John; Savini, Giacomo; Huang, Jinhai
2016-01-01
To evaluate the agreement between the measurements provided by a new optical biometer, the Aladdin, based on optical low-coherence interferometry (OLCI), and those provided by the most commonly used optical biometer (IOLMaster 500), based on partial-coherence interferometry (PCI). Multicenter clinical trial. Prospective evaluation of diagnostic test. In this study, 2 samples of adult patients were enrolled, 1 in the United States and the other in China. The U.S. group included a sample of consecutive patients scheduled for cataract surgery. The China group included a sample of healthy subjects with no cataracts. In both cases, only 1 eye of each patient was analyzed. Axial length (AL), corneal power (in diopters [D]) (K), anterior chamber depth (ACD) (corneal epithelium to lens), and corneal astigmatism were measured. All values were analyzed using a paired t test, the Pearson product-moment correlation coefficient (r), and Bland-Altman plots. In the U.S. and China groups, the OLCI mean AL values did not show a statistically significant difference from PCI values and showed excellent agreement and correlation. On the contrary, OLCI measured a lower mean K (-0.14 D) and a deeper ACD measurements (U.S. +0.16 mm and China +0.05 mm). These differences were statistically significant (P < .0001). Vector analysis did not show a statistically significant difference in astigmatism measurements. Agreement between OLCI and PCI was good. However, the small but statistically significant differences in K and ACD measurements make constant optimization necessary when calculating the intraocular lens power using theoretical formulas. Dr. Hoffer licenses the registered trademark name Hoffer to Carl Zeiss-Meditec (PCI), Haag-Streit (Lenstar), Movu (Argos), Oculus (Pentacam, AXL), Nidek (AL-Scan), Tomey (OA-2000), Topcon EU Visia Imaging (Aladdin), Ziemer (Galilei G6), and all A-scan biometer manufacturers. Dr. Shammas licenses his formulas to Carl Zeiss-Meditec (PCI), Haag-Streit (Lenstar), Nidek (AL-Scan), and Topcon EU (Visia Imaging) (Aladdin). None of the other authors has a financial or proprietary interest in any material or method mentioned. Copyright © 2016 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.
Local statistics of retinal optic flow for self-motion through natural sceneries.
Calow, Dirk; Lappe, Markus
2007-12-01
Image analysis in the visual system is well adapted to the statistics of natural scenes. Investigations of natural image statistics have so far mainly focused on static features. The present study is dedicated to the measurement and the analysis of the statistics of optic flow generated on the retina during locomotion through natural environments. Natural locomotion includes bouncing and swaying of the head and eye movement reflexes that stabilize gaze onto interesting objects in the scene while walking. We investigate the dependencies of the local statistics of optic flow on the depth structure of the natural environment and on the ego-motion parameters. To measure these dependencies we estimate the mutual information between correlated data sets. We analyze the results with respect to the variation of the dependencies over the visual field, since the visual motions in the optic flow vary depending on visual field position. We find that retinal flow direction and retinal speed show only minor statistical interdependencies. Retinal speed is statistically tightly connected to the depth structure of the scene. Retinal flow direction is statistically mostly driven by the relation between the direction of gaze and the direction of ego-motion. These dependencies differ at different visual field positions such that certain areas of the visual field provide more information about ego-motion and other areas provide more information about depth. The statistical properties of natural optic flow may be used to tune the performance of artificial vision systems based on human imitating behavior, and may be useful for analyzing properties of natural vision systems.
Homeopathy for attention-deficit/hyperactivity disorder: a pilot randomized-controlled trial.
Jacobs, Jennifer; Williams, Anna-Leila; Girard, Christine; Njike, Valentine Yanchou; Katz, David
2005-10-01
The aim of this study was to carry out a preliminary trial evaluating the effectiveness of homeopathy in the treatment of attention-deficit/hyperactivity disorder (ADHD). This work was a randomized, double-blind, placebo-controlled trial. This study was conducted in a private homeopathic clinic in the Seattle metropolitan area. Subjects included children 6-12 years of age meeting Diagnostic and Statistical Manual of Mental Disorders 4th edition (DSM-IV) criteria for ADHD. Forty-three subjects were randomized to receive a homeopathic consultation and either an individualized homeopathic remedy or placebo. Patients were seen by homeopathic physicians every 6 weeks for 18 weeks. Outcome measures included the Conner's Global Index-Parent, Conner's Global Index- Teacher, Conner's Parent Rating Scale-Brief, Continuous Performance Test, and the Clinical Global Impression Scale. There were no statistically significant differences between homeopathic remedy and placebo groups on the primary or secondary outcome variables. However, there were statistically and clinically significant improvements in both groups on many of the outcome measures. This pilot study provides no evidence to support a therapeutic effect of individually selected homeopathic remedies in children with ADHD. A therapeutic effect of the homeopathic encounter is suggested and warrants further evaluation. Future studies should be carried out over a longer period of time and should include a control group that does not receive the homeopathic consultation. Comparison to conventional stimulant medication for ADHD also should be considered.
Influence of eye biometrics and corneal micro-structure on noncontact tonometry.
Jesus, Danilo A; Majewska, Małgorzata; Krzyżanowska-Berkowska, Patrycja; Iskander, D Robert
2017-01-01
Tonometry is widely used as the main screening tool supporting glaucoma diagnosis. Still, its accuracy could be improved if full knowledge about the variation of the corneal biomechanical properties was available. In this study, Optical Coherence Tomography (OCT) speckle statistics are used to infer the organisation of the corneal micro-structure and hence, to analyse its influence on intraocular pressure (IOP) measurements. Fifty-six subjects were recruited for this prospective study. Macro and micro-structural corneal parameters as well as subject age were considered. Macro-structural analysis included the parameters that are associated with the ocular anatomy, such as central corneal thickness (CCT), corneal radius, axial length, anterior chamber depth and white-to-white corneal diameter. Micro-structural parameters which included OCT speckle statistics were related to the internal organisation of the corneal tissue and its physiological changes during lifetime. The corneal speckle obtained from OCT was modelled with the Generalised Gamma (GG) distribution that is characterised with a scale parameter and two shape parameters. In macro-structure analysis, only CCT showed a statistically significant correlation with IOP (R2 = 0.25, p<0.001). The scale parameter and the ratio of the shape parameters of GG distribution showed statistically significant correlation with IOP (R2 = 0.19, p<0.001 and R2 = 0.17, p<0.001, respectively). For the studied group, a weak, although significant correlation was found between age and IOP (R2 = 0.053, p = 0.04). Forward stepwise regression showed that CCT and the scale parameter of the Generalised Gamma distribution can be combined in a regression model (R2 = 0.39, p<0.001) to study the role of the corneal structure on IOP. We show, for the first time, that corneal micro-structure influences the IOP measurements obtained from noncontact tonometry. OCT speckle statistics can be employed to learn about the corneal micro-structure and hence, to further calibrate the IOP measurements.
Influence of eye biometrics and corneal micro-structure on noncontact tonometry
Majewska, Małgorzata; Krzyżanowska-Berkowska, Patrycja; Iskander, D. Robert
2017-01-01
Purpose Tonometry is widely used as the main screening tool supporting glaucoma diagnosis. Still, its accuracy could be improved if full knowledge about the variation of the corneal biomechanical properties was available. In this study, Optical Coherence Tomography (OCT) speckle statistics are used to infer the organisation of the corneal micro-structure and hence, to analyse its influence on intraocular pressure (IOP) measurements. Methods Fifty-six subjects were recruited for this prospective study. Macro and micro-structural corneal parameters as well as subject age were considered. Macro-structural analysis included the parameters that are associated with the ocular anatomy, such as central corneal thickness (CCT), corneal radius, axial length, anterior chamber depth and white-to-white corneal diameter. Micro-structural parameters which included OCT speckle statistics were related to the internal organisation of the corneal tissue and its physiological changes during lifetime. The corneal speckle obtained from OCT was modelled with the Generalised Gamma (GG) distribution that is characterised with a scale parameter and two shape parameters. Results In macro-structure analysis, only CCT showed a statistically significant correlation with IOP (R2 = 0.25, p<0.001). The scale parameter and the ratio of the shape parameters of GG distribution showed statistically significant correlation with IOP (R2 = 0.19, p<0.001 and R2 = 0.17, p<0.001, respectively). For the studied group, a weak, although significant correlation was found between age and IOP (R2 = 0.053, p = 0.04). Forward stepwise regression showed that CCT and the scale parameter of the Generalised Gamma distribution can be combined in a regression model (R2 = 0.39, p<0.001) to study the role of the corneal structure on IOP. Conclusions We show, for the first time, that corneal micro-structure influences the IOP measurements obtained from noncontact tonometry. OCT speckle statistics can be employed to learn about the corneal micro-structure and hence, to further calibrate the IOP measurements. PMID:28472178
NASA Technical Reports Server (NTRS)
Tripp, John S.; Tcheng, Ping
1999-01-01
Statistical tools, previously developed for nonlinear least-squares estimation of multivariate sensor calibration parameters and the associated calibration uncertainty analysis, have been applied to single- and multiple-axis inertial model attitude sensors used in wind tunnel testing to measure angle of attack and roll angle. The analysis provides confidence and prediction intervals of calibrated sensor measurement uncertainty as functions of applied input pitch and roll angles. A comparative performance study of various experimental designs for inertial sensor calibration is presented along with corroborating experimental data. The importance of replicated calibrations over extended time periods has been emphasized; replication provides independent estimates of calibration precision and bias uncertainties, statistical tests for calibration or modeling bias uncertainty, and statistical tests for sensor parameter drift over time. A set of recommendations for a new standardized model attitude sensor calibration method and usage procedures is included. The statistical information provided by these procedures is necessary for the uncertainty analysis of aerospace test results now required by users of industrial wind tunnel test facilities.
Performance of Reclassification Statistics in Comparing Risk Prediction Models
Paynter, Nina P.
2012-01-01
Concerns have been raised about the use of traditional measures of model fit in evaluating risk prediction models for clinical use, and reclassification tables have been suggested as an alternative means of assessing the clinical utility of a model. Several measures based on the table have been proposed, including the reclassification calibration (RC) statistic, the net reclassification improvement (NRI), and the integrated discrimination improvement (IDI), but the performance of these in practical settings has not been fully examined. We used simulations to estimate the type I error and power for these statistics in a number of scenarios, as well as the impact of the number and type of categories, when adding a new marker to an established or reference model. The type I error was found to be reasonable in most settings, and power was highest for the IDI, which was similar to the test of association. The relative power of the RC statistic, a test of calibration, and the NRI, a test of discrimination, varied depending on the model assumptions. These tools provide unique but complementary information. PMID:21294152
Elangovan, Satheesh; Brogden, Kim A; Dawson, Deborah V; Blanchette, Derek; Pagan-Rivera, Keyla; Stanford, Clark M; Johnson, Georgia K; Recker, Erica; Bowers, Rob; Haynes, William G; Avila-Ortiz, Gustavo
2014-01-01
To examine the relationships between three measures of body fat-body mass index (BMI), waist circumference (WC), and total body fat percent-and markers of inflammation around dental implants in stable periodontal maintenance patients. Seventy-three subjects were enrolled in this cross-sectional assessment. The study visit consisted of a physical examination that included anthropologic measurements of body composition (BMI, WC, body fat %); intraoral assessments were performed (full-mouth plaque index, periodontal and peri-implant comprehensive examinations) and peri-implant sulcular fluid (PISF) was collected on the study implants. Levels of interleukin (IL)-1α, IL-1β, IL-6, IL-8, IL-10, IL-12, IL-17, tumor necrosis factor-α, C-reactive protein, osteoprotegerin, leptin, and adiponectin in the PISF were measured using multiplex proteomic immunoassays. Correlation analysis with body fat measures was then performed using appropriate statistical methods. After adjustments for covariates, regression analyses revealed statistically significant correlation between IL-1β in PISF and WC (R = 0.33; P = .0047). In this study in stable periodontal maintenance patients, a modest but statistically significant positive correlation was observed between the levels of IL-1β, a major proinflammatory cytokine in PISF, and WC, a reliable measure of central obesity.
ParallABEL: an R library for generalized parallelization of genome-wide association studies.
Sangket, Unitsa; Mahasirimongkol, Surakameth; Chantratita, Wasun; Tandayya, Pichaya; Aulchenko, Yurii S
2010-04-29
Genome-Wide Association (GWA) analysis is a powerful method for identifying loci associated with complex traits and drug response. Parts of GWA analyses, especially those involving thousands of individuals and consuming hours to months, will benefit from parallel computation. It is arduous acquiring the necessary programming skills to correctly partition and distribute data, control and monitor tasks on clustered computers, and merge output files. Most components of GWA analysis can be divided into four groups based on the types of input data and statistical outputs. The first group contains statistics computed for a particular Single Nucleotide Polymorphism (SNP), or trait, such as SNP characterization statistics or association test statistics. The input data of this group includes the SNPs/traits. The second group concerns statistics characterizing an individual in a study, for example, the summary statistics of genotype quality for each sample. The input data of this group includes individuals. The third group consists of pair-wise statistics derived from analyses between each pair of individuals in the study, for example genome-wide identity-by-state or genomic kinship analyses. The input data of this group includes pairs of SNPs/traits. The final group concerns pair-wise statistics derived for pairs of SNPs, such as the linkage disequilibrium characterisation. The input data of this group includes pairs of individuals. We developed the ParallABEL library, which utilizes the Rmpi library, to parallelize these four types of computations. ParallABEL library is not only aimed at GenABEL, but may also be employed to parallelize various GWA packages in R. The data set from the North American Rheumatoid Arthritis Consortium (NARAC) includes 2,062 individuals with 545,080, SNPs' genotyping, was used to measure ParallABEL performance. Almost perfect speed-up was achieved for many types of analyses. For example, the computing time for the identity-by-state matrix was linearly reduced from approximately eight hours to one hour when ParallABEL employed eight processors. Executing genome-wide association analysis using the ParallABEL library on a computer cluster is an effective way to boost performance, and simplify the parallelization of GWA studies. ParallABEL is a user-friendly parallelization of GenABEL.
Statistical analysis and interpolation of compositional data in materials science.
Pesenson, Misha Z; Suram, Santosh K; Gregoire, John M
2015-02-09
Compositional data are ubiquitous in chemistry and materials science: analysis of elements in multicomponent systems, combinatorial problems, etc., lead to data that are non-negative and sum to a constant (for example, atomic concentrations). The constant sum constraint restricts the sampling space to a simplex instead of the usual Euclidean space. Since statistical measures such as mean and standard deviation are defined for the Euclidean space, traditional correlation studies, multivariate analysis, and hypothesis testing may lead to erroneous dependencies and incorrect inferences when applied to compositional data. Furthermore, composition measurements that are used for data analytics may not include all of the elements contained in the material; that is, the measurements may be subcompositions of a higher-dimensional parent composition. Physically meaningful statistical analysis must yield results that are invariant under the number of composition elements, requiring the application of specialized statistical tools. We present specifics and subtleties of compositional data processing through discussion of illustrative examples. We introduce basic concepts, terminology, and methods required for the analysis of compositional data and utilize them for the spatial interpolation of composition in a sputtered thin film. The results demonstrate the importance of this mathematical framework for compositional data analysis (CDA) in the fields of materials science and chemistry.
Random forests for classification in ecology
Cutler, D.R.; Edwards, T.C.; Beard, K.H.; Cutler, A.; Hess, K.T.; Gibson, J.; Lawler, J.J.
2007-01-01
Classification procedures are some of the most widely used statistical methods in ecology. Random forests (RF) is a new and powerful statistical classifier that is well established in other disciplines but is relatively unknown in ecology. Advantages of RF compared to other statistical classifiers include (1) very high classification accuracy; (2) a novel method of determining variable importance; (3) ability to model complex interactions among predictor variables; (4) flexibility to perform several types of statistical data analysis, including regression, classification, survival analysis, and unsupervised learning; and (5) an algorithm for imputing missing values. We compared the accuracies of RF and four other commonly used statistical classifiers using data on invasive plant species presence in Lava Beds National Monument, California, USA, rare lichen species presence in the Pacific Northwest, USA, and nest sites for cavity nesting birds in the Uinta Mountains, Utah, USA. We observed high classification accuracy in all applications as measured by cross-validation and, in the case of the lichen data, by independent test data, when comparing RF to other common classification methods. We also observed that the variables that RF identified as most important for classifying invasive plant species coincided with expectations based on the literature. ?? 2007 by the Ecological Society of America.
Temperature and Voltage Offsets in High- ZT Thermoelectrics
NASA Astrophysics Data System (ADS)
Levy, George S.
2018-06-01
Thermodynamic temperature can take on different meanings. Kinetic temperature is an expectation value and a function of the kinetic energy distribution. Statistical temperature is a parameter of the distribution. Kinetic temperature and statistical temperature, identical in Maxwell-Boltzmann statistics, can differ in other statistics such as those of Fermi-Dirac or Bose-Einstein when a field is present. Thermal equilibrium corresponds to zero statistical temperature gradient, not zero kinetic temperature gradient. Since heat carriers in thermoelectrics are fermions, the difference between these two temperatures may explain voltage and temperature offsets observed during meticulous Seebeck measurements in which the temperature-voltage curve does not go through the origin. In conventional semiconductors, temperature offsets produced by fermionic electrical carriers are not observable because they are shorted by heat phonons in the lattice. In high- ZT materials, however, these offsets have been detected but attributed to faulty laboratory procedures. Additional supporting evidence for spontaneous voltages and temperature gradients includes data collected in epistatic experiments and in the plasma Q-machine. Device fabrication guidelines for testing the hypothesis are suggested including using unipolar junctions stacked in a superlattice, alternating n/ n + and p/ p + junctions, selecting appropriate dimensions, doping, and loading.
Temperature and Voltage Offsets in High-ZT Thermoelectrics
NASA Astrophysics Data System (ADS)
Levy, George S.
2017-10-01
Thermodynamic temperature can take on different meanings. Kinetic temperature is an expectation value and a function of the kinetic energy distribution. Statistical temperature is a parameter of the distribution. Kinetic temperature and statistical temperature, identical in Maxwell-Boltzmann statistics, can differ in other statistics such as those of Fermi-Dirac or Bose-Einstein when a field is present. Thermal equilibrium corresponds to zero statistical temperature gradient, not zero kinetic temperature gradient. Since heat carriers in thermoelectrics are fermions, the difference between these two temperatures may explain voltage and temperature offsets observed during meticulous Seebeck measurements in which the temperature-voltage curve does not go through the origin. In conventional semiconductors, temperature offsets produced by fermionic electrical carriers are not observable because they are shorted by heat phonons in the lattice. In high-ZT materials, however, these offsets have been detected but attributed to faulty laboratory procedures. Additional supporting evidence for spontaneous voltages and temperature gradients includes data collected in epistatic experiments and in the plasma Q-machine. Device fabrication guidelines for testing the hypothesis are suggested including using unipolar junctions stacked in a superlattice, alternating n/n + and p/p + junctions, selecting appropriate dimensions, doping, and loading.
Computers in the General Physics Laboratory.
ERIC Educational Resources Information Center
Preston, Daryl W.; Good, R. H.
1996-01-01
Provides ideas and outcomes for nine computer laboratory experiments using a commercial eight-bit analog to digital (ADC) interface. Experiments cover statistics; rotation; harmonic motion; voltage, current, and resistance; ADC conversions; temperature measurement; single slit diffraction; and radioactive decay. Includes necessary schematics. (MVL)
Hohn, M. Ed; Nuhfer, E.B.; Vinopal, R.J.; Klanderman, D.S.
1980-01-01
Classifying very fine-grained rocks through fabric elements provides information about depositional environments, but is subject to the biases of visual taxonomy. To evaluate the statistical significance of an empirical classification of very fine-grained rocks, samples from Devonian shales in four cored wells in West Virginia and Virginia were measured for 15 variables: quartz, illite, pyrite and expandable clays determined by X-ray diffraction; total sulfur, organic content, inorganic carbon, matrix density, bulk density, porosity, silt, as well as density, sonic travel time, resistivity, and ??-ray response measured from well logs. The four lithologic types comprised: (1) sharply banded shale, (2) thinly laminated shale, (3) lenticularly laminated shale, and (4) nonbanded shale. Univariate and multivariate analyses of variance showed that the lithologic classification reflects significant differences for the variables measured, difference that can be detected independently of stratigraphic effects. Little-known statistical methods found useful in this work included: the multivariate analysis of variance with more than one effect, simultaneous plotting of samples and variables on canonical variates, and the use of parametric ANOVA and MANOVA on ranked data. ?? 1980 Plenum Publishing Corporation.
Summary Statistics for Fun Dough Data Acquired at LLNL
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kallman, J S; Morales, K E; Whipple, R E
Using x-ray computerized tomography (CT), we have characterized the x-ray linear attenuation coefficients (LAC) of a Play Dough{trademark}-like product, Fun Dough{trademark}, designated as PD. Table 1 gives the first-order statistics for each of four CT measurements, estimated with a Gaussian kernel density estimator (KDE) analysis. The mean values of the LAC range from a high of about 2100 LMHU{sub D} at 100kVp to a low of about 1100 LMHU{sub D} at 300kVp. The standard deviation of each measurement is around 1% of the mean. The entropy covers the range from 3.9 to 4.6. Ordinarily, we would model the LAC ofmore » the material and compare the modeled values to the measured values. In this case, however, we did not have the composition of the material and therefore did not model the LAC. Using a method recently proposed by Lawrence Livermore National Laboratory (LLNL), we estimate the value of the effective atomic number, Z{sub eff}, to be near 8.5. LLNL prepared about 50mL of the Fun Dough{trademark} in a polypropylene vial and firmly compressed it immediately prior to the x-ray measurements. Still, layers can plainly be seen in the reconstructed images, indicating that the bulk density of the material in the container is affected by voids and bubbles. We used the computer program IMGREC to reconstruct the CT images. The values of the key parameters used in the data capture and image reconstruction are given in this report. Additional details may be found in the experimental SOP and a separate document. To characterize the statistical distribution of LAC values in each CT image, we first isolated an 80% central-core segment of volume elements ('voxels') lying completely within the specimen, away from the walls of the polypropylene vial. All of the voxels within this central core, including those comprised of voids and inclusions, are included in the statistics. We then calculated the mean value, standard deviation and entropy for (a) the four image segments and for (b) their digital gradient images. (A digital gradient image of a given image was obtained by taking the absolute value of the difference between the initial image and that same image offset by one voxel horizontally, parallel to the rows of the x-ray detector array.) The statistics of the initial image of LAC values are called 'first order statistics;' those of the gradient image, 'second order statistics.'« less
Use of model calibration to achieve high accuracy in analysis of computer networks
Frogner, Bjorn; Guarro, Sergio; Scharf, Guy
2004-05-11
A system and method are provided for creating a network performance prediction model, and calibrating the prediction model, through application of network load statistical analyses. The method includes characterizing the measured load on the network, which may include background load data obtained over time, and may further include directed load data representative of a transaction-level event. Probabilistic representations of load data are derived to characterize the statistical persistence of the network performance variability and to determine delays throughout the network. The probabilistic representations are applied to the network performance prediction model to adapt the model for accurate prediction of network performance. Certain embodiments of the method and system may be used for analysis of the performance of a distributed application characterized as data packet streams.
Local image statistics: maximum-entropy constructions and perceptual salience
Victor, Jonathan D.; Conte, Mary M.
2012-01-01
The space of visual signals is high-dimensional and natural visual images have a highly complex statistical structure. While many studies suggest that only a limited number of image statistics are used for perceptual judgments, a full understanding of visual function requires analysis not only of the impact of individual image statistics, but also, how they interact. In natural images, these statistical elements (luminance distributions, correlations of low and high order, edges, occlusions, etc.) are intermixed, and their effects are difficult to disentangle. Thus, there is a need for construction of stimuli in which one or more statistical elements are introduced in a controlled fashion, so that their individual and joint contributions can be analyzed. With this as motivation, we present algorithms to construct synthetic images in which local image statistics—including luminance distributions, pair-wise correlations, and higher-order correlations—are explicitly specified and all other statistics are determined implicitly by maximum-entropy. We then apply this approach to measure the sensitivity of the human visual system to local image statistics and to sample their interactions. PMID:22751397
A cross-national analysis of how economic inequality predicts biodiversity loss.
Holland, Tim G; Peterson, Garry D; Gonzalez, Andrew
2009-10-01
We used socioeconomic models that included economic inequality to predict biodiversity loss, measured as the proportion of threatened plant and vertebrate species, across 50 countries. Our main goal was to evaluate whether economic inequality, measured as the Gini index of income distribution, improved the explanatory power of our statistical models. We compared four models that included the following: only population density, economic footprint (i.e., the size of the economy relative to the country area), economic footprint and income inequality (Gini index), and an index of environmental governance. We also tested the environmental Kuznets curve hypothesis, but it was not supported by the data. Statistical comparisons of the models revealed that the model including both economic footprint and inequality was the best predictor of threatened species. It significantly outperformed population density alone and the environmental governance model according to the Akaike information criterion. Inequality was a significant predictor of biodiversity loss and significantly improved the fit of our models. These results confirm that socioeconomic inequality is an important factor to consider when predicting rates of anthropogenic biodiversity loss.
Assessment of cognitive safety in clinical drug development
Roiser, Jonathan P.; Nathan, Pradeep J.; Mander, Adrian P.; Adusei, Gabriel; Zavitz, Kenton H.; Blackwell, Andrew D.
2016-01-01
Cognitive impairment is increasingly recognised as an important potential adverse effect of medication. However, many drug development programmes do not incorporate sensitive cognitive measurements. Here, we review the rationale for cognitive safety assessment, and explain several basic methodological principles for measuring cognition during clinical drug development, including study design and statistical analysis, from Phase I through to postmarketing. The crucial issue of how cognition should be assessed is emphasized, especially the sensitivity of measurement. We also consider how best to interpret the magnitude of any identified effects, including comparison with benchmarks. We conclude by discussing strategies for the effective communication of cognitive risks. PMID:26610416
2013-01-01
Background Relative validity (RV), a ratio of ANOVA F-statistics, is often used to compare the validity of patient-reported outcome (PRO) measures. We used the bootstrap to establish the statistical significance of the RV and to identify key factors affecting its significance. Methods Based on responses from 453 chronic kidney disease (CKD) patients to 16 CKD-specific and generic PRO measures, RVs were computed to determine how well each measure discriminated across clinically-defined groups of patients compared to the most discriminating (reference) measure. Statistical significance of RV was quantified by the 95% bootstrap confidence interval. Simulations examined the effects of sample size, denominator F-statistic, correlation between comparator and reference measures, and number of bootstrap replicates. Results The statistical significance of the RV increased as the magnitude of denominator F-statistic increased or as the correlation between comparator and reference measures increased. A denominator F-statistic of 57 conveyed sufficient power (80%) to detect an RV of 0.6 for two measures correlated at r = 0.7. Larger denominator F-statistics or higher correlations provided greater power. Larger sample size with a fixed denominator F-statistic or more bootstrap replicates (beyond 500) had minimal impact. Conclusions The bootstrap is valuable for establishing the statistical significance of RV estimates. A reasonably large denominator F-statistic (F > 57) is required for adequate power when using the RV to compare the validity of measures with small or moderate correlations (r < 0.7). Substantially greater power can be achieved when comparing measures of a very high correlation (r > 0.9). PMID:23721463
Chang, Pao-Erh Paul; Yang, Jen-Chih Rena; Den, Walter; Wu, Chang-Fu
2014-09-01
Emissions of volatile organic compounds (VOCs) are most frequent environmental nuisance complaints in urban areas, especially where industrial districts are nearby. Unfortunately, identifying the responsible emission sources of VOCs is essentially a difficult task. In this study, we proposed a dynamic approach to gradually confine the location of potential VOC emission sources in an industrial complex, by combining multi-path open-path Fourier transform infrared spectrometry (OP-FTIR) measurement and the statistical method of principal component analysis (PCA). Close-cell FTIR was further used to verify the VOC emission source by measuring emitted VOCs from selected exhaust stacks at factories in the confined areas. Multiple open-path monitoring lines were deployed during a 3-month monitoring campaign in a complex industrial district. The emission patterns were identified and locations of emissions were confined by the wind data collected simultaneously. N,N-Dimethyl formamide (DMF), 2-butanone, toluene, and ethyl acetate with mean concentrations of 80.0 ± 1.8, 34.5 ± 0.8, 103.7 ± 2.8, and 26.6 ± 0.7 ppbv, respectively, were identified as the major VOC mixture at all times of the day around the receptor site. As the toxic air pollutant, the concentrations of DMF in air samples were found exceeding the ambient standard despite the path-average effect of OP-FTIR upon concentration levels. The PCA data identified three major emission sources, including PU coating, chemical packaging, and lithographic printing industries. Applying instrumental measurement and statistical modeling, this study has established a systematic approach for locating emission sources. Statistical modeling (PCA) plays an important role in reducing dimensionality of a large measured dataset and identifying underlying emission sources. Instrumental measurement, however, helps verify the outcomes of the statistical modeling. The field study has demonstrated the feasibility of using multi-path OP-FTIR measurement. The wind data incorporating with the statistical modeling (PCA) may successfully identify the major emission source in a complex industrial district.
Carlson, Jordan A; Sarkin, Andrew J; Levack, Ashley E; Sklar, Marisa; Tally, Steven R; Gilmer, Todd P; Groessl, Erik J
2011-08-01
Social health is important to measure when assessing outcomes in community mental health. Our objective was to validate social health scales using items from two broader commonly used measures that assess mental health outcomes. Participants were 609 adults receiving psychological treatment services. Items were identified from the California Quality of Life (CA-QOL) and Mental Health Statistics Improvement Program (MHSIP) outcome measures by their conceptual correspondence with social health and compared to the Social Functioning Questionnaire (SFQ) using correlational analyses. Pearson correlations for the identified CA-QOL and MSHIP items with the SFQ ranged from .42 to .62, and the identified scale scores produced Pearson correlation coefficients of .56, .70, and, .70 with the SFQ. Concurrent validity with social health was supported for the identified scales. The current inclusion of these assessment tools allows community mental health programs to include social health in their assessments.
Effective Vaccine Communication during the Disneyland Measles Outbreak
Broniatowski, David Andre; Hilyard, Karen M.; Dredze, Mark
2016-01-01
Vaccine refusal rates have increased in recent years, highlighting the need for effective risk communication, especially over social media. Fuzzy-trace theory predicts that individuals encode bottom-line meaning ("gist") and statistical information ("verbatim") in parallel and that articles expressing a clear gist will be most compelling. We coded news articles (n=4,686) collected during the 2014–2015 Disneyland measles for content including statistics, stories, or opinions containing bottom-line gists regarding vaccines and vaccine-preventable illnesses. We measured the extent to which articles were compelling by how frequently they were shared on Facebook. The most widely shared articles expressed bottom-line opinions, although articles containing statistics were also more likely to be shared than articles lacking statistics. Stories had limited impact on Facebook shares. Results support Fuzzy Trace Theory's predictions regarding the distinct yet parallel impact of categorical gist and statistical verbatim information on public health communication. PMID:27179915
Effective vaccine communication during the disneyland measles outbreak.
Broniatowski, David A; Hilyard, Karen M; Dredze, Mark
2016-06-14
Vaccine refusal rates have increased in recent years, highlighting the need for effective risk communication, especially over social media. Fuzzy-trace theory predicts that individuals encode bottom-line meaning ("gist") and statistical information ("verbatim") in parallel and those articles expressing a clear gist will be most compelling. We coded news articles (n=4581) collected during the 2014-2015 Disneyland measles for content including statistics, stories, or bottom-line gists regarding vaccines and vaccine-preventable illnesses. We measured the extent to which articles were compelling by how frequently they were shared on Facebook. The most widely shared articles expressed bottom-line gists, although articles containing statistics were also more likely to be shared than articles lacking statistics. Stories had limited impact on Facebook shares. Results support Fuzzy Trace Theory's predictions regarding the distinct yet parallel impact of categorical gist and statistical verbatim information on public health communication. Copyright © 2016 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kegel, T.M.
Calibration laboratories are faced with the need to become accredited or registered to one or more quality standards. One requirement common to all of these standards is the need to have in place a measurement assurance program. What is a measurement assurance program? Brian Belanger, in Measurement Assurance Programs: Part 1, describes it as a {open_quotes}quality assurance program for a measurement process that quantifies the total uncertainty of the measurements (both random and systematic components of error) with respect to national or designated standards and demonstrates that the total uncertainty is sufficiently small to meet the user`s requirements.{close_quotes} Rolf Schumachermore » is more specific in Measurement Assurance in Your Own Laboratory. He states, {open_quotes}Measurement assurance is the application of broad quality control principles to measurements of calibrations.{close_quotes} Here, the focus is on one important part of any measurement assurance program: implementation of statistical process control (SPC). Paraphrasing Juran`s Quality Control Handbook, a process is in statistical control if the only observed variations are those that can be attributed to random causes. Conversely, a process that exhibits variations due to assignable causes is not in a state of statistical control. Finally, Carrol Croarkin states, {open_quotes}In the measurement assurance context the measurement algorithm including instrumentation, reference standards and operator interactions is the process that is to be controlled, and its direct product is the measurement per se. The measurements are assumed to be valid if the measurement algorithm is operating in a state of control.{close_quotes} Implicit in this statement is the important fact that an out-of-control process cannot produce valid measurements. 7 figs.« less
Ayahuasca in adolescence: a neuropsychological assessment.
Doering-Silveira, Evelyn; Lopez, Enrique; Grob, Charles S; de Rios, Marlene Dobkin; Alonso, Luisa K; Tacla, Cristiane; Shirakawa, Itiro; Bertolucci, Paulo H; Da Silveira, Dartiu X
2005-06-01
The purpose of the study was to evaluate neuropsychologically adolescents who use ayahuasca in a religious context. A battery of neuropsychological tests was administered to adolescents who use ayahuasca. These subjects were compared to a matched control group of adolescents who did not use ayahuasca. The controls were matched with regards to sex, age, and education. The neuropsychological battery included tests of speeded attention, visual search, sequencing, psychomotor speed, verbal and visual abilities, memory, and mental flexibility. The statistical results for subjects from matched controls on neuropsychological measures were computed using independent t-tests. Overall, statistical findings suggested that there was no significant difference between the two groups on neuropsychological measures. Even though, the data overall supports that there was not a difference between ayahuasca users and matched controls on neuropsychological measures, further studies are necessary to support these findings.
Toplak, Maggie E; Sorge, Geoff B; Benoit, André; West, Richard F; Stanovich, Keith E
2010-07-01
The Iowa Gambling Task (IGT) has been used to study decision-making differences in many different clinical and developmental samples. It has been suggested that IGT performance captures abilities that are separable from cognitive abilities, including executive functions and intelligence. The purpose of the current review was to examine studies that have explicitly examined the relationship between IGT performance and these cognitive abilities. We included 43 studies that reported correlational analyses with IGT performance, including measures of inhibition, working memory, and set-shifting as indices of executive functions, as well as measures of verbal, nonverbal, and full-scale IQ as indices of intelligence. Overall, only a small proportion of the studies reported a statistically significant relationship between IGT performance and these cognitive abilities. The majority of studies reported a non-significant relationship. Of the minority of studies that reported statistically significant effects, effect sizes were, at best, small to modest, and confidence intervals were large, indicating that considerable variability in performance on the IGT is not captured by current measures of executive function and intelligence. These findings highlight the separability between decision-making on the IGT and cognitive abilities, which is consistent with recent conceptualizations that differentiate rationality from intelligence. 2010 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
McClain, Robert L.; Wright, John C.
2014-01-01
A description of shot noise and the role it plays in absorption and emission measurements using photodiode and photomultiplier tube detection systems is presented. This description includes derivations of useful forms of the shot noise equation based on Poisson counting statistics. This approach can deepen student understanding of a fundamental…
Temporal Comparisons of Internet Topology
2014-06-01
Number CAIDA Cooperative Association of Internet Data Analysis CDN Content Delivery Network CI Confidence Interval DoS denial of service GMT Greenwich...the CAIDA data. Our methods include analysis of graph theoretical measures as well as complex network and statistical measures that will quantify the...tool that probes the Internet for topology analysis and performance [26]. Scamper uses network diagnostic tools, such as traceroute and ping, to probe
Wu, Wenjing; Wang, Yan; Xu, Lulu
2014-01-01
The aim of this meta-analysis is to evaluate the central corneal thickness (CCT) measurement differences between Pentacam (Oculus Inc., Germany) and Ultrasound Pachymetry (USP) in normal (unoperated eyes , myopic and astigmatic eyes without corneal disease or topographic irregularity), after laser in situ keratomileusis (LASIK) or photorefractive keratectomy (PRK), and keratoconic or keratoconus suspected eyes. We assess whether Pentacam and USP have similar CCT differences in normal, thinner corneas after LASIK or PRK procedures, and kerotoconic or keratoconus suspected eyes. Data sources, including PubMed, Medline, EMBASE, and Cochrane Central Registry of Controlled Trials on the Cochrane Library, were searched to find the relevant studies. Primary outcome measures were CCT measurement between Pentacam and USP. Three groups of eyes were analyzed: normal; LASIK or PRK eyes; and keratoconus suspected or keratoconic eyes. Nineteen studies describing 1,908 eyes were enrolled in the normal group. Pentacam results were 1.47 μm ,95 % confidence interval (CI) -2.32 to 5.27, higher than USP without statistically significant difference (P = 0.45). Nine studies with total 539 eyes were included in the corneas after LASIK or PRK. The mean difference in the CCT measurement with Pentacam and ultrasound pachymetry was 1.03 μm, with the 95 % CI -3.36 to 5.42, there was no statistically difference (P = 0.64). Four studies with a total of 185 eyes were included in the keratoconic eyes or keratoconus-suspect group, however,the mean difference was -6.33 μm (95 % CI -9.17 to-3.49), which was statistically different between Pentacam and ultrasound pachymetry in the CCT measurement (P < 0.0001). Pentacam offers similar CCT results to ultrasound pachymetry in normal eyes, thinner corneas after LASIK or PRK procedures. However, in keratoconic or keratoconus-suspect eyes, Pentacam slightly underestimates the central corneal thickness than does ultrasound pachymetry, which may result from the difficulty in fixation of keratoconic eyes, misalignment of Pentacam and the variation of ultrasonic velocity due to the histological deformation.
Cohen, Marc M; Elliott, Fiona; Oates, Liza; Schembri, Adrian; Mantri, Nitin
2017-02-01
Wellness retreats use many complementary and alternative therapies within a holistic residential setting, yet few studies have evaluated the effect of retreat experiences on multiple dimensions of health and well-being, and no published studies have reported health outcomes in wellness tourists. To assess the effect of a week-long wellness-retreat experience in wellness tourists. A longitudinal observational study with outcomes assessed upon arrival and departure and 6 weeks after the retreat. A rural health retreat in Queensland, Australia. A holistic, 1-week, residential, retreat experience that included many educational, therapeutic, and leisure activities and an organic, mostly plant-based diet. Multiple outcome measures were performed upon arrival and departure and 6 weeks after the retreat. These included anthropometric measures, urinary pesticide metabolites, a food and health symptom questionnaire, the Five Factor Wellness Inventory, the General Self Efficacy questionnaire, the Pittsburgh Insomnia Rating Scale, the Depression Anxiety Stress Scale, the Profile of Mood States, and the Cogstate cognitive function test battery. Statistically significant improvements (p < 0.05) were seen in almost all measures (n = 37) after 1 week and were sustained at 6 weeks (n = 17). There were statistically significant improvements (p < 0.001) in all anthropometric measures after 1 week, with reductions in abdominal girth (2.7 cm), weight (1.6 kg), and average systolic and diastolic pressure (-16.1 mmHg and -9.3 mmHg, respectively). Statistically significant improvements (p < 0.05) were also seen in psychological and health symptom measures. Urinary pesticide metabolites were detected in pooled urine samples before the retreat and were undetectable after the retreat. Retreat experiences can lead to substantial improvements in multiple dimensions of health and well-being that are maintained for 6 weeks. Further research that includes objective biomarkers and economic measures in different populations is required to determine the mechanisms of these effects and assess the value and relevance of retreat experiences to clinicians and health insurers.
A stochastic model of particle dispersion in turbulent reacting gaseous environments
NASA Astrophysics Data System (ADS)
Sun, Guangyuan; Lignell, David; Hewson, John
2012-11-01
We are performing fundamental studies of dispersive transport and time-temperature histories of Lagrangian particles in turbulent reacting flows. The particle-flow statistics including the full particle temperature PDF are of interest. A challenge in modeling particle motions is the accurate prediction of fine-scale aerosol-fluid interactions. A computationally affordable stochastic modeling approach, one-dimensional turbulence (ODT), is a proven method that captures the full range of length and time scales, and provides detailed statistics of fine-scale turbulent-particle mixing and transport. Limited results of particle transport in ODT have been reported in non-reacting flow. Here, we extend ODT to particle transport in reacting flow. The results of particle transport in three flow configurations are presented: channel flow, homogeneous isotropic turbulence, and jet flames. We investigate the functional dependence of the statistics of particle-flow interactions including (1) parametric study with varying temperatures, Reynolds numbers, and particle Stokes numbers; (2) particle temperature histories and PDFs; (3) time scale and the sensitivity of initial and boundary conditions. Flow statistics are compared to both experimental measurements and DNS data.
ERIC Educational Resources Information Center
Schneider, William R.
2011-01-01
The purpose of this study was to determine the relationship between statistics self-efficacy, statistics anxiety, and performance in introductory graduate statistics courses. The study design compared two statistics self-efficacy measures developed by Finney and Schraw (2003), a statistics anxiety measure developed by Cruise and Wilkins (1980),…
NASA Astrophysics Data System (ADS)
Sundberg, R.; Moberg, A.; Hind, A.
2012-08-01
A statistical framework for comparing the output of ensemble simulations from global climate models with networks of climate proxy and instrumental records has been developed, focusing on near-surface temperatures for the last millennium. This framework includes the formulation of a joint statistical model for proxy data, instrumental data and simulation data, which is used to optimize a quadratic distance measure for ranking climate model simulations. An essential underlying assumption is that the simulations and the proxy/instrumental series have a shared component of variability that is due to temporal changes in external forcing, such as volcanic aerosol load, solar irradiance or greenhouse gas concentrations. Two statistical tests have been formulated. Firstly, a preliminary test establishes whether a significant temporal correlation exists between instrumental/proxy and simulation data. Secondly, the distance measure is expressed in the form of a test statistic of whether a forced simulation is closer to the instrumental/proxy series than unforced simulations. The proposed framework allows any number of proxy locations to be used jointly, with different seasons, record lengths and statistical precision. The goal is to objectively rank several competing climate model simulations (e.g. with alternative model parameterizations or alternative forcing histories) by means of their goodness of fit to the unobservable true past climate variations, as estimated from noisy proxy data and instrumental observations.
Three-dimensional accuracy of different correction methods for cast implant bars
Kwon, Ji-Yung; Kim, Chang-Whe; Lim, Young-Jun; Kwon, Ho-Beom
2014-01-01
PURPOSE The aim of the present study was to evaluate the accuracy of three techniques for correction of cast implant bars. MATERIALS AND METHODS Thirty cast implant bars were fabricated on a metal master model. All cast implant bars were sectioned at 5 mm from the left gold cylinder using a disk of 0.3 mm thickness, and then each group of ten specimens was corrected by gas-air torch soldering, laser welding, and additional casting technique. Three dimensional evaluation including horizontal, vertical, and twisting measurements was based on measurement and comparison of (1) gap distances of the right abutment replica-gold cylinder interface at buccal, distal, lingual side, (2) changes of bar length, and (3) axis angle changes of the right gold cylinders at the step of the post-correction measurements on the three groups with a contact and non-contact coordinate measuring machine. One-way analysis of variance (ANOVA) and paired t-test were performed at the significance level of 5%. RESULTS Gap distances of the cast implant bars after correction procedure showed no statistically significant difference among groups. Changes in bar length between pre-casting and post-correction measurement were statistically significance among groups. Axis angle changes of the right gold cylinders were not statistically significance among groups. CONCLUSION There was no statistical significance among three techniques in horizontal, vertical and axial errors. But, gas-air torch soldering technique showed the most consistent and accurate trend in the correction of implant bar error. However, Laser welding technique, showed a large mean and standard deviation in vertical and twisting measurement and might be technique-sensitive method. PMID:24605205
Hansell, P S; Hughes, C B; Caliandro, G; Russo, P; Budin, W C; Hartman, B; Hernandez, O C
1998-01-01
Caring for the human immunodeficiency virus (HIV)-infected child is challenging and affects the entire family system. Studies have shown that social support can mitigate caregiver stress and enhance coping; however, social support may not always result in a positive outcome for the recipient. To measure caregiver stress, coping, and social support, and to test the effect of a social support boosting intervention on levels of stress, coping, and social support among caregivers of children with HIV/acquired immune deficiency syndrome (AIDS). An experimental design was used with monthly social support boosting interventions implemented. The stratified randomized sample included 70 primary caregivers of children with HIV/AIDS. The sample strata were seropositive caregivers (biological parents) and seronegative caregivers (foster parents and extended family members). Study measures included the Derogatis Stress Profile, Family Crisis Oriented Personal Evaluation Scale, and the Tilden Interpersonal Relationship Inventory. Data were analyzed using descriptive statistics and repeated measure MANOVA. Statistically significant differences between the experimental and control groups were found on changes in the dependent variables over time when caregiver strata were included as a factor in the analysis; no statistically significant results were found when caregiver strata were combined. Univariate Ftests indicated that the level of social support for caregivers who were seronegative in the experimental group was significantly different from seronegative caregivers in the control group and seropositive caregivers in both groups. No significant treatment group differences were found for seropositive caregivers. Seronegative caregivers derived substantial benefit from the social support boosting intervention. Seronegative caregivers who acquire a child with HIV/AIDS are confronted with a complex stressful situation; the critical need to enhance their social support is achievable through the intervention tested in this study.
Power, S; Mirza, M; Thakorlal, A; Ganai, B; Gavagan, L D; Given, M F; Lee, M J
2015-06-01
This prospective pilot study was undertaken to evaluate the feasibility and effectiveness of using a radiation absorbing shield to reduce operator dose from scatter during lower limb endovascular procedures. A commercially available bismuth shield system (RADPAD) was used. Sixty consecutive patients undergoing lower limb angioplasty were included. Thirty procedures were performed without the RADPAD (control group) and thirty with the RADPAD (study group). Two separate methods were used to measure dose to a single operator. Thermoluminescent dosimeter (TLD) badges were used to measure hand, eye, and unshielded body dose. A direct dosimeter with digital readout was also used to measure eye and unshielded body dose. To allow for variation between control and study groups, dose per unit time was calculated. TLD results demonstrated a significant reduction in median body dose per unit time for the study group compared with controls (p = 0.001), corresponding to a mean dose reduction rate of 65 %. Median eye and hand dose per unit time were also reduced in the study group compared with control group, however, this was not statistically significant (p = 0.081 for eye, p = 0.628 for hand). Direct dosimeter readings also showed statistically significant reduction in median unshielded body dose rate for the study group compared with controls (p = 0.037). Eye dose rate was reduced for the study group but this was not statistically significant (p = 0.142). Initial results are encouraging. Use of the shield resulted in a statistically significant reduction in unshielded dose to the operator's body. Measured dose to the eye and hand of operator were also reduced but did not reach statistical significance in this pilot study.
Filter Tuning Using the Chi-Squared Statistic
NASA Technical Reports Server (NTRS)
Lilly-Salkowski, Tyler
2017-01-01
The Goddard Space Flight Center (GSFC) Flight Dynamics Facility (FDF) performs orbit determination (OD) for the Aqua and Aura satellites. Both satellites are located in low Earth orbit (LEO), and are part of what is considered the A-Train satellite constellation. Both spacecraft are currently in the science phase of their respective missions. The FDF has recently been tasked with delivering definitive covariance for each satellite.The main source of orbit determination used for these missions is the Orbit Determination Toolkit developed by Analytical Graphics Inc. (AGI). This software uses an Extended Kalman Filter (EKF) to estimate the states of both spacecraft. The filter incorporates force modelling, ground station and space network measurements to determine spacecraft states. It also generates a covariance at each measurement. This covariance can be useful for evaluating the overall performance of the tracking data measurements and the filter itself. An accurate covariance is also useful for covariance propagation which is utilized in collision avoidance operations. It is also valuable when attempting to determine if the current orbital solution will meet mission requirements in the future.This paper examines the use of the Chi-square statistic as a means of evaluating filter performance. The Chi-square statistic is calculated to determine the realism of a covariance based on the prediction accuracy and the covariance values at a given point in time. Once calculated, it is the distribution of this statistic that provides insight on the accuracy of the covariance.For the EKF to correctly calculate the covariance, error models associated with tracking data measurements must be accurately tuned. Over estimating or under estimating these error values can have detrimental effects on the overall filter performance. The filter incorporates ground station measurements, which can be tuned based on the accuracy of the individual ground stations. It also includes measurements from the NASA space network (SN), which can be affected by the assumed accuracy of the TDRS satellite state at the time of the measurement.The force modelling in the EKF is also an important factor that affects the propagation accuracy and covariance sizing. The dominant force in the LEO orbit regime is the drag force caused by atmospheric drag. Accurate accounting of the drag force is especially important for the accuracy of the propagated state. The implementation of a box and wing model to improve drag estimation accuracy, and its overall effect on the covariance state is explored.The process of tuning the EKF for Aqua and Aura support is described, including examination of the measurement errors of available observation types (Doppler and range), and methods of dealing with potentially volatile atmospheric drag modeling. Predictive accuracy and the distribution of the Chi-square statistic, calculated based of the ODTK EKF solutions, are assessed versus accepted norms for the orbit regime.
Statistical Analysis of speckle noise reduction techniques for echocardiographic Images
NASA Astrophysics Data System (ADS)
Saini, Kalpana; Dewal, M. L.; Rohit, Manojkumar
2011-12-01
Echocardiography is the safe, easy and fast technology for diagnosing the cardiac diseases. As in other ultrasound images these images also contain speckle noise. In some cases this speckle noise is useful such as in motion detection. But in general noise removal is required for better analysis of the image and proper diagnosis. Different Adaptive and anisotropic filters are included for statistical analysis. Statistical parameters such as Signal-to-Noise Ratio (SNR), Peak Signal-to-Noise Ratio (PSNR), and Root Mean Square Error (RMSE) calculated for performance measurement. One more important aspect that there may be blurring during speckle noise removal. So it is prefered that filter should be able to enhance edges during noise removal.
Negative values of quasidistributions and quantum wave and number statistics
NASA Astrophysics Data System (ADS)
Peřina, J.; Křepelka, J.
2018-04-01
We consider nonclassical wave and number quantum statistics, and perform a decomposition of quasidistributions for nonlinear optical down-conversion processes using Bessel functions. We show that negative values of the quasidistribution do not directly represent probabilities; however, they directly influence measurable number statistics. Negative terms in the decomposition related to the nonclassical behavior with negative amplitudes of probability can be interpreted as positive amplitudes of probability in the negative orthogonal Bessel basis, whereas positive amplitudes of probability in the positive basis describe classical cases. However, probabilities are positive in all cases, including negative values of quasidistributions. Negative and positive contributions of decompositions to quasidistributions are estimated. The approach can be adapted to quantum coherence functions.
Hydroxyurea (hydroxycarbamide) for sickle cell disease.
Nevitt, Sarah J; Jones, Ashley P; Howard, Jo
2017-04-20
Sickle cell disease (SCD) is one of the most common inherited diseases worldwide. It is associated with lifelong morbidity and a reduced life expectancy. Hydroxyurea (hydroxycarbamide), an oral chemotherapeutic drug, ameliorates some of the clinical problems of SCD, in particular that of pain, by raising fetal haemoglobin. This is an update of a previously published Cochrane Review. To assess the effects of hydroxyurea therapy in people with SCD (all genotypes), of any age, regardless of setting. We searched the Cochrane Cystic Fibrosis and Genetic Disorders Group Haemoglobinopathies Register, comprising of references identified from comprehensive electronic database searches and handsearches of relevant journals and abstract books of conference proceedings. We also searched online trial registries.Date of the most recent search: 16 January 2017. Randomised and quasi-randomised controlled trials, of one month or longer, comparing hydroxyurea with placebo, standard therapy or other interventions for people with SCD. Authors independently assessed studies for inclusion, carried out data extraction and assessed the risk of bias. Seventeen studies were identified in the searches; eight randomised controlled trials were included, recruiting 899 adults and children with SCD (haemoglobin SS (HbSS), haemoglobin SC (HbSC) or haemoglobin Sβºthalassaemia (HbSβºthal) genotypes). Studies lasted from six to 30 months.Four studies (577 adults and children with HbSS or HbSβºthal) compared hydroxyurea to placebo; three recruited individuals with only severe disease and one recruited individuals with all disease severities. There were statistically significant improvements in terms of pain alteration (using measures such as pain crisis frequency, duration, intensity, hospital admissions and opoid use), measures of fetal haemoglobin and neutrophil counts and fewer occurrences of acute chest syndrome and blood transfusions in the hydroxyurea groups. There were no consistent statistically significant differences in terms of quality of life and adverse events (including serious or life-threatening events). Seven deaths occurred during the studies, but the rates by treatment group were not statistically significantly different.Two studies (254 children with HbSS or HbSβºthal also with risk of primary or secondary stroke) compared hydroxyurea and phlebotomy to transfusion and chelation; there were statistically significant improvements in terms of measures of fetal haemoglobin and neutrophil counts, but more occurrences of acute chest syndrome and infections in the hydroxyurea and phlebotomy group. There were no consistent statistically significant differences in terms of pain alteration and adverse events (including serious or life-threatening events). Two deaths occurred during the studies (one in a the hydroxyurea treatment arm and one in the control arm), but the rates by treatment group were not statistically significantly different. In the primary prevention study, no strokes occurred in either treatment group but in the secondary prevention study, seven strokes occurred in the hydroxyurea and phlebotomy group (none in the transfusion and chelation group) and the study was terminated early.The quality of the evidence for the above two comparisons was judged as moderate to low as the studies contributing to these comparisons were mostly large and well designed (and at low risk of bias); however evidence was limited and imprecise for some outcomes such as quality of life, deaths during the studies and adverse events and results are applicable only to individuals with HbSS and HbSβºthal genotypes.Of the remaining two studies, one (22 children with HbSS or HbSβºthal also at risk of stoke) compared hydroxyurea to observation; there were statistically significant improvements in terms of measures of fetal haemoglobin and neutrophil counts but no statistically significant differences in terms of adverse events (including serious or life-threatening events).The final study (44 adults and children with HbSC) compared treatment regimens with and without hydroxyurea - there was statistically significant improvement in terms of measures of fetal haemoglobin, but no statistically significant differences in terms of adverse events (including serious or life-threatening events). No participants died in either of these studies and other outcomes relevant to the review were not reported.The quality of the evidence for the above two comparisons was judged to be very low due to the limited number of participants, the lack of statistical power (as both studies were terminated early with approximately only 20% of their target sample size recruited) and the lack of applicability to all age groups and genotypes. There is evidence to suggest that hydroxyurea is effective in decreasing the frequency of pain episodes and other acute complications in adults and children with sickle cell anaemia of HbSS or HbSβºthal genotypes and in preventing life-threatening neurological events in those with sickle cell anaemia at risk of primary stroke by maintaining transcranial doppler velocities. However, there is still insufficient evidence on the long-term benefits of hydroxyurea, particularly in preventing chronic complications of SCD, recommending a standard dose or dose escalation to maximum tolerated dose. There is also insufficient evidence about the long-term risks of hydroxyurea, including its effects on fertility and reproduction. Evidence is also limited on the effects of hydroxyurea on individuals with HbSC genotype. Future studies should be designed to address such uncertainties.
Signal Processing Methods for Liquid Rocket Engine Combustion Stability Assessments
NASA Technical Reports Server (NTRS)
Kenny, R. Jeremy; Lee, Erik; Hulka, James R.; Casiano, Matthew
2011-01-01
The J2X Gas Generator engine design specifications include dynamic, spontaneous, and broadband combustion stability requirements. These requirements are verified empirically based high frequency chamber pressure measurements and analyses. Dynamic stability is determined with the dynamic pressure response due to an artificial perturbation of the combustion chamber pressure (bomb testing), and spontaneous and broadband stability are determined from the dynamic pressure responses during steady operation starting at specified power levels. J2X Workhorse Gas Generator testing included bomb tests with multiple hardware configurations and operating conditions, including a configuration used explicitly for engine verification test series. This work covers signal processing techniques developed at Marshall Space Flight Center (MSFC) to help assess engine design stability requirements. Dynamic stability assessments were performed following both the CPIA 655 guidelines and a MSFC in-house developed statistical-based approach. The statistical approach was developed to better verify when the dynamic pressure amplitudes corresponding to a particular frequency returned back to pre-bomb characteristics. This was accomplished by first determining the statistical characteristics of the pre-bomb dynamic levels. The pre-bomb statistical characterization provided 95% coverage bounds; these bounds were used as a quantitative measure to determine when the post-bomb signal returned to pre-bomb conditions. The time for post-bomb levels to acceptably return to pre-bomb levels was compared to the dominant frequency-dependent time recommended by CPIA 655. Results for multiple test configurations, including stable and unstable configurations, were reviewed. Spontaneous stability was assessed using two processes: 1) characterization of the ratio of the peak response amplitudes to the excited chamber acoustic mode amplitudes and 2) characterization of the variability of the peak response's frequency over the test duration. This characterization process assists in evaluating the discreteness of a signal as well as the stability of the chamber response. Broadband stability was assessed using a running root-mean-square evaluation. These techniques were also employed, in a comparative analysis, on available Fastrac data, and these results are presented here.
A new statistical method for characterizing the atmospheres of extrasolar planets
NASA Astrophysics Data System (ADS)
Henderson, Cassandra S.; Skemer, Andrew J.; Morley, Caroline V.; Fortney, Jonathan J.
2017-10-01
By detecting light from extrasolar planets, we can measure their compositions and bulk physical properties. The technologies used to make these measurements are still in their infancy, and a lack of self-consistency suggests that previous observations have underestimated their systemic errors. We demonstrate a statistical method, newly applied to exoplanet characterization, which uses a Bayesian formalism to account for underestimated errorbars. We use this method to compare photometry of a substellar companion, GJ 758b, with custom atmospheric models. Our method produces a probability distribution of atmospheric model parameters including temperature, gravity, cloud model (fsed) and chemical abundance for GJ 758b. This distribution is less sensitive to highly variant data and appropriately reflects a greater uncertainty on parameter fits.
Report on the lunar ranging at McDonald Observatory, 1 February - 31 May 1976
NASA Technical Reports Server (NTRS)
Palm, C. S.; Wiant, J. R.
1976-01-01
The four spring lunations produced 105 acquisitions, including the 2000th range measurement made at McDonald Observatory. Statistics were normal for the spring months. Laser and electronics problems are noted. The Loran-C station delay was corrected. Preliminary doubles data is shown. New magnetic tape data formats are presented. R and D efforts include a new laser modification design.
Mathematics. Exceptional Child Education Curriculum K-12.
ERIC Educational Resources Information Center
Jordon, Thelma; And Others
The mathematics curriculum provides a framework of instruction for exceptional child education in grades K-12. Content areas include: numeration, whole numbers, rational numbers, real/complex numbers, calculator literacy, measurement, geometry, statistics, functions/relations, computer literacy, and pre-algebra. The guide is organized by content…
77 FR 33120 - Truth in Lending (Regulation Z)
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-05
... FHFA's release of historical data on loan volumes and delinquency rates, including any tabulations or... with varying characteristics and to perform other statistical analyses that may assist the Bureau in... definitions of a ``qualified mortgage.'' For example, the Bureau is examining various measures of delinquency...
Feeny, Simon; Posso, Alberto; McDonald, Lachlan; Chuyen, Truong Thi Kim; Tung, Son Thanh
2018-01-01
A more holistic understanding of the benefits of sight-restoring cataract surgery requires a focus that goes beyond income and employment, to include a wider array of well-being measures. The objective of this study is to examine the monetary and non-monetary benefits of cataract surgery on both patients as well as their caregivers in Vietnam. Participants were randomly recruited from a Ho-Chi-Minh City Hospital. A total of 82 cataract patients and 83 caregivers participated in the survey conducted for this study. Paired t-tests, Wilcoxon Signed Rank tests, and regression analysis are used to detect any statistically significant differences in various measures of well-being for patients and caregivers before and after surgery. There are statistically significant improvements in monetary and non-monetary measures of well-being for both patients and caregivers approximately three months after undergoing cataract surgery, compared with baseline assessments collected prior to surgery. Non-monetary measures of well-being include self-assessments of overall health, mental health, hope, self-efficacy, happiness and life satisfaction. For patients, the benefits included statistically significant improvements in earnings, mobility, self-care, the ability to undertake daily activities, self-assessed health and mental health, life satisfaction, hope, and self-efficacy (p<0.01). For caregivers, attendance at work improved alongside overall health, mental health, hope, self-efficacy, happiness and life satisfaction, three months post-surgery (p<0.01). Restoring sight has positive impacts for those suffering from cataracts and their caregivers. Sometimes the benefits are almost equal in their magnitude. The study has also demonstrated that many of these impacts are non-monetary in nature. It is clear that estimates of the rate of return to restoring sight that focus only on financial gains will underestimate the true returns to society of restoring sight from cataract surgeries.
High-Throughput Nanoindentation for Statistical and Spatial Property Determination
NASA Astrophysics Data System (ADS)
Hintsala, Eric D.; Hangen, Ude; Stauffer, Douglas D.
2018-04-01
Standard nanoindentation tests are "high throughput" compared to nearly all other mechanical tests, such as tension or compression. However, the typical rates of tens of tests per hour can be significantly improved. These higher testing rates enable otherwise impractical studies requiring several thousands of indents, such as high-resolution property mapping and detailed statistical studies. However, care must be taken to avoid systematic errors in the measurement, including choosing of the indentation depth/spacing to avoid overlap of plastic zones, pileup, and influence of neighboring microstructural features in the material being tested. Furthermore, since fast loading rates are required, the strain rate sensitivity must also be considered. A review of these effects is given, with the emphasis placed on making complimentary standard nanoindentation measurements to address these issues. Experimental applications of the technique, including mapping of welds, microstructures, and composites with varying length scales, along with studying the effect of surface roughness on nominally homogeneous specimens, will be presented.
Smith, Paul F.
2017-01-01
Effective inferential statistical analysis is essential for high quality studies in neuroscience. However, recently, neuroscience has been criticised for the poor use of experimental design and statistical analysis. Many of the statistical issues confronting neuroscience are similar to other areas of biology; however, there are some that occur more regularly in neuroscience studies. This review attempts to provide a succinct overview of some of the major issues that arise commonly in the analyses of neuroscience data. These include: the non-normal distribution of the data; inequality of variance between groups; extensive correlation in data for repeated measurements across time or space; excessive multiple testing; inadequate statistical power due to small sample sizes; pseudo-replication; and an over-emphasis on binary conclusions about statistical significance as opposed to effect sizes. Statistical analysis should be viewed as just another neuroscience tool, which is critical to the final outcome of the study. Therefore, it needs to be done well and it is a good idea to be proactive and seek help early, preferably before the study even begins. PMID:29371855
Smith, Paul F
2017-01-01
Effective inferential statistical analysis is essential for high quality studies in neuroscience. However, recently, neuroscience has been criticised for the poor use of experimental design and statistical analysis. Many of the statistical issues confronting neuroscience are similar to other areas of biology; however, there are some that occur more regularly in neuroscience studies. This review attempts to provide a succinct overview of some of the major issues that arise commonly in the analyses of neuroscience data. These include: the non-normal distribution of the data; inequality of variance between groups; extensive correlation in data for repeated measurements across time or space; excessive multiple testing; inadequate statistical power due to small sample sizes; pseudo-replication; and an over-emphasis on binary conclusions about statistical significance as opposed to effect sizes. Statistical analysis should be viewed as just another neuroscience tool, which is critical to the final outcome of the study. Therefore, it needs to be done well and it is a good idea to be proactive and seek help early, preferably before the study even begins.
Fluorescent-Antibody Measurement Of Cancer-Cell Urokinase
NASA Technical Reports Server (NTRS)
Morrison, Dennis R.
1993-01-01
Combination of laboratory techniques provides measurements of amounts of urokinase in and between normal and cancer cells. Includes use of fluorescent antibodies specific against different forms of urokinase-type plasminogen activator, (uPA), fluorescence microscopy, quantitative analysis of images of sections of tumor tissue, and flow cytometry of different uPA's and deoxyribonucleic acid (DNA) found in suspended-tumor-cell preparations. Measurements provide statistical method for indicating or predicting metastatic potentials of some invasive tumors. Assessments of metastatic potentials based on such measurements used in determining appropriate follow-up procedures after surgical removal of tumors.
Cole, T J
2006-12-01
This article discusses statistical considerations for the design of a new study intended to provide an International Growth Standard for Preadolescent and Adolescent Children, including issues such as cross-sectional, longitudinal, and mixed designs; sample-size derivation for the number of populations and number of children per population; modeling of growth centiles of height, weight, and other measurements; and modeling of the adolescent growth spurt. The conclusions are that a mixed longitudinal design will provide information on both growth distance and velocity; samples of children from 5 to 10 sites should be suitable for an international standard (based on political rather than statistical arguments); the samples should be broadly uniform across age but oversampled during puberty, and should include data into adulthood. The LMS method is recommended for constructing measurement centiles, and parametric or semiparametric approaches are available to estimate the timing of the adolescent growth spurt in individuals. If the new standard is to be grafted onto the 2006 World Health Organization (WHO) reference, caution is needed at the join point of 5 years, where children from the new standard are likely to be appreciably more obese than those from the WHO reference, due to the rising trends in obesity and the time gap in data collection between the two surveys.
NASA Technical Reports Server (NTRS)
Coffin, T.
1986-01-01
A dynamic pressure data base and data base management system developed to characterize the Space Shuttle Main Engine (SSME) dynamic pressure environment is presented. The data base represents dynamic pressure measurements obtained during single engine hot firing tests of the SSME. Software is provided to permit statistical evaluation of selected measurements under specified operating conditions. An interpolation scheme is also included to estimate spectral trends with SSME power level.
Zonal average earth radiation budget measurements from satellites for climate studies
NASA Technical Reports Server (NTRS)
Ellis, J. S.; Haar, T. H. V.
1976-01-01
Data from 29 months of satellite radiation budget measurements, taken intermittently over the period 1964 through 1971, are composited into mean month, season and annual zonally averaged meridional profiles. Individual months, which comprise the 29 month set, were selected as representing the best available total flux data for compositing into large scale statistics for climate studies. A discussion of spatial resolution of the measurements along with an error analysis, including both the uncertainty and standard error of the mean, are presented.
Franc, Jeffrey Michael; Ingrassia, Pier Luigi; Verde, Manuela; Colombo, Davide; Della Corte, Francesco
2015-02-01
Surge capacity, or the ability to manage an extraordinary volume of patients, is fundamental for hospital management of mass-casualty incidents. However, quantification of surge capacity is difficult and no universal standard for its measurement has emerged, nor has a standardized statistical method been advocated. As mass-casualty incidents are rare, simulation may represent a viable alternative to measure surge capacity. Hypothesis/Problem The objective of the current study was to develop a statistical method for the quantification of surge capacity using a combination of computer simulation and simple process-control statistical tools. Length-of-stay (LOS) and patient volume (PV) were used as metrics. The use of this method was then demonstrated on a subsequent computer simulation of an emergency department (ED) response to a mass-casualty incident. In the derivation phase, 357 participants in five countries performed 62 computer simulations of an ED response to a mass-casualty incident. Benchmarks for ED response were derived from these simulations, including LOS and PV metrics for triage, bed assignment, physician assessment, and disposition. In the application phase, 13 students of the European Master in Disaster Medicine (EMDM) program completed the same simulation scenario, and the results were compared to the standards obtained in the derivation phase. Patient-volume metrics included number of patients to be triaged, assigned to rooms, assessed by a physician, and disposed. Length-of-stay metrics included median time to triage, room assignment, physician assessment, and disposition. Simple graphical methods were used to compare the application phase group to the derived benchmarks using process-control statistical tools. The group in the application phase failed to meet the indicated standard for LOS from admission to disposition decision. This study demonstrates how simulation software can be used to derive values for objective benchmarks of ED surge capacity using PV and LOS metrics. These objective metrics can then be applied to other simulation groups using simple graphical process-control tools to provide a numeric measure of surge capacity. Repeated use in simulations of actual EDs may represent a potential means of objectively quantifying disaster management surge capacity. It is hoped that the described statistical method, which is simple and reusable, will be useful for investigators in this field to apply to their own research.
Precipitate statistics in an Al-Mg-Si-Cu alloy from scanning precession electron diffraction data
NASA Astrophysics Data System (ADS)
Sunde, J. K.; Paulsen, Ø.; Wenner, S.; Holmestad, R.
2017-09-01
The key microstructural feature providing strength to age-hardenable Al alloys is nanoscale precipitates. Alloy development requires a reliable statistical assessment of these precipitates, in order to link the microstructure with material properties. Here, it is demonstrated that scanning precession electron diffraction combined with computational analysis enable the semi-automated extraction of precipitate statistics in an Al-Mg-Si-Cu alloy. Among the main findings is the precipitate number density, which agrees well with a conventional method based on manual counting and measurements. By virtue of its data analysis objectivity, our methodology is therefore seen as an advantageous alternative to existing routines, offering reproducibility and efficiency in alloy statistics. Additional results include improved qualitative information on phase distributions. The developed procedure is generic and applicable to any material containing nanoscale precipitates.
NASA Astrophysics Data System (ADS)
Yoo, Donghoon; Lee, Joohyun; Lee, Byeongchan; Kwon, Suyong; Koo, Junemo
2018-02-01
The Transient Hot-Wire Method (THWM) was developed to measure the absolute thermal conductivity of gases, liquids, melts, and solids with low uncertainty. The majority of nanofluid researchers used THWM to measure the thermal conductivity of test fluids. Several reasons have been suggested for the discrepancies in these types of measurements, including nanofluid generation, nanofluid stability, and measurement challenges. The details of the transient hot-wire method such as the test cell size, the temperature coefficient of resistance (TCR) and the sampling number are further investigated to improve the accuracy and consistency of the measurements of different researchers. It was observed that smaller test apparatuses were better because they can delay the onset of natural convection. TCR values of a coated platinum wire were measured and statistically analyzed to reduce the uncertainty in thermal conductivity measurements. For validation, ethylene glycol (EG) and water thermal conductivity were measured and analyzed in the temperature range between 280 and 310 K. Furthermore, a detailed statistical analysis was conducted for such measurements, and the results confirmed the minimum number of samples required to achieve the desired resolution and precision of the measurements. It is further proposed that researchers fully report the information related to their measurements to validate the measurements and to avoid future inconsistent nanofluid data.
Statistical assessment of the learning curves of health technologies.
Ramsay, C R; Grant, A M; Wallace, S A; Garthwaite, P H; Monk, A F; Russell, I T
2001-01-01
(1) To describe systematically studies that directly assessed the learning curve effect of health technologies. (2) Systematically to identify 'novel' statistical techniques applied to learning curve data in other fields, such as psychology and manufacturing. (3) To test these statistical techniques in data sets from studies of varying designs to assess health technologies in which learning curve effects are known to exist. METHODS - STUDY SELECTION (HEALTH TECHNOLOGY ASSESSMENT LITERATURE REVIEW): For a study to be included, it had to include a formal analysis of the learning curve of a health technology using a graphical, tabular or statistical technique. METHODS - STUDY SELECTION (NON-HEALTH TECHNOLOGY ASSESSMENT LITERATURE SEARCH): For a study to be included, it had to include a formal assessment of a learning curve using a statistical technique that had not been identified in the previous search. METHODS - DATA SOURCES: Six clinical and 16 non-clinical biomedical databases were searched. A limited amount of handsearching and scanning of reference lists was also undertaken. METHODS - DATA EXTRACTION (HEALTH TECHNOLOGY ASSESSMENT LITERATURE REVIEW): A number of study characteristics were abstracted from the papers such as study design, study size, number of operators and the statistical method used. METHODS - DATA EXTRACTION (NON-HEALTH TECHNOLOGY ASSESSMENT LITERATURE SEARCH): The new statistical techniques identified were categorised into four subgroups of increasing complexity: exploratory data analysis; simple series data analysis; complex data structure analysis, generic techniques. METHODS - TESTING OF STATISTICAL METHODS: Some of the statistical methods identified in the systematic searches for single (simple) operator series data and for multiple (complex) operator series data were illustrated and explored using three data sets. The first was a case series of 190 consecutive laparoscopic fundoplication procedures performed by a single surgeon; the second was a case series of consecutive laparoscopic cholecystectomy procedures performed by ten surgeons; the third was randomised trial data derived from the laparoscopic procedure arm of a multicentre trial of groin hernia repair, supplemented by data from non-randomised operations performed during the trial. RESULTS - HEALTH TECHNOLOGY ASSESSMENT LITERATURE REVIEW: Of 4571 abstracts identified, 272 (6%) were later included in the study after review of the full paper. Some 51% of studies assessed a surgical minimal access technique and 95% were case series. The statistical method used most often (60%) was splitting the data into consecutive parts (such as halves or thirds), with only 14% attempting a more formal statistical analysis. The reporting of the studies was poor, with 31% giving no details of data collection methods. RESULTS - NON-HEALTH TECHNOLOGY ASSESSMENT LITERATURE SEARCH: Of 9431 abstracts assessed, 115 (1%) were deemed appropriate for further investigation and, of these, 18 were included in the study. All of the methods for complex data sets were identified in the non-clinical literature. These were discriminant analysis, two-stage estimation of learning rates, generalised estimating equations, multilevel models, latent curve models, time series models and stochastic parameter models. In addition, eight new shapes of learning curves were identified. RESULTS - TESTING OF STATISTICAL METHODS: No one particular shape of learning curve performed significantly better than another. The performance of 'operation time' as a proxy for learning differed between the three procedures. Multilevel modelling using the laparoscopic cholecystectomy data demonstrated and measured surgeon-specific and confounding effects. The inclusion of non-randomised cases, despite the possible limitations of the method, enhanced the interpretation of learning effects. CONCLUSIONS - HEALTH TECHNOLOGY ASSESSMENT LITERATURE REVIEW: The statistical methods used for assessing learning effects in health technology assessment have been crude and the reporting of studies poor. CONCLUSIONS - NON-HEALTH TECHNOLOGY ASSESSMENT LITERATURE SEARCH: A number of statistical methods for assessing learning effects were identified that had not hitherto been used in health technology assessment. There was a hierarchy of methods for the identification and measurement of learning, and the more sophisticated methods for both have had little if any use in health technology assessment. This demonstrated the value of considering fields outside clinical research when addressing methodological issues in health technology assessment. CONCLUSIONS - TESTING OF STATISTICAL METHODS: It has been demonstrated that the portfolio of techniques identified can enhance investigations of learning curve effects. (ABSTRACT TRUNCATED)
Sexual Assault Prevention and Response Climate DEOCS 4.1 Construct Validity Summary
2017-08-01
DEOCS, (7) examining variance and descriptive statistics (8) examining the relationship among items/areas to reduce multicollinearity, and (9...selecting items that demonstrate the strongest scale properties. Included is a review of the 4.0 description and items, followed by the proposed...Tables 1 – 7 for the description of each measure and corresponding items. Table 1. DEOCS 4.0 Perceptions of Safety Measure Description
Choi, Mankyu; Lee, Keon-Hyung
2008-01-01
In this study, the determinants of hospital profitability were evaluated using a sample of 142 hospitals that had undergone hospital standardization inspections by the South Korea Hospital Association over the 4-year period from 1998 to 2001. The measures of profitability used as dependent variables in this study were pretax return on assets, after-tax return on assets, basic earning power, pretax operating margin, and after-tax operating margin. Among those determinants, it was found that ownership type, teaching status, inventory turnover, and the average charge per adjusted inpatient day positively and statistically significantly affected all 5 of these profitability measures. However, the labor expenses per adjusted inpatient day and administrative expenses per adjusted inpatient day negatively and statistically significantly affected all 5 profitability measures. The debt ratio negatively and statistically significantly affected all 5 profitability measures, with the exception of basic earning power. None of the market factors assessed were shown to significantly affect profitability. In conclusion, the results of this study suggest that the profitability of hospitals can be improved despite deteriorating external environmental conditions by facilitating the formation of sound financial structures with optimal capital supplies, optimizing the management of total assets with special emphasis placed on inventory management, and introducing efficient control of fixed costs including labor and administrative expenses.
Song, Fujian; Xiong, Tengbin; Parekh-Bhurke, Sheetal; Loke, Yoon K; Sutton, Alex J; Eastwood, Alison J; Holland, Richard; Chen, Yen-Fu; Glenny, Anne-Marie; Deeks, Jonathan J; Altman, Doug G
2011-08-16
To investigate the agreement between direct and indirect comparisons of competing healthcare interventions. Meta-epidemiological study based on sample of meta-analyses of randomised controlled trials. Data sources Cochrane Database of Systematic Reviews and PubMed. Inclusion criteria Systematic reviews that provided sufficient data for both direct comparison and independent indirect comparisons of two interventions on the basis of a common comparator and in which the odds ratio could be used as the outcome statistic. Inconsistency measured by the difference in the log odds ratio between the direct and indirect methods. The study included 112 independent trial networks (including 1552 trials with 478,775 patients in total) that allowed both direct and indirect comparison of two interventions. Indirect comparison had already been explicitly done in only 13 of the 85 Cochrane reviews included. The inconsistency between the direct and indirect comparison was statistically significant in 16 cases (14%, 95% confidence interval 9% to 22%). The statistically significant inconsistency was associated with fewer trials, subjectively assessed outcomes, and statistically significant effects of treatment in either direct or indirect comparisons. Owing to considerable inconsistency, many (14/39) of the statistically significant effects by direct comparison became non-significant when the direct and indirect estimates were combined. Significant inconsistency between direct and indirect comparisons may be more prevalent than previously observed. Direct and indirect estimates should be combined in mixed treatment comparisons only after adequate assessment of the consistency of the evidence.
The slip resistance of common footwear materials measured with two slipmeters.
Chang, W R; Matz, S
2001-12-01
The slip resistance of 16 commonly used footwear materials was measured with the Brungraber Mark II and the English XL on 3 floor surfaces under surface conditions of dry, wet, oily and oily wet. Three samples were used for each material combination and surface condition. The results of a one way ANOVA analysis indicated that the differences among different samples were statistically significant for a large number of material combinations and surface conditions. The results indicated that the ranking of materials based on their slip resistance values depends highly on the slipmeters, floor surfaces and surface conditions. For contaminated surfaces including wet, oily and oily wet surfaces, the slip resistance obtained with the English XL was usually higher than that measured with the Brungraber Mark II. The correlation coefficients between the slip resistance obtained with these two slipmeters calculated for different surface conditions indicated a strong correlation with statistical significance.
Use of comorbidity measures to predict the risk of death in Brazilian in-patients.
Martins, Monica
2010-06-01
To assess the use of comorbidity measures to predict the risk of death in Brazilian in-patients. Data from the Sistema de Informações Hospitalares do Sistema Unico de Saúde (Unified Health System Hospital Information System) were used, which enables only one secondary diagnosis to be recorded. A total of 1,607,697 hospitalizations were selected, all of which occurred in Brazil, between 2003 and 2004, and whose main diagnoses were: ischemic heart disease, congestive cardiac failure, stroke and pneumonia. Charlson Index and Elixhauser comorbidities were the comorbidity measures used. In addition, the simple record of a certain secondary diagnosis was also used. Logistic regression was applied to assess the impact of comorbidity measures on the estimate of risk of death. The baseline model included the following variables: age, sex and main diagnosis. Models to predict death were assessed, based on C-statistic and Hosmer-Lemeshow test. Hospital mortality rate was 10.4% and mean length of stay was 5.7 days. The majority (52%) of hospitalizations occurred among men and mean age was 62.6 years. Of all hospitalizations, 5.4% included a recorded secondary diagnosis, although the odds ratio between death and presence of comorbidity was 1.93. The baseline model showed a discriminatory capacity (C-statistic) of 0.685. The improvement in the models, attributed to the introduction of comorbidity indices, was poor, equivalent to zero when C-statistic with only two digits was considered. Although the introduction of three comorbidity measures in distinct models to predict death improved the predictive capacity of the baseline model, the values obtained are still considered insufficient. The accuracy of this type of measure is influenced by the completeness of the source of information. In this sense, high underreporting of secondary diagnosis, in addition to the well-known lack of space to note down this type of information in the Sistema de Informações Hospitalares, are the main explanatory factors for the results found.
Hammond, Flora M; Sherer, Mark; Malec, James F; Zafonte, Ross D; Dikmen, Sureyya; Bogner, Jennifer; Bell, Kathleen R; Barber, Jason; Temkin, Nancy
2018-06-07
Despite limited evidence to support the use of amantadine to enhance cognitive function after traumatic brain injury (TBI), the clinical use for this purpose is highly prevalent and is often based on inferred belief systems. The aim of this study was to assess effect of amantadine on cognition among individuals with a history of TBI and behavioral disturbance using a parallel-group, randomized, double-blind, placebo-controlled trial of amantadine 100 mg twice-daily versus placebo for 60 days. Included in the study were 119 individuals with two or more neuropsychological measures greater than 1 standard deviation below normative means from a larger study of 168 individuals with chronic TBI (>6 months post-injury) and irritability. Cognitive function was measured at treatment days 0, 28, and 60 with a battery of neuropsychological tests. Composite indices were generated: General Cognitive Index (included all measures), a Learning Memory Index (learning/memory measures), and Attention/Processing Speed Index (attention and executive function measures). Repeated-measures analysis of variance revealed statistically significant between-group differences favoring the placebo group at day 28 for General Cognitive Index (p = 0.002) and Learning Memory Index (p = 0.001), but not Attention/Processing Speed Index (p = 0.25), whereas no statistically significant between-group differences were found at day 60. There were no statistically significant between-group differences on adverse events. Cognitive function in individuals with chronic TBI is not improved by amantadine 100 mg twice-daily. In the first 28 days of use, amantadine may impede cognitive processing. However, the effect size was small and mean scores for both groups were generally within expectations for persons with history of complicated mild-to-severe TBI, suggesting that changes observed across assessments may not have functional significance. The use of amantadine to enhance cognitive function is not supported by these findings.
Cro, Suzie; Mehta, Saahil; Farhadi, Jian; Coomber, Billie; Cornelius, Victoria
2018-01-01
Essential strategies are needed to help reduce the number of post-operative complications and associated costs for breast cancer patients undergoing reconstructive breast surgery. Evidence suggests that local heat preconditioning could help improve the provision of this procedure by reducing skin necrosis. Before testing the effectiveness of heat preconditioning in a definitive randomised controlled trial (RCT), we must first establish the best way to measure skin necrosis and estimate the event rate using this definition. PREHEAT is a single-blind randomised controlled feasibility trial comparing local heat preconditioning, using a hot water bottle, against standard care on skin necrosis among breast cancer patients undergoing reconstructive breast surgery. The primary objective of this study is to determine the best way to measure skin necrosis and to estimate the event rate using this definition in each trial arm. Secondary feasibility objectives include estimating recruitment and 30 day follow-up retention rates, levels of compliance with the heating protocol, length of stay in hospital and the rates of surgical versus conservative management of skin necrosis. The information from these objectives will inform the design of a larger definitive effectiveness and cost-effectiveness RCT. This article describes the PREHEAT trial protocol and detailed statistical analysis plan, which includes the pre-specified criteria and process for establishing the best way to measure necrosis. This study will provide the evidence needed to establish the best way to measure skin necrosis, to use as the primary outcome in a future RCT to definitively test the effectiveness of local heat preconditioning. The pre-specified statistical analysis plan, developed prior to unblinded data extraction, sets out the analysis strategy and a comparative framework to support a committee evaluation of skin necrosis measurements. It will increase the transparency of the data analysis for the PREHEAT trial. ISRCTN ISRCTN15744669. Registered 25 February 2015.
Continuous diffraction of molecules and disordered molecular crystals
Yefanov, Oleksandr M.; Ayyer, Kartik; White, Thomas A.; Barty, Anton; Morgan, Andrew; Mariani, Valerio; Oberthuer, Dominik; Pande, Kanupriya
2017-01-01
The intensities of far-field diffraction patterns of orientationally aligned molecules obey Wilson statistics, whether those molecules are in isolation (giving rise to a continuous diffraction pattern) or arranged in a crystal (giving rise to Bragg peaks). Ensembles of molecules in several orientations, but uncorrelated in position, give rise to the incoherent sum of the diffraction from those objects, modifying the statistics in a similar way as crystal twinning modifies the distribution of Bragg intensities. This situation arises in the continuous diffraction of laser-aligned molecules or translationally disordered molecular crystals. This paper develops the analysis of the intensity statistics of such continuous diffraction to obtain parameters such as scaling, beam coherence and the number of contributing independent object orientations. When measured, continuous molecular diffraction is generally weak and accompanied by a background that far exceeds the strength of the signal. Instead of just relying upon the smallest measured intensities or their mean value to guide the subtraction of the background, it is shown how all measured values can be utilized to estimate the background, noise and signal, by employing a modified ‘noisy Wilson’ distribution that explicitly includes the background. Parameters relating to the background and signal quantities can be estimated from the moments of the measured intensities. The analysis method is demonstrated on previously published continuous diffraction data measured from crystals of photosystem II [Ayyer et al. (2016 ▸), Nature, 530, 202–206]. PMID:28808434
Adams, James; Kruger, Uwe; Geis, Elizabeth; Gehn, Eva; Fimbres, Valeria; Pollard, Elena; Mitchell, Jessica; Ingram, Julie; Hellmers, Robert; Quig, David; Hahn, Juergen
2017-01-01
Introduction A number of previous studies examined a possible association of toxic metals and autism, and over half of those studies suggest that toxic metal levels are different in individuals with Autism Spectrum Disorders (ASD). Additionally, several studies found that those levels correlate with the severity of ASD. Methods In order to further investigate these points, this paper performs the most detailed statistical analysis to date of a data set in this field. First morning urine samples were collected from 67 children and adults with ASD and 50 neurotypical controls of similar age and gender. The samples were analyzed to determine the levels of 10 urinary toxic metals (UTM). Autism-related symptoms were assessed with eleven behavioral measures. Statistical analysis was used to distinguish participants on the ASD spectrum and neurotypical participants based upon the UTM data alone. The analysis also included examining the association of autism severity with toxic metal excretion data using linear and nonlinear analysis. “Leave-one-out” cross-validation was used to ensure statistical independence of results. Results and Discussion Average excretion levels of several toxic metals (lead, tin, thallium, antimony) were significantly higher in the ASD group. However, ASD classification using univariate statistics proved difficult due to large variability, but nonlinear multivariate statistical analysis significantly improved ASD classification with Type I/II errors of 15% and 18%, respectively. These results clearly indicate that the urinary toxic metal excretion profiles of participants in the ASD group were significantly different from those of the neurotypical participants. Similarly, nonlinear methods determined a significantly stronger association between the behavioral measures and toxic metal excretion. The association was strongest for the Aberrant Behavior Checklist (including subscales on Irritability, Stereotypy, Hyperactivity, and Inappropriate Speech), but significant associations were found for UTM with all eleven autism-related assessments with cross-validation R2 values ranging from 0.12–0.48. PMID:28068407
Eash, David A.; Barnes, Kimberlee K.; O'Shea, Padraic S.
2016-09-19
A statewide study was led to develop regression equations for estimating three selected spring and three selected fall low-flow frequency statistics for ungaged stream sites in Iowa. The estimation equations developed for the six low-flow frequency statistics include spring (April through June) 1-, 7-, and 30-day mean low flows for a recurrence interval of 10 years and fall (October through December) 1-, 7-, and 30-day mean low flows for a recurrence interval of 10 years. Estimates of the three selected spring statistics are provided for 241 U.S. Geological Survey continuous-record streamgages, and estimates of the three selected fall statistics are provided for 238 of these streamgages, using data through June 2014. Because only 9 years of fall streamflow record were available, three streamgages included in the development of the spring regression equations were not included in the development of the fall regression equations. Because of regulation, diversion, or urbanization, 30 of the 241 streamgages were not included in the development of the regression equations. The study area includes Iowa and adjacent areas within 50 miles of the Iowa border. Because trend analyses indicated statistically significant positive trends when considering the period of record for most of the streamgages, the longest, most recent period of record without a significant trend was determined for each streamgage for use in the study. Geographic information system software was used to measure 63 selected basin characteristics for each of the 211streamgages used to develop the regional regression equations. The study area was divided into three low-flow regions that were defined in a previous study for the development of regional regression equations.Because several streamgages included in the development of regional regression equations have estimates of zero flow calculated from observed streamflow for selected spring and fall low-flow frequency statistics, the final equations for the three low-flow regions were developed using two types of regression analyses—left-censored and generalized-least-squares regression analyses. A total of 211 streamgages were included in the development of nine spring regression equations—three equations for each of the three low-flow regions. A total of 208 streamgages were included in the development of nine fall regression equations—three equations for each of the three low-flow regions. A censoring threshold was used to develop 15 left-censored regression equations to estimate the three fall low-flow frequency statistics for each of the three low-flow regions and to estimate the three spring low-flow frequency statistics for the southern and northwest regions. For the northeast region, generalized-least-squares regression was used to develop three equations to estimate the three spring low-flow frequency statistics. For the northeast region, average standard errors of prediction range from 32.4 to 48.4 percent for the spring equations and average standard errors of estimate range from 56.4 to 73.8 percent for the fall equations. For the northwest region, average standard errors of estimate range from 58.9 to 62.1 percent for the spring equations and from 83.2 to 109.4 percent for the fall equations. For the southern region, average standard errors of estimate range from 43.2 to 64.0 percent for the spring equations and from 78.1 to 78.7 percent for the fall equations.The regression equations are applicable only to stream sites in Iowa with low flows not substantially affected by regulation, diversion, or urbanization and with basin characteristics within the range of those used to develop the equations. The regression equations will be implemented within the U.S. Geological Survey StreamStats Web-based geographic information system application. StreamStats allows users to click on any ungaged stream site and compute estimates of the six selected spring and fall low-flow statistics; in addition, 90-percent prediction intervals and the measured basin characteristics for the ungaged site are provided. StreamStats also allows users to click on any Iowa streamgage to obtain computed estimates for the six selected spring and fall low-flow statistics.
Four modes of optical parametric operation for squeezed state generation
NASA Astrophysics Data System (ADS)
Andersen, U. L.; Buchler, B. C.; Lam, P. K.; Wu, J. W.; Gao, J. R.; Bachor, H.-A.
2003-11-01
We report a versatile instrument, based on a monolithic optical parametric amplifier, which reliably generates four different types of squeezed light. We obtained vacuum squeezing, low power amplitude squeezing, phase squeezing and bright amplitude squeezing. We show a complete analysis of this light, including a full quantum state tomography. In addition we demonstrate the direct detection of the squeezed state statistics without the aid of a spectrum analyser. This technique makes the nonclassical properties directly visible and allows complete measurement of the statistical moments of the squeezed quadrature.
FY07 LDRD Final Report Neutron Capture Cross-Section Measurements at DANCE
DOE Office of Scientific and Technical Information (OSTI.GOV)
Parker, W; Agvaanluvsan, U; Wilk, P
2008-02-08
We have measured neutron capture cross sections intended to address defense science problems including mix and the Quantification of Margins and Uncertainties (QMU), and provide details about statistical decay of excited nuclei. A major part of this project included developing the ability to produce radioactive targets. The cross-section measurements were made using the white neutron source at the Los Alamos Neutron Science Center, the detector array called DANCE (The Detector for Advanced Neutron Capture Experiments) and targets important for astrophysics and stockpile stewardship. DANCE is at the leading edge of neutron capture physics and represents a major leap forward inmore » capability. The detector array was recently built with LDRD money. Our measurements are a significant part of the early results from the new experimental DANCE facility. Neutron capture reactions are important for basic nuclear science, including astrophysics and the statistics of the {gamma}-ray cascades, and for applied science, including stockpile science and technology. We were most interested in neutron capture with neutron energies in the range between 1 eV and a few hundred keV, with targets important to basic science, and the s-process in particular. Of particular interest were neutron capture cross-section measurements of rare isotopes, especially radioactive isotopes. A strong collaboration between universities and Los Alamos due to the Academic Alliance was in place at the start of our project. Our project gave Livermore leverage in focusing on Livermore interests. The Lawrence Livermore Laboratory did not have a resident expert in cross-section measurements; this project allowed us to develop this expertise. For many radionuclides, the cross sections for destruction, especially (n,{gamma}), are not well known, and there is no adequate model that describes neutron capture. The modeling problem is significant because, at low energies where capture reactions are important, the neutron reaction cross sections show resonance behavior or follow 1/v of the incident neutrons. In the case of odd-odd nuclei, the modeling problem is particularly difficult because degenerate states (rotational bands) present in even-even nuclei have separated in energy. Our work included interpretation of the {gamma}-ray spectra to compare with the Statistical Model and provides information on level density and statistical decay. Neutron capture cross sections are of programmatic interest to defense sciences because many elements were added to nuclear devices in order to determine various details of the nuclear detonation, including fission yields, fusion yields, and mix. Both product nuclei created by (n,2n) reactions and reactant nuclei are transmuted by neutron capture during the explosion. Very few of the (n,{gamma}) cross sections for reactions that create products measured by radiochemists have ever been experimentally determined; most are calculated by radiochemical equivalences. Our new experimentally measured capture cross sections directly impact our knowledge about the uncertainties in device performances, which enhances our capability of carrying out our stockpile stewardship program. Europium and gadolinium cross sections are important for both astrophysics and defense programs. Measurements made prior to this project on stable europium targets differ by 30-40%, which was considered to be significantly disparate. Of the gadolinium isotopes, {sup 151}Gd is important for stockpile stewardship, and {sup 153}Gd is of high interest to astrophysics, and nether of these (radioactive) gadolinium (n,{gamma}) cross sections have been measured. Additional stable gadolinium isotopes, including {sup 157,160}Gd are of interest to astrophysics. Historical measurements of gadolinium isotopes, including {sup 152,154}Gd, had disagreements similar to the 30-40% disagreements found in the historical europium data. Actinide capture cross section measurements are important for both Stockpile Stewardship and for nuclear forensics. We focused on the {sup 242m}Am(n,{gamma}) measurement, as there was no existing capture measurement for this isotope. The cross-section measurements (cross section vs. E{sub n}) were made at the Detector for Advanced Neutron Capture Experiments. DANCE is comprised of a highly segmented array of barium fluoride (BaF{sub 2}) crystals specifically designed for neutron capture-gamma measurements, using small radioactive targets (less than one milligram). A picture of half the array, along with a photo of one crystal, is shown in Fig. 1. DANCE provides the world's leading capability for measurements of neutron capture cross sections with radioactive targets. The DANCE is a 4{pi} calorimeter and uses the intense spallation neutron source the Lujan Center at the Los Alamos National Laboratory. The detector array consists of 159 barium fluoride crystals arranged in a sphere around the target.« less
Intrarater Reliability and Other Psychometrics of the Health Promoting Activities Scale (HPAS).
Muskett, Rachel; Bourke-Taylor, Helen; Hewitt, Alana
The Health Promoting Activities Scale (HPAS) measures the self-rated frequency with which adults participate in activities that promote health. We evaluated the internal consistency, construct validity, and intrarater reliability of the HPAS with a cohort of mothers (N = 56) of school-age children. We used an online survey that included the HPAS and measures of mental and physical health. Statistical analysis included intraclass correlation coefficients (ICCs), measurement error, error range, limits of agreement, and minimum detectable change (MDC). The HPAS showed good internal consistency (Cronbach's α = .73). Construct validity was supported by a significant difference in HPAS scores among participants grouped by physical activity level; no other differences were significant. Results included a high aggregate ICC of .90 and an MDC of 5 points. Our evaluation of the HPAS revealed good reliability and stability, suggesting suitability for ongoing evaluation as an outcome measure. Copyright © 2017 by the American Occupational Therapy Association, Inc.
Effects of singing training on the speaking voice of voice majors.
Mendes, Ana P; Brown, W S; Rothman, Howard B; Sapienza, Christine
2004-03-01
This longitudinal study gathered data with regard to the question: Does singing training have an effect on the speaking voice? Fourteen voice majors (12 females and two males; age range 17 to 20 years) were recorded once a semester for four consecutive semesters, while sustaining vowels and reading the "Rainbow Passage." Acoustic measures included speaking fundamental frequency (SFF) and sound pressure level (SLP). Perturbation measures included jitter, shimmer, and harmonic-to-noise ratio. Temporal measures included sentence, consonant, and diphthong durations. Results revealed that, as the number of semesters increased, the SFF increased while jitter and shimmer slightly decreased. Repeated measure analysis, however, indicated that none of the acoustic, temporal, or perturbation differences were statistically significant. These results confirm earlier cross-sectional studies that compared singers with nonsingers, in that singing training mostly affects the singing voice and rarely the speaking voice.
Evaluation of wind field statistics near and inside clouds using a coherent Doppler lidar
NASA Astrophysics Data System (ADS)
Lottman, Brian Todd
1998-09-01
This work proposes advanced techniques for measuring the spatial wind field statistics near and inside clouds using a vertically pointing solid state coherent Doppler lidar on a fixed ground based platform. The coherent Doppler lidar is an ideal instrument for high spatial and temporal resolution velocity estimates. The basic parameters of lidar are discussed, including a complete statistical description of the Doppler lidar signal. This description is extended to cases with simple functional forms for aerosol backscatter and velocity. An estimate for the mean velocity over a sensing volume is produced by estimating the mean spectra. There are many traditional spectral estimators, which are useful for conditions with slowly varying velocity and backscatter. A new class of estimators (novel) is introduced that produces reliable velocity estimates for conditions with large variations in aerosol backscatter and velocity with range, such as cloud conditions. Performance of traditional and novel estimators is computed for a variety of deterministic atmospheric conditions using computer simulated data. Wind field statistics are produced for actual data for a cloud deck, and for multi- layer clouds. Unique results include detection of possible spectral signatures for rain, estimates for the structure function inside a cloud deck, reliable velocity estimation techniques near and inside thin clouds, and estimates for simple wind field statistics between cloud layers.
Assessment of the beryllium lymphocyte proliferation test using statistical process control.
Cher, Daniel J; Deubner, David C; Kelsh, Michael A; Chapman, Pamela S; Ray, Rose M
2006-10-01
Despite more than 20 years of surveillance and epidemiologic studies using the beryllium blood lymphocyte proliferation test (BeBLPT) as a measure of beryllium sensitization (BeS) and as an aid for diagnosing subclinical chronic beryllium disease (CBD), improvements in specific understanding of the inhalation toxicology of CBD have been limited. Although epidemiologic data suggest that BeS and CBD risks vary by process/work activity, it has proven difficult to reach specific conclusions regarding the dose-response relationship between workplace beryllium exposure and BeS or subclinical CBD. One possible reason for this uncertainty could be misclassification of BeS resulting from variation in BeBLPT testing performance. The reliability of the BeBLPT, a biological assay that measures beryllium sensitization, is unknown. To assess the performance of four laboratories that conducted this test, we used data from a medical surveillance program that offered testing for beryllium sensitization with the BeBLPT. The study population was workers exposed to beryllium at various facilities over a 10-year period (1992-2001). Workers with abnormal results were offered diagnostic workups for CBD. Our analyses used a standard statistical technique, statistical process control (SPC), to evaluate test reliability. The study design involved a repeated measures analysis of BeBLPT results generated from the company-wide, longitudinal testing. Analytical methods included use of (1) statistical process control charts that examined temporal patterns of variation for the stimulation index, a measure of cell reactivity to beryllium; (2) correlation analysis that compared prior perceptions of BeBLPT instability to the statistical measures of test variation; and (3) assessment of the variation in the proportion of missing test results and how time periods with more missing data influenced SPC findings. During the period of this study, all laboratories displayed variation in test results that were beyond what would be expected due to chance alone. Patterns of test results suggested that variations were systematic. We conclude that laboratories performing the BeBLPT or other similar biological assays of immunological response could benefit from a statistical approach such as SPC to improve quality management.
Koh, Dong-Hee; Locke, Sarah J.; Chen, Yu-Cheng; Purdue, Mark P.; Friesen, Melissa C.
2016-01-01
Background Retrospective exposure assessment of occupational lead exposure in population-based studies requires historical exposure information from many occupations and industries. Methods We reviewed published US exposure monitoring studies to identify lead exposure measurement data. We developed an occupational lead exposure database from the 175 identified papers containing 1,111 sets of lead concentration summary statistics (21% area air, 47% personal air, 32% blood). We also extracted ancillary exposure-related information, including job, industry, task/location, year collected, sampling strategy, control measures in place, and sampling and analytical methods. Results Measurements were published between 1940 and 2010 and represented 27 2-digit standardized industry classification codes. The majority of the measurements were related to lead-based paint work, joining or cutting metal using heat, primary and secondary metal manufacturing, and lead acid battery manufacturing. Conclusions This database can be used in future statistical analyses to characterize differences in lead exposure across time, jobs, and industries. PMID:25968240
Depressive Personality Disorder: A Comparison of Three Self-Report Measures
ERIC Educational Resources Information Center
Miller, Joshua D.; Tant, Adam; Bagby, R. Michael
2010-01-01
Depressive personality disorder (DPD) was included in the appendix of the "Diagnostic and Statistical Manual of Mental Disorders", Fourth Edition ("DSM-IV") for further study. Questions abound regarding this disorder in terms of its distinctiveness from extant diagnostic constructs and clinical significance.This study examines…
Minnesota's forest resources in 2004
Patrick D. Miles; Gary J. Brand; Manfred E. Mielke
2006-01-01
This report presents forest statistics based on the five annual inventory panels measured from 2000 through 2004. Forest area is estimated at 16.2 million acres or 32 percent of the total land area in the State. Important pests in Minnesota forests include the forest tent caterpillar and spruce budworm.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-04-21
... Officer, (202) 452-3829, Division of Research and Statistics, Board of Governors of the Federal Reserve... the information collection, including the validity of the methodology and assumptions used; (c) Ways... measures (such as regulatory or accounting). The agencies' burden estimates for these information...
ERIC Educational Resources Information Center
Brzezinski, Michal
2010-01-01
This paper examines the evolution of income affluence (richness) in Poland during 1998-2007. Using household survey data, the paper estimates several statistical indices of income affluence including income share of the top percentiles, population share of individuals receiving incomes higher than the richness line, and measures that take into…
Australian Vocational Education and Training Statistics: VET in Schools, 2008
ERIC Educational Resources Information Center
National Centre for Vocational Education Research (NCVER), 2010
2010-01-01
This report presents information about senior secondary school students undertaking vocational education and training (VET) through the program known as "VET in Schools" during 2008. It includes information on participation, students, courses and qualifications, and subjects. The information on key performance measures and program…
NASA Astrophysics Data System (ADS)
Nironi, Chiara; Salizzoni, Pietro; Marro, Massimo; Mejean, Patrick; Grosjean, Nathalie; Soulhac, Lionel
2015-09-01
The prediction of the probability density function (PDF) of a pollutant concentration within atmospheric flows is of primary importance in estimating the hazard related to accidental releases of toxic or flammable substances and their effects on human health. This need motivates studies devoted to the characterization of concentration statistics of pollutants dispersion in the lower atmosphere, and their dependence on the parameters controlling their emissions. As is known from previous experimental results, concentration fluctuations are significantly influenced by the diameter of the source and its elevation. In this study, we aim to further investigate the dependence of the dispersion process on the source configuration, including source size, elevation and emission velocity. To that end we study experimentally the influence of these parameters on the statistics of the concentration of a passive scalar, measured at several distances downwind of the source. We analyze the spatial distribution of the first four moments of the concentration PDFs, with a focus on the variance, its dissipation and production and its spectral density. The information provided by the dataset, completed by estimates of the intermittency factors, allow us to discuss the role of the main mechanisms controlling the scalar dispersion and their link to the form of the PDF. The latter is shown to be very well approximated by a Gamma distribution, irrespective of the emission conditions and the distance from the source. Concentration measurements are complemented by a detailed description of the velocity statistics, including direct estimates of the Eulerian integral length scales from two-point correlations, a measurement that has been rarely presented to date.
Documentation of the U.S. Geological Survey Oceanographic Time-Series Measurement Database
Montgomery, Ellyn T.; Martini, Marinna A.; Lightsom, Frances L.; Butman, Bradford
2008-01-02
This report describes the instrumentation and platforms used to make the measurements; the methods used to process, apply quality-control criteria, and archive the data; the data storage format, and how the data are released and distributed. The report also includes instructions on how to access the data from the online database at http://stellwagen.er.usgs.gov/. As of 2016, the database contains about 5,000 files, which may include observations of current velocity, wave statistics, ocean temperature, conductivity, pressure, and light transmission at one or more depths over some duration of time.
Metikaridis, T Damianos; Hadjipavlou, Alexander; Artemiadis, Artemios; Chrousos, George; Darviri, Christina
2016-05-20
Studies have shown that stress is implicated in the cause of neck pain (NP). The purpose of this study is to examine the effect of a simple, zero cost stress management program on patients suffering from NP. This study is a parallel-type randomized clinical study. People suffering from chronic non-specific NP were chosen randomly to participate in an eight week duration program of stress management (N= 28) (including diaphragmatic breathing, progressive muscle relaxation) or in a no intervention control condition (N= 25). Self-report measures were used for the evaluation of various variables at the beginning and at the end of the eight-week monitoring period. Descriptive and inferential statistic methods were used for the statistical analysis. At the end of the monitoring period, the intervention group showed a statistically significant reduction of stress and anxiety (p= 0.03, p= 0.01), report of stress related symptoms (p= 0.003), percentage of disability due to NP (p= 0.000) and NP intensity (p= 0.002). At the same time, daily routine satisfaction levels were elevated (p= 0.019). No statistically significant difference was observed in cortisol measurements. Stress management has positive effects on NP patients.
System and method for diagnosing EGR performance using NOx sensor
Mazur, Christopher John
2003-12-23
A method and system for diagnosing a condition of an EGR valve used in an engine system. The EGR valve controls the portion exhaust gases produced by such engine system and fed back to an intake of such engine system. The engine system includes a NOx sensor for measuring NOx in such exhaust. The method includes: determining a time rate of change in NOx measured by the NOx sensor; comparing the determined time rate of change in the measured NOx with a predetermined expected time rate of change in measured NOx; and determining the condition of the EGR valve as a function of such comparison. The method also includes: determining from NOx measured by the NOx sensor and engine operating conditions indications of instances when samples of such measured NOx are greater than an expected maximum NOx level for such engine condition and less than an expected minimum NOx level for such engine condition; and determining the condition of the EGR valve as a function of a statistical analysis of such indications. The method includes determining whether the NOx sensor is faulty and wherein the EGR condition determining includes determining whether the NOx sensor is faulty.
Raschke, Robert A; Groves, Robert H; Khurana, Hargobind S; Nikhanj, Nidhi; Utter, Ethel; Hartling, Didi; Stoffer, Brenda; Nunn, Kristina; Tryon, Shona; Bruner, Michelle; Calleja, Maria; Curry, Steven C
2017-01-01
Sepsis is a leading cause of mortality and morbidity in hospitalised patients. The Centers for Medicare and Medicaid Services (CMS) mandated that US hospitals report sepsis bundle compliance rate as a quality process measure in October 2015. The specific aim of our study was to improve the CMS sepsis bundle compliance rate from 30% to 40% across 20 acute care hospitals in our healthcare system within 1 year. The study included all adult inpatients with sepsis sampled according to CMS specifications from October 2015 to September 2016. The CMS sepsis bundle compliance rate was tracked monthly using statistical process control charting. A baseline rate of 28.5% with 99% control limits was established. We implemented multiple interventions including computerised decision support systems (CDSSs) to increase compliance with the most commonly missing bundle elements. Compliance reached 42% (99% statistical process control limits 18.4%-38.6%) as CDSS was implemented system-wide, but this improvement was not sustained after CMS changed specifications of the outcome measure. Difficulties encountered elucidate shortcomings of our study methodology and of the CMS sepsis bundle compliance rate as a quality process measure.
Cosmology constraints from shear peak statistics in Dark Energy Survey Science Verification data
NASA Astrophysics Data System (ADS)
Kacprzak, T.; Kirk, D.; Friedrich, O.; Amara, A.; Refregier, A.; Marian, L.; Dietrich, J. P.; Suchyta, E.; Aleksić, J.; Bacon, D.; Becker, M. R.; Bonnett, C.; Bridle, S. L.; Chang, C.; Eifler, T. F.; Hartley, W. G.; Huff, E. M.; Krause, E.; MacCrann, N.; Melchior, P.; Nicola, A.; Samuroff, S.; Sheldon, E.; Troxel, M. A.; Weller, J.; Zuntz, J.; Abbott, T. M. C.; Abdalla, F. B.; Armstrong, R.; Benoit-Lévy, A.; Bernstein, G. M.; Bernstein, R. A.; Bertin, E.; Brooks, D.; Burke, D. L.; Carnero Rosell, A.; Carrasco Kind, M.; Carretero, J.; Castander, F. J.; Crocce, M.; D'Andrea, C. B.; da Costa, L. N.; Desai, S.; Diehl, H. T.; Evrard, A. E.; Neto, A. Fausti; Flaugher, B.; Fosalba, P.; Frieman, J.; Gerdes, D. W.; Goldstein, D. A.; Gruen, D.; Gruendl, R. A.; Gutierrez, G.; Honscheid, K.; Jain, B.; James, D. J.; Jarvis, M.; Kuehn, K.; Kuropatkin, N.; Lahav, O.; Lima, M.; March, M.; Marshall, J. L.; Martini, P.; Miller, C. J.; Miquel, R.; Mohr, J. J.; Nichol, R. C.; Nord, B.; Plazas, A. A.; Romer, A. K.; Roodman, A.; Rykoff, E. S.; Sanchez, E.; Scarpine, V.; Schubnell, M.; Sevilla-Noarbe, I.; Smith, R. C.; Soares-Santos, M.; Sobreira, F.; Swanson, M. E. C.; Tarle, G.; Thomas, D.; Vikram, V.; Walker, A. R.; Zhang, Y.; DES Collaboration
2016-12-01
Shear peak statistics has gained a lot of attention recently as a practical alternative to the two-point statistics for constraining cosmological parameters. We perform a shear peak statistics analysis of the Dark Energy Survey (DES) Science Verification (SV) data, using weak gravitational lensing measurements from a 139 deg2 field. We measure the abundance of peaks identified in aperture mass maps, as a function of their signal-to-noise ratio, in the signal-to-noise range 04 would require significant corrections, which is why we do not include them in our analysis. We compare our results to the cosmological constraints from the two-point analysis on the SV field and find them to be in good agreement in both the central value and its uncertainty. We discuss prospects for future peak statistics analysis with upcoming DES data.
Probing the Statistical Properties of Unknown Texts: Application to the Voynich Manuscript
Amancio, Diego R.; Altmann, Eduardo G.; Rybski, Diego; Oliveira, Osvaldo N.; Costa, Luciano da F.
2013-01-01
While the use of statistical physics methods to analyze large corpora has been useful to unveil many patterns in texts, no comprehensive investigation has been performed on the interdependence between syntactic and semantic factors. In this study we propose a framework for determining whether a text (e.g., written in an unknown alphabet) is compatible with a natural language and to which language it could belong. The approach is based on three types of statistical measurements, i.e. obtained from first-order statistics of word properties in a text, from the topology of complex networks representing texts, and from intermittency concepts where text is treated as a time series. Comparative experiments were performed with the New Testament in 15 different languages and with distinct books in English and Portuguese in order to quantify the dependency of the different measurements on the language and on the story being told in the book. The metrics found to be informative in distinguishing real texts from their shuffled versions include assortativity, degree and selectivity of words. As an illustration, we analyze an undeciphered medieval manuscript known as the Voynich Manuscript. We show that it is mostly compatible with natural languages and incompatible with random texts. We also obtain candidates for keywords of the Voynich Manuscript which could be helpful in the effort of deciphering it. Because we were able to identify statistical measurements that are more dependent on the syntax than on the semantics, the framework may also serve for text analysis in language-dependent applications. PMID:23844002
Probing the statistical properties of unknown texts: application to the Voynich Manuscript.
Amancio, Diego R; Altmann, Eduardo G; Rybski, Diego; Oliveira, Osvaldo N; Costa, Luciano da F
2013-01-01
While the use of statistical physics methods to analyze large corpora has been useful to unveil many patterns in texts, no comprehensive investigation has been performed on the interdependence between syntactic and semantic factors. In this study we propose a framework for determining whether a text (e.g., written in an unknown alphabet) is compatible with a natural language and to which language it could belong. The approach is based on three types of statistical measurements, i.e. obtained from first-order statistics of word properties in a text, from the topology of complex networks representing texts, and from intermittency concepts where text is treated as a time series. Comparative experiments were performed with the New Testament in 15 different languages and with distinct books in English and Portuguese in order to quantify the dependency of the different measurements on the language and on the story being told in the book. The metrics found to be informative in distinguishing real texts from their shuffled versions include assortativity, degree and selectivity of words. As an illustration, we analyze an undeciphered medieval manuscript known as the Voynich Manuscript. We show that it is mostly compatible with natural languages and incompatible with random texts. We also obtain candidates for keywords of the Voynich Manuscript which could be helpful in the effort of deciphering it. Because we were able to identify statistical measurements that are more dependent on the syntax than on the semantics, the framework may also serve for text analysis in language-dependent applications.
Water Quality Sensing and Spatio-Temporal Monitoring Structure with Autocorrelation Kernel Methods.
Vizcaíno, Iván P; Carrera, Enrique V; Muñoz-Romero, Sergio; Cumbal, Luis H; Rojo-Álvarez, José Luis
2017-10-16
Pollution on water resources is usually analyzed with monitoring campaigns, which consist of programmed sampling, measurement, and recording of the most representative water quality parameters. These campaign measurements yields a non-uniform spatio-temporal sampled data structure to characterize complex dynamics phenomena. In this work, we propose an enhanced statistical interpolation method to provide water quality managers with statistically interpolated representations of spatial-temporal dynamics. Specifically, our proposal makes efficient use of the a priori available information of the quality parameter measurements through Support Vector Regression (SVR) based on Mercer's kernels. The methods are benchmarked against previously proposed methods in three segments of the Machángara River and one segment of the San Pedro River in Ecuador, and their different dynamics are shown by statistically interpolated spatial-temporal maps. The best interpolation performance in terms of mean absolute error was the SVR with Mercer's kernel given by either the Mahalanobis spatial-temporal covariance matrix or by the bivariate estimated autocorrelation function. In particular, the autocorrelation kernel provides with significant improvement of the estimation quality, consistently for all the six water quality variables, which points out the relevance of including a priori knowledge of the problem.
Water Quality Sensing and Spatio-Temporal Monitoring Structure with Autocorrelation Kernel Methods
Vizcaíno, Iván P.; Muñoz-Romero, Sergio; Cumbal, Luis H.
2017-01-01
Pollution on water resources is usually analyzed with monitoring campaigns, which consist of programmed sampling, measurement, and recording of the most representative water quality parameters. These campaign measurements yields a non-uniform spatio-temporal sampled data structure to characterize complex dynamics phenomena. In this work, we propose an enhanced statistical interpolation method to provide water quality managers with statistically interpolated representations of spatial-temporal dynamics. Specifically, our proposal makes efficient use of the a priori available information of the quality parameter measurements through Support Vector Regression (SVR) based on Mercer’s kernels. The methods are benchmarked against previously proposed methods in three segments of the Machángara River and one segment of the San Pedro River in Ecuador, and their different dynamics are shown by statistically interpolated spatial-temporal maps. The best interpolation performance in terms of mean absolute error was the SVR with Mercer’s kernel given by either the Mahalanobis spatial-temporal covariance matrix or by the bivariate estimated autocorrelation function. In particular, the autocorrelation kernel provides with significant improvement of the estimation quality, consistently for all the six water quality variables, which points out the relevance of including a priori knowledge of the problem. PMID:29035333
ParallABEL: an R library for generalized parallelization of genome-wide association studies
2010-01-01
Background Genome-Wide Association (GWA) analysis is a powerful method for identifying loci associated with complex traits and drug response. Parts of GWA analyses, especially those involving thousands of individuals and consuming hours to months, will benefit from parallel computation. It is arduous acquiring the necessary programming skills to correctly partition and distribute data, control and monitor tasks on clustered computers, and merge output files. Results Most components of GWA analysis can be divided into four groups based on the types of input data and statistical outputs. The first group contains statistics computed for a particular Single Nucleotide Polymorphism (SNP), or trait, such as SNP characterization statistics or association test statistics. The input data of this group includes the SNPs/traits. The second group concerns statistics characterizing an individual in a study, for example, the summary statistics of genotype quality for each sample. The input data of this group includes individuals. The third group consists of pair-wise statistics derived from analyses between each pair of individuals in the study, for example genome-wide identity-by-state or genomic kinship analyses. The input data of this group includes pairs of SNPs/traits. The final group concerns pair-wise statistics derived for pairs of SNPs, such as the linkage disequilibrium characterisation. The input data of this group includes pairs of individuals. We developed the ParallABEL library, which utilizes the Rmpi library, to parallelize these four types of computations. ParallABEL library is not only aimed at GenABEL, but may also be employed to parallelize various GWA packages in R. The data set from the North American Rheumatoid Arthritis Consortium (NARAC) includes 2,062 individuals with 545,080, SNPs' genotyping, was used to measure ParallABEL performance. Almost perfect speed-up was achieved for many types of analyses. For example, the computing time for the identity-by-state matrix was linearly reduced from approximately eight hours to one hour when ParallABEL employed eight processors. Conclusions Executing genome-wide association analysis using the ParallABEL library on a computer cluster is an effective way to boost performance, and simplify the parallelization of GWA studies. ParallABEL is a user-friendly parallelization of GenABEL. PMID:20429914
Evaluation of different models to estimate the global solar radiation on inclined surface
NASA Astrophysics Data System (ADS)
Demain, C.; Journée, M.; Bertrand, C.
2012-04-01
Global and diffuse solar radiation intensities are, in general, measured on horizontal surfaces, whereas stationary solar conversion systems (both flat plate solar collector and solar photovoltaic) are mounted on inclined surface to maximize the amount of solar radiation incident on the collector surface. Consequently, the solar radiation incident measured on a tilted surface has to be determined by converting solar radiation from horizontal surface to tilted surface of interest. This study evaluates the performance of 14 models transposing 10 minutes, hourly and daily diffuse solar irradiation from horizontal to inclined surface. Solar radiation data from 8 months (April to November 2011) which include diverse atmospheric conditions and solar altitudes, measured on the roof of the radiation tower of the Royal Meteorological Institute of Belgium in Uccle (Longitude 4.35°, Latitude 50.79°) were used for validation purposes. The individual model performance is assessed by an inter-comparison between the calculated and measured solar global radiation on the south-oriented surface tilted at 50.79° using statistical methods. The relative performance of the different models under different sky conditions has been studied. Comparison of the statistical errors between the different radiation models in function of the clearness index shows that some models perform better under one type of sky condition. Putting together different models acting under different sky conditions can lead to a diminution of the statistical error between global measured solar radiation and global estimated solar radiation. As models described in this paper have been developed for hourly data inputs, statistical error indexes are minimum for hourly data and increase for 10 minutes and one day frequency data.
Schwartz, Jennifer; Wang, Yongfei; Qin, Li; Schwamm, Lee H; Fonarow, Gregg C; Cormier, Nicole; Dorsey, Karen; McNamara, Robert L; Suter, Lisa G; Krumholz, Harlan M; Bernheim, Susannah M
2017-11-01
The Centers for Medicare & Medicaid Services publicly reports a hospital-level stroke mortality measure that lacks stroke severity risk adjustment. Our objective was to describe novel measures of stroke mortality suitable for public reporting that incorporate stroke severity into risk adjustment. We linked data from the American Heart Association/American Stroke Association Get With The Guidelines-Stroke registry with Medicare fee-for-service claims data to develop the measures. We used logistic regression for variable selection in risk model development. We developed 3 risk-standardized mortality models for patients with acute ischemic stroke, all of which include the National Institutes of Health Stroke Scale score: one that includes other risk variables derived only from claims data (claims model); one that includes other risk variables derived from claims and clinical variables that could be obtained from electronic health record data (hybrid model); and one that includes other risk variables that could be derived only from electronic health record data (electronic health record model). The cohort used to develop and validate the risk models consisted of 188 975 hospital admissions at 1511 hospitals. The claims, hybrid, and electronic health record risk models included 20, 21, and 9 risk-adjustment variables, respectively; the C statistics were 0.81, 0.82, and 0.79, respectively (as compared with the current publicly reported model C statistic of 0.75); the risk-standardized mortality rates ranged from 10.7% to 19.0%, 10.7% to 19.1%, and 10.8% to 20.3%, respectively; the median risk-standardized mortality rate was 14.5% for all measures; and the odds of mortality for a high-mortality hospital (+1 SD) were 1.51, 1.52, and 1.52 times those for a low-mortality hospital (-1 SD), respectively. We developed 3 quality measures that demonstrate better discrimination than the Centers for Medicare & Medicaid Services' existing stroke mortality measure, adjust for stroke severity, and could be implemented in a variety of settings. © 2017 American Heart Association, Inc.
Guidelines for Genome-Scale Analysis of Biological Rhythms.
Hughes, Michael E; Abruzzi, Katherine C; Allada, Ravi; Anafi, Ron; Arpat, Alaaddin Bulak; Asher, Gad; Baldi, Pierre; de Bekker, Charissa; Bell-Pedersen, Deborah; Blau, Justin; Brown, Steve; Ceriani, M Fernanda; Chen, Zheng; Chiu, Joanna C; Cox, Juergen; Crowell, Alexander M; DeBruyne, Jason P; Dijk, Derk-Jan; DiTacchio, Luciano; Doyle, Francis J; Duffield, Giles E; Dunlap, Jay C; Eckel-Mahan, Kristin; Esser, Karyn A; FitzGerald, Garret A; Forger, Daniel B; Francey, Lauren J; Fu, Ying-Hui; Gachon, Frédéric; Gatfield, David; de Goede, Paul; Golden, Susan S; Green, Carla; Harer, John; Harmer, Stacey; Haspel, Jeff; Hastings, Michael H; Herzel, Hanspeter; Herzog, Erik D; Hoffmann, Christy; Hong, Christian; Hughey, Jacob J; Hurley, Jennifer M; de la Iglesia, Horacio O; Johnson, Carl; Kay, Steve A; Koike, Nobuya; Kornacker, Karl; Kramer, Achim; Lamia, Katja; Leise, Tanya; Lewis, Scott A; Li, Jiajia; Li, Xiaodong; Liu, Andrew C; Loros, Jennifer J; Martino, Tami A; Menet, Jerome S; Merrow, Martha; Millar, Andrew J; Mockler, Todd; Naef, Felix; Nagoshi, Emi; Nitabach, Michael N; Olmedo, Maria; Nusinow, Dmitri A; Ptáček, Louis J; Rand, David; Reddy, Akhilesh B; Robles, Maria S; Roenneberg, Till; Rosbash, Michael; Ruben, Marc D; Rund, Samuel S C; Sancar, Aziz; Sassone-Corsi, Paolo; Sehgal, Amita; Sherrill-Mix, Scott; Skene, Debra J; Storch, Kai-Florian; Takahashi, Joseph S; Ueda, Hiroki R; Wang, Han; Weitz, Charles; Westermark, Pål O; Wijnen, Herman; Xu, Ying; Wu, Gang; Yoo, Seung-Hee; Young, Michael; Zhang, Eric Erquan; Zielinski, Tomasz; Hogenesch, John B
2017-10-01
Genome biology approaches have made enormous contributions to our understanding of biological rhythms, particularly in identifying outputs of the clock, including RNAs, proteins, and metabolites, whose abundance oscillates throughout the day. These methods hold significant promise for future discovery, particularly when combined with computational modeling. However, genome-scale experiments are costly and laborious, yielding "big data" that are conceptually and statistically difficult to analyze. There is no obvious consensus regarding design or analysis. Here we discuss the relevant technical considerations to generate reproducible, statistically sound, and broadly useful genome-scale data. Rather than suggest a set of rigid rules, we aim to codify principles by which investigators, reviewers, and readers of the primary literature can evaluate the suitability of different experimental designs for measuring different aspects of biological rhythms. We introduce CircaInSilico, a web-based application for generating synthetic genome biology data to benchmark statistical methods for studying biological rhythms. Finally, we discuss several unmet analytical needs, including applications to clinical medicine, and suggest productive avenues to address them.
Guidelines for Genome-Scale Analysis of Biological Rhythms
Hughes, Michael E.; Abruzzi, Katherine C.; Allada, Ravi; Anafi, Ron; Arpat, Alaaddin Bulak; Asher, Gad; Baldi, Pierre; de Bekker, Charissa; Bell-Pedersen, Deborah; Blau, Justin; Brown, Steve; Ceriani, M. Fernanda; Chen, Zheng; Chiu, Joanna C.; Cox, Juergen; Crowell, Alexander M.; DeBruyne, Jason P.; Dijk, Derk-Jan; DiTacchio, Luciano; Doyle, Francis J.; Duffield, Giles E.; Dunlap, Jay C.; Eckel-Mahan, Kristin; Esser, Karyn A.; FitzGerald, Garret A.; Forger, Daniel B.; Francey, Lauren J.; Fu, Ying-Hui; Gachon, Frédéric; Gatfield, David; de Goede, Paul; Golden, Susan S.; Green, Carla; Harer, John; Harmer, Stacey; Haspel, Jeff; Hastings, Michael H.; Herzel, Hanspeter; Herzog, Erik D.; Hoffmann, Christy; Hong, Christian; Hughey, Jacob J.; Hurley, Jennifer M.; de la Iglesia, Horacio O.; Johnson, Carl; Kay, Steve A.; Koike, Nobuya; Kornacker, Karl; Kramer, Achim; Lamia, Katja; Leise, Tanya; Lewis, Scott A.; Li, Jiajia; Li, Xiaodong; Liu, Andrew C.; Loros, Jennifer J.; Martino, Tami A.; Menet, Jerome S.; Merrow, Martha; Millar, Andrew J.; Mockler, Todd; Naef, Felix; Nagoshi, Emi; Nitabach, Michael N.; Olmedo, Maria; Nusinow, Dmitri A.; Ptáček, Louis J.; Rand, David; Reddy, Akhilesh B.; Robles, Maria S.; Roenneberg, Till; Rosbash, Michael; Ruben, Marc D.; Rund, Samuel S.C.; Sancar, Aziz; Sassone-Corsi, Paolo; Sehgal, Amita; Sherrill-Mix, Scott; Skene, Debra J.; Storch, Kai-Florian; Takahashi, Joseph S.; Ueda, Hiroki R.; Wang, Han; Weitz, Charles; Westermark, Pål O.; Wijnen, Herman; Xu, Ying; Wu, Gang; Yoo, Seung-Hee; Young, Michael; Zhang, Eric Erquan; Zielinski, Tomasz; Hogenesch, John B.
2017-01-01
Genome biology approaches have made enormous contributions to our understanding of biological rhythms, particularly in identifying outputs of the clock, including RNAs, proteins, and metabolites, whose abundance oscillates throughout the day. These methods hold significant promise for future discovery, particularly when combined with computational modeling. However, genome-scale experiments are costly and laborious, yielding “big data” that are conceptually and statistically difficult to analyze. There is no obvious consensus regarding design or analysis. Here we discuss the relevant technical considerations to generate reproducible, statistically sound, and broadly useful genome-scale data. Rather than suggest a set of rigid rules, we aim to codify principles by which investigators, reviewers, and readers of the primary literature can evaluate the suitability of different experimental designs for measuring different aspects of biological rhythms. We introduce CircaInSilico, a web-based application for generating synthetic genome biology data to benchmark statistical methods for studying biological rhythms. Finally, we discuss several unmet analytical needs, including applications to clinical medicine, and suggest productive avenues to address them. PMID:29098954
Statistical Distributions of Optical Flares from Gamma-Ray Bursts
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yi, Shuang-Xi; Yu, Hai; Wang, F. Y.
2017-07-20
We statistically study gamma-ray burst (GRB) optical flares from the Swift /UVOT catalog. We compile 119 optical flares, including 77 flares with redshift measurements. Some tight correlations among the timescales of optical flares are found. For example, the rise time is correlated with the decay time, and the duration time is correlated with the peak time of optical flares. These two tight correlations indicate that longer rise times are associated with longer decay times of optical flares and also suggest that broader optical flares peak at later times, which are consistent with the corresponding correlations of X-ray flares. We alsomore » study the frequency distributions of optical flare parameters, including the duration time, rise time, decay time, peak time, and waiting time. Similar power-law distributions for optical and X-ray flares are found. Our statistic results imply that GRB optical flares and X-ray flares may share the similar physical origin, and both of them are possibly related to central engine activities.« less
NASA Astrophysics Data System (ADS)
Bakker, Arthur; Ben-Zvi, Dani; Makar, Katie
2017-12-01
To understand how statistical and other types of reasoning are coordinated with actions to reduce uncertainty, we conducted a case study in vocational education that involved statistical hypothesis testing. We analyzed an intern's research project in a hospital laboratory in which reducing uncertainties was crucial to make a valid statistical inference. In his project, the intern, Sam, investigated whether patients' blood could be sent through pneumatic post without influencing the measurement of particular blood components. We asked, in the process of making a statistical inference, how are reasons and actions coordinated to reduce uncertainty? For the analysis, we used the semantic theory of inferentialism, specifically, the concept of webs of reasons and actions—complexes of interconnected reasons for facts and actions; these reasons include premises and conclusions, inferential relations, implications, motives for action, and utility of tools for specific purposes in a particular context. Analysis of interviews with Sam, his supervisor and teacher as well as video data of Sam in the classroom showed that many of Sam's actions aimed to reduce variability, rule out errors, and thus reduce uncertainties so as to arrive at a valid inference. Interestingly, the decisive factor was not the outcome of a t test but of the reference change value, a clinical chemical measure of analytic and biological variability. With insights from this case study, we expect that students can be better supported in connecting statistics with context and in dealing with uncertainty.
Consensus building for interlaboratory studies, key comparisons, and meta-analysis
NASA Astrophysics Data System (ADS)
Koepke, Amanda; Lafarge, Thomas; Possolo, Antonio; Toman, Blaza
2017-06-01
Interlaboratory studies in measurement science, including key comparisons, and meta-analyses in several fields, including medicine, serve to intercompare measurement results obtained independently, and typically produce a consensus value for the common measurand that blends the values measured by the participants. Since interlaboratory studies and meta-analyses reveal and quantify differences between measured values, regardless of the underlying causes for such differences, they also provide so-called ‘top-down’ evaluations of measurement uncertainty. Measured values are often substantially over-dispersed by comparison with their individual, stated uncertainties, thus suggesting the existence of yet unrecognized sources of uncertainty (dark uncertainty). We contrast two different approaches to take dark uncertainty into account both in the computation of consensus values and in the evaluation of the associated uncertainty, which have traditionally been preferred by different scientific communities. One inflates the stated uncertainties by a multiplicative factor. The other adds laboratory-specific ‘effects’ to the value of the measurand. After distinguishing what we call recipe-based and model-based approaches to data reductions in interlaboratory studies, we state six guiding principles that should inform such reductions. These principles favor model-based approaches that expose and facilitate the critical assessment of validating assumptions, and give preeminence to substantive criteria to determine which measurement results to include, and which to exclude, as opposed to purely statistical considerations, and also how to weigh them. Following an overview of maximum likelihood methods, three general purpose procedures for data reduction are described in detail, including explanations of how the consensus value and degrees of equivalence are computed, and the associated uncertainty evaluated: the DerSimonian-Laird procedure; a hierarchical Bayesian procedure; and the Linear Pool. These three procedures have been implemented and made widely accessible in a Web-based application (NIST Consensus Builder). We illustrate principles, statistical models, and data reduction procedures in four examples: (i) the measurement of the Newtonian constant of gravitation; (ii) the measurement of the half-lives of radioactive isotopes of caesium and strontium; (iii) the comparison of two alternative treatments for carotid artery stenosis; and (iv) a key comparison where the measurand was the calibration factor of a radio-frequency power sensor.
The AMRL Anthropometric Data Bank Library: Volumes 1-5
1977-10-01
crinion arc (#127) which were not measured on bald and balding men. Non-metric variables on the tape include somatotype ratings, both by the Sheldon...158). An analysis of the somatotype material was pub- lished as A Statistical Comparison of the Body Typing Methods of Hooton and Sheldon by C
Some Psychometric and Design Implications of Game-Based Learning Analytics
ERIC Educational Resources Information Center
Gibson, David; Clarke-Midura, Jody
2013-01-01
The rise of digital game and simulation-based learning applications has led to new approaches in educational measurement that take account of patterns in time, high resolution paths of action, and clusters of virtual performance artifacts. The new approaches, which depart from traditional statistical analyses, include data mining, machine…
Tips, Tropes, and Trivia: Ideas for Teaching Educational Research.
ERIC Educational Resources Information Center
Stallings, William M.; And Others
The collective experience of more than 50 years has led to the development of approaches that have enhanced student comprehension in the teaching of educational research methods, statistics, and measurement. Tips for teachers include using illustrative problems with one-digit numbers, using common situations and everyday objects to illustrate…
Diffuse Prior Monotonic Likelihood Ratio Test for Evaluation of Fused Image Quality Measures
2011-02-01
852–864. [25] W. Mendenhall , R. L. Scheaffer, and D. D. Wackerly, Mathematical Statistics With Applications, 3rd ed. Boston, MA: Duxbury Press, 1986...Professor and holds the Robert W. Wieseman Chaired Research Professorship in Electrical Engi- neering. His research interests include signal
A New Look at Bias in Aptitude Tests.
ERIC Educational Resources Information Center
Scheuneman, Janice Dowd
1981-01-01
Statistical bias in measurement and ethnic-group bias in testing are discussed, reviewing predictive and construct validity studies. Item bias is reconceptualized to include distance of item content from respondent's experience. Differing values of mean and standard deviation for bias parameter are analyzed in a simulation. References are…
Statistical Aspects of Reliability, Maintainability, and Availability.
1987-10-01
A total of 33 research reports were issued, and 35 papers were published in scientific journals or are in press. Research topics included optimal assembly of systems, multistate system theory , testing whether new is better than used nonparameter survival function estimation measuring information in censored models, generalizations of total positively and
AQAK: A Library Anxiety Scale for Undergraduate Students
ERIC Educational Resources Information Center
Anwar, Mumtaz A.; Al-Qallaf, Charlene L.; Al-Kandari, Noriah M.; Al-Ansari, Husain A.
2012-01-01
The library environment has drastically changed since 1992 when Bostick's Library Anxiety Scale was developed. This project aimed to develop a scale specifically for undergraduate students. A three-stage study was conducted, using students of Kuwait University. A variety of statistical measures, including factor analysis, were used to process the…
1995 Matched Anthropometric Database of U.S. Marine Corps Personnel: Summary Statistics
1996-09-01
the maximum point ofquiet respiration. Note: Breast tissue and latissimus dorsi muscle tissue are NOT included in this measurement ifthey extend...form one side of the pelvic cavity. Latissimus Dorsi - a large muscle located on the lower back that extends from just above the waist to behind
Rubalcava, J; Gómez-García, F; Ríos-Reina, J L
2012-01-01
Knowledge of the radiogrametric characteristics of a specific skeletal segment in a healthy population is of the utmost clinical importance. The main justification for this study is that there is no published description of the radiogrametric parameter of acetabular anteversion in a healthy Mexican adult population. A prospective, descriptive and cross-sectional study was conducted. Individuals of both genders older than 18 years and orthopedically healthy were included. They underwent a two-dimensional axial tomographic study of both hips to measure the acetabular anteversion angles. The statistical analysis consisted of obtaining central trend and scatter measurements. A multivariate analysis of variance (ANOVA) and statistical significance were performed. 118 individuals were studied, 60 males and 58 females, with a mean age of 47.7 +/- 16.7, and a range of 18-85 years. The anteversion of the entire group was 18.6 degrees + 4.1 degrees. Anteversion in males was 17.3 degrees +/- 3.5 degrees (10 degrees - 25 degrees) and in females 19.8 degrees +/- 4.7 degrees (10 degrees - 31 degrees). There were no statistically significant differences (p < or = 0.05) in right and left anteversion in the entire group. However, there were statistically significant differences (p > or = 0.005) both in the right and left sides when males and females were compared. Our study showed that there are great variations in the anteversion ranges of a healthy population. When our results are compared with those published by other authors the mean of most measurements exceeds 15 degrees. This should be useful to make therapeutic decisions that involve acetabular anteversion.
An evaluation of various methods of treatment for Legg-Calvé-Perthes disease.
Wang, L; Bowen, J R; Puniak, M A; Guille, J T; Glutting, J
1995-05-01
An analysis of 5 methods of treatment for Legg-Calvé-Perthes disease was done on 124 patients with 141 affected hips. Before treatment, all groups were statistically similar concerning initial Mose measurement, age at onset of the disease, gender, and Catterall class. Treatments included the Scottish Rite orthosis (41 hips), nonweight bearing and exercises (41 hips), Petrie cast (29 hips), femoral varus osteotomy (15 hips), or Salter osteotomy (15 hips). Hips treated by the Scottish Rite orthosis had a significantly worse Mose measurement across time interaction (repeated measures analysis of variance, post hoc analyses, p < 0.05). For the other 4 treatment methods, there was no statistically different change. At followup, the Mose measurements for hips treated with the Scottish Rite orthosis were significantly worse than those for hips treated by nonweight bearing and exercises, Petrie cast, varus osteotomy, or Salter osteotomy (repeated measures analysis of variance, post hoc analyses, p < 0.05). There was, however, no significant difference in the distribution of hips according to the Stulberg et al classification at the last followup.
Scherrer, Carol S.; Jacobson, Susan
2002-01-01
The roles of academic health sciences librarians are continually evolving as librarians initiate new programs and services in response to developments in computer technology and user demands. However, statistics currently collected by libraries do not accurately reflect or measure these new roles. It is essential for librarians to document, measure, and evaluate these new activities to continue to meet the needs of users and to ensure the viability of their professional role. To determine what new measures should be compiled, the authors examined current statistics, user demands, professional literature, and current activities of librarians as reported in abstracts of poster sessions at Medical Library Association annual meetings. Three new categories of services to be measured are proposed. The first, consultation, groups activities such as quality filtering and individual point-of-need instruction. The second, outreach, includes activities such as working as liaisons, participating in grand rounds or morning report, and providing continuing education. The third area, Web authoring, encompasses activities such as designing Web pages, creating online tutorials, and developing new products. Adding these three measures to those already being collected will provide a more accurate and complete depiction of the services offered by academic health sciences librarians. PMID:11999174
Challenges of Big Data Analysis.
Fan, Jianqing; Han, Fang; Liu, Han
2014-06-01
Big Data bring new opportunities to modern society and challenges to data scientists. On one hand, Big Data hold great promises for discovering subtle population patterns and heterogeneities that are not possible with small-scale data. On the other hand, the massive sample size and high dimensionality of Big Data introduce unique computational and statistical challenges, including scalability and storage bottleneck, noise accumulation, spurious correlation, incidental endogeneity, and measurement errors. These challenges are distinguished and require new computational and statistical paradigm. This article gives overviews on the salient features of Big Data and how these features impact on paradigm change on statistical and computational methods as well as computing architectures. We also provide various new perspectives on the Big Data analysis and computation. In particular, we emphasize on the viability of the sparsest solution in high-confidence set and point out that exogeneous assumptions in most statistical methods for Big Data can not be validated due to incidental endogeneity. They can lead to wrong statistical inferences and consequently wrong scientific conclusions.
Laser-diagnostic mapping of temperature and soot statistics in a 2-m diameter turbulent pool fire
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kearney, Sean P.; Grasser, Thomas W.
We present spatial profiles of temperature and soot-volume-fraction statistics from a sooting 2-m base diameter turbulent pool fire, burning a 10%-toluene / 90%-methanol fuel mixture. Dual-pump coherent anti-Stokes Raman scattering and laser-induced incandescence are utilized to obtain radial profiles of temperature and soot probability density functions (pdf) as well as estimates of temperature/soot joint statistics at three vertical heights above the surface of the methanol/toluene fuel pool. Results are presented both in the fuel vapor-dome region at ¼ base diameter and in the actively burning region at ½ and ¾ diameters above the fuel surface. The spatial evolution of themore » soot and temperature pdfs is discussed and profiles of the temperature and soot mean and rms statistics are provided. Joint temperature/soot statistics are presented as spatially resolved conditional averages across the fire plume, and in terms of a joint pdf obtained by including measurements from multiple spatial locations.« less
Laser-diagnostic mapping of temperature and soot statistics in a 2-m diameter turbulent pool fire
Kearney, Sean P.; Grasser, Thomas W.
2017-08-10
We present spatial profiles of temperature and soot-volume-fraction statistics from a sooting 2-m base diameter turbulent pool fire, burning a 10%-toluene / 90%-methanol fuel mixture. Dual-pump coherent anti-Stokes Raman scattering and laser-induced incandescence are utilized to obtain radial profiles of temperature and soot probability density functions (pdf) as well as estimates of temperature/soot joint statistics at three vertical heights above the surface of the methanol/toluene fuel pool. Results are presented both in the fuel vapor-dome region at ¼ base diameter and in the actively burning region at ½ and ¾ diameters above the fuel surface. The spatial evolution of themore » soot and temperature pdfs is discussed and profiles of the temperature and soot mean and rms statistics are provided. Joint temperature/soot statistics are presented as spatially resolved conditional averages across the fire plume, and in terms of a joint pdf obtained by including measurements from multiple spatial locations.« less
Challenges of Big Data Analysis
Fan, Jianqing; Han, Fang; Liu, Han
2014-01-01
Big Data bring new opportunities to modern society and challenges to data scientists. On one hand, Big Data hold great promises for discovering subtle population patterns and heterogeneities that are not possible with small-scale data. On the other hand, the massive sample size and high dimensionality of Big Data introduce unique computational and statistical challenges, including scalability and storage bottleneck, noise accumulation, spurious correlation, incidental endogeneity, and measurement errors. These challenges are distinguished and require new computational and statistical paradigm. This article gives overviews on the salient features of Big Data and how these features impact on paradigm change on statistical and computational methods as well as computing architectures. We also provide various new perspectives on the Big Data analysis and computation. In particular, we emphasize on the viability of the sparsest solution in high-confidence set and point out that exogeneous assumptions in most statistical methods for Big Data can not be validated due to incidental endogeneity. They can lead to wrong statistical inferences and consequently wrong scientific conclusions. PMID:25419469
Statistical model for speckle pattern optimization.
Su, Yong; Zhang, Qingchuan; Gao, Zeren
2017-11-27
Image registration is the key technique of optical metrologies such as digital image correlation (DIC), particle image velocimetry (PIV), and speckle metrology. Its performance depends critically on the quality of image pattern, and thus pattern optimization attracts extensive attention. In this article, a statistical model is built to optimize speckle patterns that are composed of randomly positioned speckles. It is found that the process of speckle pattern generation is essentially a filtered Poisson process. The dependence of measurement errors (including systematic errors, random errors, and overall errors) upon speckle pattern generation parameters is characterized analytically. By minimizing the errors, formulas of the optimal speckle radius are presented. Although the primary motivation is from the field of DIC, we believed that scholars in other optical measurement communities, such as PIV and speckle metrology, will benefit from these discussions.
Molshatzki, Noa; Drory, Yaacov; Myers, Vicki; Goldbourt, Uri; Benyamini, Yael; Steinberg, David M; Gerber, Yariv
2011-07-01
The relationship of risk factors to outcomes has traditionally been assessed by measures of association such as odds ratio or hazard ratio and their statistical significance from an adjusted model. However, a strong, highly significant association does not guarantee a gain in stratification capacity. Using recently developed model performance indices, we evaluated the incremental discriminatory power of individual and neighborhood socioeconomic status (SES) measures after myocardial infarction (MI). Consecutive patients aged ≤65 years (N=1178) discharged from 8 hospitals in central Israel after incident MI in 1992 to 1993 were followed-up through 2005. A basic model (demographic variables, traditional cardiovascular risk factors, and disease severity indicators) was compared with an extended model including SES measures (education, income, employment, living with a steady partner, and neighborhood SES) in terms of Harrell c statistic, integrated discrimination improvement (IDI), and net reclassification improvement (NRI). During the 13-year follow-up, 326 (28%) patients died. Cox proportional hazards models showed that all SES measures were significantly and independently associated with mortality. Furthermore, compared with the basic model, the extended model yielded substantial gains (all P<0.001) in c statistic (0.723 to 0.757), NRI (15.2%), IDI (5.9%), and relative IDI (32%). Improvement was observed both for sensitivity (classification of events) and specificity (classification of nonevents). This study illustrates the additional insights that can be gained from considering the IDI and NRI measures of model performance and suggests that, among community patients with incident MI, incorporating SES measures into a clinical-based model substantially improves long-term mortality risk prediction.
NASA Astrophysics Data System (ADS)
Rubin, D.; Aldering, G.; Barbary, K.; Boone, K.; Chappell, G.; Currie, M.; Deustua, S.; Fagrelius, P.; Fruchter, A.; Hayden, B.; Lidman, C.; Nordin, J.; Perlmutter, S.; Saunders, C.; Sofiatti, C.; Supernova Cosmology Project, The
2015-11-01
While recent supernova (SN) cosmology research has benefited from improved measurements, current analysis approaches are not statistically optimal and will prove insufficient for future surveys. This paper discusses the limitations of current SN cosmological analyses in treating outliers, selection effects, shape- and color-standardization relations, unexplained dispersion, and heterogeneous observations. We present a new Bayesian framework, called UNITY (Unified Nonlinear Inference for Type-Ia cosmologY), that incorporates significant improvements in our ability to confront these effects. We apply the framework to real SN observations and demonstrate smaller statistical and systematic uncertainties. We verify earlier results that SNe Ia require nonlinear shape and color standardizations, but we now include these nonlinear relations in a statistically well-justified way. This analysis was primarily performed blinded, in that the basic framework was first validated on simulated data before transitioning to real data. We also discuss possible extensions of the method.
Testing alternative ground water models using cross-validation and other methods
Foglia, L.; Mehl, S.W.; Hill, M.C.; Perona, P.; Burlando, P.
2007-01-01
Many methods can be used to test alternative ground water models. Of concern in this work are methods able to (1) rank alternative models (also called model discrimination) and (2) identify observations important to parameter estimates and predictions (equivalent to the purpose served by some types of sensitivity analysis). Some of the measures investigated are computationally efficient; others are computationally demanding. The latter are generally needed to account for model nonlinearity. The efficient model discrimination methods investigated include the information criteria: the corrected Akaike information criterion, Bayesian information criterion, and generalized cross-validation. The efficient sensitivity analysis measures used are dimensionless scaled sensitivity (DSS), composite scaled sensitivity, and parameter correlation coefficient (PCC); the other statistics are DFBETAS, Cook's D, and observation-prediction statistic. Acronyms are explained in the introduction. Cross-validation (CV) is a computationally intensive nonlinear method that is used for both model discrimination and sensitivity analysis. The methods are tested using up to five alternative parsimoniously constructed models of the ground water system of the Maggia Valley in southern Switzerland. The alternative models differ in their representation of hydraulic conductivity. A new method for graphically representing CV and sensitivity analysis results for complex models is presented and used to evaluate the utility of the efficient statistics. The results indicate that for model selection, the information criteria produce similar results at much smaller computational cost than CV. For identifying important observations, the only obviously inferior linear measure is DSS; the poor performance was expected because DSS does not include the effects of parameter correlation and PCC reveals large parameter correlations. ?? 2007 National Ground Water Association.
Bahtiri, Elton; Islami, Hilmi; Hoxha, Rexhep; Gashi, Afrim; Thaçi, Kujtim; Karakulak, Çağla; Thaçi, Shpetim; Qorraj Bytyqi, Hasime
2017-03-01
Proton pump inhibitors (PPIs) are a widely used class of drugs because of a generally acceptable safety profile. Among recently raised safety issues of the long-term use of PPIs is the increased risk of developing hypomagnesemia. As there have been very few prospective studies measuring serum magnesium levels before and after PPI therapy, we aimed to prospectively assess the potential association between PPI therapy for 12 months and the risk of hypomagnesemia as well as the incidence of new-onset hypomagnesemia during the study. In addition, the association of PPI therapy with the risk of hypocalcemia was assessed. The study included 250 patients with normal serum magnesium and total calcium levels, who underwent a long-term PPI treatment. Serum magnesium, total calcium, and parathormone (PTH) levels were measured at baseline and after 12 months. Of the 250 study participants, 209 completed 12 months of treatment and were included in the statistical analysis. The Wilcoxon signed rank test showed no statistically significant differences in serum magnesium levels between measurements at two different time points. However, there were statistically significant differences in serum total calcium and PTH levels in PPI users. Stable serum magnesium levels were demonstrated after 12 months and no association between PPI use and risk of hypomagnesemia was shown in the general population. Significant reductions of serum total calcium levels were demonstrated among PPI users; nevertheless, further research is required before recommending any serum calcium and PTH level monitoring in patients initiated on long-term PPI therapy.
Resnick, Cory M; Dentino, Kelley; Katz, Eliot; Mulliken, John B; Padwa, Bonnie L
2016-09-01
Tongue-lip adhesion (TLA) is commonly used to relieve obstructive sleep apnea (OSA) in infants with Robin sequence (RS), but few studies have evaluated its efficacy with objective measures. The purpose of this study was to measure TLA outcomes using polysomnography. Our hypothesis was that TLA relieves OSA in most infants. This is a retrospective study of infants with RS who underwent TLA from 2011 to 2014 and had at least a postoperative polysomnogram. Predictor variables included demographic and birth characteristics, surgeon, syndromic diagnosis, GILLS score, preoperative OSA severity, and clinical course. A successful outcome was defined as minimal OSA (apnea-hypopnea index score < 5) on postoperative polysomnogram and no need for additional airway intervention. Descriptive, bivariate, and regression statistics were computed, and statistical significance was set at P < .05. Eighteen subjects who had TLA at a mean age of 28 ± 4.7 days were included. Thirteen (72.2%) had a confirmed or suspected syndrome, and the mean GILLS score was 3 ± 0.3. All parameters trended toward improvement from the preoperative to postoperative polysomnograms, and improvement in OSA severity, oxygen saturation nadir, and arousals per hour was statistically significant (P < .02). This effect was significant across categories of surgeon, syndrome, and GILLS score. Nine subjects (50%) met the criteria for a successful outcome. Bivariate and regression analyses did not demonstrate a significant relationship between success and any predictor variable. TLA improved airway obstruction in all infants with RS but resolved OSA in only nine patients, and success was unpredictable.
A new u-statistic with superior design sensitivity in matched observational studies.
Rosenbaum, Paul R
2011-09-01
In an observational or nonrandomized study of treatment effects, a sensitivity analysis indicates the magnitude of bias from unmeasured covariates that would need to be present to alter the conclusions of a naïve analysis that presumes adjustments for observed covariates suffice to remove all bias. The power of sensitivity analysis is the probability that it will reject a false hypothesis about treatment effects allowing for a departure from random assignment of a specified magnitude; in particular, if this specified magnitude is "no departure" then this is the same as the power of a randomization test in a randomized experiment. A new family of u-statistics is proposed that includes Wilcoxon's signed rank statistic but also includes other statistics with substantially higher power when a sensitivity analysis is performed in an observational study. Wilcoxon's statistic has high power to detect small effects in large randomized experiments-that is, it often has good Pitman efficiency-but small effects are invariably sensitive to small unobserved biases. Members of this family of u-statistics that emphasize medium to large effects can have substantially higher power in a sensitivity analysis. For example, in one situation with 250 pair differences that are Normal with expectation 1/2 and variance 1, the power of a sensitivity analysis that uses Wilcoxon's statistic is 0.08 while the power of another member of the family of u-statistics is 0.66. The topic is examined by performing a sensitivity analysis in three observational studies, using an asymptotic measure called the design sensitivity, and by simulating power in finite samples. The three examples are drawn from epidemiology, clinical medicine, and genetic toxicology. © 2010, The International Biometric Society.
Trickey, Amber W; Crosby, Moira E; Singh, Monika; Dort, Jonathan M
2014-12-01
The application of evidence-based medicine to patient care requires unique skills of the physician. Advancing residents' abilities to accurately evaluate the quality of evidence is built on understanding of fundamental research concepts. The American Board of Surgery In-Training Examination (ABSITE) provides a relevant measure of surgical residents' knowledge of research design and statistics. We implemented a research education curriculum in an independent academic medical center general residency program, and assessed the effect on ABSITE scores. The curriculum consisted of five 1-hour monthly research and statistics lectures. The lectures were presented before the 2012 and 2013 examinations. Forty residents completing ABSITE examinations from 2007 to 2013 were included in the study. Two investigators independently identified research-related item topics from examination summary reports. Correct and incorrect responses were compared precurriculum and postcurriculum. Regression models were calculated to estimate improvement in postcurriculum scores, adjusted for individuals' scores over time and postgraduate year level. Residents demonstrated significant improvement in postcurriculum examination scores for research and statistics items. Correct responses increased 27% (P < .001). Residents were 5 times more likely to achieve a perfect score on research and statistics items postcurriculum (P < .001). Residents at all levels demonstrated improved research and statistics scores after receiving the curriculum. Because the ABSITE includes a wide spectrum of research topics, sustained improvements suggest a genuine level of understanding that will promote lifelong evaluation and clinical application of the surgical literature.
The estimation of the measurement results with using statistical methods
NASA Astrophysics Data System (ADS)
Velychko, O.; Gordiyenko, T.
2015-02-01
The row of international standards and guides describe various statistical methods that apply for a management, control and improvement of processes with the purpose of realization of analysis of the technical measurement results. The analysis of international standards and guides on statistical methods estimation of the measurement results recommendations for those applications in laboratories is described. For realization of analysis of standards and guides the cause-and-effect Ishikawa diagrams concerting to application of statistical methods for estimation of the measurement results are constructed.
Rokstad, Anne Marie Mork; Døble, Betty Sandvik; Engedal, Knut; Kirkevold, Øyvind; Benth, Jūratė Šaltytė; Selbaek, Geir
2017-06-01
The objective of the study was to evaluate the impact of the Dementia ABC educational programme on the participants' competence in person-centred care and on their level of job satisfaction. The development of person-centred care for people with dementia is highly recommended, and staff training that enhances such an approach may positively influence job satisfaction and the possibility of recruiting and retaining competent care staff. The study is a longitudinal survey, following participants over a period of 24 months with a 6-month follow-up after completion of the programme. A total of 1,795 participants from 90 municipalities in Norway are included, and 580 from 52 municipalities completed all measurements. The person-centred care assessment tool (P-CAT) is used to evaluate person-centredness. The psychosocial workplace environment and job satisfaction questionnaire is used to investigate job satisfaction. Measurements are made at baseline, and after 12, 24 and 30 months. A statistically significant increase in the mean P-CAT subscore of person-centred practice and the P-CAT total score is found at 12, 24 and 30 months compared to baseline. A statistically significant decrease in scores in the P-CAT subscore for organisational support is found at all points of measurement compared to baseline. Statistically significant increases in satisfaction with workload, personal and professional development, demands balanced with qualifications and variation in job tasks as elements of job satisfaction are reported. The evaluation of the Dementia ABC educational programme identifies statistically significant increases in scores of person-centredness and job satisfaction, indicating that the training has a positive impact. The results indicate that a multicomponent training programme including written material, multidisciplinary reflection groups and workshops has a positive impact on the development of person-centred care practice and the job satisfaction of care staff. © 2016 John Wiley & Sons Ltd.
Navy Personnel Survey (NPS) 1990 Survey Report, Statistical Tables. Volume 1. Enlisted Personnel.
1991-08-01
Respondents were asked to provide demographic data and to indicate their attitudes or opinions on rotation/ permanent change of station (PCS) moves...and measured military members attitudes and opinions in various areas, including rotation/permanent change of station moves, recruiting duty, pay and...about: the 0rganizaional Clmate Use the spac below to make any comments you wish about the organizational climate, including EQ is- and sexual3 harassment
Quantum work in the Bohmian framework
NASA Astrophysics Data System (ADS)
Sampaio, R.; Suomela, S.; Ala-Nissila, T.; Anders, J.; Philbin, T. G.
2018-01-01
At nonzero temperature classical systems exhibit statistical fluctuations of thermodynamic quantities arising from the variation of the system's initial conditions and its interaction with the environment. The fluctuating work, for example, is characterized by the ensemble of system trajectories in phase space and, by including the probabilities for various trajectories to occur, a work distribution can be constructed. However, without phase-space trajectories, the task of constructing a work probability distribution in the quantum regime has proven elusive. Here we use quantum trajectories in phase space and define fluctuating work as power integrated along the trajectories, in complete analogy to classical statistical physics. The resulting work probability distribution is valid for any quantum evolution, including cases with coherences in the energy basis. We demonstrate the quantum work probability distribution and its properties with an exactly solvable example of a driven quantum harmonic oscillator. An important feature of the work distribution is its dependence on the initial statistical mixture of pure states, which is reflected in higher moments of the work. The proposed approach introduces a fundamentally different perspective on quantum thermodynamics, allowing full thermodynamic characterization of the dynamics of quantum systems, including the measurement process.
Common pitfalls in statistical analysis: Measures of agreement.
Ranganathan, Priya; Pramesh, C S; Aggarwal, Rakesh
2017-01-01
Agreement between measurements refers to the degree of concordance between two (or more) sets of measurements. Statistical methods to test agreement are used to assess inter-rater variability or to decide whether one technique for measuring a variable can substitute another. In this article, we look at statistical measures of agreement for different types of data and discuss the differences between these and those for assessing correlation.
Russell, Christopher J; Shiroishi, Mark S; Siantz, Elizabeth; Wu, Brian W; Patino, Cecilia M
2016-03-08
Ventilator-associated respiratory infections (tracheobronchitis, pneumonia) contribute significant morbidity and mortality to adults receiving care in intensive care units (ICU). Administration of broad-spectrum intravenous antibiotics, the current standard of care, may have systemic adverse effects. The efficacy of aerosolized antibiotics for treatment of ventilator-associated respiratory infections remains unclear. Our objective was to conduct a systematic review of the efficacy of aerosolized antibiotics in the treatment of ventilator-associated pneumonia (VAP) and tracheobronchitis (VAT), using the Cochrane Collaboration guidelines. We conducted a search of three databases (PubMed, Web of Knowledge and the Cochrane Collaboration) for randomized, controlled trials studying the use of nebulized antibiotics in VAP and VAT that measured clinical cure (e.g., change in Clinical Pulmonary Infection Score) as an outcome measurement. We augmented the electronic searches with hand searches of the references for any narrative review articles as well as any article included in the systematic review. Included studies were examined for risk of bias using the Cochrane Handbook's "Risk of Bias" assessment tool. Six studies met full inclusion criteria. For the systemic review's primary outcome (clinical cure), two studies found clinically and statistically significant improvements in measures of VAP cure while four found no statistically significant difference in measurements of cure. No studies found inferiority of aerosolized antibiotics. The included studies had various degrees of biases, particularly in the performance and detection bias domains. Given that outcome measures of clinical cure were not uniform, we were unable to conduct a meta-analysis. There is insufficient evidence for the use of inhaled antibiotic therapy as primary or adjuvant treatment of VAP or VAT. Additional, better-powered randomized-controlled trials are needed to assess the efficacy of inhaled antibiotic therapy for VAP and VAT.
Kavanaugh, Michael J; So, Joanne D; Park, Peter J; Davis, Konrad L
2017-02-01
Risk stratification with the Modified Early Warning System (MEWS) or electronic cardiac arrest trigger (eCART) has been utilized with ward patients to preemptively identify high-risk patients who might benefit from enhanced monitoring, including early intensive care unit (ICU) transfer. In-hospital mortality from cardiac arrest is ∼80%, making preventative interventions an important focus area. ICUs have lower patient to nurse ratios than wards, resulting in less emphasis on the development of ICU early warning systems. Our institution developed an early warning dashboard (EWD) identifying patients who may benefit from earlier interventions. Using the adverse outcomes of cardiac arrest, ICU mortality, and ICU readmissions, a retrospective case-control study was performed using three demographic items (age, diabetes, and morbid obesity) and 24 EWD measured items, including vital signs, laboratory values, ventilator information, and other clinical information, to validate the EWD. Ten statistically significant areas were identified for cardiac arrest and 13 for ICU death. Identified items included heart rate, dialysis, leukocytosis, and lactate. The ICU readmission outcome was compared to controls from both ICU patients and ward patients, and statistical significance was identified for respiratory rate >30. With several statistically significant data elements, the EWD parameters have been incorporated into advanced clinical decision algorithms to identify at-risk ICU patients. Earlier identification and treatment of organ failure in the ICU improve outcomes and the EWD can serve as a safety measure for both at-risk in-house patients and also extend critical care expertise through telemedicine to smaller hospitals.
Vanniyasingam, Thuva; Daly, Caitlin; Jin, Xuejing; Zhang, Yuan; Foster, Gary; Cunningham, Charles; Thabane, Lehana
2018-06-01
This study reviews simulation studies of discrete choice experiments to determine (i) how survey design features affect statistical efficiency, (ii) and to appraise their reporting quality. Statistical efficiency was measured using relative design (D-) efficiency, D-optimality, or D-error. For this systematic survey, we searched Journal Storage (JSTOR), Since Direct, PubMed, and OVID which included a search within EMBASE. Searches were conducted up to year 2016 for simulation studies investigating the impact of DCE design features on statistical efficiency. Studies were screened and data were extracted independently and in duplicate. Results for each included study were summarized by design characteristic. Previously developed criteria for reporting quality of simulation studies were also adapted and applied to each included study. Of 371 potentially relevant studies, 9 were found to be eligible, with several varying in study objectives. Statistical efficiency improved when increasing the number of choice tasks or alternatives; decreasing the number of attributes, attribute levels; using an unrestricted continuous "manipulator" attribute; using model-based approaches with covariates incorporating response behaviour; using sampling approaches that incorporate previous knowledge of response behaviour; incorporating heterogeneity in a model-based design; correctly specifying Bayesian priors; minimizing parameter prior variances; and using an appropriate method to create the DCE design for the research question. The simulation studies performed well in terms of reporting quality. Improvement is needed in regards to clearly specifying study objectives, number of failures, random number generators, starting seeds, and the software used. These results identify the best approaches to structure a DCE. An investigator can manipulate design characteristics to help reduce response burden and increase statistical efficiency. Since studies varied in their objectives, conclusions were made on several design characteristics, however, the validity of each conclusion was limited. Further research should be conducted to explore all conclusions in various design settings and scenarios. Additional reviews to explore other statistical efficiency outcomes and databases can also be performed to enhance the conclusions identified from this review.
A Geostatistical Scaling Approach for the Generation of Non Gaussian Random Variables and Increments
NASA Astrophysics Data System (ADS)
Guadagnini, Alberto; Neuman, Shlomo P.; Riva, Monica; Panzeri, Marco
2016-04-01
We address manifestations of non-Gaussian statistical scaling displayed by many variables, Y, and their (spatial or temporal) increments. Evidence of such behavior includes symmetry of increment distributions at all separation distances (or lags) with sharp peaks and heavy tails which tend to decay asymptotically as lag increases. Variables reported to exhibit such distributions include quantities of direct relevance to hydrogeological sciences, e.g. porosity, log permeability, electrical resistivity, soil and sediment texture, sediment transport rate, rainfall, measured and simulated turbulent fluid velocity, and other. No model known to us captures all of the documented statistical scaling behaviors in a unique and consistent manner. We recently proposed a generalized sub-Gaussian model (GSG) which reconciles within a unique theoretical framework the probability distributions of a target variable and its increments. We presented an algorithm to generate unconditional random realizations of statistically isotropic or anisotropic GSG functions and illustrated it in two dimensions. In this context, we demonstrated the feasibility of estimating all key parameters of a GSG model underlying a single realization of Y by analyzing jointly spatial moments of Y data and corresponding increments. Here, we extend our GSG model to account for noisy measurements of Y at a discrete set of points in space (or time), present an algorithm to generate conditional realizations of corresponding isotropic or anisotropic random field, and explore them on one- and two-dimensional synthetic test cases.
Arthroscopy for treating temporomandibular joint disorders.
Currie, Roger
2011-01-01
The Cochrane Oral Health Group Trials Register, the Cochrane Central Register of Controlled Trials (CENTRAL), Medline Embase, Lilacs, Allied and Complementary Medicine Database (AMED) and CINAHL databases were searched. In addition the reference lists of the included articles were checked and 14 journals hand searched. Randomised controlled clinical trials (RCT) of arthroscopy for treating TMDs were included. There were no restrictions regarding the language or date of publication. Two review authors independently extracted data, and three review authors independently assessed the risk of bias of included trials. The authors of the selected articles were contacted for additional information. Pooling of trials was only attempted if at least two trials of comparable protocols, with the same conditions and similar outcome measurements were available. Statistical analysis was performed in accordance with the Cochrane Collaboration guidelines. Seven RCTs (n = 349) met the inclusion criteria. All the studies were either at high or unclear risk of bias. Pain was evaluated after six months in two studies. No statistically significant differences were found between the arthroscopy versus nonsurgical groups (standardised mean difference (SMD) = 0.004; 95% confidence interval (CI) - 0.46 to 0.55, P = 0.81). Two studies analysed pain 12 months after surgery (arthroscopy and arthrocentesis) in 81 patients. No statistically significant differences were found (mean difference (MD) = 0.10; 95% CI -1.46 to 1.66, P = 0.90). Three studies analysed the same outcome in patients who had been submitted to arthroscopic surgery or to open surgery and a statistically significant difference was found after 12 months (SMD = 0.45; 95% CI 0.01 to 0.89, P = 0.05) in favour of open surgery.The two studies compared the maximum interincisal opening in six different clinical outcomes (interincisal opening over 35 mm; maximum protrusion over 5 mm; click; crepitation; tenderness on palpation in the TMJ and the jaw muscles 12 months after arthroscopy and open surgery). The outcome measures did not present statistically significant differences (odds ratio (OR) = 1.00; 95% CI 0.45 to 2.21, P = 1.00). Two studies compared the maximum interincisal opening after 12 months of postsurgical follow-up. A statistically significant difference in favour of the arthroscopy group was observed (MD = 5.28; 95% CI 3.46 to 7.10, P < 0.0001).The two studies compared the mandibular function after 12 months of follow-up with 40 patients evaluated. The outcome measure was mandibular functionality (MFIQ). This difference was not statistically significant (MD = 1.58; 95% CI -0.78 to 3.94, P = 0.19). Both arthroscopy and nonsurgical treatments reduced pain after six months. When compared with arthroscopy, open surgery was more effective at reducing pain after 12 months. Nevertheless, there were no differences in mandibular functionality or in other outcomes in clinical evaluations. Arthroscopy led to greater improvement in maximum interincisal opening after 12 months than arthrocentesis; however, there was no difference in pain.
Padmanabhan, Shyam; Dommy, Ahila; Guru, Sanjeela R.; Joseph, Ajesh
2017-01-01
Aim: Periodontists frequently experience inconvenience in accurate assessment and treatment of furcation areas affected by periodontal disease. Furcation involvement (FI) most commonly affects the mandibular molars. Diagnosis of furcation-involved teeth is mainly by the assessment of probing pocket depth, clinical attachment level, furcation entrance probing, and intraoral periapical radiographs. Three-dimensional imaging has provided advantage to the clinician in assessment of bone morphology. Thus, the present study aimed to compare the diagnostic efficacy of cone-beam computed tomography (CBCT) as against direct intrasurgical measurements of furcation defects in mandibular molars. Subjects and Methods: Study population included 14 patients with 25 mandibular molar furcation sites. CBCT was performed to measure height, width, and depth of furcation defects of mandibular molars with Grade II and Grade III FI. Intrasurgical measurements of the FI were assessed during periodontal flap surgery in indicated teeth which were compared with CBCT measurements. Statistical analysis was done using paired t-test and Bland–Altman plot. Results: The CBCT versus intrasurgical furcation measurements were 2.18 ± 0.86 mm and 2.30 ± 0.89 mm for furcation height, 1.87 ± 0.52 mm and 1.84 ± 0.49 mm for furcation width, and 3.81 ± 1.37 mm and 4.05 ± 1.49 mm for furcation depth, respectively. Results showed that there was no statistical significance between the measured parameters, indicating that the two methods were statistically similar. Conclusion: Accuracy of assessment of mandibular molar FI by CBCT was comparable to that of direct surgical measurements. These findings indicate that CBCT is an excellent adjunctive diagnostic tool in periodontal treatment planning. PMID:29042732
Refractive errors in patients with newly diagnosed diabetes mellitus.
Yarbağ, Abdülhekim; Yazar, Hayrullah; Akdoğan, Mehmet; Pekgör, Ahmet; Kaleli, Suleyman
2015-01-01
Diabetes mellitus is a complex metabolic disorder that involves the small blood vessels, often causing widespread damage to tissues, including the eyes' optic refractive error. In patients with newly diagnosed diabetes mellitus who have unstable blood glucose levels, refraction may be incorrect. We aimed to investigate refraction in patients who were recently diagnosed with diabetes and treated at our centre. This prospective study was performed from February 2013 to January 2014. Patients were diagnosed with diabetes mellitus using laboratory biochemical tests and clinical examination. Venous fasting plasma glucose (fpg) levels were measured along with refractive errors. Two measurements were taken: initially and after four weeks. The last difference between the initial and end refractive measurements were evaluated. Our patients were 100 males and 30 females who had been newly diagnosed with type II DM. The refractive and fpg levels were measured twice in all patients. The average values of the initial measurements were as follows: fpg level, 415 mg/dl; average refractive value, +2.5 D (Dioptres). The average end of period measurements were fpg, 203 mg/dl; average refractive value, +0.75 D. There is a statistically significant difference between after four weeks measurements with initially measurements of fasting plasma glucose (fpg) levels (p<0.05) and there is a statistically significant relationship between changes in fpg changes with glasses ID (p<0.05) and the disappearance of blurred vision (to be greater than 50% success rate) were statistically significant (p<0.05). Also, were detected upon all these results the absence of any age and sex effects (p>0.05). Refractive error is affected in patients with newly diagnosed diabetes mellitus; therefore, plasma glucose levels should be considered in the selection of glasses.
Statistical Reviewers Improve Reporting in Biomedical Articles: A Randomized Trial
Cobo, Erik; Selva-O'Callagham, Albert; Ribera, Josep-Maria; Cardellach, Francesc; Dominguez, Ruth; Vilardell, Miquel
2007-01-01
Background Although peer review is widely considered to be the most credible way of selecting manuscripts and improving the quality of accepted papers in scientific journals, there is little evidence to support its use. Our aim was to estimate the effects on manuscript quality of either adding a statistical peer reviewer or suggesting the use of checklists such as CONSORT or STARD to clinical reviewers or both. Methodology and Principal Findings Interventions were defined as 1) the addition of a statistical reviewer to the clinical peer review process, and 2) suggesting reporting guidelines to reviewers; with “no statistical expert” and “no checklist” as controls. The two interventions were crossed in a 2×2 balanced factorial design including original research articles consecutively selected, between May 2004 and March 2005, by the Medicina Clinica (Barc) editorial committee. We randomized manuscripts to minimize differences in terms of baseline quality and type of study (intervention, longitudinal, cross-sectional, others). Sample-size calculations indicated that 100 papers provide an 80% power to test a 55% standardized difference. We specified the main outcome as the increment in quality of papers as measured on the Goodman Scale. Two blinded evaluators rated the quality of manuscripts at initial submission and final post peer review version. Of the 327 manuscripts submitted to the journal, 131 were accepted for further review, and 129 were randomized. Of those, 14 that were lost to follow-up showed no differences in initial quality to the followed-up papers. Hence, 115 were included in the main analysis, with 16 rejected for publication after peer review. 21 (18.3%) of the 115 included papers were interventions, 46 (40.0%) were longitudinal designs, 28 (24.3%) cross-sectional and 20 (17.4%) others. The 16 (13.9%) rejected papers had a significantly lower initial score on the overall Goodman scale than accepted papers (difference 15.0, 95% CI: 4.6–24.4). The effect of suggesting a guideline to the reviewers had no effect on change in overall quality as measured by the Goodman scale (0.9, 95% CI: −0.3–+2.1). The estimated effect of adding a statistical reviewer was 5.5 (95% CI: 4.3–6.7), showing a significant improvement in quality. Conclusions and Significance This prospective randomized study shows the positive effect of adding a statistical reviewer to the field-expert peers in improving manuscript quality. We did not find a statistically significant positive effect by suggesting reviewers use reporting guidelines. PMID:17389922
The extension of total gain (TG) statistic in survival models: properties and applications.
Choodari-Oskooei, Babak; Royston, Patrick; Parmar, Mahesh K B
2015-07-01
The results of multivariable regression models are usually summarized in the form of parameter estimates for the covariates, goodness-of-fit statistics, and the relevant p-values. These statistics do not inform us about whether covariate information will lead to any substantial improvement in prediction. Predictive ability measures can be used for this purpose since they provide important information about the practical significance of prognostic factors. R (2)-type indices are the most familiar forms of such measures in survival models, but they all have limitations and none is widely used. In this paper, we extend the total gain (TG) measure, proposed for a logistic regression model, to survival models and explore its properties using simulations and real data. TG is based on the binary regression quantile plot, otherwise known as the predictiveness curve. Standardised TG ranges from 0 (no explanatory power) to 1 ('perfect' explanatory power). The results of our simulations show that unlike many of the other R (2)-type predictive ability measures, TG is independent of random censoring. It increases as the effect of a covariate increases and can be applied to different types of survival models, including models with time-dependent covariate effects. We also apply TG to quantify the predictive ability of multivariable prognostic models developed in several disease areas. Overall, TG performs well in our simulation studies and can be recommended as a measure to quantify the predictive ability in survival models.
Thermal equilibrium and statistical thermometers in special relativity.
Cubero, David; Casado-Pascual, Jesús; Dunkel, Jörn; Talkner, Peter; Hänggi, Peter
2007-10-26
There is an intense debate in the recent literature about the correct generalization of Maxwell's velocity distribution in special relativity. The most frequently discussed candidate distributions include the Jüttner function as well as modifications thereof. Here we report results from fully relativistic one-dimensional molecular dynamics simulations that resolve the ambiguity. The numerical evidence unequivocally favors the Jüttner distribution. Moreover, our simulations illustrate that the concept of "thermal equilibrium" extends naturally to special relativity only if a many-particle system is spatially confined. They make evident that "temperature" can be statistically defined and measured in an observer frame independent way.
Evaluating Cellular Polyfunctionality with a Novel Polyfunctionality Index
Larsen, Martin; Sauce, Delphine; Arnaud, Laurent; Fastenackels, Solène; Appay, Victor; Gorochov, Guy
2012-01-01
Functional evaluation of naturally occurring or vaccination-induced T cell responses in mice, men and monkeys has in recent years advanced from single-parameter (e.g. IFN-γ-secretion) to much more complex multidimensional measurements. Co-secretion of multiple functional molecules (such as cytokines and chemokines) at the single-cell level is now measurable due primarily to major advances in multiparametric flow cytometry. The very extensive and complex datasets generated by this technology raise the demand for proper analytical tools that enable the analysis of combinatorial functional properties of T cells, hence polyfunctionality. Presently, multidimensional functional measures are analysed either by evaluating all combinations of parameters individually or by summing frequencies of combinations that include the same number of simultaneous functions. Often these evaluations are visualized as pie charts. Whereas pie charts effectively represent and compare average polyfunctionality profiles of particular T cell subsets or patient groups, they do not document the degree or variation of polyfunctionality within a group nor does it allow more sophisticated statistical analysis. Here we propose a novel polyfunctionality index that numerically evaluates the degree and variation of polyfuntionality, and enable comparative and correlative parametric and non-parametric statistical tests. Moreover, it allows the usage of more advanced statistical approaches, such as cluster analysis. We believe that the polyfunctionality index will render polyfunctionality an appropriate end-point measure in future studies of T cell responsiveness. PMID:22860124
NASA Astrophysics Data System (ADS)
Darvishzadeh, R.; Skidmore, A. K.; Mirzaie, M.; Atzberger, C.; Schlerf, M.
2014-12-01
Accurate estimation of grassland biomass at their peak productivity can provide crucial information regarding the functioning and productivity of the rangelands. Hyperspectral remote sensing has proved to be valuable for estimation of vegetation biophysical parameters such as biomass using different statistical techniques. However, in statistical analysis of hyperspectral data, multicollinearity is a common problem due to large amount of correlated hyper-spectral reflectance measurements. The aim of this study was to examine the prospect of above ground biomass estimation in a heterogeneous Mediterranean rangeland employing multivariate calibration methods. Canopy spectral measurements were made in the field using a GER 3700 spectroradiometer, along with concomitant in situ measurements of above ground biomass for 170 sample plots. Multivariate calibrations including partial least squares regression (PLSR), principal component regression (PCR), and Least-Squared Support Vector Machine (LS-SVM) were used to estimate the above ground biomass. The prediction accuracy of the multivariate calibration methods were assessed using cross validated R2 and RMSE. The best model performance was obtained using LS_SVM and then PLSR both calibrated with first derivative reflectance dataset with R2cv = 0.88 & 0.86 and RMSEcv= 1.15 & 1.07 respectively. The weakest prediction accuracy was appeared when PCR were used (R2cv = 0.31 and RMSEcv= 2.48). The obtained results highlight the importance of multivariate calibration methods for biomass estimation when hyperspectral data are used.
Angular Baryon Acoustic Oscillation measure at z=2.225 from the SDSS quasar survey
NASA Astrophysics Data System (ADS)
de Carvalho, E.; Bernui, A.; Carvalho, G. C.; Novaes, C. P.; Xavier, H. S.
2018-04-01
Following a quasi model-independent approach we measure the transversal BAO mode at high redshift using the two-point angular correlation function (2PACF). The analyses done here are only possible now with the quasar catalogue from the twelfth data release (DR12Q) from the Sloan Digital Sky Survey, because it is spatially dense enough to allow the measurement of the angular BAO signature with moderate statistical significance and acceptable precision. Our analyses with quasars in the redshift interval z in [2.20,2.25] produce the angular BAO scale θBAO = 1.77° ± 0.31° with a statistical significance of 2.12 σ (i.e., 97% confidence level), calculated through a likelihood analysis performed using the theoretical covariance matrix sourced by the analytical power spectra expected in the ΛCDM concordance model. Additionally, we show that the BAO signal is robust—although with less statistical significance—under diverse bin-size choices and under small displacements of the quasars' angular coordinates. Finally, we also performed cosmological parameter analyses comparing the θBAO predictions for wCDM and w(a)CDM models with angular BAO data available in the literature, including the measurement obtained here, jointly with CMB data. The constraints on the parameters ΩM, w0 and wa are in excellent agreement with the ΛCDM concordance model.
NASA Technical Reports Server (NTRS)
Racette, Paul; Lang, Roger; Zhang, Zhao-Nan; Zacharias, David; Krebs, Carolyn A. (Technical Monitor)
2002-01-01
Radiometers must be periodically calibrated because the receiver response fluctuates. Many techniques exist to correct for the time varying response of a radiometer receiver. An analytical technique has been developed that uses generalized least squares regression (LSR) to predict the performance of a wide variety of calibration algorithms. The total measurement uncertainty including the uncertainty of the calibration can be computed using LSR. The uncertainties of the calibration samples used in the regression are based upon treating the receiver fluctuations as non-stationary processes. Signals originating from the different sources of emission are treated as simultaneously existing random processes. Thus, the radiometer output is a series of samples obtained from these random processes. The samples are treated as random variables but because the underlying processes are non-stationary the statistics of the samples are treated as non-stationary. The statistics of the calibration samples depend upon the time for which the samples are to be applied. The statistics of the random variables are equated to the mean statistics of the non-stationary processes over the interval defined by the time of calibration sample and when it is applied. This analysis opens the opportunity for experimental investigation into the underlying properties of receiver non stationarity through the use of multiple calibration references. In this presentation we will discuss the application of LSR to the analysis of various calibration algorithms, requirements for experimental verification of the theory, and preliminary results from analyzing experiment measurements.
Space, race, and poverty: Spatial inequalities in walkable neighborhood amenities?
Aldstadt, Jared; Whalen, John; White, Kellee; Castro, Marcia C.; Williams, David R.
2017-01-01
BACKGROUND Multiple and varied benefits have been suggested for increased neighborhood walkability. However, spatial inequalities in neighborhood walkability likely exist and may be attributable, in part, to residential segregation. OBJECTIVE Utilizing a spatial demographic perspective, we evaluated potential spatial inequalities in walkable neighborhood amenities across census tracts in Boston, MA (US). METHODS The independent variables included minority racial/ethnic population percentages and percent of families in poverty. Walkable neighborhood amenities were assessed with a composite measure. Spatial autocorrelation in key study variables were first calculated with the Global Moran’s I statistic. Then, Spearman correlations between neighborhood socio-demographic characteristics and walkable neighborhood amenities were calculated as well as Spearman correlations accounting for spatial autocorrelation. We fit ordinary least squares (OLS) regression and spatial autoregressive models, when appropriate, as a final step. RESULTS Significant positive spatial autocorrelation was found in neighborhood socio-demographic characteristics (e.g. census tract percent Black), but not walkable neighborhood amenities or in the OLS regression residuals. Spearman correlations between neighborhood socio-demographic characteristics and walkable neighborhood amenities were not statistically significant, nor were neighborhood socio-demographic characteristics significantly associated with walkable neighborhood amenities in OLS regression models. CONCLUSIONS Our results suggest that there is residential segregation in Boston and that spatial inequalities do not necessarily show up using a composite measure. COMMENTS Future research in other geographic areas (including international contexts) and using different definitions of neighborhoods (including small-area definitions) should evaluate if spatial inequalities are found using composite measures but also should use measures of specific neighborhood amenities. PMID:29046612
Interpretation of 2-probe turbulence measurements in an axisymmetric contraction
NASA Technical Reports Server (NTRS)
Marion-Moulin, C.; Tan-Atichat, J.; Nagib, H. M.
1983-01-01
Simultaneous measurements of the streamwise and radial velocity components at two points, one on and one off the centerline with variable radial separation, were digitally recorded and processed at several stations along a four to one contraction with controlled upstream turbulence conditions. Various statistical quantities are presented including spectra and coherence functions. The integral L sub ux, L sub um, L sub vx, L sub vm were also estimated and their variation along the contraction is examined.
A validity problem in measuring exposure to mass media campaigns.
Brown, J D; Bauman, K E; Padgett, C A
1990-01-01
Recognition of radio and television messages included in three mass media campaigns designed to keep adolescents from starting to smoke cigarettes was measured in six treatment and four control cities (Standard Metropolitan Statistical Areas) in the southeastern United States. The telephone survey of 574 randomly selected adolescents found high recognition of campaign messages even in the areas where the campaigns had not been broadcast. Campaign messages that differed significantly from other anti-smoking messages were less likely to be falsely recognized. These results reinforce the need to include true control groups in mass media evaluations and to construct distinctive messages if exposure is an important aspect of campaign evaluation.
Designing Intervention Studies: Selected Populations, Range Restrictions, and Statistical Power
Miciak, Jeremy; Taylor, W. Pat; Stuebing, Karla K.; Fletcher, Jack M.; Vaughn, Sharon
2016-01-01
An appropriate estimate of statistical power is critical for the design of intervention studies. Although the inclusion of a pretest covariate in the test of the primary outcome can increase statistical power, samples selected on the basis of pretest performance may demonstrate range restriction on the selection measure and other correlated measures. This can result in attenuated pretest-posttest correlations, reducing the variance explained by the pretest covariate. We investigated the implications of two potential range restriction scenarios: direct truncation on a selection measure and indirect range restriction on correlated measures. Empirical and simulated data indicated direct range restriction on the pretest covariate greatly reduced statistical power and necessitated sample size increases of 82%–155% (dependent on selection criteria) to achieve equivalent statistical power to parameters with unrestricted samples. However, measures demonstrating indirect range restriction required much smaller sample size increases (32%–71%) under equivalent scenarios. Additional analyses manipulated the correlations between measures and pretest-posttest correlations to guide planning experiments. Results highlight the need to differentiate between selection measures and potential covariates and to investigate range restriction as a factor impacting statistical power. PMID:28479943
Designing Intervention Studies: Selected Populations, Range Restrictions, and Statistical Power.
Miciak, Jeremy; Taylor, W Pat; Stuebing, Karla K; Fletcher, Jack M; Vaughn, Sharon
2016-01-01
An appropriate estimate of statistical power is critical for the design of intervention studies. Although the inclusion of a pretest covariate in the test of the primary outcome can increase statistical power, samples selected on the basis of pretest performance may demonstrate range restriction on the selection measure and other correlated measures. This can result in attenuated pretest-posttest correlations, reducing the variance explained by the pretest covariate. We investigated the implications of two potential range restriction scenarios: direct truncation on a selection measure and indirect range restriction on correlated measures. Empirical and simulated data indicated direct range restriction on the pretest covariate greatly reduced statistical power and necessitated sample size increases of 82%-155% (dependent on selection criteria) to achieve equivalent statistical power to parameters with unrestricted samples. However, measures demonstrating indirect range restriction required much smaller sample size increases (32%-71%) under equivalent scenarios. Additional analyses manipulated the correlations between measures and pretest-posttest correlations to guide planning experiments. Results highlight the need to differentiate between selection measures and potential covariates and to investigate range restriction as a factor impacting statistical power.
Cosmology constraints from shear peak statistics in Dark Energy Survey Science Verification data
Kacprzak, T.; Kirk, D.; Friedrich, O.; ...
2016-08-19
Shear peak statistics has gained a lot of attention recently as a practical alternative to the two point statistics for constraining cosmological parameters. We perform a shear peak statistics analysis of the Dark Energy Survey (DES) Science Verification (SV) data, using weak gravitational lensing measurements from a 139 degmore » $^2$ field. We measure the abundance of peaks identified in aperture mass maps, as a function of their signal-to-noise ratio, in the signal-to-noise range $$0<\\mathcal S / \\mathcal N<4$$. To predict the peak counts as a function of cosmological parameters we use a suite of $N$-body simulations spanning 158 models with varying $$\\Omega_{\\rm m}$$ and $$\\sigma_8$$, fixing $w = -1$, $$\\Omega_{\\rm b} = 0.04$$, $h = 0.7$ and $$n_s=1$$, to which we have applied the DES SV mask and redshift distribution. In our fiducial analysis we measure $$\\sigma_{8}(\\Omega_{\\rm m}/0.3)^{0.6}=0.77 \\pm 0.07$$, after marginalising over the shear multiplicative bias and the error on the mean redshift of the galaxy sample. We introduce models of intrinsic alignments, blending, and source contamination by cluster members. These models indicate that peaks with $$\\mathcal S / \\mathcal N>4$$ would require significant corrections, which is why we do not include them in our analysis. We compare our results to the cosmological constraints from the two point analysis on the SV field and find them to be in good agreement in both the central value and its uncertainty. As a result, we discuss prospects for future peak statistics analysis with upcoming DES data.« less
Statistical Analysis of Time-Series from Monitoring of Active Volcanic Vents
NASA Astrophysics Data System (ADS)
Lachowycz, S.; Cosma, I.; Pyle, D. M.; Mather, T. A.; Rodgers, M.; Varley, N. R.
2016-12-01
Despite recent advances in the collection and analysis of time-series from volcano monitoring, and the resulting insights into volcanic processes, challenges remain in forecasting and interpreting activity from near real-time analysis of monitoring data. Statistical methods have potential to characterise the underlying structure and facilitate intercomparison of these time-series, and so inform interpretation of volcanic activity. We explore the utility of multiple statistical techniques that could be widely applicable to monitoring data, including Shannon entropy and detrended fluctuation analysis, by their application to various data streams from volcanic vents during periods of temporally variable activity. Each technique reveals changes through time in the structure of some of the data that were not apparent from conventional analysis. For example, we calculate the Shannon entropy (a measure of the randomness of a signal) of time-series from the recent dome-forming eruptions of Volcán de Colima (Mexico) and Soufrière Hills (Montserrat). The entropy of real-time seismic measurements and the count rate of certain volcano-seismic event types from both volcanoes is found to be temporally variable, with these data generally having higher entropy during periods of lava effusion and/or larger explosions. In some instances, the entropy shifts prior to or coincident with changes in seismic or eruptive activity, some of which were not clearly recognised by real-time monitoring. Comparison with other statistics demonstrates the sensitivity of the entropy to the data distribution, but that it is distinct from conventional statistical measures such as coefficient of variation. We conclude that each analysis technique examined could provide valuable insights for interpretation of diverse monitoring time-series.
Rankings Methodology Hurts Public Institutions
ERIC Educational Resources Information Center
Van Der Werf, Martin
2007-01-01
In the 1980s, when the "U.S. News & World Report" rankings of colleges were based solely on reputation, the nation's public universities were well represented at the top. However, as soon as the magazine began including its "measures of excellence," statistics intended to define quality, public universities nearly disappeared from the top. As the…
Kevin M. Potter; Christopher W. Woodall; Christopher M. Oswalt; Basil V. III Iannone; Songlin Fei
2015-01-01
Biodiversity is expected to convey numerous functional benefits to forested ecosystems, including increased productivity and resilience. When assessing biodiversity, however, statistics that account for evolutionary relationships among species may be more ecologically meaningful than traditional measures such as species richness. In three broad-scale studies, we...
ERIC Educational Resources Information Center
DiLullo, Camille; McGee, Patricia; Kriebel, Richard M.
2011-01-01
The characteristic profile of Millennial Generation students, driving many educational reforms, can be challenged by research in a number of fields including cognition, learning style, neurology, and psychology. This evidence suggests that the current aggregate view of the Millennial student may be less than accurate. Statistics show that…
Railroad safety program, task 2
NASA Technical Reports Server (NTRS)
1983-01-01
Aspects of railroad safety and the preparation of a National Inspection Plan (NIP) for rail safety improvement are examined. Methodology for the allocation of inspection resources, preparation of a NIP instruction manual, and recommendations for future NIP, are described. A statistical analysis of regional rail accidents is presented with causes and suggested preventive measures included.
The Condition of K-12 Public Education in Maine: 2008
ERIC Educational Resources Information Center
Center for Education Policy, Applied Research, and Evaluation, 2008
2008-01-01
"Education Indicators" are facts and statistics that help to describe a public education system. They are tools which are useful in examining and measuring the effectiveness of the system. Examples include information such as the amount of local funds raised to support local schools, per pupil expenditures, pupil-teacher ratios, and…
The Condition of K-12 Public Education in Maine: 2007
ERIC Educational Resources Information Center
Gravelle, Paula B.; Silvernail, David L.
2007-01-01
"Education Indicators" are facts and statistics that help to describe a public education system. They are tools which are useful in examining and measuring the effectiveness of the system. Examples include information such as the amount of local funds raised to support local schools, per pupil expenditures, pupil-teacher ratios, and…
Report of the 64th National Conference on Weights and Measures
NASA Astrophysics Data System (ADS)
Wollin, H. F.; Babeoq, L. E.; Heffernan, A. P.
1980-03-01
Major issues discussed at this conference include metric conversion in the United States, particularly the conversion of gasoline dispensers, problems relating to the quantity fill of packaged commodities especially as affected by moisture loss and statistical approach to package checking. Federal grain inspection, and a legal metrology control system are also discussed.
Kiss High Blood Pressure Goodbye: The Relationship between Dark Chocolate and Hypertension
ERIC Educational Resources Information Center
Nordmoe, Eric D.
2008-01-01
This article reports on a delicious finding from a recent study claiming a causal link between dark chocolate consumption and blood pressure reductions. In the article, I provide ideas for using this study to whet student appetites for a discussion of statistical ideas, including experimental design, measurement error and inference methods.
Awareness of Biotechnological Application: A Study among University Geography Students
ERIC Educational Resources Information Center
Ozel, Ali; Terzi, Irfan; Ozel, Emine
2009-01-01
The aim of this study is to measure the differences of university geography students about biotechnology. Therefore an awareness scale was developed by the researcher. 102 students from six different universities and their academic levels were included in the survey. The findings of the survey were evaluated both descriptively and statistically.…
Novice Interpretations of Progress Monitoring Graphs: Extreme Values and Graphical Aids
ERIC Educational Resources Information Center
Newell, Kirsten W.; Christ, Theodore J.
2017-01-01
Curriculum-Based Measurement of Reading (CBM-R) is frequently used to monitor instructional effects and evaluate response to instruction. Educators often view the data graphically on a time-series graph that might include a variety of statistical and visual aids, which are intended to facilitate the interpretation. This study evaluated the effects…
Diagnosis of Cognitive Errors by Statistical Pattern Recognition Methods.
ERIC Educational Resources Information Center
Tatsuoka, Kikumi K.; Tatsuoka, Maurice M.
The rule space model permits measurement of cognitive skill acquisition, diagnosis of cognitive errors, and detection of the strengths and weaknesses of knowledge possessed by individuals. Two ways to classify an individual into his or her most plausible latent state of knowledge include: (1) hypothesis testing--Bayes' decision rules for minimum…
Psycho-Motor Needs Assessment of Virginia School Children.
ERIC Educational Resources Information Center
Glen Haven Achievement Center, Fort Collins, CO.
An effort to assess psycho-motor (P-M) needs among Virginia children in K-4 and in special primary classes for the educable mentally retarded is presented. Included are methods for selecting, combining, and developing evaluation measures, which are verified statistically by analyses of data collected from a stratified sample of approximately 4,500…
Computation of the Molenaar Sijtsma Statistic
NASA Astrophysics Data System (ADS)
Andries van der Ark, L.
The Molenaar Sijtsma statistic is an estimate of the reliability of a test score. In some special cases, computation of the Molenaar Sijtsma statistic requires provisional measures. These provisional measures have not been fully described in the literature, and we show that they have not been implemented in the software. We describe the required provisional measures as to allow the computation of the Molenaar Sijtsma statistic for all data sets.
Ground-based lidar measurements from Ny-Ålesund during ASTAR 2007: a statistical overview
NASA Astrophysics Data System (ADS)
Hoffmann, A.; Ritter, C.; Stock, M.; Shiobara, M.; Lampert, A.; Maturilli, M.; Orgis, T.; Neuber, R.; Herber, A.
2009-07-01
During the Arctic Study of Tropospheric Aerosol, Clouds and Radiation (ASTAR) in March and April 2007, measurements obtained at the AWIPEV Research station in Ny-Ålesund, Spitsbergen (operated by the Alfred-Wegener-Institute for Polar and Marine Research and the Institut polaire français Paul-Emile Victor), supported the airborne campaign. This included Lidar data from the Koldewey Aerosol Raman Lidar (KARL) and the Micro Pulse Lidar (MPL), located in the atmospheric observatory as well as photometer data and the daily launched radiosonde. The MPL features nearly continuous measurements; the KARL was switched on whenever weather conditions allowed observations (145 h in 61 days). From 1 March to 30 April, 71 meteorological balloon soundings were performed and compared with the corresponding MPL measurements; photometer measurements are available from 18 March. For the KARL data, a statistical overview based on the optical properties backscatter ratio and volume depolarization can be given. The altitudes of the occurrence of the named features (subvisible and visible ice and water as well as mixed-phase clouds, aerosol layers) as well as their dependence on different air mass origins are analyzed. Although the spring 2007 was characterized by rather clean conditions, diverse case studies of cloud and aerosol occurrence during March and April 2007 are presented in more detail, including temporal development and main optical properties as backscatter, depolarization and extinction coefficients. Links between air mass origins and optical properties can be presumed but need further evidence.
Westenburg, C.L.; La Camera, R. J.
1996-01-01
The U.S. Geological Survey, in support of the U.S. Department of Energy, Yucca Mountain Site Characterization Project, collects, compiles, and summarizes hydrologic data in the Yucca Mountain region. The data are collected to allow assessments of ground-water resources during studies to determine the potential suitability of Yucca Mountain for storing high-level nuclear waste. Data on ground-water levels at 36 sites, ground-water discharge at 6 sites, and ground-water withdrawals within Crater Flat, Jackass Flats, Mercury Valley, and the Amargosa Desert are presented for calendar year 1994. Data collected prior to 1994 are graphically presented and data collected by other agencies (or as part of other programs) are included to further indicate variations of ground-water levels, discharges, and withdrawals through time. A statistical summary of ground-water levels at seven wells in Jackass Flats is presented. The statistical summary includes the number of measurements, the maximum, minimum, and median water-level altitudes, and the average deviation of measured water-level altitudes for selected baseline periods and for calendar years 1992-94.
NASA Astrophysics Data System (ADS)
Loredo, Thomas; Budavari, Tamas; Scargle, Jeffrey D.
2018-01-01
This presentation provides an overview of open-source software packages addressing two challenging classes of astrostatistics problems. (1) CUDAHM is a C++ framework for hierarchical Bayesian modeling of cosmic populations, leveraging graphics processing units (GPUs) to enable applying this computationally challenging paradigm to large datasets. CUDAHM is motivated by measurement error problems in astronomy, where density estimation and linear and nonlinear regression must be addressed for populations of thousands to millions of objects whose features are measured with possibly complex uncertainties, potentially including selection effects. An example calculation demonstrates accurate GPU-accelerated luminosity function estimation for simulated populations of $10^6$ objects in about two hours using a single NVIDIA Tesla K40c GPU. (2) Time Series Explorer (TSE) is a collection of software in Python and MATLAB for exploratory analysis and statistical modeling of astronomical time series. It comprises a library of stand-alone functions and classes, as well as an application environment for interactive exploration of times series data. The presentation will summarize key capabilities of this emerging project, including new algorithms for analysis of irregularly-sampled time series.
Markovic, Gabriela; Schult, Marie-Louise; Bartfai, Aniko; Elg, Mattias
2017-01-31
Progress in early cognitive recovery after acquired brain injury is uneven and unpredictable, and thus the evaluation of rehabilitation is complex. The use of time-series measurements is susceptible to statistical change due to process variation. To evaluate the feasibility of using a time-series method, statistical process control, in early cognitive rehabilitation. Participants were 27 patients with acquired brain injury undergoing interdisciplinary rehabilitation of attention within 4 months post-injury. The outcome measure, the Paced Auditory Serial Addition Test, was analysed using statistical process control. Statistical process control identifies if and when change occurs in the process according to 3 patterns: rapid, steady or stationary performers. The statistical process control method was adjusted, in terms of constructing the baseline and the total number of measurement points, in order to measure a process in change. Statistical process control methodology is feasible for use in early cognitive rehabilitation, since it provides information about change in a process, thus enabling adjustment of the individual treatment response. Together with the results indicating discernible subgroups that respond differently to rehabilitation, statistical process control could be a valid tool in clinical decision-making. This study is a starting-point in understanding the rehabilitation process using a real-time-measurements approach.
Instrument comparison for Aerosolized Titanium Dioxide
NASA Astrophysics Data System (ADS)
Ranpara, Anand
Recent toxicological studies have shown that the surface area of ultrafine particles (UFP i.e., particles with diameters less than 0.1 micrometer) has a stronger correlation with adverse health effects than does mass of these particles. Ultrafine titanium dioxide (TiO2) particles are widely used in industry, and their use is associated with adverse health outcomes, such as micro vascular dysfunctions and pulmonary damages. The primary aim of this experimental study was to compare a variety of laboratory and industrial hygiene (IH) field study instruments all measuring the same aerosolized TiO2. The study also observed intra-instrument variability between measurements made by two apparently identical devices of the same type of instrument placed side-by-side. The types of instruments studied were (1) DustTrak(TM) DRX, (2) Personal Data RAMs(TM) (PDR), (3) GRIMM, (4) Diffusion charger (DC) and (5) Scanning Mobility Particle Sizer (SMPS). Two devices of each of the four IH field study instrument types were used to measure six levels of mass concentration of fine and ultrafine TiO2 aerosols in controlled chamber tests. Metrics evaluated included real-time mass, active surface area and number/geometric surface area distributions, and off-line gravimetric mass and morphology on filters. DustTrak(TM) DRXs and PDRs were used for mass concentration measurements. DCs were used for active surface area concentration measurements. GRIMMs were used for number concentration measurements. SMPS was used for inter-instrument comparisons of surface area and number concentrations. The results indicated that two apparently identical devices of each DRX and PDR were statistically not different with each other for all the trials of both the sizes of powder (p < 5%). Mean difference between mass concentrations measured by two DustTrak DRX devices was smaller than that measured by two PDR devices. DustTrak DRX measurements were closer to the reference method, gravimetric mass concentration, than the PDRs. Two apparently identical DC devices were statistically different with each other for fine particles but not for UFP. DC devices and SMPS were statistically different with each other for both sizes of particles. Two apparently identical GRIMM devices were statistically different with each other for fine particles. For UFP, results of GRIMM device were statistically different than SMPS but not for fine particles. These observations suggest that inter-device within instrument and inter-instrument agreements depend on particle size and instrument characteristics to measure nanoparticles at different concentration levels.
Reverberant acoustic energy in auditoria that comprise systems of coupled rooms
NASA Astrophysics Data System (ADS)
Summers, Jason E.
2003-11-01
A frequency-dependent model for reverberant energy in coupled rooms is developed and compared with measurements for a 1:10 scale model and for Bass Hall, Ft. Worth, TX. At high frequencies, prior statistical-acoustics models are improved by geometrical-acoustics corrections for decay within sub-rooms and for energy transfer between sub-rooms. Comparisons of computational geometrical acoustics predictions based on beam-axis tracing with scale model measurements indicate errors resulting from tail-correction assuming constant quadratic growth of reflection density. Using ray tracing in the late part corrects this error. For mid-frequencies, the models are modified to account for wave effects at coupling apertures by including power transmission coefficients. Similarly, statical-acoustics models are improved through more accurate estimates of power transmission measurements. Scale model measurements are in accord with the predicted behavior. The edge-diffraction model is adapted to study transmission through apertures. Multiple-order scattering is theoretically and experimentally shown inaccurate due to neglect of slope diffraction. At low frequencies, perturbation models qualitatively explain scale model measurements. Measurements confirm relation of coupling strength to unperturbed pressure distribution on coupling surfaces. Measurements in Bass Hall exhibit effects of the coupled stage house. High frequency predictions of statistical acoustics and geometrical acoustics models and predictions of coupling apertures all agree with measurements.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Power, S.; Mirza, M.; Thakorlal, A.
PurposeThis prospective pilot study was undertaken to evaluate the feasibility and effectiveness of using a radiation absorbing shield to reduce operator dose from scatter during lower limb endovascular procedures.Materials and MethodsA commercially available bismuth shield system (RADPAD) was used. Sixty consecutive patients undergoing lower limb angioplasty were included. Thirty procedures were performed without the RADPAD (control group) and thirty with the RADPAD (study group). Two separate methods were used to measure dose to a single operator. Thermoluminescent dosimeter (TLD) badges were used to measure hand, eye, and unshielded body dose. A direct dosimeter with digital readout was also used tomore » measure eye and unshielded body dose. To allow for variation between control and study groups, dose per unit time was calculated.ResultsTLD results demonstrated a significant reduction in median body dose per unit time for the study group compared with controls (p = 0.001), corresponding to a mean dose reduction rate of 65 %. Median eye and hand dose per unit time were also reduced in the study group compared with control group, however, this was not statistically significant (p = 0.081 for eye, p = 0.628 for hand). Direct dosimeter readings also showed statistically significant reduction in median unshielded body dose rate for the study group compared with controls (p = 0.037). Eye dose rate was reduced for the study group but this was not statistically significant (p = 0.142).ConclusionInitial results are encouraging. Use of the shield resulted in a statistically significant reduction in unshielded dose to the operator’s body. Measured dose to the eye and hand of operator were also reduced but did not reach statistical significance in this pilot study.« less
Sample Skewness as a Statistical Measurement of Neuronal Tuning Sharpness
Samonds, Jason M.; Potetz, Brian R.; Lee, Tai Sing
2014-01-01
We propose using the statistical measurement of the sample skewness of the distribution of mean firing rates of a tuning curve to quantify sharpness of tuning. For some features, like binocular disparity, tuning curves are best described by relatively complex and sometimes diverse functions, making it difficult to quantify sharpness with a single function and parameter. Skewness provides a robust nonparametric measure of tuning curve sharpness that is invariant with respect to the mean and variance of the tuning curve and is straightforward to apply to a wide range of tuning, including simple orientation tuning curves and complex object tuning curves that often cannot even be described parametrically. Because skewness does not depend on a specific model or function of tuning, it is especially appealing to cases of sharpening where recurrent interactions among neurons produce sharper tuning curves that deviate in a complex manner from the feedforward function of tuning. Since tuning curves for all neurons are not typically well described by a single parametric function, this model independence additionally allows skewness to be applied to all recorded neurons, maximizing the statistical power of a set of data. We also compare skewness with other nonparametric measures of tuning curve sharpness and selectivity. Compared to these other nonparametric measures tested, skewness is best used for capturing the sharpness of multimodal tuning curves defined by narrow peaks (maximum) and broad valleys (minima). Finally, we provide a more formal definition of sharpness using a shape-based information gain measure and derive and show that skewness is correlated with this definition. PMID:24555451
Marcus, Dawn A; Bernstein, Cheryl D; Haq, Adeel; Breuer, Paula
2014-06-01
Fibromyalgia is associated with substantial functional disability. Current drug and non-drug treatments result in statistically significant but numerically small improvements in typical numeric measures of pain severity and fibromyalgia impact. The aim of the present study was to evaluate additional measures of pain severity and functional outcome that might be affected by fibromyalgia treatment. This retrospective review evaluated outcomes from 274 adults with fibromyalgia who participated in a six-week, multidisciplinary treatment programme. Pain and function were evaluated on the first and final treatment visit. Pain was evaluated using an 11-point numerical scale to determine clinically meaningful pain reduction (decrease ≥ 2 points) and from a pain drawing. Function was evaluated by measuring active range of motion (ROM), walking distance and speed, upper extremity exercise repetitions, and self-reports of daily activities. Numerical rating scores for pain decreased by 10-13% (p < 0.01) and Fibromyalgia Impact Questionnaire (FIQ) scores decreased by 20% (p < 0.001). More substantial improvements were noted when using alternative measures. Clinically meaningful pain relief was achieved by 37% of patients, and the body area affected by pain decreased by 31%. ROM showed significant improvements in straight leg raise and cervical motion, without improvements in lumbar ROM. Daily walking distance increased fourfold and arm exercise repetitions doubled. Despite modest albeit statistically significant improvements in standard measures of pain severity and the FIQ, more substantial pain improvement was noted when utilizing alternative measures of pain and functional improvement. Alternative symptom assessment measures might be important outcome measures to include in drug and non-drug studies to better understand fibromyalgia treatment effectiveness. © 2013 John Wiley & Sons, Ltd.
Rao, Goutham; Lopez-Jimenez, Francisco; Boyd, Jack; D'Amico, Frank; Durant, Nefertiti H; Hlatky, Mark A; Howard, George; Kirley, Katherine; Masi, Christopher; Powell-Wiley, Tiffany M; Solomonides, Anthony E; West, Colin P; Wessel, Jennifer
2017-09-05
Meta-analyses are becoming increasingly popular, especially in the fields of cardiovascular disease prevention and treatment. They are often considered to be a reliable source of evidence for making healthcare decisions. Unfortunately, problems among meta-analyses such as the misapplication and misinterpretation of statistical methods and tests are long-standing and widespread. The purposes of this statement are to review key steps in the development of a meta-analysis and to provide recommendations that will be useful for carrying out meta-analyses and for readers and journal editors, who must interpret the findings and gauge methodological quality. To make the statement practical and accessible, detailed descriptions of statistical methods have been omitted. Based on a survey of cardiovascular meta-analyses, published literature on methodology, expert consultation, and consensus among the writing group, key recommendations are provided. Recommendations reinforce several current practices, including protocol registration; comprehensive search strategies; methods for data extraction and abstraction; methods for identifying, measuring, and dealing with heterogeneity; and statistical methods for pooling results. Other practices should be discontinued, including the use of levels of evidence and evidence hierarchies to gauge the value and impact of different study designs (including meta-analyses) and the use of structured tools to assess the quality of studies to be included in a meta-analysis. We also recommend choosing a pooling model for conventional meta-analyses (fixed effect or random effects) on the basis of clinical and methodological similarities among studies to be included, rather than the results of a test for statistical heterogeneity. © 2017 American Heart Association, Inc.
Favre-Averaged Turbulence Statistics in Variable Density Mixing of Buoyant Jets
NASA Astrophysics Data System (ADS)
Charonko, John; Prestridge, Kathy
2014-11-01
Variable density mixing of a heavy fluid jet with lower density ambient fluid in a subsonic wind tunnel was experimentally studied using Particle Image Velocimetry and Planar Laser Induced Fluorescence to simultaneously measure velocity and density. Flows involving the mixing of fluids with large density ratios are important in a range of physical problems including atmospheric and oceanic flows, industrial processes, and inertial confinement fusion. Here we focus on buoyant jets with coflow. Results from two different Atwood numbers, 0.1 (Boussinesq limit) and 0.6 (non-Boussinesq case), reveal that buoyancy is important for most of the turbulent quantities measured. Statistical characteristics of the mixing important for modeling these flows such as the PDFs of density and density gradients, turbulent kinetic energy, Favre averaged Reynolds stress, turbulent mass flux velocity, density-specific volume correlation, and density power spectra were also examined and compared with previous direct numerical simulations. Additionally, a method for directly estimating Reynolds-averaged velocity statistics on a per-pixel basis is extended to Favre-averages, yielding improved accuracy and spatial resolution as compared to traditional post-processing of velocity and density fields.
Inference of median difference based on the Box-Cox model in randomized clinical trials.
Maruo, K; Isogawa, N; Gosho, M
2015-05-10
In randomized clinical trials, many medical and biological measurements are not normally distributed and are often skewed. The Box-Cox transformation is a powerful procedure for comparing two treatment groups for skewed continuous variables in terms of a statistical test. However, it is difficult to directly estimate and interpret the location difference between the two groups on the original scale of the measurement. We propose a helpful method that infers the difference of the treatment effect on the original scale in a more easily interpretable form. We also provide statistical analysis packages that consistently include an estimate of the treatment effect, covariance adjustments, standard errors, and statistical hypothesis tests. The simulation study that focuses on randomized parallel group clinical trials with two treatment groups indicates that the performance of the proposed method is equivalent to or better than that of the existing non-parametric approaches in terms of the type-I error rate and power. We illustrate our method with cluster of differentiation 4 data in an acquired immune deficiency syndrome clinical trial. Copyright © 2015 John Wiley & Sons, Ltd.
Improving the Validity of Activity of Daily Living Dependency Risk Assessment
Clark, Daniel O.; Stump, Timothy E.; Tu, Wanzhu; Miller, Douglas K.
2015-01-01
Objectives Efforts to prevent activity of daily living (ADL) dependency may be improved through models that assess older adults’ dependency risk. We evaluated whether cognition and gait speed measures improve the predictive validity of interview-based models. Method Participants were 8,095 self-respondents in the 2006 Health and Retirement Survey who were aged 65 years or over and independent in five ADLs. Incident ADL dependency was determined from the 2008 interview. Models were developed using random 2/3rd cohorts and validated in the remaining 1/3rd. Results Compared to a c-statistic of 0.79 in the best interview model, the model including cognitive measures had c-statistics of 0.82 and 0.80 while the best fitting gait speed model had c-statistics of 0.83 and 0.79 in the development and validation cohorts, respectively. Conclusion Two relatively brief models, one that requires an in-person assessment and one that does not, had excellent validity for predicting incident ADL dependency but did not significantly improve the predictive validity of the best fitting interview-based models. PMID:24652867
Multiplicative processes in visual cognition
NASA Astrophysics Data System (ADS)
Credidio, H. F.; Teixeira, E. N.; Reis, S. D. S.; Moreira, A. A.; Andrade, J. S.
2014-03-01
The Central Limit Theorem (CLT) is certainly one of the most important results in the field of statistics. The simple fact that the addition of many random variables can generate the same probability curve, elucidated the underlying process for a broad spectrum of natural systems, ranging from the statistical distribution of human heights to the distribution of measurement errors, to mention a few. An extension of the CLT can be applied to multiplicative processes, where a given measure is the result of the product of many random variables. The statistical signature of these processes is rather ubiquitous, appearing in a diverse range of natural phenomena, including the distributions of incomes, body weights, rainfall, and fragment sizes in a rock crushing process. Here we corroborate results from previous studies which indicate the presence of multiplicative processes in a particular type of visual cognition task, namely, the visual search for hidden objects. Precisely, our results from eye-tracking experiments show that the distribution of fixation times during visual search obeys a log-normal pattern, while the fixational radii of gyration follow a power-law behavior.
NASA Astrophysics Data System (ADS)
Thomas, E. G.; Shepherd, S. G.
2017-12-01
Global patterns of ionospheric convection have been widely studied in terms of the interplanetary magnetic field (IMF) magnitude and orientation in both the Northern and Southern Hemispheres using observations from the Super Dual Auroral Radar Network (SuperDARN). The dynamic range of driving conditions under which existing SuperDARN statistical models are valid is currently limited to periods when the high-latitude convection pattern remains above about 60° geomagnetic latitude. Cousins and Shepherd [2010] found this to correspond to intervals when the solar wind electric field Esw < 4.1 mV/m and IMF Bz is negative. Conversely, under northward IMF conditions (Bz > 0) the high-latitude radars often experience difficulties in measuring convection above about 85° geomagnetic latitude. In this presentation, we introduce a new statistical model of ionospheric convection which is valid for much more dominant IMF Bz conditions than was previously possible by including velocity measurements from the newly constructed tiers of radars in the Northern Hemisphere at midlatitudes and in the polar cap. This new model (TS17) is compared to previous statistical models derived from high-latitude SuperDARN observations (RG96, PSR10, CS10) and its impact on instantaneous Map Potential solutions is examined.
THE MEASUREMENT OF BONE QUALITY USING GRAY LEVEL CO-OCCURRENCE MATRIX TEXTURAL FEATURES.
Shirvaikar, Mukul; Huang, Ning; Dong, Xuanliang Neil
2016-10-01
In this paper, statistical methods for the estimation of bone quality to predict the risk of fracture are reported. Bone mineral density and bone architecture properties are the main contributors of bone quality. Dual-energy X-ray Absorptiometry (DXA) is the traditional clinical measurement technique for bone mineral density, but does not include architectural information to enhance the prediction of bone fragility. Other modalities are not practical due to cost and access considerations. This study investigates statistical parameters based on the Gray Level Co-occurrence Matrix (GLCM) extracted from two-dimensional projection images and explores links with architectural properties and bone mechanics. Data analysis was conducted on Micro-CT images of 13 trabecular bones (with an in-plane spatial resolution of about 50μm). Ground truth data for bone volume fraction (BV/TV), bone strength and modulus were available based on complex 3D analysis and mechanical tests. Correlation between the statistical parameters and biomechanical test results was studied using regression analysis. The results showed Cluster-Shade was strongly correlated with the microarchitecture of the trabecular bone and related to mechanical properties. Once the principle thesis of utilizing second-order statistics is established, it can be extended to other modalities, providing cost and convenience advantages for patients and doctors.
THE MEASUREMENT OF BONE QUALITY USING GRAY LEVEL CO-OCCURRENCE MATRIX TEXTURAL FEATURES
Shirvaikar, Mukul; Huang, Ning; Dong, Xuanliang Neil
2016-01-01
In this paper, statistical methods for the estimation of bone quality to predict the risk of fracture are reported. Bone mineral density and bone architecture properties are the main contributors of bone quality. Dual-energy X-ray Absorptiometry (DXA) is the traditional clinical measurement technique for bone mineral density, but does not include architectural information to enhance the prediction of bone fragility. Other modalities are not practical due to cost and access considerations. This study investigates statistical parameters based on the Gray Level Co-occurrence Matrix (GLCM) extracted from two-dimensional projection images and explores links with architectural properties and bone mechanics. Data analysis was conducted on Micro-CT images of 13 trabecular bones (with an in-plane spatial resolution of about 50μm). Ground truth data for bone volume fraction (BV/TV), bone strength and modulus were available based on complex 3D analysis and mechanical tests. Correlation between the statistical parameters and biomechanical test results was studied using regression analysis. The results showed Cluster-Shade was strongly correlated with the microarchitecture of the trabecular bone and related to mechanical properties. Once the principle thesis of utilizing second-order statistics is established, it can be extended to other modalities, providing cost and convenience advantages for patients and doctors. PMID:28042512
Porter, Anna K; Wen, Fang; Herring, Amy H; Rodríguez, Daniel A; Messer, Lynne C; Laraia, Barbara A; Evenson, Kelly R
2018-06-01
Reliable and stable environmental audit instruments are needed to successfully identify the physical and social attributes that may influence physical activity. This study described the reliability and stability of the PIN3 environmental audit instrument in both urban and rural neighborhoods. Four randomly sampled road segments in and around a one-quarter mile buffer of participants' residences from the Pregnancy, Infection, and Nutrition (PIN3) study were rated twice, approximately 2 weeks apart. One year later, 253 of the year 1 sampled roads were re-audited. The instrument included 43 measures that resulted in 73 item scores for calculation of percent overall agreement, kappa statistics, and log-linear models. For same-day reliability, 81% of items had moderate to outstanding kappa statistics (kappas ≥ 0.4). Two-week reliability was slightly lower, with 77% of items having moderate to outstanding agreement using kappa statistics. One-year stability had 68% of items showing moderate to outstanding agreement using kappa statistics. The reliability of the audit measures was largely consistent when comparing urban to rural locations, with only 8% of items exhibiting significant differences (α < 0.05) by urbanicity. The PIN3 instrument is a reliable and stable audit tool for studies assessing neighborhood attributes in urban and rural environments.
NASA Astrophysics Data System (ADS)
Zhang, Biyao; Liu, Xiangnan; Liu, Meiling; Wang, Dongmin
2017-04-01
This paper addresses the assessment and interpretation of the canopy-air temperature difference (Tc-Ta) distribution as an indicator for discriminating between heavy metal stress levels. Tc-Ta distribution is simulated by coupling the energy balance equation with modified leaf angle distribution. Statistical indices including average value (AVG), standard deviation (SD), median, and span of Tc-Ta in the field of view of a digital thermal imager are calculated to describe Tc-Ta distribution quantitatively and, consequently, became the stress indicators. In the application, two grains of rice growing sites under "mild" and "severe" stress level were selected as study areas. A total of 96 thermal images obtained from the field measurements in the three growth stages were used for a separate application of a theoretical variation of Tc-Ta distribution. The results demonstrated that the statistical indices calculated from both simulated and measured data exhibited an upward trend as the stress level becomes serious because heavy metal stress would only raise a portion of the leaves in the canopy. Meteorological factors could barely affect the sensitivity of the statistical indices with the exception of the wind speed. Among the statistical indices, AVG and SD were demonstrated to be better indicators for stress levels discrimination.
TDRS orbit determination by radio interferometry
NASA Technical Reports Server (NTRS)
Pavloff, Michael S.
1994-01-01
In support of a NASA study on the application of radio interferometry to satellite orbit determination, MITRE developed a simulation tool for assessing interferometry tracking accuracy. The Orbit Determination Accuracy Estimator (ODAE) models the general batch maximum likelihood orbit determination algorithms of the Goddard Trajectory Determination System (GTDS) with the group and phase delay measurements from radio interferometry. ODAE models the statistical properties of tracking error sources, including inherent observable imprecision, atmospheric delays, clock offsets, station location uncertainty, and measurement biases, and through Monte Carlo simulation, ODAE calculates the statistical properties of errors in the predicted satellites state vector. This paper presents results from ODAE application to orbit determination of the Tracking and Data Relay Satellite (TDRS) by radio interferometry. Conclusions about optimal ground station locations for interferometric tracking of TDRS are presented, along with a discussion of operational advantages of radio interferometry.
ERIC Educational Resources Information Center
Merrill, Ray M.; Chatterley, Amanda; Shields, Eric C.
2005-01-01
This study explored the effectiveness of selected statistical measures at motivating or maintaining regular exercise among college students. The study also considered whether ease in understanding these statistical measures was associated with perceived effectiveness at motivating or maintaining regular exercise. Analyses were based on a…
Al-Badriyeh, Daoud; Alameri, Marwah; Al-Okka, Randa
2017-01-01
Objective To perform a first-time analysis of the cost-effectiveness (CE) literature on chemotherapies, of all types, in cancer, in terms of trends and change over time, including the influence of industry funding. Design Systematic review. Setting A wide range of cancer-related research settings within healthcare, including health systems, hospitals and medical centres. Participants All literature comparative CE research of drug-based cancer therapies in the period 1986 to 2015. Primary and secondary outcome measures Primary outcomes are the literature trends in relation to journal subject category, authorship, research design, data sources, funds and consultation involvement. An additional outcome measure is the association between industry funding and study outcomes. Analysis Descriptive statistics and the χ2, Fisher exact or Somer's D tests were used to perform non-parametric statistics, with a p value of <0.05 as the statistical significance measure. Results Total 574 publications were analysed. The drug-related CE literature expands over time, with increased publishing in the healthcare sciences and services journal subject category (p<0.001). The retrospective data collection in studies increased over time (p<0.001). The usage of prospective data, however, has been decreasing (p<0.001) in relation to randomised clinical trials (RCTs), but is unchanging for non-RCT studies. The industry-sponsored CE studies have especially been increasing (p<0.001), in contrast to those sponsored by other sources. While paid consultation involvement grew throughout the years, the declaration of funding for this is relatively limited. Importantly, there is evidence that industry funding is associated with favourable result to the sponsor (p<0.001). Conclusions This analysis demonstrates clear trends in how the CE cancer research is presented to the practicing community, including in relation to journals, study designs, authorship and consultation, together with increased financial sponsorship by pharmaceutical industries, which may be more influencing study outcomes than other funding sources. PMID:28131999
The statistical distribution of aerosol properties in sourthern West Africa
NASA Astrophysics Data System (ADS)
Haslett, Sophie; Taylor, Jonathan; Flynn, Michael; Bower, Keith; Dorsey, James; Crawford, Ian; Brito, Joel; Denjean, Cyrielle; Bourrianne, Thierry; Burnet, Frederic; Batenburg, Anneke; Schulz, Christiane; Schneider, Johannes; Borrmann, Stephan; Sauer, Daniel; Duplissy, Jonathan; Lee, James; Vaughan, Adam; Coe, Hugh
2017-04-01
The population and economy in southern West Africa have been growing at an exceptional rate in recent years and this trend is expected to continue, with the population projected to more than double to 800 million by 2050. This will result in a dramatic increase in anthropogenic pollutants, already estimated to have tripled between 1950 and 2000 (Lamarque et al., 2010). It is known that aerosols can modify the radiative properties of clouds. As such, the entrainment of anthropogenic aerosol into the large banks of clouds forming during the onset of the West African Monsoon could have a substantial impact on the region's response to climate change. Such projections, however, are greatly limited by the scarcity of observations in this part of the world. As part of the Dynamics-Aerosol-Chemistry-Cloud Interactions in West Africa (DACCIWA) project, three research aircraft were deployed, each carrying equipment capable of measuring aerosol properties in-situ. Instrumentation included Aerosol Mass Spectrometers (AMS), Single Particle Soot Photometers (SP2), Condensation Particle Counters (CPC) and Scanning Mobility Particle Sizers (SMPS). Throughout the intensive aircraft campaign, 155 hours of scientific flights covered an area including large parts of Benin, Togo, Ghana and parts of Côte D'Ivoire. Approximately 70 hours were dedicated to the measurement of cloud-aerosol interactions, with many other flights producing data contributing towards this objective. Using datasets collected during this campaign period, it is possible to build a robust statistical understanding of aerosol properties in this region for the first time, including size distributions and optical and chemical properties. Here, we describe preliminary results from aerosol measurements on board the three aircraft. These have been used to describe aerosol properties throughout the region and time period encompassed by the DACCIWA aircraft campaign. Such statistics will be invaluable for improving future projections of cloud properties and radiative effects in the region.
2011-01-01
Background Despite more than a decade of research on hospitalists and their performance, disagreement still exists regarding whether and how hospital-based physicians improve the quality of inpatient care delivery. This systematic review summarizes the findings from 65 comparative evaluations to determine whether hospitalists provide a higher quality of inpatient care relative to traditional inpatient physicians who maintain hospital privileges with concurrent outpatient practices. Methods Articles on hospitalist performance published between January 1996 and December 2010 were identified through MEDLINE, Embase, Science Citation Index, CINAHL, NHS Economic Evaluation Database and a hand-search of reference lists, key journals and editorials. Comparative evaluations presenting original, quantitative data on processes, efficiency or clinical outcome measures of care between hospitalists, community-based physicians and traditional academic attending physicians were included (n = 65). After proposing a conceptual framework for evaluating inpatient physician performance, major findings on quality are summarized according to their percentage change, direction and statistical significance. Results The majority of reviewed articles demonstrated that hospitalists are efficient providers of inpatient care on the basis of reductions in their patients' average length of stay (69%) and total hospital costs (70%); however, the clinical quality of hospitalist care appears to be comparable to that provided by their colleagues. The methodological quality of hospitalist evaluations remains a concern and has not improved over time. Persistent issues include insufficient reporting of source or sample populations (n = 30), patients lost to follow-up (n = 42) and estimates of effect or random variability (n = 35); inappropriate use of statistical tests (n = 55); and failure to adjust for established confounders (n = 37). Conclusions Future research should include an expanded focus on the specific structures of care that differentiate hospitalists from other inpatient physician groups as well as the development of better conceptual and statistical models that identify and measure underlying mechanisms driving provider-outcome associations in quality. PMID:21592322
Comparison of measurement- and proxy-based Vs30 values in California
Yong, Alan K.
2016-01-01
This study was prompted by the recent availability of a significant amount of openly accessible measured VS30 values and the desire to investigate the trend of using proxy-based models to predict VS30 in the absence of measurements. Comparisons between measured and model-based values were performed. The measured data included 503 VS30 values collected from various projects for 482 seismographic station sites in California. Six proxy-based models—employing geologic mapping, topographic slope, and terrain classification—were also considered. Included was a new terrain class model based on the Yong et al. (2012) approach but recalibrated with updated measured VS30 values. Using the measured VS30 data as the metric for performance, the predictive capabilities of the six models were determined to be statistically indistinguishable. This study also found three models that tend to underpredict VS30 at lower velocities (NEHRP Site Classes D–E) and overpredict at higher velocities (Site Classes B–C).
Performance index for virtual reality phacoemulsification surgery
NASA Astrophysics Data System (ADS)
Söderberg, Per; Laurell, Carl-Gustaf; Simawi, Wamidh; Skarman, Eva; Nordqvist, Per; Nordh, Leif
2007-02-01
We have developed a virtual reality (VR) simulator for phacoemulsification (phaco) surgery. The current work aimed at developing a performance index that characterizes the performance of an individual trainee. We recorded measurements of 28 response variables during three iterated surgical sessions in 9 subjects naive to cataract surgery and 6 experienced cataract surgeons, separately for the sculpting phase and the evacuation phase of phacoemulsification surgery. We further defined a specific performance index for a specific measurement variable and a total performance index for a specific trainee. The distribution function for the total performance index was relatively evenly distributed both for the sculpting and the evacuation phase indicating that parametric statistics can be used for comparison of total average performance indices for different groups in the future. The current total performance index for an individual considers all measurement variables included with the same weight. It is possible that a future development of the system will indicate that a better characterization of a trainee can be obtained if the various measurements variables are given specific weights. The currently developed total performance index for a trainee is statistically an independent observation of that particular trainee.
Water Masses in the Eastern Mediterranean Sea: An Analysis of Measured Isotopic Oxygen
NASA Astrophysics Data System (ADS)
de Ruggiero, Paola; Zanchettin, Davide; Bensi, Manuel; Hainbucher, Dagmar; Stenni, Barbara; Pierini, Stefano; Rubino, Angelo
2018-04-01
We investigate aspects of the water mass structure of the Adriatic and Ionian basins (Eastern Mediterranean Sea) and their interdecadal variability through statistical analyses focused on δ18Ο measurements carried out in 1985, 1990, and 2011. In particular, the more recent δ18Ο measurements extend throughout the entire water column and constitute, to the best of our knowledge, the largest synoptic dataset encompassing different sub-basins of the Mediterranean Sea. We study the statistical linkages between temperature, salinity, dissolved oxygen and δ18Ο. We find that δ18Ο is largely independent from the other parameters, and it can be used to trace major water masses that are typically found in the basins, including the Adriatic Dense Water, the Levantine Intermediate Water, and the Cretan Intermediate and Dense Waters. Finally, we explore the possibility of using δ18Ο concentration as a proxy for dominant modes of large-scale oceanic variability in the Mediterranean Sea.
First high-statistics and high-resolution recoil-ion data from the WITCH retardation spectrometer
NASA Astrophysics Data System (ADS)
Finlay, P.; Breitenfeldt, M.; Porobić, T.; Wursten, E.; Ban, G.; Beck, M.; Couratin, C.; Fabian, X.; Fléchard, X.; Friedag, P.; Glück, F.; Herlert, A.; Knecht, A.; Kozlov, V. Y.; Liénard, E.; Soti, G.; Tandecki, M.; Traykov, E.; Van Gorp, S.; Weinheimer, Ch.; Zákoucký, D.; Severijns, N.
2016-07-01
The first high-statistics and high-resolution data set for the integrated recoil-ion energy spectrum following the β^+ decay of 35Ar has been collected with the WITCH retardation spectrometer located at CERN-ISOLDE. Over 25 million recoil-ion events were recorded on a large-area multichannel plate (MCP) detector with a time-stamp precision of 2ns and position resolution of 0.1mm due to the newly upgraded data acquisition based on the LPC Caen FASTER protocol. The number of recoil ions was measured for more than 15 different settings of the retardation potential, complemented by dedicated background and half-life measurements. Previously unidentified systematic effects, including an energy-dependent efficiency of the main MCP and a radiation-induced time-dependent background, have been identified and incorporated into the analysis. However, further understanding and treatment of the radiation-induced background requires additional dedicated measurements and remains the current limiting factor in extracting a beta-neutrino angular correlation coefficient for 35Ar decay using the WITCH spectrometer.
Lunt, Mark
2015-07-01
In the first article in this series we explored the use of linear regression to predict an outcome variable from a number of predictive factors. It assumed that the predictive factors were measured on an interval scale. However, this article shows how categorical variables can also be included in a linear regression model, enabling predictions to be made separately for different groups and allowing for testing the hypothesis that the outcome differs between groups. The use of interaction terms to measure whether the effect of a particular predictor variable differs between groups is also explained. An alternative approach to testing the difference between groups of the effect of a given predictor, which consists of measuring the effect in each group separately and seeing whether the statistical significance differs between the groups, is shown to be misleading. © The Author 2013. Published by Oxford University Press on behalf of the British Society for Rheumatology. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Objective Assessment of Vergence after Treatment of Concussion-Related CI: A Pilot Study
Scheiman, Mitchell; Talasan, Henry; Mitchell, Gladys L; Alvarez, Tara L.
2016-01-01
Purpose To evaluate changes in objective measures of disparity vergence after office-based vision therapy (OBVT) for concussion-related convergence insufficiency (CI), and determine the feasibility of using this objective assessment as an outcome measure in a clinical trial. Methods This was a prospective, observational trial. All participants were treated with weekly OBVT with home reinforcement. Participants included two adolescents and three young adults with concussion-related, symptomatic CI. The primary outcome measure was average peak velocity for 4-degree symmetrical convergence steps. Other objective outcome measures of disparity vergence included time to peak velocity, latency, accuracy, settling time, and main sequence. We also evaluated saccadic eye movements using the same outcome measures. Changes in clinical measures (near point of convergence, positive fusional vergence at near, Convergence Insufficiency Symptom Survey (CISS) score) were evaluated. Results There were statistically significant and clinically meaningful changes in all clinical measures for convergence. Four of the five subjects met clinical success criteria. For the objective measures, we found a statistically significant increase in peak velocity, response accuracy to 4° symmetrical convergence and divergence step stimuli and the main sequence ratio for convergence step stimuli. Objective saccadic eye movements (5° and 10°) appeared normal pre-OBVT, and did not show any significant change after treatment. Conclusions This is the first report of the use of objective measures of disparity vergence as outcome measures for concussion-related convergence insufficiency. These measures provide additional information that is not accessible with clinical tests about underlying physiological mechanisms leading to changes in clinical findings and symptoms. The study results also demonstrate that patients with concussion can tolerate the visual demands (over 200 vergence and versional eye movements) during the 25-minute testing time and suggest that these measures could be used in a large-scale randomized clinical trial of concussion-related CI as outcome measures. PMID:27464574
Objective Assessment of Vergence after Treatment of Concussion-Related CI: A Pilot Study.
Scheiman, Mitchell M; Talasan, Henry; Mitchell, G Lynn; Alvarez, Tara L
2017-01-01
To evaluate changes in objective measures of disparity vergence after office-based vision therapy (OBVT) for concussion-related convergence insufficiency (CI) and determine the feasibility of using this objective assessment as an outcome measure in a clinical trial. This was a prospective, observational trial. All participants were treated with weekly OBVT with home reinforcement. Participants included two adolescents and three young adults with concussion-related, symptomatic CI. The primary outcome measure was average peak velocity for 4° symmetrical convergence steps. Other objective outcome measures of disparity vergence included time to peak velocity, latency, accuracy, settling time, and main sequence. We also evaluated saccadic eye movements using the same outcome measures. Changes in clinical measures (near point of convergence, positive fusional vergence at near, Convergence Insufficiency Symptom Survey [CISS] score) were evaluated. There were statistically significant and clinically meaningful changes in all clinical measures for convergence. Four of the five subjects met clinical success criteria. For the objective measures, we found a statistically significant increase in peak velocity, response accuracy to 4° symmetrical convergence and divergence step stimuli, and the main sequence ratio for convergence step stimuli. Objective saccadic eye movements (5 and 10°) appeared normal pre-OBVT and did not show any significant change after treatment. This is the first report of the use of objective measures of disparity vergence as outcome measures for concussion-related convergence insufficiency. These measures provide additional information that is not accessible with clinical tests about underlying physiological mechanisms leading to changes in clinical findings and symptoms. The study results also demonstrate that patients with concussion can tolerate the visual demands (over 200 vergence and versional eye movements) during the 25-minute testing time and suggest that these measures could be used in a large-scale randomized clinical trial of concussion-related CI as outcome measures.
Ultrasound assessment of bladder wall thickness as a screening test for detrusor instability.
Abou-Gamrah, Amgad; Fawzy, Mounir; Sammour, Hazem; Tadros, Sherif
2014-05-01
The aim of the current study was to evaluate the diagnostic accuracy of transvaginal ultrasound measurement of bladder wall thickness (BWT) in diagnosis of over active bladder (OAB). The current prospective study was conducted at Ain Shams University Maternity Hospital over 2 years. Patients presented to the urogynecology outpatient clinic with symptoms of urinary frequency, urgency, nocturia and/or urge incontinence were included in this study. The allocated patients were divided into two groups; Group 1(study group): fifty (50) patients with urodynamic diagnosis of detrusor instability (OAB) were included. Group 2 (control): fifty (50) patients with urodynamic diagnosis of stress incontinence were included. Using a transvaginal probe, BWT was measured in three sites at the thickest part of (a) the dome of the bladder (b) the trigone, and (c) the anterior wall of the bladder. An average of the three measurements was considered as the mean bladder thickness. A total of 100 patients with lower urinary symptoms were finally analyzed. There were no statistical significant differences between both groups regarding age, parity and body mass index, while there was statistically longer disease duration in group 2. Excluding urgency, there was statistical significant difference (P < 0.001) regarding lower urinary tract symptoms namely frequency, urgency incontinence, coital incontinence and nocturia. Patients in group 1 were more positive to symptoms of frequency, urgency incontinence, and nocturia, while patients in group 2 were more positive regarding coital incontinence. The thickness of trigon, dome, anterior wall and mean BWT was significantly higher in group 1 when compared to group 2. Receiver operator characteristics curve was constructed for estimating the association between mean BWT and prediction of OAB in patients with lower urinary tract symptoms. Mean BWT at 4.78 mm was considered as best cut-off value for prediction of OAB with sensitivity of 90 % and specificity of 78 %. Mean BWT was significantly associated with OAB > 4.78 mm as denoted by the significantly large area under the curve [AUC], AUC was 0.905. In women with lower urinary tract symptom, transvaginal ultrasounds measured mean BWT seems to be an effective non invasive diagnostic tool for prediction of OAB.
Perlin, Mark William
2015-01-01
Background: DNA mixtures of two or more people are a common type of forensic crime scene evidence. A match statistic that connects the evidence to a criminal defendant is usually needed for court. Jurors rely on this strength of match to help decide guilt or innocence. However, the reliability of unsophisticated match statistics for DNA mixtures has been questioned. Materials and Methods: The most prevalent match statistic for DNA mixtures is the combined probability of inclusion (CPI), used by crime labs for over 15 years. When testing 13 short tandem repeat (STR) genetic loci, the CPI-1 value is typically around a million, regardless of DNA mixture composition. However, actual identification information, as measured by a likelihood ratio (LR), spans a much broader range. This study examined probability of inclusion (PI) mixture statistics for 517 locus experiments drawn from 16 reported cases and compared them with LR locus information calculated independently on the same data. The log(PI-1) values were examined and compared with corresponding log(LR) values. Results: The LR and CPI methods were compared in case examples of false inclusion, false exclusion, a homicide, and criminal justice outcomes. Statistical analysis of crime laboratory STR data shows that inclusion match statistics exhibit a truncated normal distribution having zero center, with little correlation to actual identification information. By the law of large numbers (LLN), CPI-1 increases with the number of tested genetic loci, regardless of DNA mixture composition or match information. These statistical findings explain why CPI is relatively constant, with implications for DNA policy, criminal justice, cost of crime, and crime prevention. Conclusions: Forensic crime laboratories have generated CPI statistics on hundreds of thousands of DNA mixture evidence items. However, this commonly used match statistic behaves like a random generator of inclusionary values, following the LLN rather than measuring identification information. A quantitative CPI number adds little meaningful information beyond the analyst's initial qualitative assessment that a person's DNA is included in a mixture. Statistical methods for reporting on DNA mixture evidence should be scientifically validated before they are relied upon by criminal justice. PMID:26605124
Perlin, Mark William
2015-01-01
DNA mixtures of two or more people are a common type of forensic crime scene evidence. A match statistic that connects the evidence to a criminal defendant is usually needed for court. Jurors rely on this strength of match to help decide guilt or innocence. However, the reliability of unsophisticated match statistics for DNA mixtures has been questioned. The most prevalent match statistic for DNA mixtures is the combined probability of inclusion (CPI), used by crime labs for over 15 years. When testing 13 short tandem repeat (STR) genetic loci, the CPI(-1) value is typically around a million, regardless of DNA mixture composition. However, actual identification information, as measured by a likelihood ratio (LR), spans a much broader range. This study examined probability of inclusion (PI) mixture statistics for 517 locus experiments drawn from 16 reported cases and compared them with LR locus information calculated independently on the same data. The log(PI(-1)) values were examined and compared with corresponding log(LR) values. The LR and CPI methods were compared in case examples of false inclusion, false exclusion, a homicide, and criminal justice outcomes. Statistical analysis of crime laboratory STR data shows that inclusion match statistics exhibit a truncated normal distribution having zero center, with little correlation to actual identification information. By the law of large numbers (LLN), CPI(-1) increases with the number of tested genetic loci, regardless of DNA mixture composition or match information. These statistical findings explain why CPI is relatively constant, with implications for DNA policy, criminal justice, cost of crime, and crime prevention. Forensic crime laboratories have generated CPI statistics on hundreds of thousands of DNA mixture evidence items. However, this commonly used match statistic behaves like a random generator of inclusionary values, following the LLN rather than measuring identification information. A quantitative CPI number adds little meaningful information beyond the analyst's initial qualitative assessment that a person's DNA is included in a mixture. Statistical methods for reporting on DNA mixture evidence should be scientifically validated before they are relied upon by criminal justice.
Abar, Orhan; Charnigo, Richard J.; Rayapati, Abner
2017-01-01
Association rule mining has received significant attention from both the data mining and machine learning communities. While data mining researchers focus more on designing efficient algorithms to mine rules from large datasets, the learning community has explored applications of rule mining to classification. A major problem with rule mining algorithms is the explosion of rules even for moderate sized datasets making it very difficult for end users to identify both statistically significant and potentially novel rules that could lead to interesting new insights and hypotheses. Researchers have proposed many domain independent interestingness measures using which, one can rank the rules and potentially glean useful rules from the top ranked ones. However, these measures have not been fully explored for rule mining in clinical datasets owing to the relatively large sizes of the datasets often encountered in healthcare and also due to limited access to domain experts for review/analysis. In this paper, using an electronic medical record (EMR) dataset of diagnoses and medications from over three million patient visits to the University of Kentucky medical center and affiliated clinics, we conduct a thorough evaluation of dozens of interestingness measures proposed in data mining literature, including some new composite measures. Using cumulative relevance metrics from information retrieval, we compare these interestingness measures against human judgments obtained from a practicing psychiatrist for association rules involving the depressive disorders class as the consequent. Our results not only surface new interesting associations for depressive disorders but also indicate classes of interestingness measures that weight rule novelty and statistical strength in contrasting ways, offering new insights for end users in identifying interesting rules. PMID:28736771
Accounting for measurement error: a critical but often overlooked process.
Harris, Edward F; Smith, Richard N
2009-12-01
Due to instrument imprecision and human inconsistencies, measurements are not free of error. Technical error of measurement (TEM) is the variability encountered between dimensions when the same specimens are measured at multiple sessions. A goal of a data collection regimen is to minimise TEM. The few studies that actually quantify TEM, regardless of discipline, report that it is substantial and can affect results and inferences. This paper reviews some statistical approaches for identifying and controlling TEM. Statistically, TEM is part of the residual ('unexplained') variance in a statistical test, so accounting for TEM, which requires repeated measurements, enhances the chances of finding a statistically significant difference if one exists. The aim of this paper was to review and discuss common statistical designs relating to types of error and statistical approaches to error accountability. This paper addresses issues of landmark location, validity, technical and systematic error, analysis of variance, scaled measures and correlation coefficients in order to guide the reader towards correct identification of true experimental differences. Researchers commonly infer characteristics about populations from comparatively restricted study samples. Most inferences are statistical and, aside from concerns about adequate accounting for known sources of variation with the research design, an important source of variability is measurement error. Variability in locating landmarks that define variables is obvious in odontometrics, cephalometrics and anthropometry, but the same concerns about measurement accuracy and precision extend to all disciplines. With increasing accessibility to computer-assisted methods of data collection, the ease of incorporating repeated measures into statistical designs has improved. Accounting for this technical source of variation increases the chance of finding biologically true differences when they exist.
Sources of international migration statistics in Africa.
1984-01-01
The sources of international migration data for Africa may be classified into 2 main categories: administrative records and 2) censuses and survey data. Both categories are sources for the direct measurement of migration, but the 2nd category can be used for the indirect estimation of net international migration. The administrative records from which data on international migration may be derived include 1) entry/departure cards or forms completed at international borders, 2) residence/work permits issued to aliens, and 3) general population registers and registers of aliens. The statistics derived from the entry/departure cards may be described as 1) land frontier control statistics and 2) port control statistics. The former refer to data derived from movements across land borders and the latter refer to information collected at international airports and seaports. Other administrative records which are potential sources of statistics on international migration in some African countries include some limited population registers, records of the registration of aliens, and particulars of residence/work permits issued to aliens. Although frontier control data are considered the most important source of international migration statistics, in many African countries these data are too deficient to provide a satisfactory indication of the level of international migration. Thus decennial population censuses and/or sample surveys are the major sources of the available statistics on the stock and characteristics of international migration. Indirect methods can be used to supplement census data with intercensal estimates of net migration using census data on the total population. This indirect method of obtaining information on migration can be used to evaluate estimates derived from frontier control records, and it also offers the means of obtaining alternative information on international migration in African countries which have not directly investigated migration topics in their censuses or surveys.
Gooding, Holly C; Ning, Hongyan; Gillman, Matthew W; Shay, Christina; Allen, Norrina; Goff, David C; Lloyd-Jones, Donald; Chiuve, Stephanie
2017-09-01
Few tools exist for assessing the risk for early atherosclerotic cardiovascular disease (ASCVD) events in young adults. To assess the performance of the Healthy Heart Score (HHS), a lifestyle-based tool that estimates ASCVD events in older adults, for ASCVD events occurring before 55 years of age. This prospective cohort study included 4893 US adults aged 18 to 30 years from the Coronary Artery Risk Development in Young Adults (CARDIA) study. Participants underwent measurement of lifestyle factors from March 25, 1985, through June 7, 1986, and were followed up for a median of 27.1 years (interquartile range, 26.9-27.2 years). Data for this study were analyzed from February 24 through December 12, 2016. The HHS includes age, smoking status, body mass index, alcohol intake, exercise, and a diet score composed of self-reported daily intake of cereal fiber, fruits and/or vegetables, nuts, sugar-sweetened beverages, and red and/or processed meats. The HHS in the CARDIA study was calculated using sex-specific equations produced by its derivation cohorts. The ability of the HHS to assess the 25-year risk for ASCVD (death from coronary heart disease, nonfatal myocardial infarction, and fatal or nonfatal ischemic stroke) in the total sample, in race- and sex-specific subgroups, and in those with and without clinical ASCVD risk factors at baseline. Model discrimination was assessed with the Harrell C statistic; model calibration, with Greenwood-Nam-D'Agostino statistics. The study population of 4893 participants included 2205 men (45.1%) and 2688 women (54.9%) with a mean (SD) age at baseline of 24.8 (3.6) years; 2483 (50.7%) were black; and 427 (8.7%) had at least 1 clinical ASCVD risk factor (hypertension, hyperlipidemia, or diabetes types 1 and 2). Among these participants, 64 premature ASCVD events occurred in women and 99 in men. The HHS showed moderate discrimination for ASCVD risk assessment in this diverse population of mostly healthy young adults (C statistic, 0.71; 95% CI, 0.66-0.76); it performed better in men (C statistic, 0.74; 95% CI, 0.68-0.79) than in women (C statistic, 0.69; 95% CI, 0.62-0.75); in white (C statistic, 0.77; 95% CI, 0.71-0.84) than in black (C statistic, 0.66; 95% CI, 0.60-0.72) participants; and in those without (C statistic, 0.71; 95% CI, 0.66-0.76) vs with (C statistic, 0.64; 95% CI, 0.55-0.73) clinical risk factors at baseline. The HHS was adequately calibrated overall and within each subgroup. The HHS, when measured in younger persons without ASCVD risk factors, performs moderately well in assessing risk for ASCVD events by early middle age. Its reliance on self-reported, modifiable lifestyle factors makes it an attractive tool for risk assessment and counseling for early ASCVD prevention.
Counterfactual entanglement and nonlocal correlations in separable states
NASA Astrophysics Data System (ADS)
Cohen, Oliver
1999-07-01
It is shown that the outcomes of measurements on systems in separable mixed states can be partitioned, via subsequent measurements on a disentangled extraneous system, into subensembles that display the statistics of entangled states. This motivates the introduction of the concept of ``counterfactual'' entanglement, which can be associated with all separable mixed states, including those that are factorable. This type of entanglement gives rise to a kind of postselection-induced Bell inequality violation. The significance of counterfactual entanglement, and its physical implications, are assessed.
Enzinger, Ewald; Morrison, Geoffrey Stewart; Ochoa, Felipe
2016-01-01
The new paradigm for the evaluation of the strength of forensic evidence includes: The use of the likelihood-ratio framework. The use of relevant data, quantitative measurements, and statistical models. Empirical testing of validity and reliability under conditions reflecting those of the case under investigation. Transparency as to decisions made and procedures employed. The present paper illustrates the use of the new paradigm to evaluate strength of evidence under conditions reflecting those of a real forensic-voice-comparison case. The offender recording was from a landline telephone system, had background office noise, and was saved in a compressed format. The suspect recording included substantial reverberation and ventilation system noise, and was saved in a different compressed format. The present paper includes descriptions of the selection of the relevant hypotheses, sampling of data from the relevant population, simulation of suspect and offender recording conditions, and acoustic measurement and statistical modelling procedures. The present paper also explores the use of different techniques to compensate for the mismatch in recording conditions. It also examines how system performance would have differed had the suspect recording been of better quality. Copyright © 2015 The Chartered Society of Forensic Sciences. Published by Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Vanchikova, E. V.; Shamrikova, E. V.; Bespyatykh, N. V.; Kyz"yurova, E. V.; Kondratenok, B. M.
2015-02-01
Metrological characteristics—precision, trueness, and accuracy—of the results of measurements of the exchangeable acidity and its components by the potentiometric titration method were studied on the basis of multiple analyses of the soil samples with the examination of statistical data for the outliers and their correspondence to the normal distribution. Measurement errors were estimated. The applied method was certified by the Metrological Center of the Uralian Branch of the Russian Academy of Sciences (certificate no. 88-17641-094-2013) and included in the Federal Information Fund on Assurance of Measurements (FR 1.31.2013.16382).
Ladd, David E.; Law, George S.
2007-01-01
The U.S. Geological Survey (USGS) provides streamflow and other stream-related information needed to protect people and property from floods, to plan and manage water resources, and to protect water quality in the streams. Streamflow statistics provided by the USGS, such as the 100-year flood and the 7-day 10-year low flow, frequently are used by engineers, land managers, biologists, and many others to help guide decisions in their everyday work. In addition to streamflow statistics, resource managers often need to know the physical and climatic characteristics (basin characteristics) of the drainage basins for locations of interest to help them understand the mechanisms that control water availability and water quality at these locations. StreamStats is a Web-enabled geographic information system (GIS) application that makes it easy for users to obtain streamflow statistics, basin characteristics, and other information for USGS data-collection stations and for ungaged sites of interest. If a user selects the location of a data-collection station, StreamStats will provide previously published information for the station from a database. If a user selects a location where no data are available (an ungaged site), StreamStats will run a GIS program to delineate a drainage basin boundary, measure basin characteristics, and estimate streamflow statistics based on USGS streamflow prediction methods. A user can download a GIS feature class of the drainage basin boundary with attributes including the measured basin characteristics and streamflow estimates.
Quality of statistical reporting in developmental disability journals.
Namasivayam, Aravind K; Yan, Tina; Wong, Wing Yiu Stephanie; van Lieshout, Pascal
2015-12-01
Null hypothesis significance testing (NHST) dominates quantitative data analysis, but its use is controversial and has been heavily criticized. The American Psychological Association has advocated the reporting of effect sizes (ES), confidence intervals (CIs), and statistical power analysis to complement NHST results to provide a more comprehensive understanding of research findings. The aim of this paper is to carry out a sample survey of statistical reporting practices in two journals with the highest h5-index scores in the areas of developmental disability and rehabilitation. Using a checklist that includes critical recommendations by American Psychological Association, we examined 100 randomly selected articles out of 456 articles reporting inferential statistics in the year 2013 in the Journal of Autism and Developmental Disorders (JADD) and Research in Developmental Disabilities (RDD). The results showed that for both journals, ES were reported only half the time (JADD 59.3%; RDD 55.87%). These findings are similar to psychology journals, but are in stark contrast to ES reporting in educational journals (73%). Furthermore, a priori power and sample size determination (JADD 10%; RDD 6%), along with reporting and interpreting precision measures (CI: JADD 13.33%; RDD 16.67%), were the least reported metrics in these journals, but not dissimilar to journals in other disciplines. To advance the science in developmental disability and rehabilitation and to bridge the research-to-practice divide, reforms in statistical reporting, such as providing supplemental measures to NHST, are clearly needed.
Elhai, Jon D; Palmieri, Patrick A
2011-08-01
We present an update of recent literature (since 2007) exploring the factor structure of posttraumatic stress disorder (PTSD) symptom measures. Research supporting a four-factor emotional numbing model and a four-factor dysphoria model is presented, with these models fitting better than all other models examined. Variables accounting for factor structure differences are reviewed, including PTSD query instructions, type of PTSD measure, extent of trauma exposure, ethnicity, and timing of administration. Methodological and statistical limitations with recent studies are presented. Finally, a research agenda and recommendations are offered to push this research area forward, including suggestions to validate PTSD’s factors against external measures of psychopathology, test moderators of factor structure, and examine heterogeneity of symptom presentations based on factor structure examination.
Park, Jangwoon; Ebert, Sheila M; Reed, Matthew P; Hallman, Jason J
2016-03-01
Previously published statistical models of driving posture have been effective for vehicle design but have not taken into account the effects of age. The present study developed new statistical models for predicting driving posture. Driving postures of 90 U.S. drivers with a wide range of age and body size were measured in laboratory mockup in nine package conditions. Posture-prediction models for female and male drivers were separately developed by employing a stepwise regression technique using age, body dimensions, vehicle package conditions, and two-way interactions, among other variables. Driving posture was significantly associated with age, and the effects of other variables depended on age. A set of posture-prediction models is presented for women and men. The results are compared with a previously developed model. The present study is the first study of driver posture to include a large cohort of older drivers and the first to report a significant effect of age. The posture-prediction models can be used to position computational human models or crash-test dummies for vehicle design and assessment. © 2015, Human Factors and Ergonomics Society.
Machine learning to analyze images of shocked materials for precise and accurate measurements
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dresselhaus-Cooper, Leora; Howard, Marylesa; Hock, Margaret C.
A supervised machine learning algorithm, called locally adaptive discriminant analysis (LADA), has been developed to locate boundaries between identifiable image features that have varying intensities. LADA is an adaptation of image segmentation, which includes techniques that find the positions of image features (classes) using statistical intensity distributions for each class in the image. In order to place a pixel in the proper class, LADA considers the intensity at that pixel and the distribution of intensities in local (nearby) pixels. This paper presents the use of LADA to provide, with statistical uncertainties, the positions and shapes of features within ultrafast imagesmore » of shock waves. We demonstrate the ability to locate image features including crystals, density changes associated with shock waves, and material jetting caused by shock waves. This algorithm can analyze images that exhibit a wide range of physical phenomena because it does not rely on comparison to a model. LADA enables analysis of images from shock physics with statistical rigor independent of underlying models or simulations.« less
Loring, David W; Larrabee, Glenn J
2006-06-01
The Halstead-Reitan Battery has been instrumental in the development of neuropsychological practice in the United States. Although Reitan administered both the Wechsler-Bellevue Intelligence Scale and Halstead's test battery when evaluating Halstead's theory of biologic intelligence, the relative sensitivity of each test battery to brain damage continues to be an area of controversy. Because Reitan did not perform direct parametric analysis to contrast group performances, we reanalyze Reitan's original validation data from both Halstead (Reitan, 1955) and Wechsler batteries (Reitan, 1959a) and calculate effect sizes and probability levels using traditional parametric approaches. Eight of the 10 tests comprising Halstead's original Impairment Index, as well as the Impairment Index itself, statistically differentiated patients with unequivocal brain damage from controls. In addition, 13 of 14 Wechsler measures including Full-Scale IQ also differed statistically between groups (Brain Damage Full-Scale IQ = 96.2; Control Group Full Scale IQ = 112.6). We suggest that differences in the statistical properties of each battery (e.g., raw scores vs. standardized scores) likely contribute to classification characteristics including test sensitivity and specificity.
Measuring Student Learning in Social Statistics: A Pretest-Posttest Study of Knowledge Gain
ERIC Educational Resources Information Center
Delucchi, Michael
2014-01-01
This study used a pretest-posttest design to measure student learning in undergraduate statistics. Data were derived from 185 students enrolled in six different sections of a social statistics course taught over a seven-year period by the same sociology instructor. The pretest-posttest instrument reveals statistically significant gains in…
Assessment of Muscle Fatigue Associated with Prolonged Standing in the Workplace
Omar, Abdul Rahman; Saman, Alias Mohd; Othman, Ibrahim
2012-01-01
Objectives The objectives of this study were to determine the psychological fatigue and analyze muscle activity of production workers who are performing processes jobs while standing for prolonged time periods. Methods The psychological fatigue experienced by the workers was obtained through questionnaire surveys. Meanwhile, muscle activity has been analyzed using surface electromyography (sEMG) measurement. Lower extremities muscles include: erector spinae, tibialis anterior, and gastrocnemius were concurrently measured for more than five hours of standing. Twenty male production workers in a metal stamping company participated as subjects in this study. The subjects were required to undergo questionnaire surveys and sEMG measurement. Results Results of the questionnaire surveys found that all subjects experienced psychological fatigue due to prolonged standing jobs. Similarly, muscle fatigue has been identified through sEMG measurement. Based on the non-parametric statistical test using the Spearman's rank order correlation, the left erector spinae obtained a moderate positive correlation and statistically significant (rs = 0.552, p < 0.05) between the results of questionnaire surveys and sEMG measurement. Conclusion Based on this study, the authors concluded that prolonged standing was contributed to psychological fatigue and to muscle fatigue among the production workers. PMID:22953228
NASA Astrophysics Data System (ADS)
Oregui, M.; Li, Z.; Dollevoet, R.
2015-03-01
In this paper, the feasibility of the Frequency Response Function (FRF)-based statistical method to identify the characteristic frequencies of railway track defects is studied. The method compares a damaged track state to a healthy state based on non-destructive field hammer test measurements. First, a study is carried out to investigate the repeatability of hammer tests in railway tracks. By changing the excitation and measurement locations it is shown that the variability introduced by the test process is negligible. Second, following the concepts of control charts employed in process monitoring, a method to define an approximate healthy state is introduced by using hammer test measurements at locations without visual damage. Then, the feasibility study includes an investigation into squats (i.e. a major type of rail surface defect) of varying severity. The identified frequency ranges related to squats agree with those found in an extensively validated vehicle-borne detection system. Therefore, the FRF-based statistical method in combination with the non-destructive hammer test measurements has the potential to be employed to identify the characteristic frequencies of damaged conditions in railway tracks in the frequency range of 300-3000 Hz.
The large sample size fallacy.
Lantz, Björn
2013-06-01
Significance in the statistical sense has little to do with significance in the common practical sense. Statistical significance is a necessary but not a sufficient condition for practical significance. Hence, results that are extremely statistically significant may be highly nonsignificant in practice. The degree of practical significance is generally determined by the size of the observed effect, not the p-value. The results of studies based on large samples are often characterized by extreme statistical significance despite small or even trivial effect sizes. Interpreting such results as significant in practice without further analysis is referred to as the large sample size fallacy in this article. The aim of this article is to explore the relevance of the large sample size fallacy in contemporary nursing research. Relatively few nursing articles display explicit measures of observed effect sizes or include a qualitative discussion of observed effect sizes. Statistical significance is often treated as an end in itself. Effect sizes should generally be calculated and presented along with p-values for statistically significant results, and observed effect sizes should be discussed qualitatively through direct and explicit comparisons with the effects in related literature. © 2012 Nordic College of Caring Science.
Kratochwill, Thomas R; Levin, Joel R
2014-04-01
In this commentary, we add to the spirit of the articles appearing in the special series devoted to meta- and statistical analysis of single-case intervention-design data. Following a brief discussion of historical factors leading to our initial involvement in statistical analysis of such data, we discuss: (a) the value added by including statistical-analysis recommendations in the What Works Clearinghouse Standards for single-case intervention designs; (b) the importance of visual analysis in single-case intervention research, along with the distinctive role that could be played by single-case effect-size measures; and (c) the elevated internal validity and statistical-conclusion validity afforded by the incorporation of various forms of randomization into basic single-case design structures. For the future, we envision more widespread application of quantitative analyses, as critical adjuncts to visual analysis, in both primary single-case intervention research studies and literature reviews in the behavioral, educational, and health sciences. Copyright © 2014 Society for the Study of School Psychology. Published by Elsevier Ltd. All rights reserved.
A critique of Rasch residual fit statistics.
Karabatsos, G
2000-01-01
In test analysis involving the Rasch model, a large degree of importance is placed on the "objective" measurement of individual abilities and item difficulties. The degree to which the objectivity properties are attained, of course, depends on the degree to which the data fit the Rasch model. It is therefore important to utilize fit statistics that accurately and reliably detect the person-item response inconsistencies that threaten the measurement objectivity of persons and items. Given this argument, it is somewhat surprising that there is far more emphasis placed in the objective measurement of person and items than there is in the measurement quality of Rasch fit statistics. This paper provides a critical analysis of the residual fit statistics of the Rasch model, arguably the most often used fit statistics, in an effort to illustrate that the task of Rasch fit analysis is not as simple and straightforward as it appears to be. The faulty statistical properties of the residual fit statistics do not allow either a convenient or a straightforward approach to Rasch fit analysis. For instance, given a residual fit statistic, the use of a single minimum critical value for misfit diagnosis across different testing situations, where the situations vary in sample and test properties, leads to both the overdetection and underdetection of misfit. To improve this situation, it is argued that psychometricians need to implement residual-free Rasch fit statistics that are based on the number of Guttman response errors, or use indices that are statistically optimal in detecting measurement disturbances.
Booth, Vicky; Masud, Tahir; Bath-Hextall, Fiona
Balance impairment can result in falls and reduced activities of daily living and function. Virtual reality and interactive gaming systems provide a novel and potentially environmentally flexible treatment option to improve postural stability and reduce falls in balance impaired populations. There are no existing systematic reviews in this topic area. To search, critically appraise and synthesise the best available evidence on whether virtual reality interventions, including interactive gaming systems, are effective at improving balance in adults with impaired balance. Adults with impaired, altered or reduced balance identified either through reduced balance outcome measure score or increased risk or incidence of falls.Types of interventions:Any virtual reality or interactive gaming systems used within a rehabilitative setting.The primary outcome was an objective measure of balance (i.e. balance outcome measure such as Berg Balance Score) or number and/or incidence of falls. Secondary outcome measures of interest included any adverse effects experienced, an outcome measure indicating functional balance (i.e. walking speed), quality of life (through use of an objective measure i.e. EuroQOL), and number of days in hospital due to falls.Types of studies:Randomised controlled trials (RCT). A three-stage strategy searched the following electronic databases: The Cochrane Central Register of Controlled Trials (CENTRAL), MEDLINE, EMBASE, AMED, CINAHL, PsycINFO, PsycBITE, OTseeker, Ei Compendex, Inspec, Current Controlled Trials, and the National Institute of Health Clinical Trials Database. The methodological quality of each included study was independently assessed using the Joanna Briggs Institute Meta Analysis of Statistics Assessment and Review Instrument (JBI-MAStARI) to systematically comment on influence of bias. Data was individually extracted from the included studies using the standardised JBI data extraction tool from JBI-MAStARI. Data was analysed using Review Manager 5 software. Results were expressed as mean difference (MD) with 95% confidence intervals for continuous outcomes. Meta-analysis was not possible due to the variation of the interventions given and small number of included trials; hence, a description of the results was given. Four studies were included in the systematic review. All the included studies used different types of virtual reality or interactive gaming interventions. Two of the included studies used the same balance outcome measure. There was a notable inconsistency of balance outcome measurement between all the included studies. No data was given regarding falls in any of the studies. A secondary outcome, the 10m walk test, was recorded in two of the studies. The four included studies had small sample sizes and poor methodological quality. Despite the presentation of statistically significant results, the clinical significance is questionable. The review can not recommend the inclusion of virtual reality or interactive gaming systems into the rehabilitation of balance impairment based on the results of the four included studies. Further investigation in this topic area is required.
Measuring Microaggression and Organizational Climate Factors in Military Units
2011-04-01
i.e., items) to accurately assess what we intend for them to measure. To assess construct and convergent validity, the author assessed the statistical ...sample indicated both convergent and construct validity of the microaggression scale. Table 5 presents these statistics . Measuring Microaggressions...models. As shown in Table 7, the measurement models had acceptable fit indices. That is, the Chi-square statistics were at their minimum; although the
Using Asymptotic Results to Obtain a Confidence Interval for the Population Median
ERIC Educational Resources Information Center
Jamshidian, M.; Khatoonabadi, M.
2007-01-01
Almost all introductory and intermediate level statistics textbooks include the topic of confidence interval for the population mean. Almost all these texts introduce the median as a robust measure of central tendency. Only a few of these books, however, cover inference on the population median and in particular confidence interval for the median.…
Financial Literacy of 15-Year-Olds: Results from PISA 2015. Data Point. NCES 2017-086
ERIC Educational Resources Information Center
Gonzales, Patrick; Sen, Anindita
2017-01-01
On May 24, the National Center for Education Statistics released Financial Literacy of 15-year-olds: Results from PISA 2015. The PISA assessment of financial literacy measured students' knowledge and understanding of fundamental elements of the financial world, including financial concepts, products, and risks, and their ability to apply what they…
ERIC Educational Resources Information Center
Wilcox, Rand R.; Serang, Sarfaraz
2017-01-01
The article provides perspectives on p values, null hypothesis testing, and alternative techniques in light of modern robust statistical methods. Null hypothesis testing and "p" values can provide useful information provided they are interpreted in a sound manner, which includes taking into account insights and advances that have…
Measures of Child Well-Being in Utah, 2000. How Are the Children?
ERIC Educational Resources Information Center
Haven, Terry, Ed.
This Kids Count report details statewide trends in the well-being of Utah's children. The statistical portrait is based on 22 indicators of children's well-being, including: (1) prenatal care; (2) low birth weight infants; (3) infant mortality; (4) child death rates; (5) child injury deaths; (6) child abuse; (7) injury hospital discharges; (8)…
Biomass statistics for Vermont - 1983
Thomas S. Frieswyk; Anne M. Malley
1986-01-01
A new measure of the forest resource has been added to the fourth forest inventory of Vermont. The inventory, which was conducted in 1982-83, included estimates of aboveground tree biomass on timberland. There are approximately 413 million green tons of wood and bark in the aboveground portion of all trees, which equates to an average of 93 green tons per acre...
Monitoring the Future: Questionnaire Responses from the Nation's High School Seniors, 1980.
ERIC Educational Resources Information Center
Bachman, Jerald G.; And Others
This report presents descriptive statistical results from a 1980 national survey of high school seniors concerning their values, behaviors, and lifestyle. It is the sixth in a series. Questionnaires were filled out by 16,524 seniors in 107 public and 20 private high schools. Student response rate was 82%. Content areas measured include the…
ERIC Educational Resources Information Center
Jeynes, William H.
2007-01-01
A meta-analysis is undertaken, including 52 studies, to determine the influence of parental involvement on the educational outcomes of urban secondary school children. Statistical analyses are done to determine the overall impact of parental involvement as well as specific components of parental involvement. Four different measures of educational…
A detailed description of the sequential probability ratio test for 2-IMU FDI
NASA Technical Reports Server (NTRS)
Rich, T. M.
1976-01-01
The sequential probability ratio test (SPRT) for 2-IMU FDI (inertial measuring unit failure detection/isolation) is described. The SPRT is a statistical technique for detecting and isolating soft IMU failures originally developed for the strapdown inertial reference unit. The flowchart of a subroutine incorporating the 2-IMU SPRT is included.
Code of Federal Regulations, 2013 CFR
2013-07-01
.... You may extend the sampling time to improve measurement accuracy of PM emissions, using good..., you may omit speed, torque, and power points from the duty-cycle regression statistics if the... mapped. (2) For variable-speed engines without low-speed governors, you may omit torque and power points...
Code of Federal Regulations, 2012 CFR
2012-07-01
.... You may extend the sampling time to improve measurement accuracy of PM emissions, using good..., you may omit speed, torque, and power points from the duty-cycle regression statistics if the... mapped. (2) For variable-speed engines without low-speed governors, you may omit torque and power points...
Measures of Child Well-Being in Utah, 1999. Kids under Construction.
ERIC Educational Resources Information Center
Haven, Terry, Ed.
This Kids Count report details statewide trends in the well-being of Utah's children. The statistical portrait is based on four general areas of children's well-being: (1) health; (2) education; (3) safety; and (4) economic security. Key indicators in these areas include: (1) prenatal care; (2) infant mortality; (3) low birth weight babies; (4)…
Greenhouse Effect Detection Experiment (GEDEX). Selected data sets
NASA Technical Reports Server (NTRS)
Olsen, Lola M.; Warnock, Archibald, III
1992-01-01
This CD-ROM contains selected data sets compiled by the participants of the Greenhouse Effect Detection Experiment (GEDEX) workshop on atmospheric temperature. The data sets include surface, upper air, and/or satellite-derived measurements of temperature, solar irradiance, clouds, greenhouse gases, fluxes, albedo, aerosols, ozone, and water vapor, along with Southern Oscillation Indices and Quasi-Biennial Oscillation statistics.
Geographic analysis of forest health indicators using spatial scan statistics
John W. Coulston; Kurt H. Riitters
2003-01-01
Forest health analysts seek to define the location, extent, and magnitude of changes in forest ecosystems, to explain the observed changes when possible, and to draw attention to the unexplained changes for further investigation. The data come from a variety of sources including satellite images, field plot measurements, and low-altitude aerial surveys. Indicators...
[Evaluation of using statistical methods in selected national medical journals].
Sych, Z
1996-01-01
The paper covers the performed evaluation of frequency with which the statistical methods were applied in analyzed works having been published in six selected, national medical journals in the years 1988-1992. For analysis the following journals were chosen, namely: Klinika Oczna, Medycyna Pracy, Pediatria Polska, Polski Tygodnik Lekarski, Roczniki Państwowego Zakładu Higieny, Zdrowie Publiczne. Appropriate number of works up to the average in the remaining medical journals was randomly selected from respective volumes of Pol. Tyg. Lek. The studies did not include works wherein the statistical analysis was not implemented, which referred both to national and international publications. That exemption was also extended to review papers, casuistic ones, reviews of books, handbooks, monographies, reports from scientific congresses, as well as papers on historical topics. The number of works was defined in each volume. Next, analysis was performed to establish the mode of finding out a suitable sample in respective studies, differentiating two categories: random and target selections. Attention was also paid to the presence of control sample in the individual works. In the analysis attention was also focussed on the existence of sample characteristics, setting up three categories: complete, partial and lacking. In evaluating the analyzed works an effort was made to present the results of studies in tables and figures (Tab. 1, 3). Analysis was accomplished with regard to the rate of employing statistical methods in analyzed works in relevant volumes of six selected, national medical journals for the years 1988-1992, simultaneously determining the number of works, in which no statistical methods were used. Concurrently the frequency of applying the individual statistical methods was analyzed in the scrutinized works. Prominence was given to fundamental statistical methods in the field of descriptive statistics (measures of position, measures of dispersion) as well as most important methods of mathematical statistics such as parametric tests of significance, analysis of variance (in single and dual classifications). non-parametric tests of significance, correlation and regression. The works, in which use was made of either multiple correlation or multiple regression or else more complex methods of studying the relationship for two or more numbers of variables, were incorporated into the works whose statistical methods were constituted by correlation and regression as well as other methods, e.g. statistical methods being used in epidemiology (coefficients of incidence and morbidity, standardization of coefficients, survival tables) factor analysis conducted by Jacobi-Hotellng's method, taxonomic methods and others. On the basis of the performed studies it has been established that the frequency of employing statistical methods in the six selected national, medical journals in the years 1988-1992 was 61.1-66.0% of the analyzed works (Tab. 3), and they generally were almost similar to the frequency provided in English language medical journals. On a whole, no significant differences were disclosed in the frequency of applied statistical methods (Tab. 4) as well as in frequency of random tests (Tab. 3) in the analyzed works, appearing in the medical journals in respective years 1988-1992. The most frequently used statistical methods in analyzed works for 1988-1992 were the measures of position 44.2-55.6% and measures of dispersion 32.5-38.5% as well as parametric tests of significance 26.3-33.1% of the works analyzed (Tab. 4). For the purpose of increasing the frequency and reliability of the used statistical methods, the didactics should be widened in the field of biostatistics at medical studies and postgraduation training designed for physicians and scientific-didactic workers.
NASA Technical Reports Server (NTRS)
Coffin, T.
1986-01-01
A dynamic pressure data base and data base management system developed to characterize the Space Shuttle Main Engine (SSME) dynamic pressure environment is described. The data base represents dynamic pressure measurements obtained during single engine hot firing tesets of the SSME. Software is provided to permit statistical evaluation of selected measurements under specified operating conditions. An interpolation scheme is also included to estimate spectral trends with SSME power level. Flow dynamic environments in high performance rocket engines are discussed.
NASA Technical Reports Server (NTRS)
Coffin, T.
1986-01-01
A dynamic pressure data base and data base management system developed to characterize the Space Shuttle Main Engine (SSME) dynamic pressure environment is reported. The data base represents dynamic pressure measurements obtained during single engine hot firing tests of the SSME. Software is provided to permit statistical evaluation of selected measurements under specified operating conditions. An interpolation scheme is included to estimate spectral trends with SSME power level. Flow Dynamic Environments in High Performance Rocket Engines are described.
Relationship between preventable hospital deaths and other measures of safety: an exploratory study.
Hogan, Helen; Healey, Frances; Neale, Graham; Thomson, Richard; Vincent, Charles; Black, Nick
2014-06-01
To explore associations between the proportion of hospital deaths that are preventable and other measures of safety. Retrospective case record review to provide estimates of preventable death proportions. Simple monotonic correlations using Spearman's rank correlation coefficient to establish the relationship with eight other measures of patient safety. Ten English acute hospital trusts. One thousand patients who died during 2009. The proportion of preventable deaths varied between hospitals (3-8%) but was not statistically significant (P = 0.94). Only one of the eight measures of safety (Methicillin-resistant Staphylococcus aureus bacteraemia rate) was clinically and statistically significantly associated with preventable death proportion (r = 0.73; P < 0.02). There were no significant associations with the other measures including hospital standardized mortality ratios (r = -0.01). There was a suggestion that preventable deaths may be more strongly associated with some other measures of outcome than with process or with structure measures. The exploratory nature of this study inevitably limited its power to provide definitive results. The observed relationships between safety measures suggest that a larger more powerful study is needed to establish the inter-relationship of different measures of safety (structure, process and outcome), in particular the widely used standardized mortality ratios. © The Author 2014. Published by Oxford University Press in association with the International Society for Quality in Health Care; all rights reserved.
MIRO Continuum Calibration for Asteroid Mode
NASA Technical Reports Server (NTRS)
Lee, Seungwon
2011-01-01
MIRO (Microwave Instrument for the Rosetta Orbiter) is a lightweight, uncooled, dual-frequency heterodyne radiometer. The MIRO encountered asteroid Steins in 2008, and during the flyby, MIRO used the Asteroid Mode to measure the emission spectrum of Steins. The Asteroid Mode is one of the seven modes of the MIRO operation, and is designed to increase the length of time that a spectral line is in the MIRO pass-band during a flyby of an object. This software is used to calibrate the continuum measurement of Steins emission power during the asteroid flyby. The MIRO raw measurement data need to be calibrated in order to obtain physically meaningful data. This software calibrates the MIRO raw measurements in digital units to the brightness temperature in Kelvin. The software uses two calibration sequences that are included in the Asteroid Mode. One sequence is at the beginning of the mode, and the other at the end. The first six frames contain the measurement of a cold calibration target, while the last six frames measure a warm calibration target. The targets have known temperatures and are used to provide reference power and gain, which can be used to convert MIRO measurements into brightness temperature. The software was developed to calibrate MIRO continuum measurements from Asteroid Mode. The software determines the relationship between the raw digital unit measured by MIRO and the equivalent brightness temperature by analyzing data from calibration frames. The found relationship is applied to non-calibration frames, which are the measurements of an object of interest such as asteroids and other planetary objects that MIRO encounters during its operation. This software characterizes the gain fluctuations statistically and determines which method to estimate gain between calibration frames. For example, if the fluctuation is lower than a statistically significant level, the averaging method is used to estimate the gain between the calibration frames. If the fluctuation is found to be statistically significant, a linear interpolation of gain and reference power is used to estimate the gain between the calibration frames.
ERIC Educational Resources Information Center
Shim, Wonsik "Jeff"; McClure, Charles R.; Fraser, Bruce T.; Bertot, John Carlo
This manual provides a beginning approach for research libraries to better describe the use and users of their networked services. The manual also aims to increase the visibility and importance of developing such statistics and measures. Specific objectives are: to identify selected key statistics and measures that can describe use and users of…
ERIC Educational Resources Information Center
Osler, James Edward
2014-01-01
This monograph provides an epistemological rational for the design of a novel post hoc statistical measure called "Tri-Center Analysis". This new statistic is designed to analyze the post hoc outcomes of the Tri-Squared Test. In Tri-Center Analysis trichotomous parametric inferential parametric statistical measures are calculated from…
Lorig, Kate; Ritter, Philip L; Turner, Ralph M; English, Kathleen; Laurent, Diana D; Greenberg, Jay
2016-12-15
Diabetes self-management education has been shown to be effective in controlled trials. The 6-week Better Choices, Better Health-Diabetes (BCBH-D) self-management program was also associated with an improvement in health outcomes in a 6-month translation study. The objective of this study was to determine whether a national translation of the BCBH-D self-management program, offered both Web-based and face-to-face, was associated with improvements in health outcomes (including HbA1c) and health behaviors (including recommended medical tests) 1 year after intervention. Web-based programs were administered nationally, whereas face-to-face workshops took place in Atlanta, Indianapolis, and St Louis. Self-report questionnaires were either Web-based or administered by mail, at baseline and 1 year, and collected health and health-behavior measures. HbA1c blood samples were collected via mailed kits. A previous 6-month study found statistically significant improvements in 13 of 14 outcome measures, including HbA1c. For this study, paired t test compared baseline with 1-year outcomes. Subgroup analyses determined whether participants with specific conditions improved (high HbA1c, depression, hypoglycemia, nonadherence to medication, no aerobic exercise). The percentage of participants with improvements in effect size of at least 0.4 in at least 1 of the 5 measures was calculated. A total of 857 participants with 1-year data (69.7% of baseline participants) demonstrated statistically significant 1-year improvements in 13 of 15 outcome measures; 79.9% (685/857) of participants showed improvements in effect size of 0.4 or greater in at least 1 of the 5 criterial measures. Participants had small but significant benefits in multiple measures. Improvements previously noted at 6 months were maintained or amplified at 1 year. ©Kate Lorig, Philip L Ritter, Ralph M Turner, Kathleen English, Diana D Laurent, Jay Greenberg. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 15.12.2016.
The probability density function (PDF) of Lagrangian Turbulence
NASA Astrophysics Data System (ADS)
Birnir, B.
2012-12-01
The statistical theory of Lagrangian turbulence is derived from the stochastic Navier-Stokes equation. Assuming that the noise in fully-developed turbulence is a generic noise determined by the general theorems in probability, the central limit theorem and the large deviation principle, we are able to formulate and solve the Kolmogorov-Hopf equation for the invariant measure of the stochastic Navier-Stokes equations. The intermittency corrections to the scaling exponents of the structure functions require a multiplicative (multipling the fluid velocity) noise in the stochastic Navier-Stokes equation. We let this multiplicative noise, in the equation, consists of a simple (Poisson) jump process and then show how the Feynmann-Kac formula produces the log-Poissonian processes, found by She and Leveque, Waymire and Dubrulle. These log-Poissonian processes give the intermittency corrections that agree with modern direct Navier-Stokes simulations (DNS) and experiments. The probability density function (PDF) plays a key role when direct Navier-Stokes simulations or experimental results are compared to theory. The statistical theory of turbulence is determined, including the scaling of the structure functions of turbulence, by the invariant measure of the Navier-Stokes equation and the PDFs for the various statistics (one-point, two-point, N-point) can be obtained by taking the trace of the corresponding invariant measures. Hopf derived in 1952 a functional equation for the characteristic function (Fourier transform) of the invariant measure. In distinction to the nonlinear Navier-Stokes equation, this is a linear functional differential equation. The PDFs obtained from the invariant measures for the velocity differences (two-point statistics) are shown to be the four parameter generalized hyperbolic distributions, found by Barndorff-Nilsen. These PDF have heavy tails and a convex peak at the origin. A suitable projection of the Kolmogorov-Hopf equations is the differential equation determining the generalized hyperbolic distributions. Then we compare these PDFs with DNS results and experimental data.
NASA Astrophysics Data System (ADS)
Saputra, K. V. I.; Cahyadi, L.; Sembiring, U. A.
2018-01-01
Start in this paper, we assess our traditional elementary statistics education and also we introduce elementary statistics with simulation-based inference. To assess our statistical class, we adapt the well-known CAOS (Comprehensive Assessment of Outcomes in Statistics) test that serves as an external measure to assess the student’s basic statistical literacy. This test generally represents as an accepted measure of statistical literacy. We also introduce a new teaching method on elementary statistics class. Different from the traditional elementary statistics course, we will introduce a simulation-based inference method to conduct hypothesis testing. From the literature, it has shown that this new teaching method works very well in increasing student’s understanding of statistics.
Morphometricity as a measure of the neuroanatomical signature of a trait.
Sabuncu, Mert R; Ge, Tian; Holmes, Avram J; Smoller, Jordan W; Buckner, Randy L; Fischl, Bruce
2016-09-27
Complex physiological and behavioral traits, including neurological and psychiatric disorders, often associate with distributed anatomical variation. This paper introduces a global metric, called morphometricity, as a measure of the anatomical signature of different traits. Morphometricity is defined as the proportion of phenotypic variation that can be explained by macroscopic brain morphology. We estimate morphometricity via a linear mixed-effects model that uses an anatomical similarity matrix computed based on measurements derived from structural brain MRI scans. We examined over 3,800 unique MRI scans from nine large-scale studies to estimate the morphometricity of a range of phenotypes, including clinical diagnoses such as Alzheimer's disease, and nonclinical traits such as measures of cognition. Our results demonstrate that morphometricity can provide novel insights about the neuroanatomical correlates of a diverse set of traits, revealing associations that might not be detectable through traditional statistical techniques.
An experimental investigation of a three dimensional wall jet. Ph.D. Thesis
NASA Technical Reports Server (NTRS)
Catalano, G. D.
1977-01-01
One and two point statistical properties are measured in the flow fields of a coflowing turbulent jet. Two different confining surfaces (one flat, one with large curvature) are placed adjacent to the lip of the circular nozzle; and the resultant effects on the flow field are determined. The one point quantities measured include mean velocities, turbulent intensities, velocity and concentration autocorrelations and power spectral densities, and intermittencies. From the autocorrelation curves, the Taylor microscale and the integral length scale are calculated. Two point quantities measured include velocity and concentration space-time correlations and pressure velocity correlations. From the velocity space-time correlations, iso-correlation contours are constructed along with the lines of maximum maximorum. These lines allow a picture of the flow pattern to be determined. The pressures monitored in the pressure velocity correlations are measured both in the flow field and at the surface of the confining wall(s).
Morphometricity as a measure of the neuroanatomical signature of a trait
Sabuncu, Mert R.; Ge, Tian; Holmes, Avram J.; Smoller, Jordan W.; Buckner, Randy L.; Fischl, Bruce
2016-01-01
Complex physiological and behavioral traits, including neurological and psychiatric disorders, often associate with distributed anatomical variation. This paper introduces a global metric, called morphometricity, as a measure of the anatomical signature of different traits. Morphometricity is defined as the proportion of phenotypic variation that can be explained by macroscopic brain morphology. We estimate morphometricity via a linear mixed-effects model that uses an anatomical similarity matrix computed based on measurements derived from structural brain MRI scans. We examined over 3,800 unique MRI scans from nine large-scale studies to estimate the morphometricity of a range of phenotypes, including clinical diagnoses such as Alzheimer’s disease, and nonclinical traits such as measures of cognition. Our results demonstrate that morphometricity can provide novel insights about the neuroanatomical correlates of a diverse set of traits, revealing associations that might not be detectable through traditional statistical techniques. PMID:27613854
Feaster, Toby D.; Lee, Kathyrn G.
2017-08-28
Low-flow statistics are needed by water-resource engineers, planners, and managers to protect and manage the water resources of Alabama. The accuracy of these statistics is influenced by such factors as length of record and specific hydrologic conditions measured in those records. As such, it is generally recommended that flow statistics be updated about every 10 years to provide improved and representative low-flow characteristics. The previous investigation of low-flow characteristics for Alabama included data through September 1990. Since that time, Alabama has experienced several historic droughts highlighting the need to update the low-flow characteristics at U.S. Geological Survey streamgaging stations. Consequently, this investigation was undertaken in cooperation with a number of State and local agencies to update low-flow frequency and flow-duration statistics at 210 continuous-record streamgaging stations in Alabama and 67 stations from basins that are shared with surrounding States. The flow characteristics were computed on the basis of available data through March 2014.
Usher, Kim; Park, Tanya; Foster, Kim; Buettner, Petra
2013-07-01
To test the effect of a nurse-led intervention on weight gain in people with serious mental illness prescribed and taking second generation antipsychotic medication. Weight gain and obesity has reached epidemic proportions in the general population with the prevalence of Metabolic Syndrome reaching 20-25% of the global population. People with serious mental illness are at even higher risk, particularly those taking second generation antipsychotic medication. An experimental randomized controlled trial was undertaken. The control group received a 12-week healthy lifestyle booklet. In addition to the booklet, the intervention group received weekly nutrition and exercise education, exercise sessions, and nurse support. Participants (n = 101) were assessed at baseline and 12 weeks. Data were collected between March 2008-December 2010. Seven outcome measures were used: body measurements included girth (cm), weight (kg), height (cm), and body mass index (kg/m(2) ); questionnaires included the medication compliance questionnaire, the Drug Attitude Inventory, the Liverpool University Neuroleptic Side Effect Rating Scale, and the Medical Outcomes Study Short Form 36. Differences in primary outcome measures between baseline and 12 weeks follow-up were compared between intervention and control groups using standard bi-variate statistical tests. The study was conducted between 2008-2010. The analysis of outcome measures for the control group (n = 50) and intervention group (n = 51) was not statistically significant. There was a mean weight change of -0·74 kg at 12 weeks for the intervention group (n = 51), while the control group (n = 50) had a mean weight change of -0·17 kg at 12 weeks. The results were not statistically significant. © 2012 Blackwell Publishing Ltd.
An adaptive approach to the dynamic allocation of buffer storage. M.S. Thesis
NASA Technical Reports Server (NTRS)
Crooke, S. C.
1970-01-01
Several strategies for the dynamic allocation of buffer storage are simulated and compared. The basic algorithms investigated, using actual statistics observed in the Univac 1108 EXEC 8 System, include the buddy method and the first-fit method. Modifications are made to the basic methods in an effort to improve and to measure allocation performance. A simulation model of an adaptive strategy is developed which permits interchanging the two different methods, the buddy and the first-fit methods with some modifications. Using an adaptive strategy, each method may be employed in the statistical environment in which its performance is superior to the other method.
Dalgin, Rebecca Spirito; Dalgin, M Halim; Metzger, Scott J
2018-05-01
This article focuses on the impact of a peer run warm line as part of the psychiatric recovery process. It utilized data including the Recovery Assessment Scale, community integration measures and crisis service usage. Longitudinal statistical analysis was completed on 48 sets of data from 2011, 2012, and 2013. Although no statistically significant differences were observed for the RAS score, community integration data showed increases in visits to primary care doctors, leisure/recreation activities and socialization with others. This study highlights the complexity of psychiatric recovery and that nonclinical peer services like peer run warm lines may be critical to the process.
Modeling longitudinal data, I: principles of multivariate analysis.
Ravani, Pietro; Barrett, Brendan; Parfrey, Patrick
2009-01-01
Statistical models are used to study the relationship between exposure and disease while accounting for the potential role of other factors' impact on outcomes. This adjustment is useful to obtain unbiased estimates of true effects or to predict future outcomes. Statistical models include a systematic component and an error component. The systematic component explains the variability of the response variable as a function of the predictors and is summarized in the effect estimates (model coefficients). The error element of the model represents the variability in the data unexplained by the model and is used to build measures of precision around the point estimates (confidence intervals).
ERIC Educational Resources Information Center
Idris, Khairiani; Yang, Kai-Lin
2017-01-01
This article reports the results of a mixed-methods approach to develop and validate an instrument to measure Indonesian pre-service teachers' conceptions of statistics. First, a phenomenographic study involving a sample of 44 participants uncovered six categories of conceptions of statistics. Second, an instrument of conceptions of statistics was…
Pang, Jingxiang; Fu, Jialei; Yang, Meina; Zhao, Xiaolei; van Wijk, Eduard; Wang, Mei; Fan, Hua; Han, Jinxiang
2016-03-01
In the practice and principle of Chinese medicine, herbal materials are classified according to their therapeutic properties. 'Cold' and 'heat' are the most important classes of Chinese medicinal herbs according to the theory of traditional Chinese medicine (TCM). In this work, delayed luminescence (DL) was measured for different samples of Chinese medicinal herbs using a sensitive photon multiplier detection system. A comparison of DL parameters, including mean intensity and statistic entropy, was undertaken to discriminate between the 'cold' and 'heat' properties of Chinese medicinal herbs. The results suggest that there are significant differences in mean intensity and statistic entropy and using this method combined with statistical analysis may provide novel parameters for the characterization of Chinese medicinal herbs in relation to their energetic properties. Copyright © 2015 John Wiley & Sons, Ltd.
Statistical Optimality in Multipartite Ranking and Ordinal Regression.
Uematsu, Kazuki; Lee, Yoonkyung
2015-05-01
Statistical optimality in multipartite ranking is investigated as an extension of bipartite ranking. We consider the optimality of ranking algorithms through minimization of the theoretical risk which combines pairwise ranking errors of ordinal categories with differential ranking costs. The extension shows that for a certain class of convex loss functions including exponential loss, the optimal ranking function can be represented as a ratio of weighted conditional probability of upper categories to lower categories, where the weights are given by the misranking costs. This result also bridges traditional ranking methods such as proportional odds model in statistics with various ranking algorithms in machine learning. Further, the analysis of multipartite ranking with different costs provides a new perspective on non-smooth list-wise ranking measures such as the discounted cumulative gain and preference learning. We illustrate our findings with simulation study and real data analysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kacprzak, T.; Kirk, D.; Friedrich, O.
Shear peak statistics has gained a lot of attention recently as a practical alternative to the two point statistics for constraining cosmological parameters. We perform a shear peak statistics analysis of the Dark Energy Survey (DES) Science Verification (SV) data, using weak gravitational lensing measurements from a 139 degmore » $^2$ field. We measure the abundance of peaks identified in aperture mass maps, as a function of their signal-to-noise ratio, in the signal-to-noise range $$0<\\mathcal S / \\mathcal N<4$$. To predict the peak counts as a function of cosmological parameters we use a suite of $N$-body simulations spanning 158 models with varying $$\\Omega_{\\rm m}$$ and $$\\sigma_8$$, fixing $w = -1$, $$\\Omega_{\\rm b} = 0.04$$, $h = 0.7$ and $$n_s=1$$, to which we have applied the DES SV mask and redshift distribution. In our fiducial analysis we measure $$\\sigma_{8}(\\Omega_{\\rm m}/0.3)^{0.6}=0.77 \\pm 0.07$$, after marginalising over the shear multiplicative bias and the error on the mean redshift of the galaxy sample. We introduce models of intrinsic alignments, blending, and source contamination by cluster members. These models indicate that peaks with $$\\mathcal S / \\mathcal N>4$$ would require significant corrections, which is why we do not include them in our analysis. We compare our results to the cosmological constraints from the two point analysis on the SV field and find them to be in good agreement in both the central value and its uncertainty. As a result, we discuss prospects for future peak statistics analysis with upcoming DES data.« less
NASA Technical Reports Server (NTRS)
1989-01-01
An assessment of quantitative methods and measures for measuring launch commit criteria (LCC) performance measurement trends is made. A statistical performance trending analysis pilot study was processed and compared to STS-26 mission data. This study used four selected shuttle measurement types (solid rocket booster, external tank, space shuttle main engine, and range safety switch safe and arm device) from the five missions prior to mission 51-L. After obtaining raw data coordinates, each set of measurements was processed to obtain statistical confidence bounds and mean data profiles for each of the selected measurement types. STS-26 measurements were compared to the statistical data base profiles to verify the statistical capability of assessing occurrences of data trend anomalies and abnormal time-varying operational conditions associated with data amplitude and phase shifts.
Measurement of the hyperelastic properties of 44 pathological ex vivo breast tissue samples
NASA Astrophysics Data System (ADS)
O'Hagan, Joseph J.; Samani, Abbas
2009-04-01
The elastic and hyperelastic properties of biological soft tissues have been of interest to the medical community. There are several biomedical applications where parameters characterizing such properties are critical for a reliable clinical outcome. These applications include surgery planning, needle biopsy and brachtherapy where tissue biomechanical modeling is involved. Another important application is interpreting nonlinear elastography images. While there has been considerable research on the measurement of the linear elastic modulus of small tissue samples, little research has been conducted for measuring parameters that characterize the nonlinear elasticity of tissues included in tissue slice specimens. This work presents hyperelastic measurement results of 44 pathological ex vivo breast tissue samples. For each sample, five hyperelastic models have been used, including the Yeoh, N = 2 polynomial, N = 1 Ogden, Arruda-Boyce, and Veronda-Westmann models. Results show that the Yeoh, polynomial and Ogden models are the most accurate in terms of fitting experimental data. The results indicate that almost all of the parameters corresponding to the pathological tissues are between two times to over two orders of magnitude larger than those of normal tissues, with C11 showing the most significant difference. Furthermore, statistical analysis indicates that C02 of the Yeoh model, and C11 and C20 of the polynomial model have very good potential for cancer classification as they show statistically significant differences for various cancer types, especially for invasive lobular carcinoma. In addition to the potential for use in cancer classification, the presented data are very important for applications such as surgery planning and virtual reality based clinician training systems where accurate nonlinear tissue response modeling is required.
Measurement of the photon statistics and the noise figure of a fiber-optic parametric amplifier.
Voss, Paul L; Tang, Renyong; Kumar, Prem
2003-04-01
We report measurement of the noise statistics of spontaneous parametric fluorescence in a fiber parametric amplifier with single-mode, single-photon resolution. We employ optical homodyne tomography for this purpose, which also provides a self-calibrating measurement of the noise figure of the amplifier. The measured photon statistics agree with quantum-mechanical predictions, and the amplifier's noise figure is found to be almost quantum limited.
Finding intonational boundaries using acoustic cues related to the voice source
NASA Astrophysics Data System (ADS)
Choi, Jeung-Yoon; Hasegawa-Johnson, Mark; Cole, Jennifer
2005-10-01
Acoustic cues related to the voice source, including harmonic structure and spectral tilt, were examined for relevance to prosodic boundary detection. The measurements considered here comprise five categories: duration, pitch, harmonic structure, spectral tilt, and amplitude. Distributions of the measurements and statistical analysis show that the measurements may be used to differentiate between prosodic categories. Detection experiments on the Boston University Radio Speech Corpus show equal error detection rates around 70% for accent and boundary detection, using only the acoustic measurements described, without any lexical or syntactic information. Further investigation of the detection results shows that duration and amplitude measurements, and, to a lesser degree, pitch measurements, are useful for detecting accents, while all voice source measurements except pitch measurements are useful for boundary detection.
Ač, Alexander; Malenovský, Zbyněk; Urban, Otmar; Hanuš, Jan; Zitová, Martina; Navrátil, Martin; Vráblová, Martina; Olejníčková, Julie; Špunda, Vladimír; Marek, Michal
2012-01-01
We explored ability of reflectance vegetation indexes (VIs) related to chlorophyll fluorescence emission (R 686/R 630, R 740/R 800) and de-epoxidation state of xanthophyll cycle pigments (PRI, calculated as (R 531 − R 570)/(R 531 − R 570)) to track changes in the CO2 assimilation rate and Light Use Efficiency (LUE) in montane grassland and Norway spruce forest ecosystems, both at leaf and also canopy level. VIs were measured at two research plots using a ground-based high spatial/spectral resolution imaging spectroscopy technique. No significant relationship between VIs and leaf light-saturated CO2 assimilation (A MAX) was detected in instantaneous measurements of grassland under steady-state irradiance conditions. Once the temporal dimension and daily irradiance variation were included into the experimental setup, statistically significant changes in VIs related to tested physiological parameters were revealed. ΔPRI and Δ(R 686/R 630) of grassland plant leaves under dark-to-full sunlight transition in the scale of minutes were significantly related to A MAX (R 2 = 0.51). In the daily course, the variation of VIs measured in one-hour intervals correlated well with the variation of Gross Primary Production (GPP), Net Ecosystem Exchange (NEE), and LUE estimated via the eddy-covariance flux tower. Statistical results were weaker in the case of the grassland ecosystem, with the strongest statistical relation of the index R 686/R 630 with NEE and GPP. PMID:22701368
Rivoirard, Romain; Duplay, Vianney; Oriol, Mathieu; Tinquaut, Fabien; Chauvin, Franck; Magne, Nicolas; Bourmaud, Aurelie
2016-01-01
Quality of reporting for Randomized Clinical Trials (RCTs) in oncology was analyzed in several systematic reviews, but, in this setting, there is paucity of data for the outcomes definitions and consistency of reporting for statistical tests in RCTs and Observational Studies (OBS). The objective of this review was to describe those two reporting aspects, for OBS and RCTs in oncology. From a list of 19 medical journals, three were retained for analysis, after a random selection: British Medical Journal (BMJ), Annals of Oncology (AoO) and British Journal of Cancer (BJC). All original articles published between March 2009 and March 2014 were screened. Only studies whose main outcome was accompanied by a corresponding statistical test were included in the analysis. Studies based on censored data were excluded. Primary outcome was to assess quality of reporting for description of primary outcome measure in RCTs and of variables of interest in OBS. A logistic regression was performed to identify covariates of studies potentially associated with concordance of tests between Methods and Results parts. 826 studies were included in the review, and 698 were OBS. Variables were described in Methods section for all OBS studies and primary endpoint was clearly detailed in Methods section for 109 RCTs (85.2%). 295 OBS (42.2%) and 43 RCTs (33.6%) had perfect agreement for reported statistical test between Methods and Results parts. In multivariable analysis, variable "number of included patients in study" was associated with test consistency: aOR (adjusted Odds Ratio) for third group compared to first group was equal to: aOR Grp3 = 0.52 [0.31-0.89] (P value = 0.009). Variables in OBS and primary endpoint in RCTs are reported and described with a high frequency. However, statistical tests consistency between methods and Results sections of OBS is not always noted. Therefore, we encourage authors and peer reviewers to verify consistency of statistical tests in oncology studies.
Rivoirard, Romain; Duplay, Vianney; Oriol, Mathieu; Tinquaut, Fabien; Chauvin, Franck; Magne, Nicolas; Bourmaud, Aurelie
2016-01-01
Background Quality of reporting for Randomized Clinical Trials (RCTs) in oncology was analyzed in several systematic reviews, but, in this setting, there is paucity of data for the outcomes definitions and consistency of reporting for statistical tests in RCTs and Observational Studies (OBS). The objective of this review was to describe those two reporting aspects, for OBS and RCTs in oncology. Methods From a list of 19 medical journals, three were retained for analysis, after a random selection: British Medical Journal (BMJ), Annals of Oncology (AoO) and British Journal of Cancer (BJC). All original articles published between March 2009 and March 2014 were screened. Only studies whose main outcome was accompanied by a corresponding statistical test were included in the analysis. Studies based on censored data were excluded. Primary outcome was to assess quality of reporting for description of primary outcome measure in RCTs and of variables of interest in OBS. A logistic regression was performed to identify covariates of studies potentially associated with concordance of tests between Methods and Results parts. Results 826 studies were included in the review, and 698 were OBS. Variables were described in Methods section for all OBS studies and primary endpoint was clearly detailed in Methods section for 109 RCTs (85.2%). 295 OBS (42.2%) and 43 RCTs (33.6%) had perfect agreement for reported statistical test between Methods and Results parts. In multivariable analysis, variable "number of included patients in study" was associated with test consistency: aOR (adjusted Odds Ratio) for third group compared to first group was equal to: aOR Grp3 = 0.52 [0.31–0.89] (P value = 0.009). Conclusion Variables in OBS and primary endpoint in RCTs are reported and described with a high frequency. However, statistical tests consistency between methods and Results sections of OBS is not always noted. Therefore, we encourage authors and peer reviewers to verify consistency of statistical tests in oncology studies. PMID:27716793
Babu, Giridhara R; Murthy, G V S; Ana, Yamuna; Patel, Prital; Deepa, R; Neelon, Sara E Benjamin; Kinra, Sanjay; Reddy, K Srinath
2018-01-01
AIM To perform a meta-analysis of the association of obesity with hypertension and type 2 diabetes mellitus (T2DM) in India among adults. METHODS To conduct meta-analysis, we performed comprehensive, electronic literature search in the PubMed, CINAHL Plus, and Google Scholar. We restricted the analysis to studies with documentation of some measure of obesity namely; body mass index, waist-hip ratio, waist circumference and diagnosis of hypertension or diagnosis of T2DM. By obtaining summary estimates of all included studies, the meta-analysis was performed using both RevMan version 5 and “metan” command STATA version 11. Heterogeneity was measured by I2 statistic. Funnel plot analysis has been done to assess the study publication bias. RESULTS Of the 956 studies screened, 18 met the eligibility criteria. The pooled odds ratio between obesity and hypertension was 3.82 (95%CI: 3.39 to 4.25). The heterogeneity around this estimate (I2 statistic) was 0%, indicating low variability. The pooled odds ratio from the included studies showed a statistically significant association between obesity and T2DM (OR = 1.14, 95%CI: 1.04 to 1.24) with a high degree of variability. CONCLUSION Despite methodological differences, obesity showed significant, potentially plausible association with hypertension and T2DM in studies conducted in India. Being a modifiable risk factor, our study informs setting policy priority and intervention efforts to prevent debilitating complications. PMID:29359028
Babu, Giridhara R; Murthy, G V S; Ana, Yamuna; Patel, Prital; Deepa, R; Neelon, Sara E Benjamin; Kinra, Sanjay; Reddy, K Srinath
2018-01-15
To perform a meta-analysis of the association of obesity with hypertension and type 2 diabetes mellitus (T2DM) in India among adults. To conduct meta-analysis, we performed comprehensive, electronic literature search in the PubMed, CINAHL Plus, and Google Scholar. We restricted the analysis to studies with documentation of some measure of obesity namely; body mass index, waist-hip ratio, waist circumference and diagnosis of hypertension or diagnosis of T2DM. By obtaining summary estimates of all included studies, the meta-analysis was performed using both RevMan version 5 and "metan" command STATA version 11. Heterogeneity was measured by I 2 statistic. Funnel plot analysis has been done to assess the study publication bias. Of the 956 studies screened, 18 met the eligibility criteria. The pooled odds ratio between obesity and hypertension was 3.82 (95%CI: 3.39 to 4.25). The heterogeneity around this estimate (I2 statistic) was 0%, indicating low variability. The pooled odds ratio from the included studies showed a statistically significant association between obesity and T2DM (OR = 1.14, 95%CI: 1.04 to 1.24) with a high degree of variability. Despite methodological differences, obesity showed significant, potentially plausible association with hypertension and T2DM in studies conducted in India. Being a modifiable risk factor, our study informs setting policy priority and intervention efforts to prevent debilitating complications.
A statistical framework for protein quantitation in bottom-up MS-based proteomics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Karpievitch, Yuliya; Stanley, Jeffrey R.; Taverner, Thomas
2009-08-15
ABSTRACT Motivation: Quantitative mass spectrometry-based proteomics requires protein-level estimates and confidence measures. Challenges include the presence of low-quality or incorrectly identified peptides and widespread, informative, missing data. Furthermore, models are required for rolling peptide-level information up to the protein level. Results: We present a statistical model for protein abundance in terms of peptide peak intensities, applicable to both label-based and label-free quantitation experiments. The model allows for both random and censoring missingness mechanisms and provides naturally for protein-level estimates and confidence measures. The model is also used to derive automated filtering and imputation routines. Three LC-MS datasets are used tomore » illustrate the methods. Availability: The software has been made available in the open-source proteomics platform DAnTE (Polpitiya et al. (2008)) (http://omics.pnl.gov/software/). Contact: adabney@stat.tamu.edu« less
Increased nuchal traslucency in normal karyotype fetuses
De Domenico, Roberta; Faraci, Marianna; Hyseni, Entela; Di Prima, Fosca A. F.; Valenti, Oriana; Monte, Santo; Giorgio, Elsa; Renda, Eliana
2011-01-01
Nuchal traslucency (NT) measurement between 11 and 14 weeks’ gestation is a reliable marker for chromosomal abnormalities, including trisomy 21. However, even if conventional karyotyping is normal, increased NT is a predictive value of adverse pregnancy outcome, because it is associated with several fetal malformations, congenital heart defects, genetic syndromes, intrauterine death and miscarriages; the majority of these structural anomalies are undetectable before birth. The risk is proportional to the nuchal translucency thickness, in fact it statistically increases after measurement reaching 3.5 mm or more. However, when these chromosomally normal fetuses with an enlarged NT survive, even if a detailed ultrasound examination and echocardiography fail to reveal any abnormalities, their uneventful outcome and postnatal developmental delay will be not statistically increased when compared to the general population. These parents should be confidently reassured that the residual chance of structural anomalies and abnormal neurodevelopment may not be higher than in the general population. PMID:22439071
A LES-based Eulerian-Lagrangian approach to predict the dynamics of bubble plumes
NASA Astrophysics Data System (ADS)
Fraga, Bruño; Stoesser, Thorsten; Lai, Chris C. K.; Socolofsky, Scott A.
2016-01-01
An approach for Eulerian-Lagrangian large-eddy simulation of bubble plume dynamics is presented and its performance evaluated. The main numerical novelties consist in defining the gas-liquid coupling based on the bubble size to mesh resolution ratio (Dp/Δx) and the interpolation between Eulerian and Lagrangian frameworks through the use of delta functions. The model's performance is thoroughly validated for a bubble plume in a cubic tank in initially quiescent water using experimental data obtained from high-resolution ADV and PIV measurements. The predicted time-averaged velocities and second-order statistics show good agreement with the measurements, including the reproduction of the anisotropic nature of the plume's turbulence. Further, the predicted Eulerian and Lagrangian velocity fields, second-order turbulence statistics and interfacial gas-liquid forces are quantified and discussed as well as the visualization of the time-averaged primary and secondary flow structure in the tank.
Comparison of Kalman filter and optimal smoother estimates of spacecraft attitude
NASA Technical Reports Server (NTRS)
Sedlak, J.
1994-01-01
Given a valid system model and adequate observability, a Kalman filter will converge toward the true system state with error statistics given by the estimated error covariance matrix. The errors generally do not continue to decrease. Rather, a balance is reached between the gain of information from new measurements and the loss of information during propagation. The errors can be further reduced, however, by a second pass through the data with an optimal smoother. This algorithm obtains the optimally weighted average of forward and backward propagating Kalman filters. It roughly halves the error covariance by including future as well as past measurements in each estimate. This paper investigates whether such benefits actually accrue in the application of an optimal smoother to spacecraft attitude determination. Tests are performed both with actual spacecraft data from the Extreme Ultraviolet Explorer (EUVE) and with simulated data for which the true state vector and noise statistics are exactly known.
Unbiased estimation of oceanic mean rainfall from satellite borne radiometer measurements
NASA Technical Reports Server (NTRS)
Mittal, M. C.
1981-01-01
The statistical properties of the radar derived rainfall obtained during the GARP Atlantic Tropical Experiment (GATE) are used to derive quantitative estimates of the spatial and temporal sampling errors associated with estimating rainfall from brightness temperature measurements such as would be obtained from a satelliteborne microwave radiometer employing a practical size antenna aperture. A basis for a method of correcting the so called beam filling problem, i.e., for the effect of nonuniformity of rainfall over the radiometer beamwidth is provided. The method presented employs the statistical properties of the observations themselves without need for physical assumptions beyond those associated with the radiative transfer model. The simulation results presented offer a validation of the estimated accuracy that can be achieved and the graphs included permit evaluation of the effect of the antenna resolution on both the temporal and spatial sampling errors.
Närhi, Mikko; Wetzel, Benjamin; Billet, Cyril; Toenger, Shanti; Sylvestre, Thibaut; Merolla, Jean-Marc; Morandotti, Roberto; Dias, Frederic; Genty, Goëry; Dudley, John M.
2016-01-01
Modulation instability is a fundamental process of nonlinear science, leading to the unstable breakup of a constant amplitude solution of a physical system. There has been particular interest in studying modulation instability in the cubic nonlinear Schrödinger equation, a generic model for a host of nonlinear systems including superfluids, fibre optics, plasmas and Bose–Einstein condensates. Modulation instability is also a significant area of study in the context of understanding the emergence of high amplitude events that satisfy rogue wave statistical criteria. Here, exploiting advances in ultrafast optical metrology, we perform real-time measurements in an optical fibre system of the unstable breakup of a continuous wave field, simultaneously characterizing emergent modulation instability breather pulses and their associated statistics. Our results allow quantitative comparison between experiment, modelling and theory, and are expected to open new perspectives on studies of instability dynamics in physics. PMID:27991513
Follett, Peter A; Hennessey, Michael K
2007-04-01
Quarantine measures including treatments are applied to exported fruit and vegetable commodities to control regulatory fruit fly pests and to reduce the likelihood of their introduction into new areas. Nonhost status can be an effective measure used to achieve quarantine security. As with quarantine treatments, nonhost status can stand alone as a measure if there is high efficacy and statistical confidence. The numbers of insects or fruit tested during investigation of nonhost status will determine the level of statistical confidence. If the level of confidence of nonhost status is not high, then additional measures may be required to achieve quarantine security as part of a systems approach. Certain countries require that either 99.99 or 99.9968% mortality, as a measure of efficacy, at the 95% confidence level, be achieved by a quarantine treatment to meet quarantine security. This article outlines how the level of confidence in nonhost status can be quantified so that its equivalency to traditional quarantine treatments may be demonstrated. Incorporating sample size and confidence levels into host status testing protocols along with efficacy will lead to greater consistency by regulatory decision-makers in interpreting results and, therefore, to more technically sound decisions on host status.
Environmental correlates to behavioral health outcomes in Alzheimer's special care units.
Zeisel, John; Silverstein, Nina M; Hyde, Joan; Levkoff, Sue; Lawton, M Powell; Holmes, William
2003-10-01
We systematically measured the associations between environmental design features of nursing home special care units and the incidence of aggression, agitation, social withdrawal, depression, and psychotic problems among persons living there who have Alzheimer's disease or a related disorder. We developed and tested a model of critical health-related environmental design features in settings for people with Alzheimer's disease. We used hierarchical linear modeling statistical techniques to assess associations between seven environmental design features and behavioral health measures for 427 residents in 15 special care units. Behavioral health measures included the Cohen-Mansfield physical agitation, verbal agitation, and aggressive behavior scales, the Multidimensional Observation Scale for Elderly Subjects depression and social withdrawal scales, and BEHAVE-AD (psychotic symptom list) misidentification and paranoid delusions scales. Statistical controls were included for the influence of, among others, cognitive status, need for assistance with activities of daily living, prescription drug use, amount of Alzheimer's staff training, and staff-to-resident ratio. Although hierarchical linear modeling minimizes the risk of Type II-false positive-error, this exploratory study also pays special attention to avoiding Type I error-the failure to recognize possible relationships between behavioral health characteristics and independent variables. We found associations between each behavioral health measure and particular environmental design features, as well as between behavioral health measures and both resident and nonenvironmental facility variables. This research demonstrates the potential that environment has for contributing to the improvement of Alzheimer's symptoms. A balanced combination of pharmacologic, behavioral, and environmental approaches is likely to be most effective in improving the health, behavior, and quality of life of people with Alzheimer's disease.
Knox, Andrew F; Bryant, Alan R
2016-05-01
Controversy exists regarding the structural and functional causes of hallux limitus, including metatarsus primus elevatus, a long first metatarsal, first-ray hypermobility, the shape of the first metatarsal head, and the presence of hallux interphalangeus. Some articles have reported on the radiographic evaluation of these measurements in feet affected by hallux limitus, but no study has directly compared the affected and unaffected feet in patients with unilateral hallux limitus. This case-control pilot study aimed to establish whether any such differences exist. Dorsoplantar and lateral weightbearing radiographs of both feet in 30 patients with unilateral hallux limitus were assessed for grade of disease, lateral intermetatarsal angle, metatarsal protrusion distance, plantar gapping at the first metatarsocuneiform joint, metatarsal head shape, and hallux abductus interphalangeus angle. Data analysis was performed using a statistical software program. Mean radiographic measurements for affected and unaffected feet demonstrated that metatarsus primus elevatus, a short first metatarsal, first-ray hypermobility, a flat metatarsal head shape, and hallux interphalangeus were prevalent in both feet. There was no statistically significant difference between feet for any of the radiographic parameters measured (Mann-Whitney U tests, independent-samples t tests, and Pearson χ(2) tests: P > .05). No significant differences exist in the presence of the structural risk factors examined between affected and unaffected feet in patients with unilateral hallux limitus. The influence of other intrinsic factors, including footedness and family history, should be investigated further.
An examination of competition and efficiency for hospital industry in Turkey.
Özgen Narcı, Hacer; Ozcan, Yasar A; Şahin, İsmet; Tarcan, Menderes; Narcı, Mustafa
2015-12-01
The two particular reforms that have been undertaken under the Health Transformation Program in Turkey are enhancing efficiency and increasing competition. However, there is a lack of information about the relationship between competition and hospital efficiency. The purpose of this paper is to analyze the effect of competition on technical efficiency for the hospital industry in Turkey. The target population included all public and private general hospitals that were open in 2010 in Turkey (n = 1,224). From these, 1,103 hospitals met the selection criteria and were included in the study. Data were obtained from the Turkish Statistical Institute, the Ministry of Health, and through a field survey. Technical efficiency of hospitals was estimated using Data Envelopment Analysis with five outputs and five inputs. The intensity of competition among hospitals was measured by objective and subjective measures. Objective competition was measured using the Hirschman-Herfindahl Index, and subjective competition was measured based on the perceptions of top level hospital managers. Multivariate Tobit regression was used to investigate the relationship between competition and efficiency while controlling the effects of demand and supply characteristics of the market and the hospital traits. Efficiency results showed that 17% of hospitals were technically efficient. Regression analyses portrayed that the degree of competition among general hospitals did not have a statistically significant relationship with hospitals' technical efficiency. To conclude, hospital efficiency in Turkey does not seem to be affected by the intensity of competition among hospitals.
Pupil Size in Outdoor Environments
2007-04-06
studies. .........................19 Table 3: Descriptive statistics for pupils measured over luminance range. .........50 Table 4: N in each...strata for all pupil measurements..........................................50 Table 5: Descriptive statistics stratified against eye color...59 Table 6: Descriptive statistics stratified against gender. .....................................64 Table 7: Descriptive
Predictors of persistent pain after total knee arthroplasty: a systematic review and meta-analysis.
Lewis, G N; Rice, D A; McNair, P J; Kluger, M
2015-04-01
Several studies have identified clinical, psychosocial, patient characteristic, and perioperative variables that are associated with persistent postsurgical pain; however, the relative effect of these variables has yet to be quantified. The aim of the study was to provide a systematic review and meta-analysis of predictor variables associated with persistent pain after total knee arthroplasty (TKA). Included studies were required to measure predictor variables prior to or at the time of surgery, include a pain outcome measure at least 3 months post-TKA, and include a statistical analysis of the effect of the predictor variable(s) on the outcome measure. Counts were undertaken of the number of times each predictor was analysed and the number of times it was found to have a significant relationship with persistent pain. Separate meta-analyses were performed to determine the effect size of each predictor on persistent pain. Outcomes from studies implementing uni- and multivariable statistical models were analysed separately. Thirty-two studies involving almost 30 000 patients were included in the review. Preoperative pain was the predictor that most commonly demonstrated a significant relationship with persistent pain across uni- and multivariable analyses. In the meta-analyses of data from univariate models, the largest effect sizes were found for: other pain sites, catastrophizing, and depression. For data from multivariate models, significant effects were evident for: catastrophizing, preoperative pain, mental health, and comorbidities. Catastrophizing, mental health, preoperative knee pain, and pain at other sites are the strongest independent predictors of persistent pain after TKA. © The Author 2014. Published by Oxford University Press on behalf of the British Journal of Anaesthesia. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Two statistics for evaluating parameter identifiability and error reduction
Doherty, John; Hunt, Randall J.
2009-01-01
Two statistics are presented that can be used to rank input parameters utilized by a model in terms of their relative identifiability based on a given or possible future calibration dataset. Identifiability is defined here as the capability of model calibration to constrain parameters used by a model. Both statistics require that the sensitivity of each model parameter be calculated for each model output for which there are actual or presumed field measurements. Singular value decomposition (SVD) of the weighted sensitivity matrix is then undertaken to quantify the relation between the parameters and observations that, in turn, allows selection of calibration solution and null spaces spanned by unit orthogonal vectors. The first statistic presented, "parameter identifiability", is quantitatively defined as the direction cosine between a parameter and its projection onto the calibration solution space. This varies between zero and one, with zero indicating complete non-identifiability and one indicating complete identifiability. The second statistic, "relative error reduction", indicates the extent to which the calibration process reduces error in estimation of a parameter from its pre-calibration level where its value must be assigned purely on the basis of prior expert knowledge. This is more sophisticated than identifiability, in that it takes greater account of the noise associated with the calibration dataset. Like identifiability, it has a maximum value of one (which can only be achieved if there is no measurement noise). Conceptually it can fall to zero; and even below zero if a calibration problem is poorly posed. An example, based on a coupled groundwater/surface-water model, is included that demonstrates the utility of the statistics. ?? 2009 Elsevier B.V.
Ries, Kernell G.; Eng, Ken
2010-01-01
The U.S. Geological Survey, in cooperation with the Maryland Department of the Environment, operated a network of 20 low-flow partial-record stations during 2008 in a region that extends from southwest of Baltimore to the northeastern corner of Maryland to obtain estimates of selected streamflow statistics at the station locations. The study area is expected to face a substantial influx of new residents and businesses as a result of military and civilian personnel transfers associated with the Federal Base Realignment and Closure Act of 2005. The estimated streamflow statistics, which include monthly 85-percent duration flows, the 10-year recurrence-interval minimum base flow, and the 7-day, 10-year low flow, are needed to provide a better understanding of the availability of water resources in the area to be affected by base-realignment activities. Streamflow measurements collected for this study at the low-flow partial-record stations and measurements collected previously for 8 of the 20 stations were related to concurrent daily flows at nearby index streamgages to estimate the streamflow statistics. Three methods were used to estimate the streamflow statistics and two methods were used to select the index streamgages. Of the three methods used to estimate the streamflow statistics, two of them--the Moments and MOVE1 methods--rely on correlating the streamflow measurements at the low-flow partial-record stations with concurrent streamflows at nearby, hydrologically similar index streamgages to determine the estimates. These methods, recommended for use by the U.S. Geological Survey, generally require about 10 streamflow measurements at the low-flow partial-record station. The third method transfers the streamflow statistics from the index streamgage to the partial-record station based on the average of the ratios of the measured streamflows at the partial-record station to the concurrent streamflows at the index streamgage. This method can be used with as few as one pair of streamflow measurements made on a single streamflow recession at the low-flow partial-record station, although additional pairs of measurements will increase the accuracy of the estimates. Errors associated with the two correlation methods generally were lower than the errors associated with the flow-ratio method, but the advantages of the flow-ratio method are that it can produce reasonably accurate estimates from streamflow measurements much faster and at lower cost than estimates obtained using the correlation methods. The two index-streamgage selection methods were (1) selection based on the highest correlation coefficient between the low-flow partial-record station and the index streamgages, and (2) selection based on Euclidean distance, where the Euclidean distance was computed as a function of geographic proximity and the basin characteristics: drainage area, percentage of forested area, percentage of impervious area, and the base-flow recession time constant, t. Method 1 generally selected index streamgages that were significantly closer to the low-flow partial-record stations than method 2. The errors associated with the estimated streamflow statistics generally were lower for method 1 than for method 2, but the differences were not statistically significant. The flow-ratio method for estimating streamflow statistics at low-flow partial-record stations was shown to be independent from the two correlation-based estimation methods. As a result, final estimates were determined for eight low-flow partial-record stations by weighting estimates from the flow-ratio method with estimates from one of the two correlation methods according to the respective variances of the estimates. Average standard errors of estimate for the final estimates ranged from 90.0 to 7.0 percent, with an average value of 26.5 percent. Average standard errors of estimate for the weighted estimates were, on average, 4.3 percent less than the best average standard errors of estima
Ma, Yue; Yin, Fei; Zhang, Tao; Zhou, Xiaohua Andrew; Li, Xiaosong
2016-01-01
Spatial scan statistics are widely used in various fields. The performance of these statistics is influenced by parameters, such as maximum spatial cluster size, and can be improved by parameter selection using performance measures. Current performance measures are based on the presence of clusters and are thus inapplicable to data sets without known clusters. In this work, we propose a novel overall performance measure called maximum clustering set-proportion (MCS-P), which is based on the likelihood of the union of detected clusters and the applied dataset. MCS-P was compared with existing performance measures in a simulation study to select the maximum spatial cluster size. Results of other performance measures, such as sensitivity and misclassification, suggest that the spatial scan statistic achieves accurate results in most scenarios with the maximum spatial cluster sizes selected using MCS-P. Given that previously known clusters are not required in the proposed strategy, selection of the optimal maximum cluster size with MCS-P can improve the performance of the scan statistic in applications without identified clusters.
Ma, Yue; Yin, Fei; Zhang, Tao; Zhou, Xiaohua Andrew; Li, Xiaosong
2016-01-01
Spatial scan statistics are widely used in various fields. The performance of these statistics is influenced by parameters, such as maximum spatial cluster size, and can be improved by parameter selection using performance measures. Current performance measures are based on the presence of clusters and are thus inapplicable to data sets without known clusters. In this work, we propose a novel overall performance measure called maximum clustering set–proportion (MCS-P), which is based on the likelihood of the union of detected clusters and the applied dataset. MCS-P was compared with existing performance measures in a simulation study to select the maximum spatial cluster size. Results of other performance measures, such as sensitivity and misclassification, suggest that the spatial scan statistic achieves accurate results in most scenarios with the maximum spatial cluster sizes selected using MCS-P. Given that previously known clusters are not required in the proposed strategy, selection of the optimal maximum cluster size with MCS-P can improve the performance of the scan statistic in applications without identified clusters. PMID:26820646
Chibwe, Leah; Geier, Mitra C; Nakamura, Jun; Tanguay, Robert L; Aitken, Michael D; Simonich, Staci L Massey
2015-12-01
The formation of more polar and toxic polycyclic aromatic hydrocarbon (PAH) transformation products is one of the concerns associated with the bioremediation of PAH-contaminated soils. Soil contaminated with coal tar (prebioremediation) from a former manufactured gas plant (MGP) site was treated in a laboratory scale bioreactor (postbioremediation) and extracted using pressurized liquid extraction. The soil extracts were fractionated, based on polarity, and analyzed for 88 PAHs (unsubstituted, oxygenated, nitrated, and heterocyclic PAHs). The PAH concentrations in the soil tested, postbioremediation, were lower than their regulatory maximum allowable concentrations (MACs), with the exception of the higher molecular weight PAHs (BaA, BkF, BbF, BaP, and IcdP), most of which did not undergo significant biodegradation. The soil extract fractions were tested for genotoxicity using the DT40 chicken lymphocyte bioassay and developmental toxicity using the embryonic zebrafish (Danio rerio) bioassay. A statistically significant increase in genotoxicity was measured in the unfractionated soil extract, as well as in four polar soil extract fractions, postbioremediation (p < 0.05). In addition, a statistically significant increase in developmental toxicity was measured in one polar soil extract fraction, postbioremediation (p < 0.05). A series of morphological abnormalities, including peculiar caudal fin malformations and hyperpigmentation in the tail, were measured in several soil extract fractions in embryonic zebrafish, both pre- and postbioremediation. The increased toxicity measured postbioremediation is not likely due to the 88 PAHs measured in this study (including quinones), because most were not present in the toxic polar fractions and/or because their concentrations did not increase postbioremediation. However, the increased toxicity measured postbioremediation is likely due to hydroxylated and carboxylated transformation products of the 3- and 4-ring PAHs (PHE, 1MPHE, 2MPHE, PRY, BaA, and FLA) that were most degraded.
Enhanced detection and visualization of anomalies in spectral imagery
NASA Astrophysics Data System (ADS)
Basener, William F.; Messinger, David W.
2009-05-01
Anomaly detection algorithms applied to hyperspectral imagery are able to reliably identify man-made objects from a natural environment based on statistical/geometric likelyhood. The process is more robust than target identification, which requires precise prior knowledge of the object of interest, but has an inherently higher false alarm rate. Standard anomaly detection algorithms measure deviation of pixel spectra from a parametric model (either statistical or linear mixing) estimating the image background. The topological anomaly detector (TAD) creates a fully non-parametric, graph theory-based, topological model of the image background and measures deviation from this background using codensity. In this paper we present a large-scale comparative test of TAD against 80+ targets in four full HYDICE images using the entire canonical target set for generation of ROC curves. TAD will be compared against several statistics-based detectors including local RX and subspace RX. Even a perfect anomaly detection algorithm would have a high practical false alarm rate in most scenes simply because the user/analyst is not interested in every anomalous object. To assist the analyst in identifying and sorting objects of interest, we investigate coloring of the anomalies with principle components projections using statistics computed from the anomalies. This gives a very useful colorization of anomalies in which objects of similar material tend to have the same color, enabling an analyst to quickly sort and identify anomalies of highest interest.
Site-conditions map for Portugal based on VS measurements: methodology and final model
NASA Astrophysics Data System (ADS)
Vilanova, Susana; Narciso, João; Carvalho, João; Lopes, Isabel; Quinta Ferreira, Mario; Moura, Rui; Borges, José; Nemser, Eliza; Pinto, carlos
2017-04-01
In this paper we present a statistically significant site-condition model for Portugal based on shear-wave velocity (VS) data and surface geology. We also evaluate the performance of commonly used Vs30 proxies based on exogenous data and analyze the implications of using those proxies for calculating site amplification in seismic hazard assessment. The dataset contains 161 Vs profiles acquired in Portugal in the context of research projects, technical reports, academic thesis and academic papers. The methodologies involved in characterizing the Vs structure at the sites in the database include seismic refraction, multichannel analysis of seismic waves and refraction microtremor. Invasive measurements were performed in selected locations in order to compare the Vs profiles obtained from both invasive and non-invasive techniques. In general there was good agreement in the subsurface structure of Vs30 obtained from the different methodologies. The database flat-file includes information on Vs30, surface geology at 1:50.000 and 1:500.000 scales, elevation and topographic slope and based on SRTM30 topographic dataset. The procedure used to develop the site-conditions map is based on a three-step process that includes defining a preliminary set of geological units based on the literature, performing statistical tests to assess whether or not the differences in the distributions of Vs30 are statistically significant, and merging of the geological units accordingly. The dataset was, to some extent, affected by clustering and/or preferential sampling and therefore a declustering algorithm was applied. The final model includes three geological units: 1) Igneous, metamorphic and old (Paleogene and Mesozoic) sedimentary rocks; 2) Neogene and Pleistocene formations, and 3) Holocene formations. The evaluation of proxies indicates that although geological analogues and topographic slope are in general unbiased, the latter shows significant bias for particular geological units and subsequently for some geographical regions.
Dai, Qi; Yang, Yanchun; Wang, Tianming
2008-10-15
Many proposed statistical measures can efficiently compare biological sequences to further infer their structures, functions and evolutionary information. They are related in spirit because all the ideas for sequence comparison try to use the information on the k-word distributions, Markov model or both. Motivated by adding k-word distributions to Markov model directly, we investigated two novel statistical measures for sequence comparison, called wre.k.r and S2.k.r. The proposed measures were tested by similarity search, evaluation on functionally related regulatory sequences and phylogenetic analysis. This offers the systematic and quantitative experimental assessment of our measures. Moreover, we compared our achievements with these based on alignment or alignment-free. We grouped our experiments into two sets. The first one, performed via ROC (receiver operating curve) analysis, aims at assessing the intrinsic ability of our statistical measures to search for similar sequences from a database and discriminate functionally related regulatory sequences from unrelated sequences. The second one aims at assessing how well our statistical measure is used for phylogenetic analysis. The experimental assessment demonstrates that our similarity measures intending to incorporate k-word distributions into Markov model are more efficient.
Biomass statistics for New Hampshire - 1983
Thomas S. Frieswyk; Anne M. Malley
1986-01-01
A new measure of the forest resource has been added to the fourth forest inventory of New Hampshire. The inventory, which was conducted in 1982-83, included estimates of aboveground tree biomass on timberland. There are approximately 502 million green tons of wood and bark in the aboveground portion of all trees, or 104 green tons per acre. Fifty-five percent or 275...
Aboveground tree biomass statistics for Maine: 1982
Eric H. Wharton; Thomas S. Frieswyk; Anne M. Malley
1985-01-01
Traditional measures of volume inadequately describe the total aboveground wood resource. The 1980-82 inventory of Maine included estimates of aboveground tree biomass on timberland. There are nearly 1,504.4 million green tons of wood and bark in all trees above the ground level, or 88.2 green tons per acre of timberland. Most of the biomass is in growing stock, but 49...
Quality Assurance for Rapid Airfield Construction
2008-05-01
necessary to conduct a volume-replacement density test for in-place soil. This density test, which was developed during this investigation, involves...the test both simpler and quicker. The Clegg hammer results are the primary means of judging compaction; thus, the requirements for density tests are...minimized through a stepwise acceptance procedure. Statistical criteria for evaluating Clegg hammer and density measurements are also included
ERIC Educational Resources Information Center
Burmester, Kristen O'Rourke
2011-01-01
Classrooms are a primary site of evidence about learning. Yet classroom proceedings often occur behind closed doors and hence evidence of student learning is observable only to the classroom teacher. The informal and undocumented nature of this information means that it is rarely included in statistical models or quantifiable analyses. This…
Intelligence--Group Administered, Grades 7 and Above. Annotated Bibliography of Tests.
ERIC Educational Resources Information Center
Educational Testing Service, Princeton, NJ. Test Collection.
Most of the 47 tests included in this bibliography assess intelligence and provide an actual I.Q. score or other score with similar statistical properties. Many of the tests are designed to measure occupational qualifications or to aid in career guidance. Although all ages are represented, the majority of tests are targeted to grade 7 and above. A…
The Effects of Conditioned Reinforcement for Reading on Reading Comprehension for 5th Graders
ERIC Educational Resources Information Center
Cumiskey Moore, Colleen
2017-01-01
In three experiments, I tested the effects of the conditioned reinforcement for reading (R+Reading) on reading comprehension with 5th graders. In Experiment 1, I conducted a series of statistical analyses with data from 18 participants for one year. I administered 4 pre/post measurements for reading repertoires which included: 1) state-wide…
ERIC Educational Resources Information Center
Madheswaran, S.
2007-01-01
Policy makers confronted with the need to introduce health and safety regulations often wonder how to value the benefits of these regulations. One way that a monetary value could be placed on reductions in health risks, including risk of death, is through understanding how people are compensated for the different risks they take. While there is an…
Measures of Child Well-Being in Utah, 1997. State and County Profiles of Child Well-Being.
ERIC Educational Resources Information Center
Haven, Terry, Ed.
This Utah Kids Count report examines statewide trends in the well-being of Utah's children. The statistical portrait is based on five general areas of children's well-being: (1) demographics; (2) health; (3) education; (4) safety; and (5) economic security. Key indicators in these five areas include the following: (1) population; (2) poverty; (3)…
Why McNemar's Procedure Needs to Be Included in the Business Statistics Curriculum
ERIC Educational Resources Information Center
Berenson, Mark L.; Koppel, Nicole B.
2005-01-01
In business research situations it is often of interest to examine the differences in the responses in repeated measurements of the same subjects or from among matched or paired subjects. A simple and useful procedure for comparing differences between proportions in two related samples was devised by McNemar (1947) nearly 60 years ago. Although…
Frings, Andreas; Linke, Stephan J; Bauer, Eva L; Druchkiv, Vasyl; Katz, Toam; Steinberg, Johannes
2015-01-01
This study was initiated to evaluate biomechanical changes using the Corvis ST tonometer (CST) on the cornea after laser in situ keratomileusis (LASIK). University Medical Center Hamburg-Eppendorf, Germany, and Care Vision Refractive Centers, Germany. Retrospective cohort study. This retrospective study included 37 eyes of 37 refractive patients. All CST measurements were performed 1 day before surgery and at the 1-month follow-up examination. The LASIK procedure included mechanical flap preparation using a Moria SBK microkeratome and an Allegretto excimer laser platform. Statistically significant differences were observed for mean first applanation length, mean first and second deflection lengths, mean first and second deflection amplitudes, radius of curvature, and peak distance. Significant positive correlations were found between the change (Δ) of radius of curvature and manifest refraction spherical equivalent (MRSE), ablation depth, and Δintraocular pressure as well as between AD and ΔHC-time. Each diopter of myopic correction in MRSE resulted in an increase in Δradius of curvature of 0.2 mm. Several CST parameters were statistically significantly altered by LASIK, thereby indicating that flap creation, ablation, or both, significantly change the ability of the cornea to absorb or dissipate energy.
Hoyle, R H
1991-02-01
Indirect measures of psychological constructs are vital to clinical research. On occasion, however, the meaning of indirect measures of psychological constructs is obfuscated by statistical procedures that do not account for the complex relations between items and latent variables and among latent variables. Covariance structure analysis (CSA) is a statistical procedure for testing hypotheses about the relations among items that indirectly measure a psychological construct and relations among psychological constructs. This article introduces clinical researchers to the strengths and limitations of CSA as a statistical procedure for conceiving and testing structural hypotheses that are not tested adequately with other statistical procedures. The article is organized around two empirical examples that illustrate the use of CSA for evaluating measurement models with correlated error terms, higher-order factors, and measured and latent variables.
Problems With Risk Reclassification Methods for Evaluating Prediction Models
Pepe, Margaret S.
2011-01-01
For comparing the performance of a baseline risk prediction model with one that includes an additional predictor, a risk reclassification analysis strategy has been proposed. The first step is to cross-classify risks calculated according to the 2 models for all study subjects. Summary measures including the percentage of reclassification and the percentage of correct reclassification are calculated, along with 2 reclassification calibration statistics. The author shows that interpretations of the proposed summary measures and P values are problematic. The author's recommendation is to display the reclassification table, because it shows interesting information, but to use alternative methods for summarizing and comparing model performance. The Net Reclassification Index has been suggested as one alternative method. The author argues for reporting components of the Net Reclassification Index because they are more clinically relevant than is the single numerical summary measure. PMID:21555714
Measurement of Muon Neutrino Quasielastic Scattering on Carbon
NASA Astrophysics Data System (ADS)
Aguilar-Arevalo, A. A.; Bazarko, A. O.; Brice, S. J.; Brown, B. C.; Bugel, L.; Cao, J.; Coney, L.; Conrad, J. M.; Cox, D. C.; Curioni, A.; Djurcic, Z.; Finley, D. A.; Fleming, B. T.; Ford, R.; Garcia, F. G.; Garvey, G. T.; Green, C.; Green, J. A.; Hart, T. L.; Hawker, E.; Imlay, R.; Johnson, R. A.; Kasper, P.; Katori, T.; Kobilarcik, T.; Kourbanis, I.; Koutsoliotas, S.; Laird, E. M.; Link, J. M.; Liu, Y.; Liu, Y.; Louis, W. C.; Mahn, K. B. M.; Marsh, W.; Martin, P. S.; McGregor, G.; Metcalf, W.; Meyers, P. D.; Mills, F.; Mills, G. B.; Monroe, J.; Moore, C. D.; Nelson, R. H.; Nienaber, P.; Ouedraogo, S.; Patterson, R. B.; Perevalov, D.; Polly, C. C.; Prebys, E.; Raaf, J. L.; Ray, H.; Roe, B. P.; Russell, A. D.; Sandberg, V.; Schirato, R.; Schmitz, D.; Shaevitz, M. H.; Shoemaker, F. C.; Smith, D.; Sorel, M.; Spentzouris, P.; Stancu, I.; Stefanski, R. J.; Sung, M.; Tanaka, H. A.; Tayloe, R.; Tzanov, M.; van de Water, R.; Wascko, M. O.; White, D. H.; Wilking, M. J.; Yang, H. J.; Zeller, G. P.; Zimmerman, E. D.
2008-01-01
The observation of neutrino oscillations is clear evidence for physics beyond the standard model. To make precise measurements of this phenomenon, neutrino oscillation experiments, including MiniBooNE, require an accurate description of neutrino charged current quasielastic (CCQE) cross sections to predict signal samples. Using a high-statistics sample of νμ CCQE events, MiniBooNE finds that a simple Fermi gas model, with appropriate adjustments, accurately characterizes the CCQE events observed in a carbon-based detector. The extracted parameters include an effective axial mass, MAeff=1.23±0.20GeV, that describes the four-momentum dependence of the axial-vector form factor of the nucleon, and a Pauli-suppression parameter, κ=1.019±0.011. Such a modified Fermi gas model may also be used by future accelerator-based experiments measuring neutrino oscillations on nuclear targets.
Statistical Reform in School Psychology Research: A Synthesis
ERIC Educational Resources Information Center
Swaminathan, Hariharan; Rogers, H. Jane
2007-01-01
Statistical reform in school psychology research is discussed in terms of research designs, measurement issues, statistical modeling and analysis procedures, interpretation and reporting of statistical results, and finally statistics education.
Estimating labile particulate iron concentrations in coastal waters from remote sensing data
NASA Astrophysics Data System (ADS)
McGaraghan, Anna R.; Kudela, Raphael M.
2012-02-01
Owing to the difficulties inherent in measuring trace metals and the importance of iron as a limiting nutrient for biological systems, the ability to monitor particulate iron concentration remotely is desirable. This study examines the relationship between labile particulate iron, described here as weak acid leachable particulate iron or total dissolvable iron, and easily obtained bio-optical measurements. We develop a bio-optical proxy that can be used to estimate large-scale patterns of labile iron concentrations in surface waters, and we extend this by including other environmental variables in a multiple linear regression statistical model. By utilizing a ratio of optical backscatter and fluorescence obtained by satellite, we identify patterns in iron concentrations confirmed by traditional shipboard sampling. This basic relationship is improved with the addition of other environmental parameters in the statistical linear regression model. The optical proxy detects known temporal and spatial trends in average surface iron concentrations in Monterey Bay. The proxy is robust in that similar performance was obtained using two independent particulate iron data sets, but it exhibits weaker correlations than the full statistical model. This proxy will be a valuable tool for oceanographers seeking to monitor iron concentrations in coastal regions and allows for better understanding of the variability of labile particulate iron in surface waters to complement direct measurement of leachable particulate or total dissolvable iron.
Kelley, George A.; Kelley, Kristi S.
2013-01-01
Purpose. Conduct a systematic review of previous meta-analyses addressing the effects of exercise in the treatment of overweight and obese children and adolescents. Methods. Previous meta-analyses of randomized controlled exercise trials that assessed adiposity in overweight and obese children and adolescents were included by searching nine electronic databases and cross-referencing from retrieved studies. Methodological quality was assessed using the Assessment of Multiple Systematic Reviews (AMSTAR) Instrument. The alpha level for statistical significance was set at P ≤ 0.05. Results. Of the 308 studies reviewed, two aggregate data meta-analyses representing 14 and 17 studies and 481 and 701 boys and girls met all eligibility criteria. Methodological quality was 64% and 73%. For both studies, statistically significant reductions in percent body fat were observed (P = 0.006 and P < 0.00001). The number-needed-to treat (NNT) was 4 and 3 with an estimated 24.5 and 31.5 million overweight and obese children in the world potentially benefitting, 2.8 and 3.6 million in the US. No other measures of adiposity (BMI-related measures, body weight, and central obesity) were statistically significant. Conclusions. Exercise is efficacious for reducing percent body fat in overweight and obese children and adolescents. Insufficient evidence exists to suggest that exercise reduces other measures of adiposity. PMID:24455215
Choroidal Thickness Analysis in Patients with Usher Syndrome Type 2 Using EDI OCT.
Colombo, L; Sala, B; Montesano, G; Pierrottet, C; De Cillà, S; Maltese, P; Bertelli, M; Rossetti, L
2015-01-01
To portray Usher Syndrome type 2, analyzing choroidal thickness and comparing data reported in published literature on RP and healthy subjects. Methods. 20 eyes of 10 patients with clinical signs and genetic diagnosis of Usher Syndrome type 2. Each patient underwent a complete ophthalmologic examination including Best Corrected Visual Acuity (BCVA), intraocular pressure (IOP), axial length (AL), automated visual field (VF), and EDI OCT. Both retinal and choroidal measures were measured. Statistical analysis was performed to correlate choroidal thickness with age, BCVA, IOP, AL, VF, and RT. Comparison with data about healthy people and nonsyndromic RP patients was performed. Results. Mean subfoveal choroidal thickness (SFCT) was 248.21 ± 79.88 microns. SFCT was statistically significant correlated with age (correlation coefficient -0.7248179, p < 0.01). No statistically significant correlation was found between SFCT and BCVA, IOP, AL, VF, and RT. SFCT was reduced if compared to healthy subjects (p < 0.01). No difference was found when compared to choroidal thickness from nonsyndromic RP patients (p = 0.2138). Conclusions. Our study demonstrated in vivo choroidal thickness reduction in patients with Usher Syndrome type 2. These data are important for the comprehension of mechanisms of disease and for the evaluation of therapeutic approaches.
NASA Technical Reports Server (NTRS)
Alberts, J. R.; Burden, H. W.; Hawes, N.; Ronca, A. E.
1996-01-01
To assess prenatal and postnatal developmental status in the offspring of a group of animals, it is typical to examine fetuses from some of the dams as well as infants born to the remaining dams. Statistical limitations often arise, particularly when the animals are rare or especially precious, because all offspring of the dam represent only a single statistical observation; littermates are not independent observations (biologically or statistically). We describe a study in which pregnant laboratory rats were laparotomized on day 7 of gestation (GD7) to ascertain the number and distribution of uterine implantation sites and were subjected to a simulated experience on a 10-day space shuttle flight. After the simulated landing on GD18, rats were unilaterally hysterectomized, thus providing a sample of fetuses from 10 independent uteruses, followed by successful vaginal delivery on GD22, yielding postnatal samples from 10 uteruses. A broad profile of maternal and offspring morphologic and physiologic measures indicated that these novel sampling procedures did not compromise maternal well-being and maintained normal offspring development and function. Measures included maternal organ weights and hormone concentrations, offspring body size, growth, organ weights, sexual differentiation, and catecholamine concentrations.
OPEN PROBLEM: Orbits' statistics in chaotic dynamical systems
NASA Astrophysics Data System (ADS)
Arnold, V.
2008-07-01
This paper shows how the measurement of the stochasticity degree of a finite sequence of real numbers, published by Kolmogorov in Italian in a journal of insurances' statistics, can be usefully applied to measure the objective stochasticity degree of sequences, originating from dynamical systems theory and from number theory. Namely, whenever the value of Kolmogorov's stochasticity parameter of a given sequence of numbers is too small (or too big), one may conclude that the conjecture describing this sequence as a sample of independent values of a random variables is highly improbable. Kolmogorov used this strategy fighting (in a paper in 'Doklady', 1940) against Lysenko, who had tried to disprove the classical genetics' law of Mendel experimentally. Calculating his stochasticity parameter value for the numbers from Lysenko's experiment reports, Kolmogorov deduced, that, while these numbers were different from the exact fulfilment of Mendel's 3 : 1 law, any smaller deviation would be a manifestation of the report's number falsification. The calculation of the values of the stochasticity parameter would be useful for many other generators of pseudorandom numbers and for many other chaotically looking statistics, including even the prime numbers distribution (discussed in this paper as an example).
An Analysis of Attitudes toward Statistics: Gender Differences among Advertising Majors.
ERIC Educational Resources Information Center
Fullerton, Jami A.; Umphrey, Don
This study measures advertising students' attitudes toward statistics. Subjects, 275 undergraduate advertising students from two southwestern United States universities, completed a questionnaire used to gauge students' attitudes toward statistics by measuring 6 underlying factors: (1) students' interest and future applicability; (2) relationship…
Measurement-device-independent entanglement-based quantum key distribution
NASA Astrophysics Data System (ADS)
Yang, Xiuqing; Wei, Kejin; Ma, Haiqiang; Sun, Shihai; Liu, Hongwei; Yin, Zhenqiang; Li, Zuohan; Lian, Shibin; Du, Yungang; Wu, Lingan
2016-05-01
We present a quantum key distribution protocol in a model in which the legitimate users gather statistics as in the measurement-device-independent entanglement witness to certify the sources and the measurement devices. We show that the task of measurement-device-independent quantum communication can be accomplished based on monogamy of entanglement, and it is fairly loss tolerate including source and detector flaws. We derive a tight bound for collective attacks on the Holevo information between the authorized parties and the eavesdropper. Then with this bound, the final secret key rate with the source flaws can be obtained. The results show that long-distance quantum cryptography over 144 km can be made secure using only standard threshold detectors.
Developing and validating a nutrition knowledge questionnaire: key methods and considerations.
Trakman, Gina Louise; Forsyth, Adrienne; Hoye, Russell; Belski, Regina
2017-10-01
To outline key statistical considerations and detailed methodologies for the development and evaluation of a valid and reliable nutrition knowledge questionnaire. Literature on questionnaire development in a range of fields was reviewed and a set of evidence-based guidelines specific to the creation of a nutrition knowledge questionnaire have been developed. The recommendations describe key qualitative methods and statistical considerations, and include relevant examples from previous papers and existing nutrition knowledge questionnaires. Where details have been omitted for the sake of brevity, the reader has been directed to suitable references. We recommend an eight-step methodology for nutrition knowledge questionnaire development as follows: (i) definition of the construct and development of a test plan; (ii) generation of the item pool; (iii) choice of the scoring system and response format; (iv) assessment of content validity; (v) assessment of face validity; (vi) purification of the scale using item analysis, including item characteristics, difficulty and discrimination; (vii) evaluation of the scale including its factor structure and internal reliability, or Rasch analysis, including assessment of dimensionality and internal reliability; and (viii) gathering of data to re-examine the questionnaire's properties, assess temporal stability and confirm construct validity. Several of these methods have previously been overlooked. The measurement of nutrition knowledge is an important consideration for individuals working in the nutrition field. Improved methods in the development of nutrition knowledge questionnaires, such as the use of factor analysis or Rasch analysis, will enable more confidence in reported measures of nutrition knowledge.
Primary school children are able to perform basic life-saving first aid measures.
Bollig, Georg; Wahl, Hans Alvin; Svendsen, Martin Veel
2009-06-01
First aid measures can be life-saving. Starting first aid education early may strengthen interest, motivation and ability to provide first aid. To determine if a first aid teaching program including 5 lessons (45 min each) of theoretical and practical training for 6-7-year-old children can influence their performance in a first aid scenario. 228 primary school children at the age of 6-7 years were included in the study, 102 girls and 126 boys. One child was 5 years old. 117 children were taught basic first aid measures and 111 without training served as control group. In the test scenario the children had to provide first aid to an unconscious victim after a cycle accident. The course participants were retested after 6 months. Statistically significant differences between course participants compared to those without training could be shown for all tested subjects, including correct assessment of consciousness (p<0.001), correct assessment of breathing (p<0.001), knowledge of the correct emergency telephone number (p<0.001), giving correct emergency call information (p<0.001), knowledge of correct recovery position (p<0.001), correct airway management (p<0.001). Retesting after 6 months showed statistically significant differences for 5 of 6 tested items. 6-7-Year-old children can give basic first aid to an unconscious patient. A course with 5 lessons leads to a significant increase in first aid knowledge and skills. Knowledge retention is good after 6 months. All primary school children should receive first aid training starting in first grade.
Planck 2015 results. XVI. Isotropy and statistics of the CMB
NASA Astrophysics Data System (ADS)
Planck Collaboration; Ade, P. A. R.; Aghanim, N.; Akrami, Y.; Aluri, P. K.; Arnaud, M.; Ashdown, M.; Aumont, J.; Baccigalupi, C.; Banday, A. J.; Barreiro, R. B.; Bartolo, N.; Basak, S.; Battaner, E.; Benabed, K.; Benoît, A.; Benoit-Lévy, A.; Bernard, J.-P.; Bersanelli, M.; Bielewicz, P.; Bock, J. J.; Bonaldi, A.; Bonavera, L.; Bond, J. R.; Borrill, J.; Bouchet, F. R.; Boulanger, F.; Bucher, M.; Burigana, C.; Butler, R. C.; Calabrese, E.; Cardoso, J.-F.; Casaponsa, B.; Catalano, A.; Challinor, A.; Chamballu, A.; Chiang, H. C.; Christensen, P. R.; Church, S.; Clements, D. L.; Colombi, S.; Colombo, L. P. L.; Combet, C.; Contreras, D.; Couchot, F.; Coulais, A.; Crill, B. P.; Cruz, M.; Curto, A.; Cuttaia, F.; Danese, L.; Davies, R. D.; Davis, R. J.; de Bernardis, P.; de Rosa, A.; de Zotti, G.; Delabrouille, J.; Désert, F.-X.; Diego, J. M.; Dole, H.; Donzelli, S.; Doré, O.; Douspis, M.; Ducout, A.; Dupac, X.; Efstathiou, G.; Elsner, F.; Enßlin, T. A.; Eriksen, H. K.; Fantaye, Y.; Fergusson, J.; Fernandez-Cobos, R.; Finelli, F.; Forni, O.; Frailis, M.; Fraisse, A. A.; Franceschi, E.; Frejsel, A.; Frolov, A.; Galeotta, S.; Galli, S.; Ganga, K.; Gauthier, C.; Ghosh, T.; Giard, M.; Giraud-Héraud, Y.; Gjerløw, E.; González-Nuevo, J.; Górski, K. M.; Gratton, S.; Gregorio, A.; Gruppuso, A.; Gudmundsson, J. E.; Hansen, F. K.; Hanson, D.; Harrison, D. L.; Henrot-Versillé, S.; Hernández-Monteagudo, C.; Herranz, D.; Hildebrandt, S. R.; Hivon, E.; Hobson, M.; Holmes, W. A.; Hornstrup, A.; Hovest, W.; Huang, Z.; Huffenberger, K. M.; Hurier, G.; Jaffe, A. H.; Jaffe, T. R.; Jones, W. C.; Juvela, M.; Keihänen, E.; Keskitalo, R.; Kim, J.; Kisner, T. S.; Knoche, J.; Kunz, M.; Kurki-Suonio, H.; Lagache, G.; Lähteenmäki, A.; Lamarre, J.-M.; Lasenby, A.; Lattanzi, M.; Lawrence, C. R.; Leonardi, R.; Lesgourgues, J.; Levrier, F.; Liguori, M.; Lilje, P. B.; Linden-Vørnle, M.; Liu, H.; López-Caniego, M.; Lubin, P. M.; Macías-Pérez, J. F.; Maggio, G.; Maino, D.; Mandolesi, N.; Mangilli, A.; Marinucci, D.; Maris, M.; Martin, P. G.; Martínez-González, E.; Masi, S.; Matarrese, S.; McGehee, P.; Meinhold, P. R.; Melchiorri, A.; Mendes, L.; Mennella, A.; Migliaccio, M.; Mikkelsen, K.; Mitra, S.; Miville-Deschênes, M.-A.; Molinari, D.; Moneti, A.; Montier, L.; Morgante, G.; Mortlock, D.; Moss, A.; Munshi, D.; Murphy, J. A.; Naselsky, P.; Nati, F.; Natoli, P.; Netterfield, C. B.; Nørgaard-Nielsen, H. U.; Noviello, F.; Novikov, D.; Novikov, I.; Oxborrow, C. A.; Paci, F.; Pagano, L.; Pajot, F.; Pant, N.; Paoletti, D.; Pasian, F.; Patanchon, G.; Pearson, T. J.; Perdereau, O.; Perotto, L.; Perrotta, F.; Pettorino, V.; Piacentini, F.; Piat, M.; Pierpaoli, E.; Pietrobon, D.; Plaszczynski, S.; Pointecouteau, E.; Polenta, G.; Popa, L.; Pratt, G. W.; Prézeau, G.; Prunet, S.; Puget, J.-L.; Rachen, J. P.; Rebolo, R.; Reinecke, M.; Remazeilles, M.; Renault, C.; Renzi, A.; Ristorcelli, I.; Rocha, G.; Rosset, C.; Rossetti, M.; Rotti, A.; Roudier, G.; Rubiño-Martín, J. A.; Rusholme, B.; Sandri, M.; Santos, D.; Savelainen, M.; Savini, G.; Scott, D.; Seiffert, M. D.; Shellard, E. P. S.; Souradeep, T.; Spencer, L. D.; Stolyarov, V.; Stompor, R.; Sudiwala, R.; Sunyaev, R.; Sutton, D.; Suur-Uski, A.-S.; Sygnet, J.-F.; Tauber, J. A.; Terenzi, L.; Toffolatti, L.; Tomasi, M.; Tristram, M.; Trombetti, T.; Tucci, M.; Tuovinen, J.; Valenziano, L.; Valiviita, J.; Van Tent, B.; Vielva, P.; Villa, F.; Wade, L. A.; Wandelt, B. D.; Wehus, I. K.; Yvon, D.; Zacchei, A.; Zibin, J. P.; Zonca, A.
2016-09-01
We test the statistical isotropy and Gaussianity of the cosmic microwave background (CMB) anisotropies using observations made by the Planck satellite. Our results are based mainly on the full Planck mission for temperature, but also include some polarization measurements. In particular, we consider the CMB anisotropy maps derived from the multi-frequency Planck data by several component-separation methods. For the temperature anisotropies, we find excellent agreement between results based on these sky maps over both a very large fraction of the sky and a broad range of angular scales, establishing that potential foreground residuals do not affect our studies. Tests of skewness, kurtosis, multi-normality, N-point functions, and Minkowski functionals indicate consistency with Gaussianity, while a power deficit at large angular scales is manifested in several ways, for example low map variance. The results of a peak statistics analysis are consistent with the expectations of a Gaussian random field. The "Cold Spot" is detected with several methods, including map kurtosis, peak statistics, and mean temperature profile. We thoroughly probe the large-scale dipolar power asymmetry, detecting it with several independent tests, and address the subject of a posteriori correction. Tests of directionality suggest the presence of angular clustering from large to small scales, but at a significance that is dependent on the details of the approach. We perform the first examination of polarization data, finding the morphology of stacked peaks to be consistent with the expectations of statistically isotropic simulations. Where they overlap, these results are consistent with the Planck 2013 analysis based on the nominal mission data and provide our most thorough view of the statistics of the CMB fluctuations to date.
Planck 2015 results: XVI. Isotropy and statistics of the CMB
Ade, P. A. R.; Aghanim, N.; Akrami, Y.; ...
2016-09-20
In this paper, we test the statistical isotropy and Gaussianity of the cosmic microwave background (CMB) anisotropies using observations made by the Planck satellite. Our results are based mainly on the full Planck mission for temperature, but also include some polarization measurements. In particular, we consider the CMB anisotropy maps derived from the multi-frequency Planck data by several component-separation methods. For the temperature anisotropies, we find excellent agreement between results based on these sky maps over both a very large fraction of the sky and a broad range of angular scales, establishing that potential foreground residuals do not affect ourmore » studies. Tests of skewness, kurtosis, multi-normality, N-point functions, and Minkowski functionals indicate consistency with Gaussianity, while a power deficit at large angular scales is manifested in several ways, for example low map variance. The results of a peak statistics analysis are consistent with the expectations of a Gaussian random field. The “Cold Spot” is detected with several methods, including map kurtosis, peak statistics, and mean temperature profile. We thoroughly probe the large-scale dipolar power asymmetry, detecting it with several independent tests, and address the subject of a posteriori correction. Tests of directionality suggest the presence of angular clustering from large to small scales, but at a significance that is dependent on the details of the approach. We perform the first examination of polarization data, finding the morphology of stacked peaks to be consistent with the expectations of statistically isotropic simulations. Finally, where they overlap, these results are consistent with the Planck 2013 analysis based on the nominal mission data and provide our most thorough view of the statistics of the CMB fluctuations to date.« less
Development of rotorcraft interior noise control concepts. Phase 2: Full scale testing, revision 1
NASA Technical Reports Server (NTRS)
Yoerkie, C. A.; Gintoli, P. J.; Moore, J. A.
1986-01-01
The phase 2 effort consisted of a series of ground and flight test measurements to obtain data for validation of the Statistical Energy Analysis (SEA) model. Included in the gound tests were various transfer function measurements between vibratory and acoustic subsystems, vibration and acoustic decay rate measurements, and coherent source measurements. The bulk of these, the vibration transfer functions, were used for SEA model validation, while the others provided information for characterization of damping and reverberation time of the subsystems. The flight test program included measurements of cabin and cockpit sound pressure level, frame and panel vibration level, and vibration levels at the main transmission attachment locations. Comparisons between measured and predicted subsystem excitation levels from both ground and flight testing were evaluated. The ground test data show good correlation with predictions of vibration levels throughout the cabin overhead for all excitations. The flight test results also indicate excellent correlation of inflight sound pressure measurements to sound pressure levels predicted by the SEA model, where the average aircraft speech interference level is predicted within 0.2 dB.
Buliung, Ron N; Larsen, Kristian; Faulkner, Guy E J; Stone, Michelle R
2013-09-01
School route measurement often involves estimating the shortest network path. We challenged the relatively uncritical adoption of this method in school travel research and tested the route discordance hypothesis that several types of difference exist between shortest network paths and reported school routes. We constructed the mapped and shortest path through network routes for a sample of 759 children aged 9 to 13 years in grades 5 and 6 (boys = 45%, girls = 54%, unreported gender = 1%), in Toronto, Ontario, Canada. We used Wilcoxon signed-rank tests to compare reported with shortest-path route measures including distance, route directness, intersection crossings, and route overlap. Measurement difference was explored by mode and location. We found statistical evidence of route discordance for walkers and children who were driven and detected it more often for inner suburban cases. Evidence of route discordance varied by mode and school location. We found statistically significant differences for route structure and built environment variables measured along reported and geographic information systems-based shortest-path school routes. Uncertainty produced by the shortest-path approach challenges its conceptual and empirical validity in school travel research.
Larsen, Kristian; Faulkner, Guy E. J.; Stone, Michelle R.
2013-01-01
Objectives. School route measurement often involves estimating the shortest network path. We challenged the relatively uncritical adoption of this method in school travel research and tested the route discordance hypothesis that several types of difference exist between shortest network paths and reported school routes. Methods. We constructed the mapped and shortest path through network routes for a sample of 759 children aged 9 to 13 years in grades 5 and 6 (boys = 45%, girls = 54%, unreported gender = 1%), in Toronto, Ontario, Canada. We used Wilcoxon signed-rank tests to compare reported with shortest-path route measures including distance, route directness, intersection crossings, and route overlap. Measurement difference was explored by mode and location. Results. We found statistical evidence of route discordance for walkers and children who were driven and detected it more often for inner suburban cases. Evidence of route discordance varied by mode and school location. Conclusions. We found statistically significant differences for route structure and built environment variables measured along reported and geographic information systems–based shortest-path school routes. Uncertainty produced by the shortest-path approach challenges its conceptual and empirical validity in school travel research. PMID:23865648
Summary and Statistical Analysis of the First AIAA Sonic Boom Prediction Workshop
NASA Technical Reports Server (NTRS)
Park, Michael A.; Morgenstern, John M.
2014-01-01
A summary is provided for the First AIAA Sonic Boom Workshop held 11 January 2014 in conjunction with AIAA SciTech 2014. Near-field pressure signatures extracted from computational fluid dynamics solutions are gathered from nineteen participants representing three countries for the two required cases, an axisymmetric body and simple delta wing body. Structured multiblock, unstructured mixed-element, unstructured tetrahedral, overset, and Cartesian cut-cell methods are used by the participants. Participants provided signatures computed on participant generated and solution adapted grids. Signatures are also provided for a series of uniformly refined workshop provided grids. These submissions are propagated to the ground and loudness measures are computed. This allows the grid convergence of a loudness measure and a validation metric (dfference norm between computed and wind tunnel measured near-field signatures) to be studied for the first time. Statistical analysis is also presented for these measures. An optional configuration includes fuselage, wing, tail, flow-through nacelles, and blade sting. This full configuration exhibits more variation in eleven submissions than the sixty submissions provided for each required case. Recommendations are provided for potential improvements to the analysis methods and a possible subsequent workshop.
On the Statistical Analysis of X-ray Polarization Measurements
NASA Technical Reports Server (NTRS)
Strohmayer, T. E.; Kallman, T. R.
2013-01-01
In many polarimetry applications, including observations in the X-ray band, the measurement of a polarization signal can be reduced to the detection and quantification of a deviation from uniformity of a distribution of measured angles of the form alpha plus beta cosine (exp 2)(phi - phi(sub 0) (0 (is) less than phi is less than pi). We explore the statistics of such polarization measurements using both Monte Carlo simulations as well as analytic calculations based on the appropriate probability distributions. We derive relations for the number of counts required to reach a given detection level (parameterized by beta the "number of sigma's" of the measurement) appropriate for measuring the modulation amplitude alpha by itself (single interesting parameter case) or jointly with the position angle phi (two interesting parameters case). We show that for the former case when the intrinsic amplitude is equal to the well known minimum detectable polarization (MDP) it is, on average, detected at the 3sigma level. For the latter case, when one requires a joint measurement at the same confidence level, then more counts are needed, by a factor of approximately equal to 2.2, than that required to achieve the MDP level. We find that the position angle uncertainty at 1sigma confidence is well described by the relation sigma(sub pi) equals 28.5(degrees) divided by beta.
Hewett, Paul; Bullock, William H
2014-01-01
For more than 20 years CSX Transportation (CSXT) has collected exposure measurements from locomotive engineers and conductors who are potentially exposed to diesel emissions. The database included measurements for elemental and total carbon, polycyclic aromatic hydrocarbons, aromatics, aldehydes, carbon monoxide, and nitrogen dioxide. This database was statistically analyzed and summarized, and the resulting statistics and exposure profiles were compared to relevant occupational exposure limits (OELs) using both parametric and non-parametric descriptive and compliance statistics. Exposure ratings, using the American Industrial Health Association (AIHA) exposure categorization scheme, were determined using both the compliance statistics and Bayesian Decision Analysis (BDA). The statistical analysis of the elemental carbon data (a marker for diesel particulate) strongly suggests that the majority of levels in the cabs of the lead locomotives (n = 156) were less than the California guideline of 0.020 mg/m(3). The sample 95th percentile was roughly half the guideline; resulting in an AIHA exposure rating of category 2/3 (determined using BDA). The elemental carbon (EC) levels in the trailing locomotives tended to be greater than those in the lead locomotive; however, locomotive crews rarely ride in the trailing locomotive. Lead locomotive EC levels were similar to those reported by other investigators studying locomotive crew exposures and to levels measured in urban areas. Lastly, both the EC sample mean and 95%UCL were less than the Environmental Protection Agency (EPA) reference concentration of 0.005 mg/m(3). With the exception of nitrogen dioxide, the overwhelming majority of the measurements for total carbon, polycyclic aromatic hydrocarbons, aromatics, aldehydes, and combustion gases in the cabs of CSXT locomotives were either non-detects or considerably less than the working OELs for the years represented in the database. When compared to the previous American Conference of Governmental Industrial Hygienists (ACGIH) threshold limit value (TLV) of 3 ppm the nitrogen dioxide exposure profile merits an exposure rating of AIHA exposure category 1. However, using the newly adopted TLV of 0.2 ppm the exposure profile receives an exposure rating of category 4. Further evaluation is recommended to determine the current status of nitrogen dioxide exposures. [Supplementary materials are available for this article. Go to the publisher's online edition of Journal of Occupational and Environmental Hygiene for the following free supplemental resource: additional text on OELs, methods, results, and additional figures and tables.].
NASA Astrophysics Data System (ADS)
Rich, Grayson Currie
The COHERENT Collaboration has produced the first-ever observation, with a significance of 6.7sigma, of a process consistent with coherent, elastic neutrino-nucleus scattering (CEnuNS) as first predicted and described by D.Z. Freedman in 1974. Physics of the CEnuNS process are presented along with its relationship to future measurements in the arenas of nuclear physics, fundamental particle physics, and astroparticle physics, where the newly-observed interaction presents a viable tool for investigations into numerous outstanding questions about the nature of the universe. To enable the CEnuNS observation with a 14.6-kg CsI[Na] detector, new measurements of the response of CsI[Na] to low-energy nuclear recoils, which is the only mechanism by which CEnuNS is detectable, were carried out at Triangle Universities Nuclear Laboratory; these measurements are detailed and an effective nuclear-recoil quenching factor of 8.78 +/- 1.66% is established for CsI[Na] in the recoil-energy range of 5-30 keV, based on new and literature data. Following separate analyses of the CEnuNS-search data by groups at the University of Chicago and the Moscow Engineering and Physics Institute, information from simulations, calculations, and ancillary measurements were used to inform statistical analyses of the collected data. Based on input from the Chicago analysis, the number of CEnuNS events expected from the Standard Model is 173 +/- 48; interpretation as a simple counting experiment finds 136 +/- 31 CEnuNS counts in the data, while a two-dimensional, profile likelihood fit yields 134 +/- 22 CEnuNS counts. Details of the simulations, calculations, and supporting measurements are discussed, in addition to the statistical procedures. Finally, potential improvements to the CsI[Na]-based CEnuNS measurement are presented along with future possibilities for COHERENT Collaboration, including new CEnuNS detectors and measurement of the neutrino-induced neutron spallation process.
Inter- relationship between rheumatoid arthritis and periodontitis.
Rajkarnikar, J; Thomas, B S; Rao, S K
2013-01-01
Periodontal medicine defines a rapidly emerging branch of Periodontology focusing on establishing a strong relationship between periodontal health and systemic health. It is speculated that the major common dysregulation which links Periodontitis with Rheumatoid arthritis (RA) is being played by the mediators of immune inflammatory response. To determine whether there is any relationship between periodontal disease and Rheumatoid arthritis. A total of 100 patients were included for the present study which was divided into two groups: one group (cases) included 50 patients attending the Department of Orthopedics, Kasturba Medical College, Manipal who were diagnosed of Rheumatoid arthritis. Another subject population included 50 patients as controls attending the Department of Oral Medicine, Manipal College of Dental Sciences, Manipal with age and gender matched with those of rheumatoid arthritis group. Specific measures for periodontitis included plaque index, gingival index, number of missing teeth, and radiographic alveolar bone loss scores. Measures of rheumatoid arthritis included health assessment questionaires, levels of C-reactive protein (CRP) and erythrocyte sedimentation rate (ESR). Various periodontal parameters were compared between the cases and controls. The average alveolar bone loss was statistically more severe in Rheumatoid arthritis (RA) group than in the controls although there were similar plaque index in both the groups. The gingival index was statistically higher in the RA group. The Erythrocyte Sedimentation Rate (ESR) and C- Reactive Protein (CRP) levels of RA patients were also significantly associated with the severity of periodontal disease. There was a significant association between Rheumatoid arthritis and Periodontitis which may be due to a common underlying deregulation of the inflammatory response in these individuals.
Granato, Gregory E.
2014-01-01
The U.S. Geological Survey (USGS) developed the Stochastic Empirical Loading and Dilution Model (SELDM) in cooperation with the Federal Highway Administration (FHWA) to indicate the risk for stormwater concentrations, flows, and loads to be above user-selected water-quality goals and the potential effectiveness of mitigation measures to reduce such risks. SELDM models the potential effect of mitigation measures by using Monte Carlo methods with statistics that approximate the net effects of structural and nonstructural best management practices (BMPs). In this report, structural BMPs are defined as the components of the drainage pathway between the source of runoff and a stormwater discharge location that affect the volume, timing, or quality of runoff. SELDM uses a simple stochastic statistical model of BMP performance to develop planning-level estimates of runoff-event characteristics. This statistical approach can be used to represent a single BMP or an assemblage of BMPs. The SELDM BMP-treatment module has provisions for stochastic modeling of three stormwater treatments: volume reduction, hydrograph extension, and water-quality treatment. In SELDM, these three treatment variables are modeled by using the trapezoidal distribution and the rank correlation with the associated highway-runoff variables. This report describes methods for calculating the trapezoidal-distribution statistics and rank correlation coefficients for stochastic modeling of volume reduction, hydrograph extension, and water-quality treatment by structural stormwater BMPs and provides the calculated values for these variables. This report also provides robust methods for estimating the minimum irreducible concentration (MIC), which is the lowest expected effluent concentration from a particular BMP site or a class of BMPs. These statistics are different from the statistics commonly used to characterize or compare BMPs. They are designed to provide a stochastic transfer function to approximate the quantity, duration, and quality of BMP effluent given the associated inflow values for a population of storm events. A database application and several spreadsheet tools are included in the digital media accompanying this report for further documentation of methods and for future use. In this study, analyses were done with data extracted from a modified copy of the January 2012 version of International Stormwater Best Management Practices Database, designated herein as the January 2012a version. Statistics for volume reduction, hydrograph extension, and water-quality treatment were developed with selected data. Sufficient data were available to estimate statistics for 5 to 10 BMP categories by using data from 40 to more than 165 monitoring sites. Water-quality treatment statistics were developed for 13 runoff-quality constituents commonly measured in highway and urban runoff studies including turbidity, sediment and solids; nutrients; total metals; organic carbon; and fecal coliforms. The medians of the best-fit statistics for each category were selected to construct generalized cumulative distribution functions for the three treatment variables. For volume reduction and hydrograph extension, interpretation of available data indicates that selection of a Spearman’s rho value that is the average of the median and maximum values for the BMP category may help generate realistic simulation results in SELDM. The median rho value may be selected to help generate realistic simulation results for water-quality treatment variables. MIC statistics were developed for 12 runoff-quality constituents commonly measured in highway and urban runoff studies by using data from 11 BMP categories and more than 167 monitoring sites. Four statistical techniques were applied for estimating MIC values with monitoring data from each site. These techniques produce a range of lower-bound estimates for each site. Four MIC estimators are proposed as alternatives for selecting a value from among the estimates from multiple sites. Correlation analysis indicates that the MIC estimates from multiple sites were weakly correlated with the geometric mean of inflow values, which indicates that there may be a qualitative or semiquantitative link between the inflow quality and the MIC. Correlations probably are weak because the MIC is influenced by the inflow water quality and the capability of each individual BMP site to reduce inflow concentrations.
Single-molecule conductance studies of photo-active and photochromic molecules
NASA Astrophysics Data System (ADS)
Tam, E. S.; Parks, J. J.; Santiago-Berrios, M. B.; Zhong, Y.-W.; Abruna, H. D.; Ralph, D. C.
2010-03-01
We perform statistical measurements of single molecule conductance in repeatedly-formed metal-molecule-metal junctions at room temperature. Our results on diaminoalkanes are consistent with those reported by the Venkataraman group. We focus on photo-active and photochromic molecules, including a series of transition-metal complexes with different metal centers and endgroups. We compare the trend in conductance across the family of complexes with that expected from electrochemical measurements. We will also report initial results on the voltage dependence of single-molecule conductances and the effects of optical excitations.
Land mobile satellite propagation measurements in Japan using ETS-V satellite
NASA Technical Reports Server (NTRS)
Obara, Noriaki; Tanaka, Kenji; Yamamoto, Shin-Ichi; Wakana, Hiromitsu
1993-01-01
Propagation characteristics of land mobile satellite communications channels have been investigated actively in recent years. Information of propagation characteristics associated with multipath fading and shadowing is required to design commercial land mobile satellite communications systems, including protocol and error correction method. CRL (Communications Research Laboratory) has carried out propagation measurements using the Engineering Test Satellite-V (ETS-V) at L band (1.5 GHz) through main roads in Japan by a medium gain antenna with an autotracking capability. This paper presents the propagation statistics obtained in this campaign.
NASA Astrophysics Data System (ADS)
Faigon, A.; Martinez Vazquez, I.; Carbonetto, S.; García Inza, M.; G
2017-01-01
A floating gate dosimeter was designed and fabricated in a standard CMOS technology. The design guides and characterization are presented. The characterization included the controlled charging by tunneling of the floating gate, and its discharging under irradiation while measuring the transistor drain current whose change is the measure of the absorbed dose. The resolution of the obtained device is close to 1 cGy satisfying the requirements for most radiation therapies dosimetry. Pending statistical proofs, the dosimeter is a potential candidate for wide in-vivo control of radiotherapy treatments.
NASA Technical Reports Server (NTRS)
Edwards, S. F.; Kantsios, A. G.; Voros, J. P.; Stewart, W. F.
1975-01-01
The development of a radiometric technique for determining the spectral and total normal emittance of materials heated to temperatures of 800, 1100, and 1300 K by direct comparison with National Bureau of Standards (NBS) reference specimens is discussed. Emittances are measured over the spectral range of 1 to 15 microns and are statistically compared with NBS reference specimens. Results are included for NBS reference specimens, Rene 41, alundum, zirconia, AISI type 321 stainless steel, nickel 201, and a space-shuttle reusable surface insulation.
Eisner, Brian H; Kambadakone, Avinash; Monga, Manoj; Anderson, James K; Thoreson, Andrew A; Lee, Hang; Dretler, Stephen P; Sahani, Dushyant V
2009-04-01
We determined the most accurate method of measuring urinary stones on computerized tomography. For the in vitro portion of the study 24 calculi, including 12 calcium oxalate monohydrate and 12 uric acid stones, that had been previously collected at our clinic were measured manually with hand calipers as the gold standard measurement. The calculi were then embedded into human kidney-sized potatoes and scanned using 64-slice multidetector computerized tomography. Computerized tomography measurements were performed at 4 window settings, including standard soft tissue windows (window width-320 and window length-50), standard bone windows (window width-1120 and window length-300), 5.13x magnified soft tissue windows and 5.13x magnified bone windows. Maximum stone dimensions were recorded. For the in vivo portion of the study 41 patients with distal ureteral stones who underwent noncontrast computerized tomography and subsequently spontaneously passed the stones were analyzed. All analyzed stones were 100% calcium oxalate monohydrate or mixed, calcium based stones. Stones were prospectively collected at the clinic and the largest diameter was measured with digital calipers as the gold standard. This was compared to computerized tomography measurements using 4.0x magnified soft tissue windows and 4.0x magnified bone windows. Statistical comparisons were performed using Pearson's correlation and paired t test. In the in vitro portion of the study the most accurate measurements were obtained using 5.13x magnified bone windows with a mean 0.13 mm difference from caliper measurement (p = 0.6). Measurements performed in the soft tissue window with and without magnification, and in the bone window without magnification were significantly different from hand caliper measurements (mean difference 1.2, 1.9 and 1.4 mm, p = 0.003, <0.001 and 0.0002, respectively). When comparing measurement errors between stones of different composition in vitro, the error for calcium oxalate calculi was significantly different from the gold standard for all methods except bone window settings with magnification. For uric acid calculi the measurement error was observed only in standard soft tissue window settings. In vivo 4.0x magnified bone windows was superior to 4.0x magnified soft tissue windows in measurement accuracy. Magnified bone window measurements were not statistically different from digital caliper measurements (mean underestimation vs digital caliper 0.3 mm, p = 0.4), while magnified soft tissue windows were statistically distinct (mean underestimation 1.4 mm, p = 0.001). In this study magnified bone windows were the most accurate method of stone measurements in vitro and in vivo. Therefore, we recommend the routine use of magnified bone windows for computerized tomography measurement of stones. In vitro the measurement error in calcium oxalate stones was greater than that in uric acid stones, suggesting that stone composition may be responsible for measurement inaccuracies.
Attitudes toward statistics in medical postgraduates: measuring, evaluating and monitoring.
Zhang, Yuhai; Shang, Lei; Wang, Rui; Zhao, Qinbo; Li, Chanjuan; Xu, Yongyong; Su, Haixia
2012-11-23
In medical training, statistics is considered a very difficult course to learn and teach. Current studies have found that students' attitudes toward statistics can influence their learning process. Measuring, evaluating and monitoring the changes of students' attitudes toward statistics are important. Few studies have focused on the attitudes of postgraduates, especially medical postgraduates. Our purpose was to understand current attitudes regarding statistics held by medical postgraduates and explore their effects on students' achievement. We also wanted to explore the influencing factors and the sources of these attitudes and monitor their changes after a systematic statistics course. A total of 539 medical postgraduates enrolled in a systematic statistics course completed the pre-form of the Survey of Attitudes Toward Statistics -28 scale, and 83 postgraduates were selected randomly from among them to complete the post-form scale after the course. Most medical postgraduates held positive attitudes toward statistics, but they thought statistics was a very difficult subject. The attitudes mainly came from experiences in a former statistical or mathematical class. Age, level of statistical education, research experience, specialty and mathematics basis may influence postgraduate attitudes toward statistics. There were significant positive correlations between course achievement and attitudes toward statistics. In general, student attitudes showed negative changes after completing a statistics course. The importance of student attitudes toward statistics must be recognized in medical postgraduate training. To make sure all students have a positive learning environment, statistics teachers should measure their students' attitudes and monitor their change of status during a course. Some necessary assistance should be offered for those students who develop negative attitudes.
Thompson, Ronald E.; Hoffman, Scott A.
2006-01-01
A suite of 28 streamflow statistics, ranging from extreme low to high flows, was computed for 17 continuous-record streamflow-gaging stations and predicted for 20 partial-record stations in Monroe County and contiguous counties in north-eastern Pennsylvania. The predicted statistics for the partial-record stations were based on regression analyses relating inter-mittent flow measurements made at the partial-record stations indexed to concurrent daily mean flows at continuous-record stations during base-flow conditions. The same statistics also were predicted for 134 ungaged stream locations in Monroe County on the basis of regression analyses relating the statistics to GIS-determined basin characteristics for the continuous-record station drainage areas. The prediction methodology for developing the regression equations used to estimate statistics was developed for estimating low-flow frequencies. This study and a companion study found that the methodology also has application potential for predicting intermediate- and high-flow statistics. The statistics included mean monthly flows, mean annual flow, 7-day low flows for three recurrence intervals, nine flow durations, mean annual base flow, and annual mean base flows for two recurrence intervals. Low standard errors of prediction and high coefficients of determination (R2) indicated good results in using the regression equations to predict the statistics. Regression equations for the larger flow statistics tended to have lower standard errors of prediction and higher coefficients of determination (R2) than equations for the smaller flow statistics. The report discusses the methodologies used in determining the statistics and the limitations of the statistics and the equations used to predict the statistics. Caution is indicated in using the predicted statistics for small drainage area situations. Study results constitute input needed by water-resource managers in Monroe County for planning purposes and evaluation of water-resources availability.
High cumulants of conserved charges and their statistical uncertainties
NASA Astrophysics Data System (ADS)
Li-Zhu, Chen; Ye-Yin, Zhao; Xue, Pan; Zhi-Ming, Li; Yuan-Fang, Wu
2017-10-01
We study the influence of measured high cumulants of conserved charges on their associated statistical uncertainties in relativistic heavy-ion collisions. With a given number of events, the measured cumulants randomly fluctuate with an approximately normal distribution, while the estimated statistical uncertainties are found to be correlated with corresponding values of the obtained cumulants. Generally, with a given number of events, the larger the cumulants we measure, the larger the statistical uncertainties that are estimated. The error-weighted averaged cumulants are dependent on statistics. Despite this effect, however, it is found that the three sigma rule of thumb is still applicable when the statistics are above one million. Supported by NSFC (11405088, 11521064, 11647093), Major State Basic Research Development Program of China (2014CB845402) and Ministry of Science and Technology (MoST) (2016YFE0104800)
A Statistical Framework for the Functional Analysis of Metagenomes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sharon, Itai; Pati, Amrita; Markowitz, Victor
2008-10-01
Metagenomic studies consider the genetic makeup of microbial communities as a whole, rather than their individual member organisms. The functional and metabolic potential of microbial communities can be analyzed by comparing the relative abundance of gene families in their collective genomic sequences (metagenome) under different conditions. Such comparisons require accurate estimation of gene family frequencies. They present a statistical framework for assessing these frequencies based on the Lander-Waterman theory developed originally for Whole Genome Shotgun (WGS) sequencing projects. They also provide a novel method for assessing the reliability of the estimations which can be used for removing seemingly unreliable measurements.more » They tested their method on a wide range of datasets, including simulated genomes and real WGS data from sequencing projects of whole genomes. Results suggest that their framework corrects inherent biases in accepted methods and provides a good approximation to the true statistics of gene families in WGS projects.« less
Patton, Charles J.; Gilroy, Edward J.
1999-01-01
Data on which this report is based, including nutrient concentrations in synthetic reference samples determined concurrently with those in real samples, are extensive (greater than 20,000 determinations) and have been published separately. In addition to confirming the well-documented instability of nitrite in acidified samples, this study also demonstrates that when biota are removed from samples at collection sites by 0.45-micrometer membrane filtration, subsequent preservation with sulfuric acid or mercury (II) provides no statistically significant improvement in nutrient concentration stability during storage at 4 degrees Celsius for 30 days. Biocide preservation had no statistically significant effect on the 30-day stability of phosphorus concentrations in whole-water splits from any of the 15 stations, but did stabilize Kjeldahl nitrogen concentrations in whole-water splits from three data-collection stations where ammonium accounted for at least half of the measured Kjeldahl nitrogen.
NASA Astrophysics Data System (ADS)
Adams, W. K.; Perkins, K. K.; Podolefsky, N. S.; Dubson, M.; Finkelstein, N. D.; Wieman, C. E.
2006-06-01
The Colorado Learning Attitudes about Science Survey (CLASS) is a new instrument designed to measure student beliefs about physics and about learning physics. This instrument extends previous work by probing additional aspects of student beliefs and by using wording suitable for students in a wide variety of physics courses. The CLASS has been validated using interviews, reliability studies, and extensive statistical analyses of responses from over 5000 students. In addition, a new methodology for determining useful and statistically robust categories of student beliefs has been developed. This paper serves as the foundation for an extensive study of how student beliefs impact and are impacted by their educational experiences. For example, this survey measures the following: that most teaching practices cause substantial drops in student scores; that a student’s likelihood of becoming a physics major correlates with their “Personal Interest” score; and that, for a majority of student populations, women’s scores in some categories, including “Personal Interest” and “Real World Connections,” are significantly different from men’s scores.
Nutritional and food insecurity of construction workers.
de Lima Brasil, Evi Clayton; de Araújo, Lindemberg Medeiros; de Toledo Vianna, Rodrigo Pinheiro
2016-06-27
Construction workers have intensive contact with their workplace and are possibly susceptible to Nutritional and Food Insecurity. This paper assessed the Food Security status, diet and anthropometric measures of workers in the Construction Industry living in the city of João Pessoa, PB. This cross-sectional study included 59 workers housed at construction sites. The workers were given the Brazilian Scale for Measuring Food Insecurity and Nutrition, had anthropometric measures taken and completed the Diet Quality Index, comparing their eating at the construction site and at home. Statistical analyses described the mean, standard deviation, frequency and Pearson correlations. Food Insecurity was reported by 71.2% of the workers, and 69.5% were overweight. The mean values of the Healthy Eating Index suggested that the workers' diets were in need of modification. There were statistically significant inverse associations among the Healthy Eating Index and Body Mass Index, waist circumference, percentage of total fat and cholesterol. Values obtained using the Scale showed Food Insecurity coupled with high excess weight and dietary inadequacies, revealing that these workers are at risk for health problems.
Accuracy of Physical Self-Description Among Chronic Exercisers and Non-Exercisers.
Berning, Joseph M; DeBeliso, Mark; Sevene, Trish G; Adams, Kent J; Salmon, Paul; Stamford, Bryant A
2014-11-06
This study addressed the role of chronic exercise to enhance physical self-description as measured by self-estimated percent body fat. Accuracy of physical self-description was determined in normal-weight, regularly exercising and non-exercising males with similar body mass index (BMI)'s and females with similar BMI's (n=42 males and 45 females of which 23 males and 23 females met criteria to be considered chronic exercisers). Statistical analyses were conducted to determine the degree of agreement between self-estimated percent body fat and actual laboratory measurements (hydrostatic weighing). Three statistical techniques were employed: Pearson correlation coefficients, Bland and Altman plots, and regression analysis. Agreement between measured and self-estimated percent body fat was superior for males and females who exercised chronically, compared to non-exercisers. The clinical implications are as follows. Satisfaction with one's body can be influenced by several factors, including self-perceived body composition. Dissatisfaction can contribute to maladaptive and destructive weight management behaviors. The present study suggests that regular exercise provides a basis for more positive weight management behaviors by enhancing the accuracy of self-assessed body composition.