Zhang, Ying; Sun, Jin; Zhang, Yun-Jiao; Chai, Qian-Yun; Zhang, Kang; Ma, Hong-Li; Wu, Xiao-Ke; Liu, Jian-Ping
2016-10-21
Although Traditional Chinese Medicine (TCM) has been widely used in clinical settings, a major challenge that remains in TCM is to evaluate its efficacy scientifically. This randomized controlled trial aims to evaluate the efficacy and safety of berberine in the treatment of patients with polycystic ovary syndrome. In order to improve the transparency and research quality of this clinical trial, we prepared this statistical analysis plan (SAP). The trial design, primary and secondary outcomes, and safety outcomes were declared to reduce selection biases in data analysis and result reporting. We specified detailed methods for data management and statistical analyses. Statistics in corresponding tables, listings, and graphs were outlined. The SAP provided more detailed information than trial protocol on data management and statistical analysis methods. Any post hoc analyses could be identified via referring to this SAP, and the possible selection bias and performance bias will be reduced in the trial. This study is registered at ClinicalTrials.gov, NCT01138930 , registered on 7 June 2010.
75 FR 72611 - Assessments, Large Bank Pricing
Federal Register 2010, 2011, 2012, 2013, 2014
2010-11-24
... the worst risk ranking and are included in the statistical analysis. Appendix 1 to the NPR describes the statistical analysis in detail. \\12\\ The percentage approximated by factors is based on the statistical model for that particual year. Actual weights assigned to each scorecard measure are largely based...
Tuuli, Methodius G; Odibo, Anthony O
2011-08-01
The objective of this article is to discuss the rationale for common statistical tests used for the analysis and interpretation of prenatal diagnostic imaging studies. Examples from the literature are used to illustrate descriptive and inferential statistics. The uses and limitations of linear and logistic regression analyses are discussed in detail.
Pike, Katie; Nash, Rachel L; Murphy, Gavin J; Reeves, Barnaby C; Rogers, Chris A
2015-02-22
The Transfusion Indication Threshold Reduction (TITRe2) trial is the largest randomized controlled trial to date to compare red blood cell transfusion strategies following cardiac surgery. This update presents the statistical analysis plan, detailing how the study will be analyzed and presented. The statistical analysis plan has been written following recommendations from the International Conference on Harmonisation of Technical Requirements for Registration of Pharmaceuticals for Human Use, prior to database lock and the final analysis of trial data. Outlined analyses are in line with the Consolidated Standards of Reporting Trials (CONSORT). The study aims to randomize 2000 patients from 17 UK centres. Patients are randomized to either a restrictive (transfuse if haemoglobin concentration <7.5 g/dl) or liberal (transfuse if haemoglobin concentration <9 g/dl) transfusion strategy. The primary outcome is a binary composite outcome of any serious infectious or ischaemic event in the first 3 months following randomization. The statistical analysis plan details how non-adherence with the intervention, withdrawals from the study, and the study population will be derived and dealt with in the analysis. The planned analyses of the trial primary and secondary outcome measures are described in detail, including approaches taken to deal with multiple testing, model assumptions not being met and missing data. Details of planned subgroup and sensitivity analyses and pre-specified ancillary analyses are given, along with potential issues that have been identified with such analyses and possible approaches to overcome such issues. ISRCTN70923932 .
Nebraska's forests, 2005: statistics, methods, and quality assurance
Patrick D. Miles; Dacia M. Meneguzzo; Charles J. Barnett
2011-01-01
The first full annual inventory of Nebraska's forests was completed in 2005 after 8,335 plots were selected and 274 forested plots were visited and measured. This report includes detailed information on forest inventory methods, and data quality estimates. Tables of various important resource statistics are presented. Detailed analysis of the inventory data are...
Kansas's forests, 2005: statistics, methods, and quality assurance
Patrick D. Miles; W. Keith Moser; Charles J. Barnett
2011-01-01
The first full annual inventory of Kansas's forests was completed in 2005 after 8,868 plots were selected and 468 forested plots were visited and measured. This report includes detailed information on forest inventory methods and data quality estimates. Important resource statistics are included in the tables. A detailed analysis of Kansas inventory is presented...
Advances in Statistical Methods for Substance Abuse Prevention Research
MacKinnon, David P.; Lockwood, Chondra M.
2010-01-01
The paper describes advances in statistical methods for prevention research with a particular focus on substance abuse prevention. Standard analysis methods are extended to the typical research designs and characteristics of the data collected in prevention research. Prevention research often includes longitudinal measurement, clustering of data in units such as schools or clinics, missing data, and categorical as well as continuous outcome variables. Statistical methods to handle these features of prevention data are outlined. Developments in mediation, moderation, and implementation analysis allow for the extraction of more detailed information from a prevention study. Advancements in the interpretation of prevention research results include more widespread calculation of effect size and statistical power, the use of confidence intervals as well as hypothesis testing, detailed causal analysis of research findings, and meta-analysis. The increased availability of statistical software has contributed greatly to the use of new methods in prevention research. It is likely that the Internet will continue to stimulate the development and application of new methods. PMID:12940467
North Dakota's forests, 2005: statistics, methods, and quality assurance
Patrick D. Miles; David E. Haugen; Charles J. Barnett
2011-01-01
The first full annual inventory of North Dakota's forests was completed in 2005 after 7,622 plots were selected and 164 forested plots were visited and measured. This report includes detailed information on forest inventory methods and data quality estimates. Important resource statistics are included in the tables. A detailed analysis of the North Dakota...
South Dakota's forests, 2005: statistics, methods, and quality assurance
Patrick D. Miles; Ronald J. Piva; Charles J. Barnett
2011-01-01
The first full annual inventory of South Dakota's forests was completed in 2005 after 8,302 plots were selected and 325 forested plots were visited and measured. This report includes detailed information on forest inventory methods and data quality estimates. Important resource statistics are included in the tables. A detailed analysis of the South Dakota...
Bayesian Statistics and Uncertainty Quantification for Safety Boundary Analysis in Complex Systems
NASA Technical Reports Server (NTRS)
He, Yuning; Davies, Misty Dawn
2014-01-01
The analysis of a safety-critical system often requires detailed knowledge of safe regions and their highdimensional non-linear boundaries. We present a statistical approach to iteratively detect and characterize the boundaries, which are provided as parameterized shape candidates. Using methods from uncertainty quantification and active learning, we incrementally construct a statistical model from only few simulation runs and obtain statistically sound estimates of the shape parameters for safety boundaries.
Multiple comparison analysis testing in ANOVA.
McHugh, Mary L
2011-01-01
The Analysis of Variance (ANOVA) test has long been an important tool for researchers conducting studies on multiple experimental groups and one or more control groups. However, ANOVA cannot provide detailed information on differences among the various study groups, or on complex combinations of study groups. To fully understand group differences in an ANOVA, researchers must conduct tests of the differences between particular pairs of experimental and control groups. Tests conducted on subsets of data tested previously in another analysis are called post hoc tests. A class of post hoc tests that provide this type of detailed information for ANOVA results are called "multiple comparison analysis" tests. The most commonly used multiple comparison analysis statistics include the following tests: Tukey, Newman-Keuls, Scheffee, Bonferroni and Dunnett. These statistical tools each have specific uses, advantages and disadvantages. Some are best used for testing theory while others are useful in generating new theory. Selection of the appropriate post hoc test will provide researchers with the most detailed information while limiting Type 1 errors due to alpha inflation.
Certification of medical librarians, 1949--1977 statistical analysis.
Schmidt, D
1979-01-01
The Medical Library Association's Code for Training and Certification of Medical Librarians was in effect from 1949 to August 1977, a period during which 3,216 individuals were certified. Statistics on each type of certificate granted each year are provided. Because 54.5% of those granted certification were awarded it in the last three-year, two-month period of the code's existence, these applications are reviewed in greater detail. Statistics on each type of certificate granted each year are provided. Because 54.5% of those granted certification were awarded it in the last three-year, two-month period of the code's existence, these applications are reviewed in greater detail. Statistics on MLA membership, sex, residence, library school, and method of meeting requirements are detailed. Questions relating to certification under the code now in existence are raised.
Certification of medical librarians, 1949--1977 statistical analysis.
Schmidt, D
1979-01-01
The Medical Library Association's Code for Training and Certification of Medical Librarians was in effect from 1949 to August 1977, a period during which 3,216 individuals were certified. Statistics on each type of certificate granted each year are provided. Because 54.5% of those granted certification were awarded it in the last three-year, two-month period of the code's existence, these applications are reviewed in greater detail. Statistics on each type of certificate granted each year are provided. Because 54.5% of those granted certification were awarded it in the last three-year, two-month period of the code's existence, these applications are reviewed in greater detail. Statistics on MLA membership, sex, residence, library school, and method of meeting requirements are detailed. Questions relating to certification under the code now in existence are raised. PMID:427287
2015-09-01
then examines the correlation between violence and police militarization. A statistical analysis of crime data found an inverse relationship between...violence and police militarization. A statistical analysis of crime data found an inverse relationship between levels of reported violence and...events. The research then focused on the correlation between violence and police militarization. The research began with a detailed statistical
CADDIS Volume 4. Data Analysis: Basic Analyses
Use of statistical tests to determine if an observation is outside the normal range of expected values. Details of CART, regression analysis, use of quantile regression analysis, CART in causal analysis, simplifying or pruning resulting trees.
The Australasian Resuscitation in Sepsis Evaluation (ARISE) trial statistical analysis plan.
Delaney, Anthony P; Peake, Sandra L; Bellomo, Rinaldo; Cameron, Peter; Holdgate, Anna; Howe, Belinda; Higgins, Alisa; Presneill, Jeffrey; Webb, Steve
2013-09-01
The Australasian Resuscitation in Sepsis Evaluation (ARISE) study is an international, multicentre, randomised, controlled trial designed to evaluate the effectiveness of early goal-directed therapy compared with standard care for patients presenting to the emergency department with severe sepsis. In keeping with current practice, and considering aspects of trial design and reporting specific to non-pharmacological interventions, our plan outlines the principles and methods for analysing and reporting the trial results. The document is prepared before completion of recruitment into the ARISE study, without knowledge of the results of the interim analysis conducted by the data safety and monitoring committee and before completion of the two related international studies. Our statistical analysis plan was designed by the ARISE chief investigators, and reviewed and approved by the ARISE steering committee. We reviewed the data collected by the research team as specified in the study protocol and detailed in the study case report form. We describe information related to baseline characteristics, characteristics of delivery of the trial interventions, details of resuscitation, other related therapies and other relevant data with appropriate comparisons between groups. We define the primary, secondary and tertiary outcomes for the study, with description of the planned statistical analyses. We have developed a statistical analysis plan with a trial profile, mock-up tables and figures. We describe a plan for presenting baseline characteristics, microbiological and antibiotic therapy, details of the interventions, processes of care and concomitant therapies and adverse events. We describe the primary, secondary and tertiary outcomes with identification of subgroups to be analysed. We have developed a statistical analysis plan for the ARISE study, available in the public domain, before the completion of recruitment into the study. This will minimise analytical bias and conforms to current best practice in conducting clinical trials.
A note on generalized Genome Scan Meta-Analysis statistics
Koziol, James A; Feng, Anne C
2005-01-01
Background Wise et al. introduced a rank-based statistical technique for meta-analysis of genome scans, the Genome Scan Meta-Analysis (GSMA) method. Levinson et al. recently described two generalizations of the GSMA statistic: (i) a weighted version of the GSMA statistic, so that different studies could be ascribed different weights for analysis; and (ii) an order statistic approach, reflecting the fact that a GSMA statistic can be computed for each chromosomal region or bin width across the various genome scan studies. Results We provide an Edgeworth approximation to the null distribution of the weighted GSMA statistic, and, we examine the limiting distribution of the GSMA statistics under the order statistic formulation, and quantify the relevance of the pairwise correlations of the GSMA statistics across different bins on this limiting distribution. We also remark on aggregate criteria and multiple testing for determining significance of GSMA results. Conclusion Theoretical considerations detailed herein can lead to clarification and simplification of testing criteria for generalizations of the GSMA statistic. PMID:15717930
NASA Technical Reports Server (NTRS)
1990-01-01
Structural Reliability Consultants' computer program creates graphic plots showing the statistical parameters of glue laminated timbers, or 'glulam.' The company president, Dr. Joseph Murphy, read in NASA Tech Briefs about work related to analysis of Space Shuttle surface tile strength performed for Johnson Space Center by Rockwell International Corporation. Analysis led to a theory of 'consistent tolerance bounds' for statistical distributions, applicable in industrial testing where statistical analysis can influence product development and use. Dr. Murphy then obtained the Tech Support Package that covers the subject in greater detail. The TSP became the basis for Dr. Murphy's computer program PC-DATA, which he is marketing commercially.
Role of microstructure on twin nucleation and growth in HCP titanium: A statistical study
DOE Office of Scientific and Technical Information (OSTI.GOV)
Arul Kumar, M.; Wroński, M.; McCabe, Rodney James
In this study, a detailed statistical analysis is performed using Electron Back Scatter Diffraction (EBSD) to establish the effect of microstructure on twin nucleation and growth in deformed commercial purity hexagonal close packed (HCP) titanium. Rolled titanium samples are compressed along rolling, transverse and normal directions to establish statistical correlations for {10–12}, {11–21}, and {11–22} twins. A recently developed automated EBSD-twinning analysis software is employed for the statistical analysis. Finally, the analysis provides the following key findings: (I) grain size and strain dependence is different for twin nucleation and growth; (II) twinning statistics can be generalized for the HCP metalsmore » magnesium, zirconium and titanium; and (III) complex microstructure, where grain shape and size distribution is heterogeneous, requires multi-point statistical correlations.« less
Role of microstructure on twin nucleation and growth in HCP titanium: A statistical study
Arul Kumar, M.; Wroński, M.; McCabe, Rodney James; ...
2018-02-01
In this study, a detailed statistical analysis is performed using Electron Back Scatter Diffraction (EBSD) to establish the effect of microstructure on twin nucleation and growth in deformed commercial purity hexagonal close packed (HCP) titanium. Rolled titanium samples are compressed along rolling, transverse and normal directions to establish statistical correlations for {10–12}, {11–21}, and {11–22} twins. A recently developed automated EBSD-twinning analysis software is employed for the statistical analysis. Finally, the analysis provides the following key findings: (I) grain size and strain dependence is different for twin nucleation and growth; (II) twinning statistics can be generalized for the HCP metalsmore » magnesium, zirconium and titanium; and (III) complex microstructure, where grain shape and size distribution is heterogeneous, requires multi-point statistical correlations.« less
Revisiting photon-statistics effects on multiphoton ionization
NASA Astrophysics Data System (ADS)
Mouloudakis, G.; Lambropoulos, P.
2018-05-01
We present a detailed analysis of the effects of photon statistics on multiphoton ionization. Through a detailed study of the role of intermediate states, we evaluate the conditions under which the premise of nonresonant processes is valid. The limitations of its validity are manifested in the dependence of the process on the stochastic properties of the radiation and found to be quite sensitive to the intensity. The results are quantified through detailed calculations for coherent, chaotic, and squeezed vacuum radiation. Their significance in the context of recent developments in radiation sources such as the short-wavelength free-electron laser and squeezed vacuum radiation is also discussed.
Some Tests of Randomness with Applications
1981-02-01
freedom. For further details, the reader is referred to Gnanadesikan (1977, p. 169) wherein other relevant tests are also given, Graphical tests, as...sample from a gamma distri- bution. J. Am. Statist. Assoc. 71, 480-7. Gnanadesikan , R. (1977). Methods for Statistical Data Analysis of Multivariate
Yu, Xiaojin; Liu, Pei; Min, Jie; Chen, Qiguang
2009-01-01
To explore the application of regression on order statistics (ROS) in estimating nondetects for food exposure assessment. Regression on order statistics was adopted in analysis of cadmium residual data set from global food contaminant monitoring, the mean residual was estimated basing SAS programming and compared with the results from substitution methods. The results show that ROS method performs better obviously than substitution methods for being robust and convenient for posterior analysis. Regression on order statistics is worth to adopt,but more efforts should be make for details of application of this method.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Steed, Chad Allen
EDENx is a multivariate data visualization tool that allows interactive user driven analysis of large-scale data sets with high dimensionality. EDENx builds on our earlier system, called EDEN to enable analysis of more dimensions and larger scale data sets. EDENx provides an initial overview of summary statistics for each variable in the data set under investigation. EDENx allows the user to interact with graphical summary plots of the data to investigate subsets and their statistical associations. These plots include histograms, binned scatterplots, binned parallel coordinate plots, timeline plots, and graphical correlation indicators. From the EDENx interface, a user can selectmore » a subsample of interest and launch a more detailed data visualization via the EDEN system. EDENx is best suited for high-level, aggregate analysis tasks while EDEN is more appropriate for detail data investigations.« less
Australasian Resuscitation In Sepsis Evaluation trial statistical analysis plan.
Delaney, Anthony; Peake, Sandra L; Bellomo, Rinaldo; Cameron, Peter; Holdgate, Anna; Howe, Belinda; Higgins, Alisa; Presneill, Jeffrey; Webb, Steve
2013-10-01
The Australasian Resuscitation In Sepsis Evaluation (ARISE) study is an international, multicentre, randomised, controlled trial designed to evaluate the effectiveness of early goal-directed therapy compared with standard care for patients presenting to the ED with severe sepsis. In keeping with current practice, and taking into considerations aspects of trial design and reporting specific to non-pharmacologic interventions, this document outlines the principles and methods for analysing and reporting the trial results. The document is prepared prior to completion of recruitment into the ARISE study, without knowledge of the results of the interim analysis conducted by the data safety and monitoring committee and prior to completion of the two related international studies. The statistical analysis plan was designed by the ARISE chief investigators, and reviewed and approved by the ARISE steering committee. The data collected by the research team as specified in the study protocol, and detailed in the study case report form were reviewed. Information related to baseline characteristics, characteristics of delivery of the trial interventions, details of resuscitation and other related therapies, and other relevant data are described with appropriate comparisons between groups. The primary, secondary and tertiary outcomes for the study are defined, with description of the planned statistical analyses. A statistical analysis plan was developed, along with a trial profile, mock-up tables and figures. A plan for presenting baseline characteristics, microbiological and antibiotic therapy, details of the interventions, processes of care and concomitant therapies, along with adverse events are described. The primary, secondary and tertiary outcomes are described along with identification of subgroups to be analysed. A statistical analysis plan for the ARISE study has been developed, and is available in the public domain, prior to the completion of recruitment into the study. This will minimise analytic bias and conforms to current best practice in conducting clinical trials. © 2013 Australasian College for Emergency Medicine and Australasian Society for Emergency Medicine.
ERIC Educational Resources Information Center
Robison, M. Henry; Christophersen, Kjell A.
2008-01-01
The purpose of this volume is to present the results of the economic impact analysis in detail by gender and entry level of education. On the data entry side, gender and entry level of education are important variables that help characterize the student body profile. This profile data links to national statistical databases which are already…
[Artificial neural networks for decision making in urologic oncology].
Remzi, M; Djavan, B
2007-06-01
This chapter presents a detailed introduction regarding Artificial Neural Networks (ANNs) and their contribution to modern Urologic Oncology. It includes a description of ANNs methodology and points out the differences between Artifical Intelligence and traditional statistic models in terms of usefulness for patients and clinicians, and its advantages over current statistical analysis.
Parsons, Nick R; Price, Charlotte L; Hiskens, Richard; Achten, Juul; Costa, Matthew L
2012-04-25
The application of statistics in reported research in trauma and orthopaedic surgery has become ever more important and complex. Despite the extensive use of statistical analysis, it is still a subject which is often not conceptually well understood, resulting in clear methodological flaws and inadequate reporting in many papers. A detailed statistical survey sampled 100 representative orthopaedic papers using a validated questionnaire that assessed the quality of the trial design and statistical analysis methods. The survey found evidence of failings in study design, statistical methodology and presentation of the results. Overall, in 17% (95% confidence interval; 10-26%) of the studies investigated the conclusions were not clearly justified by the results, in 39% (30-49%) of studies a different analysis should have been undertaken and in 17% (10-26%) a different analysis could have made a difference to the overall conclusions. It is only by an improved dialogue between statistician, clinician, reviewer and journal editor that the failings in design methodology and analysis highlighted by this survey can be addressed.
Human Deception Detection from Whole Body Motion Analysis
2015-12-01
9.3.2. Prediction Probability The output reports from SPSS detail the stepwise procedures for each series of analyses using Wald statistic values for... statistical significance in determining replication, but instead used a combination of significance and direction of means to determine partial or...and the independents need not be unbound. All data were analyzed utilizing the Statistical Package for Social Sciences ( SPSS , v.19.0, Chicago, IL
Illinois' Forests, 2005: Statistics, Methods, and Quality Assurance
Susan J. Crocker; Charles J. Barnett; Mark A. Hatfield
2013-01-01
The first full annual inventory of Illinois' forests was completed in 2005. This report contains 1) descriptive information on methods, statistics, and quality assurance of data collection, 2) a glossary of terms, 3) tables that summarize quality assurance, and 4) a core set of tabular estimates for a variety of forest resources. A detailed analysis of inventory...
Automatically inserted technical details improve radiology report accuracy.
Abujudeh, Hani H; Govindan, Siddharth; Narin, Ozden; Johnson, Jamlik Omari; Thrall, James H; Rosenthal, Daniel I
2011-09-01
To assess the effect of automatically inserted technical details on the concordance of a radiology report header with the actual procedure performed. The study was IRB approved and informed consent was waived. We obtained radiology report audit data from the hospital's compliance office from the period of January 2005 through December 2009 spanning a total of 20 financial quarters. A "discordance percentage" was defined as the percentage of total studies in which a procedure code change was made during auditing. Using Chi-square analysis we compared discordance percentages between reports with manually inserted technical details (MITD) and automatically inserted technical details (AITD). The second quarter data of 2007 was not included in the analysis as the switch from MITD to AITD occurred during this quarter. The hospital's compliance office audited 9,110 studies from 2005-2009. Excluding the 564 studies in the second quarter of 2007, we analyzed a total of 8,546 studies, 3,948 with MITD and 4,598 with AITD. The discordance percentage in the MITD group was 3.95% (156/3,948, range per quarter, 1.5- 6.1%). The AITD discordance percentage was 1.37% (63/4,598, range per quarter, 0.0-2.6%). A Chi-square analysis determined a statistically significant difference between the 2 groups (P < 0.001). There was a statistically significant improvement in the concordance of a radiology report header with the performed procedure using automatically inserted technical details compared to manually inserted details. Copyright © 2011 American College of Radiology. Published by Elsevier Inc. All rights reserved.
Detailed Uncertainty Analysis of the ZEM-3 Measurement System
NASA Technical Reports Server (NTRS)
Mackey, Jon; Sehirlioglu, Alp; Dynys, Fred
2014-01-01
The measurement of Seebeck coefficient and electrical resistivity are critical to the investigation of all thermoelectric systems. Therefore, it stands that the measurement uncertainty must be well understood to report ZT values which are accurate and trustworthy. A detailed uncertainty analysis of the ZEM-3 measurement system has been performed. The uncertainty analysis calculates error in the electrical resistivity measurement as a result of sample geometry tolerance, probe geometry tolerance, statistical error, and multi-meter uncertainty. The uncertainty on Seebeck coefficient includes probe wire correction factors, statistical error, multi-meter uncertainty, and most importantly the cold-finger effect. The cold-finger effect plagues all potentiometric (four-probe) Seebeck measurement systems, as heat parasitically transfers through thermocouple probes. The effect leads to an asymmetric over-estimation of the Seebeck coefficient. A thermal finite element analysis allows for quantification of the phenomenon, and provides an estimate on the uncertainty of the Seebeck coefficient. The thermoelectric power factor has been found to have an uncertainty of +9-14 at high temperature and 9 near room temperature.
Statistical analysis of RHIC beam position monitors performance
NASA Astrophysics Data System (ADS)
Calaga, R.; Tomás, R.
2004-04-01
A detailed statistical analysis of beam position monitors (BPM) performance at RHIC is a critical factor in improving regular operations and future runs. Robust identification of malfunctioning BPMs plays an important role in any orbit or turn-by-turn analysis. Singular value decomposition and Fourier transform methods, which have evolved as powerful numerical techniques in signal processing, will aid in such identification from BPM data. This is the first attempt at RHIC to use a large set of data to statistically enhance the capability of these two techniques and determine BPM performance. A comparison from run 2003 data shows striking agreement between the two methods and hence can be used to improve BPM functioning at RHIC and possibly other accelerators.
ERIC Educational Resources Information Center
Fouladi, Rachel T.
2000-01-01
Provides an overview of standard and modified normal theory and asymptotically distribution-free covariance and correlation structure analysis techniques and details Monte Carlo simulation results on Type I and Type II error control. Demonstrates through the simulation that robustness and nonrobustness of structure analysis techniques vary as a…
ERIC Educational Resources Information Center
Thompson, Terry J.; Gould, Karen J.
2005-01-01
In recent years the Metropolitan School District of Wayne Township in Indianapolis has been awash in data. In attempts to improve levels of student achievement, the authors collected all manner of statistical details about students and schools and attempted to perform data analysis as part of the school improvement process. The authors were never…
Simulation on a car interior aerodynamic noise control based on statistical energy analysis
NASA Astrophysics Data System (ADS)
Chen, Xin; Wang, Dengfeng; Ma, Zhengdong
2012-09-01
How to simulate interior aerodynamic noise accurately is an important question of a car interior noise reduction. The unsteady aerodynamic pressure on body surfaces is proved to be the key effect factor of car interior aerodynamic noise control in high frequency on high speed. In this paper, a detail statistical energy analysis (SEA) model is built. And the vibra-acoustic power inputs are loaded on the model for the valid result of car interior noise analysis. The model is the solid foundation for further optimization on car interior noise control. After the most sensitive subsystems for the power contribution to car interior noise are pointed by SEA comprehensive analysis, the sound pressure level of car interior aerodynamic noise can be reduced by improving their sound and damping characteristics. The further vehicle testing results show that it is available to improve the interior acoustic performance by using detailed SEA model, which comprised by more than 80 subsystems, with the unsteady aerodynamic pressure calculation on body surfaces and the materials improvement of sound/damping properties. It is able to acquire more than 2 dB reduction on the central frequency in the spectrum over 800 Hz. The proposed optimization method can be looked as a reference of car interior aerodynamic noise control by the detail SEA model integrated unsteady computational fluid dynamics (CFD) and sensitivity analysis of acoustic contribution.
Macro scale models for freight railroad terminals.
DOT National Transportation Integrated Search
2016-03-02
The project has developed a yard capacity model for macro-level analysis. The study considers the detailed sequence and scheduling in classification yards and their impacts on yard capacities simulate typical freight railroad terminals, and statistic...
Detailed Analysis of the Interoccurrence Time Statistics in Seismic Activity
NASA Astrophysics Data System (ADS)
Tanaka, Hiroki; Aizawa, Yoji
2017-02-01
The interoccurrence time statistics of seismiciry is studied theoretically as well as numerically by taking into account the conditional probability and the correlations among many earthquakes in different magnitude levels. It is known so far that the interoccurrence time statistics is well approximated by the Weibull distribution, but the more detailed information about the interoccurrence times can be obtained from the analysis of the conditional probability. Firstly, we propose the Embedding Equation Theory (EET), where the conditional probability is described by two kinds of correlation coefficients; one is the magnitude correlation and the other is the inter-event time correlation. Furthermore, the scaling law of each correlation coefficient is clearly determined from the numerical data-analysis carrying out with the Preliminary Determination of Epicenter (PDE) Catalog and the Japan Meteorological Agency (JMA) Catalog. Secondly, the EET is examined to derive the magnitude dependence of the interoccurrence time statistics and the multi-fractal relation is successfully formulated. Theoretically we cannot prove the universality of the multi-fractal relation in seismic activity; nevertheless, the theoretical results well reproduce all numerical data in our analysis, where several common features or the invariant aspects are clearly observed. Especially in the case of stationary ensembles the multi-fractal relation seems to obey an invariant curve, furthermore in the case of non-stationary (moving time) ensembles for the aftershock regime the multi-fractal relation seems to satisfy a certain invariant curve at any moving times. It is emphasized that the multi-fractal relation plays an important role to unify the statistical laws of seismicity: actually the Gutenberg-Richter law and the Weibull distribution are unified in the multi-fractal relation, and some universality conjectures regarding the seismicity are briefly discussed.
Crash analysis, statistics & information notebook 2008
DOT National Transportation Integrated Search
2008-01-01
Traditionally crash data is often presented as single fact sheets highlighting a single factor such as Vehicle Type or Road Type. This document will try to show how the risk factors interrelate to produce a crash. Complete detailed analys...
Trends in total column ozone measurements
NASA Technical Reports Server (NTRS)
Rowland, F. S.; Angell, J.; Attmannspacher, W.; Bloomfield, P.; Bojkov, R. D.; Harris, N.; Komhyr, W.; Mcfarland, M.; Mcpeters, R.; Stolarski, R. S.
1989-01-01
It is important to ensure the best available data are used in any determination of possible trends in total ozone in order to have the most accurate estimates of any trends and the associated uncertainties. Accordingly, the existing total ozone records were examined in considerable detail. Once the best data set has been produced, the statistical analysis must examine the data for any effects that might indicate changes in the behavior of global total ozone. The changes at any individual measuring station could be local in nature, and herein, particular attention was paid to the seasonal and latitudinal variations of total ozone, because two dimensional photochemical models indicate that any changes in total ozone would be most pronounced at high latitudes during the winter months. The conclusions derived from this detailed examination of available total ozone can be split into two categories, one concerning the quality and the other the statistical analysis of the total ozone record.
Alignment-free genetic sequence comparisons: a review of recent approaches by word analysis
Steele, Joe; Bastola, Dhundy
2014-01-01
Modern sequencing and genome assembly technologies have provided a wealth of data, which will soon require an analysis by comparison for discovery. Sequence alignment, a fundamental task in bioinformatics research, may be used but with some caveats. Seminal techniques and methods from dynamic programming are proving ineffective for this work owing to their inherent computational expense when processing large amounts of sequence data. These methods are prone to giving misleading information because of genetic recombination, genetic shuffling and other inherent biological events. New approaches from information theory, frequency analysis and data compression are available and provide powerful alternatives to dynamic programming. These new methods are often preferred, as their algorithms are simpler and are not affected by synteny-related problems. In this review, we provide a detailed discussion of computational tools, which stem from alignment-free methods based on statistical analysis from word frequencies. We provide several clear examples to demonstrate applications and the interpretations over several different areas of alignment-free analysis such as base–base correlations, feature frequency profiles, compositional vectors, an improved string composition and the D2 statistic metric. Additionally, we provide detailed discussion and an example of analysis by Lempel–Ziv techniques from data compression. PMID:23904502
NASA Technical Reports Server (NTRS)
Keegan, W. B.
1974-01-01
In order to produce cost effective environmental test programs, the test specifications must be realistic and to be useful, they must be available early in the life of a program. This paper describes a method for achieving such specifications for subsystems by utilizing the results of a statistical analysis of data acquired at subsystem mounting locations during system level environmental tests. The paper describes the details of this statistical analysis. The resultant recommended levels are a function of the subsystems' mounting location in the spacecraft. Methods of determining this mounting 'zone' are described. Recommendations are then made as to which of the various problem areas encountered should be pursued further.
NASA Astrophysics Data System (ADS)
Barré, Anthony; Suard, Frédéric; Gérard, Mathias; Montaru, Maxime; Riu, Delphine
2014-01-01
This paper describes the statistical analysis of recorded data parameters of electrical battery ageing during electric vehicle use. These data permit traditional battery ageing investigation based on the evolution of the capacity fade and resistance raise. The measured variables are examined in order to explain the correlation between battery ageing and operating conditions during experiments. Such study enables us to identify the main ageing factors. Then, detailed statistical dependency explorations present the responsible factors on battery ageing phenomena. Predictive battery ageing models are built from this approach. Thereby results demonstrate and quantify a relationship between variables and battery ageing global observations, and also allow accurate battery ageing diagnosis through predictive models.
Multivariate analysis: A statistical approach for computations
NASA Astrophysics Data System (ADS)
Michu, Sachin; Kaushik, Vandana
2014-10-01
Multivariate analysis is a type of multivariate statistical approach commonly used in, automotive diagnosis, education evaluating clusters in finance etc and more recently in the health-related professions. The objective of the paper is to provide a detailed exploratory discussion about factor analysis (FA) in image retrieval method and correlation analysis (CA) of network traffic. Image retrieval methods aim to retrieve relevant images from a collected database, based on their content. The problem is made more difficult due to the high dimension of the variable space in which the images are represented. Multivariate correlation analysis proposes an anomaly detection and analysis method based on the correlation coefficient matrix. Anomaly behaviors in the network include the various attacks on the network like DDOs attacks and network scanning.
Methods for trend analysis: Examples with problem/failure data
NASA Technical Reports Server (NTRS)
Church, Curtis K.
1989-01-01
Statistics are emphasized as an important role in quality control and reliability. Consequently, Trend Analysis Techniques recommended a variety of statistical methodologies that could be applied to time series data. The major goal of the working handbook, using data from the MSFC Problem Assessment System, is to illustrate some of the techniques in the NASA standard, some different techniques, and to notice patterns of data. Techniques for trend estimation used are: regression (exponential, power, reciprocal, straight line) and Kendall's rank correlation coefficient. The important details of a statistical strategy for estimating a trend component are covered in the examples. However, careful analysis and interpretation is necessary because of small samples and frequent zero problem reports in a given time period. Further investigations to deal with these issues are being conducted.
The Content of Statistical Requirements for Authors in Biomedical Research Journals
Liu, Tian-Yi; Cai, Si-Yu; Nie, Xiao-Lu; Lyu, Ya-Qi; Peng, Xiao-Xia; Feng, Guo-Shuang
2016-01-01
Background: Robust statistical designing, sound statistical analysis, and standardized presentation are important to enhance the quality and transparency of biomedical research. This systematic review was conducted to summarize the statistical reporting requirements introduced by biomedical research journals with an impact factor of 10 or above so that researchers are able to give statistical issues’ serious considerations not only at the stage of data analysis but also at the stage of methodological design. Methods: Detailed statistical instructions for authors were downloaded from the homepage of each of the included journals or obtained from the editors directly via email. Then, we described the types and numbers of statistical guidelines introduced by different press groups. Items of statistical reporting guideline as well as particular requirements were summarized in frequency, which were grouped into design, method of analysis, and presentation, respectively. Finally, updated statistical guidelines and particular requirements for improvement were summed up. Results: Totally, 21 of 23 press groups introduced at least one statistical guideline. More than half of press groups can update their statistical instruction for authors gradually relative to issues of new statistical reporting guidelines. In addition, 16 press groups, covering 44 journals, address particular statistical requirements. The most of the particular requirements focused on the performance of statistical analysis and transparency in statistical reporting, including “address issues relevant to research design, including participant flow diagram, eligibility criteria, and sample size estimation,” and “statistical methods and the reasons.” Conclusions: Statistical requirements for authors are becoming increasingly perfected. Statistical requirements for authors remind researchers that they should make sufficient consideration not only in regards to statistical methods during the research design, but also standardized statistical reporting, which would be beneficial in providing stronger evidence and making a greater critical appraisal of evidence more accessible. PMID:27748343
The Content of Statistical Requirements for Authors in Biomedical Research Journals.
Liu, Tian-Yi; Cai, Si-Yu; Nie, Xiao-Lu; Lyu, Ya-Qi; Peng, Xiao-Xia; Feng, Guo-Shuang
2016-10-20
Robust statistical designing, sound statistical analysis, and standardized presentation are important to enhance the quality and transparency of biomedical research. This systematic review was conducted to summarize the statistical reporting requirements introduced by biomedical research journals with an impact factor of 10 or above so that researchers are able to give statistical issues' serious considerations not only at the stage of data analysis but also at the stage of methodological design. Detailed statistical instructions for authors were downloaded from the homepage of each of the included journals or obtained from the editors directly via email. Then, we described the types and numbers of statistical guidelines introduced by different press groups. Items of statistical reporting guideline as well as particular requirements were summarized in frequency, which were grouped into design, method of analysis, and presentation, respectively. Finally, updated statistical guidelines and particular requirements for improvement were summed up. Totally, 21 of 23 press groups introduced at least one statistical guideline. More than half of press groups can update their statistical instruction for authors gradually relative to issues of new statistical reporting guidelines. In addition, 16 press groups, covering 44 journals, address particular statistical requirements. The most of the particular requirements focused on the performance of statistical analysis and transparency in statistical reporting, including "address issues relevant to research design, including participant flow diagram, eligibility criteria, and sample size estimation," and "statistical methods and the reasons." Statistical requirements for authors are becoming increasingly perfected. Statistical requirements for authors remind researchers that they should make sufficient consideration not only in regards to statistical methods during the research design, but also standardized statistical reporting, which would be beneficial in providing stronger evidence and making a greater critical appraisal of evidence more accessible.
LED traffic signal replacement schedules : facilitating smooth freight flows.
DOT National Transportation Integrated Search
2011-11-01
This research details a field study of LED traffic signals in Missouri and develops a replacement schedule based on key findings. : Rates of degradation were statistically analyzed using Analysis of Variance (ANOVA). Results of this research will pro...
The need for conducting forensic analysis of decommissioned bridges.
DOT National Transportation Integrated Search
2014-01-01
A limiting factor in current bridge management programs is a lack of detailed knowledge of bridge deterioration : mechanisms and processes. The current state of the art is to predict future condition using statistical forecasting : models based upon ...
On the Use of Statistics in Design and the Implications for Deterministic Computer Experiments
NASA Technical Reports Server (NTRS)
Simpson, Timothy W.; Peplinski, Jesse; Koch, Patrick N.; Allen, Janet K.
1997-01-01
Perhaps the most prevalent use of statistics in engineering design is through Taguchi's parameter and robust design -- using orthogonal arrays to compute signal-to-noise ratios in a process of design improvement. In our view, however, there is an equally exciting use of statistics in design that could become just as prevalent: it is the concept of metamodeling whereby statistical models are built to approximate detailed computer analysis codes. Although computers continue to get faster, analysis codes always seem to keep pace so that their computational time remains non-trivial. Through metamodeling, approximations of these codes are built that are orders of magnitude cheaper to run. These metamodels can then be linked to optimization routines for fast analysis, or they can serve as a bridge for integrating analysis codes across different domains. In this paper we first review metamodeling techniques that encompass design of experiments, response surface methodology, Taguchi methods, neural networks, inductive learning, and kriging. We discuss their existing applications in engineering design and then address the dangers of applying traditional statistical techniques to approximate deterministic computer analysis codes. We conclude with recommendations for the appropriate use of metamodeling techniques in given situations and how common pitfalls can be avoided.
Billot, Laurent; Lindley, Richard I; Harvey, Lisa A; Maulik, Pallab K; Hackett, Maree L; Murthy, Gudlavalleti Vs; Anderson, Craig S; Shamanna, Bindiganavale R; Jan, Stephen; Walker, Marion; Forster, Anne; Langhorne, Peter; Verma, Shweta J; Felix, Cynthia; Alim, Mohammed; Gandhi, Dorcas Bc; Pandian, Jeyaraj Durai
2017-02-01
Background In low- and middle-income countries, few patients receive organized rehabilitation after stroke, yet the burden of chronic diseases such as stroke is increasing in these countries. Affordable models of effective rehabilitation could have a major impact. The ATTEND trial is evaluating a family-led caregiver delivered rehabilitation program after stroke. Objective To publish the detailed statistical analysis plan for the ATTEND trial prior to trial unblinding. Methods Based upon the published registration and protocol, the blinded steering committee and management team, led by the trial statistician, have developed a statistical analysis plan. The plan has been informed by the chosen outcome measures, the data collection forms and knowledge of key baseline data. Results The resulting statistical analysis plan is consistent with best practice and will allow open and transparent reporting. Conclusions Publication of the trial statistical analysis plan reduces potential bias in trial reporting, and clearly outlines pre-specified analyses. Clinical Trial Registrations India CTRI/2013/04/003557; Australian New Zealand Clinical Trials Registry ACTRN1261000078752; Universal Trial Number U1111-1138-6707.
A statistical physics viewpoint on the dynamics of the bouncing ball
NASA Astrophysics Data System (ADS)
Chastaing, Jean-Yonnel; Géminard, Jean-Christophe; Bertin, Eric
2016-06-01
We compute, in a statistical physics perspective, the dynamics of a bouncing ball maintained in a chaotic regime thanks to collisions with a plate experiencing an aperiodic vibration. We analyze in details the energy exchanges between the bead and the vibrating plate, and show that the coupling between the bead and the plate can be modeled in terms of both a dissipative process and an injection mechanism by an energy reservoir. An analysis of the injection statistics in terms of fluctuation relation is also provided.
NASA Astrophysics Data System (ADS)
Belianinov, Alex; Ganesh, Panchapakesan; Lin, Wenzhi; Sales, Brian C.; Sefat, Athena S.; Jesse, Stephen; Pan, Minghu; Kalinin, Sergei V.
2014-12-01
Atomic level spatial variability of electronic structure in Fe-based superconductor FeTe0.55Se0.45 (Tc = 15 K) is explored using current-imaging tunneling-spectroscopy. Multivariate statistical analysis of the data differentiates regions of dissimilar electronic behavior that can be identified with the segregation of chalcogen atoms, as well as boundaries between terminations and near neighbor interactions. Subsequent clustering analysis allows identification of the spatial localization of these dissimilar regions. Similar statistical analysis of modeled calculated density of states of chemically inhomogeneous FeTe1-xSex structures further confirms that the two types of chalcogens, i.e., Te and Se, can be identified by their electronic signature and differentiated by their local chemical environment. This approach allows detailed chemical discrimination of the scanning tunneling microscopy data including separation of atomic identities, proximity, and local configuration effects and can be universally applicable to chemically and electronically inhomogeneous surfaces.
Fernee, Christianne; Browne, Martin; Zakrzewski, Sonia
2017-01-01
This paper introduces statistical shape modelling (SSM) for use in osteoarchaeology research. SSM is a full field, multi-material analytical technique, and is presented as a supplementary geometric morphometric (GM) tool. Lower mandibular canines from two archaeological populations and one modern population were sampled, digitised using micro-CT, aligned, registered to a baseline and statistically modelled using principal component analysis (PCA). Sample material properties were incorporated as a binary enamel/dentin parameter. Results were assessed qualitatively and quantitatively using anatomical landmarks. Finally, the technique’s application was demonstrated for inter-sample comparison through analysis of the principal component (PC) weights. It was found that SSM could provide high detail qualitative and quantitative insight with respect to archaeological inter- and intra-sample variability. This technique has value for archaeological, biomechanical and forensic applications including identification, finite element analysis (FEA) and reconstruction from partial datasets. PMID:29216199
An issue of literacy on pediatric arterial hypertension
NASA Astrophysics Data System (ADS)
Teodoro, M. Filomena; Romana, Andreia; Simão, Carla
2017-11-01
Arterial hypertension in pediatric age is a public health problem, whose prevalence has increased significantly over time. Pediatric arterial hypertension (PAH) is under-diagnosed in most cases, a highly prevalent disease, appears without notice with multiple consequences on the children's health and future adults. Children caregivers and close family must know the PAH existence, the negative consequences associated with it, the risk factors and, finally, must do prevention. In [12, 13] can be found a statistical data analysis using a simpler questionnaire introduced in [4] under the aim of a preliminary study about PAH caregivers acquaintance. A continuation of such analysis is detailed in [14]. An extension of such questionnaire was built and applied to a distinct population and it was filled online. The statistical approach is partially reproduced in the present work. Some statistical models were estimated using several approaches, namely multivariate analysis (factorial analysis), also adequate methods to analyze the kind of data in study.
Ninety six gasoline samples were collected from around the U.S. in Autumn 2004. A detailed hydrocarbon analysis was performed on each sample resulting in a data set of approximately 300 chemicals per sample. Statistical analyses were performed on the entire suite of reported chem...
Implementation of Head Start Planned Variation: 1970-1971. Part II.
ERIC Educational Resources Information Center
Lukas, Carol Van Deusen; Wohlleb, Cynthia
This volume of appendices is Part II of a study of program implementation in 12 models of Head Start Planned Variation. It presents details of the data analysis, copies of data collection instruments, and additional analyses and statistics. The appendices are: (A) Analysis of Variance Designs, (B) Copies of Instruments, (C) Additional Analyses,…
Testing for detailed balance in a financial market
NASA Astrophysics Data System (ADS)
Fiebig, H. R.; Musgrove, D. P.
2015-06-01
We test a historical price-time series in a financial market (the NASDAQ 100 index) for a statistical property known as detailed balance. The presence of detailed balance would imply that the market can be modeled by a stochastic process based on a Markov chain, thus leading to equilibrium. In economic terms, a positive outcome of the test would support the efficient market hypothesis, a cornerstone of neo-classical economic theory. In contrast to the usage in prevalent economic theory the term equilibrium here is tied to the returns, rather than the price-time series. The test is based on an action functional S constructed from the elements of the detailed balance condition and the historical data set, and then analyzing S by means of simulated annealing. Checks are performed to verify the validity of the analysis method. We discuss the outcome of this analysis.
ERIC Educational Resources Information Center
General Accounting Office, Washington, DC.
To compile its projections of future employment levels, the Bureau of Labor Statistics (BLS) combines the following five interlinked models in a six-step process: a labor force model, an econometric model of the U.S. economy, an industry activity model, an industry labor demand model, and an occupational labor demand model. The BLS was asked to…
Hu, Yiwen; Chen, Jiahui; Hu, Guping; Yu, Jianchen; Zhu, Xun; Lin, Yongcheng; Chen, Shengping; Yuan, Jie
2015-01-07
Every year, hundreds of new compounds are discovered from the metabolites of marine organisms. Finding new and useful compounds is one of the crucial drivers for this field of research. Here we describe the statistics of bioactive compounds discovered from marine organisms from 1985 to 2012. This work is based on our database, which contains information on more than 15,000 chemical substances including 4196 bioactive marine natural products. We performed a comprehensive statistical analysis to understand the characteristics of the novel bioactive compounds and detail temporal trends, chemical structures, species distribution, and research progress. We hope this meta-analysis will provide useful information for research into the bioactivity of marine natural products and drug development.
Lawlor, Debbie A; Peters, Tim J; Howe, Laura D; Noble, Sian M; Kipping, Ruth R; Jago, Russell
2013-07-24
The Active For Life Year 5 (AFLY5) randomised controlled trial protocol was published in this journal in 2011. It provided a summary analysis plan. This publication is an update of that protocol and provides a detailed analysis plan. This update provides a detailed analysis plan of the effectiveness and cost-effectiveness of the AFLY5 intervention. The plan includes details of how variables will be quality control checked and the criteria used to define derived variables. Details of four key analyses are provided: (a) effectiveness analysis 1 (the effect of the AFLY5 intervention on primary and secondary outcomes at the end of the school year in which the intervention is delivered); (b) mediation analyses (secondary analyses examining the extent to which any effects of the intervention are mediated via self-efficacy, parental support and knowledge, through which the intervention is theoretically believed to act); (c) effectiveness analysis 2 (the effect of the AFLY5 intervention on primary and secondary outcomes 12 months after the end of the intervention) and (d) cost effectiveness analysis (the cost-effectiveness of the AFLY5 intervention). The details include how the intention to treat and per-protocol analyses were defined and planned sensitivity analyses for dealing with missing data. A set of dummy tables are provided in Additional file 1. This detailed analysis plan was written prior to any analyst having access to any data and was approved by the AFLY5 Trial Steering Committee. Its publication will ensure that analyses are in accordance with an a priori plan related to the trial objectives and not driven by knowledge of the data. ISRCTN50133740.
Characterization of Low-Molecular-Weight Heparins by Strong Anion-Exchange Chromatography.
Sadowski, Radosław; Gadzała-Kopciuch, Renata; Kowalkowski, Tomasz; Widomski, Paweł; Jujeczka, Ludwik; Buszewski, Bogusław
2017-11-01
Currently, detailed structural characterization of low-molecular-weight heparin (LMWH) products is an analytical subject of great interest. In this work, we carried out a comprehensive structural analysis of LMWHs and applied a modified pharmacopeial method, as well as methods developed by other researchers, to the analysis of novel biosimilar LMWH products; and, for the first time, compared the qualitative and quantitative composition of commercially available drugs (enoxaparin, nadroparin, and dalteparin). For this purpose, we used strong anion-exchange (SAX) chromatography with spectrophotometric detection because this method is more helpful, easier, and faster than other separation techniques for the detailed disaccharide analysis of new LMWH drugs. In addition, we subjected the obtained results to statistical analysis (factor analysis, t-test, and Newman-Keuls post hoc test).
Plane-wave decomposition by spherical-convolution microphone array
NASA Astrophysics Data System (ADS)
Rafaely, Boaz; Park, Munhum
2004-05-01
Reverberant sound fields are widely studied, as they have a significant influence on the acoustic performance of enclosures in a variety of applications. For example, the intelligibility of speech in lecture rooms, the quality of music in auditoria, the noise level in offices, and the production of 3D sound in living rooms are all affected by the enclosed sound field. These sound fields are typically studied through frequency response measurements or statistical measures such as reverberation time, which do not provide detailed spatial information. The aim of the work presented in this seminar is the detailed analysis of reverberant sound fields. A measurement and analysis system based on acoustic theory and signal processing, designed around a spherical microphone array, is presented. Detailed analysis is achieved by decomposition of the sound field into waves, using spherical Fourier transform and spherical convolution. The presentation will include theoretical review, simulation studies, and initial experimental results.
Uncertainty Analysis of the NASA Glenn 8x6 Supersonic Wind Tunnel
NASA Technical Reports Server (NTRS)
Stephens, Julia; Hubbard, Erin; Walter, Joel; McElroy, Tyler
2016-01-01
This paper presents methods and results of a detailed measurement uncertainty analysis that was performed for the 8- by 6-foot Supersonic Wind Tunnel located at the NASA Glenn Research Center. The statistical methods and engineering judgments used to estimate elemental uncertainties are described. The Monte Carlo method of propagating uncertainty was selected to determine the uncertainty of calculated variables of interest. A detailed description of the Monte Carlo method as applied for this analysis is provided. Detailed uncertainty results for the uncertainty in average free stream Mach number as well as other variables of interest are provided. All results are presented as random (variation in observed values about a true value), systematic (potential offset between observed and true value), and total (random and systematic combined) uncertainty. The largest sources contributing to uncertainty are determined and potential improvement opportunities for the facility are investigated.
An illustrated analysis of North Carolina traffic crash statistics for 2006
DOT National Transportation Integrated Search
2006-01-01
This books purpose is to serve as a resource for traffic safety experts, : public officials and those interested in making North Carolinas roadways, : sidewalks and pedal-cyclist paths safe for all of us who travel on them. It : presents detail...
Life expectancy evaluation and development of a replacement schedule for LED traffic signals.
DOT National Transportation Integrated Search
2011-03-01
This research details a field study of LED traffic signals in Missouri and develops a replacement schedule : based on key findings. Rates of degradation were statistically analyzed using Analysis of Variance : (ANOVA). Results of this research will p...
ERIC Educational Resources Information Center
Bernstein, Michael I.
1982-01-01
Steps a school board can take to minimize the risk of age discrimination suits include reviewing all written policies, forms, files, and collective bargaining agreements for age discriminatory items; preparing a detailed statistical analysis of the age of personnel; and reviewing reduction-in-force procedures. (Author/MLF)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yoon, Boram; Gupta, Rajan; Bhattacharya, Tanmoy
We present a detailed analysis of methods to reduce statistical errors and excited-state contamination in the calculation of matrix elements of quark bilinear operators in nucleon states. All the calculations were done on a 2+1 flavor ensemble with lattices of sizemore » $$32^3 \\times 64$$ generated using the rational hybrid Monte Carlo algorithm at $a=0.081$~fm and with $$M_\\pi=312$$~MeV. The statistical precision of the data is improved using the all-mode-averaging method. We compare two methods for reducing excited-state contamination: a variational analysis and a two-state fit to data at multiple values of the source-sink separation $$t_{\\rm sep}$$. We show that both methods can be tuned to significantly reduce excited-state contamination and discuss their relative advantages and cost-effectiveness. A detailed analysis of the size of source smearing used in the calculation of quark propagators and the range of values of $$t_{\\rm sep}$$ needed to demonstrate convergence of the isovector charges of the nucleon to the $$t_{\\rm sep} \\to \\infty $$ estimates is presented.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yoon, Boram; Gupta, Rajan; Bhattacharya, Tanmoy
We present a detailed analysis of methods to reduce statistical errors and excited-state contamination in the calculation of matrix elements of quark bilinear operators in nucleon states. All the calculations were done on a 2+1-flavor ensemble with lattices of size 32 3 × 64 generated using the rational hybrid Monte Carlo algorithm at a = 0.081 fm and with M π = 312 MeV. The statistical precision of the data is improved using the all-mode-averaging method. We compare two methods for reducing excited-state contamination: a variational analysis and a 2-state fit to data at multiple values of the source-sink separationmore » t sep. We show that both methods can be tuned to significantly reduce excited-state contamination and discuss their relative advantages and cost effectiveness. As a result, a detailed analysis of the size of source smearing used in the calculation of quark propagators and the range of values of t sep needed to demonstrate convergence of the isovector charges of the nucleon to the t sep → ∞ estimates is presented.« less
Controlling excited-state contamination in nucleon matrix elements
Yoon, Boram; Gupta, Rajan; Bhattacharya, Tanmoy; ...
2016-06-08
We present a detailed analysis of methods to reduce statistical errors and excited-state contamination in the calculation of matrix elements of quark bilinear operators in nucleon states. All the calculations were done on a 2+1-flavor ensemble with lattices of size 32 3 × 64 generated using the rational hybrid Monte Carlo algorithm at a = 0.081 fm and with M π = 312 MeV. The statistical precision of the data is improved using the all-mode-averaging method. We compare two methods for reducing excited-state contamination: a variational analysis and a 2-state fit to data at multiple values of the source-sink separationmore » t sep. We show that both methods can be tuned to significantly reduce excited-state contamination and discuss their relative advantages and cost effectiveness. As a result, a detailed analysis of the size of source smearing used in the calculation of quark propagators and the range of values of t sep needed to demonstrate convergence of the isovector charges of the nucleon to the t sep → ∞ estimates is presented.« less
Specialized data analysis of SSME and advanced propulsion system vibration measurements
NASA Technical Reports Server (NTRS)
Coffin, Thomas; Swanson, Wayne L.; Jong, Yen-Yi
1993-01-01
The basic objectives of this contract were to perform detailed analysis and evaluation of dynamic data obtained during Space Shuttle Main Engine (SSME) test and flight operations, including analytical/statistical assessment of component dynamic performance, and to continue the development and implementation of analytical/statistical models to effectively define nominal component dynamic characteristics, detect anomalous behavior, and assess machinery operational conditions. This study was to provide timely assessment of engine component operational status, identify probable causes of malfunction, and define feasible engineering solutions. The work was performed under three broad tasks: (1) Analysis, Evaluation, and Documentation of SSME Dynamic Test Results; (2) Data Base and Analytical Model Development and Application; and (3) Development and Application of Vibration Signature Analysis Techniques.
Alignment-free genetic sequence comparisons: a review of recent approaches by word analysis.
Bonham-Carter, Oliver; Steele, Joe; Bastola, Dhundy
2014-11-01
Modern sequencing and genome assembly technologies have provided a wealth of data, which will soon require an analysis by comparison for discovery. Sequence alignment, a fundamental task in bioinformatics research, may be used but with some caveats. Seminal techniques and methods from dynamic programming are proving ineffective for this work owing to their inherent computational expense when processing large amounts of sequence data. These methods are prone to giving misleading information because of genetic recombination, genetic shuffling and other inherent biological events. New approaches from information theory, frequency analysis and data compression are available and provide powerful alternatives to dynamic programming. These new methods are often preferred, as their algorithms are simpler and are not affected by synteny-related problems. In this review, we provide a detailed discussion of computational tools, which stem from alignment-free methods based on statistical analysis from word frequencies. We provide several clear examples to demonstrate applications and the interpretations over several different areas of alignment-free analysis such as base-base correlations, feature frequency profiles, compositional vectors, an improved string composition and the D2 statistic metric. Additionally, we provide detailed discussion and an example of analysis by Lempel-Ziv techniques from data compression. © The Author 2013. Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.
HydroApps: An R package for statistical simulation to use in regional analysis
NASA Astrophysics Data System (ADS)
Ganora, D.
2013-12-01
The HydroApps package is a newborn R extension initially developed to support the use of a recent model for flood frequency estimation developed for applications in Northwestern Italy; it also contains some general tools for regional analyses and can be easily extended to include other statistical models. The package is currently at an experimental level of development. The HydroApps is a corollary of the SSEM project for regional flood frequency analysis, although it was developed independently to support various instances of regional analyses. Its aim is to provide a basis for interplay between statistical simulation and practical operational use. In particular, the main module of the package deals with the building of the confidence bands of flood frequency curves expressed by means of their L-moments. Other functions include pre-processing and visualization of hydrologic time series, analysis of the optimal design-flood under uncertainty, but also tools useful in water resources management for the estimation of flow duration curves and their sensitivity to water withdrawals. Particular attention is devoted to the code granularity, i.e. the level of detail and aggregation of the code: a greater detail means more low-level functions, which entails more flexibility but reduces the ease of use for practical use. A balance between detail and simplicity is necessary and can be resolved with appropriate wrapping functions and specific help pages for each working block. From a more general viewpoint, the package has not really and user-friendly interface, but runs on multiple operating systems and it's easy to update, as many other open-source projects., The HydroApps functions and their features are reported in order to share ideas and materials to improve the ';technological' and information transfer between scientist communities and final users like policy makers.
Extracting chemical information from high-resolution Kβ X-ray emission spectroscopy
NASA Astrophysics Data System (ADS)
Limandri, S.; Robledo, J.; Tirao, G.
2018-06-01
High-resolution X-ray emission spectroscopy allows studying the chemical environment of a wide variety of materials. Chemical information can be obtained by fitting the X-ray spectra and observing the behavior of some spectral features. Spectral changes can also be quantified by means of statistical parameters calculated by considering the spectrum as a probability distribution. Another possibility is to perform statistical multivariate analysis, such as principal component analysis. In this work the performance of these procedures for extracting chemical information in X-ray emission spectroscopy spectra for mixtures of Mn2+ and Mn4+ oxides are studied. A detail analysis of the parameters obtained, as well as the associated uncertainties is shown. The methodologies are also applied for Mn oxidation state characterization of double perovskite oxides Ba1+xLa1-xMnSbO6 (with 0 ≤ x ≤ 0.7). The results show that statistical parameters and multivariate analysis are the most suitable for the analysis of this kind of spectra.
Texture as a basis for acoustic classification of substrate in the nearshore region
NASA Astrophysics Data System (ADS)
Dennison, A.; Wattrus, N. J.
2016-12-01
Segmentation and classification of substrate type from two locations in Lake Superior, are predicted using multivariate statistical processing of textural measures derived from shallow-water, high-resolution multibeam bathymetric data. During a multibeam sonar survey, both bathymetric and backscatter data are collected. It is well documented that the statistical characteristic of a sonar backscatter mosaic is dependent on substrate type. While classifying the bottom-type on the basis on backscatter alone can accurately predict and map bottom-type, it lacks the ability to resolve and capture fine textural details, an important factor in many habitat mapping studies. Statistical processing can capture the pertinent details about the bottom-type that are rich in textural information. Further multivariate statistical processing can then isolate characteristic features, and provide the basis for an accurate classification scheme. Preliminary results from an analysis of bathymetric data and ground-truth samples collected from the Amnicon River, Superior, Wisconsin, and the Lester River, Duluth, Minnesota, demonstrate the ability to process and develop a novel classification scheme of the bottom type in two geomorphologically distinct areas.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Edwards, Lloyd A.; Paresol, Bernard
This report of the geostatistical analysis results of the fire fuels response variables, custom reaction intensity and total dead fuels is but a part of an SRS 2010 vegetation inventory project. For detailed description of project, theory and background including sample design, methods, and results please refer to USDA Forest Service Savannah River Site internal report “SRS 2010 Vegetation Inventory GeoStatistical Mapping Report”, (Edwards & Parresol 2013).
Secondary School Mathematics Curriculum Improvement Study Information Bulletin 7.
ERIC Educational Resources Information Center
Secondary School Mathematics Curriculum Improvement Study, New York, NY.
The background, objectives, and design of Secondary School Mathematics Curriculum Improvement Study (SSMCIS) are summarized. Details are given of the content of the text series, "Unified Modern Mathematics," in the areas of algebra, geometry, linear algebra, probability and statistics, analysis (calculus), logic, and computer…
Image encryption based on a delayed fractional-order chaotic logistic system
NASA Astrophysics Data System (ADS)
Wang, Zhen; Huang, Xia; Li, Ning; Song, Xiao-Na
2012-05-01
A new image encryption scheme is proposed based on a delayed fractional-order chaotic logistic system. In the process of generating a key stream, the time-varying delay and fractional derivative are embedded in the proposed scheme to improve the security. Such a scheme is described in detail with security analyses including correlation analysis, information entropy analysis, run statistic analysis, mean-variance gray value analysis, and key sensitivity analysis. Experimental results show that the newly proposed image encryption scheme possesses high security.
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCord, R.A.; Olson, R.J.
1988-01-01
Environmental research and assessment activities at Oak Ridge National Laboratory (ORNL) include the analysis of spatial and temporal patterns of ecosystem response at a landscape scale. Analysis through use of geographic information system (GIS) involves an interaction between the user and thematic data sets frequently expressed as maps. A portion of GIS analysis has a mathematical or statistical aspect, especially for the analysis of temporal patterns. ARC/INFO is an excellent tool for manipulating GIS data and producing the appropriate map graphics. INFO also has some limited ability to produce statistical tabulation. At ORNL we have extended our capabilities by graphicallymore » interfacing ARC/INFO and SAS/GRAPH to provide a combined mapping and statistical graphics environment. With the data management, statistical, and graphics capabilities of SAS added to ARC/INFO, we have expanded the analytical and graphical dimensions of the GIS environment. Pie or bar charts, frequency curves, hydrographs, or scatter plots as produced by SAS can be added to maps from attribute data associated with ARC/INFO coverages. Numerous, small, simplified graphs can also become a source of complex map ''symbols.'' These additions extend the dimensions of GIS graphics to include time, details of the thematic composition, distribution, and interrelationships. 7 refs., 3 figs.« less
Jiang, Wei; Yu, Weichuan
2017-02-15
In genome-wide association studies (GWASs) of common diseases/traits, we often analyze multiple GWASs with the same phenotype together to discover associated genetic variants with higher power. Since it is difficult to access data with detailed individual measurements, summary-statistics-based meta-analysis methods have become popular to jointly analyze datasets from multiple GWASs. In this paper, we propose a novel summary-statistics-based joint analysis method based on controlling the joint local false discovery rate (Jlfdr). We prove that our method is the most powerful summary-statistics-based joint analysis method when controlling the false discovery rate at a certain level. In particular, the Jlfdr-based method achieves higher power than commonly used meta-analysis methods when analyzing heterogeneous datasets from multiple GWASs. Simulation experiments demonstrate the superior power of our method over meta-analysis methods. Also, our method discovers more associations than meta-analysis methods from empirical datasets of four phenotypes. The R-package is available at: http://bioinformatics.ust.hk/Jlfdr.html . eeyu@ust.hk. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com
NASA Technical Reports Server (NTRS)
Weger, R. C.; Lee, J.; Zhu, Tianri; Welch, R. M.
1992-01-01
The current controversy existing in reference to the regularity vs. clustering in cloud fields is examined by means of analysis and simulation studies based upon nearest-neighbor cumulative distribution statistics. It is shown that the Poisson representation of random point processes is superior to pseudorandom-number-generated models and that pseudorandom-number-generated models bias the observed nearest-neighbor statistics towards regularity. Interpretation of this nearest-neighbor statistics is discussed for many cases of superpositions of clustering, randomness, and regularity. A detailed analysis is carried out of cumulus cloud field spatial distributions based upon Landsat, AVHRR, and Skylab data, showing that, when both large and small clouds are included in the cloud field distributions, the cloud field always has a strong clustering signal.
Hu, Yiwen; Chen, Jiahui; Hu, Guping; Yu, Jianchen; Zhu, Xun; Lin, Yongcheng; Chen, Shengping; Yuan, Jie
2015-01-01
Every year, hundreds of new compounds are discovered from the metabolites of marine organisms. Finding new and useful compounds is one of the crucial drivers for this field of research. Here we describe the statistics of bioactive compounds discovered from marine organisms from 1985 to 2012. This work is based on our database, which contains information on more than 15,000 chemical substances including 4196 bioactive marine natural products. We performed a comprehensive statistical analysis to understand the characteristics of the novel bioactive compounds and detail temporal trends, chemical structures, species distribution, and research progress. We hope this meta-analysis will provide useful information for research into the bioactivity of marine natural products and drug development. PMID:25574736
Analysis of Variance in Statistical Image Processing
NASA Astrophysics Data System (ADS)
Kurz, Ludwik; Hafed Benteftifa, M.
1997-04-01
A key problem in practical image processing is the detection of specific features in a noisy image. Analysis of variance (ANOVA) techniques can be very effective in such situations, and this book gives a detailed account of the use of ANOVA in statistical image processing. The book begins by describing the statistical representation of images in the various ANOVA models. The authors present a number of computationally efficient algorithms and techniques to deal with such problems as line, edge, and object detection, as well as image restoration and enhancement. By describing the basic principles of these techniques, and showing their use in specific situations, the book will facilitate the design of new algorithms for particular applications. It will be of great interest to graduate students and engineers in the field of image processing and pattern recognition.
Project Sell, Title VII: Final Evaluation 1970-1971.
ERIC Educational Resources Information Center
Condon, Elaine C.; And Others
This evaluative report consists of two parts. The first is a narrative report which represents a summary by the evaluation team and recommendations regarding project activities; the second part provides a statistical analysis of project achievements. Details are provided on evaluation techniques, staff, management, instructional materials,…
Methods for collection and analysis of aquatic biological and microbiological samples
Greeson, Phillip E.; Ehlke, T.A.; Irwin, G.A.; Lium, B.W.; Slack, K.V.
1977-01-01
Chapter A4 contains methods used by the U.S. Geological Survey to collect, preserve, and analyze waters to determine their biological and microbiological properties. Part 1 discusses biological sampling and sampling statistics. The statistical procedures are accompanied by examples. Part 2 consists of detailed descriptions of more than 45 individual methods, including those for bacteria, phytoplankton, zooplankton, seston, periphyton, macrophytes, benthic invertebrates, fish and other vertebrates, cellular contents, productivity, and bioassays. Each method is summarized, and the application, interferences, apparatus, reagents, collection, analysis, calculations, reporting of results, precision and references are given. Part 3 consists of a glossary. Part 4 is a list of taxonomic references.
On the structure and phase transitions of power-law Poissonian ensembles
NASA Astrophysics Data System (ADS)
Eliazar, Iddo; Oshanin, Gleb
2012-10-01
Power-law Poissonian ensembles are Poisson processes that are defined on the positive half-line, and that are governed by power-law intensities. Power-law Poissonian ensembles are stochastic objects of fundamental significance; they uniquely display an array of fractal features and they uniquely generate a span of important applications. In this paper we apply three different methods—oligarchic analysis, Lorenzian analysis and heterogeneity analysis—to explore power-law Poissonian ensembles. The amalgamation of these analyses, combined with the topology of power-law Poissonian ensembles, establishes a detailed and multi-faceted picture of the statistical structure and the statistical phase transitions of these elemental ensembles.
Schmitt, M; Groß, K; Grub, J; Heib, F
2015-06-01
Contact angle determination by sessile drop technique is essential to characterise surface properties in science and in industry. Different specific angles can be observed on every solid which are correlated with the advancing or the receding of the triple line. Different procedures and definitions for the determination of specific angles exist which are often not comprehensible or reproducible. Therefore one of the most important things in this area is to build standard, reproducible and valid methods for determining advancing/receding contact angles. This contribution introduces novel techniques to analyse dynamic contact angle measurements (sessile drop) in detail which are applicable for axisymmetric and non-axisymmetric drops. Not only the recently presented fit solution by sigmoid function and the independent analysis of the different parameters (inclination, contact angle, velocity of the triple point) but also the dependent analysis will be firstly explained in detail. These approaches lead to contact angle data and different access on specific contact angles which are independent from "user-skills" and subjectivity of the operator. As example the motion behaviour of droplets on flat silicon-oxide surfaces after different surface treatments is dynamically measured by sessile drop technique when inclining the sample plate. The triple points, the inclination angles, the downhill (advancing motion) and the uphill angles (receding motion) obtained by high-precision drop shape analysis are independently and dependently statistically analysed. Due to the small covered distance for the dependent analysis (<0.4mm) and the dominance of counted events with small velocity the measurements are less influenced by motion dynamics and the procedure can be called "slow moving" analysis. The presented procedures as performed are especially sensitive to the range which reaches from the static to the "slow moving" dynamic contact angle determination. They are characterised by small deviations of the computed values. Additional to the detailed introduction of this novel analytical approaches plus fit solution special motion relations for the drop on inclined surfaces and detailed relations about the reactivity of the freshly cleaned silicon wafer surface resulting in acceleration behaviour (reactive de-wetting) are presented. Copyright © 2014 Elsevier Inc. All rights reserved.
Goodwin, Cody R; Sherrod, Stacy D; Marasco, Christina C; Bachmann, Brian O; Schramm-Sapyta, Nicole; Wikswo, John P; McLean, John A
2014-07-01
A metabolic system is composed of inherently interconnected metabolic precursors, intermediates, and products. The analysis of untargeted metabolomics data has conventionally been performed through the use of comparative statistics or multivariate statistical analysis-based approaches; however, each falls short in representing the related nature of metabolic perturbations. Herein, we describe a complementary method for the analysis of large metabolite inventories using a data-driven approach based upon a self-organizing map algorithm. This workflow allows for the unsupervised clustering, and subsequent prioritization of, correlated features through Gestalt comparisons of metabolic heat maps. We describe this methodology in detail, including a comparison to conventional metabolomics approaches, and demonstrate the application of this method to the analysis of the metabolic repercussions of prolonged cocaine exposure in rat sera profiles.
Belianinov, Alex; Panchapakesan, G.; Lin, Wenzhi; ...
2014-12-02
Atomic level spatial variability of electronic structure in Fe-based superconductor FeTe0.55Se0.45 (Tc = 15 K) is explored using current-imaging tunneling-spectroscopy. Multivariate statistical analysis of the data differentiates regions of dissimilar electronic behavior that can be identified with the segregation of chalcogen atoms, as well as boundaries between terminations and near neighbor interactions. Subsequent clustering analysis allows identification of the spatial localization of these dissimilar regions. Similar statistical analysis of modeled calculated density of states of chemically inhomogeneous FeTe1 x Sex structures further confirms that the two types of chalcogens, i.e., Te and Se, can be identified by their electronic signaturemore » and differentiated by their local chemical environment. This approach allows detailed chemical discrimination of the scanning tunneling microscopy data including separation of atomic identities, proximity, and local configuration effects and can be universally applicable to chemically and electronically inhomogeneous surfaces.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Belianinov, Alex, E-mail: belianinova@ornl.gov; Ganesh, Panchapakesan; Lin, Wenzhi
2014-12-01
Atomic level spatial variability of electronic structure in Fe-based superconductor FeTe{sub 0.55}Se{sub 0.45} (T{sub c} = 15 K) is explored using current-imaging tunneling-spectroscopy. Multivariate statistical analysis of the data differentiates regions of dissimilar electronic behavior that can be identified with the segregation of chalcogen atoms, as well as boundaries between terminations and near neighbor interactions. Subsequent clustering analysis allows identification of the spatial localization of these dissimilar regions. Similar statistical analysis of modeled calculated density of states of chemically inhomogeneous FeTe{sub 1−x}Se{sub x} structures further confirms that the two types of chalcogens, i.e., Te and Se, can be identified bymore » their electronic signature and differentiated by their local chemical environment. This approach allows detailed chemical discrimination of the scanning tunneling microscopy data including separation of atomic identities, proximity, and local configuration effects and can be universally applicable to chemically and electronically inhomogeneous surfaces.« less
Shadish, William R; Hedges, Larry V; Pustejovsky, James E
2014-04-01
This article presents a d-statistic for single-case designs that is in the same metric as the d-statistic used in between-subjects designs such as randomized experiments and offers some reasons why such a statistic would be useful in SCD research. The d has a formal statistical development, is accompanied by appropriate power analyses, and can be estimated using user-friendly SPSS macros. We discuss both advantages and disadvantages of d compared to other approaches such as previous d-statistics, overlap statistics, and multilevel modeling. It requires at least three cases for computation and assumes normally distributed outcomes and stationarity, assumptions that are discussed in some detail. We also show how to test these assumptions. The core of the article then demonstrates in depth how to compute d for one study, including estimation of the autocorrelation and the ratio of between case variance to total variance (between case plus within case variance), how to compute power using a macro, and how to use the d to conduct a meta-analysis of studies using single-case designs in the free program R, including syntax in an appendix. This syntax includes how to read data, compute fixed and random effect average effect sizes, prepare a forest plot and a cumulative meta-analysis, estimate various influence statistics to identify studies contributing to heterogeneity and effect size, and do various kinds of publication bias analyses. This d may prove useful for both the analysis and meta-analysis of data from SCDs. Copyright © 2013 Society for the Study of School Psychology. Published by Elsevier Ltd. All rights reserved.
A Fishy Problem for Advanced Students
ERIC Educational Resources Information Center
Patterson, Richard A.
1977-01-01
While developing a research course for gifted high school students, improvements were made in a local pond. Students worked for a semester learning research techniques, statistical analysis, and limnology. At the end of the course, the three students produced a joint scientific paper detailing their study of the pond. (MA)
Systems Analysis of Alternative Architectures for Riverine Warfare in 2010
2006-12-01
propose system of systems improvements for the RF in 2010. With the RF currently working to establish a command structure, train and equip its forces...opposing force. Measures of performance such as time to first enemy detection and loss exchange ratio were collected from MANA. A detailed statistical
78 FR 50373 - Proposed Information Collection; Comment Request; Annual Capital Expenditures Survey
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-19
... source of detailed comprehensive statistics on actual business spending for non-farm companies, non- governmental companies, organizations, and associations operating in the United States. Both employer and nonemployer companies are included in the survey. The Bureau of Economic Analysis, the primary Federal user of...
Engaging Business Students in Quantitative Skills Development
ERIC Educational Resources Information Center
Cronin, Anthony; Carroll, Paula
2015-01-01
In this paper the complex problems of developing quantitative and analytical skills in undergraduate first year, first semester business students are addressed. An action research project, detailing how first year business students perceive the relevance of data analysis and inferential statistics in light of the economic downturn and the…
Errors in logic and statistics plague a meta-analysis
USDA-ARS?s Scientific Manuscript database
The non-target effects of transgenic insecticidal crops has been a topic of debate for over a decade and many laboratory and field studies have addressed the issue in numerous countries. In 2009 Lovei et al. (Transgenic Insecticidal Crops and Natural Enemies: A Detailed Review of Laboratory Studies)...
Xu, Tong-kai; Sun, Zhi-hui; Jiang, Yong
2012-03-01
To evaluate the dimensional stability and detail reproduction of five additional silicone impression materials after autoclave sterilization. Impressions were made on the ISO 4823 standard mold containing several marking lines, in five kinds of additional silicone. All the impressions were sterilized by high temperature and pressure (135 °C, 212.8 kPa) for 25 min. Linear measurements of pre-sterilization and post-sterilization were made with a measuring microscope. Statistical analysis utilized single-factor analysis with pair-wise comparison of mean values when appropriate. Hypothesis testing was conducted at alpha = 0.05. No significant difference was found between the pre-sterilization and post-sterilization conditions for all locations, and all the absolute valuse of linear rate of change less than 8%. All the sterilization by the autoclave did not affect the surfuce detail reproduction of the 5 impression materials. The dimensional stability and detail reproduction of the five additional silicone impression materials in the study was unaffected by autoclave sterilization.
How Do the Metabolic Effects of Chronic Stress Influence Breast Cancer Biology
2013-04-01
meta - analysis . International Journal of Cancer. 2003;107:1023-9. 3. Song M, Lee K-M, Kang D. Breast Cancer Prevention Based on Gene- Environment...7 PCR system. Details of the statistical analysis are provided in supplemental methods 1 section. 2 Adipocyte glucose consumption and...life events and breast cancer risk: a meta - analysis . International Journal of Cancer. 7 2003;107:1023-9. 8 3. Song M, Lee K-M, Kang D. Breast
On the frequency-magnitude distribution of converging boundaries
NASA Astrophysics Data System (ADS)
Marzocchi, W.; Laura, S.; Heuret, A.; Funiciello, F.
2011-12-01
The occurrence of the last mega-thrust earthquake in Japan has clearly remarked the high risk posed to society by such events in terms of social and economic losses even at large spatial scale. The primary component for a balanced and objective mitigation of the impact of these earthquakes is the correct forecast of where such kind of events may occur in the future. To date, there is a wide range of opinions about where mega-thrust earthquakes can occur. Here, we aim at presenting some detailed statistical analysis of a database of worldwide interplate earthquakes occurring at current subduction zones. The database has been recently published in the framework of the EURYI Project 'Convergent margins and seismogenesis: defining the risk of great earthquakes by using statistical data and modelling', and it provides a unique opportunity to explore in detail the seismogenic process in subducting lithosphere. In particular, the statistical analysis of this database allows us to explore many interesting scientific issues such as the existence of different frequency-magnitude distributions across the trenches, the quantitative characterization of subduction zones that are able to produce more likely mega-thrust earthquakes, the prominent features that characterize converging boundaries with different seismic activity and so on. Besides the scientific importance, such issues may lead to improve our mega-thrust earthquake forecasting capability.
Marateb, Hamid Reza; Mansourian, Marjan; Adibi, Peyman; Farina, Dario
2014-01-01
Background: selecting the correct statistical test and data mining method depends highly on the measurement scale of data, type of variables, and purpose of the analysis. Different measurement scales are studied in details and statistical comparison, modeling, and data mining methods are studied based upon using several medical examples. We have presented two ordinal–variables clustering examples, as more challenging variable in analysis, using Wisconsin Breast Cancer Data (WBCD). Ordinal-to-Interval scale conversion example: a breast cancer database of nine 10-level ordinal variables for 683 patients was analyzed by two ordinal-scale clustering methods. The performance of the clustering methods was assessed by comparison with the gold standard groups of malignant and benign cases that had been identified by clinical tests. Results: the sensitivity and accuracy of the two clustering methods were 98% and 96%, respectively. Their specificity was comparable. Conclusion: by using appropriate clustering algorithm based on the measurement scale of the variables in the study, high performance is granted. Moreover, descriptive and inferential statistics in addition to modeling approach must be selected based on the scale of the variables. PMID:24672565
Lu, Z. Q. J.; Lowhorn, N. D.; Wong-Ng, W.; Zhang, W.; Thomas, E. L.; Otani, M.; Green, M. L.; Tran, T. N.; Caylor, C.; Dilley, N. R.; Downey, A.; Edwards, B.; Elsner, N.; Ghamaty, S.; Hogan, T.; Jie, Q.; Li, Q.; Martin, J.; Nolas, G.; Obara, H.; Sharp, J.; Venkatasubramanian, R.; Willigan, R.; Yang, J.; Tritt, T.
2009-01-01
In an effort to develop a Standard Reference Material (SRM™) for Seebeck coefficient, we have conducted a round-robin measurement survey of two candidate materials—undoped Bi2Te3 and Constantan (55 % Cu and 45 % Ni alloy). Measurements were performed in two rounds by twelve laboratories involved in active thermoelectric research using a number of different commercial and custom-built measurement systems and techniques. In this paper we report the detailed statistical analyses on the interlaboratory measurement results and the statistical methodology for analysis of irregularly sampled measurement curves in the interlaboratory study setting. Based on these results, we have selected Bi2Te3 as the prototype standard material. Once available, this SRM will be useful for future interlaboratory data comparison and instrument calibrations. PMID:27504212
Lajus, Dmitry; Sukhikh, Natalia; Alekseev, Victor
2015-01-01
Interest in cryptic species has increased significantly with current progress in genetic methods. The large number of cryptic species suggests that the resolution of traditional morphological techniques may be insufficient for taxonomical research. However, some species now considered to be cryptic may, in fact, be designated pseudocryptic after close morphological examination. Thus the “cryptic or pseudocryptic” dilemma speaks to the resolution of morphological analysis and its utility for identifying species. We address this dilemma first by systematically reviewing data published from 1980 to 2013 on cryptic species of Copepoda and then by performing an in-depth morphological study of the former Eurytemora affinis complex of cryptic species. Analyzing the published data showed that, in 5 of 24 revisions eligible for systematic review, cryptic species assignment was based solely on the genetic variation of forms without detailed morphological analysis to confirm the assignment. Therefore, some newly described cryptic species might be designated pseudocryptic under more detailed morphological analysis as happened with Eurytemora affinis complex. Recent genetic analyses of the complex found high levels of heterogeneity without morphological differences; it is argued to be cryptic. However, next detailed morphological analyses allowed to describe a number of valid species. Our study, using deep statistical analyses usually not applied for new species describing, of this species complex confirmed considerable differences between former cryptic species. In particular, fluctuating asymmetry (FA), the random variation of left and right structures, was significantly different between forms and provided independent information about their status. Our work showed that multivariate statistical approaches, such as principal component analysis, can be powerful techniques for the morphological discrimination of cryptic taxons. Despite increasing cryptic species designations, morphological techniques have great potential in determining copepod taxonomy. PMID:26120427
The use of open source bioinformatics tools to dissect transcriptomic data.
Nitsche, Benjamin M; Ram, Arthur F J; Meyer, Vera
2012-01-01
Microarrays are a valuable technology to study fungal physiology on a transcriptomic level. Various microarray platforms are available comprising both single and two channel arrays. Despite different technologies, preprocessing of microarray data generally includes quality control, background correction, normalization, and summarization of probe level data. Subsequently, depending on the experimental design, diverse statistical analysis can be performed, including the identification of differentially expressed genes and the construction of gene coexpression networks.We describe how Bioconductor, a collection of open source and open development packages for the statistical programming language R, can be used for dissecting microarray data. We provide fundamental details that facilitate the process of getting started with R and Bioconductor. Using two publicly available microarray datasets from Aspergillus niger, we give detailed protocols on how to identify differentially expressed genes and how to construct gene coexpression networks.
Smith, Derek R
2010-12-01
Although bibliometric analysis affords significant insight into the progression and distribution of information within a particular research field, detailed longitudinal studies of this type are rare within the field of nursing. This study aimed to investigate, from a bibliometric perspective, the progression and trends of core international nursing journals over the longest possible time period. A detailed bibliometric analysis was undertaken among 7 core international nursing periodicals using custom historical data sourced from the Thomson Reuters Journal Citation Reports®. In the 32 years between 1977 and 2008, the number of citations received by these 7 journals increased over 700%. A sustained and statistically significant (p<0.001) 3-fold increase was also observed in the average impact factor score during this period. Statistical analysis revealed that all periodicals experienced significant (p<0.001) improvements in their impact factors over time, with gains ranging from approximately 2- to 78-fold. Overall, this study provides one of the most comprehensive, longitudinal bibliometric analyses ever conducted in the field of nursing. Impressive and continual impact factor gains suggest that published nursing research is being increasingly seen, heard and cited in the international academic community. Copyright © 2010 Elsevier Ltd. All rights reserved.
Statistical properties of radiation from VUV and X-ray free electron laser
NASA Astrophysics Data System (ADS)
Saldin, E. L.; Schneidmiller, E. A.; Yurkov, M. V.
1998-03-01
The paper presents a comprehensive analysis of the statistical properties of the radiation from a self-amplified spontaneous emission (SASE) free electron laser operating in linear and nonlinear mode. The investigation has been performed in a one-dimensional approximation assuming the electron pulse length to be much larger than a coherence length of the radiation. The following statistical properties of the SASE FEL radiation have been studied in detail: time and spectral field correlations, distribution of the fluctuations of the instantaneous radiation power, distribution of the energy in the electron bunch, distribution of the radiation energy after the monochromator installed at the FEL amplifier exit and radiation spectrum. The linear high gain limit is studied analytically. It is shown that the radiation from a SASE FEL operating in the linear regime possesses all the features corresponding to completely chaotic polarized radiation. A detailed study of statistical properties of the radiation from a SASE FEL operating in linear and nonlinear regime has been performed by means of time-dependent simulation codes. All numerical results presented in the paper have been calculated for the 70 nm SASE FEL at the TESLA Test Facility being under construction at DESY.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Noo, F; Guo, Z
2016-06-15
Purpose: Penalized-weighted least-square reconstruction has become an important research topic in CT, to reduce dose without affecting image quality. Two components impact image quality in this reconstruction: the statistical weights and the use of an edge-preserving penalty term. We are interested in assessing the influence of statistical weights on their own, without the edge-preserving feature. Methods: The influence of statistical weights on image quality was assessed in terms of low-contrast detail detection using LROC analysis. The task amounted to detect and localize a 6-mm lesion with random contrast inside the FORBILD head phantom. A two-alternative forced-choice experiment was used withmore » two human observers performing the task. Reconstructions without and with statistical weights were compared, both using the same quadratic penalty term. The beam energy was set to 30keV to amplify spatial differences in attenuation and thereby the role of statistical weights. A fan-beam data acquisition geometry was used. Results: Visual inspection of images clearly showed a difference in noise between the two reconstructions methods. As expected, the reconstruction without statistical weights exhibited noise streaks. The other reconstruction appeared better in this aspect, but presented other disturbing noise patterns and artifacts induced by the weights. The LROC analysis yield the following 95-percent confidence interval for the difference in reader-averaged AUC (reconstruction without weights minus reconstruction with weights): [0.0026,0.0599]. The mean AUC value was 0.9094. Conclusion: We have investigated the impact of statistical weights without the use of edge-preserving penalty in penalized weighted least-square reconstruction. A decrease rather than increase in image quality was observed when using statistical weights. Thus, the observers were better able to cope with the noise streaks than the noise patterns and artifacts induced by the statistical weights. It may be that different results would be obtained if the penalty term was used with a pixel-dependent weight. F Noo receives research support from Siemens Healthcare GmbH.« less
NASA Astrophysics Data System (ADS)
Mfumu Kihumba, Antoine; Vanclooster, Marnik
2013-04-01
Drinking water in Kinshasa, the capital of the Democratic Republic of Congo, is provided by extracting groundwater from the local aquifer, particularly in peripheral areas. The exploited groundwater body is mainly unconfined and located within a continuous detrital aquifer, primarily composed of sedimentary formations. However, the aquifer is subjected to an increasing threat of anthropogenic pollution pressure. Understanding the detailed origin of this pollution pressure is important for sustainable drinking water management in Kinshasa. The present study aims to explain the observed nitrate pollution problem, nitrate being considered as a good tracer for other pollution threats. The analysis is made in terms of physical attributes that are readily available using a statistical modelling approach. For the nitrate data, use was made of a historical groundwater quality assessment study, for which the data were re-analysed. The physical attributes are related to the topography, land use, geology and hydrogeology of the region. Prior to the statistical modelling, intrinsic and specific vulnerability for nitrate pollution was assessed. This vulnerability assessment showed that the alluvium area in the northern part of the region is the most vulnerable area. This area consists of urban land use with poor sanitation. Re-analysis of the nitrate pollution data demonstrated that the spatial variability of nitrate concentrations in the groundwater body is high, and coherent with the fragmented land use of the region and the intrinsic and specific vulnerability maps. For the statistical modeling use was made of multiple regression and regression tree analysis. The results demonstrated the significant impact of land use variables on the Kinshasa groundwater nitrate pollution and the need for a detailed delineation of groundwater capture zones around the monitoring stations. Key words: Groundwater , Isotopic, Kinshasa, Modelling, Pollution, Physico-chemical.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mayer, B. P.; Mew, D. A.; DeHope, A.
Attribution of the origin of an illicit drug relies on identification of compounds indicative of its clandestine production and is a key component of many modern forensic investigations. The results of these studies can yield detailed information on method of manufacture, starting material source, and final product - all critical forensic evidence. In the present work, chemical attribution signatures (CAS) associated with the synthesis of the analgesic fentanyl, N-(1-phenylethylpiperidin-4-yl)-N-phenylpropanamide, were investigated. Six synthesis methods, all previously published fentanyl synthetic routes or hybrid versions thereof, were studied in an effort to identify and classify route-specific signatures. 160 distinct compounds and inorganicmore » species were identified using gas and liquid chromatographies combined with mass spectrometric methods (GC-MS and LCMS/ MS-TOF) in conjunction with inductively coupled plasma mass spectrometry (ICPMS). The complexity of the resultant data matrix urged the use of multivariate statistical analysis. Using partial least squares discriminant analysis (PLS-DA), 87 route-specific CAS were classified and a statistical model capable of predicting the method of fentanyl synthesis was validated and tested against CAS profiles from crude fentanyl products deposited and later extracted from two operationally relevant surfaces: stainless steel and vinyl tile. This work provides the most detailed fentanyl CAS investigation to date by using orthogonal mass spectral data to identify CAS of forensic significance for illicit drug detection, profiling, and attribution.« less
Teo, Guoshou; Kim, Sinae; Tsou, Chih-Chiang; Collins, Ben; Gingras, Anne-Claude; Nesvizhskii, Alexey I; Choi, Hyungwon
2015-11-03
Data independent acquisition (DIA) mass spectrometry is an emerging technique that offers more complete detection and quantification of peptides and proteins across multiple samples. DIA allows fragment-level quantification, which can be considered as repeated measurements of the abundance of the corresponding peptides and proteins in the downstream statistical analysis. However, few statistical approaches are available for aggregating these complex fragment-level data into peptide- or protein-level statistical summaries. In this work, we describe a software package, mapDIA, for statistical analysis of differential protein expression using DIA fragment-level intensities. The workflow consists of three major steps: intensity normalization, peptide/fragment selection, and statistical analysis. First, mapDIA offers normalization of fragment-level intensities by total intensity sums as well as a novel alternative normalization by local intensity sums in retention time space. Second, mapDIA removes outlier observations and selects peptides/fragments that preserve the major quantitative patterns across all samples for each protein. Last, using the selected fragments and peptides, mapDIA performs model-based statistical significance analysis of protein-level differential expression between specified groups of samples. Using a comprehensive set of simulation datasets, we show that mapDIA detects differentially expressed proteins with accurate control of the false discovery rates. We also describe the analysis procedure in detail using two recently published DIA datasets generated for 14-3-3β dynamic interaction network and prostate cancer glycoproteome. The software was written in C++ language and the source code is available for free through SourceForge website http://sourceforge.net/projects/mapdia/.This article is part of a Special Issue entitled: Computational Proteomics. Copyright © 2015 Elsevier B.V. All rights reserved.
Barber, Julie A; Thompson, Simon G
1998-01-01
Objective To review critically the statistical methods used for health economic evaluations in randomised controlled trials where an estimate of cost is available for each patient in the study. Design Survey of published randomised trials including an economic evaluation with cost values suitable for statistical analysis; 45 such trials published in 1995 were identified from Medline. Main outcome measures The use of statistical methods for cost data was assessed in terms of the descriptive statistics reported, use of statistical inference, and whether the reported conclusions were justified. Results Although all 45 trials reviewed apparently had cost data for each patient, only 9 (20%) reported adequate measures of variability for these data and only 25 (56%) gave results of statistical tests or a measure of precision for the comparison of costs between the randomised groups. Only 16 (36%) of the articles gave conclusions which were justified on the basis of results presented in the paper. No paper reported sample size calculations for costs. Conclusions The analysis and interpretation of cost data from published trials reveal a lack of statistical awareness. Strong and potentially misleading conclusions about the relative costs of alternative therapies have often been reported in the absence of supporting statistical evidence. Improvements in the analysis and reporting of health economic assessments are urgently required. Health economic guidelines need to be revised to incorporate more detailed statistical advice. Key messagesHealth economic evaluations required for important healthcare policy decisions are often carried out in randomised controlled trialsA review of such published economic evaluations assessed whether statistical methods for cost outcomes have been appropriately used and interpretedFew publications presented adequate descriptive information for costs or performed appropriate statistical analysesIn at least two thirds of the papers, the main conclusions regarding costs were not justifiedThe analysis and reporting of health economic assessments within randomised controlled trials urgently need improving PMID:9794854
NASA Astrophysics Data System (ADS)
Pradeep, Krishna; Poiroux, Thierry; Scheer, Patrick; Juge, André; Gouget, Gilles; Ghibaudo, Gérard
2018-07-01
This work details the analysis of wafer level global process variability in 28 nm FD-SOI using split C-V measurements. The proposed approach initially evaluates the native on wafer process variability using efficient extraction methods on split C-V measurements. The on-wafer threshold voltage (VT) variability is first studied and modeled using a simple analytical model. Then, a statistical model based on the Leti-UTSOI compact model is proposed to describe the total C-V variability in different bias conditions. This statistical model is finally used to study the contribution of each process parameter to the total C-V variability.
The other half of the story: effect size analysis in quantitative research.
Maher, Jessica Middlemis; Markey, Jonathan C; Ebert-May, Diane
2013-01-01
Statistical significance testing is the cornerstone of quantitative research, but studies that fail to report measures of effect size are potentially missing a robust part of the analysis. We provide a rationale for why effect size measures should be included in quantitative discipline-based education research. Examples from both biological and educational research demonstrate the utility of effect size for evaluating practical significance. We also provide details about some effect size indices that are paired with common statistical significance tests used in educational research and offer general suggestions for interpreting effect size measures. Finally, we discuss some inherent limitations of effect size measures and provide further recommendations about reporting confidence intervals.
Trends in study design and the statistical methods employed in a leading general medicine journal.
Gosho, M; Sato, Y; Nagashima, K; Takahashi, S
2018-02-01
Study design and statistical methods have become core components of medical research, and the methodology has become more multifaceted and complicated over time. The study of the comprehensive details and current trends of study design and statistical methods is required to support the future implementation of well-planned clinical studies providing information about evidence-based medicine. Our purpose was to illustrate study design and statistical methods employed in recent medical literature. This was an extension study of Sato et al. (N Engl J Med 2017; 376: 1086-1087), which reviewed 238 articles published in 2015 in the New England Journal of Medicine (NEJM) and briefly summarized the statistical methods employed in NEJM. Using the same database, we performed a new investigation of the detailed trends in study design and individual statistical methods that were not reported in the Sato study. Due to the CONSORT statement, prespecification and justification of sample size are obligatory in planning intervention studies. Although standard survival methods (eg Kaplan-Meier estimator and Cox regression model) were most frequently applied, the Gray test and Fine-Gray proportional hazard model for considering competing risks were sometimes used for a more valid statistical inference. With respect to handling missing data, model-based methods, which are valid for missing-at-random data, were more frequently used than single imputation methods. These methods are not recommended as a primary analysis, but they have been applied in many clinical trials. Group sequential design with interim analyses was one of the standard designs, and novel design, such as adaptive dose selection and sample size re-estimation, was sometimes employed in NEJM. Model-based approaches for handling missing data should replace single imputation methods for primary analysis in the light of the information found in some publications. Use of adaptive design with interim analyses is increasing after the presentation of the FDA guidance for adaptive design. © 2017 John Wiley & Sons Ltd.
Identification of the isomers using principal component analysis (PCA) method
NASA Astrophysics Data System (ADS)
Kepceoǧlu, Abdullah; Gündoǧdu, Yasemin; Ledingham, Kenneth William David; Kilic, Hamdi Sukur
2016-03-01
In this work, we have carried out a detailed statistical analysis for experimental data of mass spectra from xylene isomers. Principle Component Analysis (PCA) was used to identify the isomers which cannot be distinguished using conventional statistical methods for interpretation of their mass spectra. Experiments have been carried out using a linear TOF-MS coupled to a femtosecond laser system as an energy source for the ionisation processes. We have performed experiments and collected data which has been analysed and interpreted using PCA as a multivariate analysis of these spectra. This demonstrates the strength of the method to get an insight for distinguishing the isomers which cannot be identified using conventional mass analysis obtained through dissociative ionisation processes on these molecules. The PCA results dependending on the laser pulse energy and the background pressure in the spectrometers have been presented in this work.
Statistical analysis of Turbine Engine Diagnostic (TED) field test data
NASA Astrophysics Data System (ADS)
Taylor, Malcolm S.; Monyak, John T.
1994-11-01
During the summer of 1993, a field test of turbine engine diagnostic (TED) software, developed jointly by U.S. Army Research Laboratory and the U.S. Army Ordnance Center and School, was conducted at Fort Stuart, GA. The data were collected in conformance with a cross-over design, some of whose considerations are detailed. The initial analysis of the field test data was exploratory, followed by a more formal investigation. Technical aspects of the data analysis insights that were elicited are reported.
2017-08-30
as being three-fold: 1) a measurement of the integrity of both the central and peripheral visual processing centers; 2) an indicator of detail...visual assessment task 12 integral to the Army’s Class 1 Flight Physical (Ginsburg, 1981 and 1984; Bachman & Behar, 1986). During a Class 1 flight...systems. Meta-analysis has been defined as the statistical analysis of a collection of analytical results for the purpose of integrating the findings
Guam's forest resources, 2002.
Joseph A. Donnegan; Sarah L. Butler; Walter Grabowiecki; Bruce A. Hiserote; David. Limtiaco
2004-01-01
The Forest Inventory and Analysis Program collected, analyzed, and summarized field data on 46 forested plots on the island of Guam. Estimates of forest area, tree stem volume and biomass, the numbers of trees, tree damages, and the distribution of tree sizes were summarized for this statistical sample. Detailed tables and graphical highlights provide a summary of Guam...
Methods for the evaluation of alternative disaster warning systems
NASA Technical Reports Server (NTRS)
Agnew, C. E.; Anderson, R. J., Jr.; Lanen, W. N.
1977-01-01
For each of the methods identified, a theoretical basis is provided and an illustrative example is described. The example includes sufficient realism and detail to enable an analyst to conduct an evaluation of other systems. The methods discussed in the study include equal capability cost analysis, consumers' surplus, and statistical decision theory.
Spatial prediction of landslide hazard using discriminant analysis and GIS
Peter V. Gorsevski; Paul Gessler; Randy B. Foltz
2000-01-01
Environmental attributes relevant for spatial prediction of landslides triggered by rain and snowmelt events were derived from digital elevation model (DEM). Those data in conjunction with statistics and geographic information system (GIS) provided a detailed basis for spatial prediction of landslide hazard. The spatial prediction of landslide hazard in this paper is...
Estimated Effects of Retirement Revision on Retention of Navy Tactical Pilots.
1986-12-01
detailed explanation of the procedure and proofs can be found in Hanushek and Jackson [Ref. 441. S511 ,V. VI. RESULTS AND ANALYSIS A. DESCRIPTIVE...Introduction to Econometrics, pp. 242-243, Prentice-Hall, 1978. 44. Hanushek Eric ard Jackson, John, Statistical .Mlethods for Social Scientists, p. S188
A constant current charge technique for low Earth orbit life testing
NASA Technical Reports Server (NTRS)
Glueck, Peter
1991-01-01
A constant current charge technique for low earth orbit testing of nickel cadmium cells is presented. The method mimics the familiar taper charge of the constant potential technique while maintaining cell independence for statistical analysis. A detailed example application is provided and the advantages and disadvantages of this technique are discussed.
Palau's forest resources, 2003.
Joseph A. Donnegan; Sarah L. Butler; Olaf Kuegler; Brent J. Stroud; Bruce A. Hiserote; Kashgar. Rengulbai
2007-01-01
The Forest Inventory and Analysis Program collected, analyzed, and summarized field data on 54 forested plots on the islands in the Republic of Palau. Estimates of forest area, tree stem volume and biomass, the numbers of trees, tree damages, and the distribution of tree sizes were summarized for this statistical sample. Detailed tables and graphical highlights provide...
Mediation Analysis in a Latent Growth Curve Modeling Framework
ERIC Educational Resources Information Center
von Soest, Tilmann; Hagtvet, Knut A.
2011-01-01
This article presents several longitudinal mediation models in the framework of latent growth curve modeling and provides a detailed account of how such models can be constructed. Logical and statistical challenges that might arise when such analyses are conducted are also discussed. Specifically, we discuss how the initial status (intercept) and…
California forests: trends, problems, and opportunities.
Charles L. Bolsinger
1980-01-01
The most recent information on forest area in California, volume of timber, ownership of forest resources, and rate of use and replenishment is summarized. An analysis of physical opportunities to increase timber production is presented, along with a discussion of problems relating to timber production. Also included are detailed statistical tables; a brief historical...
Conditional statistics in a turbulent premixed flame derived from direct numerical simulation
NASA Technical Reports Server (NTRS)
Mantel, Thierry; Bilger, Robert W.
1994-01-01
The objective of this paper is to briefly introduce conditional moment closure (CMC) methods for premixed systems and to derive the transport equation for the conditional species mass fraction conditioned on the progress variable based on the enthalpy. Our statistical analysis will be based on the 3-D DNS database of Trouve and Poinsot available at the Center for Turbulence Research. The initial conditions and characteristics (turbulence, thermo-diffusive properties) as well as the numerical method utilized in the DNS of Trouve and Poinsot are presented, and some details concerning our statistical analysis are also given. From the analysis of DNS results, the effects of the position in the flame brush, of the Damkoehler and Lewis numbers on the conditional mean scalar dissipation, and conditional mean velocity are presented and discussed. Information concerning unconditional turbulent fluxes are also presented. The anomaly found in previous studies of counter-gradient diffusion for the turbulent flux of the progress variable is investigated.
NASA Astrophysics Data System (ADS)
Yan, Wang-Ji; Ren, Wei-Xin
2018-01-01
This study applies the theoretical findings of circularly-symmetric complex normal ratio distribution Yan and Ren (2016) [1,2] to transmissibility-based modal analysis from a statistical viewpoint. A probabilistic model of transmissibility function in the vicinity of the resonant frequency is formulated in modal domain, while some insightful comments are offered. It theoretically reveals that the statistics of transmissibility function around the resonant frequency is solely dependent on 'noise-to-signal' ratio and mode shapes. As a sequel to the development of the probabilistic model of transmissibility function in modal domain, this study poses the process of modal identification in the context of Bayesian framework by borrowing a novel paradigm. Implementation issues unique to the proposed approach are resolved by Lagrange multiplier approach. Also, this study explores the possibility of applying Bayesian analysis in distinguishing harmonic components and structural ones. The approaches are verified through simulated data and experimentally testing data. The uncertainty behavior due to variation of different factors is also discussed in detail.
Statistical analysis of the determinations of the Sun's Galactocentric distance
NASA Astrophysics Data System (ADS)
Malkin, Zinovy
2013-02-01
Based on several tens of R0 measurements made during the past two decades, several studies have been performed to derive the best estimate of R0. Some used just simple averaging to derive a result, whereas others provided comprehensive analyses of possible errors in published results. In either case, detailed statistical analyses of data used were not performed. However, a computation of the best estimates of the Galactic rotation constants is not only an astronomical but also a metrological task. Here we perform an analysis of 53 R0 measurements (published in the past 20 years) to assess the consistency of the data. Our analysis shows that they are internally consistent. It is also shown that any trend in the R0 estimates from the last 20 years is statistically negligible, which renders the presence of a bandwagon effect doubtful. On the other hand, the formal errors in the published R0 estimates improve significantly with time.
ERIC Educational Resources Information Center
Berkner, Lutz; Wei, Christina Chang
2006-01-01
This report, based on data from the 2003-04 National Postsecondary Student Aid Study (NASAS:04), provides detailed information about undergraduate tuition and total price of attendance at various types of institutions, the percentage of students receiving various types of financial aid, and the average amounts that they received. In 2003-04,…
NASA Astrophysics Data System (ADS)
Zeng, Bangze; Zhu, Youpan; Li, Zemin; Hu, Dechao; Luo, Lin; Zhao, Deli; Huang, Juan
2014-11-01
Duo to infrared image with low contrast, big noise and unclear visual effect, target is very difficult to observed and identified. This paper presents an improved infrared image detail enhancement algorithm based on adaptive histogram statistical stretching and gradient filtering (AHSS-GF). Based on the fact that the human eyes are very sensitive to the edges and lines, the author proposed to extract the details and textures by using the gradient filtering. New histogram could be acquired by calculating the sum of original histogram based on fixed window. With the minimum value for cut-off point, author carried on histogram statistical stretching. After the proper weights given to the details and background, the detail-enhanced results could be acquired finally. The results indicate image contrast could be improved and the details and textures could be enhanced effectively as well.
RipleyGUI: software for analyzing spatial patterns in 3D cell distributions
Hansson, Kristin; Jafari-Mamaghani, Mehrdad; Krieger, Patrik
2013-01-01
The true revolution in the age of digital neuroanatomy is the ability to extensively quantify anatomical structures and thus investigate structure-function relationships in great detail. To facilitate the quantification of neuronal cell patterns we have developed RipleyGUI, a MATLAB-based software that can be used to detect patterns in the 3D distribution of cells. RipleyGUI uses Ripley's K-function to analyze spatial distributions. In addition the software contains statistical tools to determine quantitative statistical differences, and tools for spatial transformations that are useful for analyzing non-stationary point patterns. The software has a graphical user interface making it easy to use without programming experience, and an extensive user manual explaining the basic concepts underlying the different statistical tools used to analyze spatial point patterns. The described analysis tool can be used for determining the spatial organization of neurons that is important for a detailed study of structure-function relationships. For example, neocortex that can be subdivided into six layers based on cell density and cell types can also be analyzed in terms of organizational principles distinguishing the layers. PMID:23658544
Statistical plant set estimation using Schroeder-phased multisinusoidal input design
NASA Technical Reports Server (NTRS)
Bayard, D. S.
1992-01-01
A frequency domain method is developed for plant set estimation. The estimation of a plant 'set' rather than a point estimate is required to support many methods of modern robust control design. The approach here is based on using a Schroeder-phased multisinusoid input design which has the special property of placing input energy only at the discrete frequency points used in the computation. A detailed analysis of the statistical properties of the frequency domain estimator is given, leading to exact expressions for the probability distribution of the estimation error, and many important properties. It is shown that, for any nominal parametric plant estimate, one can use these results to construct an overbound on the additive uncertainty to any prescribed statistical confidence. The 'soft' bound thus obtained can be used to replace 'hard' bounds presently used in many robust control analysis and synthesis methods.
The statistical analysis of circadian phase and amplitude in constant-routine core-temperature data
NASA Technical Reports Server (NTRS)
Brown, E. N.; Czeisler, C. A.
1992-01-01
Accurate estimation of the phases and amplitude of the endogenous circadian pacemaker from constant-routine core-temperature series is crucial for making inferences about the properties of the human biological clock from data collected under this protocol. This paper presents a set of statistical methods based on a harmonic-regression-plus-correlated-noise model for estimating the phases and the amplitude of the endogenous circadian pacemaker from constant-routine core-temperature data. The methods include a Bayesian Monte Carlo procedure for computing the uncertainty in these circadian functions. We illustrate the techniques with a detailed study of a single subject's core-temperature series and describe their relationship to other statistical methods for circadian data analysis. In our laboratory, these methods have been successfully used to analyze more than 300 constant routines and provide a highly reliable means of extracting phase and amplitude information from core-temperature data.
Universal self-similarity of propagating populations
NASA Astrophysics Data System (ADS)
Eliazar, Iddo; Klafter, Joseph
2010-07-01
This paper explores the universal self-similarity of propagating populations. The following general propagation model is considered: particles are randomly emitted from the origin of a d -dimensional Euclidean space and propagate randomly and independently of each other in space; all particles share a statistically common—yet arbitrary—motion pattern; each particle has its own random propagation parameters—emission epoch, motion frequency, and motion amplitude. The universally self-similar statistics of the particles’ displacements and first passage times (FPTs) are analyzed: statistics which are invariant with respect to the details of the displacement and FPT measurements and with respect to the particles’ underlying motion pattern. Analysis concludes that the universally self-similar statistics are governed by Poisson processes with power-law intensities and by the Fréchet and Weibull extreme-value laws.
Universal self-similarity of propagating populations.
Eliazar, Iddo; Klafter, Joseph
2010-07-01
This paper explores the universal self-similarity of propagating populations. The following general propagation model is considered: particles are randomly emitted from the origin of a d-dimensional Euclidean space and propagate randomly and independently of each other in space; all particles share a statistically common--yet arbitrary--motion pattern; each particle has its own random propagation parameters--emission epoch, motion frequency, and motion amplitude. The universally self-similar statistics of the particles' displacements and first passage times (FPTs) are analyzed: statistics which are invariant with respect to the details of the displacement and FPT measurements and with respect to the particles' underlying motion pattern. Analysis concludes that the universally self-similar statistics are governed by Poisson processes with power-law intensities and by the Fréchet and Weibull extreme-value laws.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1983-01-01
An airborne combined radiometric and magnetic survey was performed for the Department of Energy (DOE) over the Durango A, Durango B, Durango C, and Durango D Detail Areas of southwestern Colorado. The Durango A Detail Area is within the coverage of the Needle Mountains and Silverton 15' map sheets, and the Pole Creek Mountain, Rio Grande Pyramid, Emerald Lake, Granite Peak, Vallecito Reservoir, and Lemon Reservoir 7.5' map sheets of the National Topographic Map Series (NTMS). The Durango B Detail Area is within the coverage of the Silverton 15' map sheet and the Wetterhorn Peak, Uncompahgre Peak, Lake City, Redcloudmore » Peak, Lake San Cristobal, Pole Creek Mountain, and Finger Mesa 7.5' map sheets of the NTMS. The Durango C Detail Area is within the coverage of the Platoro and Wolf Creek Pass 15' map sheets of the NTMS. The Durango D Detail Area is within the coverage of the Granite Lake, Cimarrona Peak, Bear Mountain, and Oakbrush Ridge 7.5' map sheets of the NTMS. Radiometric data were corrected for live time, aircraft and equipment background, cosmic background, atmospheric radon, Compton scatter, and altitude dependence. The corrected data were statistically evaluated, gridded, and contoured to produce maps of the radiometric variables, uranium, potassium, and thorium; their ratios; and the residual magnetic field. These maps have been analyzed in order to produce a multi-variant analysis contour map based on the radiometric response of the individual geological units. A geochemical analysis has been performed, using the radiometric and magnetic contour maps, the multi-variant analysis map, and factor analysis techniques, to produce a geochemical analysis map for the area.« less
Illinois travel statistics, 2009
DOT National Transportation Integrated Search
2010-01-01
The 2009 Illinois Travel Statistics publication is assembled to provide detailed traffic : information to the different users of traffic data. While most users of traffic data at this level : of detail are within the Illinois Department of Transporta...
Illinois travel statistics, 2001
DOT National Transportation Integrated Search
2002-01-01
The 2001 Illinois Travel Statistics publication is assembled to provide detailed traffic : information to the different users of traffic data. While most users of traffic data at this : level of detail are within the Illinois Department of Transporta...
Illinois travel statistics, 2003
DOT National Transportation Integrated Search
2004-01-01
The 2003 Illinois Travel Statistics publication is assembled to provide detailed traffic : information to the different users of traffic data. While most users of traffic data at this level : of detail are within the Illinois Department of Transporta...
Illinois travel statistics, 2010
DOT National Transportation Integrated Search
2011-01-01
The 2010 Illinois Travel Statistics publication is assembled to provide detailed traffic : information to the different users of traffic data. While most users of traffic data at this level : of detail are within the Illinois Department of Transporta...
Illinois travel statistics, 2005
DOT National Transportation Integrated Search
2006-01-01
The 2005 Illinois Travel Statistics publication is assembled to provide detailed traffic : information to the different users of traffic data. While most users of traffic data at this level : of detail are within the Illinois Department of Transporta...
Illinois travel statistics, 2007
DOT National Transportation Integrated Search
2008-01-01
The 2007 Illinois Travel Statistics publication is assembled to provide detailed traffic : information to the different users of traffic data. While most users of traffic data at this level : of detail are within the Illinois Department of Transporta...
Illinois travel statistics, 2000
DOT National Transportation Integrated Search
2001-01-01
The 2000 Illinois Travel Statistics publication is assembled to provide detailed traffic : information to the different users of traffic data. While most users of traffic data at this : level of detail are within the Illinois Department of Transporta...
Illinois travel statistics, 2006
DOT National Transportation Integrated Search
2007-01-01
The 2006 Illinois Travel Statistics publication is assembled to provide detailed traffic : information to the different users of traffic data. While most users of traffic data at this level : of detail are within the Illinois Department of Transporta...
Illinois travel statistics, 2002
DOT National Transportation Integrated Search
2003-01-01
The 2002 Illinois Travel Statistics publication is assembled to provide detailed traffic : information to the different users of traffic data. While most users of traffic data at this level : of detail are within the Illinois Department of Transporta...
Illinois travel statistics, 2008
DOT National Transportation Integrated Search
2009-01-01
The 2008 Illinois Travel Statistics publication is assembled to provide detailed traffic : information to the different users of traffic data. While most users of traffic data at this level : of detail are within the Illinois Department of Transporta...
Illinois travel statistics, 2004
DOT National Transportation Integrated Search
2005-01-01
The 2004 Illinois Travel Statistics publication is assembled to provide detailed traffic : information to the different users of traffic data. While most users of traffic data at this level : of detail are within the Illinois Department of Transporta...
Illinois travel statistics, 1999
DOT National Transportation Integrated Search
2000-01-01
The 1999 Illinois Travel Statistics publication is assembled to provide detailed traffic : information to the different users of traffic data. While most users of traffic data at this : level of detail are within the Illinois Department of Transporta...
Unicomb, Rachael; Colyvas, Kim; Harrison, Elisabeth; Hewat, Sally
2015-06-01
Case-study methodology studying change is often used in the field of speech-language pathology, but it can be criticized for not being statistically robust. Yet with the heterogeneous nature of many communication disorders, case studies allow clinicians and researchers to closely observe and report on change. Such information is valuable and can further inform large-scale experimental designs. In this research note, a statistical analysis for case-study data is outlined that employs a modification to the Reliable Change Index (Jacobson & Truax, 1991). The relationship between reliable change and clinical significance is discussed. Example data are used to guide the reader through the use and application of this analysis. A method of analysis is detailed that is suitable for assessing change in measures with binary categorical outcomes. The analysis is illustrated using data from one individual, measured before and after treatment for stuttering. The application of this approach to assess change in categorical, binary data has potential application in speech-language pathology. It enables clinicians and researchers to analyze results from case studies for their statistical and clinical significance. This new method addresses a gap in the research design literature, that is, the lack of analysis methods for noncontinuous data (such as counts, rates, proportions of events) that may be used in case-study designs.
Network Meta-Analysis Using R: A Review of Currently Available Automated Packages
Neupane, Binod; Richer, Danielle; Bonner, Ashley Joel; Kibret, Taddele; Beyene, Joseph
2014-01-01
Network meta-analysis (NMA) – a statistical technique that allows comparison of multiple treatments in the same meta-analysis simultaneously – has become increasingly popular in the medical literature in recent years. The statistical methodology underpinning this technique and software tools for implementing the methods are evolving. Both commercial and freely available statistical software packages have been developed to facilitate the statistical computations using NMA with varying degrees of functionality and ease of use. This paper aims to introduce the reader to three R packages, namely, gemtc, pcnetmeta, and netmeta, which are freely available software tools implemented in R. Each automates the process of performing NMA so that users can perform the analysis with minimal computational effort. We present, compare and contrast the availability and functionality of different important features of NMA in these three packages so that clinical investigators and researchers can determine which R packages to implement depending on their analysis needs. Four summary tables detailing (i) data input and network plotting, (ii) modeling options, (iii) assumption checking and diagnostic testing, and (iv) inference and reporting tools, are provided, along with an analysis of a previously published dataset to illustrate the outputs available from each package. We demonstrate that each of the three packages provides a useful set of tools, and combined provide users with nearly all functionality that might be desired when conducting a NMA. PMID:25541687
Network meta-analysis using R: a review of currently available automated packages.
Neupane, Binod; Richer, Danielle; Bonner, Ashley Joel; Kibret, Taddele; Beyene, Joseph
2014-01-01
Network meta-analysis (NMA)--a statistical technique that allows comparison of multiple treatments in the same meta-analysis simultaneously--has become increasingly popular in the medical literature in recent years. The statistical methodology underpinning this technique and software tools for implementing the methods are evolving. Both commercial and freely available statistical software packages have been developed to facilitate the statistical computations using NMA with varying degrees of functionality and ease of use. This paper aims to introduce the reader to three R packages, namely, gemtc, pcnetmeta, and netmeta, which are freely available software tools implemented in R. Each automates the process of performing NMA so that users can perform the analysis with minimal computational effort. We present, compare and contrast the availability and functionality of different important features of NMA in these three packages so that clinical investigators and researchers can determine which R packages to implement depending on their analysis needs. Four summary tables detailing (i) data input and network plotting, (ii) modeling options, (iii) assumption checking and diagnostic testing, and (iv) inference and reporting tools, are provided, along with an analysis of a previously published dataset to illustrate the outputs available from each package. We demonstrate that each of the three packages provides a useful set of tools, and combined provide users with nearly all functionality that might be desired when conducting a NMA.
Terrain-analysis procedures for modeling radar backscatter
Schaber, Gerald G.; Pike, Richard J.; Berlin, Graydon Lennis
1978-01-01
The collection and analysis of detailed information on the surface of natural terrain are important aspects of radar-backscattering modeling. Radar is especially sensitive to surface-relief changes in the millimeter- to-decimeter scale four conventional K-band (~1-cm wavelength) to L-band (~25-cm wavelength) radar systems. Surface roughness statistics that characterize these changes in detail have been generated by a comprehensive set of seven programmed calculations for radar-backscatter modeling from sets of field measurements. The seven programs are 1) formatting of data in readable form for subsequent topographic analysis program; 2) relief analysis; 3) power spectral analysis; 4) power spectrum plots; 5) slope angle between slope reversals; 6) slope angle against slope interval plots; and 7) base length slope angle and curvature. This complete Fortran IV software package, 'Terrain Analysis', is here presented for the first time. It was originally developed a decade ago for investigations of lunar morphology and surface trafficability for the Apollo Lunar Roving Vehicle.
A new image encryption algorithm based on the fractional-order hyperchaotic Lorenz system
NASA Astrophysics Data System (ADS)
Wang, Zhen; Huang, Xia; Li, Yu-Xia; Song, Xiao-Na
2013-01-01
We propose a new image encryption algorithm on the basis of the fractional-order hyperchaotic Lorenz system. While in the process of generating a key stream, the system parameters and the derivative order are embedded in the proposed algorithm to enhance the security. Such an algorithm is detailed in terms of security analyses, including correlation analysis, information entropy analysis, run statistic analysis, mean-variance gray value analysis, and key sensitivity analysis. The experimental results demonstrate that the proposed image encryption scheme has the advantages of large key space and high security for practical image encryption.
ERIC Educational Resources Information Center
West Virginia Higher Education Policy Commission, 2004
2004-01-01
The West Virginia Higher Education Facilities Information System was formed as a method for instituting statewide standardization of space use and classification; to serve as a vehicle for statewide data acquisition; and to provide statistical data that contributes to detailed institutional planning analysis. The result thus far is the production…
ERIC Educational Resources Information Center
Technomics, Inc., McLean, VA.
This publication is Attachment 1 of a set of 16 computer listed QPCB task sorts, by career level, for the entire Hospital Corps and Dental Technician fields. Statistical data are presented in tabular form for a detailed listing of job duties in radiation. (BT)
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-04
... this formula and is designed to calculate the amount of money that may be lost on a portfolio over a... to statistical analysis, such as OTC Bulletin Board or Pink Sheet issues or issues trading below a... considered matched and institutional delivery details are sent to DTC for settlement. Completion of the money...
Traeger, Adrian C; Skinner, Ian W; Hübscher, Markus; Lee, Hopin; Moseley, G Lorimer; Nicholas, Michael K; Henschke, Nicholas; Refshauge, Kathryn M; Blyth, Fiona M; Main, Chris J; Hush, Julia M; Pearce, Garry; Lo, Serigne; McAuley, James H
Statistical analysis plans increase the transparency of decisions made in the analysis of clinical trial results. The purpose of this paper is to detail the planned analyses for the PREVENT trial, a randomized, placebo-controlled trial of patient education for acute low back pain. We report the pre-specified principles, methods, and procedures to be adhered to in the main analysis of the PREVENT trial data. The primary outcome analysis will be based on Mixed Models for Repeated Measures (MMRM), which can test treatment effects at specific time points, and the assumptions of this analysis are outlined. We also outline the treatment of secondary outcomes and planned sensitivity analyses. We provide decisions regarding the treatment of missing data, handling of descriptive and process measure data, and blinded review procedures. Making public the pre-specified statistical analysis plan for the PREVENT trial minimizes the potential for bias in the analysis of trial data, and in the interpretation and reporting of trial results. ACTRN12612001180808 (https://www.anzctr.org.au/Trial/Registration/TrialReview.aspx?ACTRN=12612001180808). Copyright © 2017 Associação Brasileira de Pesquisa e Pós-Graduação em Fisioterapia. Publicado por Elsevier Editora Ltda. All rights reserved.
Shim, Minsun; Kim, Yong-Chan; Kye, Su Yeon; Park, Keeho
2016-08-01
How the news media cover cancer may have profound significance for cancer prevention and control; however, little is known about the actual content of cancer news coverage in Korea. This research thus aimed to examine news portrayal of specific cancer types with respect to threat and efficacy, and to investigate whether news portrayal corresponds to actual cancer statistics. A content analysis of 1,138 cancer news stories was conducted, using a representative sample from 23 news outlets (television, newspapers, and other news media) in Korea over a 5-year period from 2008 to 2012. Cancer incidence and mortality rates were obtained from the Korean Statistical Information Service. Results suggest that threat was most prominent in news stories on pancreatic cancer (with 87% of the articles containing threat information with specific details), followed by liver (80%) and lung cancers (70%), and least in stomach cancer (41%). Efficacy information with details was conveyed most often in articles on colorectal (54%), skin (54%), and liver (50%) cancers, and least in thyroid cancer (17%). In terms of discrepancies between news portrayal and actual statistics, the threat of pancreatic and liver cancers was overreported, whereas the threat of stomach and prostate cancers was underreported. Efficacy information regarding cervical and colorectal cancers was overrepresented in the news relative to cancer statistics; efficacy of lung and thyroid cancers was underreported. Findings provide important implications for medical professionals to understand news information about particular cancers as a basis for public (mis)perception, and to communicate effectively about cancer risk with the public and patients.
2016-01-01
How the news media cover cancer may have profound significance for cancer prevention and control; however, little is known about the actual content of cancer news coverage in Korea. This research thus aimed to examine news portrayal of specific cancer types with respect to threat and efficacy, and to investigate whether news portrayal corresponds to actual cancer statistics. A content analysis of 1,138 cancer news stories was conducted, using a representative sample from 23 news outlets (television, newspapers, and other news media) in Korea over a 5-year period from 2008 to 2012. Cancer incidence and mortality rates were obtained from the Korean Statistical Information Service. Results suggest that threat was most prominent in news stories on pancreatic cancer (with 87% of the articles containing threat information with specific details), followed by liver (80%) and lung cancers (70%), and least in stomach cancer (41%). Efficacy information with details was conveyed most often in articles on colorectal (54%), skin (54%), and liver (50%) cancers, and least in thyroid cancer (17%). In terms of discrepancies between news portrayal and actual statistics, the threat of pancreatic and liver cancers was overreported, whereas the threat of stomach and prostate cancers was underreported. Efficacy information regarding cervical and colorectal cancers was overrepresented in the news relative to cancer statistics; efficacy of lung and thyroid cancers was underreported. Findings provide important implications for medical professionals to understand news information about particular cancers as a basis for public (mis)perception, and to communicate effectively about cancer risk with the public and patients. PMID:27478333
Olavarría, Verónica V; Arima, Hisatomi; Anderson, Craig S; Brunser, Alejandro; Muñoz-Venturelli, Paula; Billot, Laurent; Lavados, Pablo M
2017-02-01
Background The HEADPOST Pilot is a proof-of-concept, open, prospective, multicenter, international, cluster randomized, phase IIb controlled trial, with masked outcome assessment. The trial will test if lying flat head position initiated in patients within 12 h of onset of acute ischemic stroke involving the anterior circulation increases cerebral blood flow in the middle cerebral arteries, as measured by transcranial Doppler. The study will also assess the safety and feasibility of patients lying flat for ≥24 h. The trial was conducted in centers in three countries, with ability to perform early transcranial Doppler. A feature of this trial was that patients were randomized to a certain position according to the month of admission to hospital. Objective To outline in detail the predetermined statistical analysis plan for HEADPOST Pilot study. Methods All data collected by participating researchers will be reviewed and formally assessed. Information pertaining to the baseline characteristics of patients, their process of care, and the delivery of treatments will be classified, and for each item, appropriate descriptive statistical analyses are planned with comparisons made between randomized groups. For the outcomes, statistical comparisons to be made between groups are planned and described. Results This statistical analysis plan was developed for the analysis of the results of the HEADPOST Pilot study to be transparent, available, verifiable, and predetermined before data lock. Conclusions We have developed a statistical analysis plan for the HEADPOST Pilot study which is to be followed to avoid analysis bias arising from prior knowledge of the study findings. Trial registration The study is registered under HEADPOST-Pilot, ClinicalTrials.gov Identifier NCT01706094.
The Statistical Consulting Center for Astronomy (SCCA)
NASA Technical Reports Server (NTRS)
Akritas, Michael
2001-01-01
The process by which raw astronomical data acquisition is transformed into scientifically meaningful results and interpretation typically involves many statistical steps. Traditional astronomy limits itself to a narrow range of old and familiar statistical methods: means and standard deviations; least-squares methods like chi(sup 2) minimization; and simple nonparametric procedures such as the Kolmogorov-Smirnov tests. These tools are often inadequate for the complex problems and datasets under investigations, and recent years have witnessed an increased usage of maximum-likelihood, survival analysis, multivariate analysis, wavelet and advanced time-series methods. The Statistical Consulting Center for Astronomy (SCCA) assisted astronomers with the use of sophisticated tools, and to match these tools with specific problems. The SCCA operated with two professors of statistics and a professor of astronomy working together. Questions were received by e-mail, and were discussed in detail with the questioner. Summaries of those questions and answers leading to new approaches were posted on the Web (www.state.psu.edu/ mga/SCCA). In addition to serving individual astronomers, the SCCA established a Web site for general use that provides hypertext links to selected on-line public-domain statistical software and services. The StatCodes site (www.astro.psu.edu/statcodes) provides over 200 links in the areas of: Bayesian statistics; censored and truncated data; correlation and regression, density estimation and smoothing, general statistics packages and information; image analysis; interactive Web tools; multivariate analysis; multivariate clustering and classification; nonparametric analysis; software written by astronomers; spatial statistics; statistical distributions; time series analysis; and visualization tools. StatCodes has received a remarkable high and constant hit rate of 250 hits/week (over 10,000/year) since its inception in mid-1997. It is of interest to scientists both within and outside of astronomy. The most popular sections are multivariate techniques, image analysis, and time series analysis. Hundreds of copies of the ASURV, SLOPES and CENS-TAU codes developed by SCCA scientists were also downloaded from the StatCodes site. In addition to formal SCCA duties, SCCA scientists continued a variety of related activities in astrostatistics, including refereeing of statistically oriented papers submitted to the Astrophysical Journal, talks in meetings including Feigelson's talk to science journalists entitled "The reemergence of astrostatistics" at the American Association for the Advancement of Science meeting, and published papers of astrostatistical content.
Neti, Prasad V.S.V.; Howell, Roger W.
2010-01-01
Recently, the distribution of radioactivity among a population of cells labeled with 210Po was shown to be well described by a log-normal (LN) distribution function (J Nucl Med. 2006;47:1049–1058) with the aid of autoradiography. To ascertain the influence of Poisson statistics on the interpretation of the autoradiographic data, the present work reports on a detailed statistical analysis of these earlier data. Methods The measured distributions of α-particle tracks per cell were subjected to statistical tests with Poisson, LN, and Poisson-lognormal (P-LN) models. Results The LN distribution function best describes the distribution of radioactivity among cell populations exposed to 0.52 and 3.8 kBq/mL of 210Po-citrate. When cells were exposed to 67 kBq/mL, the P-LN distribution function gave a better fit; however, the underlying activity distribution remained log-normal. Conclusion The present analysis generally provides further support for the use of LN distributions to describe the cellular uptake of radioactivity. Care should be exercised when analyzing autoradiographic data on activity distributions to ensure that Poisson processes do not distort the underlying LN distribution. PMID:18483086
Statistical analysis of secondary particle distributions in relativistic nucleus-nucleus collisions
NASA Technical Reports Server (NTRS)
Mcguire, Stephen C.
1987-01-01
The use is described of several statistical techniques to characterize structure in the angular distributions of secondary particles from nucleus-nucleus collisions in the energy range 24 to 61 GeV/nucleon. The objective of this work was to determine whether there are correlations between emitted particle intensity and angle that may be used to support the existence of the quark gluon plasma. The techniques include chi-square null hypothesis tests, the method of discrete Fourier transform analysis, and fluctuation analysis. We have also used the method of composite unit vectors to test for azimuthal asymmetry in a data set of 63 JACEE-3 events. Each method is presented in a manner that provides the reader with some practical detail regarding its application. Of those events with relatively high statistics, Fe approaches 0 at 55 GeV/nucleon was found to possess an azimuthal distribution with a highly non-random structure. No evidence of non-statistical fluctuations was found in the pseudo-rapidity distributions of the events studied. It is seen that the most effective application of these methods relies upon the availability of many events or single events that possess very high multiplicities.
Haneuse, Sebastien; Lee, Kyu Ha
2016-05-01
Hospital readmission is a key marker of quality of health care. Notwithstanding its widespread use, however, it remains controversial in part because statistical methods used to analyze readmission, primarily logistic regression and related models, may not appropriately account for patients who die before experiencing a readmission event within the time frame of interest. Toward resolving this, we describe and illustrate the semi-competing risks framework, which refers to the general setting where scientific interest lies with some nonterminal event (eg, readmission), the occurrence of which is subject to a terminal event (eg, death). Although several statistical analysis methods have been proposed for semi-competing risks data, we describe in detail the use of illness-death models primarily because of their relation to well-known methods for survival analysis and the availability of software. We also describe and consider in detail several existing approaches that could, in principle, be used to analyze semi-competing risks data, including composite end point and competing risks analyses. Throughout we illustrate the ideas and methods using data on N=49 763 Medicare beneficiaries hospitalized between 2011 and 2013 with a principle discharge diagnosis of heart failure. © 2016 American Heart Association, Inc.
Thybo, Kasper Højgaard; Jakobsen, Janus Christian; Hägi-Pedersen, Daniel; Pedersen, Niels Anker; Dahl, Jørgen Berg; Schrøder, Henrik Morville; Bülow, Hans Henrik; Bjørck, Jan Gottfrid; Overgaard, Søren; Mathiesen, Ole; Wetterslev, Jørn
2017-10-10
Effective postoperative pain management is essential for the rehabilitation of the surgical patient. The PANSAID trial evaluates the analgesic effects and safety of the combination of paracetamol and ibuprofen. This paper describes in detail the statistical analysis plan for the primary publication to prevent outcome reporting bias and data-driven analysis results. The PANSAID trial is a multicentre, randomised, controlled, parallel, four-group clinical trial comparing the beneficial and harmful effects of different doses and combinations of paracetamol and ibuprofen in patients having total hip arthroplastic surgery. Patients, caregivers, physicians, investigators, and statisticians are blinded to the intervention. The two co-primary outcomes are 24-h consumption of morphine and proportion of patients with one or more serious adverse events within 90 days after surgery. Secondary outcomes are pain scores during mobilisation and at rest at 6 and 24 h postoperatively, and the proportion of patients with one or more adverse events within 24 h postoperatively. PANSAID will provide a large trial with low risk of bias regarding benefits and harms of the combination of paracetamol and ibuprofen used in a perioperative setting. ClinicalTrials.org identifier: NCT02571361 . Registered on 7 October 2015.
Variational Bayesian Parameter Estimation Techniques for the General Linear Model
Starke, Ludger; Ostwald, Dirk
2017-01-01
Variational Bayes (VB), variational maximum likelihood (VML), restricted maximum likelihood (ReML), and maximum likelihood (ML) are cornerstone parametric statistical estimation techniques in the analysis of functional neuroimaging data. However, the theoretical underpinnings of these model parameter estimation techniques are rarely covered in introductory statistical texts. Because of the widespread practical use of VB, VML, ReML, and ML in the neuroimaging community, we reasoned that a theoretical treatment of their relationships and their application in a basic modeling scenario may be helpful for both neuroimaging novices and practitioners alike. In this technical study, we thus revisit the conceptual and formal underpinnings of VB, VML, ReML, and ML and provide a detailed account of their mathematical relationships and implementational details. We further apply VB, VML, ReML, and ML to the general linear model (GLM) with non-spherical error covariance as commonly encountered in the first-level analysis of fMRI data. To this end, we explicitly derive the corresponding free energy objective functions and ensuing iterative algorithms. Finally, in the applied part of our study, we evaluate the parameter and model recovery properties of VB, VML, ReML, and ML, first in an exemplary setting and then in the analysis of experimental fMRI data acquired from a single participant under visual stimulation. PMID:28966572
Haneuse, Sebastien; Lee, Kyu Ha
2016-01-01
Hospital readmission is a key marker of quality of health care. Notwithstanding its widespread use, however, it remains controversial in part because statistical methods used to analyze readmission, primarily logistic regression and related models, may not appropriately account for patients who die prior to experiencing a readmission event within the timeframe of interest. Towards resolving this, we describe and illustrate the semi-competing risks framework, which refers to the general setting where scientific interest lies with some non-terminal event (e.g. readmission), the occurrence of which is subject to a terminal event (e.g. death). Although a number of statistical analysis methods have been proposed for semi-competing risks data, we describe in detail the use of illness-death models primarily because of their relation to well-known methods for survival analysis and the availability of software. We also describe and consider in detail a number of existing approaches that could, in principle, be used to analyze semi-competing risks data including composite endpoint and competing risks analyses. Throughout we illustrate the ideas and methods using data on N=49,763 Medicare beneficiaries hospitalized between 2011–2013 with a principle discharge diagnosis of heart failure. PMID:27072677
NASA Astrophysics Data System (ADS)
Ryazanova, A. A.; Okladnikov, I. G.; Gordov, E. P.
2017-11-01
The frequency of occurrence and magnitude of precipitation and temperature extreme events show positive trends in several geographical regions. These events must be analyzed and studied in order to better understand their impact on the environment, predict their occurrences, and mitigate their effects. For this purpose, we augmented web-GIS called “CLIMATE” to include a dedicated statistical package developed in the R language. The web-GIS “CLIMATE” is a software platform for cloud storage processing and visualization of distributed archives of spatial datasets. It is based on a combined use of web and GIS technologies with reliable procedures for searching, extracting, processing, and visualizing the spatial data archives. The system provides a set of thematic online tools for the complex analysis of current and future climate changes and their effects on the environment. The package includes new powerful methods of time-dependent statistics of extremes, quantile regression and copula approach for the detailed analysis of various climate extreme events. Specifically, the very promising copula approach allows obtaining the structural connections between the extremes and the various environmental characteristics. The new statistical methods integrated into the web-GIS “CLIMATE” can significantly facilitate and accelerate the complex analysis of climate extremes using only a desktop PC connected to the Internet.
Using Statistical Analysis Software to Advance Nitro Plasticizer Wettability
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shear, Trevor Allan
Statistical analysis in science is an extremely powerful tool that is often underutilized. Additionally, it is frequently the case that data is misinterpreted or not used to its fullest extent. Utilizing the advanced software JMP®, many aspects of experimental design and data analysis can be evaluated and improved. This overview will detail the features of JMP® and how they were used to advance a project, resulting in time and cost savings, as well as the collection of scientifically sound data. The project analyzed in this report addresses the inability of a nitro plasticizer to coat a gold coated quartz crystalmore » sensor used in a quartz crystal microbalance. Through the use of the JMP® software, the wettability of the nitro plasticizer was increased by over 200% using an atmospheric plasma pen, ensuring good sample preparation and reliable results.« less
Disutility analysis of oil spills: graphs and trends.
Ventikos, Nikolaos P; Sotiropoulos, Foivos S
2014-04-15
This paper reports the results of an analysis of oil spill cost data assembled from a worldwide pollution database that mainly includes data from the International Oil Pollution Compensation Fund. The purpose of the study is to analyze the conditions of marine pollution accidents and the factors that impact the costs of oil spills worldwide. The accidents are classified into categories based on their characteristics, and the cases are compared using charts to show how the costs are affected under all conditions. This study can be used as a helpful reference for developing a detailed statistical model that is capable of reliably and realistically estimating the total costs of oil spills. To illustrate the differences identified by this statistical analysis, the results are compared with the results of previous studies, and the findings are discussed. Copyright © 2014 Elsevier Ltd. All rights reserved.
Linear regression analysis: part 14 of a series on evaluation of scientific publications.
Schneider, Astrid; Hommel, Gerhard; Blettner, Maria
2010-11-01
Regression analysis is an important statistical method for the analysis of medical data. It enables the identification and characterization of relationships among multiple factors. It also enables the identification of prognostically relevant risk factors and the calculation of risk scores for individual prognostication. This article is based on selected textbooks of statistics, a selective review of the literature, and our own experience. After a brief introduction of the uni- and multivariable regression models, illustrative examples are given to explain what the important considerations are before a regression analysis is performed, and how the results should be interpreted. The reader should then be able to judge whether the method has been used correctly and interpret the results appropriately. The performance and interpretation of linear regression analysis are subject to a variety of pitfalls, which are discussed here in detail. The reader is made aware of common errors of interpretation through practical examples. Both the opportunities for applying linear regression analysis and its limitations are presented.
Mayer, Brian P.; DeHope, Alan J.; Mew, Daniel A.; ...
2016-03-24
Attribution of the origin of an illicit drug relies on identification of compounds indicative of its clandestine production and is a key component of many modern forensic investigations. Here, the results of these studies can yield detailed information on method of manufacture, starting material source, and final product, all critical forensic evidence. In the present work, chemical attribution signatures (CAS) associated with the synthesis of the analgesic fentanyl, N-(1-phenylethylpiperidin-4-yl)-N-phenylpropanamide, were investigated. Six synthesis methods, all previously published fentanyl synthetic routes or hybrid versions thereof, were studied in an effort to identify and classify route-specific signatures. A total of 160 distinctmore » compounds and inorganic species were identified using gas and liquid chromatographies combined with mass spectrometric methods (gas chromatography/mass spectrometry (GC/MS) and liquid chromatography–tandem mass spectrometry-time of-flight (LC–MS/MS-TOF)) in conjunction with inductively coupled plasma mass spectrometry (ICPMS). The complexity of the resultant data matrix urged the use of multivariate statistical analysis. Using partial least-squares-discriminant analysis (PLS-DA), 87 route-specific CAS were classified and a statistical model capable of predicting the method of fentanyl synthesis was validated and tested against CAS profiles from crude fentanyl products deposited and later extracted from two operationally relevant surfaces: stainless steel and vinyl tile. Finally, this work provides the most detailed fentanyl CAS investigation to date by using orthogonal mass spectral data to identify CAS of forensic significance for illicit drug detection, profiling, and attribution.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mayer, Brian P.; DeHope, Alan J.; Mew, Daniel A.
Attribution of the origin of an illicit drug relies on identification of compounds indicative of its clandestine production and is a key component of many modern forensic investigations. Here, the results of these studies can yield detailed information on method of manufacture, starting material source, and final product, all critical forensic evidence. In the present work, chemical attribution signatures (CAS) associated with the synthesis of the analgesic fentanyl, N-(1-phenylethylpiperidin-4-yl)-N-phenylpropanamide, were investigated. Six synthesis methods, all previously published fentanyl synthetic routes or hybrid versions thereof, were studied in an effort to identify and classify route-specific signatures. A total of 160 distinctmore » compounds and inorganic species were identified using gas and liquid chromatographies combined with mass spectrometric methods (gas chromatography/mass spectrometry (GC/MS) and liquid chromatography–tandem mass spectrometry-time of-flight (LC–MS/MS-TOF)) in conjunction with inductively coupled plasma mass spectrometry (ICPMS). The complexity of the resultant data matrix urged the use of multivariate statistical analysis. Using partial least-squares-discriminant analysis (PLS-DA), 87 route-specific CAS were classified and a statistical model capable of predicting the method of fentanyl synthesis was validated and tested against CAS profiles from crude fentanyl products deposited and later extracted from two operationally relevant surfaces: stainless steel and vinyl tile. Finally, this work provides the most detailed fentanyl CAS investigation to date by using orthogonal mass spectral data to identify CAS of forensic significance for illicit drug detection, profiling, and attribution.« less
Weir, Christopher J; Butcher, Isabella; Assi, Valentina; Lewis, Stephanie C; Murray, Gordon D; Langhorne, Peter; Brady, Marian C
2018-03-07
Rigorous, informative meta-analyses rely on availability of appropriate summary statistics or individual participant data. For continuous outcomes, especially those with naturally skewed distributions, summary information on the mean or variability often goes unreported. While full reporting of original trial data is the ideal, we sought to identify methods for handling unreported mean or variability summary statistics in meta-analysis. We undertook two systematic literature reviews to identify methodological approaches used to deal with missing mean or variability summary statistics. Five electronic databases were searched, in addition to the Cochrane Colloquium abstract books and the Cochrane Statistics Methods Group mailing list archive. We also conducted cited reference searching and emailed topic experts to identify recent methodological developments. Details recorded included the description of the method, the information required to implement the method, any underlying assumptions and whether the method could be readily applied in standard statistical software. We provided a summary description of the methods identified, illustrating selected methods in example meta-analysis scenarios. For missing standard deviations (SDs), following screening of 503 articles, fifteen methods were identified in addition to those reported in a previous review. These included Bayesian hierarchical modelling at the meta-analysis level; summary statistic level imputation based on observed SD values from other trials in the meta-analysis; a practical approximation based on the range; and algebraic estimation of the SD based on other summary statistics. Following screening of 1124 articles for methods estimating the mean, one approximate Bayesian computation approach and three papers based on alternative summary statistics were identified. Illustrative meta-analyses showed that when replacing a missing SD the approximation using the range minimised loss of precision and generally performed better than omitting trials. When estimating missing means, a formula using the median, lower quartile and upper quartile performed best in preserving the precision of the meta-analysis findings, although in some scenarios, omitting trials gave superior results. Methods based on summary statistics (minimum, maximum, lower quartile, upper quartile, median) reported in the literature facilitate more comprehensive inclusion of randomised controlled trials with missing mean or variability summary statistics within meta-analyses.
Houts, Carrie R; Edwards, Michael C; Wirth, R J; Deal, Linda S
2016-11-01
There has been a notable increase in the advocacy of using small-sample designs as an initial quantitative assessment of item and scale performance during the scale development process. This is particularly true in the development of clinical outcome assessments (COAs), where Rasch analysis has been advanced as an appropriate statistical tool for evaluating the developing COAs using a small sample. We review the benefits such methods are purported to offer from both a practical and statistical standpoint and detail several problematic areas, including both practical and statistical theory concerns, with respect to the use of quantitative methods, including Rasch-consistent methods, with small samples. The feasibility of obtaining accurate information and the potential negative impacts of misusing large-sample statistical methods with small samples during COA development are discussed.
NASA Astrophysics Data System (ADS)
Skorobogatiy, Maksim; Sadasivan, Jayesh; Guerboukha, Hichem
2018-05-01
In this paper, we first discuss the main types of noise in a typical pump-probe system, and then focus specifically on terahertz time domain spectroscopy (THz-TDS) setups. We then introduce four statistical models for the noisy pulses obtained in such systems, and detail rigorous mathematical algorithms to de-noise such traces, find the proper averages and characterise various types of experimental noise. Finally, we perform a comparative analysis of the performance, advantages and limitations of the algorithms by testing them on the experimental data collected using a particular THz-TDS system available in our laboratories. We conclude that using advanced statistical models for trace averaging results in the fitting errors that are significantly smaller than those obtained when only a simple statistical average is used.
Statistical models of lunar rocks and regolith
NASA Technical Reports Server (NTRS)
Marcus, A. H.
1973-01-01
The mathematical, statistical, and computational approaches used in the investigation of the interrelationship of lunar fragmental material, regolith, lunar rocks, and lunar craters are described. The first two phases of the work explored the sensitivity of the production model of fragmental material to mathematical assumptions, and then completed earlier studies on the survival of lunar surface rocks with respect to competing processes. The third phase combined earlier work into a detailed statistical analysis and probabilistic model of regolith formation by lithologically distinct layers, interpreted as modified crater ejecta blankets. The fourth phase of the work dealt with problems encountered in combining the results of the entire project into a comprehensive, multipurpose computer simulation model for the craters and regolith. Highlights of each phase of research are given.
Mars Pathfinder Near-Field Rock Distribution Re-Evaluation
NASA Technical Reports Server (NTRS)
Haldemann, A. F. C.; Golombek, M. P.
2003-01-01
We have completed analysis of a new near-field rock count at the Mars Pathfinder landing site and determined that the previously published rock count suggesting 16% cumulative fractional area (CFA) covered by rocks is incorrect. The earlier value is not so much wrong (our new CFA is 20%), as right for the wrong reason: both the old and the new CFA's are consistent with remote sensing data, however the earlier determination incorrectly calculated rock coverage using apparent width rather than average diameter. Here we present details of the new rock database and the new statistics, as well as the importance of using rock average diameter for rock population statistics. The changes to the near-field data do not affect the far-field rock statistics.
Design optimization of a prescribed vibration system using conjoint value analysis
NASA Astrophysics Data System (ADS)
Malinga, Bongani; Buckner, Gregory D.
2016-12-01
This article details a novel design optimization strategy for a prescribed vibration system (PVS) used to mechanically filter solids from fluids in oil and gas drilling operations. A dynamic model of the PVS is developed, and the effects of disturbance torques are detailed. This model is used to predict the effects of design parameters on system performance and efficiency, as quantified by system attributes. Conjoint value analysis, a statistical technique commonly used in marketing science, is utilized to incorporate designer preferences. This approach effectively quantifies and optimizes preference-based trade-offs in the design process. The effects of designer preferences on system performance and efficiency are simulated. This novel optimization strategy yields improvements in all system attributes across all simulated vibration profiles, and is applicable to other industrial electromechanical systems.
ERIC Educational Resources Information Center
Kapoor, Kanta
2010-01-01
Purpose: The purpose of this paper is to quantify the use of electronic journals in comparison with the print collections in the Guru Gobind Singh Indraprastha University Library. Design/methodology/approach: A detailed analysis was made of the use of lending services, the Xerox facility and usage of electronic journals such as Science Direct,…
Evaluation of scheduling techniques for payload activity planning
NASA Technical Reports Server (NTRS)
Bullington, Stanley F.
1991-01-01
Two tasks related to payload activity planning and scheduling were performed. The first task involved making a comparison of space mission activity scheduling problems with production scheduling problems. The second task consisted of a statistical analysis of the output of runs of the Experiment Scheduling Program (ESP). Details of the work which was performed on these two tasks are presented.
NASA Technical Reports Server (NTRS)
1980-01-01
An area around the Munich-Riem airport was divided into 32 clusters of different noise exposure and subjects were drawn from each cluster for a social survey and for psychological, medical, and physiological testing. Extensive acoustical measurements were also carried out in each cluster. The results were then subjected to detailed statistical analysis.
ERIC Educational Resources Information Center
Technomics, Inc., McLean, VA.
This publication is Attachment 9 of a set of 16 computer listed QPCB task sorts, by career level, for the entire Hospital Corps and Dental Technician fields. Statistical data are presented in tabular form for a detailed listing of job duties in medical laboratory technology. (BT)
Analysis of satellite data on energetic particles of ionospheric origin
NASA Technical Reports Server (NTRS)
Sharp, R. D.; Johnson, R. G.; Shelley, E. G.
1976-01-01
The principal result of this program has been the completion of a detailed statistical study of the properties of precipitating O(+) and H(+) ions during two principal magnetic storms. The results of the analysis of selected data of ion mass spectrometer experiment on satellites are given with emphasis on the morphology of the O(+) ions of ionospheric origin with energies in the 0.7 les than or equal to E less than or equal to 12 keV range that were discovered with this experiment.
Statistical Analysis of 30 Years Rainfall Data: A Case Study
NASA Astrophysics Data System (ADS)
Arvind, G.; Ashok Kumar, P.; Girish Karthi, S.; Suribabu, C. R.
2017-07-01
Rainfall is a prime input for various engineering design such as hydraulic structures, bridges and culverts, canals, storm water sewer and road drainage system. The detailed statistical analysis of each region is essential to estimate the relevant input value for design and analysis of engineering structures and also for crop planning. A rain gauge station located closely in Trichy district is selected for statistical analysis where agriculture is the prime occupation. The daily rainfall data for a period of 30 years is used to understand normal rainfall, deficit rainfall, Excess rainfall and Seasonal rainfall of the selected circle headquarters. Further various plotting position formulae available is used to evaluate return period of monthly, seasonally and annual rainfall. This analysis will provide useful information for water resources planner, farmers and urban engineers to assess the availability of water and create the storage accordingly. The mean, standard deviation and coefficient of variation of monthly and annual rainfall was calculated to check the rainfall variability. From the calculated results, the rainfall pattern is found to be erratic. The best fit probability distribution was identified based on the minimum deviation between actual and estimated values. The scientific results and the analysis paved the way to determine the proper onset and withdrawal of monsoon results which were used for land preparation and sowing.
Evaluating statistical cloud schemes: What can we gain from ground-based remote sensing?
NASA Astrophysics Data System (ADS)
Grützun, V.; Quaas, J.; Morcrette, C. J.; Ament, F.
2013-09-01
Statistical cloud schemes with prognostic probability distribution functions have become more important in atmospheric modeling, especially since they are in principle scale adaptive and capture cloud physics in more detail. While in theory the schemes have a great potential, their accuracy is still questionable. High-resolution three-dimensional observational data of water vapor and cloud water, which could be used for testing them, are missing. We explore the potential of ground-based remote sensing such as lidar, microwave, and radar to evaluate prognostic distribution moments using the "perfect model approach." This means that we employ a high-resolution weather model as virtual reality and retrieve full three-dimensional atmospheric quantities and virtual ground-based observations. We then use statistics from the virtual observation to validate the modeled 3-D statistics. Since the data are entirely consistent, any discrepancy occurring is due to the method. Focusing on total water mixing ratio, we find that the mean ratio can be evaluated decently but that it strongly depends on the meteorological conditions as to whether the variance and skewness are reliable. Using some simple schematic description of different synoptic conditions, we show how statistics obtained from point or line measurements can be poor at representing the full three-dimensional distribution of water in the atmosphere. We argue that a careful analysis of measurement data and detailed knowledge of the meteorological situation is necessary to judge whether we can use the data for an evaluation of higher moments of the humidity distribution used by a statistical cloud scheme.
Damiani, Lucas Petri; Berwanger, Otavio; Paisani, Denise; Laranjeira, Ligia Nasi; Suzumura, Erica Aranha; Amato, Marcelo Britto Passos; Carvalho, Carlos Roberto Ribeiro; Cavalcanti, Alexandre Biasi
2017-01-01
Background The Alveolar Recruitment for Acute Respiratory Distress Syndrome Trial (ART) is an international multicenter randomized pragmatic controlled trial with allocation concealment involving 120 intensive care units in Brazil, Argentina, Colombia, Italy, Poland, Portugal, Malaysia, Spain, and Uruguay. The primary objective of ART is to determine whether maximum stepwise alveolar recruitment associated with PEEP titration, adjusted according to the static compliance of the respiratory system (ART strategy), is able to increase 28-day survival in patients with acute respiratory distress syndrome compared to conventional treatment (ARDSNet strategy). Objective To describe the data management process and statistical analysis plan. Methods The statistical analysis plan was designed by the trial executive committee and reviewed and approved by the trial steering committee. We provide an overview of the trial design with a special focus on describing the primary (28-day survival) and secondary outcomes. We describe our data management process, data monitoring committee, interim analyses, and sample size calculation. We describe our planned statistical analyses for primary and secondary outcomes as well as pre-specified subgroup analyses. We also provide details for presenting results, including mock tables for baseline characteristics, adherence to the protocol and effect on clinical outcomes. Conclusion According to best trial practice, we report our statistical analysis plan and data management plan prior to locking the database and beginning analyses. We anticipate that this document will prevent analysis bias and enhance the utility of the reported results. Trial registration ClinicalTrials.gov number, NCT01374022. PMID:28977255
Monitoring and Evaluation: Statistical Support for Life-cycle Studies, Annual Report 2003.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Skalski, John
2003-11-01
The ongoing mission of this project is the development of statistical tools for analyzing fisheries tagging data in the most precise and appropriate manner possible. This mission also includes providing statistical guidance on the best ways to design large-scale tagging studies. This mission continues because the technologies for conducting fish tagging studies continuously evolve. In just the last decade, fisheries biologists have seen the evolution from freeze-brands and coded wire tags (CWT) to passive integrated transponder (PIT) tags, balloon-tags, radiotelemetry, and now, acoustic-tags. With each advance, the technology holds the promise of more detailed and precise information. However, the technologymore » for analyzing and interpreting the data also becomes more complex as the tagging techniques become more sophisticated. The goal of the project is to develop the analytical tools in parallel with the technical advances in tagging studies, so that maximum information can be extracted on a timely basis. Associated with this mission is the transfer of these analytical capabilities to the field investigators to assure consistency and the highest levels of design and analysis throughout the fisheries community. Consequently, this project provides detailed technical assistance on the design and analysis of tagging studies to groups requesting assistance throughout the fisheries community. Ideally, each project and each investigator would invest in the statistical support needed for the successful completion of their study. However, this is an ideal that is rarely if every attained. Furthermore, there is only a small pool of highly trained scientists in this specialized area of tag analysis here in the Northwest. Project 198910700 provides the financial support to sustain this local expertise on the statistical theory of tag analysis at the University of Washington and make it available to the fisheries community. Piecemeal and fragmented support from various agencies and organizations would be incapable of maintaining a center of expertise. The mission of the project is to help assure tagging studies are designed and analyzed from the onset to extract the best available information using state-of-the-art statistical methods. The overarching goals of the project is to assure statistically sound survival studies so that fish managers can focus on the management implications of their findings and not be distracted by concerns whether the studies are statistically reliable or not. Specific goals and objectives of the study include the following: (1) Provide consistent application of statistical methodologies for survival estimation across all salmon life cycle stages to assure comparable performance measures and assessment of results through time, to maximize learning and adaptive management opportunities, and to improve and maintain the ability to responsibly evaluate the success of implemented Columbia River FWP salmonid mitigation programs and identify future mitigation options. (2) Improve analytical capabilities to conduct research on survival processes of wild and hatchery chinook and steelhead during smolt outmigration, to improve monitoring and evaluation capabilities and assist in-season river management to optimize operational and fish passage strategies to maximize survival. (3) Extend statistical support to estimate ocean survival and in-river survival of returning adults. Provide statistical guidance in implementing a river-wide adult PIT-tag detection capability. (4) Develop statistical methods for survival estimation for all potential users and make this information available through peer-reviewed publications, statistical software, and technology transfers to organizations such as NOAA Fisheries, the Fish Passage Center, US Fish and Wildlife Service, US Geological Survey (USGS), US Army Corps of Engineers (USACE), Public Utility Districts (PUDs), the Independent Scientific Advisory Board (ISAB), and other members of the Northwest fisheries community. (5) Provide and maintain statistical software for tag analysis and user support. (6) Provide improvements in statistical theory and software as requested by user groups. These improvements include extending software capabilities to address new research issues, adapting tagging techniques to new study designs, and extending the analysis capabilities to new technologies such as radio-tags and acoustic-tags.« less
Westfall, Jacob; Kenny, David A; Judd, Charles M
2014-10-01
Researchers designing experiments in which a sample of participants responds to a sample of stimuli are faced with difficult questions about optimal study design. The conventional procedures of statistical power analysis fail to provide appropriate answers to these questions because they are based on statistical models in which stimuli are not assumed to be a source of random variation in the data, models that are inappropriate for experiments involving crossed random factors of participants and stimuli. In this article, we present new methods of power analysis for designs with crossed random factors, and we give detailed, practical guidance to psychology researchers planning experiments in which a sample of participants responds to a sample of stimuli. We extensively examine 5 commonly used experimental designs, describe how to estimate statistical power in each, and provide power analysis results based on a reasonable set of default parameter values. We then develop general conclusions and formulate rules of thumb concerning the optimal design of experiments in which a sample of participants responds to a sample of stimuli. We show that in crossed designs, statistical power typically does not approach unity as the number of participants goes to infinity but instead approaches a maximum attainable power value that is possibly small, depending on the stimulus sample. We also consider the statistical merits of designs involving multiple stimulus blocks. Finally, we provide a simple and flexible Web-based power application to aid researchers in planning studies with samples of stimuli.
CISN ShakeAlert Earthquake Early Warning System Monitoring Tools
NASA Astrophysics Data System (ADS)
Henson, I. H.; Allen, R. M.; Neuhauser, D. S.
2015-12-01
CISN ShakeAlert is a prototype earthquake early warning system being developed and tested by the California Integrated Seismic Network. The system has recently been expanded to support redundant data processing and communications. It now runs on six machines at three locations with ten Apache ActiveMQ message brokers linking together 18 waveform processors, 12 event association processes and 4 Decision Module alert processes. The system ingests waveform data from about 500 stations and generates many thousands of triggers per day, from which a small portion produce earthquake alerts. We have developed interactive web browser system-monitoring tools that display near real time state-of-health and performance information. This includes station availability, trigger statistics, communication and alert latencies. Connections to regional earthquake catalogs provide a rapid assessment of the Decision Module hypocenter accuracy. Historical performance can be evaluated, including statistics for hypocenter and origin time accuracy and alert time latencies for different time periods, magnitude ranges and geographic regions. For the ElarmS event associator, individual earthquake processing histories can be examined, including details of the transmission and processing latencies associated with individual P-wave triggers. Individual station trigger and latency statistics are available. Detailed information about the ElarmS trigger association process for both alerted events and rejected events is also available. The Google Web Toolkit and Map API have been used to develop interactive web pages that link tabular and geographic information. Statistical analysis is provided by the R-Statistics System linked to a PostgreSQL database.
NASA Astrophysics Data System (ADS)
Whittaker, Kara A.; McShane, Dan
2013-02-01
A large storm event in southwest Washington State triggered over 2500 landslides and provided an opportunity to assess two slope stability screening tools. The statistical analysis conducted demonstrated that both screening tools are effective at predicting where landslides were likely to take place (Whittaker and McShane, 2012). Here we reply to two discussions of this article related to the development of the slope stability screening tools and the accuracy and scale of the spatial data used. Neither of the discussions address our statistical analysis or results. We provide greater detail on our sampling criteria and also elaborate on the policy and management implications of our findings and how they complement those of a separate investigation of landslides resulting from the same storm. The conclusions made in Whittaker and McShane (2012) stand as originally published unless future analysis indicates otherwise.
Hadoop and friends - first experience at CERN with a new platform for high throughput analysis steps
NASA Astrophysics Data System (ADS)
Duellmann, D.; Surdy, K.; Menichetti, L.; Toebbicke, R.
2017-10-01
The statistical analysis of infrastructure metrics comes with several specific challenges, including the fairly large volume of unstructured metrics from a large set of independent data sources. Hadoop and Spark provide an ideal environment in particular for the first steps of skimming rapidly through hundreds of TB of low relevance data to find and extract the much smaller data volume that is relevant for statistical analysis and modelling. This presentation will describe the new Hadoop service at CERN and the use of several of its components for high throughput data aggregation and ad-hoc pattern searches. We will describe the hardware setup used, the service structure with a small set of decoupled clusters and the first experience with co-hosting different applications and performing software upgrades. We will further detail the common infrastructure used for data extraction and preparation from continuous monitoring and database input sources.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grant, C W; Lenderman, J S; Gansemer, J D
This document is an update to the 'ADIS Algorithm Evaluation Project Plan' specified in the Statement of Work for the US-VISIT Identity Matching Algorithm Evaluation Program, as deliverable II.D.1. The original plan was delivered in August 2010. This document modifies the plan to reflect modified deliverables reflecting delays in obtaining a database refresh. This document describes the revised schedule of the program deliverables. The detailed description of the processes used, the statistical analysis processes and the results of the statistical analysis will be described fully in the program deliverables. The US-VISIT Identity Matching Algorithm Evaluation Program is work performed bymore » Lawrence Livermore National Laboratory (LLNL) under IAA HSHQVT-07-X-00002 P00004 from the Department of Homeland Security (DHS).« less
Docking studies on NSAID/COX-2 isozyme complexes using Contact Statistics analysis
NASA Astrophysics Data System (ADS)
Ermondi, Giuseppe; Caron, Giulia; Lawrence, Raelene; Longo, Dario
2004-11-01
The selective inhibition of COX-2 isozymes should lead to a new generation of NSAIDs with significantly reduced side effects; e.g. celecoxib (Celebrex®) and rofecoxib (Vioxx®). To obtain inhibitors with higher selectivity it has become essential to gain additional insight into the details of the interactions between COX isozymes and NSAIDs. Although X-ray structures of COX-2 complexed with a small number of ligands are available, experimental data are missing for two well-known selective COX-2 inhibitors (rofecoxib and nimesulide) and docking results reported are controversial. We use a combination of a traditional docking procedure with a new computational tool (Contact Statistics analysis) that identifies the best orientation among a number of solutions to shed some light on this topic.
NASA Technical Reports Server (NTRS)
Hansen, K. L.
1982-01-01
The effect of population and certain environmental characteristics on the availability of land for siting nuclear power plants was assessed. The study area, consisting of the 48 contiguous states, was divided into 5 kilometer (km) square grid cells yielding a total of 600,000 cells. Through the use of a modern geographic information system, it was possible to provide a detailed analysis of a quite large area. Numerous maps and statistical tables were produced, the detail of which were limited only by available data. Evaluation issues included population density, restricted lands, seismic hardening, site preparation, water availability, and cost factors.
The New Maia Detector System: Methods For High Definition Trace Element Imaging Of Natural Material
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ryan, C. G.; School of Physics, University of Melbourne, Parkville VIC; CODES Centre of Excellence, University of Tasmania, Hobart TAS
2010-04-06
Motivated by the need for megapixel high definition trace element imaging to capture intricate detail in natural material, together with faster acquisition and improved counting statistics in elemental imaging, a large energy-dispersive detector array called Maia has been developed by CSIRO and BNL for SXRF imaging on the XFM beamline at the Australian Synchrotron. A 96 detector prototype demonstrated the capacity of the system for real-time deconvolution of complex spectral data using an embedded implementation of the Dynamic Analysis method and acquiring highly detailed images up to 77 M pixels spanning large areas of complex mineral sample sections.
The New Maia Detector System: Methods For High Definition Trace Element Imaging Of Natural Material
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ryan, C.G.; Siddons, D.P.; Kirkham, R.
2010-05-25
Motivated by the need for megapixel high definition trace element imaging to capture intricate detail in natural material, together with faster acquisition and improved counting statistics in elemental imaging, a large energy-dispersive detector array called Maia has been developed by CSIRO and BNL for SXRF imaging on the XFM beamline at the Australian Synchrotron. A 96 detector prototype demonstrated the capacity of the system for real-time deconvolution of complex spectral data using an embedded implementation of the Dynamic Analysis method and acquiring highly detailed images up to 77 M pixels spanning large areas of complex mineral sample sections.
Statistical Analysis of NAS Parallel Benchmarks and LINPACK Results
NASA Technical Reports Server (NTRS)
Meuer, Hans-Werner; Simon, Horst D.; Strohmeier, Erich; Lasinski, T. A. (Technical Monitor)
1994-01-01
In the last three years extensive performance data have been reported for parallel machines both based on the NAS Parallel Benchmarks, and on LINPACK. In this study we have used the reported benchmark results and performed a number of statistical experiments using factor, cluster, and regression analyses. In addition to the performance results of LINPACK and the eight NAS parallel benchmarks, we have also included peak performance of the machine, and the LINPACK n and n(sub 1/2) values. Some of the results and observations can be summarized as follows: 1) All benchmarks are strongly correlated with peak performance. 2) LINPACK and EP have each a unique signature. 3) The remaining NPB can grouped into three groups as follows: (CG and IS), (LU and SP), and (MG, FT, and BT). Hence three (or four with EP) benchmarks are sufficient to characterize the overall NPB performance. Our poster presentation will follow a standard poster format, and will present the data of our statistical analysis in detail.
NASA Astrophysics Data System (ADS)
Jaranowski, Piotr; Królak, Andrzej
2000-03-01
We develop the analytic and numerical tools for data analysis of the continuous gravitational-wave signals from spinning neutron stars for ground-based laser interferometric detectors. The statistical data analysis method that we investigate is maximum likelihood detection which for the case of Gaussian noise reduces to matched filtering. We study in detail the statistical properties of the optimum functional that needs to be calculated in order to detect the gravitational-wave signal and estimate its parameters. We find it particularly useful to divide the parameter space into elementary cells such that the values of the optimal functional are statistically independent in different cells. We derive formulas for false alarm and detection probabilities both for the optimal and the suboptimal filters. We assess the computational requirements needed to do the signal search. We compare a number of criteria to build sufficiently accurate templates for our data analysis scheme. We verify the validity of our concepts and formulas by means of the Monte Carlo simulations. We present algorithms by which one can estimate the parameters of the continuous signals accurately. We find, confirming earlier work of other authors, that given a 100 Gflops computational power an all-sky search for observation time of 7 days and directed search for observation time of 120 days are possible whereas an all-sky search for 120 days of observation time is computationally prohibitive.
NASA Astrophysics Data System (ADS)
Hacker, Joshua; Vandenberghe, Francois; Jung, Byoung-Jo; Snyder, Chris
2017-04-01
Effective assimilation of cloud-affected radiance observations from space-borne imagers, with the aim of improving cloud analysis and forecasting, has proven to be difficult. Large observation biases, nonlinear observation operators, and non-Gaussian innovation statistics present many challenges. Ensemble-variational data assimilation (EnVar) systems offer the benefits of flow-dependent background error statistics from an ensemble, and the ability of variational minimization to handle nonlinearity. The specific benefits of ensemble statistics, relative to static background errors more commonly used in variational systems, have not been quantified for the problem of assimilating cloudy radiances. A simple experiment framework is constructed with a regional NWP model and operational variational data assimilation system, to provide the basis understanding the importance of ensemble statistics in cloudy radiance assimilation. Restricting the observations to those corresponding to clouds in the background forecast leads to innovations that are more Gaussian. The number of large innovations is reduced compared to the more general case of all observations, but not eliminated. The Huber norm is investigated to handle the fat tails of the distributions, and allow more observations to be assimilated without the need for strict background checks that eliminate them. Comparing assimilation using only ensemble background error statistics with assimilation using only static background error statistics elucidates the importance of the ensemble statistics. Although the cost functions in both experiments converge to similar values after sufficient outer-loop iterations, the resulting cloud water, ice, and snow content are greater in the ensemble-based analysis. The subsequent forecasts from the ensemble-based analysis also retain more condensed water species, indicating that the local environment is more supportive of clouds. In this presentation we provide details that explain the apparent benefit from using ensembles for cloudy radiance assimilation in an EnVar context.
Developing and validating a nutrition knowledge questionnaire: key methods and considerations.
Trakman, Gina Louise; Forsyth, Adrienne; Hoye, Russell; Belski, Regina
2017-10-01
To outline key statistical considerations and detailed methodologies for the development and evaluation of a valid and reliable nutrition knowledge questionnaire. Literature on questionnaire development in a range of fields was reviewed and a set of evidence-based guidelines specific to the creation of a nutrition knowledge questionnaire have been developed. The recommendations describe key qualitative methods and statistical considerations, and include relevant examples from previous papers and existing nutrition knowledge questionnaires. Where details have been omitted for the sake of brevity, the reader has been directed to suitable references. We recommend an eight-step methodology for nutrition knowledge questionnaire development as follows: (i) definition of the construct and development of a test plan; (ii) generation of the item pool; (iii) choice of the scoring system and response format; (iv) assessment of content validity; (v) assessment of face validity; (vi) purification of the scale using item analysis, including item characteristics, difficulty and discrimination; (vii) evaluation of the scale including its factor structure and internal reliability, or Rasch analysis, including assessment of dimensionality and internal reliability; and (viii) gathering of data to re-examine the questionnaire's properties, assess temporal stability and confirm construct validity. Several of these methods have previously been overlooked. The measurement of nutrition knowledge is an important consideration for individuals working in the nutrition field. Improved methods in the development of nutrition knowledge questionnaires, such as the use of factor analysis or Rasch analysis, will enable more confidence in reported measures of nutrition knowledge.
Borghesi, Christian; Raynal, Jean-Claude; Bouchaud, Jean-Philippe
2012-01-01
We study in details the turnout rate statistics for 77 elections in 11 different countries. We show that the empirical results established in a previous paper for French elections appear to hold much more generally. We find in particular that the spatial correlation of turnout rates decay logarithmically with distance in all cases. This result is quantitatively reproduced by a decision model that assumes that each voter makes his mind as a result of three influence terms: one totally idiosyncratic component, one city-specific term with short-ranged fluctuations in space, and one long-ranged correlated field which propagates diffusively in space. A detailed analysis reveals several interesting features: for example, different countries have different degrees of local heterogeneities and seem to be characterized by a different propensity for individuals to conform to the cultural norm. We furthermore find clear signs of herding (i.e., strongly correlated decisions at the individual level) in some countries, but not in others. PMID:22615762
Dynamics of land change in India: a fine-scale spatial analysis
NASA Astrophysics Data System (ADS)
Meiyappan, P.; Roy, P. S.; Sharma, Y.; Jain, A. K.; Ramachandran, R.; Joshi, P. K.
2015-12-01
Land is scarce in India: India occupies 2.4% of worlds land area, but supports over 1/6th of worlds human and livestock population. This high population to land ratio, combined with socioeconomic development and increasing consumption has placed tremendous pressure on India's land resources for food, feed, and fuel. In this talk, we present contemporary (1985 to 2005) spatial estimates of land change in India using national-level analysis of Landsat imageries. Further, we investigate the causes of the spatial patterns of change using two complementary lines of evidence. First, we use statistical models estimated at macro-scale to understand the spatial relationships between land change patterns and their concomitant drivers. This analysis using our newly compiled extensive socioeconomic database at village level (~630,000 units), is 100x higher in spatial resolution compared to existing datasets, and covers over 200 variables. The detailed socioeconomic data enabled the fine-scale spatial analysis with Landsat data. Second, we synthesized information from over 130 survey based case studies on land use drivers in India to complement our macro-scale analysis. The case studies are especially useful to identify unobserved variables (e.g. farmer's attitude towards risk). Ours is the most detailed analysis of contemporary land change in India, both in terms of national extent, and the use of detailed spatial information on land change, socioeconomic factors, and synthesis of case studies.
An assessment of training needs for the lumber manufacturing industry in the eastern United States
Joseph Denig; Scott Page; Yuhua Su; Karen Martinson
2008-01-01
A training needs assessment of the primary forest products industry was conducted for 33 eastern states. his publication presents in detail the statistical analysis of the study. Of the 2,570 lumber manufacturing companies, consisting of firms with more than six employees for the U.S. Department of Labor Standard Industrial Classification Code 2421, the response rate...
ERIC Educational Resources Information Center
Mantri, Archana
2014-01-01
The intent of the study presented in this paper is to show that the model of problem-based learning (PBL) can be made scalable by designing curriculum around a set of open-ended problems (OEPs). The detailed statistical analysis of the data collected to measure the effects of traditional and PBL instructions for three courses in Electronics and…
Nateglinide versus repaglinide for type 2 diabetes mellitus in China.
Li, Chanjuan; Xia, Jielai; Zhang, Gaokui; Wang, Suzhen; Wang, Ling
2009-12-01
The purpose of this study is to evaluate efficacy and safety of nateglinide tablet administration in comparison with those of repaglinide tablet as control on treating type 2 diabetes mellitus in China. Pooled-analysis with analysis of covariance (ANCOVA) method was applied to assess the efficacy and safety based on original data collected from four independent randomized clinical trials with similar research protocols. However meta-analysis was applied based on the outcomes of the four studies. The results by meta-analysis were comparable to those obtained by pooled-analysis. The means of HbA(1c), and fasting blood glucose in both the nateglinide and repaglinide groups were reduced significantly after 12 weeks duration but no statistical differences in reduction between the two groups. The adverse reaction rates were 9.89 and 6.51% in the nateglinide and repaglinide groups respectively, with the rate difference showing no statistical significance, and the Odds Ratio of adverse reaction rate (95% confidence interval) was 1.59 (0.99, 2.55). Both nateglinide and repaglinide administration have similarly significant effects on reducing HbA(1c) and FBG. However, the adverse reaction rate in the nateglinide group is higher than that in the latter using repaglinide but no statistical significance difference as revealed in the four clinical trials detailed below.
Constrained Stochastic Extended Redundancy Analysis.
DeSarbo, Wayne S; Hwang, Heungsun; Stadler Blank, Ashley; Kappe, Eelco
2015-06-01
We devise a new statistical methodology called constrained stochastic extended redundancy analysis (CSERA) to examine the comparative impact of various conceptual factors, or drivers, as well as the specific predictor variables that contribute to each driver on designated dependent variable(s). The technical details of the proposed methodology, the maximum likelihood estimation algorithm, and model selection heuristics are discussed. A sports marketing consumer psychology application is provided in a Major League Baseball (MLB) context where the effects of six conceptual drivers of game attendance and their defining predictor variables are estimated. Results compare favorably to those obtained using traditional extended redundancy analysis (ERA).
Model-Based Linkage Analysis of a Quantitative Trait.
Song, Yeunjoo E; Song, Sunah; Schnell, Audrey H
2017-01-01
Linkage Analysis is a family-based method of analysis to examine whether any typed genetic markers cosegregate with a given trait, in this case a quantitative trait. If linkage exists, this is taken as evidence in support of a genetic basis for the trait. Historically, linkage analysis was performed using a binary disease trait, but has been extended to include quantitative disease measures. Quantitative traits are desirable as they provide more information than binary traits. Linkage analysis can be performed using single-marker methods (one marker at a time) or multipoint (using multiple markers simultaneously). In model-based linkage analysis the genetic model for the trait of interest is specified. There are many software options for performing linkage analysis. Here, we use the program package Statistical Analysis for Genetic Epidemiology (S.A.G.E.). S.A.G.E. was chosen because it also includes programs to perform data cleaning procedures and to generate and test genetic models for a quantitative trait, in addition to performing linkage analysis. We demonstrate in detail the process of running the program LODLINK to perform single-marker analysis, and MLOD to perform multipoint analysis using output from SEGREG, where SEGREG was used to determine the best fitting statistical model for the trait.
Wavelet Statistical Analysis of Low-Latitude Geomagnetic Measurements
NASA Astrophysics Data System (ADS)
Papa, A. R.; Akel, A. F.
2009-05-01
Following previous works by our group (Papa et al., JASTP, 2006), where we analyzed a series of records acquired at the Vassouras National Geomagnetic Observatory in Brazil for the month of October 2000, we introduced a wavelet analysis for the same type of data and for other periods. It is well known that wavelets allow a more detailed study in several senses: the time window for analysis can be drastically reduced if compared to other traditional methods (Fourier, for example) and at the same time allow an almost continuous accompaniment of both amplitude and frequency of signals as time goes by. This advantage brings some possibilities for potentially useful forecasting methods of the type also advanced by our group in previous works (see for example, Papa and Sosman, JASTP, 2008). However, the simultaneous statistical analysis of both time series (in our case amplitude and frequency) is a challenging matter and is in this sense that we have found what we consider our main goal. Some possible trends for future works are advanced.
WASP (Write a Scientific Paper) using Excel - 2: Pivot tables.
Grech, Victor
2018-02-01
Data analysis at the descriptive stage and the eventual presentation of results requires the tabulation and summarisation of data. This exercise should always precede inferential statistics. Pivot tables and pivot charts are one of Excel's most powerful and underutilised features, with tabulation functions that immensely facilitate descriptive statistics. Pivot tables permit users to dynamically summarise and cross-tabulate data, create tables in several dimensions, offer a range of summary statistics and can be modified interactively with instant outputs. Large and detailed datasets are thereby easily manipulated making pivot tables arguably the best way to explore, summarise and present data from many different angles. This second paper in the WASP series in Early Human Development provides pointers for pivot table manipulation in Excel™. Copyright © 2018 Elsevier B.V. All rights reserved.
Statistical Analysis of Large-Scale Structure of Universe
NASA Astrophysics Data System (ADS)
Tugay, A. V.
While galaxy cluster catalogs were compiled many decades ago, other structural elements of cosmic web are detected at definite level only in the newest works. For example, extragalactic filaments were described by velocity field and SDSS galaxy distribution during the last years. Large-scale structure of the Universe could be also mapped in the future using ATHENA observations in X-rays and SKA in radio band. Until detailed observations are not available for the most volume of Universe, some integral statistical parameters can be used for its description. Such methods as galaxy correlation function, power spectrum, statistical moments and peak statistics are commonly used with this aim. The parameters of power spectrum and other statistics are important for constraining the models of dark matter, dark energy, inflation and brane cosmology. In the present work we describe the growth of large-scale density fluctuations in one- and three-dimensional case with Fourier harmonics of hydrodynamical parameters. In result we get power-law relation for the matter power spectrum.
A generalization of random matrix theory and its application to statistical physics.
Wang, Duan; Zhang, Xin; Horvatic, Davor; Podobnik, Boris; Eugene Stanley, H
2017-02-01
To study the statistical structure of crosscorrelations in empirical data, we generalize random matrix theory and propose a new method of cross-correlation analysis, known as autoregressive random matrix theory (ARRMT). ARRMT takes into account the influence of auto-correlations in the study of cross-correlations in multiple time series. We first analytically and numerically determine how auto-correlations affect the eigenvalue distribution of the correlation matrix. Then we introduce ARRMT with a detailed procedure of how to implement the method. Finally, we illustrate the method using two examples taken from inflation rates for air pressure data for 95 US cities.
Simulation skill of APCC set of global climate models for Asian summer monsoon rainfall variability
NASA Astrophysics Data System (ADS)
Singh, U. K.; Singh, G. P.; Singh, Vikas
2015-04-01
The performance of 11 Asia-Pacific Economic Cooperation Climate Center (APCC) global climate models (coupled and uncoupled both) in simulating the seasonal summer (June-August) monsoon rainfall variability over Asia (especially over India and East Asia) has been evaluated in detail using hind-cast data (3 months advance) generated from APCC which provides the regional climate information product services based on multi-model ensemble dynamical seasonal prediction systems. The skill of each global climate model over Asia was tested separately in detail for the period of 21 years (1983-2003), and simulated Asian summer monsoon rainfall (ASMR) has been verified using various statistical measures for Indian and East Asian land masses separately. The analysis found a large variation in spatial ASMR simulated with uncoupled model compared to coupled models (like Predictive Ocean Atmosphere Model for Australia, National Centers for Environmental Prediction and Japan Meteorological Agency). The simulated ASMR in coupled model was closer to Climate Prediction Centre Merged Analysis of Precipitation (CMAP) compared to uncoupled models although the amount of ASMR was underestimated in both models. Analysis also found a high spread in simulated ASMR among the ensemble members (suggesting that the model's performance is highly dependent on its initial conditions). The correlation analysis between sea surface temperature (SST) and ASMR shows that that the coupled models are strongly associated with ASMR compared to the uncoupled models (suggesting that air-sea interaction is well cared in coupled models). The analysis of rainfall using various statistical measures suggests that the multi-model ensemble (MME) performed better compared to individual model and also separate study indicate that Indian and East Asian land masses are more useful compared to Asia monsoon rainfall as a whole. The results of various statistical measures like skill of multi-model ensemble, large spread among the ensemble members of individual model, strong teleconnection (correlation analysis) with SST, coefficient of variation, inter-annual variability, analysis of Taylor diagram, etc. suggest that there is a need to improve coupled model instead of uncoupled model for the development of a better dynamical seasonal forecast system.
Automatic Generation of Algorithms for the Statistical Analysis of Planetary Nebulae Images
NASA Technical Reports Server (NTRS)
Fischer, Bernd
2004-01-01
Analyzing data sets collected in experiments or by observations is a Core scientific activity. Typically, experimentd and observational data are &aught with uncertainty, and the analysis is based on a statistical model of the conjectured underlying processes, The large data volumes collected by modern instruments make computer support indispensible for this. Consequently, scientists spend significant amounts of their time with the development and refinement of the data analysis programs. AutoBayes [GF+02, FS03] is a fully automatic synthesis system for generating statistical data analysis programs. Externally, it looks like a compiler: it takes an abstract problem specification and translates it into executable code. Its input is a concise description of a data analysis problem in the form of a statistical model as shown in Figure 1; its output is optimized and fully documented C/C++ code which can be linked dynamically into the Matlab and Octave environments. Internally, however, it is quite different: AutoBayes derives a customized algorithm implementing the given model using a schema-based process, and then further refines and optimizes the algorithm into code. A schema is a parameterized code template with associated semantic constraints which define and restrict the template s applicability. The schema parameters are instantiated in a problem-specific way during synthesis as AutoBayes checks the constraints against the original model or, recursively, against emerging sub-problems. AutoBayes schema library contains problem decomposition operators (which are justified by theorems in a formal logic in the domain of Bayesian networks) as well as machine learning algorithms (e.g., EM, k-Means) and nu- meric optimization methods (e.g., Nelder-Mead simplex, conjugate gradient). AutoBayes augments this schema-based approach by symbolic computation to derive closed-form solutions whenever possible. This is a major advantage over other statistical data analysis systems which use numerical approximations even in cases where closed-form solutions exist. AutoBayes is implemented in Prolog and comprises approximately 75.000 lines of code. In this paper, we take one typical scientific data analysis problem-analyzing planetary nebulae images taken by the Hubble Space Telescope-and show how AutoBayes can be used to automate the implementation of the necessary anal- ysis programs. We initially follow the analysis described by Knuth and Hajian [KHO2] and use AutoBayes to derive code for the published models. We show the details of the code derivation process, including the symbolic computations and automatic integration of library procedures, and compare the results of the automatically generated and manually implemented code. We then go beyond the original analysis and use AutoBayes to derive code for a simple image segmentation procedure based on a mixture model which can be used to automate a manual preproceesing step. Finally, we combine the original approach with the simple segmentation which yields a more detailed analysis. This also demonstrates that AutoBayes makes it easy to combine different aspects of data analysis.
Adams, James; Kruger, Uwe; Geis, Elizabeth; Gehn, Eva; Fimbres, Valeria; Pollard, Elena; Mitchell, Jessica; Ingram, Julie; Hellmers, Robert; Quig, David; Hahn, Juergen
2017-01-01
Introduction A number of previous studies examined a possible association of toxic metals and autism, and over half of those studies suggest that toxic metal levels are different in individuals with Autism Spectrum Disorders (ASD). Additionally, several studies found that those levels correlate with the severity of ASD. Methods In order to further investigate these points, this paper performs the most detailed statistical analysis to date of a data set in this field. First morning urine samples were collected from 67 children and adults with ASD and 50 neurotypical controls of similar age and gender. The samples were analyzed to determine the levels of 10 urinary toxic metals (UTM). Autism-related symptoms were assessed with eleven behavioral measures. Statistical analysis was used to distinguish participants on the ASD spectrum and neurotypical participants based upon the UTM data alone. The analysis also included examining the association of autism severity with toxic metal excretion data using linear and nonlinear analysis. “Leave-one-out” cross-validation was used to ensure statistical independence of results. Results and Discussion Average excretion levels of several toxic metals (lead, tin, thallium, antimony) were significantly higher in the ASD group. However, ASD classification using univariate statistics proved difficult due to large variability, but nonlinear multivariate statistical analysis significantly improved ASD classification with Type I/II errors of 15% and 18%, respectively. These results clearly indicate that the urinary toxic metal excretion profiles of participants in the ASD group were significantly different from those of the neurotypical participants. Similarly, nonlinear methods determined a significantly stronger association between the behavioral measures and toxic metal excretion. The association was strongest for the Aberrant Behavior Checklist (including subscales on Irritability, Stereotypy, Hyperactivity, and Inappropriate Speech), but significant associations were found for UTM with all eleven autism-related assessments with cross-validation R2 values ranging from 0.12–0.48. PMID:28068407
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1983-01-01
This volume contains geology of the Durango D detail area, radioactive mineral occurrences in Colorado, and geophysical data interpretation. Eight appendices provide: stacked profiles, geologic histograms, geochemical histograms, speed and altitude histograms, geologic statistical tables, geochemical statistical tables, magnetic and ancillary profiles, and test line data.
Statistical description of tectonic motions
NASA Technical Reports Server (NTRS)
Agnew, Duncan Carr
1991-01-01
The behavior of stochastic processes was studied whose power spectra are described by power-law behavior. The details of the analysis and the conclusions that were reached are presented. This analysis was extended to compare detection capabilities of different measurement techniques (e.g., gravimetry and GPS for the vertical, and seismometers and GPS for horizontal), both in general and for the specific case of the deformations produced by a dislocation in a half-space (which applies to seismic of preseismic sources). The time-domain behavior of power-law noises is also investigated.
Reporting Practices and Use of Quantitative Methods in Canadian Journal Articles in Psychology.
Counsell, Alyssa; Harlow, Lisa L
2017-05-01
With recent focus on the state of research in psychology, it is essential to assess the nature of the statistical methods and analyses used and reported by psychological researchers. To that end, we investigated the prevalence of different statistical procedures and the nature of statistical reporting practices in recent articles from the four major Canadian psychology journals. The majority of authors evaluated their research hypotheses through the use of analysis of variance (ANOVA), t -tests, and multiple regression. Multivariate approaches were less common. Null hypothesis significance testing remains a popular strategy, but the majority of authors reported a standardized or unstandardized effect size measure alongside their significance test results. Confidence intervals on effect sizes were infrequently employed. Many authors provided minimal details about their statistical analyses and less than a third of the articles presented on data complications such as missing data and violations of statistical assumptions. Strengths of and areas needing improvement for reporting quantitative results are highlighted. The paper concludes with recommendations for how researchers and reviewers can improve comprehension and transparency in statistical reporting.
Spatio-temporal analysis of annual rainfall in Crete, Greece
NASA Astrophysics Data System (ADS)
Varouchakis, Emmanouil A.; Corzo, Gerald A.; Karatzas, George P.; Kotsopoulou, Anastasia
2018-03-01
Analysis of rainfall data from the island of Crete, Greece was performed to identify key hydrological years and return periods as well as to analyze the inter-annual behavior of the rainfall variability during the period 1981-2014. The rainfall spatial distribution was also examined in detail to identify vulnerable areas of the island. Data analysis using statistical tools and spectral analysis were applied to investigate and interpret the temporal course of the available rainfall data set. In addition, spatial analysis techniques were applied and compared to determine the rainfall spatial distribution on the island of Crete. The analysis presented that in contrast to Regional Climate Model estimations, rainfall rates have not decreased, while return periods vary depending on seasonality and geographic location. A small but statistical significant increasing trend was detected in the inter-annual rainfall variations as well as a significant rainfall cycle almost every 8 years. In addition, statistically significant correlation of the island's rainfall variability with the North Atlantic Oscillation is identified for the examined period. On the other hand, regression kriging method combining surface elevation as secondary information improved the estimation of the annual rainfall spatial variability on the island of Crete by 70% compared to ordinary kriging. The rainfall spatial and temporal trends on the island of Crete have variable characteristics that depend on the geographical area and on the hydrological period.
Neti, Prasad V.S.V.; Howell, Roger W.
2008-01-01
Recently, the distribution of radioactivity among a population of cells labeled with 210Po was shown to be well described by a log normal distribution function (J Nucl Med 47, 6 (2006) 1049-1058) with the aid of an autoradiographic approach. To ascertain the influence of Poisson statistics on the interpretation of the autoradiographic data, the present work reports on a detailed statistical analyses of these data. Methods The measured distributions of alpha particle tracks per cell were subjected to statistical tests with Poisson (P), log normal (LN), and Poisson – log normal (P – LN) models. Results The LN distribution function best describes the distribution of radioactivity among cell populations exposed to 0.52 and 3.8 kBq/mL 210Po-citrate. When cells were exposed to 67 kBq/mL, the P – LN distribution function gave a better fit, however, the underlying activity distribution remained log normal. Conclusions The present analysis generally provides further support for the use of LN distributions to describe the cellular uptake of radioactivity. Care should be exercised when analyzing autoradiographic data on activity distributions to ensure that Poisson processes do not distort the underlying LN distribution. PMID:16741316
A statistical study of EMIC waves observed by Cluster: 1. Wave properties
NASA Astrophysics Data System (ADS)
Allen, R. C.; Zhang, J.-C.; Kistler, L. M.; Spence, H. E.; Lin, R.-L.; Klecker, B.; Dunlop, M. W.; André, M.; Jordanova, V. K.
2015-07-01
Electromagnetic ion cyclotron (EMIC) waves are an important mechanism for particle energization and losses inside the magnetosphere. In order to better understand the effects of these waves on particle dynamics, detailed information about the occurrence rate, wave power, ellipticity, normal angle, energy propagation angle distributions, and local plasma parameters are required. Previous statistical studies have used in situ observations to investigate the distribution of these parameters in the magnetic local time versus L-shell (MLT-L) frame within a limited magnetic latitude (MLAT) range. In this study, we present a statistical analysis of EMIC wave properties using 10 years (2001-2010) of data from Cluster, totaling 25,431 min of wave activity. Due to the polar orbit of Cluster, we are able to investigate EMIC waves at all MLATs and MLTs. This allows us to further investigate the MLAT dependence of various wave properties inside different MLT sectors and further explore the effects of Shabansky orbits on EMIC wave generation and propagation. The statistical analysis is presented in two papers. This paper focuses on the wave occurrence distribution as well as the distribution of wave properties. The companion paper focuses on local plasma parameters during wave observations as well as wave generation proxies.
No-reference multiscale blur detection tool for content based image retrieval
NASA Astrophysics Data System (ADS)
Ezekiel, Soundararajan; Stocker, Russell; Harrity, Kyle; Alford, Mark; Ferris, David; Blasch, Erik; Gorniak, Mark
2014-06-01
In recent years, digital cameras have been widely used for image capturing. These devices are equipped in cell phones, laptops, tablets, webcams, etc. Image quality is an important component of digital image analysis. To assess image quality for these mobile products, a standard image is required as a reference image. In this case, Root Mean Square Error and Peak Signal to Noise Ratio can be used to measure the quality of the images. However, these methods are not possible if there is no reference image. In our approach, a discrete-wavelet transformation is applied to the blurred image, which decomposes into the approximate image and three detail sub-images, namely horizontal, vertical, and diagonal images. We then focus on noise-measuring the detail images and blur-measuring the approximate image to assess the image quality. We then compute noise mean and noise ratio from the detail images, and blur mean and blur ratio from the approximate image. The Multi-scale Blur Detection (MBD) metric provides both an assessment of the noise and blur content. These values are weighted based on a linear regression against full-reference y values. From these statistics, we can compare to normal useful image statistics for image quality without needing a reference image. We then test the validity of our obtained weights by R2 analysis as well as using them to estimate image quality of an image with a known quality measure. The result shows that our method provides acceptable results for images containing low to mid noise levels and blur content.
Prototyping with Data Dictionaries for Requirements Analysis.
1985-03-01
statistical packages and software for screen layout. These items work at a higher level than another category of prototyping tool, program generators... Program generators are software packages which, when given specifications, produce source listings, usually in a high order language such as COBCL...with users and this will not happen if he must stop to develcp a detailed program . [Ref. 241] Hardware as well as software should be considered in
ERIC Educational Resources Information Center
Technomics, Inc., McLean, VA.
This publication is Attachment 4 of a set of 16 computer listed QPCB task sorts, by career level, for the entire Hospital Corps and Dental Technician fields. Statistical data are presented in tabular form for a detailed listing of job duties for clinical physician assistants. (BT)
Intercomparison between ozone profiles measured above Spitsbergen by lidar and sonde techniques
NASA Technical Reports Server (NTRS)
Fabian, Rolf; Vondergathen, Peter; Ehlers, J.; Krueger, Bernd C.; Neuber, Roland; Beyerle, Georg
1994-01-01
This paper compares coincident ozone profile measurements by electrochemical sondes and lidar performed at Ny-Alesund/Spitsbergen. A detailed height dependent statistical analysis of the differences between these complementary methods was performed for the overlapping altitude region (13-35 km). The data set comprises ozone profile measurements conducted between Jan. 1989 and Jan. 1991. Differences of up to 25 percent were found above 30 km altitude.
ERIC Educational Resources Information Center
Technomics, Inc., McLean, VA.
This publication is Attachment 5 of a set of 16 computer listed QPCB task sorts, by career level, for the entire Hospital Corps and Dental Technician fields. Statistical data are presented in tabular form for a detailed listing of job duties in cardio-pulmonary, EEG, EKG, and inhalation therapy. (BT)
Görgen, Kai; Hebart, Martin N; Allefeld, Carsten; Haynes, John-Dylan
2017-12-27
Standard neuroimaging data analysis based on traditional principles of experimental design, modelling, and statistical inference is increasingly complemented by novel analysis methods, driven e.g. by machine learning methods. While these novel approaches provide new insights into neuroimaging data, they often have unexpected properties, generating a growing literature on possible pitfalls. We propose to meet this challenge by adopting a habit of systematic testing of experimental design, analysis procedures, and statistical inference. Specifically, we suggest to apply the analysis method used for experimental data also to aspects of the experimental design, simulated confounds, simulated null data, and control data. We stress the importance of keeping the analysis method the same in main and test analyses, because only this way possible confounds and unexpected properties can be reliably detected and avoided. We describe and discuss this Same Analysis Approach in detail, and demonstrate it in two worked examples using multivariate decoding. With these examples, we reveal two sources of error: A mismatch between counterbalancing (crossover designs) and cross-validation which leads to systematic below-chance accuracies, and linear decoding of a nonlinear effect, a difference in variance. Copyright © 2017 Elsevier Inc. All rights reserved.
Climate Considerations Of The Electricity Supply Systems In Industries
NASA Astrophysics Data System (ADS)
Asset, Khabdullin; Zauresh, Khabdullina
2014-12-01
The study is focused on analysis of climate considerations of electricity supply systems in a pellet industry. The developed analysis model consists of two modules: statistical data of active power losses evaluation module and climate aspects evaluation module. The statistical data module is presented as a universal mathematical model of electrical systems and components of industrial load. It forms a basis for detailed accounting of power loss from the voltage levels. On the basis of the universal model, a set of programs is designed to perform the calculation and experimental research. It helps to obtain the statistical characteristics of the power losses and loads of the electricity supply systems and to define the nature of changes in these characteristics. Within the module, several methods and algorithms for calculating parameters of equivalent circuits of low- and high-voltage ADC and SD with a massive smooth rotor with laminated poles are developed. The climate aspects module includes an analysis of the experimental data of power supply system in pellet production. It allows identification of GHG emission reduction parameters: operation hours, type of electrical motors, values of load factor and deviation of standard value of voltage.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1983-01-01
This volume contains geology of the Durango A detail area, radioactive mineral occurences in Colorado, and geophysical data interpretation. Eight appendices provide the following: stacked profiles, geologic histograms, geochemical histograms, speed and altitude histograms, geologic statistical tables, geochemical statistical tables, magnetic and ancillary profiles, and test line data.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1983-01-01
The geology of the Durango B detail area, the radioactive mineral occurrences in Colorado and the geophysical data interpretation are included in this report. Seven appendices contain: stacked profiles, geologic histograms, geochemical histograms, speed and altitude histograms, geologic statistical tables, geochemical statistical tables, and test line data.
Usman, Mohammad N.; Umar, Muhammad D.
2018-01-01
Background: Recent studies have revealed that pharmacists have interest in conducting research. However, lack of confidence is a major barrier. Objective: This study evaluated pharmacists’ self-perceived competence and confidence to plan and conduct health-related research. Method: This cross sectional study was conducted during the 89th Annual National Conference of the Pharmaceutical Society of Nigeria in November 2016. An adapted questionnaire was validated and administered to 200 pharmacist delegates during the conference. Result: Overall, 127 questionnaires were included in the analysis. At least 80% of the pharmacists had previous health-related research experience. Pharmacist’s competence and confidence scores were lowest for research skills such as: using software for statistical analysis, choosing and applying appropriate inferential statistical test and method, and outlining detailed statistical plan to be used in data analysis. Highest competence and confidence scores were observed for conception of research idea, literature search and critical appraisal of literature. Pharmacists with previous research experience had higher competence and confidence scores than those with no previous research experience (p<0.05). The only predictor of moderate-to-extreme self-competence and confidence was having at least one journal article publication during the last 5 years. Conclusion: Nigerian pharmacists indicated interest to participate in health-related research. However, self-competence and confidence to plan and conduct research were low. This was particularly so for skills related to statistical analysis. Training programs and building of Pharmacy Practice Research Network are recommended to enhance pharmacist’s research capacity. PMID:29619141
Statistical modelling of subdiffusive dynamics in the cytoplasm of living cells: A FARIMA approach
NASA Astrophysics Data System (ADS)
Burnecki, K.; Muszkieta, M.; Sikora, G.; Weron, A.
2012-04-01
Golding and Cox (Phys. Rev. Lett., 96 (2006) 098102) tracked the motion of individual fluorescently labelled mRNA molecules inside live E. coli cells. They found that in the set of 23 trajectories from 3 different experiments, the automatically recognized motion is subdiffusive and published an intriguing microscopy video. Here, we extract the corresponding time series from this video by image segmentation method and present its detailed statistical analysis. We find that this trajectory was not included in the data set already studied and has different statistical properties. It is best fitted by a fractional autoregressive integrated moving average (FARIMA) process with the normal-inverse Gaussian (NIG) noise and the negative memory. In contrast to earlier studies, this shows that the fractional Brownian motion is not the best model for the dynamics documented in this video.
Geometric and dynamic perspectives on phase-coherent and noncoherent chaos.
Zou, Yong; Donner, Reik V; Kurths, Jürgen
2012-03-01
Statistically distinguishing between phase-coherent and noncoherent chaotic dynamics from time series is a contemporary problem in nonlinear sciences. In this work, we propose different measures based on recurrence properties of recorded trajectories, which characterize the underlying systems from both geometric and dynamic viewpoints. The potentials of the individual measures for discriminating phase-coherent and noncoherent chaotic oscillations are discussed. A detailed numerical analysis is performed for the chaotic Rössler system, which displays both types of chaos as one control parameter is varied, and the Mackey-Glass system as an example of a time-delay system with noncoherent chaos. Our results demonstrate that especially geometric measures from recurrence network analysis are well suited for tracing transitions between spiral- and screw-type chaos, a common route from phase-coherent to noncoherent chaos also found in other nonlinear oscillators. A detailed explanation of the observed behavior in terms of attractor geometry is given.
Guided SAR image despeckling with probabilistic non local weights
NASA Astrophysics Data System (ADS)
Gokul, Jithin; Nair, Madhu S.; Rajan, Jeny
2017-12-01
SAR images are generally corrupted by granular disturbances called speckle, which makes visual analysis and detail extraction a difficult task. Non Local despeckling techniques with probabilistic similarity has been a recent trend in SAR despeckling. To achieve effective speckle suppression without compromising detail preservation, we propose an improvement for the existing Generalized Guided Filter with Bayesian Non-Local Means (GGF-BNLM) method. The proposed method (Guided SAR Image Despeckling with Probabilistic Non Local Weights) replaces parametric constants based on heuristics in GGF-BNLM method with dynamically derived values based on the image statistics for weight computation. Proposed changes make GGF-BNLM method adaptive and as a result, significant improvement is achieved in terms of performance. Experimental analysis on SAR images shows excellent speckle reduction without compromising feature preservation when compared to GGF-BNLM method. Results are also compared with other state-of-the-art and classic SAR depseckling techniques to demonstrate the effectiveness of the proposed method.
Anderson, Craig S; Woodward, Mark; Arima, Hisatomi; Chen, Xiaoying; Lindley, Richard I; Wang, Xia; Chalmers, John
2015-12-01
The ENhanced Control of Hypertension And Thrombolysis strokE stuDy trial is a 2 × 2 quasi-factorial active-comparison, prospective, randomized, open, blinded endpoint clinical trial that is evaluating in thrombolysis-eligible acute ischemic stroke patients whether: (1) low-dose (0·6 mg/kg body weight) intravenous alteplase has noninferior efficacy and lower risk of symptomatic intracerebral hemorrhage compared with standard-dose (0·9 mg/kg body weight) intravenous alteplase; and (2) early intensive blood pressure lowering (systolic target 130-140 mmHg) has superior efficacy and lower risk of any intracerebral hemorrhage compared with guideline-recommended blood pressure control (systolic target <180 mmHg). To outline in detail the predetermined statistical analysis plan for the 'alteplase dose arm' of the study. All data collected by participating researchers will be reviewed and formally assessed. Information pertaining to the baseline characteristics of patients, their process of care, and the delivery of treatments will be classified, and for each item, appropriate descriptive statistical analyses are planned with appropriate comparisons made between randomized groups. For the trial outcomes, the most appropriate statistical comparisons to be made between groups are planned and described. A statistical analysis plan was developed for the results of the alteplase dose arm of the study that is transparent, available to the public, verifiable, and predetermined before completion of data collection. We have developed a predetermined statistical analysis plan for the ENhanced Control of Hypertension And Thrombolysis strokE stuDy alteplase dose arm which is to be followed to avoid analysis bias arising from prior knowledge of the study findings. © 2015 The Authors. International Journal of Stroke published by John Wiley & Sons Ltd on behalf of World Stroke Organization.
Data Acquisition and Preprocessing in Studies on Humans: What Is Not Taught in Statistics Classes?
Zhu, Yeyi; Hernandez, Ladia M; Mueller, Peter; Dong, Yongquan; Forman, Michele R
2013-01-01
The aim of this paper is to address issues in research that may be missing from statistics classes and important for (bio-)statistics students. In the context of a case study, we discuss data acquisition and preprocessing steps that fill the gap between research questions posed by subject matter scientists and statistical methodology for formal inference. Issues include participant recruitment, data collection training and standardization, variable coding, data review and verification, data cleaning and editing, and documentation. Despite the critical importance of these details in research, most of these issues are rarely discussed in an applied statistics program. One reason for the lack of more formal training is the difficulty in addressing the many challenges that can possibly arise in the course of a study in a systematic way. This article can help to bridge this gap between research questions and formal statistical inference by using an illustrative case study for a discussion. We hope that reading and discussing this paper and practicing data preprocessing exercises will sensitize statistics students to these important issues and achieve optimal conduct, quality control, analysis, and interpretation of a study.
NASA Technical Reports Server (NTRS)
Cull, R. C.; Eltimsahy, A. H.
1983-01-01
The present investigation is concerned with the formulation of energy management strategies for stand-alone photovoltaic (PV) systems, taking into account a basic control algorithm for a possible predictive, (and adaptive) controller. The control system controls the flow of energy in the system according to the amount of energy available, and predicts the appropriate control set-points based on the energy (insolation) available by using an appropriate system model. Aspects of adaptation to the conditions of the system are also considered. Attention is given to a statistical analysis technique, the analysis inputs, the analysis procedure, and details regarding the basic control algorithm.
The GEOS Ozone Data Assimilation System: Specification of Error Statistics
NASA Technical Reports Server (NTRS)
Stajner, Ivanka; Riishojgaard, Lars Peter; Rood, Richard B.
2000-01-01
A global three-dimensional ozone data assimilation system has been developed at the Data Assimilation Office of the NASA/Goddard Space Flight Center. The Total Ozone Mapping Spectrometer (TOMS) total ozone and the Solar Backscatter Ultraviolet (SBUV) or (SBUV/2) partial ozone profile observations are assimilated. The assimilation, into an off-line ozone transport model, is done using the global Physical-space Statistical Analysis Scheme (PSAS). This system became operational in December 1999. A detailed description of the statistical analysis scheme, and in particular, the forecast and observation error covariance models is given. A new global anisotropic horizontal forecast error correlation model accounts for a varying distribution of observations with latitude. Correlations are largest in the zonal direction in the tropics where data is sparse. Forecast error variance model is proportional to the ozone field. The forecast error covariance parameters were determined by maximum likelihood estimation. The error covariance models are validated using x squared statistics. The analyzed ozone fields in the winter 1992 are validated against independent observations from ozone sondes and HALOE. There is better than 10% agreement between mean Halogen Occultation Experiment (HALOE) and analysis fields between 70 and 0.2 hPa. The global root-mean-square (RMS) difference between TOMS observed and forecast values is less than 4%. The global RMS difference between SBUV observed and analyzed ozone between 50 and 3 hPa is less than 15%.
Analysis of the Einstein sample of early-type galaxies
NASA Technical Reports Server (NTRS)
Eskridge, Paul B.; Fabbiano, Giuseppina
1993-01-01
The EINSTEIN galaxy catalog contains x-ray data for 148 early-type (E and SO) galaxies. A detailed analysis of the global properties of this sample are studied. By comparing the x-ray properties with other tracers of the ISM, as well as with observables related to the stellar dynamics and populations of the sample, we expect to determine more clearly the physical relationships that determine the evolution of early-type galaxies. Previous studies with smaller samples have explored the relationships between x-ray luminosity (L(sub x)) and luminosities in other bands. Using our larger sample and the statistical techniques of survival analysis, a number of these earlier analyses were repeated. For our full sample, a strong statistical correlation is found between L(sub X) and L(sub B) (the probability that the null hypothesis is upheld is P less than 10(exp -4) from a variety of rank correlation tests. Regressions with several algorithms yield consistent results.
A comparison of healing rates on two pressure-relieving systems.
Russell, L; Reynolds, T; Carr, J; Evans, A; Holmes, M
The authors have previously reported the preliminary results of a randomized-controlled trial comparing the relative efficacy of two pressure-relieving systems: Huntleigh Nimbus 3 and Aura Cushion, and Pegasus Cairwave Therapy System and ProActive Seating Cushion (Russell et al, 2000). Although both the mattresses and cushions were effective treatments for pressure ulcers, the Huntleigh equipment was demonstrated to be statistically more effective for heel ulcers, but no differences were demonstrated for sacral ulcers. This article gives a more detailed analysis of the 141 patients assessed using computerized-image analysis of the digital images of sacral ulcers captured during the trial and specifically discusses the healing rates and other patient characteristics. Ninety-eight per cent of ulcers examined were deemed superficial (Torrance grade 2a, 2b, 3). Precision of image analysis assessed by within- and between-batch coefficients of variation was excellent: calibration CV 0.93-1.84%; area CV 4.61-5.72%. The healing rates on the two mattresses were not shown to be statistically different from each other.
Color image encryption using random transforms, phase retrieval, chaotic maps, and diffusion
NASA Astrophysics Data System (ADS)
Annaby, M. H.; Rushdi, M. A.; Nehary, E. A.
2018-04-01
The recent tremendous proliferation of color imaging applications has been accompanied by growing research in data encryption to secure color images against adversary attacks. While recent color image encryption techniques perform reasonably well, they still exhibit vulnerabilities and deficiencies in terms of statistical security measures due to image data redundancy and inherent weaknesses. This paper proposes two encryption algorithms that largely treat these deficiencies and boost the security strength through novel integration of the random fractional Fourier transforms, phase retrieval algorithms, as well as chaotic scrambling and diffusion. We show through detailed experiments and statistical analysis that the proposed enhancements significantly improve security measures and immunity to attacks.
Vadapalli, Sriharsha Babu; Atluri, Kaleswararao; Putcha, Madhu Sudhan; Kondreddi, Sirisha; Kumar, N. Suman; Tadi, Durga Prasad
2016-01-01
Objectives: This in vitro study was designed to compare polyvinyl-siloxane (PVS) monophase and polyether (PE) monophase materials under dry and moist conditions for properties such as surface detail reproduction, dimensional stability, and gypsum compatibility. Materials and Methods: Surface detail reproduction was evaluated using two criteria. Dimensional stability was evaluated according to American Dental Association (ADA) specification no. 19. Gypsum compatibility was assessed by two criteria. All the samples were evaluated, and the data obtained were analyzed by a two-way analysis of variance (ANOVA) and Pearson's Chi-square tests. Results: When surface detail reproduction was evaluated with modification of ADA specification no. 19, both the groups under the two conditions showed no significant difference statistically. When evaluated macroscopically both the groups showed statistically significant difference. Results for dimensional stability showed that the deviation from standard was significant among the two groups, where Aquasil group showed significantly more deviation compared to Impregum group (P < 0.001). Two conditions also showed significant difference, with moist conditions showing significantly more deviation compared to dry condition (P < 0.001). The results of gypsum compatibility when evaluated with modification of ADA specification no. 19 and by giving grades to the casts for both the groups and under two conditions showed no significant difference statistically. Conclusion: Regarding dimensional stability, both impregum and aquasil performed better in dry condition than in moist; impregum performed better than aquasil in both the conditions. When tested for surface detail reproduction according to ADA specification, under dry and moist conditions both of them performed almost equally. When tested according to macroscopic evaluation, impregum and aquasil performed significantly better in dry condition compared to moist condition. In dry condition, both the materials performed almost equally. In moist condition, aquasil performed significantly better than impregum. Regarding gypsum compatibility according to ADA specification, in dry condition both the materials performed almost equally, and in moist condition aquasil performed better than impregum. When tested by macroscopic evaluation, impregum performed better than aquasil in both the conditions. PMID:27583217
Vadapalli, Sriharsha Babu; Atluri, Kaleswararao; Putcha, Madhu Sudhan; Kondreddi, Sirisha; Kumar, N Suman; Tadi, Durga Prasad
2016-01-01
This in vitro study was designed to compare polyvinyl-siloxane (PVS) monophase and polyether (PE) monophase materials under dry and moist conditions for properties such as surface detail reproduction, dimensional stability, and gypsum compatibility. Surface detail reproduction was evaluated using two criteria. Dimensional stability was evaluated according to American Dental Association (ADA) specification no. 19. Gypsum compatibility was assessed by two criteria. All the samples were evaluated, and the data obtained were analyzed by a two-way analysis of variance (ANOVA) and Pearson's Chi-square tests. When surface detail reproduction was evaluated with modification of ADA specification no. 19, both the groups under the two conditions showed no significant difference statistically. When evaluated macroscopically both the groups showed statistically significant difference. Results for dimensional stability showed that the deviation from standard was significant among the two groups, where Aquasil group showed significantly more deviation compared to Impregum group (P < 0.001). Two conditions also showed significant difference, with moist conditions showing significantly more deviation compared to dry condition (P < 0.001). The results of gypsum compatibility when evaluated with modification of ADA specification no. 19 and by giving grades to the casts for both the groups and under two conditions showed no significant difference statistically. Regarding dimensional stability, both impregum and aquasil performed better in dry condition than in moist; impregum performed better than aquasil in both the conditions. When tested for surface detail reproduction according to ADA specification, under dry and moist conditions both of them performed almost equally. When tested according to macroscopic evaluation, impregum and aquasil performed significantly better in dry condition compared to moist condition. In dry condition, both the materials performed almost equally. In moist condition, aquasil performed significantly better than impregum. Regarding gypsum compatibility according to ADA specification, in dry condition both the materials performed almost equally, and in moist condition aquasil performed better than impregum. When tested by macroscopic evaluation, impregum performed better than aquasil in both the conditions.
Application of spatial technology in malaria research & control: some new insights.
Saxena, Rekha; Nagpal, B N; Srivastava, Aruna; Gupta, S K; Dash, A P
2009-08-01
Geographical information System (GIS) has emerged as the core of the spatial technology which integrates wide range of dataset available from different sources including Remote Sensing (RS) and Global Positioning System (GPS). Literature published during the decade (1998-2007) has been compiled and grouped into six categories according to the usage of the technology in malaria epidemiology. Different GIS modules like spatial data sources, mapping and geo-processing tools, distance calculation, digital elevation model (DEM), buffer zone and geo-statistical analysis have been investigated in detail, illustrated with examples as per the derived results. These GIS tools have contributed immensely in understanding the epidemiological processes of malaria and examples drawn have shown that GIS is now widely used for research and decision making in malaria control. Statistical data analysis currently is the most consistent and established set of tools to analyze spatial datasets. The desired future development of GIS is in line with the utilization of geo-statistical tools which combined with high quality data has capability to provide new insight into malaria epidemiology and the complexity of its transmission potential in endemic areas.
Analysis of the statistic al properties of pulses in atmospheric corona discharge
NASA Astrophysics Data System (ADS)
Aubrecht, L.; Koller, J.; Plocek, J.; Stanék, Z.
2000-03-01
The properties of the negative corona current pulses in a single point-to-plane configuration have been extensively studied by many investigators. The amplitude and the interval of these pulses are not generally constant and depend on many variables. The repetition rate and the amplitude of the pulses fluctuate in time. Since these fluctuations are subject to a certain probability distribution, the statistical processing was used for the analysis of the pulse fluctuations. The behavior of the pulses has been also investigated in a multipoint geometry configuration. The dependence of the behavior of the corona pulses on the gap lengths, the material, the shape of the point electrode, the number and separation of electrodes (in the multiple-point mode) has been investigated, too. No detailed study has been carried out up to now for this case. The attention has been devoted also to the study of the pulses on the points of live materials (needles of coniferous trees). This contribution describes recent studies of the statistical properties of the pulses for various conditions.
Fatal falls in the US construction industry, 1990 to 1999.
Derr, J; Forst, L; Chen, H Y; Conroy, L
2001-10-01
The Occupational Safety and Health Administration's (OSHA's) Integrated Management Information System (IMIS) database allows for the detailed analysis of risk factors surrounding fatal occupational events. This study used IMIS data to (1) perform a risk factor analysis of fatal construction falls, and (2) assess the impact of the February 1995 29 CFR Part 1926 Subpart M OSHA fall protection regulations for construction by calculating trends in fatal fall rates. In addition, IMIS data on fatal construction falls were compared with data from other occupational fatality surveillance systems. For falls in construction, the study identified several demographic factors that may indicate increased risk. A statistically significant downward trend in fatal falls was evident in all construction and within several construction categories during the decade. Although the study failed to show a statistically significant intervention effect from the new OSHA regulations, it may have lacked the power to do so.
Tasker, Gary D.; Granato, Gregory E.
2000-01-01
Decision makers need viable methods for the interpretation of local, regional, and national-highway runoff and urban-stormwater data including flows, concentrations and loads of chemical constituents and sediment, potential effects on receiving waters, and the potential effectiveness of various best management practices (BMPs). Valid (useful for intended purposes), current, and technically defensible stormwater-runoff models are needed to interpret data collected in field studies, to support existing highway and urban-runoffplanning processes, to meet National Pollutant Discharge Elimination System (NPDES) requirements, and to provide methods for computation of Total Maximum Daily Loads (TMDLs) systematically and economically. Historically, conceptual, simulation, empirical, and statistical models of varying levels of detail, complexity, and uncertainty have been used to meet various data-quality objectives in the decision-making processes necessary for the planning, design, construction, and maintenance of highways and for other land-use applications. Water-quality simulation models attempt a detailed representation of the physical processes and mechanisms at a given site. Empirical and statistical regional water-quality assessment models provide a more general picture of water quality or changes in water quality over a region. All these modeling techniques share one common aspect-their predictive ability is poor without suitable site-specific data for calibration. To properly apply the correct model, one must understand the classification of variables, the unique characteristics of water-resources data, and the concept of population structure and analysis. Classifying variables being used to analyze data may determine which statistical methods are appropriate for data analysis. An understanding of the characteristics of water-resources data is necessary to evaluate the applicability of different statistical methods, to interpret the results of these techniques, and to use tools and techniques that account for the unique nature of water-resources data sets. Populations of data on stormwater-runoff quantity and quality are often best modeled as logarithmic transformations. Therefore, these factors need to be considered to form valid, current, and technically defensible stormwater-runoff models. Regression analysis is an accepted method for interpretation of water-resources data and for prediction of current or future conditions at sites that fit the input data model. Regression analysis is designed to provide an estimate of the average response of a system as it relates to variation in one or more known variables. To produce valid models, however, regression analysis should include visual analysis of scatterplots, an examination of the regression equation, evaluation of the method design assumptions, and regression diagnostics. A number of statistical techniques are described in the text and in the appendixes to provide information necessary to interpret data by use of appropriate methods. Uncertainty is an important part of any decisionmaking process. In order to deal with uncertainty problems, the analyst needs to know the severity of the statistical uncertainty of the methods used to predict water quality. Statistical models need to be based on information that is meaningful, representative, complete, precise, accurate, and comparable to be deemed valid, up to date, and technically supportable. To assess uncertainty in the analytical tools, the modeling methods, and the underlying data set, all of these components need be documented and communicated in an accessible format within project publications.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mahfuz, H.; Maniruzzaman, M.; Vaidya, U.
1997-04-01
Monotonic tensile and fatigue response of continuous silicon carbide fiber reinforced silicon nitride (SiC{sub f}/Si{sub 3}N{sub 4}) composites has been investigated. The monotonic tensile tests have been performed at room and elevated temperatures. Fatigue tests have been conducted at room temperature (RT), at a stress ratio, R = 0.1 and a frequency of 5 Hz. It is observed during the monotonic tests that the composites retain only 30% of its room temperature strength at 1,600 C suggesting a substantial chemical degradation of the matrix at that temperature. The softening of the matrix at elevated temperature also causes reduction in tensilemore » modulus, and the total reduction in modulus is around 45%. Fatigue data have been generated at three load levels and the fatigue strength of the composite has been found to be considerably high; about 75% of its ultimate room temperature strength. Extensive statistical analysis has been performed to understand the degree of scatter in the fatigue as well as in the static test data. Weibull shape factors and characteristic values have been determined for each set of tests and their relationship with the response of the composites has been discussed. A statistical fatigue life prediction method developed from the Weibull distribution is also presented. Maximum Likelihood Estimator with censoring techniques and data pooling schemes has been employed to determine the distribution parameters for the statistical analysis. These parameters have been used to generate the S-N diagram with desired level of reliability. Details of the statistical analysis and the discussion of the static and fatigue behavior of the composites are presented in this paper.« less
Computer-aided auditing of prescription drug claims.
Iyengar, Vijay S; Hermiz, Keith B; Natarajan, Ramesh
2014-09-01
We describe a methodology for identifying and ranking candidate audit targets from a database of prescription drug claims. The relevant audit targets may include various entities such as prescribers, patients and pharmacies, who exhibit certain statistical behavior indicative of potential fraud and abuse over the prescription claims during a specified period of interest. Our overall approach is consistent with related work in statistical methods for detection of fraud and abuse, but has a relative emphasis on three specific aspects: first, based on the assessment of domain experts, certain focus areas are selected and data elements pertinent to the audit analysis in each focus area are identified; second, specialized statistical models are developed to characterize the normalized baseline behavior in each focus area; and third, statistical hypothesis testing is used to identify entities that diverge significantly from their expected behavior according to the relevant baseline model. The application of this overall methodology to a prescription claims database from a large health plan is considered in detail.
Statistical modeling of optical attenuation measurements in continental fog conditions
NASA Astrophysics Data System (ADS)
Khan, Muhammad Saeed; Amin, Muhammad; Awan, Muhammad Saleem; Minhas, Abid Ali; Saleem, Jawad; Khan, Rahimdad
2017-03-01
Free-space optics is an innovative technology that uses atmosphere as a propagation medium to provide higher data rates. These links are heavily affected by atmospheric channel mainly because of fog and clouds that act to scatter and even block the modulated beam of light from reaching the receiver end, hence imposing severe attenuation. A comprehensive statistical study of the fog effects and deep physical understanding of the fog phenomena are very important for suggesting improvements (reliability and efficiency) in such communication systems. In this regard, 6-months real-time measured fog attenuation data are considered and statistically investigated. A detailed statistical analysis related to each fog event for that period is presented; the best probability density functions are selected on the basis of Akaike information criterion, while the estimates of unknown parameters are computed by maximum likelihood estimation technique. The results show that most fog attenuation events follow normal mixture distribution and some follow the Weibull distribution.
The distribution of density in supersonic turbulence
NASA Astrophysics Data System (ADS)
Squire, Jonathan; Hopkins, Philip F.
2017-11-01
We propose a model for the statistics of the mass density in supersonic turbulence, which plays a crucial role in star formation and the physics of the interstellar medium (ISM). The model is derived by considering the density to be arranged as a collection of strong shocks of width ˜ M^{-2}, where M is the turbulent Mach number. With two physically motivated parameters, the model predicts all density statistics for M>1 turbulence: the density probability distribution and its intermittency (deviation from lognormality), the density variance-Mach number relation, power spectra and structure functions. For the proposed model parameters, reasonable agreement is seen between model predictions and numerical simulations, albeit within the large uncertainties associated with current simulation results. More generally, the model could provide a useful framework for more detailed analysis of future simulations and observational data. Due to the simple physical motivations for the model in terms of shocks, it is straightforward to generalize to more complex physical processes, which will be helpful in future more detailed applications to the ISM. We see good qualitative agreement between such extensions and recent simulations of non-isothermal turbulence.
Analysis of the Waggle Dance Motion of Honeybees for the Design of a Biomimetic Honeybee Robot
Landgraf, Tim; Rojas, Raúl; Nguyen, Hai; Kriegel, Fabian; Stettin, Katja
2011-01-01
The honeybee dance “language” is one of the most popular examples of information transfer in the animal world. Today, more than 60 years after its discovery it still remains unknown how follower bees decode the information contained in the dance. In order to build a robotic honeybee that allows a deeper investigation of the communication process we have recorded hundreds of videos of waggle dances. In this paper we analyze the statistics of visually captured high-precision dance trajectories of European honeybees (Apis mellifera carnica). The trajectories were produced using a novel automatic tracking system and represent the most detailed honeybee dance motion information available. Although honeybee dances seem very variable, some properties turned out to be invariant. We use these properties as a minimal set of parameters that enables us to model the honeybee dance motion. We provide a detailed statistical description of various dance properties that have not been characterized before and discuss the role of particular dance components in the commmunication process. PMID:21857906
Extreme event statistics in a drifting Markov chain
NASA Astrophysics Data System (ADS)
Kindermann, Farina; Hohmann, Michael; Lausch, Tobias; Mayer, Daniel; Schmidt, Felix; Widera, Artur
2017-07-01
We analyze extreme event statistics of experimentally realized Markov chains with various drifts. Our Markov chains are individual trajectories of a single atom diffusing in a one-dimensional periodic potential. Based on more than 500 individual atomic traces we verify the applicability of the Sparre Andersen theorem to our system despite the presence of a drift. We present detailed analysis of four different rare-event statistics for our system: the distributions of extreme values, of record values, of extreme value occurrence in the chain, and of the number of records in the chain. We observe that, for our data, the shape of the extreme event distributions is dominated by the underlying exponential distance distribution extracted from the atomic traces. Furthermore, we find that even small drifts influence the statistics of extreme events and record values, which is supported by numerical simulations, and we identify cases in which the drift can be determined without information about the underlying random variable distributions. Our results facilitate the use of extreme event statistics as a signal for small drifts in correlated trajectories.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Merrill, D.W.; Selvin, S.; Close, E.R.
In studying geographic disease distributions, one normally compares rates of arbitrarily defined geographic subareas (e.g. census tracts), thereby sacrificing the geographic detail of the original data. The sparser the data, the larger the subareas must be in order to calculate stable rates. This dilemma is avoided with the technique of Density Equalizing Map Projections (DEMP). Boundaries of geographic subregions are adjusted to equalize population density over the entire study area. Case locations plotted on the transformed map should have a uniform distribution if the underlying disease-rates are constant. On the transformed map, the statistical analysis of the observed distribution ismore » greatly simplified. Even for sparse distributions, the statistical significance of a supposed disease cluster can be reliably calculated. The present report describes the first successful application of the DEMP technique to a sizeable ``real-world`` data set of epidemiologic interest. An improved DEMP algorithm [GUSE93, CLOS94] was applied to a data set previously analyzed with conventional techniques [SATA90, REYN91]. The results from the DEMP analysis and a conventional analysis are compared.« less
Humans make efficient use of natural image statistics when performing spatial interpolation.
D'Antona, Anthony D; Perry, Jeffrey S; Geisler, Wilson S
2013-12-16
Visual systems learn through evolution and experience over the lifespan to exploit the statistical structure of natural images when performing visual tasks. Understanding which aspects of this statistical structure are incorporated into the human nervous system is a fundamental goal in vision science. To address this goal, we measured human ability to estimate the intensity of missing image pixels in natural images. Human estimation accuracy is compared with various simple heuristics (e.g., local mean) and with optimal observers that have nearly complete knowledge of the local statistical structure of natural images. Human estimates are more accurate than those of simple heuristics, and they match the performance of an optimal observer that knows the local statistical structure of relative intensities (contrasts). This optimal observer predicts the detailed pattern of human estimation errors and hence the results place strong constraints on the underlying neural mechanisms. However, humans do not reach the performance of an optimal observer that knows the local statistical structure of the absolute intensities, which reflect both local relative intensities and local mean intensity. As predicted from a statistical analysis of natural images, human estimation accuracy is negligibly improved by expanding the context from a local patch to the whole image. Our results demonstrate that the human visual system exploits efficiently the statistical structure of natural images.
NASA Astrophysics Data System (ADS)
Dennison, Andrew G.
Classification of the seafloor substrate can be done with a variety of methods. These methods include Visual (dives, drop cameras); mechanical (cores, grab samples); acoustic (statistical analysis of echosounder returns). Acoustic methods offer a more powerful and efficient means of collecting useful information about the bottom type. Due to the nature of an acoustic survey, larger areas can be sampled, and by combining the collected data with visual and mechanical survey methods provide greater confidence in the classification of a mapped region. During a multibeam sonar survey, both bathymetric and backscatter data is collected. It is well documented that the statistical characteristic of a sonar backscatter mosaic is dependent on bottom type. While classifying the bottom-type on the basis on backscatter alone can accurately predict and map bottom-type, i.e a muddy area from a rocky area, it lacks the ability to resolve and capture fine textural details, an important factor in many habitat mapping studies. Statistical processing of high-resolution multibeam data can capture the pertinent details about the bottom-type that are rich in textural information. Further multivariate statistical processing can then isolate characteristic features, and provide the basis for an accurate classification scheme. The development of a new classification method is described here. It is based upon the analysis of textural features in conjunction with ground truth sampling. The processing and classification result of two geologically distinct areas in nearshore regions of Lake Superior; off the Lester River,MN and Amnicon River, WI are presented here, using the Minnesota Supercomputer Institute's Mesabi computing cluster for initial processing. Processed data is then calibrated using ground truth samples to conduct an accuracy assessment of the surveyed areas. From analysis of high-resolution bathymetry data collected at both survey sites is was possible to successfully calculate a series of measures that describe textural information about the lake floor. Further processing suggests that the features calculated capture a significant amount of statistical information about the lake floor terrain as well. Two sources of error, an anomalous heave and refraction error significantly deteriorated the quality of the processed data and resulting validate results. Ground truth samples used to validate the classification methods utilized for both survey sites, however, resulted in accuracy values ranging from 5 -30 percent at the Amnicon River, and between 60-70 percent for the Lester River. The final results suggest that this new processing methodology does adequately capture textural information about the lake floor and does provide an acceptable classification in the absence of significant data quality issues.
ERIC Educational Resources Information Center
National Science Foundation, Washington, DC. Div. of Science Resources Studies.
Detailed statistical tables on federal funds for research and development (R&D) activities are provided in this document. Tables are organized into the following sections: research, development, and R&D plant; R&D- agency, character of work, and performer; total research- agency, performer, and field of science; basic research- agency,…
ERIC Educational Resources Information Center
National Science Foundation, Washington, DC. Div. of Science Resources Studies.
Detailed statistical tables showing the funding levels of 92 federal agencies for research and development (R&D) are provided in this document. These tables are organized into the following sections: research, development, and R&D plant; R&D agency, character of work, and performer; total basic and applied applied research--agency,…
ERIC Educational Resources Information Center
National Science Foundation, Washington, DC.
During the March through July 1981 period a total of 36 Federal agencies and their subdivisions (95 individual respondents) submitted data in response to the Annual Survey of Federal Funds for Research and Development, Volume XXX, conducted by the National Science Foundation. The detailed statistical tables presented in this report were derived…
ERIC Educational Resources Information Center
National Science Foundation, Washington, DC. Div. of Science Resources Studies.
Detailed statistical tables on federal funds for research and development (R&D) are provided in this document. Tables are organized into the following sections: research, development, and R&D plant; R&D--agency, character of work, and performer; total research--agency, performer, and field of science; basic research--agency, performer,…
Analysis of Yb3+/Er3+-codoped microring resonator cross-grid matrices
NASA Astrophysics Data System (ADS)
Vallés, Juan A.; Gǎlǎtuş, Ramona
2014-09-01
An analytic model of the scattering response of a highly Yb3+/Er3+-codoped phosphate glass microring resonator matrix is considered to obtain the transfer functions of an M x N cross-grid microring resonator structure. Then a detailed model is used to calculate the pump and signal propagation, including a microscopic statistical formalism to describe the high-concentration induced energy-transfer mechanisms and passive and active features are combined to realistically simulate the performance as a wavelength-selective amplifier or laser. This analysis allows the optimization of these structures for telecom or sensing applications.
Performance of a Lexical and POS Tagger for Sanskrit
NASA Astrophysics Data System (ADS)
Hellwig, Oliver
Due to the phonetic, morphological, and lexical complexity of Sanskrit, the automatic analysis of this language is a real challenge in the area of natural language processing. The paper describes a series of tests that were performed to assess the accuracy of the tagging program SanskritTagger. To our knowlegde, it offers the first reliable benchmark data for evaluating the quality of taggers for Sanskrit using an unrestricted dictionary and texts from different domains. Based on a detailed analysis of the test results, the paper points out possible directions for future improvements of statistical tagging procedures for Sanskrit.
Precise measurement of the half-life of the Fermi {beta} decay of {sup 26}Al{sup m}
DOE Office of Scientific and Technical Information (OSTI.GOV)
Scott, Rebecca J.; Thompson, Maxwell N.; Rassool, Roger P.
2011-08-15
State-of-the-art signal digitization and analysis techniques have been used to measure the half-life of the Fermi {beta} decay of {sup 26}Al{sup m}. The half-life was determined to be 6347.8 {+-} 2.5 ms. This new datum contributes to the experimental testing of the conserved-vector-current hypothesis and the required unitarity of the Cabibbo-Kobayashi-Maskawa matrix: two essential components of the standard model. Detailed discussion of the experimental techniques and data analysis and a thorough investigation of the statistical and systematic uncertainties are presented.
Teaching Statistics Online Using "Excel"
ERIC Educational Resources Information Center
Jerome, Lawrence
2011-01-01
As anyone who has taught or taken a statistics course knows, statistical calculations can be tedious and error-prone, with the details of a calculation sometimes distracting students from understanding the larger concepts. Traditional statistics courses typically use scientific calculators, which can relieve some of the tedium and errors but…
Mórocz, István Akos; Janoos, Firdaus; van Gelderen, Peter; Manor, David; Karni, Avi; Breznitz, Zvia; von Aster, Michael; Kushnir, Tammar; Shalev, Ruth
2012-01-01
The aim of this article is to report on the importance and challenges of a time-resolved and spatio-temporal analysis of fMRI data from complex cognitive processes and associated disorders using a study on developmental dyscalculia (DD). Participants underwent fMRI while judging the incorrectness of multiplication results, and the data were analyzed using a sequence of methods, each of which progressively provided more a detailed picture of the spatio-temporal aspect of this disease. Healthy subjects and subjects with DD performed alike behaviorally though they exhibited parietal disparities using traditional voxel-based group analyses. Further and more detailed differences, however, surfaced with a time-resolved examination of the neural responses during the experiment. While performing inter-group comparisons, a third group of subjects with dyslexia (DL) but with no arithmetic difficulties was included to test the specificity of the analysis and strengthen the statistical base with overall fifty-eight subjects. Surprisingly, the analysis showed a functional dissimilarity during an initial reading phase for the group of dyslexic but otherwise normal subjects, with respect to controls, even though only numerical digits and no alphabetic characters were presented. Thus our results suggest that time-resolved multi-variate analysis of complex experimental paradigms has the ability to yield powerful new clinical insights about abnormal brain function. Similarly, a detailed compilation of aberrations in the functional cascade may have much greater potential to delineate the core processing problems in mental disorders. PMID:22368322
Methodological reporting of randomized trials in five leading Chinese nursing journals.
Shi, Chunhu; Tian, Jinhui; Ren, Dan; Wei, Hongli; Zhang, Lihuan; Wang, Quan; Yang, Kehu
2014-01-01
Randomized controlled trials (RCTs) are not always well reported, especially in terms of their methodological descriptions. This study aimed to investigate the adherence of methodological reporting complying with CONSORT and explore associated trial level variables in the Chinese nursing care field. In June 2012, we identified RCTs published in five leading Chinese nursing journals and included trials with details of randomized methods. The quality of methodological reporting was measured through the methods section of the CONSORT checklist and the overall CONSORT methodological items score was calculated and expressed as a percentage. Meanwhile, we hypothesized that some general and methodological characteristics were associated with reporting quality and conducted a regression with these data to explore the correlation. The descriptive and regression statistics were calculated via SPSS 13.0. In total, 680 RCTs were included. The overall CONSORT methodological items score was 6.34 ± 0.97 (Mean ± SD). No RCT reported descriptions and changes in "trial design," changes in "outcomes" and "implementation," or descriptions of the similarity of interventions for "blinding." Poor reporting was found in detailing the "settings of participants" (13.1%), "type of randomization sequence generation" (1.8%), calculation methods of "sample size" (0.4%), explanation of any interim analyses and stopping guidelines for "sample size" (0.3%), "allocation concealment mechanism" (0.3%), additional analyses in "statistical methods" (2.1%), and targeted subjects and methods of "blinding" (5.9%). More than 50% of trials described randomization sequence generation, the eligibility criteria of "participants," "interventions," and definitions of the "outcomes" and "statistical methods." The regression analysis found that publication year and ITT analysis were weakly associated with CONSORT score. The completeness of methodological reporting of RCTs in the Chinese nursing care field is poor, especially with regard to the reporting of trial design, changes in outcomes, sample size calculation, allocation concealment, blinding, and statistical methods.
Kotter, Elmar; Bley, Thorsten A; Saueressig, Ulrich; Fisch, Dagmar; Springer, Oliver; Winterer, Jan Torsten; Schaefer, Oliver; Langer, Mathias
2003-11-01
To evaluate the detection rate of fine details of a new thin-film transistor (TFT) grayscale monitor designed for radiologic diagnosis, compared with a type of cathode ray tube (CRT) screen used routinely for diagnostic radiology. Fifteen radiographs of a statistical phantom presenting low- and high-contrast details were obtained and read out with an Agfa ADC compact storage phosphor system. Each radiograph presented 60 high-density (high-contrast) and 60 low-density (low-contrast) test bodies. Approximately half the test bodies contained holes with different diameters. Observers were asked to detect the presence or absence of a hole in the test body on a 5-point confidence range. The total of 1800 test bodies was reviewed by 5 radiologists on the TFT monitor (20.8 inches; 1536 x 2048 pixels; maximum luminance, 650 cd/m2; contrast, 600:1) and the CRT monitor (21 inches; P45 Phosphor; 2048 x 2560 pixels operated at 1728 x 2304 pixels; maximum luminance, 600 cd/m2; contrast, 300:1). The data were analyzed by receiver-operator characteristic analysis. For high-contrast details, the mean area under the curve rated 0.9336 for the TFT monitor and 0.9312 for the CRT monitor. For low-contrast details, the mean area under the curve rated 0.9189 for the TFT monitor and 0.9224 for the CRT monitor. At P
Online Updating of Statistical Inference in the Big Data Setting.
Schifano, Elizabeth D; Wu, Jing; Wang, Chun; Yan, Jun; Chen, Ming-Hui
2016-01-01
We present statistical methods for big data arising from online analytical processing, where large amounts of data arrive in streams and require fast analysis without storage/access to the historical data. In particular, we develop iterative estimating algorithms and statistical inferences for linear models and estimating equations that update as new data arrive. These algorithms are computationally efficient, minimally storage-intensive, and allow for possible rank deficiencies in the subset design matrices due to rare-event covariates. Within the linear model setting, the proposed online-updating framework leads to predictive residual tests that can be used to assess the goodness-of-fit of the hypothesized model. We also propose a new online-updating estimator under the estimating equation setting. Theoretical properties of the goodness-of-fit tests and proposed estimators are examined in detail. In simulation studies and real data applications, our estimator compares favorably with competing approaches under the estimating equation setting.
Preparing and Presenting Effective Research Posters
Miller, Jane E
2007-01-01
Objectives Posters are a common way to present results of a statistical analysis, program evaluation, or other project at professional conferences. Often, researchers fail to recognize the unique nature of the format, which is a hybrid of a published paper and an oral presentation. This methods note demonstrates how to design research posters to convey study objectives, methods, findings, and implications effectively to varied professional audiences. Methods A review of existing literature on research communication and poster design is used to identify and demonstrate important considerations for poster content and layout. Guidelines on how to write about statistical methods, results, and statistical significance are illustrated with samples of ineffective writing annotated to point out weaknesses, accompanied by concrete examples and explanations of improved presentation. A comparison of the content and format of papers, speeches, and posters is also provided. Findings Each component of a research poster about a quantitative analysis should be adapted to the audience and format, with complex statistical results translated into simplified charts, tables, and bulleted text to convey findings as part of a clear, focused story line. Conclusions Effective research posters should be designed around two or three key findings with accompanying handouts and narrative description to supply additional technical detail and encourage dialog with poster viewers. PMID:17355594
A statistical study of EMIC waves observed by Cluster. 1. Wave properties. EMIC Wave Properties
Allen, R. C.; Zhang, J. -C.; Kistler, L. M.; ...
2015-07-23
Electromagnetic ion cyclotron (EMIC) waves are an important mechanism for particle energization and losses inside the magnetosphere. In order to better understand the effects of these waves on particle dynamics, detailed information about the occurrence rate, wave power, ellipticity, normal angle, energy propagation angle distributions, and local plasma parameters are required. Previous statistical studies have used in situ observations to investigate the distribution of these parameters in the magnetic local time versus L-shell (MLT-L) frame within a limited magnetic latitude (MLAT) range. In our study, we present a statistical analysis of EMIC wave properties using 10 years (2001–2010) of datamore » from Cluster, totaling 25,431 min of wave activity. Due to the polar orbit of Cluster, we are able to investigate EMIC waves at all MLATs and MLTs. This allows us to further investigate the MLAT dependence of various wave properties inside different MLT sectors and further explore the effects of Shabansky orbits on EMIC wave generation and propagation. Thus, the statistical analysis is presented in two papers. OUr paper focuses on the wave occurrence distribution as well as the distribution of wave properties. The companion paper focuses on local plasma parameters during wave observations as well as wave generation proxies.« less
GPUs for statistical data analysis in HEP: a performance study of GooFit on GPUs vs. RooFit on CPUs
NASA Astrophysics Data System (ADS)
Pompili, Alexis; Di Florio, Adriano; CMS Collaboration
2016-10-01
In order to test the computing capabilities of GPUs with respect to traditional CPU cores a high-statistics toy Monte Carlo technique has been implemented both in ROOT/RooFit and GooFit frameworks with the purpose to estimate the statistical significance of the structure observed by CMS close to the kinematical boundary of the Jψϕ invariant mass in the three-body decay B +→JψϕK +. GooFit is a data analysis open tool under development that interfaces ROOT/RooFit to CUDA platform on nVidia GPU. The optimized GooFit application running on GPUs hosted by servers in the Bari Tier2 provides striking speed-up performances with respect to the RooFit application parallelised on multiple CPUs by means of PROOF-Lite tool. The considerably resulting speed-up, while comparing concurrent GooFit processes allowed by CUDA Multi Process Service and a RooFit/PROOF-Lite process with multiple CPU workers, is presented and discussed in detail. By means of GooFit it has also been possible to explore the behaviour of a likelihood ratio test statistic in different situations in which the Wilks Theorem may apply or does not apply because its regularity conditions are not satisfied.
Photographic and photometric enhancement of Lunar Orbiter products, projects A, B and C
NASA Technical Reports Server (NTRS)
1972-01-01
A detailed discussion is presented of the framelet joining, photometric data improvement, and statistical error analysis. The Lunar Orbiter film handling system, readout system, and the digitization are described, along with the technique of joining adjacent framelets by a using a digital computer. Time and cost estimates are given. The problems and techniques involved in improving the digitized data are discussed. It was found that spectacular improvements are possible. Program documentations are included.
Bose-Einstein distribution of money in a free-market economy. II
NASA Astrophysics Data System (ADS)
Kürten, K. E.; Kusmartsev, F. V.
2011-01-01
We argue about the application of methods of statistical mechanics to free economy (Kusmartsev F. V., Phys. Lett. A, 375 (2011) 966) and find that the most general distribution of money or income in a free-market economy has a general Bose-Einstein distribution form. Therewith the market is described by three parameters: temperature, chemical potential and the space dimensionality. Numerical simulations and a detailed analysis of a generic model confirm this finding.
1994-07-01
1993. "Analysis of the 1730-1732. Track - Before - Detect Approach to Target Detection using Pixel Statistics", to appear in IEEE Transactions Scholz, J...large surveillance arrays. One approach to combining energy in different spatial cells is track - before - detect . References to examples appear in the next... track - before - detect problem. The results obtained are not expected to depend strongly on model details. In particular, the structure of the tracking
Barišić, Ivan; Mitteregger, Dieter; Hirschl, Alexander M; Noehammer, Christa; Wiesinger-Mayr, Herbert
2014-10-01
The detailed analysis of antibiotic resistance mechanisms is essential for understanding the underlying evolutionary processes, the implementation of appropriate intervention strategies and to guarantee efficient treatment options. In the present study, 110 β-lactam-resistant, clinical isolates of Enterobacteriaceae sampled in 2011 in one of Europe's largest hospitals, the General Hospital Vienna, were screened for the presence of 31 β-lactamase genes. Twenty of those isolates were selected for whole genome sequencing (WGS). In addition, the number of β-lactamase genes was estimated using biostatistical models. The carbapenemase genes blaKPC-2, blaKPC-3, and blaVIM-4 were identified in carbapenem-resistant and intermediate susceptible isolates, blaOXA-72 in an extended-spectrum β-lactamase (ESBL)-positive one. Furthermore, the observed high prevalence of the acquired blaDHA-1 and blaCMY AmpC β-lactamase genes (70%) in phenotypically AmpC-positive isolates is alarming due to their capability to become carbapenem-resistant upon changes in membrane permeability. The statistical analyses revealed that approximately 55% of all β-lactamase genes present in the General Hospital Vienna were detected by this study. In summary, this work gives a very detailed picture on the disseminated β-lactamases and other resistance genes in one of Europe's largest hospitals. Copyright © 2014 Elsevier B.V. All rights reserved.
Probability, statistics, and computational science.
Beerenwinkel, Niko; Siebourg, Juliane
2012-01-01
In this chapter, we review basic concepts from probability theory and computational statistics that are fundamental to evolutionary genomics. We provide a very basic introduction to statistical modeling and discuss general principles, including maximum likelihood and Bayesian inference. Markov chains, hidden Markov models, and Bayesian network models are introduced in more detail as they occur frequently and in many variations in genomics applications. In particular, we discuss efficient inference algorithms and methods for learning these models from partially observed data. Several simple examples are given throughout the text, some of which point to models that are discussed in more detail in subsequent chapters.
NASA Astrophysics Data System (ADS)
McCray, Wilmon Wil L., Jr.
The research was prompted by a need to conduct a study that assesses process improvement, quality management and analytical techniques taught to students in U.S. colleges and universities undergraduate and graduate systems engineering and the computing science discipline (e.g., software engineering, computer science, and information technology) degree programs during their academic training that can be applied to quantitatively manage processes for performance. Everyone involved in executing repeatable processes in the software and systems development lifecycle processes needs to become familiar with the concepts of quantitative management, statistical thinking, process improvement methods and how they relate to process-performance. Organizations are starting to embrace the de facto Software Engineering Institute (SEI) Capability Maturity Model Integration (CMMI RTM) Models as process improvement frameworks to improve business processes performance. High maturity process areas in the CMMI model imply the use of analytical, statistical, quantitative management techniques, and process performance modeling to identify and eliminate sources of variation, continually improve process-performance; reduce cost and predict future outcomes. The research study identifies and provides a detail discussion of the gap analysis findings of process improvement and quantitative analysis techniques taught in U.S. universities systems engineering and computing science degree programs, gaps that exist in the literature, and a comparison analysis which identifies the gaps that exist between the SEI's "healthy ingredients " of a process performance model and courses taught in U.S. universities degree program. The research also heightens awareness that academicians have conducted little research on applicable statistics and quantitative techniques that can be used to demonstrate high maturity as implied in the CMMI models. The research also includes a Monte Carlo simulation optimization model and dashboard that demonstrates the use of statistical methods, statistical process control, sensitivity analysis, quantitative and optimization techniques to establish a baseline and predict future customer satisfaction index scores (outcomes). The American Customer Satisfaction Index (ACSI) model and industry benchmarks were used as a framework for the simulation model.
NASA Astrophysics Data System (ADS)
Vigan, A.; Chauvin, G.; Bonavita, M.; Desidera, S.; Bonnefoy, M.; Mesa, D.; Beuzit, J.-L.; Augereau, J.-C.; Biller, B.; Boccaletti, A.; Brugaletta, E.; Buenzli, E.; Carson, J.; Covino, E.; Delorme, P.; Eggenberger, A.; Feldt, M.; Hagelberg, J.; Henning, T.; Lagrange, A.-M.; Lanzafame, A.; Ménard, F.; Messina, S.; Meyer, M.; Montagnier, G.; Mordasini, C.; Mouillet, D.; Moutou, C.; Mugnier, L.; Quanz, S. P.; Reggiani, M.; Ségransan, D.; Thalmann, C.; Waters, R.; Zurlo, A.
2014-01-01
Over the past decade, a growing number of deep imaging surveys have started to provide meaningful constraints on the population of extrasolar giant planets at large orbital separation. Primary targets for these surveys have been carefully selected based on their age, distance and spectral type, and often on their membership to young nearby associations where all stars share common kinematics, photometric and spectroscopic properties. The next step is a wider statistical analysis of the frequency and properties of low mass companions as a function of stellar mass and orbital separation. In late 2009, we initiated a coordinated European Large Program using angular differential imaging in the H band (1.66 μm) with NaCo at the VLT. Our aim is to provide a comprehensive and statistically significant study of the occurrence of extrasolar giant planets and brown dwarfs at large (5-500 AU) orbital separation around ~150 young, nearby stars, a large fraction of which have never been observed at very deep contrast. The survey has now been completed and we present the data analysis and detection limits for the observed sample, for which we reach the planetary-mass domain at separations of >~50 AU on average. We also present the results of the statistical analysis that has been performed over the 75 targets newly observed at high-contrast. We discuss the details of the statistical analysis and the physical constraints that our survey provides for the frequency and formation scenario of planetary mass companions at large separation.
Ge, Long; Tian, Jin-hui; Li, Xiu-xia; Song, Fujian; Li, Lun; Zhang, Jun; Li, Ge; Pei, Gai-qin; Qiu, Xia; Yang, Ke-hu
2016-01-01
Because of the methodological complexity of network meta-analyses (NMAs), NMAs may be more vulnerable to methodological risks than conventional pair-wise meta-analysis. Our study aims to investigate epidemiology characteristics, conduction of literature search, methodological quality and reporting of statistical analysis process in the field of cancer based on PRISMA extension statement and modified AMSTAR checklist. We identified and included 102 NMAs in the field of cancer. 61 NMAs were conducted using a Bayesian framework. Of them, more than half of NMAs did not report assessment of convergence (60.66%). Inconsistency was assessed in 27.87% of NMAs. Assessment of heterogeneity in traditional meta-analyses was more common (42.62%) than in NMAs (6.56%). Most of NMAs did not report assessment of similarity (86.89%) and did not used GRADE tool to assess quality of evidence (95.08%). 43 NMAs were adjusted indirect comparisons, the methods used were described in 53.49% NMAs. Only 4.65% NMAs described the details of handling of multi group trials and 6.98% described the methods of similarity assessment. The median total AMSTAR-score was 8.00 (IQR: 6.00–8.25). Methodological quality and reporting of statistical analysis did not substantially differ by selected general characteristics. Overall, the quality of NMAs in the field of cancer was generally acceptable. PMID:27848997
NASA Astrophysics Data System (ADS)
Mori, Kaya; Chonko, James C.; Hailey, Charles J.
2005-10-01
We have reanalyzed the 260 ks XMM-Newton observation of 1E 1207.4-5209. There are several significant improvements over previous work. First, a much broader range of physically plausible spectral models was used. Second, we have used a more rigorous statistical analysis. The standard F-distribution was not employed, but rather the exact finite statistics F-distribution was determined by Monte Carlo simulations. This approach was motivated by the recent work of Protassov and coworkers and Freeman and coworkers. They demonstrated that the standard F-distribution is not even asymptotically correct when applied to assess the significance of additional absorption features in a spectrum. With our improved analysis we do not find a third and fourth spectral feature in 1E 1207.4-5209 but only the two broad absorption features previously reported. Two additional statistical tests, one line model dependent and the other line model independent, confirmed our modified F-test analysis. For all physically plausible continuum models in which the weak residuals are strong enough to fit, the residuals occur at the instrument Au M edge. As a sanity check we confirmed that the residuals are consistent in strength and position with the instrument Au M residuals observed in 3C 273.
Cejnar, Pavel; Kuckova, Stepanka; Prochazka, Ales; Karamonova, Ludmila; Svobodova, Barbora
2018-06-15
Explorative statistical analysis of mass spectrometry data is still a time-consuming step. We analyzed critical factors for application of principal component analysis (PCA) in mass spectrometry and focused on two whole spectrum based normalization techniques and their application in the analysis of registered peak data and, in comparison, in full spectrum data analysis. We used this technique to identify different metabolic patterns in the bacterial culture of Cronobacter sakazakii, an important foodborne pathogen. Two software utilities, the ms-alone, a python-based utility for mass spectrometry data preprocessing and peak extraction, and the multiMS-toolbox, an R software tool for advanced peak registration and detailed explorative statistical analysis, were implemented. The bacterial culture of Cronobacter sakazakii was cultivated on Enterobacter sakazakii Isolation Agar, Blood Agar Base and Tryptone Soya Agar for 24 h and 48 h and applied by the smear method on an Autoflex speed MALDI-TOF mass spectrometer. For three tested cultivation media only two different metabolic patterns of Cronobacter sakazakii were identified using PCA applied on data normalized by two different normalization techniques. Results from matched peak data and subsequent detailed full spectrum analysis identified only two different metabolic patterns - a cultivation on Enterobacter sakazakii Isolation Agar showed significant differences to the cultivation on the other two tested media. The metabolic patterns for all tested cultivation media also proved the dependence on cultivation time. Both whole spectrum based normalization techniques together with the full spectrum PCA allow identification of important discriminative factors in experiments with several variable condition factors avoiding any problems with improper identification of peaks or emphasis on bellow threshold peak data. The amounts of processed data remain still manageable. Both implemented software utilities are available free of charge from http://uprt.vscht.cz/ms. Copyright © 2018 John Wiley & Sons, Ltd.
Guo, Shaoyin; Hihath, Joshua; Díez-Pérez, Ismael; Tao, Nongjian
2011-11-30
We report on the measurement and statistical study of thousands of current-voltage characteristics and transition voltage spectra (TVS) of single-molecule junctions with different contact geometries that are rapidly acquired using a new break junction method at room temperature. This capability allows one to obtain current-voltage, conductance voltage, and transition voltage histograms, thus adding a new dimension to the previous conductance histogram analysis at a fixed low-bias voltage for single molecules. This method confirms the low-bias conductance values of alkanedithiols and biphenyldithiol reported in literature. However, at high biases the current shows large nonlinearity and asymmetry, and TVS allows for the determination of a critically important parameter, the tunneling barrier height or energy level alignment between the molecule and the electrodes of single-molecule junctions. The energy level alignment is found to depend on the molecule and also on the contact geometry, revealing the role of contact geometry in both the contact resistance and energy level alignment of a molecular junction. Detailed statistical analysis further reveals that, despite the dependence of the energy level alignment on contact geometry, the variation in single-molecule conductance is primarily due to contact resistance rather than variations in the energy level alignment.
DICON: interactive visual analysis of multidimensional clusters.
Cao, Nan; Gotz, David; Sun, Jimeng; Qu, Huamin
2011-12-01
Clustering as a fundamental data analysis technique has been widely used in many analytic applications. However, it is often difficult for users to understand and evaluate multidimensional clustering results, especially the quality of clusters and their semantics. For large and complex data, high-level statistical information about the clusters is often needed for users to evaluate cluster quality while a detailed display of multidimensional attributes of the data is necessary to understand the meaning of clusters. In this paper, we introduce DICON, an icon-based cluster visualization that embeds statistical information into a multi-attribute display to facilitate cluster interpretation, evaluation, and comparison. We design a treemap-like icon to represent a multidimensional cluster, and the quality of the cluster can be conveniently evaluated with the embedded statistical information. We further develop a novel layout algorithm which can generate similar icons for similar clusters, making comparisons of clusters easier. User interaction and clutter reduction are integrated into the system to help users more effectively analyze and refine clustering results for large datasets. We demonstrate the power of DICON through a user study and a case study in the healthcare domain. Our evaluation shows the benefits of the technique, especially in support of complex multidimensional cluster analysis. © 2011 IEEE
Arabi, Yaseen; Al-Hameed, Fahad; Burns, Karen E A; Mehta, Sangeeta; Alsolamy, Sami; Almaani, Mohammed; Mandourah, Yasser; Almekhlafi, Ghaleb A; Al Bshabshe, Ali; Finfer, Simon; Alshahrani, Mohammed; Khalid, Imran; Mehta, Yatin; Gaur, Atul; Hawa, Hassan; Buscher, Hergen; Arshad, Zia; Lababidi, Hani; Al Aithan, Abdulsalam; Jose, Jesna; Abdukahil, Sheryl Ann I; Afesh, Lara Y; Dbsawy, Maamoun; Al-Dawood, Abdulaziz
2018-03-15
The Pneumatic CompREssion for Preventing VENous Thromboembolism (PREVENT) trial evaluates the effect of adjunctive intermittent pneumatic compression (IPC) with pharmacologic thromboprophylaxis compared to pharmacologic thromboprophylaxis alone on venous thromboembolism (VTE) in critically ill adults. In this multicenter randomized trial, critically ill patients receiving pharmacologic thromboprophylaxis will be randomized to an IPC or a no IPC (control) group. The primary outcome is "incident" proximal lower-extremity deep vein thrombosis (DVT) within 28 days after randomization. Radiologists interpreting the lower-extremity ultrasonography will be blinded to intervention allocation, whereas the patients and treating team will be unblinded. The trial has 80% power to detect a 3% absolute risk reduction in the rate of proximal DVT from 7% to 4%. Consistent with international guidelines, we have developed a detailed plan to guide the analysis of the PREVENT trial. This plan specifies the statistical methods for the evaluation of primary and secondary outcomes, and defines covariates for adjusted analyses a priori. Application of this statistical analysis plan to the PREVENT trial will facilitate unbiased analyses of clinical data. ClinicalTrials.gov , ID: NCT02040103 . Registered on 3 November 2013; Current controlled trials, ID: ISRCTN44653506 . Registered on 30 October 2013.
Capturing Fine Details Involving Low-Cost Sensors -a Comparative Study
NASA Astrophysics Data System (ADS)
Rehany, N.; Barsi, A.; Lovas, T.
2017-11-01
Capturing the fine details on the surface of small objects is a real challenge to many conventional surveying methods. Our paper discusses the investigation of several data acquisition technologies, such as arm scanner, structured light scanner, terrestrial laser scanner, object line-scanner, DSLR camera, and mobile phone camera. A palm-sized embossed sculpture reproduction was used as a test object; it has been surveyed by all the instruments. The result point clouds and meshes were then analyzed, using the arm scanner's dataset as reference. In addition to general statistics, the results have been evaluated based both on 3D deviation maps and 2D deviation graphs; the latter allows even more accurate analysis of the characteristics of the different data acquisition approaches. Additionally, own-developed local minimum maps were created that nicely visualize the potential level of detail provided by the applied technologies. Besides the usual geometric assessment, the paper discusses the different resource needs (cost, time, expertise) of the discussed techniques. Our results proved that even amateur sensors operated by amateur users can provide high quality datasets that enable engineering analysis. Based on the results, the paper contains an outlook to potential future investigations in this field.
NASA Astrophysics Data System (ADS)
Tong, Xiaojun; Cui, Minggen; Wang, Zhu
2009-07-01
The design of the new compound two-dimensional chaotic function is presented by exploiting two one-dimensional chaotic functions which switch randomly, and the design is used as a chaotic sequence generator which is proved by Devaney's definition proof of chaos. The properties of compound chaotic functions are also proved rigorously. In order to improve the robustness against difference cryptanalysis and produce avalanche effect, a new feedback image encryption scheme is proposed using the new compound chaos by selecting one of the two one-dimensional chaotic functions randomly and a new image pixels method of permutation and substitution is designed in detail by array row and column random controlling based on the compound chaos. The results from entropy analysis, difference analysis, statistical analysis, sequence randomness analysis, cipher sensitivity analysis depending on key and plaintext have proven that the compound chaotic sequence cipher can resist cryptanalytic, statistical and brute-force attacks, and especially it accelerates encryption speed, and achieves higher level of security. By the dynamical compound chaos and perturbation technology, the paper solves the problem of computer low precision of one-dimensional chaotic function.
Personal and professional profile of mountain medicine physicians.
Peters, Patrick
2003-01-01
The purpose of this study was to define and describe the personal and professional profile of mountain medicine physicians including general physical training information and to include a detailed overview of the practice of mountain sports. A group of physicians participating in a specialized mountain medicine education program filled out a standardized questionnaire. The data obtained from this questionnaire were first analyzed in a descriptive way and then by statistical methods (chi2 test, t test, and analysis of variance). Detailed results have been provided for gender, age, marital status, general training frequency and methods, professional status, additional medical qualifications, memberships in professional societies and alpine clubs, mountain sports practice, and injuries sustained during the practice of mountain sports. This study has provided a detailed overview concerning the personal and professional profile of mountain medicine physicians. Course organizers as well as official commissions regulating the education in mountain medicine will be able to use this information to adapt and optimize the courses and the recommendations/requirements as detailed by the UIAA-ICAR-ISMM (Union Internationale des Associations Alpinistes, International Commission for Alpine Rescue, International Society for Mountain Medicine).
Analyzing How We Do Analysis and Consume Data, Results from the SciDAC-Data Project
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ding, P.; Aliaga, L.; Mubarak, M.
One of the main goals of the Dept. of Energy funded SciDAC-Data project is to analyze the more than 410,000 high energy physics datasets that have been collected, generated and defined over the past two decades by experiments using the Fermilab storage facilities. These datasets have been used as the input to over 5.6 million recorded analysis projects, for which detailed analytics have been gathered. The analytics and meta information for these datasets and analysis projects are being combined with knowledge of their part of the HEP analysis chains for major experiments to understand how modern computing and data deliverymore » is being used. We present the first results of this project, which examine in detail how the CDF, D0, NOvA, MINERvA and MicroBooNE experiments have organized, classified and consumed petascale datasets to produce their physics results. The results include analysis of the correlations in dataset/file overlap, data usage patterns, data popularity, dataset dependency and temporary dataset consumption. The results provide critical insight into how workflows and data delivery schemes can be combined with different caching strategies to more efficiently perform the work required to mine these large HEP data volumes and to understand the physics analysis requirements for the next generation of HEP computing facilities. In particular we present a detailed analysis of the NOvA data organization and consumption model corresponding to their first and second oscillation results (2014-2016) and the first look at the analysis of the Tevatron Run II experiments. We present statistical distributions for the characterization of these data and data driven models describing their consumption« less
Analyzing how we do Analysis and Consume Data, Results from the SciDAC-Data Project
NASA Astrophysics Data System (ADS)
Ding, P.; Aliaga, L.; Mubarak, M.; Tsaris, A.; Norman, A.; Lyon, A.; Ross, R.
2017-10-01
One of the main goals of the Dept. of Energy funded SciDAC-Data project is to analyze the more than 410,000 high energy physics datasets that have been collected, generated and defined over the past two decades by experiments using the Fermilab storage facilities. These datasets have been used as the input to over 5.6 million recorded analysis projects, for which detailed analytics have been gathered. The analytics and meta information for these datasets and analysis projects are being combined with knowledge of their part of the HEP analysis chains for major experiments to understand how modern computing and data delivery is being used. We present the first results of this project, which examine in detail how the CDF, D0, NOvA, MINERvA and MicroBooNE experiments have organized, classified and consumed petascale datasets to produce their physics results. The results include analysis of the correlations in dataset/file overlap, data usage patterns, data popularity, dataset dependency and temporary dataset consumption. The results provide critical insight into how workflows and data delivery schemes can be combined with different caching strategies to more efficiently perform the work required to mine these large HEP data volumes and to understand the physics analysis requirements for the next generation of HEP computing facilities. In particular we present a detailed analysis of the NOvA data organization and consumption model corresponding to their first and second oscillation results (2014-2016) and the first look at the analysis of the Tevatron Run II experiments. We present statistical distributions for the characterization of these data and data driven models describing their consumption.
Minnesota forest statistics, 1977.
Pamela J. Jakes
1980-01-01
Presents highlights and statistics from the Fourth Minnesota Forest Inventory. Includes detailed tables of forest area, timber volume, net annual growth, timber removals, mortality, and timber products output.
Nair, C K; Patil, V M; Raghavan, V; Babu, S; Nayanar, S
2015-01-01
There is limited data from India regarding elderly non-Hodgkin's lymphomas (NHL) patients. Hence, this audit was planned to study the clinic-pathological features and treatment outcomes in elderly NHL patients. Retrospective analysis of all NHL patients above age of 59 years treated at the author's institute, between December 2010 and December 2013 was done. Case records were reviewed for baseline details, staging details, prognostic factors, treatment delivered, response, toxicity and efficacy. SPSS version 16 (IBM, Newyork) was used for analysis. Descriptive statistics was performed. Kaplan-Meir survival analysis was done for estimation of progression-free survival (PFS) and overall survival (OS). Univariate analysis was done for identifying factors affecting PFS and OS. Out of 141 NHL patients, 67 patients were identified subjected to the inclusion criteria. The median age was 68 years (60-92). Majority were B-cell NHL (86.6%). The commonest subtype in B-cell was diffuse large B-cell lymphoma (55.2%). Fifty-four patients took treatment. The treatment intent was curative in 41 patients (61.2%). Among the patients receiving curative treatment, 16 patients couldn't receive treatment in accordance with NCCN guidelines due to financial issues. Two years PFS was 55%. Two years PFS for B-cell NHL and T-cell NHL were 55% and 50% respectively (P = 0.982). Two years PFS for standard Rx and nonstandard Rx were 62% and 50% respectively, but it didn't reach statistical significance (P = 0.537). Two years OS for the entire cohort was 84%. Standard treatment in accordance with guidelines can be delivered in elderly patients irrespective of age. There is a need for creating financial assistance for patients, so that potentially curative treatments are not denied.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mayer, B. P.; Valdez, C. A.; DeHope, A. J.
Critical to many modern forensic investigations is the chemical attribution of the origin of an illegal drug. This process greatly relies on identification of compounds indicative of its clandestine or commercial production. The results of these studies can yield detailed information on method of manufacture, sophistication of the synthesis operation, starting material source, and final product. In the present work, chemical attribution signatures (CAS) associated with the synthesis of the analgesic 3- methylfentanyl, N-(3-methyl-1-phenethylpiperidin-4-yl)-N-phenylpropanamide, were investigated. Six synthesis methods were studied in an effort to identify and classify route-specific signatures. These methods were chosen to minimize the use of scheduledmore » precursors, complicated laboratory equipment, number of overall steps, and demanding reaction conditions. Using gas and liquid chromatographies combined with mass spectrometric methods (GC-QTOF and LC-QTOF) in conjunction with inductivelycoupled plasma mass spectrometry (ICP-MS), over 240 distinct compounds and elements were monitored. As seen in our previous work with CAS of fentanyl synthesis the complexity of the resultant data matrix necessitated the use of multivariate statistical analysis. Using partial least squares discriminant analysis (PLS-DA), 62 statistically significant, route-specific CAS were identified. Statistical classification models using a variety of machine learning techniques were then developed with the ability to predict the method of 3-methylfentanyl synthesis from three blind crude samples generated by synthetic chemists without prior experience with these methods.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Martin, Madhavi Z; Labbe, Nicole; Wagner, Rebekah J.
2013-01-01
This chapter details the application of LIBS in a number of environmental areas of research such as carbon sequestration and climate change. LIBS has also been shown to be useful in other high resolution environmental applications for example, elemental mapping and detection of metals in plant materials. LIBS has also been used in phytoremediation applications. Other biological research involves a detailed understanding of wood chemistry response to precipitation variations and also to forest fires. A cross-section of Mountain pine (pinceae Pinus pungen Lamb.) was scanned using a translational stage to determine the differences in the chemical features both before andmore » after a fire event. Consequently, by monitoring the elemental composition pattern of a tree and by looking for abrupt changes, one can reconstruct the disturbance history of a tree and a forest. Lastly we have shown that multivariate analysis of the LIBS data is necessary to standardize the analysis and correlate to other standard laboratory techniques. LIBS along with multivariate statistical analysis makes it a very powerful technology that can be transferred from laboratory to field applications with ease.« less
The secret lives of experiments: methods reporting in the fMRI literature.
Carp, Joshua
2012-10-15
Replication of research findings is critical to the progress of scientific understanding. Accordingly, most scientific journals require authors to report experimental procedures in sufficient detail for independent researchers to replicate their work. To what extent do research reports in the functional neuroimaging literature live up to this standard? The present study evaluated methods reporting and methodological choices across 241 recent fMRI articles. Many studies did not report critical methodological details with regard to experimental design, data acquisition, and analysis. Further, many studies were underpowered to detect any but the largest statistical effects. Finally, data collection and analysis methods were highly flexible across studies, with nearly as many unique analysis pipelines as there were studies in the sample. Because the rate of false positive results is thought to increase with the flexibility of experimental designs, the field of functional neuroimaging may be particularly vulnerable to false positives. In sum, the present study documented significant gaps in methods reporting among fMRI studies. Improved methodological descriptions in research reports would yield significant benefits for the field. Copyright © 2012 Elsevier Inc. All rights reserved.
Analysis of BSRT Profiles in the LHC at Injection
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fitterer, M.; Stancari, G.; Papadopoulou, S.
The beam synchrotron radiation telescope (BSRT) at the LHC allows to take profiles of the transverse beam distribution, which can provide useful additional insight in the evolution of the transverse beam distribution. A python class has been developed [1], which allows to read in the BSRT profiles, usually stored in binary format, run different analysis tools and generate plots of the statistical parameters and profiles as well as videos of the the profiles. The detailed analysis will be described in this note. The analysis is based on the data obtained at injection energy (450 GeV) during MD1217 [2] and MD1415more » [3] which will be also used as illustrative example. A similar approach is also taken with a MATLAB based analysis described in [4].« less
14 CFR 298.61 - Reporting of traffic statistics.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 14 Aeronautics and Space 4 2011-01-01 2011-01-01 false Reporting of traffic statistics. 298.61... Requirements § 298.61 Reporting of traffic statistics. (a) Each commuter air carrier and small certificated air... statistics shall be compiled in terms of each flight stage as actually performed. The detail T-100 data shall...
14 CFR 298.61 - Reporting of traffic statistics.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Reporting of traffic statistics. 298.61... Requirements § 298.61 Reporting of traffic statistics. (a) Each commuter air carrier and small certificated air... statistics shall be compiled in terms of each flight stage as actually performed. The detail T-100 data shall...
Assessing population exposure for landslide risk analysis using dasymetric cartography
NASA Astrophysics Data System (ADS)
Garcia, Ricardo A. C.; Oliveira, Sérgio C.; Zêzere, José L.
2016-12-01
Assessing the number and locations of exposed people is a crucial step in landslide risk management and emergency planning. The available population statistical data frequently have insufficient detail for an accurate assessment of potentially exposed people to hazardous events, mainly when they occur at the local scale, such as with landslides. The present study aims to apply dasymetric cartography to improving population spatial resolution and to assess the potentially exposed population. An additional objective is to compare the results with those obtained with a more common approach that uses, as spatial units, basic census units, which are the best spatial data disaggregation and detailed information available for regional studies in Portugal. Considering the Portuguese census data and a layer of residential building footprint, which was used as ancillary information, the number of exposed inhabitants differs significantly according to the approach used. When the census unit approach is used, considering the three highest landslide susceptible classes, the number of exposed inhabitants is in general overestimated. Despite the associated uncertainties of a general cost-benefit analysis, the presented methodology seems to be a reliable approach for gaining a first approximation of a more detailed estimation of exposed people. The approach based on dasymetric cartography allows the spatial resolution of population over large areas to be increased and enables the use of detailed landslide susceptibility maps, which are valuable for improving the exposed population assessment.
Development and testing of a fast conceptual river water quality model.
Keupers, Ingrid; Willems, Patrick
2017-04-15
Modern, model based river quality management strongly relies on river water quality models to simulate the temporal and spatial evolution of pollutant concentrations in the water body. Such models are typically constructed by extending detailed hydrodynamic models with a component describing the advection-diffusion and water quality transformation processes in a detailed, physically based way. This approach is too computational time demanding, especially when simulating long time periods that are needed for statistical analysis of the results or when model sensitivity analysis, calibration and validation require a large number of model runs. To overcome this problem, a structure identification method to set up a conceptual river water quality model has been developed. Instead of calculating the water quality concentrations at each water level and discharge node, the river branch is divided into conceptual reservoirs based on user information such as location of interest and boundary inputs. These reservoirs are modelled as Plug Flow Reactor (PFR) and Continuously Stirred Tank Reactor (CSTR) to describe advection and diffusion processes. The same water quality transformation processes as in the detailed models are considered but with adjusted residence times based on the hydrodynamic simulation results and calibrated to the detailed water quality simulation results. The developed approach allows for a much faster calculation time (factor 10 5 ) without significant loss of accuracy, making it feasible to perform time demanding scenario runs. Copyright © 2017 Elsevier Ltd. All rights reserved.
Systems Analysis of NASA Aviation Safety Program: Final Report
NASA Technical Reports Server (NTRS)
Jones, Sharon M.; Reveley, Mary S.; Withrow, Colleen A.; Evans, Joni K.; Barr, Lawrence; Leone, Karen
2013-01-01
A three-month study (February to April 2010) of the NASA Aviation Safety (AvSafe) program was conducted. This study comprised three components: (1) a statistical analysis of currently available civilian subsonic aircraft data from the National Transportation Safety Board (NTSB), the Federal Aviation Administration (FAA), and the Aviation Safety Information Analysis and Sharing (ASIAS) system to identify any significant or overlooked aviation safety issues; (2) a high-level qualitative identification of future safety risks, with an assessment of the potential impact of the NASA AvSafe research on the National Airspace System (NAS) based on these risks; and (3) a detailed, top-down analysis of the NASA AvSafe program using an established and peer-reviewed systems analysis methodology. The statistical analysis identified the top aviation "tall poles" based on NTSB accident and FAA incident data from 1997 to 2006. A separate examination of medical helicopter accidents in the United States was also conducted. Multiple external sources were used to develop a compilation of ten "tall poles" in future safety issues/risks. The top-down analysis of the AvSafe was conducted by using a modification of the Gibson methodology. Of the 17 challenging safety issues that were identified, 11 were directly addressed by the AvSafe program research portfolio.
NASA Technical Reports Server (NTRS)
Perrier, A.; Itier, B.; Boissard, P. (Principal Investigator); Goillot, C.; Belluomo, P.; Valery, P.
1980-01-01
A consecutive night and day flight and measurements on the ground, were made in the region of Voves, south of Chartres. The statistical analysis of the thermal scanner data permitted the establishment of criteria for the homogeneity of surfaces. These criteria were used in defining the surface temperature values which are most representative for use in an energy balance approach to evapotranspiration (day) and heat balance (night). For a number of maize fields that airborne thermal scanner data permitted a detailed energy analysis of different fields of a same crop to be carried out. Such a detailed analysis was not necessary for a calculation of crop evapotranspiration which could be evaluated from the mean temperature of the crop surface. A differential analysis day night is of interest for enhancing the contrast between types of surfaces, as well as for a better definition of the daily energy balance. It should be stressed that, for a homogeneous region, a study such as the present one, could be carried out on a relatively small part of the total surface, as the results for a surface of 2.5 x 2 sq km were not significantly different from those obtained from a surface three times larger.
An investigation into pilot and system response to critical in-flight events, volume 1
NASA Technical Reports Server (NTRS)
Rockwell, T. H.; Giffin, W. C.
1981-01-01
The scope of a critical in-flight event (CIFE) with emphasis on pilot management of available resources is described. Detailed scenarios for both full mission simulation and written testing of pilot responses to CIFE's, and statistical relationships among pilot characteristics and observed responses are developed. A model developed to described pilot response to CIFE and an analysis of professional fight crews compliance with specified operating procedures and the relationships with in-flight errors are included.
Trajectory Design for a Single-String Impactor Concept
NASA Technical Reports Server (NTRS)
Dono Perez, Andres; Burton, Roland; Stupl, Jan; Mauro, David
2017-01-01
This paper introduces a trajectory design for a secondary spacecraft concept to augment science return in interplanetary missions. The concept consist of a single-string probe with a kinetic impactor on board that generates an artificial plume to perform in-situ sampling. The trajectory design was applied to a particular case study that samples ejecta particles from the Jovian moon Europa. Results were validated using statistical analysis. Details regarding the navigation, targeting and disposal challenges related to this concept are presented herein.
Influence of the carrying vehicle in the aerospatial survey of natural radioactivity
NASA Technical Reports Server (NTRS)
Dejesusparada, N. (Principal Investigator); Martin, I. M.
1981-01-01
The importance of the choice of the carrying vehicle in aerial surveys of natural radioactivity, particularly in the location of uraniferous regions, is discussed. The results of observations depend on the exposure time, that is, the velocity and altitude the carrying vehicle can attain. Overflights of the same region using identical instrumentation but in two different types of aircraft were performed. A detailed statistical analysis of the measurements obtained during these flights demonstrates the precision of localization achievable by this method.
1993-03-30
Navy was highest in all three measures, followed by the Air Force, and the Army was the lowest. No branch accounted for a large proportion of the...the largest proportion of workload and costs were in the category ’Outside Catchment Area’. The total government pay for outside catchment area category...services. MACDILL REG HOSP MACDILL AFB had the highest amount of total government pay in the Air Force’s billable MTF. It accounted for 4.13
Load research manual. Volume 2: Fundamentals of implementing load research procedures
NASA Astrophysics Data System (ADS)
1980-11-01
This manual will assist electric utilities and state regulatory authorities in investigating customer electricity demand as part of cost-of-service studies, rate design, marketing research, system design, load forecasting, rate reform analysis, and load management research. Load research procedures are described in detail. Research programs at three utilities are compared: Carolina Power and Light Company, Long Island Lighting Company, and Southern California Edison Company. A load research bibliography and glossaries of load research and statistical terms are also included.
Scanpath-based analysis of objects conspicuity in context of human vision physiology.
Augustyniak, Piotr
2007-01-01
This paper discusses principal aspects of objects conspicuity investigated with use of an eye tracker and interpreted on the background of human vision physiology. Proper management of objects conspicuity is fundamental in several leading edge applications in the information society like advertisement, web design, man-machine interfacing and ergonomics. Although some common rules of human perception are applied since centuries in the art, the interest of human perception process is motivated today by the need of gather and maintain the recipient attention by putting selected messages in front of the others. Our research uses the visual tasks methodology and series of progressively modified natural images. The modifying details were attributed by their size, color and position while the scanpath-derived gaze points confirmed or not the act of perception. The statistical analysis yielded the probability of detail perception and correlations with the attributes. This probability conforms to the knowledge about the retina anatomy and perception physiology, although we use noninvasive methods only.
Summary statistics in auditory perception.
McDermott, Josh H; Schemitsch, Michael; Simoncelli, Eero P
2013-04-01
Sensory signals are transduced at high resolution, but their structure must be stored in a more compact format. Here we provide evidence that the auditory system summarizes the temporal details of sounds using time-averaged statistics. We measured discrimination of 'sound textures' that were characterized by particular statistical properties, as normally result from the superposition of many acoustic features in auditory scenes. When listeners discriminated examples of different textures, performance improved with excerpt duration. In contrast, when listeners discriminated different examples of the same texture, performance declined with duration, a paradoxical result given that the information available for discrimination grows with duration. These results indicate that once these sounds are of moderate length, the brain's representation is limited to time-averaged statistics, which, for different examples of the same texture, converge to the same values with increasing duration. Such statistical representations produce good categorical discrimination, but limit the ability to discern temporal detail.
Comparison of Australian and US Cost-Benefit Approaches to MEPS
DOE Office of Scientific and Technical Information (OSTI.GOV)
McMahon, James E.
2004-03-12
The Australian Greenhouse Office contracted with the Collaborative Labeling and Appliance Standards Program (CLASP) for LBNL to compare US and Australian approaches to analyzing costs and benefits of minimum energy performance standards (MEPS). This report compares the approaches for three types of products: household refrigerators and freezers, small electric storage water heaters, and commercial/industrial air conditioners. This report presents the findings of similarities and differences between the approaches of the two countries and suggests changes to consider in the approach taken in Australia. The purpose of the Australian program is to reduce greenhouse gas emissions, while the US program ismore » intended to increase energy efficiency; each program is thus subject to specific constraints. The market and policy contexts are different, with the USA producing most of its own products and conducting pioneering engineering-economic studies to identify maximum energy efficiency levels that are technologically feasible and economically justified. In contrast, Australia imports a large share of its products and adopts MEPS already in place elsewhere. With these differences in circumstances, Australia's analysis approach could be expected to have less analytical detail and still result in MEPS levels that are appropriate for their policy and market context. In practice, the analysis required to meet these different objectives is quite similar. To date, Australia's cost-benefit analysis has served the goals and philosophies of the program well and been highly effective in successfully identifying MEPS that are significantly reducing greenhouse gas emissions while providing economic benefits to consumers. In some cases, however, the experience of the USA--using more extensive data sets and more detailed analysis--suggests possible improvements to Australia's cost-benefit analysis. The principal findings of the comparison are: (1) The Technology and Market Assessments are similar; no changes are recommended. (2) The Australian approach to determining the relationship of price to energy efficiency is based on current market, while the US approach uses prospective estimates. Both approaches may benefit from increased retrospective analysis of impacts of MEPS on appliance and equipment prices. Under some circumstances, Australia may wish to consider analyzing two separate components leading to price impacts: (a) changes in manufacturing costs and (b) markups used to convert from manufacturing costs to consumer price. (3) The Life-Cycle Cost methods are similar, but the USA has statistical surveys that permit a more detailed analysis. Australia uses average values, while the US uses full distributions. If data and resources permit, Australia may benefit from greater depth here as well. If implemented, the changes will provide more information about the benefits and costs of the program, in particular identifying who benefits and who bears net costs so that programs can be designed to offset unintended negative consequences, and may assist the government in convincing affected parties of the justification for some MEPS. However, without a detailed and statistically representative national survey, such an approach may not be practical for Australia at this time. (4) The National Benefits and Costs methods are similar prospective estimates of shipments, costs and energy savings, as well as greenhouse gas emissions. Additional sensitivity studies could further illustrate the ranges in these estimates. Consideration of lower discount rates could lead to more stringent MEPS in some cases. (5) Both the Australian and US analyses of impacts on industry, competition, and trade ultimately depend upon sufficient consultation with industry experts. While the Australian analysis of financial impacts on manufacturers is less detailed than that of the US, the Australian treatment of impacts on market shares imported from different regions of the world is more detailed. No change is recommended. Implementing these changes would increase the depth of analysis, require additional data collection and analysis, and incur associated costs and time. The recommended changes are likely to have incremental rather than dramatic impacts on the substance and implications of the analysis as currently conducted.« less
"Magnitude-based inference": a statistical review.
Welsh, Alan H; Knight, Emma J
2015-04-01
We consider "magnitude-based inference" and its interpretation by examining in detail its use in the problem of comparing two means. We extract from the spreadsheets, which are provided to users of the analysis (http://www.sportsci.org/), a precise description of how "magnitude-based inference" is implemented. We compare the implemented version of the method with general descriptions of it and interpret the method in familiar statistical terms. We show that "magnitude-based inference" is not a progressive improvement on modern statistics. The additional probabilities introduced are not directly related to the confidence interval but, rather, are interpretable either as P values for two different nonstandard tests (for different null hypotheses) or as approximate Bayesian calculations, which also lead to a type of test. We also discuss sample size calculations associated with "magnitude-based inference" and show that the substantial reduction in sample sizes claimed for the method (30% of the sample size obtained from standard frequentist calculations) is not justifiable so the sample size calculations should not be used. Rather than using "magnitude-based inference," a better solution is to be realistic about the limitations of the data and use either confidence intervals or a fully Bayesian analysis.
Physics-based statistical learning approach to mesoscopic model selection.
Taverniers, Søren; Haut, Terry S; Barros, Kipton; Alexander, Francis J; Lookman, Turab
2015-11-01
In materials science and many other research areas, models are frequently inferred without considering their generalization to unseen data. We apply statistical learning using cross-validation to obtain an optimally predictive coarse-grained description of a two-dimensional kinetic nearest-neighbor Ising model with Glauber dynamics (GD) based on the stochastic Ginzburg-Landau equation (sGLE). The latter is learned from GD "training" data using a log-likelihood analysis, and its predictive ability for various complexities of the model is tested on GD "test" data independent of the data used to train the model on. Using two different error metrics, we perform a detailed analysis of the error between magnetization time trajectories simulated using the learned sGLE coarse-grained description and those obtained using the GD model. We show that both for equilibrium and out-of-equilibrium GD training trajectories, the standard phenomenological description using a quartic free energy does not always yield the most predictive coarse-grained model. Moreover, increasing the amount of training data can shift the optimal model complexity to higher values. Our results are promising in that they pave the way for the use of statistical learning as a general tool for materials modeling and discovery.
Scalability of Several Asynchronous Many-Task Models for In Situ Statistical Analysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pebay, Philippe Pierre; Bennett, Janine Camille; Kolla, Hemanth
This report is a sequel to [PB16], in which we provided a first progress report on research and development towards a scalable, asynchronous many-task, in situ statistical analysis engine using the Legion runtime system. This earlier work included a prototype implementation of a proposed solution, using a proxy mini-application as a surrogate for a full-scale scientific simulation code. The first scalability studies were conducted with the above on modestly-sized experimental clusters. In contrast, in the current work we have integrated our in situ analysis engines with a full-size scientific application (S3D, using the Legion-SPMD model), and have conducted nu- mericalmore » tests on the largest computational platform currently available for DOE science ap- plications. We also provide details regarding the design and development of a light-weight asynchronous collectives library. We describe how this library is utilized within our SPMD- Legion S3D workflow, and compare the data aggregation technique deployed herein to the approach taken within our previous work.« less
Topographic ERP analyses: a step-by-step tutorial review.
Murray, Micah M; Brunet, Denis; Michel, Christoph M
2008-06-01
In this tutorial review, we detail both the rationale for as well as the implementation of a set of analyses of surface-recorded event-related potentials (ERPs) that uses the reference-free spatial (i.e. topographic) information available from high-density electrode montages to render statistical information concerning modulations in response strength, latency, and topography both between and within experimental conditions. In these and other ways these topographic analysis methods allow the experimenter to glean additional information and neurophysiologic interpretability beyond what is available from canonical waveform analyses. In this tutorial we present the example of somatosensory evoked potentials (SEPs) in response to stimulation of each hand to illustrate these points. For each step of these analyses, we provide the reader with both a conceptual and mathematical description of how the analysis is carried out, what it yields, and how to interpret its statistical outcome. We show that these topographic analysis methods are intuitive and easy-to-use approaches that can remove much of the guesswork often confronting ERP researchers and also assist in identifying the information contained within high-density ERP datasets.
Role of reservoirs in sustained seismicity of Koyna-Warna region—a statistical analysis
NASA Astrophysics Data System (ADS)
Yadav, Amrita; Gahalaut, Kalpna; Purnachandra Rao, N.
2018-03-01
Koyna-Warna region in western India is a globally recognized site of reservoir-triggered seismicity near the Koyna and Warna reservoirs. The region has been reported with several M > 5 earthquakes in the last five decades including M6.3 Koyna earthquake which is considered as the largest triggered earthquake worldwide. In the present study, a detailed statistical analysis has been done for long period earthquake catalogues during 1968-2004 of MERI and 2005-2012 of CSIR-NGRI to find out the spatio-temporal influence of the Koyna and Warna reservoirs impoundment on the seismicity of the region. Depending upon the earthquake clusters, we divided the region into three different zones and performed power spectrum and singular spectrum analysis (SSA) on them. For the time period 1983-1995, the earthquake zone near the Warna reservoir; for 1996-2004, the earthquake zone near the Koyna reservoir; and for 2005-2012, the earthquake zone near the Warna reservoir found to be influenced by the annual water level variations in the reservoirs that confirm the continuous role of both the reservoirs in the seismicity of the Koyna-Warna region.
Snow parameters from Nimbus-6 electrically scanned microwave radiometer. [(ESMR-6)
NASA Technical Reports Server (NTRS)
Abrams, G.; Edgerton, A. T.
1977-01-01
Two sites in Canada were selected for detailed analysis of the ESMR-6/ snow relationships. Data were analyzed for February 1976 for site 1 and January, February and March 1976 for site 2. Snowpack water equivalents were less than 4.5 inches for site 1 and, depending on the month, were between 2.9 and 14.5 inches for site 2. A statistically significant relationship was found between ESMR-6 measurements and snowpack water equivalents for the Site 2 February and March data. Associated analysis findings presented are the effects of random measurement errors, snow site physiolography, and weather conditions on the ESMR-6/snow relationship.
The persistent signature of tropical cyclones in ambient seismic noise
NASA Astrophysics Data System (ADS)
Gualtieri, Lucia; Camargo, Suzana J.; Pascale, Salvatore; Pons, Flavio M. E.; Ekström, Göran
2018-02-01
The spectrum of ambient seismic noise shows strong signals associated with tropical cyclones, yet a detailed understanding of these signals and the relationship between them and the storms is currently lacking. Through the analysis of more than a decade of seismic data recorded at several stations located in and adjacent to the northwest Pacific Ocean, here we show that there is a persistent and frequency-dependent signature of tropical cyclones in ambient seismic noise that depends on characteristics of the storm and on the detailed location of the station relative to the storm. An adaptive statistical model shows that the spectral amplitude of ambient seismic noise, and notably of the short-period secondary microseisms, has a strong relationship with tropical cyclone intensity and can be employed to extract information on the tropical cyclones.
Liu, M; Wei, L; Zhang, J
2006-01-01
Missing data in clinical trials are inevitable. We highlight the ICH guidelines and CPMP points to consider on missing data. Specifically, we outline how we should consider missing data issues when designing, planning and conducting studies to minimize missing data impact. We also go beyond the coverage of the above two documents, provide a more detailed review of the basic concepts of missing data and frequently used terminologies, and examples of the typical missing data mechanism, and discuss technical details and literature for several frequently used statistical methods and associated software. Finally, we provide a case study where the principles outlined in this paper are applied to one clinical program at protocol design, data analysis plan and other stages of a clinical trial.
Park, Ji Eun; Han, Kyunghwa; Sung, Yu Sub; Chung, Mi Sun; Koo, Hyun Jung; Yoon, Hee Mang; Choi, Young Jun; Lee, Seung Soo; Kim, Kyung Won; Shin, Youngbin; An, Suah; Cho, Hyo-Min
2017-01-01
Objective To evaluate the frequency and adequacy of statistical analyses in a general radiology journal when reporting a reliability analysis for a diagnostic test. Materials and Methods Sixty-three studies of diagnostic test accuracy (DTA) and 36 studies reporting reliability analyses published in the Korean Journal of Radiology between 2012 and 2016 were analyzed. Studies were judged using the methodological guidelines of the Radiological Society of North America-Quantitative Imaging Biomarkers Alliance (RSNA-QIBA), and COnsensus-based Standards for the selection of health Measurement INstruments (COSMIN) initiative. DTA studies were evaluated by nine editorial board members of the journal. Reliability studies were evaluated by study reviewers experienced with reliability analysis. Results Thirty-one (49.2%) of the 63 DTA studies did not include a reliability analysis when deemed necessary. Among the 36 reliability studies, proper statistical methods were used in all (5/5) studies dealing with dichotomous/nominal data, 46.7% (7/15) of studies dealing with ordinal data, and 95.2% (20/21) of studies dealing with continuous data. Statistical methods were described in sufficient detail regarding weighted kappa in 28.6% (2/7) of studies and regarding the model and assumptions of intraclass correlation coefficient in 35.3% (6/17) and 29.4% (5/17) of studies, respectively. Reliability parameters were used as if they were agreement parameters in 23.1% (3/13) of studies. Reproducibility and repeatability were used incorrectly in 20% (3/15) of studies. Conclusion Greater attention to the importance of reporting reliability, thorough description of the related statistical methods, efforts not to neglect agreement parameters, and better use of relevant terminology is necessary. PMID:29089821
Goedhart, Paul W; van der Voet, Hilko; Baldacchino, Ferdinando; Arpaia, Salvatore
2014-04-01
Genetic modification of plants may result in unintended effects causing potentially adverse effects on the environment. A comparative safety assessment is therefore required by authorities, such as the European Food Safety Authority, in which the genetically modified plant is compared with its conventional counterpart. Part of the environmental risk assessment is a comparative field experiment in which the effect on non-target organisms is compared. Statistical analysis of such trials come in two flavors: difference testing and equivalence testing. It is important to know the statistical properties of these, for example, the power to detect environmental change of a given magnitude, before the start of an experiment. Such prospective power analysis can best be studied by means of a statistical simulation model. This paper describes a general framework for simulating data typically encountered in environmental risk assessment of genetically modified plants. The simulation model, available as Supplementary Material, can be used to generate count data having different statistical distributions possibly with excess-zeros. In addition the model employs completely randomized or randomized block experiments, can be used to simulate single or multiple trials across environments, enables genotype by environment interaction by adding random variety effects, and finally includes repeated measures in time following a constant, linear or quadratic pattern in time possibly with some form of autocorrelation. The model also allows to add a set of reference varieties to the GM plants and its comparator to assess the natural variation which can then be used to set limits of concern for equivalence testing. The different count distributions are described in some detail and some examples of how to use the simulation model to study various aspects, including a prospective power analysis, are provided.
Goedhart, Paul W; van der Voet, Hilko; Baldacchino, Ferdinando; Arpaia, Salvatore
2014-01-01
Genetic modification of plants may result in unintended effects causing potentially adverse effects on the environment. A comparative safety assessment is therefore required by authorities, such as the European Food Safety Authority, in which the genetically modified plant is compared with its conventional counterpart. Part of the environmental risk assessment is a comparative field experiment in which the effect on non-target organisms is compared. Statistical analysis of such trials come in two flavors: difference testing and equivalence testing. It is important to know the statistical properties of these, for example, the power to detect environmental change of a given magnitude, before the start of an experiment. Such prospective power analysis can best be studied by means of a statistical simulation model. This paper describes a general framework for simulating data typically encountered in environmental risk assessment of genetically modified plants. The simulation model, available as Supplementary Material, can be used to generate count data having different statistical distributions possibly with excess-zeros. In addition the model employs completely randomized or randomized block experiments, can be used to simulate single or multiple trials across environments, enables genotype by environment interaction by adding random variety effects, and finally includes repeated measures in time following a constant, linear or quadratic pattern in time possibly with some form of autocorrelation. The model also allows to add a set of reference varieties to the GM plants and its comparator to assess the natural variation which can then be used to set limits of concern for equivalence testing. The different count distributions are described in some detail and some examples of how to use the simulation model to study various aspects, including a prospective power analysis, are provided. PMID:24834325
A Three Dimensional Kinematic and Kinetic Study of the Golf Swing
Nesbit, Steven M.
2005-01-01
This paper discusses the three-dimensional kinematics and kinetics of a golf swing as performed by 84 male and one female amateur subjects of various skill levels. The analysis was performed using a variable full-body computer model of a human coupled with a flexible model of a golf club. Data to drive the model was obtained from subject swings recorded using a multi-camera motion analysis system. Model output included club trajectories, golfer/club interaction forces and torques, work and power, and club deflections. These data formed the basis for a statistical analysis of all subjects, and a detailed analysis and comparison of the swing characteristics of four of the subjects. The analysis generated much new data concerning the mechanics of the golf swing. It revealed that a golf swing is a highly coordinated and individual motion and subject-to-subject variations were significant. The study highlighted the importance of the wrists in generating club head velocity and orienting the club face. The trajectory of the hands and the ability to do work were the factors most closely related to skill level. Key Points Full-body model of the golf swing. Mechanical description of the golf swing. Statistical analysis of golf swing mechanics. Comparisons of subject swing mechanics PMID:24627665
A three dimensional kinematic and kinetic study of the golf swing.
Nesbit, Steven M
2005-12-01
This paper discusses the three-dimensional kinematics and kinetics of a golf swing as performed by 84 male and one female amateur subjects of various skill levels. The analysis was performed using a variable full-body computer model of a human coupled with a flexible model of a golf club. Data to drive the model was obtained from subject swings recorded using a multi-camera motion analysis system. Model output included club trajectories, golfer/club interaction forces and torques, work and power, and club deflections. These data formed the basis for a statistical analysis of all subjects, and a detailed analysis and comparison of the swing characteristics of four of the subjects. The analysis generated much new data concerning the mechanics of the golf swing. It revealed that a golf swing is a highly coordinated and individual motion and subject-to-subject variations were significant. The study highlighted the importance of the wrists in generating club head velocity and orienting the club face. The trajectory of the hands and the ability to do work were the factors most closely related to skill level. Key PointsFull-body model of the golf swing.Mechanical description of the golf swing.Statistical analysis of golf swing mechanics.Comparisons of subject swing mechanics.
ACCOUNTING FOR CALIBRATION UNCERTAINTIES IN X-RAY ANALYSIS: EFFECTIVE AREAS IN SPECTRAL FITTING
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, Hyunsook; Kashyap, Vinay L.; Drake, Jeremy J.
2011-04-20
While considerable advance has been made to account for statistical uncertainties in astronomical analyses, systematic instrumental uncertainties have been generally ignored. This can be crucial to a proper interpretation of analysis results because instrumental calibration uncertainty is a form of systematic uncertainty. Ignoring it can underestimate error bars and introduce bias into the fitted values of model parameters. Accounting for such uncertainties currently requires extensive case-specific simulations if using existing analysis packages. Here, we present general statistical methods that incorporate calibration uncertainties into spectral analysis of high-energy data. We first present a method based on multiple imputation that can bemore » applied with any fitting method, but is necessarily approximate. We then describe a more exact Bayesian approach that works in conjunction with a Markov chain Monte Carlo based fitting. We explore methods for improving computational efficiency, and in particular detail a method of summarizing calibration uncertainties with a principal component analysis of samples of plausible calibration files. This method is implemented using recently codified Chandra effective area uncertainties for low-resolution spectral analysis and is verified using both simulated and actual Chandra data. Our procedure for incorporating effective area uncertainty is easily generalized to other types of calibration uncertainties.« less
NASA Astrophysics Data System (ADS)
Titov, A. G.; Gordov, E. P.; Okladnikov, I.; Shulgina, T. M.
2011-12-01
Analysis of recent climatic and environmental changes in Siberia performed on the basis of the CLEARS (CLimate and Environment Analysis and Research System) information-computational system is presented. The system was developed using the specialized software framework for rapid development of thematic information-computational systems based on Web-GIS technologies. It comprises structured environmental datasets, computational kernel, specialized web portal implementing web mapping application logic, and graphical user interface. Functional capabilities of the system include a number of procedures for mathematical and statistical analysis, data processing and visualization. At present a number of georeferenced datasets is available for processing including two editions of NCEP/NCAR Reanalysis, JMA/CRIEPI JRA-25 Reanalysis, ECMWF ERA-40 and ERA Interim Reanalysis, meteorological observation data for the territory of the former USSR, and others. Firstly, using functionality of the computational kernel employing approved statistical methods it was shown that the most reliable spatio-temporal characteristics of surface temperature and precipitation in Siberia in the second half of 20th and beginning of 21st centuries are provided by ERA-40/ERA Interim Reanalysis and APHRODITE JMA Reanalysis, respectively. Namely those Reanalyses are statistically consistent with reliable in situ meteorological observations. Analysis of surface temperature and precipitation dynamics for the territory of Siberia performed on the base of the developed information-computational system reveals fine spatial and temporal details in heterogeneous patterns obtained for the region earlier. Dynamics of bioclimatic indices determining climate change impact on structure and functioning of regional vegetation cover was investigated as well. Analysis shows significant positive trends of growing season length accompanied by statistically significant increase of sum of growing degree days and total annual precipitation over the south of Western Siberia. In particular, we conclude that analysis of trends of growing season length, sum of growing degree-days and total precipitation during the growing season reveals a tendency to an increase of vegetation ecosystems productivity across the south of Western Siberia (55°-60°N, 59°-84°E) in the past several decades. The developed system functionality providing instruments for comparison of modeling and observational data and for reliable climatological analysis allowed us to obtain new results characterizing regional manifestations of global change. It should be added that each analysis performed using the system leads also to generation of the archive of spatio-temporal data fields ready for subsequent usage by other specialists. In particular, the archive of bioclimatic indices obtained will allow performing further detailed studies of interrelations between local climate and vegetation cover changes, including changes of carbon uptake related to variations of types and amount of vegetation and spatial shift of vegetation zones. This work is partially supported by RFBR grants #10-07-00547 and #11-05-01190-a, SB RAS Basic Program Projects 4.31.1.5 and 4.31.2.7.
A Statistical Study of Eiscat Electron and Ion Temperature Measurements In The E-region
NASA Astrophysics Data System (ADS)
Hussey, G.; Haldoupis, C.; Schlegel, K.; Bösinger, T.
Motivated by the large EISCAT data base, which covers over 15 years of common programme operation, and previous statistical work with EISCAT data (e.g., C. Hal- doupis, K. Schlegel, and G. Hussey, Auroral E-region electron density gradients mea- sured with EISCAT, Ann. Geopshysicae, 18, 1172-1181, 2000), a detailed statistical analysis of electron and ion EISCAT temperature measurements has been undertaken. This study was specifically concerned with the statistical dependence of heating events with other ambient parameters such as the electric field and electron density. The re- sults showed previously reported dependences such as the electron temperature being directly correlated with the ambient electric field and inversely related to the electron density. However, these correlations were found to be also dependent upon altitude. There was also evidence of the so called "Schlegel effect" (K. Schlegel, Reduced effective recombination coefficient in the disturbed polar E-region, J. Atmos. Terr. Phys., 44, 183-185, 1982); that is, the heated electron gas leads to increases in elec- tron density through a reduction in the recombination rate. This paper will present the statistical heating results and attempt to offer physical explanations and interpretations of the findings.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1979-05-01
The small detail area, 18 miles by 18 miles, lying near the center of the Powder River Basin, is covered entirely by sediments of the Eocene Wasatch Formation. Historically economic uranium deposits have been worked in the southeast corner of the area which includes the northern extremity of the Pumpkin Buttes district. 127 statistical uranium anomalies were generated for the study area, based on area wide statistics.
SDGs and Geospatial Frameworks: Data Integration in the United States
NASA Astrophysics Data System (ADS)
Trainor, T.
2016-12-01
Responding to the need to monitor a nation's progress towards meeting the Sustainable Development Goals (SDG) outlined in the 2030 U.N. Agenda requires the integration of earth observations with statistical information. The urban agenda proposed in SDG 11 challenges the global community to find a geospatial approach to monitor and measure inclusive, safe, resilient, and sustainable cities and communities. Target 11.7 identifies public safety, accessibility to green and public spaces, and the most vulnerable populations (i.e., women and children, older persons, and persons with disabilities) as the most important priorities of this goal. A challenge for both national statistical organizations and earth observation agencies in addressing SDG 11 is the requirement for detailed statistics at a sufficient spatial resolution to provide the basis for meaningful analysis of the urban population and city environments. Using an example for the city of Pittsburgh, this presentation proposes data and methods to illustrate how earth science and statistical data can be integrated to respond to Target 11.7. Finally, a preliminary series of data initiatives are proposed for extending this method to other global cities.
Guide to Using Onionskin Analysis Code (U)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fugate, Michael Lynn; Morzinski, Jerome Arthur
2016-09-15
This document is a guide to using R-code written for the purpose of analyzing onionskin experiments. We expect the user to be very familiar with statistical methods and the R programming language. For more details about onionskin experiments and the statistical methods mentioned in this document see Storlie, Fugate, et al. (2013). Engineers at LANL experiment with detonators and high explosives to assess performance. The experimental unit, called an onionskin, is a hemisphere consisting of a detonator and a booster pellet surrounded by explosive material. When the detonator explodes, a streak camera mounted above the pole of the hemisphere recordsmore » when the shock wave arrives at the surface. The output from the camera is a two-dimensional image that is transformed into a curve that shows the arrival time as a function of polar angle. The statistical challenge is to characterize a baseline population of arrival time curves and to compare the baseline curves to curves from a new, so-called, test series. The hope is that the new test series of curves is statistically similar to the baseline population.« less
Two-mode mazer injected with V-type three-level atoms
NASA Astrophysics Data System (ADS)
Liang, Wen-Qing; Zhang, Zhi-Ming; Xie, Sheng-Wu
2003-12-01
The properties of the two-mode mazer operating on V-type three-level atoms are studied. The effect of the one-atom pumping on the two modes of the cavity field in number-state is asymmetric, that is, the atom emits a photon into one mode with some probability and absorbs a photon from the other mode with some other probability. This effect makes the steady-state photon distribution and the steady-state photon statistics asymmetric for the two modes. The diagram of the probability currents for the photon distribution, given by the analysis of the master equation, reveals that there is no detailed balance solution for the master equation. The computations show that the photon statistics of one mode or both modes can be sub-Poissonian, that the two modes can have anticorrelation or correlation, that the photon statistics increases with the increase of thermal photons and that the resonant position and strength of the photon statistics are influenced by the ratio of the two coupling strengths of the two modes. These properties are also discussed physically.
A new statistical methodology predicting chip failure probability considering electromigration
NASA Astrophysics Data System (ADS)
Sun, Ted
In this research thesis, we present a new approach to analyze chip reliability subject to electromigration (EM) whose fundamental causes and EM phenomenon happened in different materials are presented in this thesis. This new approach utilizes the statistical nature of EM failure in order to assess overall EM risk. It includes within-die temperature variations from the chip's temperature map extracted by an Electronic Design Automation (EDA) tool to estimate the failure probability of a design. Both the power estimation and thermal analysis are performed in the EDA flow. We first used the traditional EM approach to analyze the design with a single temperature across the entire chip that involves 6 metal and 5 via layers. Next, we used the same traditional approach but with a realistic temperature map. The traditional EM analysis approach and that coupled with a temperature map and the comparison between the results of considering and not considering temperature map are presented in in this research. A comparison between these two results confirms that using a temperature map yields a less pessimistic estimation of the chip's EM risk. Finally, we employed the statistical methodology we developed considering a temperature map and different use-condition voltages and frequencies to estimate the overall failure probability of the chip. The statistical model established considers the scaling work with the usage of traditional Black equation and four major conditions. The statistical result comparisons are within our expectations. The results of this statistical analysis confirm that the chip level failure probability is higher i) at higher use-condition frequencies for all use-condition voltages, and ii) when a single temperature instead of a temperature map across the chip is considered. In this thesis, I start with an overall review on current design types, common flows, and necessary verifications and reliability checking steps used in this IC design industry. Furthermore, the important concepts about "Scripting Automation" which is used in all the integration of using diversified EDA tools in this research work are also described in detail with several examples and my completed coding works are also put in the appendix for your reference. Hopefully, this construction of my thesis will give readers a thorough understanding about my research work from the automation of EDA tools to the statistical data generation, from the nature of EM to the statistical model construction, and the comparisons among the traditional EM analysis and the statistical EM analysis approaches.
Hagemann, G; Eichbaum, G
1997-07-01
The following three film-screen combinations were compared: a) a combination of anticrossover film and UV-light emitting screens, b) a combination of blue-light emitting screens and film, and c) a conventional green fluorescing screen film combination. Radiographs of a specially designed plexiglass phantom (0.2 x 0.2 x 0.12 m3) were obtained that contained bar patterns of lead and plaster (calcium sulfate) to test high and intermediate contrast resolution and bar patterns of air to test low contrast resolution, respectively. An aluminum step wedge was integrated to evaluate dose-density curves of the radiographs. The dose values for the various step thicknesses were measured as percentage of the dose value in air for 60, 81, and 117 kV. Exposure conditions were the following: 12 pulse generator, 0.6 mm focus size, 4.7 mm aluminum prefilter, a grid with 40 lines/cm (12:1), and a focus-detector distance of 1.15 m. The thresholds of visible bars of the various pattern materials were assessed by seven radiologists, one technician, and the authors. The resulting contrast detail diagram could not prove any significant differences between the three tested screen film combinations. The pairwise comparison, however, found 8 of the 18 paired differences to be statistically significant between the conventional and the two new screen-film combinations. The authors concluded that subjective visual assessment of the threshold in a contrast detail study alone is of only limited value to grade image quality if no well-defined criteria are used (BIR report 20 [1989] 137-139). The statistical approach of paired differences of the estimated means appeared to be more appropriate.
Web-based data collection: detailed methods of a questionnaire and data gathering tool
Cooper, Charles J; Cooper, Sharon P; del Junco, Deborah J; Shipp, Eva M; Whitworth, Ryan; Cooper, Sara R
2006-01-01
There have been dramatic advances in the development of web-based data collection instruments. This paper outlines a systematic web-based approach to facilitate this process through locally developed code and to describe the results of using this process after two years of data collection. We provide a detailed example of a web-based method that we developed for a study in Starr County, Texas, assessing high school students' work and health status. This web-based application includes data instrument design, data entry and management, and data tables needed to store the results that attempt to maximize the advantages of this data collection method. The software also efficiently produces a coding manual, web-based statistical summary and crosstab reports, as well as input templates for use by statistical packages. Overall, web-based data entry using a dynamic approach proved to be a very efficient and effective data collection system. This data collection method expedited data processing and analysis and eliminated the need for cumbersome and expensive transfer and tracking of forms, data entry, and verification. The code has been made available for non-profit use only to the public health research community as a free download [1]. PMID:16390556
Helioseismology of pre-emerging active regions. III. Statistical analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barnes, G.; Leka, K. D.; Braun, D. C.
The subsurface properties of active regions (ARs) prior to their appearance at the solar surface may shed light on the process of AR formation. Helioseismic holography has been applied to samples taken from two populations of regions on the Sun (pre-emergence and without emergence), each sample having over 100 members, that were selected to minimize systematic bias, as described in Paper I. Paper II showed that there are statistically significant signatures in the average helioseismic properties that precede the formation of an AR. This paper describes a more detailed analysis of the samples of pre-emergence regions and regions without emergencemore » based on discriminant analysis. The property that is best able to distinguish the populations is found to be the surface magnetic field, even a day before the emergence time. However, after accounting for the correlations between the surface field and the quantities derived from helioseismology, there is still evidence of a helioseismic precursor to AR emergence that is present for at least a day prior to emergence, although the analysis presented cannot definitively determine the subsurface properties prior to emergence due to the small sample sizes.« less
Effect of the statin therapy on biochemical laboratory tests--a chemometrics study.
Durceková, Tatiana; Mocák, Ján; Boronová, Katarína; Balla, Ján
2011-01-05
Statins are the first-line choice for lowering total and LDL cholesterol levels and very important medicaments for reducing the risk of coronary artery disease. The aim of this study is therefore assessment of the results of biochemical tests characterizing the condition of 172 patients before and after administration of statins. For this purpose, several chemometric tools, namely principal component analysis, cluster analysis, discriminant analysis, logistic regression, KNN classification, ROC analysis, descriptive statistics and ANOVA were used. Mutual relations of 11 biochemical laboratory tests, the patient's age and gender were investigated in detail. Achieved results enable to evaluate the extent of the statin treatment in each individual case. They may also help in monitoring the dynamic progression of the disease. Copyright © 2010 Elsevier B.V. All rights reserved.
Ensemble of Thermostatically Controlled Loads: Statistical Physics Approach.
Chertkov, Michael; Chernyak, Vladimir
2017-08-17
Thermostatically controlled loads, e.g., air conditioners and heaters, are by far the most widespread consumers of electricity. Normally the devices are calibrated to provide the so-called bang-bang control - changing from on to off, and vice versa, depending on temperature. We considered aggregation of a large group of similar devices into a statistical ensemble, where the devices operate following the same dynamics, subject to stochastic perturbations and randomized, Poisson on/off switching policy. Using theoretical and computational tools of statistical physics, we analyzed how the ensemble relaxes to a stationary distribution and established a relationship between the relaxation and the statistics of the probability flux associated with devices' cycling in the mixed (discrete, switch on/off, and continuous temperature) phase space. This allowed us to derive the spectrum of the non-equilibrium (detailed balance broken) statistical system and uncover how switching policy affects oscillatory trends and the speed of the relaxation. Relaxation of the ensemble is of practical interest because it describes how the ensemble recovers from significant perturbations, e.g., forced temporary switching off aimed at utilizing the flexibility of the ensemble to provide "demand response" services to change consumption temporarily to balance a larger power grid. We discuss how the statistical analysis can guide further development of the emerging demand response technology.
Ensemble of Thermostatically Controlled Loads: Statistical Physics Approach
Chertkov, Michael; Chernyak, Vladimir
2017-01-17
Thermostatically Controlled Loads (TCL), e.g. air-conditioners and heaters, are by far the most wide-spread consumers of electricity. Normally the devices are calibrated to provide the so-called bang-bang control of temperature - changing from on to off , and vice versa, depending on temperature. Aggregation of a large group of similar devices into a statistical ensemble is considered, where the devices operate following the same dynamics subject to stochastic perturbations and randomized, Poisson on/off switching policy. We analyze, using theoretical and computational tools of statistical physics, how the ensemble relaxes to a stationary distribution and establish relation between the re- laxationmore » and statistics of the probability flux, associated with devices' cycling in the mixed (discrete, switch on/off , and continuous, temperature) phase space. This allowed us to derive and analyze spec- trum of the non-equilibrium (detailed balance broken) statistical system. and uncover how switching policy affects oscillatory trend and speed of the relaxation. Relaxation of the ensemble is of a practical interest because it describes how the ensemble recovers from significant perturbations, e.g. forceful temporary switching o aimed at utilizing flexibility of the ensemble in providing "demand response" services relieving consumption temporarily to balance larger power grid. We discuss how the statistical analysis can guide further development of the emerging demand response technology.« less
Ensemble of Thermostatically Controlled Loads: Statistical Physics Approach
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chertkov, Michael; Chernyak, Vladimir
Thermostatically Controlled Loads (TCL), e.g. air-conditioners and heaters, are by far the most wide-spread consumers of electricity. Normally the devices are calibrated to provide the so-called bang-bang control of temperature - changing from on to off , and vice versa, depending on temperature. Aggregation of a large group of similar devices into a statistical ensemble is considered, where the devices operate following the same dynamics subject to stochastic perturbations and randomized, Poisson on/off switching policy. We analyze, using theoretical and computational tools of statistical physics, how the ensemble relaxes to a stationary distribution and establish relation between the re- laxationmore » and statistics of the probability flux, associated with devices' cycling in the mixed (discrete, switch on/off , and continuous, temperature) phase space. This allowed us to derive and analyze spec- trum of the non-equilibrium (detailed balance broken) statistical system. and uncover how switching policy affects oscillatory trend and speed of the relaxation. Relaxation of the ensemble is of a practical interest because it describes how the ensemble recovers from significant perturbations, e.g. forceful temporary switching o aimed at utilizing flexibility of the ensemble in providing "demand response" services relieving consumption temporarily to balance larger power grid. We discuss how the statistical analysis can guide further development of the emerging demand response technology.« less
Reis, Matthias; Kromer, Justus A; Klipp, Edda
2018-01-20
Multimodality is a phenomenon which complicates the analysis of statistical data based exclusively on mean and variance. Here, we present criteria for multimodality in hierarchic first-order reaction networks, consisting of catalytic and splitting reactions. Those networks are characterized by independent and dependent subnetworks. First, we prove the general solvability of the Chemical Master Equation (CME) for this type of reaction network and thereby extend the class of solvable CME's. Our general solution is analytical in the sense that it allows for a detailed analysis of its statistical properties. Given Poisson/deterministic initial conditions, we then prove the independent species to be Poisson/binomially distributed, while the dependent species exhibit generalized Poisson/Khatri Type B distributions. Generalized Poisson/Khatri Type B distributions are multimodal for an appropriate choice of parameters. We illustrate our criteria for multimodality by several basic models, as well as the well-known two-stage transcription-translation network and Bateman's model from nuclear physics. For both examples, multimodality was previously not reported.
NASA Technical Reports Server (NTRS)
Shantaram, S. Pai; Gyekenyesi, John P.
1989-01-01
The calculation of shape and scale parametes of the two-parameter Weibull distribution is described using the least-squares analysis and maximum likelihood methods for volume- and surface-flaw-induced fracture in ceramics with complete and censored samples. Detailed procedures are given for evaluating 90 percent confidence intervals for maximum likelihood estimates of shape and scale parameters, the unbiased estimates of the shape parameters, and the Weibull mean values and corresponding standard deviations. Furthermore, the necessary steps are described for detecting outliers and for calculating the Kolmogorov-Smirnov and the Anderson-Darling goodness-of-fit statistics and 90 percent confidence bands about the Weibull distribution. It also shows how to calculate the Batdorf flaw-density constants by using the Weibull distribution statistical parameters. The techniques described were verified with several example problems, from the open literature, and were coded in the Structural Ceramics Analysis and Reliability Evaluation (SCARE) design program.
Comparison of requirements and capabilities of major multipurpose software packages.
Igo, Robert P; Schnell, Audrey H
2012-01-01
The aim of this chapter is to introduce the reader to commonly used software packages and illustrate their input requirements, analysis options, strengths, and limitations. We focus on packages that perform more than one function and include a program for quality control, linkage, and association analyses. Additional inclusion criteria were (1) programs that are free to academic users and (2) currently supported, maintained, and developed. Using those criteria, we chose to review three programs: Statistical Analysis for Genetic Epidemiology (S.A.G.E.), PLINK, and Merlin. We will describe the required input format and analysis options. We will not go into detail about every possible program in the packages, but we will give an overview of the packages requirements and capabilities.
Alternative types of molecule-decorated atomic chains in Au–CO–Au single-molecule junctions
Balogh, Zoltán; Makk, Péter
2015-01-01
Summary We investigate the formation and evolution of Au–CO single-molecule break junctions. The conductance histogram exhibits two distinct molecular configurations, which are further investigated by a combined statistical analysis. According to conditional histogram and correlation analysis these molecular configurations show strong anticorrelations with each other and with pure Au monoatomic junctions and atomic chains. We identify molecular precursor configurations with somewhat higher conductance, which are formed prior to single-molecule junctions. According to detailed length analysis two distinct types of molecule-affected chain-formation processes are observed, and we compare these results to former theoretical calculations considering bridge- and atop-type molecular configurations where the latter has reduced conductance due to destructive Fano interference. PMID:26199840
Alternative types of molecule-decorated atomic chains in Au-CO-Au single-molecule junctions.
Balogh, Zoltán; Makk, Péter; Halbritter, András
2015-01-01
We investigate the formation and evolution of Au-CO single-molecule break junctions. The conductance histogram exhibits two distinct molecular configurations, which are further investigated by a combined statistical analysis. According to conditional histogram and correlation analysis these molecular configurations show strong anticorrelations with each other and with pure Au monoatomic junctions and atomic chains. We identify molecular precursor configurations with somewhat higher conductance, which are formed prior to single-molecule junctions. According to detailed length analysis two distinct types of molecule-affected chain-formation processes are observed, and we compare these results to former theoretical calculations considering bridge- and atop-type molecular configurations where the latter has reduced conductance due to destructive Fano interference.
Cuesta-Frau, David; Varela, Manuel; Aboy, Mateo; Miró-Martínez, Pau
2009-01-01
We describe a device for dual channel body temperature monitoring. The device can operate as a real time monitor or as a data logger, and has Bluetooth capabilities to enable for wireless data download to the computer used for data analysis. The proposed device is capable of sampling temperature at a rate of 1 sample per minute with a resolution of 0.01 °C . The internal memory allows for stand-alone data logging of up to 10 days. The device has a battery life of 50 hours in continuous real-time mode. In addition to describing the proposed device in detail, we report the results of a statistical analysis conducted to assess its accuracy and reproducibility. PMID:22408473
Cuesta-Frau, David; Varela, Manuel; Aboy, Mateo; Miró-Martínez, Pau
2009-01-01
We describe a device for dual channel body temperature monitoring. The device can operate as a real time monitor or as a data logger, and has Bluetooth capabilities to enable for wireless data download to the computer used for data analysis. The proposed device is capable of sampling temperature at a rate of 1 sample per minute with a resolution of 0.01 °C . The internal memory allows for stand-alone data logging of up to 10 days. The device has a battery life of 50 hours in continuous real-time mode. In addition to describing the proposed device in detail, we report the results of a statistical analysis conducted to assess its accuracy and reproducibility.
Yuan, Zhongshang; Liu, Hong; Zhang, Xiaoshuai; Li, Fangyu; Zhao, Jinghua; Zhang, Furen; Xue, Fuzhong
2013-01-01
Currently, the genetic variants identified by genome wide association study (GWAS) generally only account for a small proportion of the total heritability for complex disease. One crucial reason is the underutilization of gene-gene joint effects commonly encountered in GWAS, which includes their main effects and co-association. However, gene-gene co-association is often customarily put into the framework of gene-gene interaction vaguely. From the causal graph perspective, we elucidate in detail the concept and rationality of gene-gene co-association as well as its relationship with traditional gene-gene interaction, and propose two Fisher r-to-z transformation-based simple statistics to detect it. Three series of simulations further highlight that gene-gene co-association refers to the extent to which the joint effects of two genes differs from the main effects, not only due to the traditional interaction under the nearly independent condition but the correlation between two genes. The proposed statistics are more powerful than logistic regression under various situations, cannot be affected by linkage disequilibrium and can have acceptable false positive rate as long as strictly following the reasonable GWAS data analysis roadmap. Furthermore, an application to gene pathway analysis associated with leprosy confirms in practice that our proposed gene-gene co-association concepts as well as the correspondingly proposed statistics are strongly in line with reality. PMID:23923021
Seismic risk analysis for the Babcock and Wilcox facility, Leechburg, Pennsylvania
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1977-10-21
The results of a detailed seismic risk analysis of the Babcock and Wilcox Plutonium Fuel Fabrication facility at Leechburg, Pennsylvania are presented. This report focuses on earthquakes; the other natural hazards, being addressed in separate reports, are severe weather (strong winds and tornados) and floods. The calculational method used is based on Cornell's work (1968); it has been previously applied to safety evaluations of major projects. The historical seismic record was established after a review of available literature, consultation with operators of local seismic arrays and examination of appropriate seismic data bases. Because of the aseismicity of the region aroundmore » the site, an analysis different from the conventional closest approach in a tectonic province was adapted. Earthquakes as far from the site as 1,000 km were included, as were the possibility of earthquakes at the site. In addition, various uncertainties in the input were explicitly considered in the analysis. The results of the risk analysis, which include a Bayesian estimate of the uncertainties, are presented, expressed as return period accelerations. The best estimate curve indicates that the Babcock and Wilcox facility will experience 0.05 g every 220 years and 0.10 g every 1400 years. The bounding curves roughly represent the one standard deviation confidence limits about the best estimate, reflecting the uncertainty in certain of the input. Detailed examination of the results show that the accelerations are very insensitive to the details of the source region geometries or the historical earthquake statistics in each region and that each of the source regions contributes almost equally to the cumulative risk at the site. If required for structural analysis, acceleration response spectra for the site can be constructed by scaling the mean response spectrum for alluvium in WASH 1255 by these peak accelerations.« less
van Rhee, Henk; Hak, Tony
2017-01-01
We present a new tool for meta‐analysis, Meta‐Essentials, which is free of charge and easy to use. In this paper, we introduce the tool and compare its features to other tools for meta‐analysis. We also provide detailed information on the validation of the tool. Although free of charge and simple, Meta‐Essentials automatically calculates effect sizes from a wide range of statistics and can be used for a wide range of meta‐analysis applications, including subgroup analysis, moderator analysis, and publication bias analyses. The confidence interval of the overall effect is automatically based on the Knapp‐Hartung adjustment of the DerSimonian‐Laird estimator. However, more advanced meta‐analysis methods such as meta‐analytical structural equation modelling and meta‐regression with multiple covariates are not available. In summary, Meta‐Essentials may prove a valuable resource for meta‐analysts, including researchers, teachers, and students. PMID:28801932
NASA Technical Reports Server (NTRS)
Davis, Frank W.; Quattrochi, Dale A.; Ridd, Merrill K.; Lam, Nina S.-N.; Walsh, Stephen J.
1991-01-01
This paper discusses some basic scientific issues and research needs in the joint processing of remotely sensed and GIS data for environmental analysis. Two general topics are treated in detail: (1) scale dependence of geographic data and the analysis of multiscale remotely sensed and GIS data, and (2) data transformations and information flow during data processing. The discussion of scale dependence focuses on the theory and applications of spatial autocorrelation, geostatistics, and fractals for characterizing and modeling spatial variation. Data transformations during processing are described within the larger framework of geographical analysis, encompassing sampling, cartography, remote sensing, and GIS. Development of better user interfaces between image processing, GIS, database management, and statistical software is needed to expedite research on these and other impediments to integrated analysis of remotely sensed and GIS data.
Mobile Digest of Education Statistics, 2013. NCES 2014-086
ERIC Educational Resources Information Center
Snyder, Thomas D.
2014-01-01
This is the first edition of the "Mobile Digest of Education Statistics." This compact compilation of statistical information covers prekindergarten through graduate school to describe the current American education scene. The "Mobile Digest" is designed as an easy mobile reference for materials found in detail in the…
Using Facebook Data to Turn Introductory Statistics Students into Consultants
ERIC Educational Resources Information Center
Childers, Adam F.
2017-01-01
Facebook provides businesses and organizations with copious data that describe how users are interacting with their page. This data affords an excellent opportunity to turn introductory statistics students into consultants to analyze the Facebook data using descriptive and inferential statistics. This paper details a semester-long project that…
medplot: a web application for dynamic summary and analysis of longitudinal medical data based on R.
Ahlin, Črt; Stupica, Daša; Strle, Franc; Lusa, Lara
2015-01-01
In biomedical studies the patients are often evaluated numerous times and a large number of variables are recorded at each time-point. Data entry and manipulation of longitudinal data can be performed using spreadsheet programs, which usually include some data plotting and analysis capabilities and are straightforward to use, but are not designed for the analyses of complex longitudinal data. Specialized statistical software offers more flexibility and capabilities, but first time users with biomedical background often find its use difficult. We developed medplot, an interactive web application that simplifies the exploration and analysis of longitudinal data. The application can be used to summarize, visualize and analyze data by researchers that are not familiar with statistical programs and whose knowledge of statistics is limited. The summary tools produce publication-ready tables and graphs. The analysis tools include features that are seldom available in spreadsheet software, such as correction for multiple testing, repeated measurement analyses and flexible non-linear modeling of the association of the numerical variables with the outcome. medplot is freely available and open source, it has an intuitive graphical user interface (GUI), it is accessible via the Internet and can be used within a web browser, without the need for installing and maintaining programs locally on the user's computer. This paper describes the application and gives detailed examples describing how to use the application on real data from a clinical study including patients with early Lyme borreliosis.
Hayati Rezvan, Panteha; Lee, Katherine J; Simpson, Julie A
2015-04-07
Missing data are common in medical research, which can lead to a loss in statistical power and potentially biased results if not handled appropriately. Multiple imputation (MI) is a statistical method, widely adopted in practice, for dealing with missing data. Many academic journals now emphasise the importance of reporting information regarding missing data and proposed guidelines for documenting the application of MI have been published. This review evaluated the reporting of missing data, the application of MI including the details provided regarding the imputation model, and the frequency of sensitivity analyses within the MI framework in medical research articles. A systematic review of articles published in the Lancet and New England Journal of Medicine between January 2008 and December 2013 in which MI was implemented was carried out. We identified 103 papers that used MI, with the number of papers increasing from 11 in 2008 to 26 in 2013. Nearly half of the papers specified the proportion of complete cases or the proportion with missing data by each variable. In the majority of the articles (86%) the imputed variables were specified. Of the 38 papers (37%) that stated the method of imputation, 20 used chained equations, 8 used multivariate normal imputation, and 10 used alternative methods. Very few articles (9%) detailed how they handled non-normally distributed variables during imputation. Thirty-nine papers (38%) stated the variables included in the imputation model. Less than half of the papers (46%) reported the number of imputations, and only two papers compared the distribution of imputed and observed data. Sixty-six papers presented the results from MI as a secondary analysis. Only three articles carried out a sensitivity analysis following MI to assess departures from the missing at random assumption, with details of the sensitivity analyses only provided by one article. This review outlined deficiencies in the documenting of missing data and the details provided about imputation. Furthermore, only a few articles performed sensitivity analyses following MI even though this is strongly recommended in guidelines. Authors are encouraged to follow the available guidelines and provide information on missing data and the imputation process.
2011-01-01
Background Meta-analysis is a popular methodology in several fields of medical research, including genetic association studies. However, the methods used for meta-analysis of association studies that report haplotypes have not been studied in detail. In this work, methods for performing meta-analysis of haplotype association studies are summarized, compared and presented in a unified framework along with an empirical evaluation of the literature. Results We present multivariate methods that use summary-based data as well as methods that use binary and count data in a generalized linear mixed model framework (logistic regression, multinomial regression and Poisson regression). The methods presented here avoid the inflation of the type I error rate that could be the result of the traditional approach of comparing a haplotype against the remaining ones, whereas, they can be fitted using standard software. Moreover, formal global tests are presented for assessing the statistical significance of the overall association. Although the methods presented here assume that the haplotypes are directly observed, they can be easily extended to allow for such an uncertainty by weighting the haplotypes by their probability. Conclusions An empirical evaluation of the published literature and a comparison against the meta-analyses that use single nucleotide polymorphisms, suggests that the studies reporting meta-analysis of haplotypes contain approximately half of the included studies and produce significant results twice more often. We show that this excess of statistically significant results, stems from the sub-optimal method of analysis used and, in approximately half of the cases, the statistical significance is refuted if the data are properly re-analyzed. Illustrative examples of code are given in Stata and it is anticipated that the methods developed in this work will be widely applied in the meta-analysis of haplotype association studies. PMID:21247440
Application of Multi-Hypothesis Sequential Monte Carlo for Breakup Analysis
NASA Astrophysics Data System (ADS)
Faber, W. R.; Zaidi, W.; Hussein, I. I.; Roscoe, C. W. T.; Wilkins, M. P.; Schumacher, P. W., Jr.
As more objects are launched into space, the potential for breakup events and space object collisions is ever increasing. These events create large clouds of debris that are extremely hazardous to space operations. Providing timely, accurate, and statistically meaningful Space Situational Awareness (SSA) data is crucial in order to protect assets and operations in space. The space object tracking problem, in general, is nonlinear in both state dynamics and observations, making it ill-suited to linear filtering techniques such as the Kalman filter. Additionally, given the multi-object, multi-scenario nature of the problem, space situational awareness requires multi-hypothesis tracking and management that is combinatorially challenging in nature. In practice, it is often seen that assumptions of underlying linearity and/or Gaussianity are used to provide tractable solutions to the multiple space object tracking problem. However, these assumptions are, at times, detrimental to tracking data and provide statistically inconsistent solutions. This paper details a tractable solution to the multiple space object tracking problem applicable to space object breakup events. Within this solution, simplifying assumptions of the underlying probability density function are relaxed and heuristic methods for hypothesis management are avoided. This is done by implementing Sequential Monte Carlo (SMC) methods for both nonlinear filtering as well as hypothesis management. This goal of this paper is to detail the solution and use it as a platform to discuss computational limitations that hinder proper analysis of large breakup events.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aartsen, M. G.; Abraham, K.; Ackermann, M.
The IceCube Neutrino Observatory accumulated a total of 318 billion cosmic-ray-induced muon events between 2009 May and 2015 May. This data set was used for a detailed analysis of the sidereal anisotropy in the arrival directions of cosmic rays in the TeV to PeV energy range. The observed global sidereal anisotropy features large regions of relative excess and deficit, with amplitudes of the order of 10{sup 3} up to about 100 TeV. A decomposition of the arrival direction distribution into spherical harmonics shows that most of the power is contained in the low-multipole ( ℓ ≤ 4) moments. However, highermore » multipole components are found to be statistically significant down to an angular scale of less than 10°, approaching the angular resolution of the detector. Above 100 TeV, a change in the morphology of the arrival direction distribution is observed, and the anisotropy is characterized by a wide relative deficit whose amplitude increases with primary energy up to at least 5 PeV, the highest energies currently accessible to IceCube. No time dependence of the large- and small-scale structures is observed in the period of six years covered by this analysis. The high-statistics data set reveals more details of the properties of the anisotropy and is potentially able to shed light on the various physical processes that are responsible for the complex angular structure and energy evolution.« less
NASA Astrophysics Data System (ADS)
Toliver, Paul; Ozdur, Ibrahim; Agarwal, Anjali; Woodward, T. K.
2013-05-01
In this paper, we describe a detailed performance comparison of alternative single-pixel, single-mode LIDAR architectures including (i) linear-mode APD-based direct-detection, (ii) optically-preamplified PIN receiver, (iii) PINbased coherent-detection, and (iv) Geiger-mode single-photon-APD counting. Such a comparison is useful when considering next-generation LIDAR on a chip, which would allow one to leverage extensive waveguide-based structures and processing elements developed for telecom and apply them to small form-factor sensing applications. Models of four LIDAR transmit and receive systems are described in detail, which include not only the dominant sources of receiver noise commonly assumed in each of the four detection limits, but also additional noise terms present in realistic implementations. These receiver models are validated through the analysis of detection statistics collected from an experimental LIDAR testbed. The receiver is reconfigurable into four modes of operation, while transmit waveforms and channel characteristics are held constant. The use of a diffuse hard target highlights the importance of including speckle noise terms in the overall system analysis. All measurements are done at 1550 nm, which offers multiple system advantages including less stringent eye safety requirements and compatibility with available telecom components, optical amplification, and photonic integration. Ultimately, the experimentally-validated detection statistics can be used as part of an end-to-end system model for projecting rate, range, and resolution performance limits and tradeoffs of alternative integrated LIDAR architectures.
NASA Astrophysics Data System (ADS)
Aartsen, M. G.; Abraham, K.; Ackermann, M.; Adams, J.; Aguilar, J. A.; Ahlers, M.; Ahrens, M.; Altmann, D.; Anderson, T.; Ansseau, I.; Anton, G.; Archinger, M.; Arguelles, C.; Arlen, T. C.; Auffenberg, J.; Bai, X.; Barwick, S. W.; Baum, V.; Bay, R.; Beatty, J. J.; Becker Tjus, J.; Becker, K.-H.; Beiser, E.; BenZvi, S.; Berghaus, P.; Berley, D.; Bernardini, E.; Bernhard, A.; Besson, D. Z.; Binder, G.; Bindig, D.; Bissok, M.; Blaufuss, E.; Blumenthal, J.; Boersma, D. J.; Bohm, C.; Börner, M.; Bos, F.; Bose, D.; Böser, S.; Botner, O.; Braun, J.; Brayeur, L.; Bretz, H.-P.; Buzinsky, N.; Casey, J.; Casier, M.; Cheung, E.; Chirkin, D.; Christov, A.; Clark, K.; Classen, L.; Coenders, S.; Collin, G. H.; Conrad, J. M.; Cowen, D. F.; Cruz Silva, A. H.; Daughhetee, J.; Davis, J. C.; Day, M.; de André, J. P. A. M.; De Clercq, C.; del Pino Rosendo, E.; Dembinski, H.; De Ridder, S.; Desiati, P.; de Vries, K. D.; de Wasseige, G.; de With, M.; DeYoung, T.; Díaz-Vélez, J. C.; di Lorenzo, V.; Dujmovic, H.; Dumm, J. P.; Dunkman, M.; Eberhardt, B.; Ehrhardt, T.; Eichmann, B.; Euler, S.; Evenson, P. A.; Fahey, S.; Fazely, A. R.; Feintzeig, J.; Felde, J.; Filimonov, K.; Finley, C.; Flis, S.; Fösig, C.-C.; Fuchs, T.; Gaisser, T. K.; Gaior, R.; Gallagher, J.; Gerhardt, L.; Ghorbani, K.; Gier, D.; Gladstone, L.; Glagla, M.; Glüsenkamp, T.; Goldschmidt, A.; Golup, G.; Gonzalez, J. G.; Góra, D.; Grant, D.; Griffith, Z.; Ha, C.; Haack, C.; Haj Ismail, A.; Hallgren, A.; Halzen, F.; Hansen, E.; Hansmann, B.; Hansmann, T.; Hanson, K.; Hebecker, D.; Heereman, D.; Helbing, K.; Hellauer, R.; Hickford, S.; Hignight, J.; Hill, G. C.; Hoffman, K. D.; Hoffmann, R.; Holzapfel, K.; Homeier, A.; Hoshina, K.; Huang, F.; Huber, M.; Huelsnitz, W.; Hulth, P. O.; Hultqvist, K.; In, S.; Ishihara, A.; Jacobi, E.; Japaridze, G. S.; Jeong, M.; Jero, K.; Jones, B. J. P.; Jurkovic, M.; Kappes, A.; Karg, T.; Karle, A.; Katz, U.; Kauer, M.; Keivani, A.; Kelley, J. L.; Kemp, J.; Kheirandish, A.; Kim, M.; Kintscher, T.; Kiryluk, J.; Klein, S. R.; Kohnen, G.; Koirala, R.; Kolanoski, H.; Konietz, R.; Köpke, L.; Kopper, C.; Kopper, S.; Koskinen, D. J.; Kowalski, M.; Krings, K.; Kroll, G.; Kroll, M.; Krückl, G.; Kunnen, J.; Kunwar, S.; Kurahashi, N.; Kuwabara, T.; Labare, M.; Lanfranchi, J. L.; Larson, M. J.; Lennarz, D.; Lesiak-Bzdak, M.; Leuermann, M.; Leuner, J.; Lu, L.; Lünemann, J.; Madsen, J.; Maggi, G.; Mahn, K. B. M.; Mandelartz, M.; Maruyama, R.; Mase, K.; Matis, H. S.; Maunu, R.; McNally, F.; Meagher, K.; Medici, M.; Meier, M.; Meli, A.; Menne, T.; Merino, G.; Meures, T.; Miarecki, S.; Middell, E.; Mohrmann, L.; Montaruli, T.; Morse, R.; Nahnhauer, R.; Naumann, U.; Neer, G.; Niederhausen, H.; Nowicki, S. C.; Nygren, D. R.; Obertacke Pollmann, A.; Olivas, A.; Omairat, A.; O'Murchadha, A.; Palczewski, T.; Pandya, H.; Pankova, D. V.; Paul, L.; Pepper, J. A.; Pérez de los Heros, C.; Pfendner, C.; Pieloth, D.; Pinat, E.; Posselt, J.; Price, P. B.; Przybylski, G. T.; Quinnan, M.; Raab, C.; Rädel, L.; Rameez, M.; Rawlins, K.; Reimann, R.; Relich, M.; Resconi, E.; Rhode, W.; Richman, M.; Richter, S.; Riedel, B.; Robertson, S.; Rongen, M.; Rott, C.; Ruhe, T.; Ryckbosch, D.; Sabbatini, L.; Sander, H.-G.; Sandrock, A.; Sandroos, J.; Sarkar, S.; Schatto, K.; Schimp, M.; Schlunder, P.; Schmidt, T.; Schoenen, S.; Schöneberg, S.; Schönwald, A.; Schumacher, L.; Seckel, D.; Seunarine, S.; Soldin, D.; Song, M.; Spiczak, G. M.; Spiering, C.; Stahlberg, M.; Stamatikos, M.; Stanev, T.; Stasik, A.; Steuer, A.; Stezelberger, T.; Stokstad, R. G.; Stössl, A.; Ström, R.; Strotjohann, N. L.; Sullivan, G. W.; Sutherland, M.; Taavola, H.; Taboada, I.; Tatar, J.; Ter-Antonyan, S.; Terliuk, A.; Tešić, G.; Tilav, S.; Toale, P. A.; Tobin, M. N.; Toscano, S.; Tosi, D.; Tselengidou, M.; Turcati, A.; Unger, E.; Usner, M.; Vallecorsa, S.; Vandenbroucke, J.; van Eijndhoven, N.; Vanheule, S.; van Santen, J.; Veenkamp, J.; Vehring, M.; Voge, M.; Vraeghe, M.; Walck, C.; Wallace, A.; Wallraff, M.; Wandkowsky, N.; Weaver, Ch.; Wendt, C.; Westerhoff, S.; Whelan, B. J.; Wiebe, K.; Wiebusch, C. H.; Wille, L.; Williams, D. R.; Wills, L.; Wissing, H.; Wolf, M.; Wood, T. R.; Woschnagg, K.; Xu, D. L.; Xu, X. W.; Xu, Y.; Yanez, J. P.; Yodh, G.; Yoshida, S.; Zoll, M.; IceCube Collaboration
2016-08-01
The IceCube Neutrino Observatory accumulated a total of 318 billion cosmic-ray-induced muon events between 2009 May and 2015 May. This data set was used for a detailed analysis of the sidereal anisotropy in the arrival directions of cosmic rays in the TeV to PeV energy range. The observed global sidereal anisotropy features large regions of relative excess and deficit, with amplitudes of the order of 10-3 up to about 100 TeV. A decomposition of the arrival direction distribution into spherical harmonics shows that most of the power is contained in the low-multipole (ℓ ≤ 4) moments. However, higher multipole components are found to be statistically significant down to an angular scale of less than 10°, approaching the angular resolution of the detector. Above 100 TeV, a change in the morphology of the arrival direction distribution is observed, and the anisotropy is characterized by a wide relative deficit whose amplitude increases with primary energy up to at least 5 PeV, the highest energies currently accessible to IceCube. No time dependence of the large- and small-scale structures is observed in the period of six years covered by this analysis. The high-statistics data set reveals more details of the properties of the anisotropy and is potentially able to shed light on the various physical processes that are responsible for the complex angular structure and energy evolution.
Ankle plantarflexion strength in rearfoot and forefoot runners: a novel clusteranalytic approach.
Liebl, Dominik; Willwacher, Steffen; Hamill, Joseph; Brüggemann, Gert-Peter
2014-06-01
The purpose of the present study was to test for differences in ankle plantarflexion strengths of habitually rearfoot and forefoot runners. In order to approach this issue, we revisit the problem of classifying different footfall patterns in human runners. A dataset of 119 subjects running shod and barefoot (speed 3.5m/s) was analyzed. The footfall patterns were clustered by a novel statistical approach, which is motivated by advances in the statistical literature on functional data analysis. We explain the novel statistical approach in detail and compare it to the classically used strike index of Cavanagh and Lafortune (1980). The two groups found by the new cluster approach are well interpretable as a forefoot and a rearfoot footfall groups. The subsequent comparison study of the clustered subjects reveals that runners with a forefoot footfall pattern are capable of producing significantly higher joint moments in a maximum voluntary contraction (MVC) of their ankle plantarflexor muscles tendon units; difference in means: 0.28Nm/kg. This effect remains significant after controlling for an additional gender effect and for differences in training levels. Our analysis confirms the hypothesis that forefoot runners have a higher mean MVC plantarflexion strength than rearfoot runners. Furthermore, we demonstrate that our proposed stochastic cluster analysis provides a robust and useful framework for clustering foot strikes. Copyright © 2014 Elsevier B.V. All rights reserved.
Simonoska Crcarevska, Maja; Dimitrovska, Aneta; Sibinovska, Nadica; Mladenovska, Kristina; Slavevska Raicki, Renata; Glavas Dodov, Marija
2015-07-15
Microsponges drug delivery system (MDDC) was prepared by double emulsion-solvent-diffusion technique using rotor-stator homogenization. Quality by design (QbD) concept was implemented for the development of MDDC with potential to be incorporated into semisolid dosage form (gel). Quality target product profile (QTPP) and critical quality attributes (CQA) were defined and identified, accordingly. Critical material attributes (CMA) and Critical process parameters (CPP) were identified using quality risk management (QRM) tool, failure mode, effects and criticality analysis (FMECA). CMA and CPP were identified based on results obtained from principal component analysis (PCA-X&Y) and partial least squares (PLS) statistical analysis along with literature data, product and process knowledge and understanding. FMECA identified amount of ethylcellulose, chitosan, acetone, dichloromethane, span 80, tween 80 and water ratio in primary/multiple emulsions as CMA and rotation speed and stirrer type used for organic solvent removal as CPP. The relationship between identified CPP and particle size as CQA was described in the design space using design of experiments - one-factor response surface method. Obtained results from statistically designed experiments enabled establishment of mathematical models and equations that were used for detailed characterization of influence of identified CPP upon MDDC particle size and particle size distribution and their subsequent optimization. Copyright © 2015 Elsevier B.V. All rights reserved.
Ocular disorders in children with spastic subtype of cerebral palsy.
Ozturk, A Taylan; Berk, A Tulin; Yaman, Aylin
2013-01-01
To document common ocular abnormalities in children with spastic subtype of cerebral palsy (CP) and to find out whether any correlation exists between their occurance and etiologic factors. Totally 194 patients with the diagnosis of spastic type CP were enrolled in this retrospective study. Detailed ophthalmic examinations were performed. Demographic data and neuroradiological findings were documented. Kruskal-Wallis, Mann Whitney U, Pearson Chi-square tests and Student's t tests were used in the statistical analysis. The mean age was 64.7±44.2 months on the first ophthalmic examination. Prevalences of diplegia (47.4%) and tetraplegia (36.1%) were found to be higher than the frequency of hemiplegia (16.5%) in our study population. Etiologic factor was asphyxia in 60.8% of the patients. Abnormal ocular findings were present in 78.9% of the patients. Statistically significant poor vision was detected in tetraplegia group among all the spastic ubtypes of CP (P=0.000). Anisometropia and significant refractive error were found in 14.4% and 70.1% of the patients, respectively. Thirty-six children (18.6%) had nystagmus and 107 children (55.2%) had strabismus. Lower gestational age and birth weight were statistically higher in patients with esotropia than exotropia (P=0.009 and P=0.024, respectively). Abnormal morphology of the optic disc was present in 152 eyes (39.2%). Severe periventricular leukomalacia (PVL) was found in 48 patients and statistically significant poor vision was detected in the presence of PVL (P=0.000). Spastic diplegic or tetraplegic CP patients with positive neuroradiological symptoms, younger gestational age and lower birth weight ought to have detailed ophthalmic examinations as early as possible to provide best visual rehabilitation.
Neurological Outcomes Following Suicidal Hanging: A Prospective Study of 101 Patients
Jawaid, Mohammed Turab; Amalnath, S. Deepak; Subrahmanyam, D. K. S.
2017-01-01
Context: Survivors of suicidal hanging can have variable neurological outcomes – from complete recovery to irreversible brain damage. Literature on the neurological outcomes in these patients is confined to retrospective studies and case series. Hence, this prospective study was carried out. Aims: The aim is to study the neurological outcomes in suicidal hanging. Settings and Design: This was a prospective observational study carried out from July 2014 to July 2016. Subjects and Methods: Consecutive patients admitted to the emergency and medicine wards were included in the study. Details of the clinical and radiological findings, course in hospital and at 1 month postdischarge were analyzed. Statistical Analysis Used: Statistical analysis was performed using IBM SPSS advanced statistics 20.0 (SPSS Inc., Chicago, USA). Univariate analysis was performed using Chi-square test for significance and Odd's ratio was calculated. Results: Of the 101 patients, 6 died and 4 had residual neuro deficits. Cervical spine injury was seen in 3 patients. Interestingly, 39 patients could not remember the act of hanging (retrograde amnesia). Hypotension, pulmonary edema, Glasgow coma scale (GCS) score <8 at admission, need for mechanical ventilation, and cerebral edema on plain computed tomography were more in those with amnesia as compared to those with normal memory and these findings were statistically significant. Conclusions: Majority of patients recovered without any sequelae. Routine imaging of cervical spine may not be warranted in all patients, even in those with poor GCS. Retrograde amnesia might be more common than previously believed and further studies are needed to analyze this peculiar feature. PMID:28584409
Major Greenwood's early career and the first departments of medical statistics.
Farewell, Vern; Johnson, Tony
2014-06-15
Major Greenwood was the foremost medical statistician of the first half of the 20th century in the UK and is often credited with founding the first department of medical statistics at the Lister Institute in London in 1910. Here, we examine in detail his career prior to this appointment, including his association with Karl Pearson. We also examine the remit of the Department of Medical Statistics at the London Hospital of which he was the founding Director in 1908, some 2 years earlier than his appointment at the Lister Institute. Supporting information consisting of further details about Major Greenwood's early career, biographical articles and obituaries for him, and a list of his publications to 1910 by year, is also provided. © 2014 The Authors. Statistics in Medicine published by John Wiley & Sons, Ltd.
Computational statistics using the Bayesian Inference Engine
NASA Astrophysics Data System (ADS)
Weinberg, Martin D.
2013-09-01
This paper introduces the Bayesian Inference Engine (BIE), a general parallel, optimized software package for parameter inference and model selection. This package is motivated by the analysis needs of modern astronomical surveys and the need to organize and reuse expensive derived data. The BIE is the first platform for computational statistics designed explicitly to enable Bayesian update and model comparison for astronomical problems. Bayesian update is based on the representation of high-dimensional posterior distributions using metric-ball-tree based kernel density estimation. Among its algorithmic offerings, the BIE emphasizes hybrid tempered Markov chain Monte Carlo schemes that robustly sample multimodal posterior distributions in high-dimensional parameter spaces. Moreover, the BIE implements a full persistence or serialization system that stores the full byte-level image of the running inference and previously characterized posterior distributions for later use. Two new algorithms to compute the marginal likelihood from the posterior distribution, developed for and implemented in the BIE, enable model comparison for complex models and data sets. Finally, the BIE was designed to be a collaborative platform for applying Bayesian methodology to astronomy. It includes an extensible object-oriented and easily extended framework that implements every aspect of the Bayesian inference. By providing a variety of statistical algorithms for all phases of the inference problem, a scientist may explore a variety of approaches with a single model and data implementation. Additional technical details and download details are available from http://www.astro.umass.edu/bie. The BIE is distributed under the GNU General Public License.
H I Structure and Topology of the Galaxy Revealed by the I-GALFA H I 21-cm Line Survey
NASA Astrophysics Data System (ADS)
Koo, Bon-Chul; Park, G.; Cho, W.; Gibson, S. J.; Kang, J.; Douglas, K. A.; Peek, J. E. G.; Korpela, E. J.; Heiles, C. E.
2011-05-01
The I-GALFA survey mapping all the H I in the inner Galactic disk visible to the Arecibo 305m telescope within 10 degrees of the Galactic plane (longitudes of 32 to 77 degrees at b = 0) completed observations in 2009 September and will soon be made publicly available. The high (3.4 arcmin) resolution and tremendous sensitivity of the survey offer a great opportunity to observe the fine details of H I both in the inner and in the far outer Galaxy. The reduced HI column density maps show that the HI structure is highly filamentary and clumpy, pervaded by shell-like structures, vertical filaments, and small clumps. By inspecting individual maps, we have found 36 shell candidates of angular sizes ranging from 0.4 to 12 degrees, half of which appear to be expanding. In order to characterize the filamentary/clumpy morphology of the HI structure, we have carried out statistical analyses of selected areas representing the spiral arms in the inner and outer Galaxy. Genus statistics that can distinguish the ``meatball'' and ``swiss-cheese'' topologies show that the HI topology is clump-like in most regions. The two-dimensional Fourier analysis further shows the HI structures are filamentary and mainly parallel to the plane in the outer Galaxy. We also examine the level-crossing statistics, the results of which are described in detail in an accompanying poster by Park et al.
Rivoirard, Romain; Duplay, Vianney; Oriol, Mathieu; Tinquaut, Fabien; Chauvin, Franck; Magne, Nicolas; Bourmaud, Aurelie
2016-01-01
Quality of reporting for Randomized Clinical Trials (RCTs) in oncology was analyzed in several systematic reviews, but, in this setting, there is paucity of data for the outcomes definitions and consistency of reporting for statistical tests in RCTs and Observational Studies (OBS). The objective of this review was to describe those two reporting aspects, for OBS and RCTs in oncology. From a list of 19 medical journals, three were retained for analysis, after a random selection: British Medical Journal (BMJ), Annals of Oncology (AoO) and British Journal of Cancer (BJC). All original articles published between March 2009 and March 2014 were screened. Only studies whose main outcome was accompanied by a corresponding statistical test were included in the analysis. Studies based on censored data were excluded. Primary outcome was to assess quality of reporting for description of primary outcome measure in RCTs and of variables of interest in OBS. A logistic regression was performed to identify covariates of studies potentially associated with concordance of tests between Methods and Results parts. 826 studies were included in the review, and 698 were OBS. Variables were described in Methods section for all OBS studies and primary endpoint was clearly detailed in Methods section for 109 RCTs (85.2%). 295 OBS (42.2%) and 43 RCTs (33.6%) had perfect agreement for reported statistical test between Methods and Results parts. In multivariable analysis, variable "number of included patients in study" was associated with test consistency: aOR (adjusted Odds Ratio) for third group compared to first group was equal to: aOR Grp3 = 0.52 [0.31-0.89] (P value = 0.009). Variables in OBS and primary endpoint in RCTs are reported and described with a high frequency. However, statistical tests consistency between methods and Results sections of OBS is not always noted. Therefore, we encourage authors and peer reviewers to verify consistency of statistical tests in oncology studies.
Rivoirard, Romain; Duplay, Vianney; Oriol, Mathieu; Tinquaut, Fabien; Chauvin, Franck; Magne, Nicolas; Bourmaud, Aurelie
2016-01-01
Background Quality of reporting for Randomized Clinical Trials (RCTs) in oncology was analyzed in several systematic reviews, but, in this setting, there is paucity of data for the outcomes definitions and consistency of reporting for statistical tests in RCTs and Observational Studies (OBS). The objective of this review was to describe those two reporting aspects, for OBS and RCTs in oncology. Methods From a list of 19 medical journals, three were retained for analysis, after a random selection: British Medical Journal (BMJ), Annals of Oncology (AoO) and British Journal of Cancer (BJC). All original articles published between March 2009 and March 2014 were screened. Only studies whose main outcome was accompanied by a corresponding statistical test were included in the analysis. Studies based on censored data were excluded. Primary outcome was to assess quality of reporting for description of primary outcome measure in RCTs and of variables of interest in OBS. A logistic regression was performed to identify covariates of studies potentially associated with concordance of tests between Methods and Results parts. Results 826 studies were included in the review, and 698 were OBS. Variables were described in Methods section for all OBS studies and primary endpoint was clearly detailed in Methods section for 109 RCTs (85.2%). 295 OBS (42.2%) and 43 RCTs (33.6%) had perfect agreement for reported statistical test between Methods and Results parts. In multivariable analysis, variable "number of included patients in study" was associated with test consistency: aOR (adjusted Odds Ratio) for third group compared to first group was equal to: aOR Grp3 = 0.52 [0.31–0.89] (P value = 0.009). Conclusion Variables in OBS and primary endpoint in RCTs are reported and described with a high frequency. However, statistical tests consistency between methods and Results sections of OBS is not always noted. Therefore, we encourage authors and peer reviewers to verify consistency of statistical tests in oncology studies. PMID:27716793
eSACP - a new Nordic initiative towards developing statistical climate services
NASA Astrophysics Data System (ADS)
Thorarinsdottir, Thordis; Thejll, Peter; Drews, Martin; Guttorp, Peter; Venälainen, Ari; Uotila, Petteri; Benestad, Rasmus; Mesquita, Michel d. S.; Madsen, Henrik; Fox Maule, Cathrine
2015-04-01
The Nordic research council NordForsk has recently announced its support for a new 3-year research initiative on "statistical analysis of climate projections" (eSACP). eSACP will focus on developing e-science tools and services based on statistical analysis of climate projections for the purpose of helping decision-makers and planners in the face of expected future challenges in regional climate change. The motivation behind the project is the growing recognition in our society that forecasts of future climate change is associated with various sources of uncertainty, and that any long-term planning and decision-making dependent on a changing climate must account for this. At the same time there is an obvious gap between scientists from different fields and between practitioners in terms of understanding how climate information relates to different parts of the "uncertainty cascade". In eSACP we will develop generic e-science tools and statistical climate services to facilitate the use of climate projections by decision-makers and scientists from all fields for climate impact analyses and for the development of robust adaptation strategies, which properly (in a statistical sense) account for the inherent uncertainty. The new tool will be publically available and include functionality to utilize the extensive and dynamically growing repositories of data and use state-of-the-art statistical techniques to quantify the uncertainty and innovative approaches to visualize the results. Such a tool will not only be valuable for future assessments and underpin the development of dedicated climate services, but will also assist the scientific community in making more clearly its case on the consequences of our changing climate to policy makers and the general public. The eSACP project is led by Thordis Thorarinsdottir, Norwegian Computing Center, and also includes the Finnish Meteorological Institute, the Norwegian Meteorological Institute, the Technical University of Denmark and the Bjerknes Centre for Climate Research, Norway. This poster will present details of focus areas in the project and show some examples of the expected analysis tools.
Analysis of 3d Building Models Accuracy Based on the Airborne Laser Scanning Point Clouds
NASA Astrophysics Data System (ADS)
Ostrowski, W.; Pilarska, M.; Charyton, J.; Bakuła, K.
2018-05-01
Creating 3D building models in large scale is becoming more popular and finds many applications. Nowadays, a wide term "3D building models" can be applied to several types of products: well-known CityGML solid models (available on few Levels of Detail), which are mainly generated from Airborne Laser Scanning (ALS) data, as well as 3D mesh models that can be created from both nadir and oblique aerial images. City authorities and national mapping agencies are interested in obtaining the 3D building models. Apart from the completeness of the models, the accuracy aspect is also important. Final accuracy of a building model depends on various factors (accuracy of the source data, complexity of the roof shapes, etc.). In this paper the methodology of inspection of dataset containing 3D models is presented. The proposed approach check all building in dataset with comparison to ALS point clouds testing both: accuracy and level of details. Using analysis of statistical parameters for normal heights for reference point cloud and tested planes and segmentation of point cloud provides the tool that can indicate which building and which roof plane in do not fulfill requirement of model accuracy and detail correctness. Proposed method was tested on two datasets: solid and mesh model.
Parzefall, Thomas; Wolf, Axel; Frei, Klemens; Kaider, Alexandra; Riss, Dominik
2017-03-01
Use of reliable grading scores to measure epistaxis severity in hereditary hemorrhagic telangiectasia (HHT) is essential in clinical routine and for scientific purposes. For practical reasons, visual analog scale (VAS) scoring and the Epistaxis Severity Score (ESS) are widely used. VAS scores are purely subjective, and a potential shortcoming of the ESS is that it is based on self-reported anamnestic bleeding data. The aim of this study was to validate the level of correlation between VAS scores, the ESS, and actual bleeding events, based on detailed epistaxis diaries of patients. Records from daily epistaxis diaries maintained by 16 HHT patients over 112 consecutive days were compared with the monthly ESS and daily VAS scores in the corresponding time period. The Spearman rank correlation coefficient, analysis of variance models, and multiple R 2 measures were used for statistical analysis. Although the ESS and VAS scores generally showed a high degree of correlation with actual bleeding events, mild events were underrepresented in both scores. Our results highlight the usefulness of the ESS as a standard epistaxis score in cohorts with moderate to severe degrees of epistaxis. The use of detailed epistaxis diaries should be considered when monitoring patients and cohorts with mild forms of HHT. © 2016 ARS-AAOA, LLC.
Compression technique for large statistical data bases
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eggers, S.J.; Olken, F.; Shoshani, A.
1981-03-01
The compression of large statistical databases is explored and are proposed for organizing the compressed data, such that the time required to access the data is logarithmic. The techniques exploit special characteristics of statistical databases, namely, variation in the space required for the natural encoding of integer attributes, a prevalence of a few repeating values or constants, and the clustering of both data of the same length and constants in long, separate series. The techniques are variations of run-length encoding, in which modified run-lengths for the series are extracted from the data stream and stored in a header, which ismore » used to form the base level of a B-tree index into the database. The run-lengths are cumulative, and therefore the access time of the data is logarithmic in the size of the header. The details of the compression scheme and its implementation are discussed, several special cases are presented, and an analysis is given of the relative performance of the various versions.« less
Tephrostratigraphy of the A.D. 79 pyroclastic deposits in perivolcanic areas of Mt. Vesuvio (Italy)
NASA Astrophysics Data System (ADS)
Lirer, Lucio; Munno, Rosalba; Petrosino, Paola; Vinci, Anna
1993-11-01
Correlations between pyroclastic deposits in perivolcanic areas are often complicated by lateral and vertical textural variations linked to very localized depositional effects. In this regard, a detailed sampling of A.D. 79 eruption products has been performed in the main archaeological sites of the perivolcanic area, with the aim of carrying out a grain-size, compositional and geochemical investigation so as to identify the marker layers from different stratigraphic successions and thus reconstruct the eruptive sequence. In order to process the large number of data available, a statistical approach was considered the most suitable. Statistical processing highlighted 14 marker layers among the fall, stratified surge and pyroclastic flow deposits. Furthermore statistical analysis made it possible to correlate pyroclastic flow and surge deposits interbedded with fall, interpreted as a lateral facies variation. Finally, the passage from magmatic to hydromagmatic activity is marked by the deposition of pyroclastic flow, surge and accretionary lapilli-bearing deposits. No transitional phase from magmatic to hydromagmatic activity has been recognized.
NASA Astrophysics Data System (ADS)
Yi, Yong; Chen, Zhengying; Wang, Liming
2018-05-01
Corona-originated discharge of DC transmission lines is the main reason for the radiated electromagnetic interference (EMI) field in the vicinity of transmission lines. A joint time-frequency analysis technique was proposed to extract the radiated EMI current (excitation current) of DC corona based on corona current statistical measurements. A reduced-scale experimental platform was setup to measure the statistical distributions of current waveform parameters of aluminum conductor steel reinforced. Based on the measured results, the peak value, root-mean-square value and average value with 9 kHz and 200 Hz band-with of 0.5 MHz radiated EMI current were calculated by the technique proposed and validated with conventional excitation function method. Radio interference (RI) was calculated based on the radiated EMI current and a wire-to-plate platform was built for the validity of the RI computation results. The reason for the certain deviation between the computations and measurements was detailed analyzed.
Caple, Jodi; Stephan, Carl N
2017-05-01
Graphic exemplars of cranial sex and ancestry are essential to forensic anthropology for standardizing casework, training analysts, and communicating group trends. To date, graphic exemplars have comprised hand-drawn sketches, or photographs of individual specimens, which risks bias/subjectivity. Here, we performed quantitative analysis of photographic data to generate new photo-realistic and objective exemplars of skull form. Standardized anterior and left lateral photographs of skulls for each sex were analyzed in the computer graphics program Psychomorph for the following groups: South African Blacks, South African Whites, American Blacks, American Whites, and Japanese. The average cranial form was calculated for each photographic view, before the color information for every individual was warped to the average form and combined to produce statistical averages. These mathematically derived exemplars-and their statistical exaggerations or extremes-retain the high-resolution detail of the original photographic dataset, making them the ideal casework and training reference standards. © 2016 American Academy of Forensic Sciences.
A Bayesian test for Hardy–Weinberg equilibrium of biallelic X-chromosomal markers
Puig, X; Ginebra, J; Graffelman, J
2017-01-01
The X chromosome is a relatively large chromosome, harboring a lot of genetic information. Much of the statistical analysis of X-chromosomal information is complicated by the fact that males only have one copy. Recently, frequentist statistical tests for Hardy–Weinberg equilibrium have been proposed specifically for dealing with markers on the X chromosome. Bayesian test procedures for Hardy–Weinberg equilibrium for the autosomes have been described, but Bayesian work on the X chromosome in this context is lacking. This paper gives the first Bayesian approach for testing Hardy–Weinberg equilibrium with biallelic markers at the X chromosome. Marginal and joint posterior distributions for the inbreeding coefficient in females and the male to female allele frequency ratio are computed, and used for statistical inference. The paper gives a detailed account of the proposed Bayesian test, and illustrates it with data from the 1000 Genomes project. In that implementation, a novel approach to tackle multiple testing from a Bayesian perspective through posterior predictive checks is used. PMID:28900292
DOE Office of Scientific and Technical Information (OSTI.GOV)
Steenbergen, K. G., E-mail: kgsteen@gmail.com; Gaston, N.
2014-02-14
Inspired by methods of remote sensing image analysis, we analyze structural variation in cluster molecular dynamics (MD) simulations through a unique application of the principal component analysis (PCA) and Pearson Correlation Coefficient (PCC). The PCA analysis characterizes the geometric shape of the cluster structure at each time step, yielding a detailed and quantitative measure of structural stability and variation at finite temperature. Our PCC analysis captures bond structure variation in MD, which can be used to both supplement the PCA analysis as well as compare bond patterns between different cluster sizes. Relying only on atomic position data, without requirement formore » a priori structural input, PCA and PCC can be used to analyze both classical and ab initio MD simulations for any cluster composition or electronic configuration. Taken together, these statistical tools represent powerful new techniques for quantitative structural characterization and isomer identification in cluster MD.« less
Steenbergen, K G; Gaston, N
2014-02-14
Inspired by methods of remote sensing image analysis, we analyze structural variation in cluster molecular dynamics (MD) simulations through a unique application of the principal component analysis (PCA) and Pearson Correlation Coefficient (PCC). The PCA analysis characterizes the geometric shape of the cluster structure at each time step, yielding a detailed and quantitative measure of structural stability and variation at finite temperature. Our PCC analysis captures bond structure variation in MD, which can be used to both supplement the PCA analysis as well as compare bond patterns between different cluster sizes. Relying only on atomic position data, without requirement for a priori structural input, PCA and PCC can be used to analyze both classical and ab initio MD simulations for any cluster composition or electronic configuration. Taken together, these statistical tools represent powerful new techniques for quantitative structural characterization and isomer identification in cluster MD.
2013-01-01
Background So far, no detailed study of the Iranian pharmaceutical market has been conducted, and only a few studies have analyzed medicine consumption and expenditure in Iran. Pharmaceutical market trend analysis remains one of the most useful instruments to evaluate the pharmaceutical systems efficiency. An increase in imports of medicines, and a simultaneous decrease in domestic production prompted us to investigate the pharmaceutical expenditure structure. On the other hand, analyzing statistics provides a suitable method to assess the outcomes of national pharmaceutical policies and regulations. Methods This is a descriptive and cross-sectional study which investigates the Iranian pharmaceutical market over a 13-year period (1997–2010). This study used the Iranian pharmaceutical statistical datasheet published by the Iranian Ministry of Health. Systematic searches of the relevant Persian and English research literature were made. In addition, official government documents were analyzed as sources of both data and detailed statements of policy. Results Analysis of the Iranian pharmaceutical market in the 13-year period shows that medicine consumption sales value growth has been 28.38% annually. Determination of domestic production and import reveals that 9.3% and 42.3% annual growth, respectively, have been experienced. Conclusions The Iranian pharmaceutical market has undergone great growth in comparison with developing countries and the pharmerging group, and the market is expanding quickly while a major share goes to biotechnology drugs, which implies the need to commercialization activities in novel fields like pharmaceutical biotechnology. This market expansion has been in favor of imported medicine in sales terms, caused by the reinforcement of suspicious policies of policy makers that necessitates fundamental rearrangements. PMID:23805853
NASA Astrophysics Data System (ADS)
Bencomo, Jose Antonio Fagundez
The main goal of this study was to relate physical changes in image quality measured by Modulation Transfer Function (MTF) to diagnostic accuracy. One Hundred and Fifty Kodak Min-R screen/film combination conventional craniocaudal mammograms obtained with the Pfizer Microfocus Mammographic system were selected from the files of the Department of Radiology, at M.D. Anderson Hospital and Tumor Institute. The mammograms included 88 cases with a variety of benign diagnosis and 62 cases with a variety of malignant biopsy diagnosis. The average age of the patient population was 55 years old. 70 cases presented calcifications with 30 cases having calcifications smaller than 0.5mm. 46 cases presented irregular bordered masses larger than 1 cm. 30 cases presented smooth bordered masses with 20 larger than 1 cm. Four separated copies of the original images were made each having a different change in the MTF using a defocusing technique whereby copies of the original were obtained by light exposure through different thicknesses (spacing) of transparent film base. The mammograms were randomized, and evaluated by three experienced mammographers for the degree of visibility of various anatomical breast structures and pathological lesions (masses and calicifications), subjective image quality, and mammographic interpretation. 3,000 separate evaluations were anayzed by several statistical techniques including Receiver Operating Characteristic curve analysis, McNemar test for differences between proportions and the Landis et al. method of agreement weighted kappa for ordinal categorical data. Results from the statistical analysis show: (1) There were no statistical significant differences in the diagnostic accuracy of the observers when diagnosing from mammograms with the same MTF. (2) There were no statistically significant differences in diagnostic accuracy for each observer when diagnosing from mammograms with the different MTF's used in the study. (3) There statistical significant differences in detail visibility between the copies and the originals. Detail visibility was better in the originals. (4) Feature interpretations were not significantly different between the originals and the copies. (5) Perception of image quality did not affect image interpretation. Continuation and improvement of this research ca be accomplished by: using a case population more sensitive to MTF changes, i.e., asymptomatic women with minimum breast cancer, more observers (including less experienced radiologists and experienced technologists) must collaborate in the study, and using a minimum of 200 benign and 200 malignant cases.
De Crop, An; Bacher, Klaus; Van Hoof, Tom; Smeets, Peter V; Smet, Barbara S; Vergauwen, Merel; Kiendys, Urszula; Duyck, Philippe; Verstraete, Koenraad; D'Herde, Katharina; Thierens, Hubert
2012-01-01
To determine the correlation between the clinical and physical image quality of chest images by using cadavers embalmed with the Thiel technique and a contrast-detail phantom. The use of human cadavers fulfilled the requirements of the institutional ethics committee. Clinical image quality was assessed by using three human cadavers embalmed with the Thiel technique, which results in excellent preservation of the flexibility and plasticity of organs and tissues. As a result, lungs can be inflated during image acquisition to simulate the pulmonary anatomy seen on a chest radiograph. Both contrast-detail phantom images and chest images of the Thiel-embalmed bodies were acquired with an amorphous silicon flat-panel detector. Tube voltage (70, 81, 90, 100, 113, 125 kVp), copper filtration (0.1, 0.2, 0.3 mm Cu), and exposure settings (200, 280, 400, 560, 800 speed class) were altered to simulate different quality levels. Four experienced radiologists assessed the image quality by using a visual grading analysis (VGA) technique based on European Quality Criteria for Chest Radiology. The phantom images were scored manually and automatically with use of dedicated software, both resulting in an inverse image quality figure (IQF). Spearman rank correlations between inverse IQFs and VGA scores were calculated. A statistically significant correlation (r = 0.80, P < .01) was observed between the VGA scores and the manually obtained inverse IQFs. Comparison of the VGA scores and the automated evaluated phantom images showed an even better correlation (r = 0.92, P < .001). The results support the value of contrast-detail phantom analysis for evaluating clinical image quality in chest radiography. © RSNA, 2011.
Methodological Reporting of Randomized Trials in Five Leading Chinese Nursing Journals
Shi, Chunhu; Tian, Jinhui; Ren, Dan; Wei, Hongli; Zhang, Lihuan; Wang, Quan; Yang, Kehu
2014-01-01
Background Randomized controlled trials (RCTs) are not always well reported, especially in terms of their methodological descriptions. This study aimed to investigate the adherence of methodological reporting complying with CONSORT and explore associated trial level variables in the Chinese nursing care field. Methods In June 2012, we identified RCTs published in five leading Chinese nursing journals and included trials with details of randomized methods. The quality of methodological reporting was measured through the methods section of the CONSORT checklist and the overall CONSORT methodological items score was calculated and expressed as a percentage. Meanwhile, we hypothesized that some general and methodological characteristics were associated with reporting quality and conducted a regression with these data to explore the correlation. The descriptive and regression statistics were calculated via SPSS 13.0. Results In total, 680 RCTs were included. The overall CONSORT methodological items score was 6.34±0.97 (Mean ± SD). No RCT reported descriptions and changes in “trial design,” changes in “outcomes” and “implementation,” or descriptions of the similarity of interventions for “blinding.” Poor reporting was found in detailing the “settings of participants” (13.1%), “type of randomization sequence generation” (1.8%), calculation methods of “sample size” (0.4%), explanation of any interim analyses and stopping guidelines for “sample size” (0.3%), “allocation concealment mechanism” (0.3%), additional analyses in “statistical methods” (2.1%), and targeted subjects and methods of “blinding” (5.9%). More than 50% of trials described randomization sequence generation, the eligibility criteria of “participants,” “interventions,” and definitions of the “outcomes” and “statistical methods.” The regression analysis found that publication year and ITT analysis were weakly associated with CONSORT score. Conclusions The completeness of methodological reporting of RCTs in the Chinese nursing care field is poor, especially with regard to the reporting of trial design, changes in outcomes, sample size calculation, allocation concealment, blinding, and statistical methods. PMID:25415382
Wallace, Cynthia S.A.; Advised by Marsh, Stuart E.
2002-01-01
The research accomplished in this dissertation used both mathematical and statistical techniques to extract and evaluate measures of landscape temporal dynamics and spatial structure from remotely sensed data for the purpose of mapping wildlife habitat. By coupling the landscape measures gleaned from the remotely sensed data with various sets of animal sightings and population data, effective models of habitat preference were created.Measures of temporal dynamics of vegetation greenness as measured by National Oceanographic and Atmospheric Administration’s Advanced Very High Resolution Radiometer (AVHRR) satellite were used to effectively characterize and map season specific habitat of the Sonoran pronghorn antelope, as well as produce preliminary models of potential yellow-billed cuckoo habitat in Arizona. Various measures that capture different aspects of the temporal dynamics of the landscape were derived from AVHRR Normalized Difference Vegetation Index composite data using three main classes of calculations: basic statistics, standardized principal components analysis, and Fourier analysis. Pronghorn habitat models based on the AVHRR measures correspond visually and statistically to GIS-based models produced using data that represent detailed knowledge of ground-condition.Measures of temporal dynamics also revealed statistically significant correlations with annual estimates of elk population in selected Arizona Game Management Units, suggesting elk respond to regional environmental changes that can be measured using satellite data. Such relationships, once verified and established, can be used to help indirectly monitor the population.Measures of landscape spatial structure derived from IKONOS high spatial resolution (1-m) satellite data using geostatistics effectively map details of Sonoran pronghorn antelope habitat. Local estimates of the nugget, sill, and range variogram parameters calculated within 25 x 25-meter image windows describe the spatial autocorrelation of the image, permitting classification of all pixels into coherent units whose signature graphs exhibit a classic variogram shape. The variogram parameters captured in these signatures have been shown in previous studies to discriminate between different species-specific vegetation associations.The synoptic view of the landscape provided by satellite data can inform resource management efforts. The ability to characterize the spatial structure and temporal dynamics of habitat using repeatable remote sensing data allows closer monitoring of the relationship between a species and its landscape.
Planck 2015 results. XVI. Isotropy and statistics of the CMB
NASA Astrophysics Data System (ADS)
Planck Collaboration; Ade, P. A. R.; Aghanim, N.; Akrami, Y.; Aluri, P. K.; Arnaud, M.; Ashdown, M.; Aumont, J.; Baccigalupi, C.; Banday, A. J.; Barreiro, R. B.; Bartolo, N.; Basak, S.; Battaner, E.; Benabed, K.; Benoît, A.; Benoit-Lévy, A.; Bernard, J.-P.; Bersanelli, M.; Bielewicz, P.; Bock, J. J.; Bonaldi, A.; Bonavera, L.; Bond, J. R.; Borrill, J.; Bouchet, F. R.; Boulanger, F.; Bucher, M.; Burigana, C.; Butler, R. C.; Calabrese, E.; Cardoso, J.-F.; Casaponsa, B.; Catalano, A.; Challinor, A.; Chamballu, A.; Chiang, H. C.; Christensen, P. R.; Church, S.; Clements, D. L.; Colombi, S.; Colombo, L. P. L.; Combet, C.; Contreras, D.; Couchot, F.; Coulais, A.; Crill, B. P.; Cruz, M.; Curto, A.; Cuttaia, F.; Danese, L.; Davies, R. D.; Davis, R. J.; de Bernardis, P.; de Rosa, A.; de Zotti, G.; Delabrouille, J.; Désert, F.-X.; Diego, J. M.; Dole, H.; Donzelli, S.; Doré, O.; Douspis, M.; Ducout, A.; Dupac, X.; Efstathiou, G.; Elsner, F.; Enßlin, T. A.; Eriksen, H. K.; Fantaye, Y.; Fergusson, J.; Fernandez-Cobos, R.; Finelli, F.; Forni, O.; Frailis, M.; Fraisse, A. A.; Franceschi, E.; Frejsel, A.; Frolov, A.; Galeotta, S.; Galli, S.; Ganga, K.; Gauthier, C.; Ghosh, T.; Giard, M.; Giraud-Héraud, Y.; Gjerløw, E.; González-Nuevo, J.; Górski, K. M.; Gratton, S.; Gregorio, A.; Gruppuso, A.; Gudmundsson, J. E.; Hansen, F. K.; Hanson, D.; Harrison, D. L.; Henrot-Versillé, S.; Hernández-Monteagudo, C.; Herranz, D.; Hildebrandt, S. R.; Hivon, E.; Hobson, M.; Holmes, W. A.; Hornstrup, A.; Hovest, W.; Huang, Z.; Huffenberger, K. M.; Hurier, G.; Jaffe, A. H.; Jaffe, T. R.; Jones, W. C.; Juvela, M.; Keihänen, E.; Keskitalo, R.; Kim, J.; Kisner, T. S.; Knoche, J.; Kunz, M.; Kurki-Suonio, H.; Lagache, G.; Lähteenmäki, A.; Lamarre, J.-M.; Lasenby, A.; Lattanzi, M.; Lawrence, C. R.; Leonardi, R.; Lesgourgues, J.; Levrier, F.; Liguori, M.; Lilje, P. B.; Linden-Vørnle, M.; Liu, H.; López-Caniego, M.; Lubin, P. M.; Macías-Pérez, J. F.; Maggio, G.; Maino, D.; Mandolesi, N.; Mangilli, A.; Marinucci, D.; Maris, M.; Martin, P. G.; Martínez-González, E.; Masi, S.; Matarrese, S.; McGehee, P.; Meinhold, P. R.; Melchiorri, A.; Mendes, L.; Mennella, A.; Migliaccio, M.; Mikkelsen, K.; Mitra, S.; Miville-Deschênes, M.-A.; Molinari, D.; Moneti, A.; Montier, L.; Morgante, G.; Mortlock, D.; Moss, A.; Munshi, D.; Murphy, J. A.; Naselsky, P.; Nati, F.; Natoli, P.; Netterfield, C. B.; Nørgaard-Nielsen, H. U.; Noviello, F.; Novikov, D.; Novikov, I.; Oxborrow, C. A.; Paci, F.; Pagano, L.; Pajot, F.; Pant, N.; Paoletti, D.; Pasian, F.; Patanchon, G.; Pearson, T. J.; Perdereau, O.; Perotto, L.; Perrotta, F.; Pettorino, V.; Piacentini, F.; Piat, M.; Pierpaoli, E.; Pietrobon, D.; Plaszczynski, S.; Pointecouteau, E.; Polenta, G.; Popa, L.; Pratt, G. W.; Prézeau, G.; Prunet, S.; Puget, J.-L.; Rachen, J. P.; Rebolo, R.; Reinecke, M.; Remazeilles, M.; Renault, C.; Renzi, A.; Ristorcelli, I.; Rocha, G.; Rosset, C.; Rossetti, M.; Rotti, A.; Roudier, G.; Rubiño-Martín, J. A.; Rusholme, B.; Sandri, M.; Santos, D.; Savelainen, M.; Savini, G.; Scott, D.; Seiffert, M. D.; Shellard, E. P. S.; Souradeep, T.; Spencer, L. D.; Stolyarov, V.; Stompor, R.; Sudiwala, R.; Sunyaev, R.; Sutton, D.; Suur-Uski, A.-S.; Sygnet, J.-F.; Tauber, J. A.; Terenzi, L.; Toffolatti, L.; Tomasi, M.; Tristram, M.; Trombetti, T.; Tucci, M.; Tuovinen, J.; Valenziano, L.; Valiviita, J.; Van Tent, B.; Vielva, P.; Villa, F.; Wade, L. A.; Wandelt, B. D.; Wehus, I. K.; Yvon, D.; Zacchei, A.; Zibin, J. P.; Zonca, A.
2016-09-01
We test the statistical isotropy and Gaussianity of the cosmic microwave background (CMB) anisotropies using observations made by the Planck satellite. Our results are based mainly on the full Planck mission for temperature, but also include some polarization measurements. In particular, we consider the CMB anisotropy maps derived from the multi-frequency Planck data by several component-separation methods. For the temperature anisotropies, we find excellent agreement between results based on these sky maps over both a very large fraction of the sky and a broad range of angular scales, establishing that potential foreground residuals do not affect our studies. Tests of skewness, kurtosis, multi-normality, N-point functions, and Minkowski functionals indicate consistency with Gaussianity, while a power deficit at large angular scales is manifested in several ways, for example low map variance. The results of a peak statistics analysis are consistent with the expectations of a Gaussian random field. The "Cold Spot" is detected with several methods, including map kurtosis, peak statistics, and mean temperature profile. We thoroughly probe the large-scale dipolar power asymmetry, detecting it with several independent tests, and address the subject of a posteriori correction. Tests of directionality suggest the presence of angular clustering from large to small scales, but at a significance that is dependent on the details of the approach. We perform the first examination of polarization data, finding the morphology of stacked peaks to be consistent with the expectations of statistically isotropic simulations. Where they overlap, these results are consistent with the Planck 2013 analysis based on the nominal mission data and provide our most thorough view of the statistics of the CMB fluctuations to date.
Planck 2015 results: XVI. Isotropy and statistics of the CMB
Ade, P. A. R.; Aghanim, N.; Akrami, Y.; ...
2016-09-20
In this paper, we test the statistical isotropy and Gaussianity of the cosmic microwave background (CMB) anisotropies using observations made by the Planck satellite. Our results are based mainly on the full Planck mission for temperature, but also include some polarization measurements. In particular, we consider the CMB anisotropy maps derived from the multi-frequency Planck data by several component-separation methods. For the temperature anisotropies, we find excellent agreement between results based on these sky maps over both a very large fraction of the sky and a broad range of angular scales, establishing that potential foreground residuals do not affect ourmore » studies. Tests of skewness, kurtosis, multi-normality, N-point functions, and Minkowski functionals indicate consistency with Gaussianity, while a power deficit at large angular scales is manifested in several ways, for example low map variance. The results of a peak statistics analysis are consistent with the expectations of a Gaussian random field. The “Cold Spot” is detected with several methods, including map kurtosis, peak statistics, and mean temperature profile. We thoroughly probe the large-scale dipolar power asymmetry, detecting it with several independent tests, and address the subject of a posteriori correction. Tests of directionality suggest the presence of angular clustering from large to small scales, but at a significance that is dependent on the details of the approach. We perform the first examination of polarization data, finding the morphology of stacked peaks to be consistent with the expectations of statistically isotropic simulations. Finally, where they overlap, these results are consistent with the Planck 2013 analysis based on the nominal mission data and provide our most thorough view of the statistics of the CMB fluctuations to date.« less
NASA Technical Reports Server (NTRS)
Strub, P. Ted; James, Corinne; Thomas, Andrew C.; Abbott, Mark R.
1990-01-01
The large-scale patterns of satellite-derived surface pigment concentration off the west coast of North America are presented and are averaged into monthly mean surface wind fields over the California Current system (CCS) for the July 1979 to June 1986 period. The patterns are discussed in terms of both seasonal and nonseasonal variability for the indicated time period. The large-scale seasonal characteristics of the California Current are summarized. The data and methods used are described, and the problems known to affect the satellite-derived pigment concentrations and the wind data used in the study are discussed. The statistical analysis results are then presented and discussed in light of past observations and theory. Details of the CZCS data processing are described, and details of the principal estimator pattern methodology used here are given.
Neutrino and axion bounds from the globular cluster M5 (NGC 5904).
Viaux, N; Catelan, M; Stetson, P B; Raffelt, G G; Redondo, J; Valcarce, A A R; Weiss, A
2013-12-06
The red-giant branch (RGB) in globular clusters is extended to larger brightness if the degenerate helium core loses too much energy in "dark channels." Based on a large set of archival observations, we provide high-precision photometry for the Galactic globular cluster M5 (NGC 5904), allowing for a detailed comparison between the observed tip of the RGB with predictions based on contemporary stellar evolution theory. In particular, we derive 95% confidence limits of g(ae)<4.3×10(-13) on the axion-electron coupling and μ(ν)<4.5×10(-12)μ(B) (Bohr magneton μ(B)=e/2m(e)) on a neutrino dipole moment, based on a detailed analysis of statistical and systematic uncertainties. The cluster distance is the single largest source of uncertainty and can be improved in the future.
The Application of Statistics Education Research in My Classroom
ERIC Educational Resources Information Center
Jordan, Joy
2007-01-01
A collaborative, statistics education research project (Lovett, 2001) is discussed. Some results of the project were applied in the computer lab sessions of my elementary statistics course. I detail the process of applying these research results, as well as the use of knowledge surveys. Furthermore, I give general suggestions to teachers who want…
Statistical Handbook on Consumption and Wealth in the United States.
ERIC Educational Resources Information Center
Kaul, Chandrika, Ed.; Tomaselli-Moschovitis, Valerie, Ed.
This easy-to-use statistical handbook features the most up-to-date and comprehensive data related to U.S. wealth and consumer spending patterns. More than 300 statistical tables and charts are organized into 8 detailed sections. Intended for students, teachers, and general users, the handbook contains these sections: (1) "General Economic…
Educational Statistics for Selected Health Occupations.
ERIC Educational Resources Information Center
Johnson, Donald W.; Holz, Frank M.
Detailed statistics on education are provided for a number of health occupations. Data are given as far back as 1950-1951 for medical and dental schools, while for schools of public health, the data begin in 1975-1976. Complete 1980 data are provided only for dentistry, pharmacy, and veterinary medicine. Statistical tables are included on the…
Using Microsoft Excel to Generate Usage Statistics
ERIC Educational Resources Information Center
Spellman, Rosemary
2011-01-01
At the Libraries Service Center, statistics are generated on a monthly, quarterly, and yearly basis by using four Microsoft Excel workbooks. These statistics provide information about what materials are being requested and by whom. They also give details about why certain requests may not have been filled. Utilizing Excel allows for a shallower…
Mini-Digest of Education Statistics, 2008. NCES 2009-021
ERIC Educational Resources Information Center
Snyder, Thomas D.
2009-01-01
This publication is the 14th edition of the "Mini-Digest of Education Statistics," a pocket-sized compilation of statistical information covering the broad field of American education from kindergarten through graduate school. The "Mini-Digest" is designed as an easy reference for materials found in much greater detail in the…
Binary partition tree analysis based on region evolution and its application to tree simplification.
Lu, Huihai; Woods, John C; Ghanbari, Mohammed
2007-04-01
Pyramid image representations via tree structures are recognized methods for region-based image analysis. Binary partition trees can be applied which document the merging process with small details found at the bottom levels and larger ones close to the root. Hindsight of the merging process is stored within the tree structure and provides the change histories of an image property from the leaf to the root node. In this work, the change histories are modelled by evolvement functions and their second order statistics are analyzed by using a knee function. Knee values show the reluctancy of each merge. We have systematically formulated these findings to provide a novel framework for binary partition tree analysis, where tree simplification is demonstrated. Based on an evolvement function, for each upward path in a tree, the tree node associated with the first reluctant merge is considered as a pruning candidate. The result is a simplified version providing a reduced solution space and still complying with the definition of a binary tree. The experiments show that image details are preserved whilst the number of nodes is dramatically reduced. An image filtering tool also results which preserves object boundaries and has applications for segmentation.
Multivariate analysis of longitudinal rates of change.
Bryan, Matthew; Heagerty, Patrick J
2016-12-10
Longitudinal data allow direct comparison of the change in patient outcomes associated with treatment or exposure. Frequently, several longitudinal measures are collected that either reflect a common underlying health status, or characterize processes that are influenced in a similar way by covariates such as exposure or demographic characteristics. Statistical methods that can combine multivariate response variables into common measures of covariate effects have been proposed in the literature. Current methods for characterizing the relationship between covariates and the rate of change in multivariate outcomes are limited to select models. For example, 'accelerated time' methods have been developed which assume that covariates rescale time in longitudinal models for disease progression. In this manuscript, we detail an alternative multivariate model formulation that directly structures longitudinal rates of change and that permits a common covariate effect across multiple outcomes. We detail maximum likelihood estimation for a multivariate longitudinal mixed model. We show via asymptotic calculations the potential gain in power that may be achieved with a common analysis of multiple outcomes. We apply the proposed methods to the analysis of a trivariate outcome for infant growth and compare rates of change for HIV infected and uninfected infants. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
Zhang, Wenfeng; Su, Qingyong; Xu, Mi; Yan, Wei
2015-09-01
The stress relaxation curves for three different hot deformation processes in the temperature range of 750-1000 °C were studied to develop an understanding of the precipitation behavior in a nitride-strengthened martensitic heat resistant steel (Zhang et al., Mater. Sci. Eng. A, 2015) [1]. This data article provides supporting data and detailed information on how to accurately analysis the stress relaxation data. The statistical analysis of the stress peak curves, including the number of peaks, the intensity of the peaks and the integral value of the pumps, was carried out. Meanwhile, the XRD energy spectrum data was also calculated in terms of lattice distortion.
The composite sequential clustering technique for analysis of multispectral scanner data
NASA Technical Reports Server (NTRS)
Su, M. Y.
1972-01-01
The clustering technique consists of two parts: (1) a sequential statistical clustering which is essentially a sequential variance analysis, and (2) a generalized K-means clustering. In this composite clustering technique, the output of (1) is a set of initial clusters which are input to (2) for further improvement by an iterative scheme. This unsupervised composite technique was employed for automatic classification of two sets of remote multispectral earth resource observations. The classification accuracy by the unsupervised technique is found to be comparable to that by traditional supervised maximum likelihood classification techniques. The mathematical algorithms for the composite sequential clustering program and a detailed computer program description with job setup are given.
Zhang, Wenfeng; Su, Qingyong; Xu, Mi; Yan, Wei
2015-01-01
The stress relaxation curves for three different hot deformation processes in the temperature range of 750–1000 °C were studied to develop an understanding of the precipitation behavior in a nitride-strengthened martensitic heat resistant steel (Zhang et al., Mater. Sci. Eng. A, 2015) [1]. This data article provides supporting data and detailed information on how to accurately analysis the stress relaxation data. The statistical analysis of the stress peak curves, including the number of peaks, the intensity of the peaks and the integral value of the pumps, was carried out. Meanwhile, the XRD energy spectrum data was also calculated in terms of lattice distortion. PMID:26306310
An analysis of quantum coherent solar photovoltaic cells
NASA Astrophysics Data System (ADS)
Kirk, A. P.
2012-02-01
A new hypothesis (Scully et al., Proc. Natl. Acad. Sci. USA 108 (2011) 15097) suggests that it is possible to break the statistical physics-based detailed balance-limiting power conversion efficiency and increase the power output of a solar photovoltaic cell by using “noise-induced quantum coherence” to increase the current. The fundamental errors of this hypothesis are explained here. As part of this analysis, we show that the maximum photogenerated current density for a practical solar cell is a function of the incident spectrum, sunlight concentration factor, and solar cell energy bandgap and thus the presence of quantum coherence is irrelevant as it is unable to lead to increased current output from a solar cell.
NASA Technical Reports Server (NTRS)
Iverson, L. R.; Cook, E. A.; Graham, R. L.; Olson, J. S.; Frank, T.; Ke, Y.; Treworgy, C.; Risser, P. G.
1986-01-01
Several hardware, software, and data collection problems encountered were conquered. The Geographic Information System (GIS) data from other systems were converted to ERDAS format for incorporation with the image data. Statistical analysis of the relationship between spectral values and productivity is being pursued. Several project sites, including Jackson, Pope, Boulder, Smokies, and Huntington Forest are evolving as the most intensively studied areas, primarily due to availability of data and time. Progress with data acquisition and quality checking, more details on experimental sites, and brief summarizations of research results and future plans are discussed. Material on personnel, collaborators, facilities, site background, and meetings and publications of the investigators are included.
Volcanic eruptions and solar activity
NASA Technical Reports Server (NTRS)
Stothers, Richard B.
1989-01-01
The historical record of large volcanic eruptions from 1500 to 1980 is subjected to detailed time series analysis. In two weak but probably statistically significant periodicities of about 11 and 80 yr, the frequency of volcanic eruptions increases (decreases) slightly around the times of solar minimum (maximum). Time series analysis of the volcanogenic acidities in a deep ice core from Greenland reveals several very long periods ranging from about 80 to about 350 yr which are similar to the very slow solar cycles previously detected in auroral and C-14 records. Solar flares may cause changes in atmospheric circulation patterns that abruptly alter the earth's spin. The resulting jolt probably triggers small earthquakes which affect volcanism.
Removal of Asperger's syndrome from the DSM V: community response to uncertainty.
Parsloe, Sarah M; Babrow, Austin S
2016-01-01
The May 2013 release of the new version of the Diagnostic and Statistical Manual of Mental Disorders (DSM V) subsumed Asperger's syndrome under the wider diagnostic label of autism spectrum disorder (ASD). The revision has created much uncertainty in the community affected by this condition. This study uses problematic integration theory and thematic analysis to investigate how participants in Wrong Planet, a large online community associated with autism and Asperger's syndrome, have constructed these uncertainties. The analysis illuminates uncertainties concerning both the likelihood of diagnosis and value of diagnosis, and it details specific issues within these two general areas of uncertainty. The article concludes with both conceptual and practical implications.
Multifractality and freezing phenomena in random energy landscapes: An introduction
NASA Astrophysics Data System (ADS)
Fyodorov, Yan V.
2010-10-01
We start our lectures with introducing and discussing the general notion of multifractality spectrum for random measures on lattices, and how it can be probed using moments of that measure. Then we show that the Boltzmann-Gibbs probability distributions generated by logarithmically correlated random potentials provide a simple yet non-trivial example of disorder-induced multifractal measures. The typical values of the multifractality exponents can be extracted from calculating the free energy of the associated Statistical Mechanics problem. To succeed in such a calculation we introduce and discuss in some detail two analytically tractable models for logarithmically correlated potentials. The first model uses a special definition of distances between points in space and is based on the idea of multiplicative cascades which originated in theory of turbulent motion. It is essentially equivalent to statistical mechanics of directed polymers on disordered trees studied long ago by Derrida and Spohn (1988) in Ref. [12]. In this way we introduce the notion of the freezing transition which is identified with an abrupt change in the multifractality spectrum. Second model which allows for explicit analytical evaluation of the free energy is the infinite-dimensional version of the problem which can be solved by employing the replica trick. In particular, the latter version allows one to identify the freezing phenomenon with a mechanism of the replica symmetry breaking (RSB) and to elucidate its physical meaning. The corresponding one-step RSB solution turns out to be marginally stable everywhere in the low-temperature phase. We finish with a short discussion of recent developments and extensions of models with logarithmic correlations, in particular in the context of extreme value statistics. The first appendix summarizes the standard elementary information about Gaussian integrals and related subjects, and introduces the notion of the Gaussian free field characterized by logarithmic correlations. Three other appendices provide the detailed exposition of a few technical details underlying the replica analysis of the model discussed in the lectures.
Special report. NIOSH: new facts about violence against healthcare workers and security officers.
1996-09-01
A definitive compilation and analysis by NIOSH (National Institute For Occupational Safety and Health) of recent studies measuring violence in the workplace presents the clearest picture to date of the nature and frequency of such violence and which employees are at greatest risk. For managers in security and health care, the NIOSH statistics are especially important. In this report, we'll review what we believe are the most significant findings of NIOSH and other sources and present details of programs designed to prevent healthcare workers from becoming victims of violence.
DOE R&D Accomplishments Database
Tartarelli, G. F.; CDF Collaboration
1996-05-01
The authors present the latest results about top physics obtained by the CDF experiment at the Fermilab Tevatron collider. The data sample used for these analysis (about 110 pb{sup{minus}1}) represents almost the entire statistics collected by CDF during four years (1992--95) of data taking. This large data size has allowed detailed studies of top production and decay properties. The results discussed here include the determination of the top quark mass, the measurement of the production cross section, the study of the kinematics of the top events and a look at top decays.
Avoiding false discoveries in association studies.
Sabatti, Chiara
2007-01-01
We consider the problem of controlling false discoveries in association studies. We assume that the design of the study is adequate so that the "false discoveries" are potentially only because of random chance, not to confounding or other flaws. Under this premise, we review the statistical framework for hypothesis testing and correction for multiple comparisons. We consider in detail the currently accepted strategies in linkage analysis. We then examine the underlying similarities and differences between linkage and association studies and document some of the most recent methodological developments for association mapping.
Arm structure in normal spiral galaxies, 1: Multivariate data for 492 galaxies
NASA Technical Reports Server (NTRS)
Magri, Christopher
1994-01-01
Multivariate data have been collected as part of an effort to develop a new classification system for spiral galaxies, one which is not necessarily based on subjective morphological properties. A sample of 492 moderately bright northern Sa and Sc spirals was chosen for future statistical analysis. New observations were made at 20 and 21 cm; the latter data are described in detail here. Infrared Astronomy Satellite (IRAS) fluxes were obtained from archival data. Finally, new estimates of arm pattern radomness and of local environmental harshness were compiled for most sample objects.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nielsen, Erik; Blume-Kohout, Robin; Rudinger, Kenneth
PyGSTi is an implementation of Gate Set Tomography in the python programming language. Gate Set Tomography (GST) is a theory and protocol for simultaneously estimating the state preparation, gate operations, and measurement effects of a physical system of one or many quantum bits (qubits). These estimates are based entirely on the statistics of experimental measurements, and their interpretation and analysis can provide a detailed understanding of the types of errors/imperfections in the physical system. In this way, GST provides not only a means of certifying the "goodness" of qubits but also a means of debugging (i.e. improving) them.
Metronomic palliative chemotherapy in maxillary sinus tumor
Patil, Vijay M.; Noronh, Vanita; Joshi, Amit; Karpe, Ashay; Talreja, Vikas; Chandrasekharan, Arun; Dhumal, Sachin; Prabhash, Kumar
2016-01-01
Background: Metronomic chemotherapy consisting of methotrexate and celecoxib recently has shown promising results in multiple studies in head and neck cancers. However, these studies have not included patients with maxillary sinus primaries. Hence, the role of palliative metronomic chemotherapy in patients with maxillary sinus carcinoma that is not amenable to radical therapy is unknown. Methods: This was a retrospective analysis of carcinoma maxillary sinus patients who received palliative metronomic chemotherapy between August 2011 and August 2014. The demographic details, symptomatology, previous treatment details, indication for palliative chemotherapy, response to therapy, and overall survival (OS) details were extracted. SPSS version 16 was used for analysis. Descriptive statistics have been performed. Survival analysis was done by Kaplan–Meier method. Results: Five patients had received metronomic chemotherapy. The median age was 60 years (range 37–64 years). The proportion of patients surviving at 6 months, 12 months, and 18 months were 40%, 40%, and 20%, respectively. The estimated median OS was 126 days (95% confidence interval 0–299.9 days). The estimated median survival in patients with an event-free period after the last therapy of <6 months was 45 days, whereas it was 409 days in patients with an event-free period postlast therapy above 6 months (P = 0.063). Conclusion: Metronomic chemotherapy in carcinoma maxillary sinus holds promise. It has activity similar to that seen in head and neck cancers and needs to be evaluated further in a larger cohort of patients. PMID:27275447
Durand, Marc; Käfer, Jos; Quilliet, Catherine; Cox, Simon; Talebi, Shirin Ataei; Graner, François
2011-10-14
We propose an analytical model for the statistical mechanics of shuffled two-dimensional foams with moderate bubble size polydispersity. It predicts without any adjustable parameters the correlations between the number of sides n of the bubbles (topology) and their areas A (geometry) observed in experiments and numerical simulations of shuffled foams. Detailed statistics show that in shuffled cellular patterns n correlates better with √A (as claimed by Desch and Feltham) than with A (as claimed by Lewis and widely assumed in the literature). At the level of the whole foam, standard deviations Δn and ΔA are in proportion. Possible applications include correlations of the detailed distributions of n and A, three-dimensional foams, and biological tissues.
Salvatore, Stefania; Bramness, Jørgen Gustav; Reid, Malcolm J; Thomas, Kevin Victor; Harman, Christopher; Røislien, Jo
2015-01-01
Wastewater-based epidemiology (WBE) is a new methodology for estimating the drug load in a population. Simple summary statistics and specification tests have typically been used to analyze WBE data, comparing differences between weekday and weekend loads. Such standard statistical methods may, however, overlook important nuanced information in the data. In this study, we apply functional data analysis (FDA) to WBE data and compare the results to those obtained from more traditional summary measures. We analysed temporal WBE data from 42 European cities, using sewage samples collected daily for one week in March 2013. For each city, the main temporal features of two selected drugs were extracted using functional principal component (FPC) analysis, along with simpler measures such as the area under the curve (AUC). The individual cities' scores on each of the temporal FPCs were then used as outcome variables in multiple linear regression analysis with various city and country characteristics as predictors. The results were compared to those of functional analysis of variance (FANOVA). The three first FPCs explained more than 99% of the temporal variation. The first component (FPC1) represented the level of the drug load, while the second and third temporal components represented the level and the timing of a weekend peak. AUC was highly correlated with FPC1, but other temporal characteristic were not captured by the simple summary measures. FANOVA was less flexible than the FPCA-based regression, and even showed concordance results. Geographical location was the main predictor for the general level of the drug load. FDA of WBE data extracts more detailed information about drug load patterns during the week which are not identified by more traditional statistical methods. Results also suggest that regression based on FPC results is a valuable addition to FANOVA for estimating associations between temporal patterns and covariate information.
Genome-wide comparative analysis of four Indian Drosophila species.
Mohanty, Sujata; Khanna, Radhika
2017-12-01
Comparative analysis of multiple genomes of closely or distantly related Drosophila species undoubtedly creates excitement among evolutionary biologists in exploring the genomic changes with an ecology and evolutionary perspective. We present herewith the de novo assembled whole genome sequences of four Drosophila species, D. bipectinata, D. takahashii, D. biarmipes and D. nasuta of Indian origin using Next Generation Sequencing technology on an Illumina platform along with their detailed assembly statistics. The comparative genomics analysis, e.g. gene predictions and annotations, functional and orthogroup analysis of coding sequences and genome wide SNP distribution were performed. The whole genome of Zaprionus indianus of Indian origin published earlier by us and the genome sequences of previously sequenced 12 Drosophila species available in the NCBI database were included in the analysis. The present work is a part of our ongoing genomics project of Indian Drosophila species.
MNE software for processing MEG and EEG data
Gramfort, A.; Luessi, M.; Larson, E.; Engemann, D.; Strohmeier, D.; Brodbeck, C.; Parkkonen, L.; Hämäläinen, M.
2013-01-01
Magnetoencephalography and electroencephalography (M/EEG) measure the weak electromagnetic signals originating from neural currents in the brain. Using these signals to characterize and locate brain activity is a challenging task, as evidenced by several decades of methodological contributions. MNE, whose name stems from its capability to compute cortically-constrained minimum-norm current estimates from M/EEG data, is a software package that provides comprehensive analysis tools and workflows including preprocessing, source estimation, time–frequency analysis, statistical analysis, and several methods to estimate functional connectivity between distributed brain regions. The present paper gives detailed information about the MNE package and describes typical use cases while also warning about potential caveats in analysis. The MNE package is a collaborative effort of multiple institutes striving to implement and share best methods and to facilitate distribution of analysis pipelines to advance reproducibility of research. Full documentation is available at http://martinos.org/mne. PMID:24161808
Statistical methods for the analysis of climate extremes
NASA Astrophysics Data System (ADS)
Naveau, Philippe; Nogaj, Marta; Ammann, Caspar; Yiou, Pascal; Cooley, Daniel; Jomelli, Vincent
2005-08-01
Currently there is an increasing research activity in the area of climate extremes because they represent a key manifestation of non-linear systems and an enormous impact on economic and social human activities. Our understanding of the mean behavior of climate and its 'normal' variability has been improving significantly during the last decades. In comparison, climate extreme events have been hard to study and even harder to predict because they are, by definition, rare and obey different statistical laws than averages. In this context, the motivation for this paper is twofold. Firstly, we recall the basic principles of Extreme Value Theory that is used on a regular basis in finance and hydrology, but it still does not have the same success in climate studies. More precisely, the theoretical distributions of maxima and large peaks are recalled. The parameters of such distributions are estimated with the maximum likelihood estimation procedure that offers the flexibility to take into account explanatory variables in our analysis. Secondly, we detail three case-studies to show that this theory can provide a solid statistical foundation, specially when assessing the uncertainty associated with extreme events in a wide range of applications linked to the study of our climate. To cite this article: P. Naveau et al., C. R. Geoscience 337 (2005).
“Magnitude-based Inference”: A Statistical Review
Welsh, Alan H.; Knight, Emma J.
2015-01-01
ABSTRACT Purpose We consider “magnitude-based inference” and its interpretation by examining in detail its use in the problem of comparing two means. Methods We extract from the spreadsheets, which are provided to users of the analysis (http://www.sportsci.org/), a precise description of how “magnitude-based inference” is implemented. We compare the implemented version of the method with general descriptions of it and interpret the method in familiar statistical terms. Results and Conclusions We show that “magnitude-based inference” is not a progressive improvement on modern statistics. The additional probabilities introduced are not directly related to the confidence interval but, rather, are interpretable either as P values for two different nonstandard tests (for different null hypotheses) or as approximate Bayesian calculations, which also lead to a type of test. We also discuss sample size calculations associated with “magnitude-based inference” and show that the substantial reduction in sample sizes claimed for the method (30% of the sample size obtained from standard frequentist calculations) is not justifiable so the sample size calculations should not be used. Rather than using “magnitude-based inference,” a better solution is to be realistic about the limitations of the data and use either confidence intervals or a fully Bayesian analysis. PMID:25051387
Large-scale gene function analysis with the PANTHER classification system.
Mi, Huaiyu; Muruganujan, Anushya; Casagrande, John T; Thomas, Paul D
2013-08-01
The PANTHER (protein annotation through evolutionary relationship) classification system (http://www.pantherdb.org/) is a comprehensive system that combines gene function, ontology, pathways and statistical analysis tools that enable biologists to analyze large-scale, genome-wide data from sequencing, proteomics or gene expression experiments. The system is built with 82 complete genomes organized into gene families and subfamilies, and their evolutionary relationships are captured in phylogenetic trees, multiple sequence alignments and statistical models (hidden Markov models or HMMs). Genes are classified according to their function in several different ways: families and subfamilies are annotated with ontology terms (Gene Ontology (GO) and PANTHER protein class), and sequences are assigned to PANTHER pathways. The PANTHER website includes a suite of tools that enable users to browse and query gene functions, and to analyze large-scale experimental data with a number of statistical tests. It is widely used by bench scientists, bioinformaticians, computer scientists and systems biologists. In the 2013 release of PANTHER (v.8.0), in addition to an update of the data content, we redesigned the website interface to improve both user experience and the system's analytical capability. This protocol provides a detailed description of how to analyze genome-wide experimental data with the PANTHER classification system.
Effect of Low-Dose MDCT and Iterative Reconstruction on Trabecular Bone Microstructure Assessment.
Kopp, Felix K; Holzapfel, Konstantin; Baum, Thomas; Nasirudin, Radin A; Mei, Kai; Garcia, Eduardo G; Burgkart, Rainer; Rummeny, Ernst J; Kirschke, Jan S; Noël, Peter B
2016-01-01
We investigated the effects of low-dose multi detector computed tomography (MDCT) in combination with statistical iterative reconstruction algorithms on trabecular bone microstructure parameters. Twelve donated vertebrae were scanned with the routine radiation exposure used in our department (standard-dose) and a low-dose protocol. Reconstructions were performed with filtered backprojection (FBP) and maximum-likelihood based statistical iterative reconstruction (SIR). Trabecular bone microstructure parameters were assessed and statistically compared for each reconstruction. Moreover, fracture loads of the vertebrae were biomechanically determined and correlated to the assessed microstructure parameters. Trabecular bone microstructure parameters based on low-dose MDCT and SIR significantly correlated with vertebral bone strength. There was no significant difference between microstructure parameters calculated on low-dose SIR and standard-dose FBP images. However, the results revealed a strong dependency on the regularization strength applied during SIR. It was observed that stronger regularization might corrupt the microstructure analysis, because the trabecular structure is a very small detail that might get lost during the regularization process. As a consequence, the introduction of SIR for trabecular bone microstructure analysis requires a specific optimization of the regularization parameters. Moreover, in comparison to other approaches, superior noise-resolution trade-offs can be found with the proposed methods.
Size of the Dynamic Bead in Polymers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Agapov, Alexander L; Sokolov, Alexei P
2010-01-01
Presented analysis of neutron, mechanical, and MD simulation data available in the literature demonstrates that the dynamic bead size (the smallest subchain that still exhibits the Rouse-like dynamics) in most of the polymers is significantly larger than the traditionally defined Kuhn segment. Moreover, our analysis emphasizes that even the static bead size (e.g., chain statistics) disagrees with the Kuhn segment length. We demonstrate that the deficiency of the Kuhn segment definition is based on the assumption of a chain being completely extended inside a single bead. The analysis suggests that representation of a real polymer chain by the bead-and-spring modelmore » with a single parameter C cannot be correct. One needs more parameters to reflect correctly details of the chain structure in the bead-and-spring model.« less
Lack, N
2001-08-01
The introduction of the modified data set for quality assurance in obstetrics (formerly perinatal survey) in Lower Saxony and Bavaria as early as 1999 saw the urgent requirement for a corresponding new statistical analysis of the revised data. The general outline of a new data reporting concept was originally presented by the Bavarian Commission for Perinatology and Neonatology at the Munich Perinatal Conference in November 1997. These ideas are germinal to content and layout of the new quality report for obstetrics currently in its nationwide harmonisation phase coordinated by the federal office for quality assurance in hospital care. A flexible and modular database oriented analysis tool developed in Bavaria is now in its second year of successful operation. The functionalities of this system are described in detail.
Wavelet Transform Based Higher Order Statistical Analysis of Wind and Wave Time Histories
NASA Astrophysics Data System (ADS)
Habib Huseni, Gulamhusenwala; Balaji, Ramakrishnan
2017-10-01
Wind, blowing on the surface of the ocean, imparts the energy to generate the waves. Understanding the wind-wave interactions is essential for an oceanographer. This study involves higher order spectral analyses of wind speeds and significant wave height time histories, extracted from European Centre for Medium-Range Weather Forecast database at an offshore location off Mumbai coast, through continuous wavelet transform. The time histories were divided by the seasons; pre-monsoon, monsoon, post-monsoon and winter and the analysis were carried out to the individual data sets, to assess the effect of various seasons on the wind-wave interactions. The analysis revealed that the frequency coupling of wind speeds and wave heights of various seasons. The details of data, analysing technique and results are presented in this paper.
Web-TCGA: an online platform for integrated analysis of molecular cancer data sets.
Deng, Mario; Brägelmann, Johannes; Schultze, Joachim L; Perner, Sven
2016-02-06
The Cancer Genome Atlas (TCGA) is a pool of molecular data sets publicly accessible and freely available to cancer researchers anywhere around the world. However, wide spread use is limited since an advanced knowledge of statistics and statistical software is required. In order to improve accessibility we created Web-TCGA, a web based, freely accessible online tool, which can also be run in a private instance, for integrated analysis of molecular cancer data sets provided by TCGA. In contrast to already available tools, Web-TCGA utilizes different methods for analysis and visualization of TCGA data, allowing users to generate global molecular profiles across different cancer entities simultaneously. In addition to global molecular profiles, Web-TCGA offers highly detailed gene and tumor entity centric analysis by providing interactive tables and views. As a supplement to other already available tools, such as cBioPortal (Sci Signal 6:pl1, 2013, Cancer Discov 2:401-4, 2012), Web-TCGA is offering an analysis service, which does not require any installation or configuration, for molecular data sets available at the TCGA. Individual processing requests (queries) are generated by the user for mutation, methylation, expression and copy number variation (CNV) analyses. The user can focus analyses on results from single genes and cancer entities or perform a global analysis (multiple cancer entities and genes simultaneously).
Daepp, Madeleine I. G.; Hamilton, Marcus J.; West, Geoffrey B.; Bettencourt, Luís M. A.
2015-01-01
The firm is a fundamental economic unit of contemporary human societies. Studies on the general quantitative and statistical character of firms have produced mixed results regarding their lifespans and mortality. We examine a comprehensive database of more than 25 000 publicly traded North American companies, from 1950 to 2009, to derive the statistics of firm lifespans. Based on detailed survival analysis, we show that the mortality of publicly traded companies manifests an approximately constant hazard rate over long periods of observation. This regularity indicates that mortality rates are independent of a company's age. We show that the typical half-life of a publicly traded company is about a decade, regardless of business sector. Our results shed new light on the dynamics of births and deaths of publicly traded companies and identify some of the necessary ingredients of a general theory of firms. PMID:25833247
Daepp, Madeleine I G; Hamilton, Marcus J; West, Geoffrey B; Bettencourt, Luís M A
2015-05-06
The firm is a fundamental economic unit of contemporary human societies. Studies on the general quantitative and statistical character of firms have produced mixed results regarding their lifespans and mortality. We examine a comprehensive database of more than 25 000 publicly traded North American companies, from 1950 to 2009, to derive the statistics of firm lifespans. Based on detailed survival analysis, we show that the mortality of publicly traded companies manifests an approximately constant hazard rate over long periods of observation. This regularity indicates that mortality rates are independent of a company's age. We show that the typical half-life of a publicly traded company is about a decade, regardless of business sector. Our results shed new light on the dynamics of births and deaths of publicly traded companies and identify some of the necessary ingredients of a general theory of firms.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kerr, J.; Jones, G.L.
1996-01-01
Recent advances in hardware and software have given the interpreter and engineer new ways to view 3D seismic data and well bore information. Recent papers have also highlighted the use of various statistics and seismic attributes. By combining new 3D rendering technologies with recent trends in seismic analysis, the interpreter can improve the structural and stratigraphic resolution of hydrocarbon reservoirs. This paper gives several examples using 3D visualization to better define both the structural and stratigraphic aspects of several different structural types from around the world. Statistics, 3D visualization techniques and rapid animation are used to show complex faulting andmore » detailed channel systems. These systems would be difficult to map using either 2D or 3D data with conventional interpretation techniques.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kerr, J.; Jones, G.L.
1996-12-31
Recent advances in hardware and software have given the interpreter and engineer new ways to view 3D seismic data and well bore information. Recent papers have also highlighted the use of various statistics and seismic attributes. By combining new 3D rendering technologies with recent trends in seismic analysis, the interpreter can improve the structural and stratigraphic resolution of hydrocarbon reservoirs. This paper gives several examples using 3D visualization to better define both the structural and stratigraphic aspects of several different structural types from around the world. Statistics, 3D visualization techniques and rapid animation are used to show complex faulting andmore » detailed channel systems. These systems would be difficult to map using either 2D or 3D data with conventional interpretation techniques.« less
Sugita, Mitsuro; Weatherbee, Andrew; Bizheva, Kostadinka; Popov, Ivan; Vitkin, Alex
2016-07-01
The probability density function (PDF) of light scattering intensity can be used to characterize the scattering medium. We have recently shown that in optical coherence tomography (OCT), a PDF formalism can be sensitive to the number of scatterers in the probed scattering volume and can be represented by the K-distribution, a functional descriptor for non-Gaussian scattering statistics. Expanding on this initial finding, here we examine polystyrene microsphere phantoms with different sphere sizes and concentrations, and also human skin and fingernail in vivo. It is demonstrated that the K-distribution offers an accurate representation for the measured OCT PDFs. The behavior of the shape parameter of K-distribution that best fits the OCT scattering results is investigated in detail, and the applicability of this methodology for biological tissue characterization is demonstrated and discussed.
Using the MCNP Taylor series perturbation feature (efficiently) for shielding problems
NASA Astrophysics Data System (ADS)
Favorite, Jeffrey
2017-09-01
The Taylor series or differential operator perturbation method, implemented in MCNP and invoked using the PERT card, can be used for efficient parameter studies in shielding problems. This paper shows how only two PERT cards are needed to generate an entire parameter study, including statistical uncertainty estimates (an additional three PERT cards can be used to give exact statistical uncertainties). One realistic example problem involves a detailed helium-3 neutron detector model and its efficiency as a function of the density of its high-density polyethylene moderator. The MCNP differential operator perturbation capability is extremely accurate for this problem. A second problem involves the density of the polyethylene reflector of the BeRP ball and is an example of first-order sensitivity analysis using the PERT capability. A third problem is an analytic verification of the PERT capability.
Basic biostatistics for post-graduate students
Dakhale, Ganesh N.; Hiware, Sachin K.; Shinde, Abhijit T.; Mahatme, Mohini S.
2012-01-01
Statistical methods are important to draw valid conclusions from the obtained data. This article provides background information related to fundamental methods and techniques in biostatistics for the use of postgraduate students. Main focus is given to types of data, measurement of central variations and basic tests, which are useful for analysis of different types of observations. Few parameters like normal distribution, calculation of sample size, level of significance, null hypothesis, indices of variability, and different test are explained in detail by giving suitable examples. Using these guidelines, we are confident enough that postgraduate students will be able to classify distribution of data along with application of proper test. Information is also given regarding various free software programs and websites useful for calculations of statistics. Thus, postgraduate students will be benefitted in both ways whether they opt for academics or for industry. PMID:23087501
Database Performance Monitoring for the Photovoltaic Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Klise, Katherine A.
The Database Performance Monitoring (DPM) software (copyright in processes) is being developed at Sandia National Laboratories to perform quality control analysis on time series data. The software loads time indexed databases (currently csv format), performs a series of quality control tests defined by the user, and creates reports which include summary statistics, tables, and graphics. DPM can be setup to run on an automated schedule defined by the user. For example, the software can be run once per day to analyze data collected on the previous day. HTML formatted reports can be sent via email or hosted on a website.more » To compare performance of several databases, summary statistics and graphics can be gathered in a dashboard view which links to detailed reporting information for each database. The software can be customized for specific applications.« less
Hudson-Shore, Michelle
2016-12-01
The Annual Statistics of Scientific Procedures on Living Animals Great Britain 2015 indicate that the Home Office were correct in recommending that caution should be exercised when interpreting the 2014 data as an apparent decline in animal experiments. The 2015 report shows that, as the changes to the format of the annual statistics have become more familiar and less problematic, there has been a re-emergence of the upward trend in animal research and testing in Great Britain. The 2015 statistics report an increase in animal procedures (up to 4,142,631) and in the number of animals used (up to 4,069,349). This represents 1% more than the totals in 2013, and a 7% increase on the procedures reported in 2014. This paper details an analysis of these most recent statistics, providing information on overall animal use and highlighting specific issues associated with genetically-altered animals, dogs and primates. It also reflects on areas of the new format that have previously been highlighted as being problematic, and concludes with a discussion about the use of animals in regulatory research and testing, and how there are significant missed opportunities for replacing some of the animal-based tests in this area. 2016 FRAME.
The chi-square test of independence.
McHugh, Mary L
2013-01-01
The Chi-square statistic is a non-parametric (distribution free) tool designed to analyze group differences when the dependent variable is measured at a nominal level. Like all non-parametric statistics, the Chi-square is robust with respect to the distribution of the data. Specifically, it does not require equality of variances among the study groups or homoscedasticity in the data. It permits evaluation of both dichotomous independent variables, and of multiple group studies. Unlike many other non-parametric and some parametric statistics, the calculations needed to compute the Chi-square provide considerable information about how each of the groups performed in the study. This richness of detail allows the researcher to understand the results and thus to derive more detailed information from this statistic than from many others. The Chi-square is a significance statistic, and should be followed with a strength statistic. The Cramer's V is the most common strength test used to test the data when a significant Chi-square result has been obtained. Advantages of the Chi-square include its robustness with respect to distribution of the data, its ease of computation, the detailed information that can be derived from the test, its use in studies for which parametric assumptions cannot be met, and its flexibility in handling data from both two group and multiple group studies. Limitations include its sample size requirements, difficulty of interpretation when there are large numbers of categories (20 or more) in the independent or dependent variables, and tendency of the Cramer's V to produce relative low correlation measures, even for highly significant results.
NASA Astrophysics Data System (ADS)
Rich, Grayson Currie
The COHERENT Collaboration has produced the first-ever observation, with a significance of 6.7sigma, of a process consistent with coherent, elastic neutrino-nucleus scattering (CEnuNS) as first predicted and described by D.Z. Freedman in 1974. Physics of the CEnuNS process are presented along with its relationship to future measurements in the arenas of nuclear physics, fundamental particle physics, and astroparticle physics, where the newly-observed interaction presents a viable tool for investigations into numerous outstanding questions about the nature of the universe. To enable the CEnuNS observation with a 14.6-kg CsI[Na] detector, new measurements of the response of CsI[Na] to low-energy nuclear recoils, which is the only mechanism by which CEnuNS is detectable, were carried out at Triangle Universities Nuclear Laboratory; these measurements are detailed and an effective nuclear-recoil quenching factor of 8.78 +/- 1.66% is established for CsI[Na] in the recoil-energy range of 5-30 keV, based on new and literature data. Following separate analyses of the CEnuNS-search data by groups at the University of Chicago and the Moscow Engineering and Physics Institute, information from simulations, calculations, and ancillary measurements were used to inform statistical analyses of the collected data. Based on input from the Chicago analysis, the number of CEnuNS events expected from the Standard Model is 173 +/- 48; interpretation as a simple counting experiment finds 136 +/- 31 CEnuNS counts in the data, while a two-dimensional, profile likelihood fit yields 134 +/- 22 CEnuNS counts. Details of the simulations, calculations, and supporting measurements are discussed, in addition to the statistical procedures. Finally, potential improvements to the CsI[Na]-based CEnuNS measurement are presented along with future possibilities for COHERENT Collaboration, including new CEnuNS detectors and measurement of the neutrino-induced neutron spallation process.
NASA Technical Reports Server (NTRS)
Bommier, V.; Leroy, J. L.; Sahal-Brechot, S.
1985-01-01
The Hanle effect method for magnetic field vector diagnostics has now provided results on the magnetic field strength and direction in quiescent prominences, from linear polarization measurements in the He I E sub 3 line, performed at the Pic-du-Midi and at Sacramento Peak. However, there is an inescapable ambiguity in the field vector determination: each polarization measurement provides two field vector solutions symmetrical with respect to the line-of-sight. A statistical analysis capable of solving this ambiguity was applied to the large sample of prominences observed at the Pic-du-Midi (Leroy, et al., 1984); the same method of analysis applied to the prominences observed at Sacramento Peak (Athay, et al., 1983) provides results in agreement on the most probable magnetic structure of prominences; these results are detailed. The statistical results were confirmed on favorable individual cases: for 15 prominences observed at Pic-du-Midi, the two-field vectors are pointing on the same side of the prominence, and the alpha angles are large enough with respect to the measurements and interpretation inaccuracies, so that the field polarity is derived without any ambiguity.
Federal, State and Local Transportation Financial Statistics : Fiscal Years 1982-1992
DOT National Transportation Integrated Search
1995-07-01
The Federal, State and Local Transportation Financial Statistics report is the latest in a series that identifies and details transportation-related revenues and expenditures by mode and government jurisdiction for fiscal years 1982 through 1992. The...
Minnesota forest statistics, 1990.
Patrick D. Miles; Chung M. Chen
1992-01-01
The fifth inventory of Minnesota's forests reports 51.0 million acres of land, of which 16.7 million acres are forested. This bulletin presents statistical highlights and contains detailed tables of forest area, as well as timber volume, growth, removals, mortality, and ownership.
[Big data approaches in psychiatry: examples in depression research].
Bzdok, D; Karrer, T M; Habel, U; Schneider, F
2017-11-29
The exploration and therapy of depression is aggravated by heterogeneous etiological mechanisms and various comorbidities. With the growing trend towards big data in psychiatry, research and therapy can increasingly target the individual patient. This novel objective requires special methods of analysis. The possibilities and challenges of the application of big data approaches in depression are examined in closer detail. Examples are given to illustrate the possibilities of big data approaches in depression research. Modern machine learning methods are compared to traditional statistical methods in terms of their potential in applications to depression. Big data approaches are particularly suited to the analysis of detailed observational data, the prediction of single data points or several clinical variables and the identification of endophenotypes. A current challenge lies in the transfer of results into the clinical treatment of patients with depression. Big data approaches enable biological subtypes in depression to be identified and predictions in individual patients to be made. They have enormous potential for prevention, early diagnosis, treatment choice and prognosis of depression as well as for treatment development.
Revealing the structural nature of the Cd isotopes
NASA Astrophysics Data System (ADS)
Garrett, P. E.; Diaz Varela, A.; Green, K. L.; Jamieson, D. S.; Jigmeddorj, B.; Wood, J. L.; Yates, S. W.
2015-10-01
The even-even Cd isotopes have provided fertile ground for the investigation of collectivity in nuclei. Soon after the development of the Bohr model, the stable Cd isotopes were identified as nearly harmonic vibrators based on their excitation energy patterns. The measurements of enhanced B (E 2) values appeared to support this interpretation. Shape co-existing rotational-like intruder bands were discovered, and mixing between the configurations was invoked to explain the deviation of the decay pattern of multiphonon vibrational states. Very recently, a detailed analysis of the low-lying levels of 110Cd combining results of the (n ,n' γ) reaction and high-statistics β decay, provided strong evidence that the mixing between configurations is weak, except for the ground-state band and ``Kπ =0+ '' intruder band. The analysis of the levels in 110Cd has now been extended to 3 MeV, and combined with data for 112Cd and previous Coulomb excitation data for 114Cd, enables a detailed map of the E 2 collectivity in these nuclei, demanding a complete re-interpretation of the structure of the stable Cd isotopes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Guntur, Srinivas; Jonkman, Jason; Sievers, Ryan
This paper presents validation and code-to-code verification of the latest version of the U.S. Department of Energy, National Renewable Energy Laboratory wind turbine aeroelastic engineering simulation tool, FAST v8. A set of 1,141 test cases, for which experimental data from a Siemens 2.3 MW machine have been made available and were in accordance with the International Electrotechnical Commission 61400-13 guidelines, were identified. These conditions were simulated using FAST as well as the Siemens in-house aeroelastic code, BHawC. This paper presents a detailed analysis comparing results from FAST with those from BHawC as well as experimental measurements, using statistics including themore » means and the standard deviations along with the power spectral densities of select turbine parameters and loads. Results indicate a good agreement among the predictions using FAST, BHawC, and experimental measurements. Here, these agreements are discussed in detail in this paper, along with some comments regarding the differences seen in these comparisons relative to the inherent uncertainties in such a model-based analysis.« less
Guntur, Srinivas; Jonkman, Jason; Sievers, Ryan; ...
2017-08-29
This paper presents validation and code-to-code verification of the latest version of the U.S. Department of Energy, National Renewable Energy Laboratory wind turbine aeroelastic engineering simulation tool, FAST v8. A set of 1,141 test cases, for which experimental data from a Siemens 2.3 MW machine have been made available and were in accordance with the International Electrotechnical Commission 61400-13 guidelines, were identified. These conditions were simulated using FAST as well as the Siemens in-house aeroelastic code, BHawC. This paper presents a detailed analysis comparing results from FAST with those from BHawC as well as experimental measurements, using statistics including themore » means and the standard deviations along with the power spectral densities of select turbine parameters and loads. Results indicate a good agreement among the predictions using FAST, BHawC, and experimental measurements. Here, these agreements are discussed in detail in this paper, along with some comments regarding the differences seen in these comparisons relative to the inherent uncertainties in such a model-based analysis.« less
Forensic genetic analysis of bio-geographical ancestry.
Phillips, Chris
2015-09-01
With the great strides made in the last ten years in the understanding of human population variation and the detailed characterization of the genome, it is now possible to identify sets of ancestry informative markers suitable for relatively small-scale PCR-based assays and use them to analyze the ancestry of an individual from forensic DNA. This review outlines some of the current understanding of past human population structure and how it may have influenced the complex distribution of contemporary human diversity. A simplified description of human diversity can provide a suitable basis for choosing the best ancestry-informative markers, which is important given the constraints of multiplex sizes in forensic DNA tests. It is also important to decide the level of geographic resolution that is realistic to ensure the balance between informativeness and an over-simplification of complex human diversity patterns. A detailed comparison is made of the most informative ancestry markers suitable for forensic use and assessments are made of the data analysis regimes that can provide statistical inferences of a DNA donor's bio-geographical ancestry. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
From Constraints to Resolution Rules Part II : chains, braids, confluence and T&E
NASA Astrophysics Data System (ADS)
Berthier, Denis
In this Part II, we apply the general theory developed in Part I to a detailed analysis of the Constraint Satisfaction Problem (CSP). We show how specific types of resolution rules can be defined. In particular, we introduce the general notions of a chain and a braid. As in Part I, these notions are illustrated in detail with the Sudoku example - a problem known to be NP-complete and which is therefore typical of a broad class of hard problems. For Sudoku, we also show how far one can go in "approximating" a CSP with a resolution theory and we give an empirical statistical analysis of how the various puzzles, corresponding to different sets of entries, can be classified along a natural scale of complexity. For any CSP, we also prove the confluence property of some Resolution Theories based on braids and we show how it can be used to define different resolution strategies. Finally, we prove that, in any CSP, braids have the same solving capacity as Trial-and-Error (T&E) with no guessing and we comment this result in the Sudoku case.
Probabilistic atlas and geometric variability estimation to drive tissue segmentation.
Xu, Hao; Thirion, Bertrand; Allassonnière, Stéphanie
2014-09-10
Computerized anatomical atlases play an important role in medical image analysis. While an atlas usually refers to a standard or mean image also called template, which presumably represents well a given population, it is not enough to characterize the observed population in detail. A template image should be learned jointly with the geometric variability of the shapes represented in the observations. These two quantities will in the sequel form the atlas of the corresponding population. The geometric variability is modeled as deformations of the template image so that it fits the observations. In this paper, we provide a detailed analysis of a new generative statistical model based on dense deformable templates that represents several tissue types observed in medical images. Our atlas contains both an estimation of probability maps of each tissue (called class) and the deformation metric. We use a stochastic algorithm for the estimation of the probabilistic atlas given a dataset. This atlas is then used for atlas-based segmentation method to segment the new images. Experiments are shown on brain T1 MRI datasets. Copyright © 2014 John Wiley & Sons, Ltd.
An analysis of numerical convergence in discrete velocity gas dynamics for internal flows
NASA Astrophysics Data System (ADS)
Sekaran, Aarthi; Varghese, Philip; Goldstein, David
2018-07-01
The Discrete Velocity Method (DVM) for solving the Boltzmann equation has significant advantages in the modeling of non-equilibrium and near equilibrium flows as compared to other methods in terms of reduced statistical noise, faster solutions and the ability to handle transient flows. Yet the DVM performance for rarefied flow in complex, small-scale geometries, in microelectromechanical (MEMS) devices for instance, is yet to be studied in detail. The present study focuses on the performance of the DVM for locally large Knudsen number flows of argon around sharp corners and other sources for discontinuities in the distribution function. Our analysis details the nature of the solution for some benchmark cases and introduces the concept of solution convergence for the transport terms in the discrete velocity Boltzmann equation. The limiting effects of the velocity space discretization are also investigated and the constraints on obtaining a robust, consistent solution are derived. We propose techniques to maintain solution convergence and demonstrate the implementation of a specific strategy and its effect on the fidelity of the solution for some benchmark cases.
NASA Astrophysics Data System (ADS)
Vitali, Lina; Righini, Gaia; Piersanti, Antonio; Cremona, Giuseppe; Pace, Giandomenico; Ciancarella, Luisella
2017-12-01
Air backward trajectory calculations are commonly used in a variety of atmospheric analyses, in particular for source attribution evaluation. The accuracy of backward trajectory analysis is mainly determined by the quality and the spatial and temporal resolution of the underlying meteorological data set, especially in the cases of complex terrain. This work describes a new tool for the calculation and the statistical elaboration of backward trajectories. To take advantage of the high-resolution meteorological database of the Italian national air quality model MINNI, a dedicated set of procedures was implemented under the name of M-TraCE (MINNI module for Trajectories Calculation and statistical Elaboration) to calculate and process the backward trajectories of air masses reaching a site of interest. Some outcomes from the application of the developed methodology to the Italian Network of Special Purpose Monitoring Stations are shown to assess its strengths for the meteorological characterization of air quality monitoring stations. M-TraCE has demonstrated its capabilities to provide a detailed statistical assessment of transport patterns and region of influence of the site under investigation, which is fundamental for correctly interpreting pollutants measurements and ascertaining the official classification of the monitoring site based on meta-data information. Moreover, M-TraCE has shown its usefulness in supporting other assessments, i.e., spatial representativeness of a monitoring site, focussing specifically on the analysis of the effects due to meteorological variables.
NASA Astrophysics Data System (ADS)
Athiyamaan, V.; Mohan Ganesh, G.
2017-11-01
Self-Compacting Concrete is one of the special concretes that have ability to flow and consolidate on its own weight, completely fill the formwork even in the presence of dense reinforcement; whilst maintaining its homogeneity throughout the formwork without any requirement for vibration. Researchers all over the world are developing high performance concrete by adding various Fibers, admixtures in different proportions. Various different kinds Fibers like glass, steel, carbon, Poly propylene and aramid Fibers provide improvement in concrete properties like tensile strength, fatigue characteristic, durability, shrinkage, impact, erosion resistance and serviceability of concrete[6]. It includes fundamental study on fiber reinforced self-compacting concrete with admixtures; its rheological properties, mechanical properties and overview study on design methodology statistical approaches regarding optimizing the concrete performances. The study has been classified into seven basic chapters: introduction, phenomenal study on material properties review on self-compacting concrete, overview on fiber reinforced self-compacting concrete containing admixtures, review on design and analysis of experiment; a statistical approach, summary of existing works on FRSCC and statistical modeling, literature review and, conclusion. It is so eminent to know the resent studies that had been done on polymer based binder materials (fly ash, metakaolin, GGBS, etc.), fiber reinforced concrete and SCC; to do an effective research on fiber reinforced self-compacting concrete containing admixtures. The key aim of the study is to sort-out the research gap and to gain a complete knowledge on polymer based Self compacting fiber reinforced concrete.
Trade Studies of Space Launch Architectures using Modular Probabilistic Risk Analysis
NASA Technical Reports Server (NTRS)
Mathias, Donovan L.; Go, Susie
2006-01-01
A top-down risk assessment in the early phases of space exploration architecture development can provide understanding and intuition of the potential risks associated with new designs and technologies. In this approach, risk analysts draw from their past experience and the heritage of similar existing systems as a source for reliability data. This top-down approach captures the complex interactions of the risk driving parts of the integrated system without requiring detailed knowledge of the parts themselves, which is often unavailable in the early design stages. Traditional probabilistic risk analysis (PRA) technologies, however, suffer several drawbacks that limit their timely application to complex technology development programs. The most restrictive of these is a dependence on static planning scenarios, expressed through fault and event trees. Fault trees incorporating comprehensive mission scenarios are routinely constructed for complex space systems, and several commercial software products are available for evaluating fault statistics. These static representations cannot capture the dynamic behavior of system failures without substantial modification of the initial tree. Consequently, the development of dynamic models using fault tree analysis has been an active area of research in recent years. This paper discusses the implementation and demonstration of dynamic, modular scenario modeling for integration of subsystem fault evaluation modules using the Space Architecture Failure Evaluation (SAFE) tool. SAFE is a C++ code that was originally developed to support NASA s Space Launch Initiative. It provides a flexible framework for system architecture definition and trade studies. SAFE supports extensible modeling of dynamic, time-dependent risk drivers of the system and functions at the level of fidelity for which design and failure data exists. The approach is scalable, allowing inclusion of additional information as detailed data becomes available. The tool performs a Monte Carlo analysis to provide statistical estimates. Example results of an architecture system reliability study are summarized for an exploration system concept using heritage data from liquid-fueled expendable Saturn V/Apollo launch vehicles.
NASA Astrophysics Data System (ADS)
Conefrey, Theresa Catherine
Government-sponsored and private research initiatives continue to document the underrepresentation of women in the sciences. Despite policy initiatives, women's attrition rates each stage of their scientific careers remain higher than those of their male colleagues. In order to improve retention rates more information is needed about why many drop out or do not succeed as well as they could. While broad sociological studies and statistical surveys offer a valuable overview of institutional practices, in-depth qualitative analyses are needed to complement these large-scale studies. This present study goes behind statistical generalizations about the situation of women in science to explore the actual experience of scientific socialization and professionalization. Beginning with one reason often cited by women who have dropped out of science: "a bad lab experience," I explore through detailed observation in a naturalistic setting what this phrase might actually mean. Using ethnographic and discourse analytic methods, I present a detailed analysis of the discourse patterns in a life sciences laboratory group at a large research university. I show how language accomplishes the work of indexing and constituting social constraints, of maintaining or undermining the hierarchical power dynamics of the laboratory, of shaping members' presentation of self, and of modeling social and professional skills required to "do science." Despite the widespread conviction among scientists that "the mind has no sex," my study details how gender marks many routine interactions in the lab, including an emphasis on competition, a reinforcement of sex-role stereotypes, and a conversational style that is in several respects more compatible with men's than women's forms of talk.
Villa-Uriol, M. C.; Berti, G.; Hose, D. R.; Marzo, A.; Chiarini, A.; Penrose, J.; Pozo, J.; Schmidt, J. G.; Singh, P.; Lycett, R.; Larrabide, I.; Frangi, A. F.
2011-01-01
Cerebral aneurysms are a multi-factorial disease with severe consequences. A core part of the European project @neurIST was the physical characterization of aneurysms to find candidate risk factors associated with aneurysm rupture. The project investigated measures based on morphological, haemodynamic and aneurysm wall structure analyses for more than 300 cases of ruptured and unruptured aneurysms, extracting descriptors suitable for statistical studies. This paper deals with the unique challenges associated with this task, and the implemented solutions. The consistency of results required by the subsequent statistical analyses, given the heterogeneous image data sources and multiple human operators, was met by a highly automated toolchain combined with training. A testimonial of the successful automation is the positive evaluation of the toolchain by over 260 clinicians during various hands-on workshops. The specification of the analyses required thorough investigations of modelling and processing choices, discussed in a detailed analysis protocol. Finally, an abstract data model governing the management of the simulation-related data provides a framework for data provenance and supports future use of data and toolchain. This is achieved by enabling the easy modification of the modelling approaches and solution details through abstract problem descriptions, removing the need of repetition of manual processing work. PMID:22670202
NASA Astrophysics Data System (ADS)
Yoo, Donghoon; Lee, Joohyun; Lee, Byeongchan; Kwon, Suyong; Koo, Junemo
2018-02-01
The Transient Hot-Wire Method (THWM) was developed to measure the absolute thermal conductivity of gases, liquids, melts, and solids with low uncertainty. The majority of nanofluid researchers used THWM to measure the thermal conductivity of test fluids. Several reasons have been suggested for the discrepancies in these types of measurements, including nanofluid generation, nanofluid stability, and measurement challenges. The details of the transient hot-wire method such as the test cell size, the temperature coefficient of resistance (TCR) and the sampling number are further investigated to improve the accuracy and consistency of the measurements of different researchers. It was observed that smaller test apparatuses were better because they can delay the onset of natural convection. TCR values of a coated platinum wire were measured and statistically analyzed to reduce the uncertainty in thermal conductivity measurements. For validation, ethylene glycol (EG) and water thermal conductivity were measured and analyzed in the temperature range between 280 and 310 K. Furthermore, a detailed statistical analysis was conducted for such measurements, and the results confirmed the minimum number of samples required to achieve the desired resolution and precision of the measurements. It is further proposed that researchers fully report the information related to their measurements to validate the measurements and to avoid future inconsistent nanofluid data.
C-statistic fitting routines: User's manual and reference guide
NASA Technical Reports Server (NTRS)
Nousek, John A.; Farwana, Vida
1991-01-01
The computer program is discussed which can read several input files and provide a best set of values for the functions provided by the user, using either C-statistic or the chi(exp 2) statistic method. The program consists of one main routine and several functions and subroutines. Detail descriptions of each function and subroutine is presented. A brief description of the C-statistic and the reason for its application is also presented.
Army Air Forces Statistical Digest, 1946. First Annual Number
1947-06-01
accordance with AAF Letter 5-5 dated 28 April 1947, the Army Air Forces Statistical Digest has been designated as the official AAF statisti- cal yearbook...more detailed exposition, Since the Digest is designed primarily as a reference. manual, the re- action of users to its contents is important in the...distributed by this Headquarters (Statistical Control Division, Office of the Air Comptroller) is hereby designated as the official AAF statistical yearbook
Advanced Gear Alloys for Ultra High Strength Applications
NASA Technical Reports Server (NTRS)
Shen, Tony; Krantz, Timothy; Sebastian, Jason
2011-01-01
Single tooth bending fatigue (STBF) test data of UHS Ferrium C61 and C64 alloys are presented in comparison with historical test data of conventional gear steels (9310 and Pyrowear 53) with comparable statistical analysis methods. Pitting and scoring tests of C61 and C64 are works in progress. Boeing statistical analysis of STBF test data for the four gear steels (C61, C64, 9310 and Pyrowear 53) indicates that the UHS grades exhibit increases in fatigue strength in the low cycle fatigue (LCF) regime. In the high cycle fatigue (HCF) regime, the UHS steels exhibit better mean fatigue strength endurance limit behavior (particularly as compared to Pyrowear 53). However, due to considerable scatter in the UHS test data, the anticipated overall benefits of the UHS grades in bending fatigue have not been fully demonstrated. Based on all the test data and on Boeing s analysis, C61 has been selected by Boeing as the gear steel for the final ERDS demonstrator test gearboxes. In terms of potential follow-up work, detailed physics-based, micromechanical analysis and modeling of the fatigue data would allow for a better understanding of the causes of the experimental scatter, and of the transition from high-stress LCF (surface-dominated) to low-stress HCF (subsurface-dominated) fatigue failure. Additional STBF test data and failure analysis work, particularly in the HCF regime and around the endurance limit stress, could allow for better statistical confidence and could reduce the observed effects of experimental test scatter. Finally, the need for further optimization of the residual compressive stress profiles of the UHS steels (resulting from carburization and peening) is noted, particularly for the case of the higher hardness C64 material.
Code of Federal Regulations, 2010 CFR
2010-01-01
... include Government-controlled corporations. Bureau of Labor Statistics (BLS) means the Bureau of Labor Statistics of the Department of Labor. Commonwealth of the Northern Mariana Islands (CNMI) means the....207. Detailed Expenditure Category (DEC) means the lowest level of expenditure shown in tabulated...
Inventory of Electric Utility Power Plants in the United States
2002-01-01
Final issue of this report. Provides detailed statistics on existing generating units operated by electric utilities as of December 31, 2000, and certain summary statistics about new generators planned for operation by electric utilities during the next 5 years.
Forest statistics for Minnesota's Prairie Unit.
Sue M. Roussopoulos
1992-01-01
The fifth inventory of Minnesota's Prairie Unit reports 19.2 million acres of land, of which 660 thousand acres are forested. This bulletin presents statistical highlights and contains detailed tables of forest area, as well as timber volume, growth, removals, mortality, and ownership.
Michigan forest statistics, 1993.
Earl C. Leatherberry; John S. Jr. Spencer
1996-01-01
The fifth forest inventory of Michigan's forest reports 36.4 million acres of land, of which 19.3 million acres are forested. This bulletin presents statistical highlights and contains detailed tables of forest area, as well as timber volume, growth, removals, mortality, and biomass.
A Study of Bicycle and Passenger Car Collisions Based on Insurance Claims Data
Isaksson-Hellman, Irene
2012-01-01
In Sweden, bicycle crashes are under-reported in the official statistics that are based on police reports. Statistics from hospital reports show that cyclists constitute the highest percentage of severely injured road users compared to other road user groups. However, hospital reports lack detailed information about the crash. To get a more comprehensive view, additional data are needed to accurately reflect the casualty situation for cyclists. An analysis based on 438 cases of bicycle and passenger car collisions is presented, using data collected from insurance claims. The most frequent crash situations are described with factors as to where and when collisions occur, age and gender of the involved cyclists and drivers. Information on environmental circumstances such as road status, weather- and light conditions, speedlimits and traffic environment is also included. Based on the various crash events, a total of 32 different scenarios have been categorized, and it was found that more than 75% were different kinds of intersection related situations. From the data, it was concluded that factors such as estimated impact speed and age significantly influence injury severity. The insurance claims data complement the official statistics and provide a more comprehensive view of bicycle and passenger car collisions by considering all levels of crash and injury severity. The detailed descriptions of the crash situations also provide an opportunity to find countermeasures to prevent or mitigate collisions. The results provide a useful basis, and facilitates the work of reducing the number of bicycle and passenger car collisions with serious consequences. PMID:23169111
Research on the Hotel Image Based on the Detail Service
NASA Astrophysics Data System (ADS)
Li, Ban; Shenghua, Zheng; He, Yi
Detail service management, initially developed as marketing programs to enhance customer loyalty, has now become an important part of customer relation strategy. This paper analyzes the critical factors of detail service and its influence on the hotel image. We establish the theoretical model of influencing factors on hotel image and propose corresponding hypotheses. We use applying statistical method to test and verify the above-mentioned hypotheses. This paper provides a foundation for further study of detail service design and planning issues.
ERIC Educational Resources Information Center
Hood, Michelle; Creed, Peter A.; Neumann, David L.
2012-01-01
We tested a model of the relationship between attitudes toward statistics and achievement based on Eccles' Expectancy Value Model (1983). Participants (n = 149; 83% female) were second-year Australian university students in a psychology statistics course (mean age = 23.36 years, SD = 7.94 years). We obtained demographic details, past performance,…
Indicators of School Crime and Safety: 2009. NCES 2010-012/NCJ 228478
ERIC Educational Resources Information Center
Dinkes, Rachel; Kemp, Jana; Baum, Katrina
2009-01-01
A joint effort by the Bureau of Justice Statistics and National Center for Education Statistics, this annual report examines crime occurring in school as well as on the way to and from school. It provides the most current detailed statistical information to inform the Nation on the nature of crime in schools. This report presents data on crime at…
NASA Technical Reports Server (NTRS)
Bauman, William H., III
2010-01-01
The 45th Weather Squadron (45 WS) Launch Weather Officers use the 12-km resolution North American Mesoscale (NAM) model (MesoNAM) text and graphical product forecasts extensively to support launch weather operations. However, the actual performance of the model at Kennedy Space Center (KSC) and Cape Canaveral Air Force Station (CCAFS) has not been measured objectively. In order to have tangible evidence of model performance, the 45 WS tasked the Applied Meteorology Unit to conduct a detailed statistical analysis of model output compared to observed values. The model products are provided to the 45 WS by ACTA, Inc. and include hourly forecasts from 0 to 84 hours based on model initialization times of 00, 06, 12 and 18 UTC. The objective analysis compared the MesoNAM forecast winds, temperature and dew point, as well as the changes in these parameters over time, to the observed values from the sensors in the KSC/CCAFS wind tower network. Objective statistics will give the forecasters knowledge of the model's strength and weaknesses, which will result in improved forecasts for operations.
NASA Technical Reports Server (NTRS)
Watson, Leela R.
2011-01-01
The 45th Weather Squadron Launch Weather Officers use the 12-km resolution North American Mesoscale model (MesoNAM) forecasts to support launch weather operations. In Phase I, the performance of the model at KSC/CCAFS was measured objectively by conducting a detailed statistical analysis of model output compared to observed values. The objective analysis compared the MesoNAM forecast winds, temperature, and dew point to the observed values from the sensors in the KSC/CCAFS wind tower network. In Phase II, the AMU modified the current tool by adding an additional 15 months of model output to the database and recalculating the verification statistics. The bias, standard deviation of bias, Root Mean Square Error, and Hypothesis test for bias were calculated to verify the performance of the model. The results indicated that the accuracy decreased as the forecast progressed, there was a diurnal signal in temperature with a cool bias during the late night and a warm bias during the afternoon, and there was a diurnal signal in dewpoint temperature with a low bias during the afternoon and a high bias during the late night.
Mathias, C G; Morrison, J H
1988-10-01
The overall incidence rates, numbers, and proportions of occupational skin diseases recorded in the Bureau of Labor Statistics Annual Survey of Occupational Injuries and Illnesses, from 1973 through 1984, were reviewed, and a detailed analysis of occupational skin diseases recorded in the 1984 Annual Survey was performed. Overall incidence rates and numbers of cases declined from 1973 through 1983, but increased slightly in 1984. The major industrial divisions of agriculture and manufacturing have consistently had the highest rates and numbers of cases, respectively; skin diseases have accounted for almost two thirds of all occupational illnesses within agriculture. In the 1984 Annual Survey, 11 industries were ranked in the "Top 15" for both incidence rates and numbers of cases, at the two-digit Standard Industrial Classification level. At the four-digit level for manufacturing, four industries were also ranked in the "Top 15" for both indexes. This analysis has identified industries toward which research efforts should be directed to characterize those occupational activities or exposures most responsible for these higher risks.
NASA Technical Reports Server (NTRS)
Pai, Shantaram S.; Gyekenyesi, John P.
1988-01-01
The calculation of shape and scale parameters of the two-parameter Weibull distribution is described using the least-squares analysis and maximum likelihood methods for volume- and surface-flaw-induced fracture in ceramics with complete and censored samples. Detailed procedures are given for evaluating 90 percent confidence intervals for maximum likelihood estimates of shape and scale parameters, the unbiased estimates of the shape parameters, and the Weibull mean values and corresponding standard deviations. Furthermore, the necessary steps are described for detecting outliers and for calculating the Kolmogorov-Smirnov and the Anderson-Darling goodness-of-fit statistics and 90 percent confidence bands about the Weibull distribution. It also shows how to calculate the Batdorf flaw-density constants by uing the Weibull distribution statistical parameters. The techniques described were verified with several example problems, from the open literature, and were coded. The techniques described were verified with several example problems from the open literature, and were coded in the Structural Ceramics Analysis and Reliability Evaluation (SCARE) design program.
Petroleum supply monthly, June 1999, with data for April 1999
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
Data presented in the Petroleum Supply Monthly (PSM) describe the supply and disposition of petroleum products in the US and major US geographic regions. The data series describe production, imports and exports, inter-Petroleum Administration for Defense (PAD) District movements, and inventories by the primary suppliers of petroleum products in the US (50 States and the District of Columbia). The reporting universe includes those petroleum sectors in primary supply. Included are: petroleum refiners, motor gasoline blenders, operators of natural gas processing plants and fractionators, inter-PAD transporters, importers, and major inventory holders of petroleum products and crude oil. When aggregated, the datamore » reported by these sectors approximately represent the consumption of petroleum products in the US. Data presented in the PSM are divided into two sections: Summary Statistics and Detailed Statistics. The tables and figures in the Summary Statistics section of the PSM present a time series of selected petroleum data on a US level. The Detailed Statistics tables of the PSM present statistics for the most current month available as well as year-to-date. 16 figs., 66 tabs.« less
Petroleum supply monthly, February 1999, with data for December 1998
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
Data presented in the Petroleum Supply Monthly (PSM) describes the supply and disposition of petroleum products in the United States and major US geographic regions. The data series describe production, imports and exports, inter-Petroleum Administration for Defense (PAD) District movements, and inventories by the primary suppliers of petroleum products in the United States (50 States and the District of Columbia). The reporting universe includes those petroleum sectors in primary supply. Included are: petroleum refiners, motor gasoline blenders, operators of natural gas processing plants and fractionators, inter-PAD transporters, importers, and major inventory holders of petroleum products and crude oil. When aggregated,more » the data reported by these sectors approximately represent the consumption of petroleum products in the United States. Data presented in the PSM are divided into two sections: Summary Statistics and Detailed Statistics. The tables and figures in the Summary Statistics section of the PSM present a time series of selected petroleum data on a US level. The Detailed Statistics tables of the PSM present statistics for the most current month available as well as year-to-date. 16 figs., 66 tabs.« less
Suurmond, Robert; van Rhee, Henk; Hak, Tony
2017-12-01
We present a new tool for meta-analysis, Meta-Essentials, which is free of charge and easy to use. In this paper, we introduce the tool and compare its features to other tools for meta-analysis. We also provide detailed information on the validation of the tool. Although free of charge and simple, Meta-Essentials automatically calculates effect sizes from a wide range of statistics and can be used for a wide range of meta-analysis applications, including subgroup analysis, moderator analysis, and publication bias analyses. The confidence interval of the overall effect is automatically based on the Knapp-Hartung adjustment of the DerSimonian-Laird estimator. However, more advanced meta-analysis methods such as meta-analytical structural equation modelling and meta-regression with multiple covariates are not available. In summary, Meta-Essentials may prove a valuable resource for meta-analysts, including researchers, teachers, and students. © 2017 The Authors. Research Synthesis Methods published by John Wiley & Sons Ltd.
User's Manual for RESRAD-OFFSITE Version 2.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yu, C.; Gnanapragasam, E.; Biwer, B. M.
2007-09-05
The RESRAD-OFFSITE code is an extension of the RESRAD (onsite) code, which has been widely used for calculating doses and risks from exposure to radioactively contaminated soils. The development of RESRAD-OFFSITE started more than 10 years ago, but new models and methodologies have been developed, tested, and incorporated since then. Some of the new models have been benchmarked against other independently developed (international) models. The databases used have also expanded to include all the radionuclides (more than 830) contained in the International Commission on Radiological Protection (ICRP) 38 database. This manual provides detailed information on the design and application ofmore » the RESRAD-OFFSITE code. It describes in detail the new models used in the code, such as the three-dimensional dispersion groundwater flow and radionuclide transport model, the Gaussian plume model for atmospheric dispersion, and the deposition model used to estimate the accumulation of radionuclides in offsite locations and in foods. Potential exposure pathways and exposure scenarios that can be modeled by the RESRAD-OFFSITE code are also discussed. A user's guide is included in Appendix A of this manual. The default parameter values and parameter distributions are presented in Appendix B, along with a discussion on the statistical distributions for probabilistic analysis. A detailed discussion on how to reduce run time, especially when conducting probabilistic (uncertainty) analysis, is presented in Appendix C of this manual.« less
Summary Statistics for Homemade ?Play Dough? -- Data Acquired at LLNL
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kallman, J S; Morales, K E; Whipple, R E
Using x-ray computerized tomography (CT), we have characterized the x-ray linear attenuation coefficients (LAC) of a homemade Play Dough{trademark}-like material, designated as PDA. Table 1 gives the first-order statistics for each of four CT measurements, estimated with a Gaussian kernel density estimator (KDE) analysis. The mean values of the LAC range from a high of about 2700 LMHU{sub D} 100kVp to a low of about 1200 LMHUD at 300kVp. The standard deviation of each measurement is around 10% to 15% of the mean. The entropy covers the range from 6.0 to 7.4. Ordinarily, we would model the LAC of themore » material and compare the modeled values to the measured values. In this case, however, we did not have the detailed chemical composition of the material and therefore did not model the LAC. Using a method recently proposed by Lawrence Livermore National Laboratory (LLNL), we estimate the value of the effective atomic number, Z{sub eff}, to be near 10. LLNL prepared about 50mL of the homemade 'Play Dough' in a polypropylene vial and firmly compressed it immediately prior to the x-ray measurements. We used the computer program IMGREC to reconstruct the CT images. The values of the key parameters used in the data capture and image reconstruction are given in this report. Additional details may be found in the experimental SOP and a separate document. To characterize the statistical distribution of LAC values in each CT image, we first isolated an 80% central-core segment of volume elements ('voxels') lying completely within the specimen, away from the walls of the polypropylene vial. All of the voxels within this central core, including those comprised of voids and inclusions, are included in the statistics. We then calculated the mean value, standard deviation and entropy for (a) the four image segments and for (b) their digital gradient images. (A digital gradient image of a given image was obtained by taking the absolute value of the difference between the initial image and that same image offset by one voxel horizontally, parallel to the rows of the x-ray detector array.) The statistics of the initial image of LAC values are called 'first order statistics;' those of the gradient image, 'second order statistics.'« less
Micro-foundations for macroeconomics: New set-up based on statistical physics
NASA Astrophysics Data System (ADS)
Yoshikawa, Hiroshi
2016-12-01
Modern macroeconomics is built on "micro foundations." Namely, optimization of micro agent such as consumer and firm is explicitly analyzed in model. Toward this goal, standard model presumes "the representative" consumer/firm, and analyzes its behavior in detail. However, the macroeconomy consists of 107 consumers and 106 firms. For the purpose of analyzing such macro system, it is meaningless to pursue the micro behavior in detail. In this respect, there is no essential difference between economics and physics. The method of statistical physics can be usefully applied to the macroeconomy, and provides Keynesian economics with correct micro-foundations.
Entry, Descent, and Landing Operations Analysis for the Mars Phoenix Lander
NASA Technical Reports Server (NTRS)
Prince, Jill L.; Desai, Prasun N.; Queen, Eric M.; Grover, Myron R.
2008-01-01
The Mars Phoenix lander was launched August 4, 2007 and remained in cruise for ten months before landing in the northern plains of Mars in May 2008. The one-month Entry, Descent, and Landing (EDL) operations phase prior to entry consisted of daily analyses, meetings, and decisions necessary to determine if trajectory correction maneuvers and environmental parameter updates to the spacecraft were required. An overview of the Phoenix EDL trajectory simulation and analysis that was performed during the EDL approach and operations phase is described in detail. The evolution of the Monte Carlo statistics and footprint ellipse during the final approach phase is also provided. The EDL operations effort accurately delivered the Phoenix lander to the desired landing region on May 25, 2008.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wilson, Eric J
The ResStock analysis tool is helping states, municipalities, utilities, and manufacturers identify which home upgrades save the most energy and money. Across the country there's a vast diversity in the age, size, construction practices, installed equipment, appliances, and resident behavior of the housing stock, not to mention the range of climates. These variations have hindered the accuracy of predicting savings for existing homes. Researchers at the National Renewable Energy Laboratory (NREL) developed ResStock. It's a versatile tool that takes a new approach to large-scale residential energy analysis by combining: large public and private data sources, statistical sampling, detailed subhourly buildingmore » simulations, high-performance computing. This combination achieves unprecedented granularity and most importantly - accuracy - in modeling the diversity of the single-family housing stock.« less
A statistical approach to develop a detailed soot growth model using PAH characteristics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Raj, Abhijeet; Celnik, Matthew; Shirley, Raphael
A detailed PAH growth model is developed, which is solved using a kinetic Monte Carlo algorithm. The model describes the structure and growth of planar PAH molecules, and is referred to as the kinetic Monte Carlo-aromatic site (KMC-ARS) model. A detailed PAH growth mechanism based on reactions at radical sites available in the literature, and additional reactions obtained from quantum chemistry calculations are used to model the PAH growth processes. New rates for the reactions involved in the cyclodehydrogenation process for the formation of 6-member rings on PAHs are calculated in this work based on density functional theory simulations. Themore » KMC-ARS model is validated by comparing experimentally observed ensembles on PAHs with the computed ensembles for a C{sub 2}H{sub 2} and a C{sub 6}H{sub 6} flame at different heights above the burner. The motivation for this model is the development of a detailed soot particle population balance model which describes the evolution of an ensemble of soot particles based on their PAH structure. However, at present incorporating such a detailed model into a population balance is computationally unfeasible. Therefore, a simpler model referred to as the site-counting model has been developed, which replaces the structural information of the PAH molecules by their functional groups augmented with statistical closure expressions. This closure is obtained from the KMC-ARS model, which is used to develop correlations and statistics in different flame environments which describe such PAH structural information. These correlations and statistics are implemented in the site-counting model, and results from the site-counting model and the KMC-ARS model are in good agreement. Additionally the effect of steric hindrance in large PAH structures is investigated and correlations for sites unavailable for reaction are presented. (author)« less
Statistical analysis of early failures in electromigration
NASA Astrophysics Data System (ADS)
Gall, M.; Capasso, C.; Jawarani, D.; Hernandez, R.; Kawasaki, H.; Ho, P. S.
2001-07-01
The detection of early failures in electromigration (EM) and the complicated statistical nature of this important reliability phenomenon have been difficult issues to treat in the past. A satisfactory experimental approach for the detection and the statistical analysis of early failures has not yet been established. This is mainly due to the rare occurrence of early failures and difficulties in testing of large sample populations. Furthermore, experimental data on the EM behavior as a function of varying number of failure links are scarce. In this study, a technique utilizing large interconnect arrays in conjunction with the well-known Wheatstone Bridge is presented. Three types of structures with a varying number of Ti/TiN/Al(Cu)/TiN-based interconnects were used, starting from a small unit of five lines in parallel. A serial arrangement of this unit enabled testing of interconnect arrays encompassing 480 possible failure links. In addition, a Wheatstone Bridge-type wiring using four large arrays in each device enabled simultaneous testing of 1920 interconnects. In conjunction with a statistical deconvolution to the single interconnect level, the results indicate that the electromigration failure mechanism studied here follows perfect lognormal behavior down to the four sigma level. The statistical deconvolution procedure is described in detail. Over a temperature range from 155 to 200 °C, a total of more than 75 000 interconnects were tested. None of the samples have shown an indication of early, or alternate, failure mechanisms. The activation energy of the EM mechanism studied here, namely the Cu incubation time, was determined to be Q=1.08±0.05 eV. We surmise that interface diffusion of Cu along the Al(Cu) sidewalls and along the top and bottom refractory layers, coupled with grain boundary diffusion within the interconnects, constitutes the Cu incubation mechanism.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ade, P. A. R.; Aghanim, N.; Akrami, Y.
In this paper, we test the statistical isotropy and Gaussianity of the cosmic microwave background (CMB) anisotropies using observations made by the Planck satellite. Our results are based mainly on the full Planck mission for temperature, but also include some polarization measurements. In particular, we consider the CMB anisotropy maps derived from the multi-frequency Planck data by several component-separation methods. For the temperature anisotropies, we find excellent agreement between results based on these sky maps over both a very large fraction of the sky and a broad range of angular scales, establishing that potential foreground residuals do not affect ourmore » studies. Tests of skewness, kurtosis, multi-normality, N-point functions, and Minkowski functionals indicate consistency with Gaussianity, while a power deficit at large angular scales is manifested in several ways, for example low map variance. The results of a peak statistics analysis are consistent with the expectations of a Gaussian random field. The “Cold Spot” is detected with several methods, including map kurtosis, peak statistics, and mean temperature profile. We thoroughly probe the large-scale dipolar power asymmetry, detecting it with several independent tests, and address the subject of a posteriori correction. Tests of directionality suggest the presence of angular clustering from large to small scales, but at a significance that is dependent on the details of the approach. We perform the first examination of polarization data, finding the morphology of stacked peaks to be consistent with the expectations of statistically isotropic simulations. Finally, where they overlap, these results are consistent with the Planck 2013 analysis based on the nominal mission data and provide our most thorough view of the statistics of the CMB fluctuations to date.« less
Quantitative knowledge acquisition for expert systems
NASA Technical Reports Server (NTRS)
Belkin, Brenda L.; Stengel, Robert F.
1991-01-01
A common problem in the design of expert systems is the definition of rules from data obtained in system operation or simulation. While it is relatively easy to collect data and to log the comments of human operators engaged in experiments, generalizing such information to a set of rules has not previously been a direct task. A statistical method is presented for generating rule bases from numerical data, motivated by an example based on aircraft navigation with multiple sensors. The specific objective is to design an expert system that selects a satisfactory suite of measurements from a dissimilar, redundant set, given an arbitrary navigation geometry and possible sensor failures. The systematic development is described of a Navigation Sensor Management (NSM) Expert System from Kalman Filter convariance data. The method invokes two statistical techniques: Analysis of Variance (ANOVA) and the ID3 Algorithm. The ANOVA technique indicates whether variations of problem parameters give statistically different covariance results, and the ID3 algorithms identifies the relationships between the problem parameters using probabilistic knowledge extracted from a simulation example set. Both are detailed.
Karl Pearson and eugenics: personal opinions and scientific rigor.
Delzell, Darcie A P; Poliak, Cathy D
2013-09-01
The influence of personal opinions and biases on scientific conclusions is a threat to the advancement of knowledge. Expertise and experience does not render one immune to this temptation. In this work, one of the founding fathers of statistics, Karl Pearson, is used as an illustration of how even the most talented among us can produce misleading results when inferences are made without caution or reference to potential bias and other analysis limitations. A study performed by Pearson on British Jewish schoolchildren is examined in light of ethical and professional statistical practice. The methodology used and inferences made by Pearson and his coauthor are sometimes questionable and offer insight into how Pearson's support of eugenics and his own British nationalism could have potentially influenced his often careless and far-fetched inferences. A short background into Pearson's work and beliefs is provided, along with an in-depth examination of the authors' overall experimental design and statistical practices. In addition, portions of the study regarding intelligence and tuberculosis are discussed in more detail, along with historical reactions to their work.
Statistics of the cosmic Mach number from numerical simulations of a cold dark matter universe
NASA Technical Reports Server (NTRS)
Suto, Yasushi; Cen, Renyue; Ostriker, Jeremiah P.
1992-01-01
Results are presented of an analysis of the cosmic Mach number, M, the ratio of the streaming velocity, v, to the random velocity dispersion, sigma, of galaxies in a given patch of the universe, which was performed on the basis of hydrodynamical simulations of the cold dark matter scenario. Galaxy formation is modeled by application of detailed physical processes rather than by the ad hoc assumption of 'bias' between dark matter and galaxy fluctuations. The correlation between M and sigma is found to be very weak for both components. No evidence is found for a physical 'velocity bias' in the quantities which appear in the definition of M. Standard cold-dark-matter-dominated universes are in conflict, at a statistically significant level, with the available observation, in that they predict a Mach number considerably lower than is observed.
Orton, Dennis J.; Doucette, Alan A.
2013-01-01
Identification of biomarkers capable of differentiating between pathophysiological states of an individual is a laudable goal in the field of proteomics. Protein biomarker discovery generally employs high throughput sample characterization by mass spectrometry (MS), being capable of identifying and quantifying thousands of proteins per sample. While MS-based technologies have rapidly matured, the identification of truly informative biomarkers remains elusive, with only a handful of clinically applicable tests stemming from proteomic workflows. This underlying lack of progress is attributed in large part to erroneous experimental design, biased sample handling, as well as improper statistical analysis of the resulting data. This review will discuss in detail the importance of experimental design and provide some insight into the overall workflow required for biomarker identification experiments. Proper balance between the degree of biological vs. technical replication is required for confident biomarker identification. PMID:28250400
Giofrè, David; Cumming, Geoff; Fresc, Luca; Boedker, Ingrid; Tressoldi, Patrizio
2017-01-01
From January 2014, Psychological Science introduced new submission guidelines that encouraged the use of effect sizes, estimation, and meta-analysis (the "new statistics"), required extra detail of methods, and offered badges for use of open science practices. We investigated the use of these practices in empirical articles published by Psychological Science and, for comparison, by the Journal of Experimental Psychology: General, during the period of January 2013 to December 2015. The use of null hypothesis significance testing (NHST) was extremely high at all times and in both journals. In Psychological Science, the use of confidence intervals increased markedly overall, from 28% of articles in 2013 to 70% in 2015, as did the availability of open data (3 to 39%) and open materials (7 to 31%). The other journal showed smaller or much smaller changes. Our findings suggest that journal-specific submission guidelines may encourage desirable changes in authors' practices.
DOE Office of Scientific and Technical Information (OSTI.GOV)
House, L.L.; Querfeld, C.W.; Rees, D.E.
1982-04-15
Coronal magnetic fields influence in the intensity and linear polarization of light scattered by coronal Fe XIV ions. To interpret polarization measurements of Fe XIV 5303 A coronal emission requires a detailed understanding of the dependence of the emitted Stokes vector on coronal magnetic field direction, electron density, and temperature and on height of origin. The required dependence is included in the solutions of statistical equilibrium for the ion which are solved explicitly for 34 magnetic sublevels in both the ground and four excited terms. The full solutions are reduced to equivalent simple analytic forms which clearly show the requiredmore » dependence on coronal conditions. The analytic forms of the reduced solutions are suitable for routine analysis of 5303 green line polarimetric data obtained at Pic du Midi and from the Solar Maximum Mission Coronagraph/Polarimeter.« less
Universality classes of fluctuation dynamics in hierarchical complex systems
NASA Astrophysics Data System (ADS)
Macêdo, A. M. S.; González, Iván R. Roa; Salazar, D. S. P.; Vasconcelos, G. L.
2017-03-01
A unified approach is proposed to describe the statistics of the short-time dynamics of multiscale complex systems. The probability density function of the relevant time series (signal) is represented as a statistical superposition of a large time-scale distribution weighted by the distribution of certain internal variables that characterize the slowly changing background. The dynamics of the background is formulated as a hierarchical stochastic model whose form is derived from simple physical constraints, which in turn restrict the dynamics to only two possible classes. The probability distributions of both the signal and the background have simple representations in terms of Meijer G functions. The two universality classes for the background dynamics manifest themselves in the signal distribution as two types of tails: power law and stretched exponential, respectively. A detailed analysis of empirical data from classical turbulence and financial markets shows excellent agreement with the theory.
An analysis of population and social change in London wards in the 1980s.
Congdon, P
1989-01-01
"This paper discusses the estimation and projection of small area populations in London, [England] and considers trends in intercensal social and demographic indices which can be calculated using these estimates. Information available annually on vital statistics and electorates is combined with detailed data from the Census Small Area Statistics to derive demographic component based population estimates for London's electoral wards over five year periods. The availability of age disaggregated population estimates permits derivation of small area social indicators for intercensal years, for example, of unemployment and mortality. Trends in spatial inequality of such indicators during the 1980s are analysed and point to continuing wide differentials. A typology of population and social indicators gives an indication of the small area distribution of the recent population turnaround in inner London, and of its association with other social processes such as gentrification and ethnic concentration." excerpt
Low, Diana H P; Motakis, Efthymios
2013-10-01
Binding free energy calculations obtained through molecular dynamics simulations reflect intermolecular interaction states through a series of independent snapshots. Typically, the free energies of multiple simulated series (each with slightly different starting conditions) need to be estimated. Previous approaches carry out this task by moving averages at certain decorrelation times, assuming that the system comes from a single conformation description of binding events. Here, we discuss a more general approach that uses statistical modeling, wavelets denoising and hierarchical clustering to estimate the significance of multiple statistically distinct subpopulations, reflecting potential macrostates of the system. We present the deltaGseg R package that performs macrostate estimation from multiple replicated series and allows molecular biologists/chemists to gain physical insight into the molecular details that are not easily accessible by experimental techniques. deltaGseg is a Bioconductor R package available at http://bioconductor.org/packages/release/bioc/html/deltaGseg.html.
Forest Statistics for Minnesota's Northern Pine Unit.
Pat Murray
1991-01-01
The fifth inventory of Minnesota's Northern Pine Unit reports 11.1 million acres of land, of which 6.3 million acres are forested. This bulletin presents statistical highlights and contains detailed tables of forest area, as well as timber volume, growth, removals, mortality, and ownership.
Forest statistics for Minnesota's Aspen-Birch Unit.
Neal P. Kingsley
1991-01-01
The fifth inventory of Minnesota's Aspen-Birch Unit reports 8.7 million acres of land, of which 7.4 million acres are forested. This bulletin present statistical highlights and contains detailed tables of forest area, as well as timber volume, growth, removals, mortality, and ownership.
NASA Technical Reports Server (NTRS)
Rockwell, T. H.; Griffin, W. C.
1981-01-01
Critical in-flight events (CIFE) that threaten the aircraft were studied. The scope of the CIFE was described and defined with emphasis on characterizing event development, detection and assessment; pilot information requirements, sources, acquisition, and interpretation, pilot response options, decision processed, and decision implementation and event outcome. Detailed scenarios were developed for use in simulators and paper and pencil testing for developing relationships between pilot performance and background information as well as for an analysis of pilot reaction decision and feedback processes. Statistical relationships among pilot characteristics and observed responses to CIFE's were developed.
Nonlinear analysis and performance evaluation of the Annular Suspension and Pointing System (ASPS)
NASA Technical Reports Server (NTRS)
Joshi, S. M.
1978-01-01
The Annular Suspension and Pointing System (ASPS) can provide high accurate fine pointing for a variety of solar-, stellar-, and Earth-viewing scientific instruments during space shuttle orbital missions. In this report, a detailed nonlinear mathematical model is developed for the ASPS/Space Shuttle system. The equations are augmented with nonlinear models of components such as magnetic actuators and gimbal torquers. Control systems and payload attitude state estimators are designed in order to obtain satisfactory pointing performance, and statistical pointing performance is predicted in the presence of measurement noise and disturbances.
The fossil record of evolution: Data on diversification and extinction
NASA Technical Reports Server (NTRS)
Sepkoski, J. John, Jr.
1990-01-01
The two principle efforts include: (1) a compilation of a synoptic, mesoscale data base on times of origination and extinction of animal genera in the oceans over the last 600 million years of geologic time; and (2) an analysis of statistical patterns in these data that relate to the diversification of complex life and to the occurrence of mass extinctions, especially those that might be associated with extraterrestrial phenomena. The data base is unique in its taxonomic scope and detail and in its temporal resolution. It is a valuable resource for investigating evolutionary expansions and extinctions of complex life.
Resonance decay dynamics and their effects on pT spectra of pions in heavy-ion collisions
NASA Astrophysics Data System (ADS)
Lo, Pok Man
2018-03-01
The influence of resonance decay dynamics on the momentum spectra of pions in heavy-ion collisions is examined. Taking the decay processes ω →3 π and ρ →2 π as examples, I demonstrate how the resonance width and details of decay dynamics (via the decay matrix element) can modify the physical observables. The latter effect is commonly neglected in statistical models. To remedy the situation, a theoretical framework for incorporating hadron dynamics into the analysis is formulated, which can be straightforwardly extended to describe general N -body decays.
NASA Astrophysics Data System (ADS)
Zhou, Hui
It is the inevitable outcome of higher education reform to carry out office and departmental target responsibility system, in which statistical processing of student's information is an important part of student's performance review. On the basis of the analysis of the student's evaluation, the student information management database application system is designed by using relational database management system software in this paper. In order to implement the function of student information management, the functional requirement, overall structure, data sheets and fields, data sheet Association and software codes are designed in details.
Memristor emulator causes dissimilarity on a coupled memristive systems
NASA Astrophysics Data System (ADS)
Sabarathinam, S.; Prasad, Awadhesh
2018-04-01
The memristor is known as abasic fourth passive solid state circuit element. Itgaining increasing attention to create the next generation electronic devices commonly used as fundamental chaotic circuit although often arbitrary (typically piecewise linear or cubic) fluxcharge characteristics. In thispresent work, the causes of the memristor emulator studied in a coupled memristive chaoticoscillator for the first time. We confirm that the emulator that allows synchronization between theoscillators and cause the dissimilarity between the systems when increasing the couplingstrength, and co-efficient of the memristor emulator. The detailed statistical analysis was performed to confirm such phenomenon.
NASA Astrophysics Data System (ADS)
Okladnikov, Igor; Gordov, Evgeny; Titov, Alexander; Fazliev, Alexander
2017-04-01
Description and the first results of the Russian Science Foundation project "Virtual computational information environment for analysis, evaluation and prediction of the impacts of global climate change on the environment and climate of a selected region" is presented. The project is aimed at development of an Internet-accessible computation and information environment providing unskilled in numerical modelling and software design specialists, decision-makers and stakeholders with reliable and easy-used tools for in-depth statistical analysis of climatic characteristics, and instruments for detailed analysis, assessment and prediction of impacts of global climate change on the environment and climate of the targeted region. In the framework of the project, approaches of "cloud" processing and analysis of large geospatial datasets will be developed on the technical platform of the Russian leading institution involved in research of climate change and its consequences. Anticipated results will create a pathway for development and deployment of thematic international virtual research laboratory focused on interdisciplinary environmental studies. VRE under development will comprise best features and functionality of earlier developed information and computing system CLIMATE (http://climate.scert.ru/), which is widely used in Northern Eurasia environment studies. The Project includes several major directions of research listed below. 1. Preparation of geo-referenced data sets, describing the dynamics of the current and possible future climate and environmental changes in detail. 2. Improvement of methods of analysis of climate change. 3. Enhancing the functionality of the VRE prototype in order to create a convenient and reliable tool for the study of regional social, economic and political consequences of climate change. 4. Using the output of the first three tasks, compilation of the VRE prototype, its validation, preparation of applicable detailed description of climate change in Western Siberia, and dissemination of the Project results. Results of the first stage of the Project implementation are presented. This work is supported by the Russian Science Foundation grant No16-19-10257.
Uvarov, Vladimir; Popov, Inna; Shapur, Nandakishore; Abdin, Tamer; Gofrit, Ofer N; Pode, Dov; Duvdevani, Mordechai
2011-12-01
Urinary calculi have been recognized as one of the most painful medical disorders. Tenable knowledge of the phase composition of the stones is very important to elucidate an underlying etiology of the stone disease. We report here the results of quantitative X-ray diffraction phase analysis performed on 278 kidney stones from the 275 patients treated at the Department of Urology of Hadassah Hebrew University Hospital (Jerusalem, Israel). Quantification of biominerals in multicomponent samples was performed using the normalized reference intensity ratio method. According to the observed phase compositions, all the tested stones were classified into five chemical groups: oxalates (43.2%), phosphates (7.7%), urates (10.3%), cystines (2.9%), and stones composed of a mixture of different minerals (35.9%). A detailed analysis of each allocated chemical group is presented along with the crystallite size calculations for all the observed crystalline phases. The obtained results have been compared with the published data originated from different geographical regions. Morphology and spatial distribution of the phases identified in the kidney stones were studied with scanning electron microscopy (SEM) and energy-dispersive X-ray spectroscopy (EDS). This type of detailed study of phase composition and structural characteristics of the kidney stones was performed in Israel for the first time.
Surface radiant flux densities inferred from LAC and GAC AVHRR data
NASA Astrophysics Data System (ADS)
Berger, F.; Klaes, D.
To infer surface radiant flux densities from current (NOAA-AVHRR, ERS-1/2 ATSR) and future meteorological (Envisat AATSR, MSG, METOP) satellite data, the complex, modular analysis scheme SESAT (Strahlungs- und Energieflüsse aus Satellitendaten) could be developed (Berger, 2001). This scheme allows the determination of cloud types, optical and microphysical cloud properties as well as surface and TOA radiant flux densities. After testing of SESAT in Central Europe and the Baltic Sea catchment (more than 400scenes U including a detailed validation with various surface measurements) it could be applied to a large number of NOAA-16 AVHRR overpasses covering the globe.For the analysis, two different spatial resolutions U local area coverage (LAC) andwere considered. Therefore, all inferred results, like global area coverage (GAC) U cloud cover, cloud properties and radiant properties, could be intercompared. Specific emphasis could be made to the surface radiant flux densities (all radiative balance compoments), where results for different regions, like Southern America, Southern Africa, Northern America, Europe, and Indonesia, will be presented. Applying SESAT, energy flux densities, like latent and sensible heat flux densities could also be determined additionally. A statistical analysis of all results including a detailed discussion for the two spatial resolutions will close this study.
Multivariate Analysis of Longitudinal Rates of Change
Bryan, Matthew; Heagerty, Patrick J.
2016-01-01
Longitudinal data allow direct comparison of the change in patient outcomes associated with treatment or exposure. Frequently, several longitudinal measures are collected that either reflect a common underlying health status, or characterize processes that are influenced in a similar way by covariates such as exposure or demographic characteristics. Statistical methods that can combine multivariate response variables into common measures of covariate effects have been proposed by Roy and Lin [1]; Proust-Lima, Letenneur and Jacqmin-Gadda [2]; and Gray and Brookmeyer [3] among others. Current methods for characterizing the relationship between covariates and the rate of change in multivariate outcomes are limited to select models. For example, Gray and Brookmeyer [3] introduce an “accelerated time” method which assumes that covariates rescale time in longitudinal models for disease progression. In this manuscript we detail an alternative multivariate model formulation that directly structures longitudinal rates of change, and that permits a common covariate effect across multiple outcomes. We detail maximum likelihood estimation for a multivariate longitudinal mixed model. We show via asymptotic calculations the potential gain in power that may be achieved with a common analysis of multiple outcomes. We apply the proposed methods to the analysis of a trivariate outcome for infant growth and compare rates of change for HIV infected and uninfected infants. PMID:27417129
Bhatia, Tripta
2018-07-01
Accurate quantitative analysis of image data requires that we distinguish between fluorescence intensity (true signal) and the noise inherent to its measurements to the extent possible. We image multilamellar membrane tubes and beads that grow from defects in the fluid lamellar phase of the lipid 1,2-dioleoyl-sn-glycero-3-phosphocholine dissolved in water and water-glycerol mixtures by using fluorescence confocal polarizing microscope. We quantify image noise and determine the noise statistics. Understanding the nature of image noise also helps in optimizing image processing to detect sub-optical features, which would otherwise remain hidden. We use an image-processing technique "optimum smoothening" to improve the signal-to-noise ratio of features of interest without smearing their structural details. A high SNR renders desired positional accuracy with which it is possible to resolve features of interest with width below optical resolution. Using optimum smoothening, the smallest and the largest core diameter detected is of width [Formula: see text] and [Formula: see text] nm, respectively, discussed in this paper. The image-processing and analysis techniques and the noise modeling discussed in this paper can be used for detailed morphological analysis of features down to sub-optical length scales that are obtained by any kind of fluorescence intensity imaging in the raster mode.
Dark Energy Survey Year 1 Results: Multi-Probe Methodology and Simulated Likelihood Analyses
DOE Office of Scientific and Technical Information (OSTI.GOV)
Krause, E.; et al.
We present the methodology for and detail the implementation of the Dark Energy Survey (DES) 3x2pt DES Year 1 (Y1) analysis, which combines configuration-space two-point statistics from three different cosmological probes: cosmic shear, galaxy-galaxy lensing, and galaxy clustering, using data from the first year of DES observations. We have developed two independent modeling pipelines and describe the code validation process. We derive expressions for analytical real-space multi-probe covariances, and describe their validation with numerical simulations. We stress-test the inference pipelines in simulated likelihood analyses that vary 6-7 cosmology parameters plus 20 nuisance parameters and precisely resemble the analysis to be presented in the DES 3x2pt analysis paper, using a variety of simulated input data vectors with varying assumptions. We find that any disagreement between pipelines leads to changes in assigned likelihoodmore » $$\\Delta \\chi^2 \\le 0.045$$ with respect to the statistical error of the DES Y1 data vector. We also find that angular binning and survey mask do not impact our analytic covariance at a significant level. We determine lower bounds on scales used for analysis of galaxy clustering (8 Mpc$$~h^{-1}$$) and galaxy-galaxy lensing (12 Mpc$$~h^{-1}$$) such that the impact of modeling uncertainties in the non-linear regime is well below statistical errors, and show that our analysis choices are robust against a variety of systematics. These tests demonstrate that we have a robust analysis pipeline that yields unbiased cosmological parameter inferences for the flagship 3x2pt DES Y1 analysis. We emphasize that the level of independent code development and subsequent code comparison as demonstrated in this paper is necessary to produce credible constraints from increasingly complex multi-probe analyses of current data.« less
ViPAR: a software platform for the Virtual Pooling and Analysis of Research Data.
Carter, Kim W; Francis, Richard W; Carter, K W; Francis, R W; Bresnahan, M; Gissler, M; Grønborg, T K; Gross, R; Gunnes, N; Hammond, G; Hornig, M; Hultman, C M; Huttunen, J; Langridge, A; Leonard, H; Newman, S; Parner, E T; Petersson, G; Reichenberg, A; Sandin, S; Schendel, D E; Schalkwyk, L; Sourander, A; Steadman, C; Stoltenberg, C; Suominen, A; Surén, P; Susser, E; Sylvester Vethanayagam, A; Yusof, Z
2016-04-01
Research studies exploring the determinants of disease require sufficient statistical power to detect meaningful effects. Sample size is often increased through centralized pooling of disparately located datasets, though ethical, privacy and data ownership issues can often hamper this process. Methods that facilitate the sharing of research data that are sympathetic with these issues and which allow flexible and detailed statistical analyses are therefore in critical need. We have created a software platform for the Virtual Pooling and Analysis of Research data (ViPAR), which employs free and open source methods to provide researchers with a web-based platform to analyse datasets housed in disparate locations. Database federation permits controlled access to remotely located datasets from a central location. The Secure Shell protocol allows data to be securely exchanged between devices over an insecure network. ViPAR combines these free technologies into a solution that facilitates 'virtual pooling' where data can be temporarily pooled into computer memory and made available for analysis without the need for permanent central storage. Within the ViPAR infrastructure, remote sites manage their own harmonized research dataset in a database hosted at their site, while a central server hosts the data federation component and a secure analysis portal. When an analysis is initiated, requested data are retrieved from each remote site and virtually pooled at the central site. The data are then analysed by statistical software and, on completion, results of the analysis are returned to the user and the virtually pooled data are removed from memory. ViPAR is a secure, flexible and powerful analysis platform built on open source technology that is currently in use by large international consortia, and is made publicly available at [http://bioinformatics.childhealthresearch.org.au/software/vipar/]. © The Author 2015. Published by Oxford University Press on behalf of the International Epidemiological Association.
Clinical use and misuse of automated semen analysis.
Sherins, R J
1991-01-01
During the past six years, there has been an explosion of technology which allows automated machine-vision for sperm analysis. CASA clearly provides an opportunity for objective, systematic assessment of sperm motion. But there are many caveats in using this type of equipment. CASA requires a disciplined and standardized approach to semen collection, specimen preparation, machine settings, calibration and avoidance of sampling bias. Potential sources of error can be minimized. Unfortunately, the rapid commercialization of this technology preceded detailed statistical analysis of such data to allow equally rapid comparisons of data between different CASA machines and among different laboratories. Thus, it is now imperative that we standardize use of this technology and obtain more detailed biological insights into sperm motion parameters in semen and after capacitation before we empirically employ CASA for studies of fertility prediction. In the basic science arena, CASA technology will likely evolve to provide new algorithms for accurate sperm motion analysis and give us an opportunity to address the biophysics of sperm movement. In the clinical arena, CASA instruments provide the opportunity to share and compare sperm motion data among laboratories by virtue of its objectivity, assuming standardized conditions of utilization. Identification of men with specific sperm motion disorders is certain, but the biological relevance of motility dysfunction to actual fertilization remains uncertain and surely the subject for further study.
Understanding Student Learning: The Need for Education Data
ERIC Educational Resources Information Center
Slaven, Chip
2015-01-01
Schools have long collected information about students, from basic emergency contact details to daily attendance statistics. But only recently have schools used education technology to collect solid, reliable information (or data) about how students learn--as well as details about their strengths, challenges, and individual traits that impact…
Statistical Report: Academic Year 2014-15. Student Exchange Program
ERIC Educational Resources Information Center
Western Interstate Commission for Higher Education, 2015
2015-01-01
This report covers fall 2014 enrollments for WUE [Western Undergraduate Exchange], WRGP [Western Regional Graduate Program], and PSEP [Professional Student Exchange Program]. It details the funds that flow between students' home states and the enrolling PSEP institutions that receive them. This newly expanded format gives detailed enrollment for…
An Efficient Objective Analysis System for Parallel Computers
NASA Technical Reports Server (NTRS)
Stobie, J.
1999-01-01
A new atmospheric objective analysis system designed for parallel computers will be described. The system can produce a global analysis (on a 1 X 1 lat-lon grid with 18 levels of heights and winds and 10 levels of moisture) using 120,000 observations in 17 minutes on 32 CPUs (SGI Origin 2000). No special parallel code is needed (e.g. MPI or multitasking) and the 32 CPUs do not have to be on the same platform. The system is totally portable and can run on several different architectures at once. In addition, the system can easily scale up to 100 or more CPUS. This will allow for much higher resolution and significant increases in input data. The system scales linearly as the number of observations and the number of grid points. The cost overhead in going from 1 to 32 CPUs is 18%. In addition, the analysis results are identical regardless of the number of processors used. This system has all the characteristics of optimal interpolation, combining detailed instrument and first guess error statistics to produce the best estimate of the atmospheric state. Static tests with a 2 X 2.5 resolution version of this system showed it's analysis increments are comparable to the latest NASA operational system including maintenance of mass-wind balance. Results from several months of cycling test in the Goddard EOS Data Assimilation System (GEOS DAS) show this new analysis retains the same level of agreement between the first guess and observations (O-F statistics) as the current operational system.
Forest Statistics for Minnesota's Central Hardwood Unit.
Earl C. Leatherberry
1991-01-01
In 1990, the fifth inventory of Minnesota's Central Hardwood Unit found 11.9 million acres of land, of which 2.4 million acres are forested. This bulletin presents statistical highlights and contains detailed tables of forest area, as well as timber volume, growth, removal, mortality, and ownership.