Science.gov

Sample records for addition statistical analyses

  1. Insights into Corona Formation through Statistical Analyses

    NASA Technical Reports Server (NTRS)

    Glaze, L. S.; Stofan, E. R.; Smrekar, S. E.; Baloga, S. M.

    2002-01-01

    Statistical analysis of an expanded database of coronae on Venus indicates that the populations of Type 1 (with fracture annuli) and 2 (without fracture annuli) corona diameters are statistically indistinguishable, and therefore we have no basis for assuming different formation mechanisms. Analysis of the topography and diameters of coronae shows that coronae that are depressions, rimmed depressions, and domes tend to be significantly smaller than those that are plateaus, rimmed plateaus, or domes with surrounding rims. This is consistent with the model of Smrekar and Stofan and inconsistent with predictions of the spreading drop model of Koch and Manga. The diameter range for domes, the initial stage of corona formation, provides a broad constraint on the buoyancy of corona-forming plumes. Coronae are only slightly more likely to be topographically raised than depressions, with Type 1 coronae most frequently occurring as rimmed depressions and Type 2 coronae most frequently occuring with flat interiors and raised rims. Most Type 1 coronae are located along chasmata systems or fracture belts, while Type 2 coronas are found predominantly as isolated features in the plains. Coronae at hotspot rises tend to be significantly larger than coronae in other settings, consistent with a hotter upper mantle at hotspot rises and their active state.

  2. Reporting statistical analyses in peer review journal articles.

    PubMed

    Stephens, Richard; Grant, Maria J

    2015-06-01

    As a regular referee for the Health Information and Libraries Journal, Richard Stephens--Winner of the 2014 Wellcome Trust Science Writing Prize--has been impressed by the science on offer in the Health Information and Libraries Journal. But he has also been struck by how often similar problems with statistical analysis reporting come up during the review process. Acknowledging that statistics can be scary, he advocates that they should be simply viewed as a means of communicating ideas. In this editorial, he provides some straightforward guidelines on reporting statistical analyses in peer review journal articles, highlights pitfalls to avoid and illustrates best practice to aim for. PMID:25943969

  3. Confounded Statistical Analyses Hinder Interpretation of the NELP Report

    ERIC Educational Resources Information Center

    Paris, Scott G.; Luo, Serena Wenshu

    2010-01-01

    The National Early Literacy Panel (2008) report identified early predictors of reading achievement as good targets for instruction, and many of those skills are related to decoding. In this article, the authors suggest that the developmental trajectories of rapidly developing skills pose problems for traditional statistical analyses. Rapidly…

  4. Statistical analyses in trials for the comprehensive understanding of organogenesis and histogenesis in humans and mice.

    PubMed

    Otani, Hiroki; Udagawa, Jun; Naito, Kanta

    2016-06-01

    Statistical analyses based on the quantitative data from real multicellular organisms are useful as inductive-type studies to analyse complex morphogenetic events in addition to deductive-type analyses using mathematical models. Here, we introduce several of our trials for the statistical analysis of organogenesis and histogenesis of human and mouse embryos and foetuses. Multidimensional scaling has been applied to prove the existence and examine the mode of interkinetic nuclear migration, a regulatory mechanism of stem cell proliferation/differentiation in epithelial tubular tissues. Several statistical methods were used on morphometric data from human foetuses to establish the multidimensional standard growth curve and to describe the relation among the developing organs and body parts. Although the results are still limited, we show that these analyses are not only useful to understand the normal and abnormal morphogenesis in humans and mice but also to provide clues that could correlate aspects of prenatal developmental events with postnatal diseases. PMID:26935132

  5. The Large-Scale Structure of Semantic Networks: Statistical Analyses and a Model of Semantic Growth

    ERIC Educational Resources Information Center

    Steyvers, Mark; Tenenbaum, Joshua B.

    2005-01-01

    We present statistical analyses of the large-scale structure of 3 types of semantic networks: word associations, WordNet, and Roget's Thesaurus. We show that they have a small-world structure, characterized by sparse connectivity, short average path lengths between words, and strong local clustering. In addition, the distributions of the number of…

  6. Weak additivity principle for current statistics in d dimensions.

    PubMed

    Pérez-Espigares, C; Garrido, P L; Hurtado, P I

    2016-04-01

    The additivity principle (AP) allows one to compute the current distribution in many one-dimensional nonequilibrium systems. Here we extend this conjecture to general d-dimensional driven diffusive systems, and validate its predictions against both numerical simulations of rare events and microscopic exact calculations of three paradigmatic models of diffusive transport in d=2. Crucially, the existence of a structured current vector field at the fluctuating level, coupled to the local mobility, turns out to be essential to understand current statistics in d>1. We prove that, when compared to the straightforward extension of the AP to high d, the so-called weak AP always yields a better minimizer of the macroscopic fluctuation theory action for current statistics. PMID:27176236

  7. Weak additivity principle for current statistics in d dimensions

    NASA Astrophysics Data System (ADS)

    Pérez-Espigares, C.; Garrido, P. L.; Hurtado, P. I.

    2016-04-01

    The additivity principle (AP) allows one to compute the current distribution in many one-dimensional nonequilibrium systems. Here we extend this conjecture to general d -dimensional driven diffusive systems, and validate its predictions against both numerical simulations of rare events and microscopic exact calculations of three paradigmatic models of diffusive transport in d =2 . Crucially, the existence of a structured current vector field at the fluctuating level, coupled to the local mobility, turns out to be essential to understand current statistics in d >1 . We prove that, when compared to the straightforward extension of the AP to high d , the so-called weak AP always yields a better minimizer of the macroscopic fluctuation theory action for current statistics.

  8. Review of Statistical Methods for Analysing Healthcare Resources and Costs

    PubMed Central

    Mihaylova, Borislava; Briggs, Andrew; O'Hagan, Anthony; Thompson, Simon G

    2011-01-01

    We review statistical methods for analysing healthcare resource use and costs, their ability to address skewness, excess zeros, multimodality and heavy right tails, and their ease for general use. We aim to provide guidance on analysing resource use and costs focusing on randomised trials, although methods often have wider applicability. Twelve broad categories of methods were identified: (I) methods based on the normal distribution, (II) methods following transformation of data, (III) single-distribution generalized linear models (GLMs), (IV) parametric models based on skewed distributions outside the GLM family, (V) models based on mixtures of parametric distributions, (VI) two (or multi)-part and Tobit models, (VII) survival methods, (VIII) non-parametric methods, (IX) methods based on truncation or trimming of data, (X) data components models, (XI) methods based on averaging across models, and (XII) Markov chain methods. Based on this review, our recommendations are that, first, simple methods are preferred in large samples where the near-normality of sample means is assured. Second, in somewhat smaller samples, relatively simple methods, able to deal with one or two of above data characteristics, may be preferable but checking sensitivity to assumptions is necessary. Finally, some more complex methods hold promise, but are relatively untried; their implementation requires substantial expertise and they are not currently recommended for wider applied work. Copyright © 2010 John Wiley & Sons, Ltd. PMID:20799344

  9. A weighted U statistic for association analyses considering genetic heterogeneity.

    PubMed

    Wei, Changshuai; Elston, Robert C; Lu, Qing

    2016-07-20

    Converging evidence suggests that common complex diseases with the same or similar clinical manifestations could have different underlying genetic etiologies. While current research interests have shifted toward uncovering rare variants and structural variations predisposing to human diseases, the impact of heterogeneity in genetic studies of complex diseases has been largely overlooked. Most of the existing statistical methods assume the disease under investigation has a homogeneous genetic effect and could, therefore, have low power if the disease undergoes heterogeneous pathophysiological and etiological processes. In this paper, we propose a heterogeneity-weighted U (HWU) method for association analyses considering genetic heterogeneity. HWU can be applied to various types of phenotypes (e.g., binary and continuous) and is computationally efficient for high-dimensional genetic data. Through simulations, we showed the advantage of HWU when the underlying genetic etiology of a disease was heterogeneous, as well as the robustness of HWU against different model assumptions (e.g., phenotype distributions). Using HWU, we conducted a genome-wide analysis of nicotine dependence from the Study of Addiction: Genetics and Environments dataset. The genome-wide analysis of nearly one million genetic markers took 7h, identifying heterogeneous effects of two new genes (i.e., CYP3A5 and IKBKB) on nicotine dependence. Copyright © 2016 John Wiley & Sons, Ltd. PMID:26833871

  10. Weighted Statistical Binning: Enabling Statistically Consistent Genome-Scale Phylogenetic Analyses

    PubMed Central

    Bayzid, Md Shamsuzzoha; Mirarab, Siavash; Boussau, Bastien; Warnow, Tandy

    2015-01-01

    Because biological processes can result in different loci having different evolutionary histories, species tree estimation requires multiple loci from across multiple genomes. While many processes can result in discord between gene trees and species trees, incomplete lineage sorting (ILS), modeled by the multi-species coalescent, is considered to be a dominant cause for gene tree heterogeneity. Coalescent-based methods have been developed to estimate species trees, many of which operate by combining estimated gene trees, and so are called "summary methods". Because summary methods are generally fast (and much faster than more complicated coalescent-based methods that co-estimate gene trees and species trees), they have become very popular techniques for estimating species trees from multiple loci. However, recent studies have established that summary methods can have reduced accuracy in the presence of gene tree estimation error, and also that many biological datasets have substantial gene tree estimation error, so that summary methods may not be highly accurate in biologically realistic conditions. Mirarab et al. (Science 2014) presented the "statistical binning" technique to improve gene tree estimation in multi-locus analyses, and showed that it improved the accuracy of MP-EST, one of the most popular coalescent-based summary methods. Statistical binning, which uses a simple heuristic to evaluate "combinability" and then uses the larger sets of genes to re-calculate gene trees, has good empirical performance, but using statistical binning within a phylogenomic pipeline does not have the desirable property of being statistically consistent. We show that weighting the re-calculated gene trees by the bin sizes makes statistical binning statistically consistent under the multispecies coalescent, and maintains the good empirical performance. Thus, "weighted statistical binning" enables highly accurate genome-scale species tree estimation, and is also statistically

  11. Statistical tests of additional plate boundaries from plate motion inversions

    NASA Technical Reports Server (NTRS)

    Stein, S.; Gordon, R. G.

    1984-01-01

    The application of the F-ratio test, a standard statistical technique, to the results of relative plate motion inversions has been investigated. The method tests whether the improvement in fit of the model to the data resulting from the addition of another plate to the model is greater than that expected purely by chance. This approach appears to be useful in determining whether additional plate boundaries are justified. Previous results have been confirmed favoring separate North American and South American plates with a boundary located beween 30 N and the equator. Using Chase's global relative motion data, it is shown that in addition to separate West African and Somalian plates, separate West Indian and Australian plates, with a best-fitting boundary between 70 E and 90 E, can be resolved. These results are generally consistent with the observation that the Indian plate's internal deformation extends somewhat westward of the Ninetyeast Ridge. The relative motion pole is similar to Minster and Jordan's and predicts the NW-SE compression observed in earthquake mechanisms near the Ninetyeast Ridge.

  12. Using a Five-Step Procedure for Inferential Statistical Analyses

    ERIC Educational Resources Information Center

    Kamin, Lawrence F.

    2010-01-01

    Many statistics texts pose inferential statistical problems in a disjointed way. By using a simple five-step procedure as a template for statistical inference problems, the student can solve problems in an organized fashion. The problem and its solution will thus be a stand-by-itself organic whole and a single unit of thought and effort. The…

  13. Testing of hypotheses about altitude decompression sickness by statistical analyses

    NASA Technical Reports Server (NTRS)

    Van Liew, H. D.; Burkard, M. E.; Conkin, J.; Powell, M. R. (Principal Investigator)

    1996-01-01

    This communication extends a statistical analysis of forced-descent decompression sickness at altitude in exercising subjects (J Appl Physiol 1994; 76:2726-2734) with a data subset having an additional explanatory variable, rate of ascent. The original explanatory variables for risk-function analysis were environmental pressure of the altitude, duration of exposure, and duration of pure-O2 breathing before exposure; the best fit was consistent with the idea that instantaneous risk increases linearly as altitude exposure continues. Use of the new explanatory variable improved the fit of the smaller data subset, as indicated by log likelihood. Also, with ascent rate accounted for, replacement of the term for linear accrual of instantaneous risk by a term for rise and then decay made a highly significant improvement upon the original model (log likelihood increased by 37 log units). The authors conclude that a more representative data set and removal of the variability attributable to ascent rate allowed the rise-and-decay mechanism, which is expected from theory and observations, to become manifest.

  14. Cucumis monosomic alien addition lines: morphological, cytological, and genotypic analyses.

    PubMed

    Chen, Jin-Feng; Luo, Xiang-Dong; Qian, Chun-Tao; Jahn, Molly M; Staub, Jack E; Zhuang, Fei-Yun; Lou, Qun-Feng; Ren, Gang

    2004-05-01

    Cucumis hystrix Chakr. (HH, 2n=24), a wild relative of the cultivated cucumber, possesses several potentially valuable disease-resistance and abiotic stress-tolerance traits for cucumber ( C. sativus L., CC, 2n=14) improvement. Numerous attempts have been made to transfer desirable traits since the successful interspecific hybridization between C. hystrix and C. sativus, one of which resulted in the production of an allotriploid (HCC, 2n=26: one genome of C. hystrix and two of C. sativus). When this genotype was treated with colchicine to induce polyploidy, two monosomic alien addition lines (MAALs) (plant nos. 87 and 517: 14 CC+1 H, 2n=15) were recovered among 252 viable plants. Each of these plants was morphologically distinct from allotriploids and cultivated cucumbers. Cytogenetic and molecular marker analyses were performed to confirm the genetic constitution and further characterize these two MAALs. Chromosome counts made from at least 30 meristematic cells from each plant confirmed 15 nuclear chromosomes. In pollen mother cells of plant nos. 87 and 517, seven bivalents and one univalent were observed at diakinesis and metaphase I; the frequency of trivalent formation was low (about 4-5%). At anaphase I and II, stochastic and asymmetric division led to the formation of two gamete classes: n=7 and n=8; however, pollen fertility was relatively high. Pollen stainability in plant no. 87 was 86.7% and in plant no. 517 was 93.2%. Random amplified polymorphic DNA analysis was performed using 100 random 10-base primers. Genotypes obtained with eight primers (A-9, A-11, AH-13, AI-19, AJ-18, AJ-20, E-19, and N-20) showed a band common to the two MAAL plants and C. hystrix that was absent in C. sativus, confirming that the alien chromosomes present in the MAALs were derived from C. hystrix. Morphological differences and differences in banding patterns were also observed between plant nos. 87 and 517 after amplification with primers AI-5, AJ-13, N-12, and N-20

  15. Study design, methodology and statistical analyses in the clinical development of sparfloxacin.

    PubMed

    Genevois, E; Lelouer, V; Vercken, J B; Caillon, R

    1996-05-01

    Many publications in the past 10 years have emphasised the difficulties of evaluating anti-infective drugs and the need for well-designed clinical trials in this therapeutic field. The clinical development of sparfloxacin in Europe, involving more than 4000 patients in ten countries, provided the opportunity to implement a methodology for evaluation and statistical analyses which would take into account actual requirements and past insufficiencies. This methodology focused on a rigorous and accurate patient classification for evaluability, subgroups of particular interest, efficacy assessment based on automation (algorithm) and individual case review by expert panel committees. In addition, the statistical analyses did not use significance testing but rather confidence intervals to determine whether sparfloxacin was therapeutically equivalent to the reference comparator antibacterial agents. PMID:8737126

  16. SOCR Analyses – an Instructional Java Web-based Statistical Analysis Toolkit

    PubMed Central

    Chu, Annie; Cui, Jenny; Dinov, Ivo D.

    2011-01-01

    The Statistical Online Computational Resource (SOCR) designs web-based tools for educational use in a variety of undergraduate courses (Dinov 2006). Several studies have demonstrated that these resources significantly improve students' motivation and learning experiences (Dinov et al. 2008). SOCR Analyses is a new component that concentrates on data modeling and analysis using parametric and non-parametric techniques supported with graphical model diagnostics. Currently implemented analyses include commonly used models in undergraduate statistics courses like linear models (Simple Linear Regression, Multiple Linear Regression, One-Way and Two-Way ANOVA). In addition, we implemented tests for sample comparisons, such as t-test in the parametric category; and Wilcoxon rank sum test, Kruskal-Wallis test, Friedman's test, in the non-parametric category. SOCR Analyses also include several hypothesis test models, such as Contingency tables, Friedman's test and Fisher's exact test. The code itself is open source (http://socr.googlecode.com/), hoping to contribute to the efforts of the statistical computing community. The code includes functionality for each specific analysis model and it has general utilities that can be applied in various statistical computing tasks. For example, concrete methods with API (Application Programming Interface) have been implemented in statistical summary, least square solutions of general linear models, rank calculations, etc. HTML interfaces, tutorials, source code, activities, and data are freely available via the web (www.SOCR.ucla.edu). Code examples for developers and demos for educators are provided on the SOCR Wiki website. In this article, the pedagogical utilization of the SOCR Analyses is discussed, as well as the underlying design framework. As the SOCR project is on-going and more functions and tools are being added to it, these resources are constantly improved. The reader is strongly encouraged to check the SOCR site for most

  17. Statistical and Time Analyses of Gamma-Ray Bursts

    NASA Astrophysics Data System (ADS)

    Marani, Gabriela Fabiana

    1998-10-01

    If Gamma-Ray Bursts (GRBs) are cosmological in origin, different signatures are expected to be found in the available \\batse catalog and time series. Cross-correlation with extragalactic objects and gravitational lensing by intervening objects are some examples of these particular features. A new statistical method to perform 2-D angular cross-correlations between GRB positions and different catalogs of extragalactic objects has been implemented. For the best located 74 3B GRBs, a 2.5-σ level association with rich, nearby Abell clusters is found. If all the cluster are taken into account, a 3.5-σ association is found for the best located 27 3B bursts. Another 2.5-σ excess is found between soft GRBs and the whole Abell catalog. On the other hand, excesses of the order of ~ 3.5-4-σ are found between the best located 10 to 380 bursts and the nearest, intrinsically brightest radio quiet quasars. However, the correlations appears to be due to statistical fluctuations and the excesses are only suggestive of a physical association. A fully automated search among 1,235 4B bursts has been carried out to look for statistical similarities in the GRB time series. No two GRBs are found to be identical to within the statistical limits of the data, regardless of their position on the sky. GRBs that appear statistically similar in one energy channel are either dissimilar in other energy channels or too dim for a significant statistical comparison. One consequence of the search for statistical similarities is that no gravitational lensing of GRBs by foreground galaxies have been found. Given this null result, and assuming no GRB rate density or luminosity evolution, and that QSO images are distorted by the same galaxy field, a conservative upper limit to the redshift is derived, zmax ~ 4.22 at a 2-σ confidence level, for an Einstein-de Sitter universe and bursts with P > 1 photon cm-2 s-1. The gravitational lensing rates have been computed for different Friedmann

  18. Statistical Analyses of Hydrophobic Interactions: A Mini-Review.

    PubMed

    Pratt, Lawrence R; Chaudhari, Mangesh I; Rempe, Susan B

    2016-07-14

    This review focuses on the striking recent progress in solving for hydrophobic interactions between small inert molecules. We discuss several new understandings. First, the inverse temperature phenomenology of hydrophobic interactions, i.e., strengthening of hydrophobic bonds with increasing temperature, is decisively exhibited by hydrophobic interactions between atomic-scale hard sphere solutes in water. Second, inclusion of attractive interactions associated with atomic-size hydrophobic reference cases leads to substantial, nontrivial corrections to reference results for purely repulsive solutes. Hydrophobic bonds are weakened by adding solute dispersion forces to treatment of reference cases. The classic statistical mechanical theory for those corrections is not accurate in this application, but molecular quasi-chemical theory shows promise. Finally, because of the masking roles of excluded volume and attractive interactions, comparisons that do not discriminate the different possibilities face an interpretive danger. PMID:27258151

  19. Scripts for TRUMP data analyses. Part II (HLA-related data): statistical analyses specific for hematopoietic stem cell transplantation.

    PubMed

    Kanda, Junya

    2016-01-01

    The Transplant Registry Unified Management Program (TRUMP) made it possible for members of the Japan Society for Hematopoietic Cell Transplantation (JSHCT) to analyze large sets of national registry data on autologous and allogeneic hematopoietic stem cell transplantation. However, as the processes used to collect transplantation information are complex and differed over time, the background of these processes should be understood when using TRUMP data. Previously, information on the HLA locus of patients and donors had been collected using a questionnaire-based free-description method, resulting in some input errors. To correct minor but significant errors and provide accurate HLA matching data, the use of a Stata or EZR/R script offered by the JSHCT is strongly recommended when analyzing HLA data in the TRUMP dataset. The HLA mismatch direction, mismatch counting method, and different impacts of HLA mismatches by stem cell source are other important factors in the analysis of HLA data. Additionally, researchers should understand the statistical analyses specific for hematopoietic stem cell transplantation, such as competing risk, landmark analysis, and time-dependent analysis, to correctly analyze transplant data. The data center of the JSHCT can be contacted if statistical assistance is required. PMID:26588927

  20. Deep stratospheric intrusions: a statistical assessment with model guided analyses

    NASA Astrophysics Data System (ADS)

    Elbern, H.; Kowol, J.; Sládkovic, R.; Ebel, A.

    A statistical assessment of deep intrusions of stratospheric air based on records of two adjacent mountain stations of the northern Alps at different altitudes is presented. Ten years recordings of beryllium activity, ozone concentrations, and relative humidity at the Zugspitze summit (2962 m a.s.l.), as well as ozone and relative humidity at the Wank summit (1776 m a.s.l., 15km distance) were analyzed. 195 stratospheric intrusion events could unambiguously be identified for the Zugspitze, whereas 85 intrusion events were found for the Wank. No event could be reliably identified at the valley floor station at Garmisch-Partenkirchen (740m a.s.l.). There is a pronounced seasonal cycle of the frequency of events showing highest activity during fall, winter, and spring, whereas low activity is found during summer. By assessing average events it was possible to infer the monthly mean enrichment rate of the lower tropospheric ozone concentration by deep stratospheric intrusions. It was found that at least 3% of the ozone burden is replaced every month on an annual average. Three events of moderate strength were taken to be further analyzed by mesoscale meteorological model simulations with subsequent trajectory studies. In two cases the intrusion of stratospheric air was induced by tropopause foldings. In the third case a cut-off low with an associated fold was responsible for the increased exchange. All three cases revealed that the ingress of stratospheric air observed at the mountain station is a non-local process induced more than 2000 km apart. Transport over these distances took about 2-4 days. Along the pathways through the tropopause the air parcels are shown to subside from the tip of the folds at 400-500 hPa down to about 700 hPa to reach the Zugspitze measurement station.

  1. Multivariate statistical analyses of groundwater surrounding Forty mile wash

    SciTech Connect

    Woocay, A.; Walton, J.C.

    2007-07-01

    Groundwater chemistry data from 211 sampling locations in the vicinity of Yucca Mountain, Nevada are analyzed using multivariate statistical methods in order to better understand groundwater chemical evolution, ascertain potential flow paths and determine hydrochemical facies. Correspondence analysis of the major ion chemistry is used to define relationships between and among major ions and sampling locations. A k-means cluster analysis is used to determine hydrochemical facies based on correspondence analysis dimensions. The derived dimensions and hydrochemical facies are presented as bi-plots and overlaid on a digital elevation model of the region giving a visual picture of potential interactions and flow paths. A distinct signature of the groundwater chemistry along the extended flow path of Fortymile Wash can be observed along with some potential interaction at possible fault lines near Highway I-95. The signature from Fortymile Wash is believed to represent the relict of water that infiltrated during past pluvial periods when the amount of runoff in the wash was significantly larger than during the current drier period. This hypothesis appears to be supported by hydrogen-2 and oxygen-18 data which indicate that younger groundwater is found in the upper part of the wash near Yucca Mountain and older groundwater is found in the lower region of the wash near Amargosa River. The range of the hydrogen-2 data corresponds to precipitation in a period of relatively cold climate and has a similar spatial signature to the oxygen-18 data. If the hypothesis that current groundwater chemistry primarily reflects past focused infiltration of surface runoff rather than regional groundwater migration is correct, then saturated zone transport from Yucca Mountain may be much slower than is currently anticipated. (authors)

  2. "What If" Analyses: Ways to Interpret Statistical Significance Test Results Using EXCEL or "R"

    ERIC Educational Resources Information Center

    Ozturk, Elif

    2012-01-01

    The present paper aims to review two motivations to conduct "what if" analyses using Excel and "R" to understand the statistical significance tests through the sample size context. "What if" analyses can be used to teach students what statistical significance tests really do and in applied research either prospectively to estimate what sample size…

  3. Statistical Data Analyses of Trace Chemical, Biochemical, and Physical Analytical Signatures

    SciTech Connect

    Udey, Ruth Norma

    2013-01-01

    Analytical and bioanalytical chemistry measurement results are most meaningful when interpreted using rigorous statistical treatments of the data. The same data set may provide many dimensions of information depending on the questions asked through the applied statistical methods. Three principal projects illustrated the wealth of information gained through the application of statistical data analyses to diverse problems.

  4. Statistical Reporting Errors and Collaboration on Statistical Analyses in Psychological Science

    PubMed Central

    Veldkamp, Coosje L. S.; Nuijten, Michèle B.; Dominguez-Alvarez, Linda; van Assen, Marcel A. L. M.; Wicherts, Jelte M.

    2014-01-01

    Statistical analysis is error prone. A best practice for researchers using statistics would therefore be to share data among co-authors, allowing double-checking of executed tasks just as co-pilots do in aviation. To document the extent to which this ‘co-piloting’ currently occurs in psychology, we surveyed the authors of 697 articles published in six top psychology journals and asked them whether they had collaborated on four aspects of analyzing data and reporting results, and whether the described data had been shared between the authors. We acquired responses for 49.6% of the articles and found that co-piloting on statistical analysis and reporting results is quite uncommon among psychologists, while data sharing among co-authors seems reasonably but not completely standard. We then used an automated procedure to study the prevalence of statistical reporting errors in the articles in our sample and examined the relationship between reporting errors and co-piloting. Overall, 63% of the articles contained at least one p-value that was inconsistent with the reported test statistic and the accompanying degrees of freedom, and 20% of the articles contained at least one p-value that was inconsistent to such a degree that it may have affected decisions about statistical significance. Overall, the probability that a given p-value was inconsistent was over 10%. Co-piloting was not found to be associated with reporting errors. PMID:25493918

  5. Statistical Analyses of White-Light Flares: Two Main Results about Flare Behaviour

    NASA Astrophysics Data System (ADS)

    Dal, Hasan Ali

    2012-08-01

    We present two main results, based on models and the statistical analyses of 1672 U-band flares. We also discuss the behaviour of white-light flares. In addition, the parameters of the flares detected from two years of observations on CR Dra are presented. By comparing with flare parameters obtained from other UV Ceti-type stars, we examine the behaviour of the optical flare processes along with the spectral types. Moreover, we aimed, using large white-light flare data, to analyse the flare time-scales with respect to some results obtained from X-ray observations. Using SPSS V17.0 and GraphPad Prism V5.02 software, the flares detected from CR Dra were modelled with the OPEA function, and analysed with the t-Test method to compare similar flare events in other stars. In addition, using some regression calculations in order to derive the best histograms, the time-scales of white-light flares were analysed. Firstly, CR Dra flares have revealed that white-light flares behave in a similar way as their counterparts observed in X-rays. As can be seen in X-ray observations, the electron density seems to be a dominant parameter in white-light flare process, too. Secondly, the distributions of the flare time-scales demonstrate that the number of observed flares reaches a maximum value in some particular ratios, which are 0.5, or its multiples, and especially positive integers. The thermal processes might be dominant for these white-light flares, while non-thermal processes might be dominant in the others. To obtain better results for the behaviour of the white-light flare process along with the spectral types, much more stars in a wide spectral range, from spectral type dK5e to dM6e, must be observed in white-light flare patrols.

  6. SOCR Analyses: Implementation and Demonstration of a New Graphical Statistics Educational Toolkit

    PubMed Central

    Chu, Annie; Cui, Jenny; Dinov, Ivo D.

    2011-01-01

    The web-based, Java-written SOCR (Statistical Online Computational Resource) tools have been utilized in many undergraduate and graduate level statistics courses for seven years now (Dinov 2006; Dinov et al. 2008b). It has been proven that these resources can successfully improve students’ learning (Dinov et al. 2008b). Being first published online in 2005, SOCR Analyses is a somewhat new component and it concentrate on data modeling for both parametric and non-parametric data analyses with graphical model diagnostics. One of the main purposes of SOCR Analyses is to facilitate statistical learning for high school and undergraduate students. As we have already implemented SOCR Distributions and Experiments, SOCR Analyses and Charts fulfill the rest of a standard statistics curricula. Currently, there are four core components of SOCR Analyses. Linear models included in SOCR Analyses are simple linear regression, multiple linear regression, one-way and two-way ANOVA. Tests for sample comparisons include t-test in the parametric category. Some examples of SOCR Analyses’ in the non-parametric category are Wilcoxon rank sum test, Kruskal-Wallis test, Friedman’s test, Kolmogorov-Smirnoff test and Fligner-Killeen test. Hypothesis testing models include contingency table, Friedman’s test and Fisher’s exact test. The last component of Analyses is a utility for computing sample sizes for normal distribution. In this article, we present the design framework, computational implementation and the utilization of SOCR Analyses. PMID:21666874

  7. Transformation (normalization) of slope gradient and surface curvatures, automated for statistical analyses from DEMs

    NASA Astrophysics Data System (ADS)

    Csillik, O.; Evans, I. S.; Drăguţ, L.

    2015-03-01

    Automated procedures are developed to alleviate long tails in frequency distributions of morphometric variables. They minimize the skewness of slope gradient frequency distributions, and modify the kurtosis of profile and plan curvature distributions toward that of the Gaussian (normal) model. Box-Cox (for slope) and arctangent (for curvature) transformations are tested on nine digital elevation models (DEMs) of varying origin and resolution, and different landscapes, and shown to be effective. Resulting histograms are illustrated and show considerable improvements over those for previously recommended slope transformations (sine, square root of sine, and logarithm of tangent). Unlike previous approaches, the proposed method evaluates the frequency distribution of slope gradient values in a given area and applies the most appropriate transform if required. Sensitivity of the arctangent transformation is tested, showing that Gaussian-kurtosis transformations are acceptable also in terms of histogram shape. Cube root transformations of curvatures produced bimodal histograms. The transforms are applicable to morphometric variables and many others with skewed or long-tailed distributions. By avoiding long tails and outliers, they permit parametric statistics such as correlation, regression and principal component analyses to be applied, with greater confidence that requirements for linearity, additivity and even scatter of residuals (constancy of error variance) are likely to be met. It is suggested that such transformations should be routinely applied in all parametric analyses of long-tailed variables. Our Box-Cox and curvature automated transformations are based on a Python script, implemented as an easy-to-use script tool in ArcGIS.

  8. Modelling and analysing track cycling Omnium performances using statistical and machine learning techniques.

    PubMed

    Ofoghi, Bahadorreza; Zeleznikow, John; Dwyer, Dan; Macmahon, Clare

    2013-01-01

    This article describes the utilisation of an unsupervised machine learning technique and statistical approaches (e.g., the Kolmogorov-Smirnov test) that assist cycling experts in the crucial decision-making processes for athlete selection, training, and strategic planning in the track cycling Omnium. The Omnium is a multi-event competition that will be included in the summer Olympic Games for the first time in 2012. Presently, selectors and cycling coaches make decisions based on experience and intuition. They rarely have access to objective data. We analysed both the old five-event (first raced internationally in 2007) and new six-event (first raced internationally in 2011) Omniums and found that the addition of the elimination race component to the Omnium has, contrary to expectations, not favoured track endurance riders. We analysed the Omnium data and also determined the inter-relationships between different individual events as well as between those events and the final standings of riders. In further analysis, we found that there is no maximum ranking (poorest performance) in each individual event that riders can afford whilst still winning a medal. We also found the required times for riders to finish the timed components that are necessary for medal winning. The results of this study consider the scoring system of the Omnium and inform decision-making toward successful participation in future major Omnium competitions. PMID:23320948

  9. Statistical Analyses of Raw Material Data for MTM45-1/CF7442A-36% RW: CMH Cure Cycle

    NASA Technical Reports Server (NTRS)

    Coroneos, Rula; Pai, Shantaram, S.; Murthy, Pappu

    2013-01-01

    This report describes statistical characterization of physical properties of the composite material system MTM45-1/CF7442A, which has been tested and is currently being considered for use on spacecraft structures. This composite system is made of 6K plain weave graphite fibers in a highly toughened resin system. This report summarizes the distribution types and statistical details of the tests and the conditions for the experimental data generated. These distributions will be used in multivariate regression analyses to help determine material and design allowables for similar material systems and to establish a procedure for other material systems. Additionally, these distributions will be used in future probabilistic analyses of spacecraft structures. The specific properties that are characterized are the ultimate strength, modulus, and Poisson??s ratio by using a commercially available statistical package. Results are displayed using graphical and semigraphical methods and are included in the accompanying appendixes.

  10. Multifractal and statistical analyses of heat release fluctuations in a spark ignition engine.

    PubMed

    Sen, Asok K; Litak, Grzegorz; Kaminski, Tomasz; Wendeker, Mirosław

    2008-09-01

    Using multifractal and statistical analyses, we have investigated the complex dynamics of cycle-to-cycle heat release variations in a spark ignition engine. Three different values of the spark advance angle (Delta beta) are examined. The multifractal complexity is characterized by the singularity spectrum of the heat release time series in terms of the Holder exponent. The broadness of the singularity spectrum gives a measure of the degree of mutifractality or complexity of the time series. The broader the spectrum, the richer and more complex is the structure with a higher degree of multifractality. Using this broadness measure, the complexity in heat release variations is compared for the three spark advance angles (SAAs). Our results reveal that the heat release data are most complex for Delta beta=30 degrees followed in order by Delta beta=15 degrees and 5 degrees. In other words, the complexity increases with increasing SAA. In addition, we found that for all the SAAs considered, the heat release fluctuations behave like an antipersistent or a negatively correlated process, becoming more antipersistent with decreasing SAA. We have also performed a statistical analysis of the heat release variations by calculating the kurtosis of their probability density functions (pdfs). It is found that for the smallest SAA considered, Delta beta=5 degrees, the pdf is nearly Gaussian with a kurtosis of 3.42. As the value of the SAA increases, the pdf deviates from a Gaussian distribution and tends to be more peaked with larger values of kurtosis. In particular, the kurtosis has values of 3.94 and 6.69, for Delta beta=15 degrees and 30 degrees, respectively. A non-Gaussian density function with kurtosis in excess of 3 is indicative of intermittency. A larger value of kurtosis implies a higher degree of intermittency. PMID:19045453

  11. Statistical Reform: Evidence-Based Practice, Meta-Analyses, and Single Subject Designs

    ERIC Educational Resources Information Center

    Jenson, William R.; Clark, Elaine; Kircher, John C.; Kristjansson, Sean D.

    2007-01-01

    Evidence-based practice approaches to interventions has come of age and promises to provide a new standard of excellence for school psychologists. This article describes several definitions of evidence-based practice and the problems associated with traditional statistical analyses that rely on rejection of the null hypothesis for the…

  12. Statistical analyses on the pattern of food consumption and digestive-tract cancers in Japan.

    PubMed

    Hara, N; Sakata, K; Nagai, M; Fujita, Y; Hashimoto, T; Yanagawa, H

    1984-01-01

    The relationships between areal differences in mortality from six digestive-tract cancers and consumption of selected foods in 46 of the 47 Japanese prefectures (Okinawa being excluded) were analyzed. Statistical analyses disclosed that the groups of foods positively associated with cancer death were as follows: for esophageal cancer, pork, oil, popular-grade sake, and green tea; for stomach cancer, fresh fish, salted or dried fish, salt, and special-grade sake; for colon cancer, bread, milk, butter, margarine, ketchup, beer, and salted or dried fish; for rectal cancer, fresh fish, salted or dried fish, salt, and popular-grade sake; for cancer of the biliary passages, pork, popular-grade sake, and green tea; and for pancreatic cancer, oil, mayonnaise, fresh fish, and salted or dried fish. These results are based on statistical analyses. Further epidemiological analyses are required to find a biological causal relationship. PMID:6545578

  13. Radiation Induced Chromatin Conformation Changes Analysed by Fluorescent Localization Microscopy, Statistical Physics, and Graph Theory

    PubMed Central

    Müller, Patrick; Hillebrandt, Sabina; Krufczik, Matthias; Bach, Margund; Kaufmann, Rainer; Hausmann, Michael; Heermann, Dieter W.

    2015-01-01

    It has been well established that the architecture of chromatin in cell nuclei is not random but functionally correlated. Chromatin damage caused by ionizing radiation raises complex repair machineries. This is accompanied by local chromatin rearrangements and structural changes which may for instance improve the accessibility of damaged sites for repair protein complexes. Using stably transfected HeLa cells expressing either green fluorescent protein (GFP) labelled histone H2B or yellow fluorescent protein (YFP) labelled histone H2A, we investigated the positioning of individual histone proteins in cell nuclei by means of high resolution localization microscopy (Spectral Position Determination Microscopy = SPDM). The cells were exposed to ionizing radiation of different doses and aliquots were fixed after different repair times for SPDM imaging. In addition to the repair dependent histone protein pattern, the positioning of antibodies specific for heterochromatin and euchromatin was separately recorded by SPDM. The present paper aims to provide a quantitative description of structural changes of chromatin after irradiation and during repair. It introduces a novel approach to analyse SPDM images by means of statistical physics and graph theory. The method is based on the calculation of the radial distribution functions as well as edge length distributions for graphs defined by a triangulation of the marker positions. The obtained results show that through the cell nucleus the different chromatin re-arrangements as detected by the fluorescent nucleosomal pattern average themselves. In contrast heterochromatic regions alone indicate a relaxation after radiation exposure and re-condensation during repair whereas euchromatin seemed to be unaffected or behave contrarily. SPDM in combination with the analysis techniques applied allows the systematic elucidation of chromatin re-arrangements after irradiation and during repair, if selected sub-regions of nuclei are

  14. Detailed statistical contact angle analyses; "slow moving" drops on inclining silicon-oxide surfaces.

    PubMed

    Schmitt, M; Groß, K; Grub, J; Heib, F

    2015-06-01

    Contact angle determination by sessile drop technique is essential to characterise surface properties in science and in industry. Different specific angles can be observed on every solid which are correlated with the advancing or the receding of the triple line. Different procedures and definitions for the determination of specific angles exist which are often not comprehensible or reproducible. Therefore one of the most important things in this area is to build standard, reproducible and valid methods for determining advancing/receding contact angles. This contribution introduces novel techniques to analyse dynamic contact angle measurements (sessile drop) in detail which are applicable for axisymmetric and non-axisymmetric drops. Not only the recently presented fit solution by sigmoid function and the independent analysis of the different parameters (inclination, contact angle, velocity of the triple point) but also the dependent analysis will be firstly explained in detail. These approaches lead to contact angle data and different access on specific contact angles which are independent from "user-skills" and subjectivity of the operator. As example the motion behaviour of droplets on flat silicon-oxide surfaces after different surface treatments is dynamically measured by sessile drop technique when inclining the sample plate. The triple points, the inclination angles, the downhill (advancing motion) and the uphill angles (receding motion) obtained by high-precision drop shape analysis are independently and dependently statistically analysed. Due to the small covered distance for the dependent analysis (<0.4mm) and the dominance of counted events with small velocity the measurements are less influenced by motion dynamics and the procedure can be called "slow moving" analysis. The presented procedures as performed are especially sensitive to the range which reaches from the static to the "slow moving" dynamic contact angle determination. They are characterised by

  15. Statistical Analyses of Second Indoor Bio-Release Field Evaluation Study at Idaho National Laboratory

    SciTech Connect

    Amidan, Brett G.; Pulsipher, Brent A.; Matzke, Brett D.

    2009-12-17

    In September 2008 a large-scale testing operation (referred to as the INL-2 test) was performed within a two-story building (PBF-632) at the Idaho National Laboratory (INL). The report “Operational Observations on the INL-2 Experiment” defines the seven objectives for this test and discusses the results and conclusions. This is further discussed in the introduction of this report. The INL-2 test consisted of five tests (events) in which a floor (level) of the building was contaminated with the harmless biological warfare agent simulant Bg and samples were taken in most, if not all, of the rooms on the contaminated floor. After the sampling, the building was decontaminated, and the next test performed. Judgmental samples and probabilistic samples were determined and taken during each test. Vacuum, wipe, and swab samples were taken within each room. The purpose of this report is to study an additional four topics that were not within the scope of the original report. These topics are: 1) assess the quantitative assumptions about the data being normally or log-normally distributed; 2) evaluate differences and quantify the sample to sample variability within a room and across the rooms; 3) perform geostatistical types of analyses to study spatial correlations; and 4) quantify the differences observed between surface types and sampling methods for each scenario and study the consistency across the scenarios. The following four paragraphs summarize the results of each of the four additional analyses. All samples after decontamination came back negative. Because of this, it was not appropriate to determine if these clearance samples were normally distributed. As Table 1 shows, the characterization data consists of values between and inclusive of 0 and 100 CFU/cm2 (100 was the value assigned when the number is too numerous to count). The 100 values are generally much bigger than the rest of the data, causing the data to be right skewed. There are also a significant

  16. Testing for Additivity at Select Mixture Groups of Interest Based on Statistical Equivalence Testing Methods

    SciTech Connect

    Stork, LeAnna M.; Gennings, Chris; Carchman, Richard; Carter, Jr., Walter H.; Pounds, Joel G.; Mumtaz, Moiz

    2006-12-01

    Several assumptions, defined and undefined, are used in the toxicity assessment of chemical mixtures. In scientific practice mixture components in the low-dose region, particularly subthreshold doses, are often assumed to behave additively (i.e., zero interaction) based on heuristic arguments. This assumption has important implications in the practice of risk assessment, but has not been experimentally tested. We have developed methodology to test for additivity in the sense of Berenbaum (Advances in Cancer Research, 1981), based on the statistical equivalence testing literature where the null hypothesis of interaction is rejected for the alternative hypothesis of additivity when data support the claim. The implication of this approach is that conclusions of additivity are made with a false positive rate controlled by the experimenter. The claim of additivity is based on prespecified additivity margins, which are chosen using expert biological judgment such that small deviations from additivity, which are not considered to be biologically important, are not statistically significant. This approach is in contrast to the usual hypothesis-testing framework that assumes additivity in the null hypothesis and rejects when there is significant evidence of interaction. In this scenario, failure to reject may be due to lack of statistical power making the claim of additivity problematic. The proposed method is illustrated in a mixture of five organophosphorus pesticides that were experimentally evaluated alone and at relevant mixing ratios. Motor activity was assessed in adult male rats following acute exposure. Four low-dose mixture groups were evaluated. Evidence of additivity is found in three of the four low-dose mixture groups.The proposed method tests for additivity of the whole mixture and does not take into account subset interactions (e.g., synergistic, antagonistic) that may have occurred and cancelled each other out.

  17. Statistical Analyses of Scatterplots to Identify Important Factors in Large-Scale Simulations

    SciTech Connect

    Kleijnen, J.P.C.; Helton, J.C.

    1999-04-01

    The robustness of procedures for identifying patterns in scatterplots generated in Monte Carlo sensitivity analyses is investigated. These procedures are based on attempts to detect increasingly complex patterns in the scatterplots under consideration and involve the identification of (1) linear relationships with correlation coefficients, (2) monotonic relationships with rank correlation coefficients, (3) trends in central tendency as defined by means, medians and the Kruskal-Wallis statistic, (4) trends in variability as defined by variances and interquartile ranges, and (5) deviations from randomness as defined by the chi-square statistic. The following two topics related to the robustness of these procedures are considered for a sequence of example analyses with a large model for two-phase fluid flow: the presence of Type I and Type II errors, and the stability of results obtained with independent Latin hypercube samples. Observations from analysis include: (1) Type I errors are unavoidable, (2) Type II errors can occur when inappropriate analysis procedures are used, (3) physical explanations should always be sought for why statistical procedures identify variables as being important, and (4) the identification of important variables tends to be stable for independent Latin hypercube samples.

  18. A guide to statistical analysis in microbial ecology: a community-focused, living review of multivariate data analyses.

    PubMed

    Buttigieg, Pier Luigi; Ramette, Alban

    2014-12-01

    The application of multivariate statistical analyses has become a consistent feature in microbial ecology. However, many microbial ecologists are still in the process of developing a deep understanding of these methods and appreciating their limitations. As a consequence, staying abreast of progress and debate in this arena poses an additional challenge to many microbial ecologists. To address these issues, we present the GUide to STatistical Analysis in Microbial Ecology (GUSTA ME): a dynamic, web-based resource providing accessible descriptions of numerous multivariate techniques relevant to microbial ecologists. A combination of interactive elements allows users to discover and navigate between methods relevant to their needs and examine how they have been used by others in the field. We have designed GUSTA ME to become a community-led and -curated service, which we hope will provide a common reference and forum to discuss and disseminate analytical techniques relevant to the microbial ecology community. PMID:25314312

  19. Exploratory study on a statistical method to analyse time resolved data obtained during nanomaterial exposure measurements

    NASA Astrophysics Data System (ADS)

    Clerc, F.; Njiki-Menga, G.-H.; Witschger, O.

    2013-04-01

    Most of the measurement strategies that are suggested at the international level to assess workplace exposure to nanomaterials rely on devices measuring, in real time, airborne particles concentrations (according different metrics). Since none of the instruments to measure aerosols can distinguish a particle of interest to the background aerosol, the statistical analysis of time resolved data requires special attention. So far, very few approaches have been used for statistical analysis in the literature. This ranges from simple qualitative analysis of graphs to the implementation of more complex statistical models. To date, there is still no consensus on a particular approach and the current period is always looking for an appropriate and robust method. In this context, this exploratory study investigates a statistical method to analyse time resolved data based on a Bayesian probabilistic approach. To investigate and illustrate the use of the this statistical method, particle number concentration data from a workplace study that investigated the potential for exposure via inhalation from cleanout operations by sandpapering of a reactor producing nanocomposite thin films have been used. In this workplace study, the background issue has been addressed through the near-field and far-field approaches and several size integrated and time resolved devices have been used. The analysis of the results presented here focuses only on data obtained with two handheld condensation particle counters. While one was measuring at the source of the released particles, the other one was measuring in parallel far-field. The Bayesian probabilistic approach allows a probabilistic modelling of data series, and the observed task is modelled in the form of probability distributions. The probability distributions issuing from time resolved data obtained at the source can be compared with the probability distributions issuing from the time resolved data obtained far-field, leading in a

  20. A Weighted U Statistic for Genetic Association Analyses of Sequencing Data

    PubMed Central

    Wei, Changshuai; Li, Ming; He, Zihuai; Vsevolozhskaya, Olga; Schaid, Daniel J.; Lu, Qing

    2014-01-01

    With advancements in next generation sequencing technology, a massive amount of sequencing data are generated, which offers a great opportunity to comprehensively investigate the role of rare variants in the genetic etiology of complex diseases. Nevertheless, the high-dimensional sequencing data poses a great challenge for statistical analysis. The association analyses based on traditional statistical methods suffer substantial power loss because of the low frequency of genetic variants and the extremely high dimensionality of the data. We developed a weighted U statistic, referred to as WU-SEQ, for the high-dimensional association analysis of sequencing data. Based on a non-parametric U statistic, WU-SEQ makes no assumption of the underlying disease model and phenotype distribution, and can be applied to a variety of phenotypes. Through simulation studies and an empirical study, we showed that WU-SEQ outperformed a commonly used SKAT method when the underlying assumptions were violated (e.g., the phenotype followed a heavy-tailed distribution). Even when the assumptions were satisfied, WU-SEQ still attained comparable performance to SKAT. Finally, we applied WU-SEQ to sequencing data from the Dallas Heart Study (DHS), and detected an association between ANGPTL 4 and very low density lipoprotein cholesterol. PMID:25331574

  1. Statistically significant deviations from additivity: What do they mean in assessing toxicity of mixtures?

    PubMed

    Liu, Yang; Vijver, Martina G; Qiu, Hao; Baas, Jan; Peijnenburg, Willie J G M

    2015-12-01

    There is increasing attention from scientists and policy makers to the joint effects of multiple metals on organisms when present in a mixture. Using root elongation of lettuce (Lactuca sativa L.) as a toxicity endpoint, the combined effects of binary mixtures of Cu, Cd, and Ni were studied. The statistical MixTox model was used to search deviations from the reference models i.e. concentration addition (CA) and independent action (IA). The deviations were subsequently interpreted as 'interactions'. A comprehensive experiment was designed to test the reproducibility of the 'interactions'. The results showed that the toxicity of binary metal mixtures was equally well predicted by both reference models. We found statistically significant 'interactions' in four of the five total datasets. However, the patterns of 'interactions' were found to be inconsistent or even contradictory across the different independent experiments. It is recommended that a statistically significant 'interaction', must be treated with care and is not necessarily biologically relevant. Searching a statistically significant interaction can be the starting point for further measurements and modeling to advance the understanding of underlying mechanisms and non-additive interactions occurring inside the organisms. PMID:26188643

  2. Multivariate Statistical Analyses Demonstrate Unique Host Immune Responses to Single and Dual Lentiviral Infection

    PubMed Central

    Chiaromonte, Francesca; Terwee, Julie; VandeWoude, Sue; Bjornstad, Ottar; Poss, Mary

    2009-01-01

    Background Feline immunodeficiency virus (FIV) and human immunodeficiency virus (HIV) are recently identified lentiviruses that cause progressive immune decline and ultimately death in infected cats and humans. It is of great interest to understand how to prevent immune system collapse caused by these lentiviruses. We recently described that disease caused by a virulent FIV strain in cats can be attenuated if animals are first infected with a feline immunodeficiency virus derived from a wild cougar. The detailed temporal tracking of cat immunological parameters in response to two viral infections resulted in high-dimensional datasets containing variables that exhibit strong co-variation. Initial analyses of these complex data using univariate statistical techniques did not account for interactions among immunological response variables and therefore potentially obscured significant effects between infection state and immunological parameters. Methodology and Principal Findings Here, we apply a suite of multivariate statistical tools, including Principal Component Analysis, MANOVA and Linear Discriminant Analysis, to temporal immunological data resulting from FIV superinfection in domestic cats. We investigated the co-variation among immunological responses, the differences in immune parameters among four groups of five cats each (uninfected, single and dual infected animals), and the “immune profiles” that discriminate among them over the first four weeks following superinfection. Dual infected cats mount an immune response by 24 days post superinfection that is characterized by elevated levels of CD8 and CD25 cells and increased expression of IL4 and IFNγ, and FAS. This profile discriminates dual infected cats from cats infected with FIV alone, which show high IL-10 and lower numbers of CD8 and CD25 cells. Conclusions Multivariate statistical analyses demonstrate both the dynamic nature of the immune response to FIV single and dual infection and the

  3. On an Additive Semigraphoid Model for Statistical Networks With Application to Pathway Analysis

    PubMed Central

    Li, Bing; Chun, Hyonho; Zhao, Hongyu

    2014-01-01

    We introduce a nonparametric method for estimating non-gaussian graphical models based on a new statistical relation called additive conditional independence, which is a three-way relation among random vectors that resembles the logical structure of conditional independence. Additive conditional independence allows us to use one-dimensional kernel regardless of the dimension of the graph, which not only avoids the curse of dimensionality but also simplifies computation. It also gives rise to a parallel structure to the gaussian graphical model that replaces the precision matrix by an additive precision operator. The estimators derived from additive conditional independence cover the recently introduced nonparanormal graphical model as a special case, but outperform it when the gaussian copula assumption is violated. We compare the new method with existing ones by simulations and in genetic pathway analysis. PMID:26401064

  4. Systematic Mapping and Statistical Analyses of Valley Landform and Vegetation Asymmetries Across Hydroclimatic Gradients

    NASA Astrophysics Data System (ADS)

    Poulos, M. J.; Pierce, J. L.; McNamara, J. P.; Flores, A. N.; Benner, S. G.

    2015-12-01

    Terrain aspect alters the spatial distribution of insolation across topography, driving eco-pedo-hydro-geomorphic feedbacks that can alter landform evolution and result in valley asymmetries for a suite of land surface characteristics (e.g. slope length and steepness, vegetation, soil properties, and drainage development). Asymmetric valleys serve as natural laboratories for studying how landscapes respond to climate perturbation. In the semi-arid montane granodioritic terrain of the Idaho batholith, Northern Rocky Mountains, USA, prior works indicate that reduced insolation on northern (pole-facing) aspects prolongs snow pack persistence, and is associated with thicker, finer-grained soils, that retain more water, prolong the growing season, support coniferous forest rather than sagebrush steppe ecosystems, stabilize slopes at steeper angles, and produce sparser drainage networks. We hypothesize that the primary drivers of valley asymmetry development are changes in the pedon-scale water-balance that coalesce to alter catchment-scale runoff and drainage development, and ultimately cause the divide between north and south-facing land surfaces to migrate northward. We explore this conceptual framework by coupling land surface analyses with statistical modeling to assess relationships and the relative importance of land surface characteristics. Throughout the Idaho batholith, we systematically mapped and tabulated various statistical measures of landforms, land cover, and hydroclimate within discrete valley segments (n=~10,000). We developed a random forest based statistical model to predict valley slope asymmetry based upon numerous measures (n>300) of landscape asymmetries. Preliminary results suggest that drainages are tightly coupled with hillslopes throughout the region, with drainage-network slope being one of the strongest predictors of land-surface-averaged slope asymmetry. When slope-related statistics are excluded, due to possible autocorrelation, valley

  5. How does the Danish Groundwater Monitoring Programme support statistical consistent nitrate trend analyses in groundwater?

    NASA Astrophysics Data System (ADS)

    Hansen, Birgitte; Thorling, Lærke; Sørensen, Brian; Dalgaard, Tommy; Erlandsen, Mogens

    2013-04-01

    The overall aim of performing nitrate trend analyses in oxic groundwater is to document the effect of regulation of Danish agriculture on N pollution. The design of the Danish Groundwater Monitoring Programme is presented and discussed in relation to performance of statistical consistence nitrate trend analyses. Three types of data are crucial. Firstly, long and continuous time-series from the national groundwater monitoring network enable a statistically systematic analysis of distribution, trends and trend reversals in the groundwater nitrate concentration. Secondly, knowledge about the N surplus in Danish agriculture since 1950 from Denmark Statistics is used as an indicator of the potential loss of N. Thirdly, groundwater recharge age determination are performed in order to allow linking of the first two dataset. Recent results published in Hansen et al. (2011 & 2012) will be presented. Since the 1980s, regulations implemented by Danish farmers have succeeded in optimizing the N (nitrogen) management at farm level. As a result, the upward agricultural N surplus trend has been reversed, and the N surplus has reduced by 30-55% from 1980 to 2007 depending on region. The reduction in the N surplus served to reduce the losses of N from agriculture, with documented positive effects on nature and the environment in Denmark. In groundwater, the upward trend in nitrate concentrations was reversed around 1980, and a larger number of downward nitrate trends were seen in the youngest groundwater compared with the oldest groundwater. However, on average, approximately 48% of the oxic monitored groundwater has nitrate concentrations above the groundwater and drinking water standards of 50 mg/l. Furthermore, trend analyses show that 33% of all the monitored groundwater has upward nitrate trends, while only 18% of the youngest groundwater has upward nitrate trends according to data sampled from 1988-2009. A regional analysis shows a correlation between a high level of N

  6. Statistical contact angle analyses; "slow moving" drops on a horizontal silicon-oxide surface.

    PubMed

    Schmitt, M; Grub, J; Heib, F

    2015-06-01

    Sessile drop experiments on horizontal surfaces are commonly used to characterise surface properties in science and in industry. The advancing angle and the receding angle are measurable on every solid. Specially on horizontal surfaces even the notions themselves are critically questioned by some authors. Building a standard, reproducible and valid method of measuring and defining specific (advancing/receding) contact angles is an important challenge of surface science. Recently we have developed two/three approaches, by sigmoid fitting, by independent and by dependent statistical analyses, which are practicable for the determination of specific angles/slopes if inclining the sample surface. These approaches lead to contact angle data which are independent on "user-skills" and subjectivity of the operator which is also of urgent need to evaluate dynamic measurements of contact angles. We will show in this contribution that the slightly modified procedures are also applicable to find specific angles for experiments on horizontal surfaces. As an example droplets on a flat freshly cleaned silicon-oxide surface (wafer) are dynamically measured by sessile drop technique while the volume of the liquid is increased/decreased. The triple points, the time, the contact angles during the advancing and the receding of the drop obtained by high-precision drop shape analysis are statistically analysed. As stated in the previous contribution the procedure is called "slow movement" analysis due to the small covered distance and the dominance of data points with low velocity. Even smallest variations in velocity such as the minimal advancing motion during the withdrawing of the liquid are identifiable which confirms the flatness and the chemical homogeneity of the sample surface and the high sensitivity of the presented approaches. PMID:25524007

  7. ADDITIONAL STRESS AND FRACTURE MECHANICS ANALYSES OF PRESSURIZED WATER REACTOR PRESSURE VESSEL NOZZLES

    SciTech Connect

    Walter, Matthew; Yin, Shengjun; Stevens, Gary; Sommerville, Daniel; Palm, Nathan; Heinecke, Carol

    2012-01-01

    In past years, the authors have undertaken various studies of nozzles in both boiling water reactors (BWRs) and pressurized water reactors (PWRs) located in the reactor pressure vessel (RPV) adjacent to the core beltline region. Those studies described stress and fracture mechanics analyses performed to assess various RPV nozzle geometries, which were selected based on their proximity to the core beltline region, i.e., those nozzle configurations that are located close enough to the core region such that they may receive sufficient fluence prior to end-of-life (EOL) to require evaluation of embrittlement as part of the RPV analyses associated with pressure-temperature (P-T) limits. In this paper, additional stress and fracture analyses are summarized that were performed for additional PWR nozzles with the following objectives: To expand the population of PWR nozzle configurations evaluated, which was limited in the previous work to just two nozzles (one inlet and one outlet nozzle). To model and understand differences in stress results obtained for an internal pressure load case using a two-dimensional (2-D) axi-symmetric finite element model (FEM) vs. a three-dimensional (3-D) FEM for these PWR nozzles. In particular, the ovalization (stress concentration) effect of two intersecting cylinders, which is typical of RPV nozzle configurations, was investigated. To investigate the applicability of previously recommended linear elastic fracture mechanics (LEFM) hand solutions for calculating the Mode I stress intensity factor for a postulated nozzle corner crack for pressure loading for these PWR nozzles. These analyses were performed to further expand earlier work completed to support potential revision and refinement of Title 10 to the U.S. Code of Federal Regulations (CFR), Part 50, Appendix G, Fracture Toughness Requirements, and are intended to supplement similar evaluation of nozzles presented at the 2008, 2009, and 2011 Pressure Vessels and Piping (PVP

  8. Municipal solid waste composition: Sampling methodology, statistical analyses, and case study evaluation

    SciTech Connect

    Edjabou, Maklawe Essonanawe; Jensen, Morten Bang; Götze, Ramona; Pivnenko, Kostyantyn; Petersen, Claus; Scheutz, Charlotte; Astrup, Thomas Fruergaard

    2015-02-15

    Highlights: • Tiered approach to waste sorting ensures flexibility and facilitates comparison of solid waste composition data. • Food and miscellaneous wastes are the main fractions contributing to the residual household waste. • Separation of food packaging from food leftovers during sorting is not critical for determination of the solid waste composition. - Abstract: Sound waste management and optimisation of resource recovery require reliable data on solid waste generation and composition. In the absence of standardised and commonly accepted waste characterisation methodologies, various approaches have been reported in literature. This limits both comparability and applicability of the results. In this study, a waste sampling and sorting methodology for efficient and statistically robust characterisation of solid waste was introduced. The methodology was applied to residual waste collected from 1442 households distributed among 10 individual sub-areas in three Danish municipalities (both single and multi-family house areas). In total 17 tonnes of waste were sorted into 10–50 waste fractions, organised according to a three-level (tiered approach) facilitating comparison of the waste data between individual sub-areas with different fractionation (waste from one municipality was sorted at “Level III”, e.g. detailed, while the two others were sorted only at “Level I”). The results showed that residual household waste mainly contained food waste (42 ± 5%, mass per wet basis) and miscellaneous combustibles (18 ± 3%, mass per wet basis). The residual household waste generation rate in the study areas was 3–4 kg per person per week. Statistical analyses revealed that the waste composition was independent of variations in the waste generation rate. Both, waste composition and waste generation rates were statistically similar for each of the three municipalities. While the waste generation rates were similar for each of the two housing types (single

  9. Review of Statistical Analyses Resulting from Performance of HLDWD- DWPF-005

    SciTech Connect

    Beck, R.S.

    1997-09-29

    The Engineering Department at the Defense Waste Processing Facility (DWPF) has reviewed two reports from the Statistical Consulting Section (SCS) involving the statistical analysis of test results for analysis of small sample inserts (references 1{ampersand}2). The test results cover two proposed analytical methods, a room temperature hydrofluoric acid preparation (Cold Chem) and a sodium peroxide/sodium hydroxide fusion modified for insert samples (Modified Fusion). The reports support implementation of the proposed small sample containers and analytical methods at DWPF. Hydragard sampler valve performance was typical of previous results (reference 3). Using an element from each major feed stream. lithium from the frit and iron from the sludge, the sampler was determined to deliver a uniform mixture in either sample container.The lithium to iron ratios were equivalent for the standard 15 ml vial and the 3 ml insert.The proposed method provide equivalent analyses as compared to the current methods. The biases associated with the proposed methods on a vitrified basis are less than 5% for major elements. The sum of oxides for the proposed method compares favorably with the sum of oxides for the conventional methods. However, the average sum of oxides for the Cold Chem method was 94.3% which is below the minimum required recovery of 95%. Both proposed methods, cold Chem and Modified Fusion, will be required at first to provide an accurate analysis which will routinely meet the 95% and 105% average sum of oxides limit for Product Composition Control System (PCCS).Issued to be resolved during phased implementation are as follows: (1) Determine calcine/vitrification factor for radioactive feed; (2) Evaluate covariance matrix change against process operating ranges to determine optimum sample size; (3) Evaluate sources for low sum of oxides; and (4) Improve remote operability of production versions of equipment and instruments for installation in 221-S.The specifics of

  10. Evaluation of Leymus chinensis quality using near-infrared reflectance spectroscopy with three different statistical analyses

    PubMed Central

    Chen, Jishan; Zhu, Ruifen; Xu, Ruixuan; Zhang, Wenjun; Shen, Yue

    2015-01-01

    Due to a boom in the dairy industry in Northeast China, the hay industry has been developing rapidly. Thus, it is very important to evaluate the hay quality with a rapid and accurate method. In this research, a novel technique that combines near infrared spectroscopy (NIRs) with three different statistical analyses (MLR, PCR and PLS) was used to predict the chemical quality of sheepgrass (Leymus chinensis) in Heilongjiang Province, China including the concentrations of crude protein (CP), acid detergent fiber (ADF), and neutral detergent fiber (NDF). Firstly, the linear partial least squares regression (PLS) was performed on the spectra and the predictions were compared to those with laboratory-based recorded spectra. Then, the MLR evaluation method for CP has a potential to be used for industry requirements, as it needs less sophisticated and cheaper instrumentation using only a few wavelengths. Results show that in terms of CP, ADF and NDF, (i) the prediction accuracy in terms of CP, ADF and NDF using PLS was obviously improved compared to the PCR algorithm, and comparable or even better than results generated using the MLR algorithm; (ii) the predictions were worse compared to laboratory-based spectra with the MLR algorithmin, and poor predictions were obtained (R2, 0.62, RPD, 0.9) using MLR in terms of NDF; (iii) a satisfactory accuracy with R2 and RPD by PLS method of 0.91, 3.2 for CP, 0.89, 3.1 for ADF and 0.88, 3.0 for NDF, respectively, was obtained. Our results highlight the use of the combined NIRs-PLS method could be applied as a valuable technique to rapidly and accurately evaluate the quality of sheepgrass hay. PMID:26644973

  11. Compositional provinces of Mars from statistical analyses of TES, GRS, OMEGA and CRISM data

    NASA Astrophysics Data System (ADS)

    Rogers, A. Deanne; Hamilton, Victoria E.

    2015-01-01

    identified 10 distinct classes of mineral assemblage on Mars through statistical analyses of mineral abundances derived from Mars Global Surveyor Thermal Emission Spectrometer (TES) data at a spatial resolution of 8 pixels per degree. Two classes are new regions in Sinus Meridiani and northern Hellas basin. Except for crystalline hematite abundance, Sinus Meridiani exhibits compositional characteristics similar to Meridiani Planum; these two regions may share part of a common history. The northern margin of Hellas basin lacks olivine and high-Ca pyroxene compared to terrains just outside the Hellas outer ring; this may reflect a difference in crustal compositions and/or aqueous alteration. Hesperian highland volcanic terrains are largely mapped into one class. These terrains exhibit low-to-intermediate potassium and thorium concentrations (from Gamma Ray Spectrometer (GRS) data) compared to older highland terrains, indicating differences in the complexity of processes affecting mantle melts between these different-aged terrains. A previously reported, locally observed trend toward decreasing proportions of low-calcium pyroxene relative to total pyroxene with time is also apparent over the larger scales of our study. Spatial trends in olivine and pyroxene abundance are consistent with those observed in near-infrared data sets. Generally, regions that are distinct in TES data also exhibit distinct elemental characteristics in GRS data, suggesting that surficial coatings are not the primary control on TES mineralogical variations, but rather reflect regional differences in igneous and large-scale sedimentary/glacial processes. Distinct compositions measured over large, low-dust regions from multiple data sets indicate that global homogenization of unconsolidated surface materials has not occurred.

  12. Statistical analyses of nuclear waste level measurements to estimate retained gas volumes

    NASA Astrophysics Data System (ADS)

    Whitney, Paul D.; Chen, Guang

    1999-01-01

    The Hanford site is home to 177 large, underground nuclear waste storage tanks. Numerous safety and environmental concerns around these tanks and their contents. One such concern is the propensity for the waste in these tanks to generate and retain flammable gases. The surface level of the waste in these tanks is routinely monitored to assess whether the tanks are leaking. For some of the tanks, the waste surface level measurements synchronously fluctuated with atmospheric pressure changes. The current best explanation for these synchronous fluctuations is that the waste contains gas-phase material that changes volume in response to the atmospheric pressure changes. This paper describes: (1) The exploratory data analysis that led to the discovery of the phenomena; (2) A physical mode based on the ideal gas law that explains the phenomena. Additionally, the model allows one to obtain estimates of the retained gas volume in the tank waste; (3) A statistical procedure for detecting retained gas based on the physical model and tank surface level measurements; and (4) A Kalman filter model for analyzing the dynamics of retained gas. It's also shown how the filter can be used to detect abrupt changes in the system.

  13. Characterization of Microstructural Changes in Coarse Ferritic-Pearlitic Stainless Steel Through the Statistical Fluctuation and Fractal Analyses of Barkhausen Noise

    NASA Astrophysics Data System (ADS)

    Padovese, L. R.; da Silva, F. E.; Moura, E. P.; Gonçalves, L. L.

    2010-02-01

    This work aims to identify the changes in the microstructure of ferritic-pearlitic stainless steel, through the statistical fluctuation and fractal analyses of Barkhausen noise. The samples studied were obtained from pipes of steam pressure vessels, and presented coarse ferritic-pearlitic phases before degradation. Due to temperature effects, two different microstructures were obtained from pearlite that has partially and completely transformed to spheroidite. The statistical fluctuations of the Barkhausen signals are obtained by means of Hurst and detrended-fluctuation analyses, and the fractal analyses are carried out by applying the minimal cover technique to the signals. The curves obtained for the statistical fluctuations and fractal analyses, as functions of the time window, were processed by using pattern classification techniques such as principal-component analysis and Karhunen-Loève expansion. Approximately a 100% success rate has been reached for the classification of the different microstructures, and this indicates that the proposed analyses can be an effective additional tool in the study of microstructural characterization.

  14. Identifying Frequent Users of an Urban Emergency Medical Service Using Descriptive Statistics and Regression Analyses

    PubMed Central

    Norman, Chenelle; Mello, Michael; Choi, Bryan

    2016-01-01

    This retrospective cohort study provides a descriptive analysis of a population that frequently uses an urban emergency medical service (EMS) and identifies factors that contribute to use among all frequent users. For purposes of this study we divided frequent users into the following groups: low- frequent users (4 EMS transports in 2012), medium-frequent users (5 to 6 EMS transports in 2012), high-frequent users (7 to 10 EMS transports in 2012) and super-frequent users (11 or more EMS transports in 2012). Overall, we identified 539 individuals as frequent users. For all groups of EMS frequent users (i.e. low, medium, high and super) one or more hospital admissions, receiving a referral for follow-up care upon discharge, and having no insurance were found to be statistically significant with frequent EMS use (P<0.05). Within the diagnostic categories, 41.61% of super-frequent users had a diagnosis of “primarily substance abuse/misuse” and among low-frequent users a majority, 53.33%, were identified as having a “reoccurring (medical) diagnosis.” Lastly, relative risk ratios for the highest group of users, super-frequent users, were 3.34 (95% CI [1.90–5.87]) for obtaining at least one referral for follow-up care, 13.67 (95% CI [5.60–33.34]) for having four or more hospital admissions and 5.95 (95% CI [1.80–19.63]) for having a diagnoses of primarily substance abuse/misuse. Findings from this study demonstrate that among low- and medium-frequent users a majority of patients are using EMS for reoccurring medical conditions. This could potentially be avoided with better care management. In addition, this study adds to the current literature that illustrates a strong correlation between substance abuse/misuse and high/super-frequent EMS use. For the subgroup analysis among individuals 65 years of age and older, we did not find any of the independent variables included in our model to be statistically significant with frequent EMS use. PMID:26823929

  15. Using Additional Analyses to Clarify the Functions of Problem Behavior: An Analysis of Two Cases

    ERIC Educational Resources Information Center

    Payne, Steven W.; Dozier, Claudia L.; Neidert, Pamela L.; Jowett, Erica S.; Newquist, Matthew H.

    2014-01-01

    Functional analyses (FA) have proven useful for identifying contingencies that influence problem behavior. Research has shown that some problem behavior may only occur in specific contexts or be influenced by multiple or idiosyncratic variables. When these contexts or sources of influence are not assessed in an FA, further assessment may be…

  16. Additives

    NASA Technical Reports Server (NTRS)

    Smalheer, C. V.

    1973-01-01

    The chemistry of lubricant additives is discussed to show what the additives are chemically and what functions they perform in the lubrication of various kinds of equipment. Current theories regarding the mode of action of lubricant additives are presented. The additive groups discussed include the following: (1) detergents and dispersants, (2) corrosion inhibitors, (3) antioxidants, (4) viscosity index improvers, (5) pour point depressants, and (6) antifouling agents.

  17. A proposed statistical framework for the management of subgroup analyses for large clinical trials.

    PubMed

    Luo, Xiaolong; Chen, Peng; Wu, Alan Chengqing; Pan, Guohua; Li, Mingyu; Chen, Guang; Dong, Qian; Cline, Gary A; Dornseif, Bruce E; Jin, Zhezhen

    2015-11-01

    Planned and unplanned subgroup analyses of large clinical trials are frequently performed and the results are sometimes difficult to interpret. The source of a nominal significant finding may come from a true signal, variation of the clinical trial outcome or the observed data structure. Quantitative assessment is critical to the interpretation of the totality of the clinical data. In this article we provide a general framework to manage subgroup analyses and to interpret the findings through a set of supplement analyses to planned main (primary and secondary) analyses, as an alternative to the commonly used multiple comparison framework. The proposed approach collectively and coherently utilizes several quantitative methods and enhances the credibility and interpretability of subgroup analyses. A case study is used to illustrate the application of the proposed method. PMID:26388115

  18. Statistical Analyses of Scatterplots to Identify Important Factors in Large-Scale Simulations, 2. Robustness of Techniques

    SciTech Connect

    Helton, J.C.; Kleijnen, J.P.C.

    1999-03-24

    Procedures for identifying patterns in scatterplots generated in Monte Carlo sensitivity analyses are described and illustrated. These procedures attempt to detect increasingly complex patterns in scatterplots and involve the identification of (i) linear relationships with correlation coefficients, (ii) monotonic relationships with rank correlation coefficients, (iii) trends in central tendency as defined by means, medians and the Kruskal-Wallis statistic, (iv) trends in variability as defined by variances and interquartile ranges, and (v) deviations from randomness as defined by the chi-square statistic. A sequence of example analyses with a large model for two-phase fluid flow illustrates how the individual procedures can differ in the variables that they identify as having effects on particular model outcomes. The example analyses indicate that the use of a sequence of procedures is a good analysis strategy and provides some assurance that an important effect is not overlooked.

  19. Statistical modelling of measurement errors in gas chromatographic analyses of blood alcohol content.

    PubMed

    Moroni, Rossana; Blomstedt, Paul; Wilhelm, Lars; Reinikainen, Tapani; Sippola, Erkki; Corander, Jukka

    2010-10-10

    Headspace gas chromatographic measurements of ethanol content in blood specimens from suspect drunk drivers are routinely carried out in forensic laboratories. In the widely established standard statistical framework, measurement errors in such data are represented by Gaussian distributions for the population of blood specimens at any given level of ethanol content. It is known that the variance of measurement errors increases as a function of the level of ethanol content and the standard statistical approach addresses this issue by replacing the unknown population variances by estimates derived from large sample using a linear regression model. Appropriate statistical analysis of the systematic and random components in the measurement errors is necessary in order to guarantee legally sound security corrections reported to the police authority. Here we address this issue by developing a novel statistical approach that takes into account any potential non-linearity in the relationship between the level of ethanol content and the variability of measurement errors. Our method is based on standard non-parametric kernel techniques for density estimation using a large database of laboratory measurements for blood specimens. Furthermore, we address also the issue of systematic errors in the measurement process by a statistical model that incorporates the sign of the error term in the security correction calculations. Analysis of a set of certified reference materials (CRMs) blood samples demonstrates the importance of explicitly handling the direction of the systematic errors in establishing the statistical uncertainty about the true level of ethanol content. Use of our statistical framework to aid quality control in the laboratory is also discussed. PMID:20494532

  20. The use and misuse of statistical analyses. [in geophysics and space physics

    NASA Technical Reports Server (NTRS)

    Reiff, P. H.

    1983-01-01

    The statistical techniques most often used in space physics include Fourier analysis, linear correlation, auto- and cross-correlation, power spectral density, and superposed epoch analysis. Tests are presented which can evaluate the significance of the results obtained through each of these. Data presented without some form of error analysis are frequently useless, since they offer no way of assessing whether a bump on a spectrum or on a superposed epoch analysis is real or merely a statistical fluctuation. Among many of the published linear correlations, for instance, the uncertainty in the intercept and slope is not given, so that the significance of the fitted parameters cannot be assessed.

  1. Applying statistical causal analyses to agricultural conservation: A case study examining P loss impacts

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Estimating the effect of agricultural conservation practices on reducing nutrient loss using observational data can be confounded by differing crop types and differing management practices. As we may not have the full knowledge of these confounding factors, conventional statistical methods are ofte...

  2. Metagenomic analyses of the late Pleistocene permafrost - additional tools for reconstruction of environmental conditions

    NASA Astrophysics Data System (ADS)

    Rivkina, Elizaveta; Petrovskaya, Lada; Vishnivetskaya, Tatiana; Krivushin, Kirill; Shmakova, Lyubov; Tutukina, Maria; Meyers, Arthur; Kondrashov, Fyodor

    2016-04-01

    A comparative analysis of the metagenomes from two 30 000-year-old permafrost samples, one of lake-alluvial origin and the other from late Pleistocene Ice Complex sediments, revealed significant differences within microbial communities. The late Pleistocene Ice Complex sediments (which have been characterized by the absence of methane with lower values of redox potential and Fe2+ content) showed a low abundance of methanogenic archaea and enzymes from both the carbon and nitrogen cycles, but a higher abundance of enzymes associated with the sulfur cycle. The metagenomic and geochemical analyses described in the paper provide evidence that the formation of the sampled late Pleistocene Ice Complex sediments likely took place under much more aerobic conditions than lake-alluvial sediments.

  3. Time series expression analyses using RNA-seq: a statistical approach.

    PubMed

    Oh, Sunghee; Song, Seongho; Grabowski, Gregory; Zhao, Hongyu; Noonan, James P

    2013-01-01

    RNA-seq is becoming the de facto standard approach for transcriptome analysis with ever-reducing cost. It has considerable advantages over conventional technologies (microarrays) because it allows for direct identification and quantification of transcripts. Many time series RNA-seq datasets have been collected to study the dynamic regulations of transcripts. However, statistically rigorous and computationally efficient methods are needed to explore the time-dependent changes of gene expression in biological systems. These methods should explicitly account for the dependencies of expression patterns across time points. Here, we discuss several methods that can be applied to model timecourse RNA-seq data, including statistical evolutionary trajectory index (SETI), autoregressive time-lagged regression (AR(1)), and hidden Markov model (HMM) approaches. We use three real datasets and simulation studies to demonstrate the utility of these dynamic methods in temporal analysis. PMID:23586021

  4. Statistic analyses of the color experience according to the age of the observer.

    PubMed

    Hunjet, Anica; Parac-Osterman, Durdica; Vucaj, Edita

    2013-04-01

    Psychological experience of color is a real state of the communication between the environment and color, and it will depend on the source of the light, angle of the view, and particular on the observer and his health condition. Hering's theory or a theory of the opponent processes supposes that cones, which are situated in the retina of the eye, are not sensible on the three chromatic domains (areas, fields, zones) (red, green and purple-blue), but they produce a signal based on the principle of the opposed pairs of colors. A reason of this theory depends on the fact that certain disorders of the color eyesight, which include blindness to certain colors, cause blindness to pairs of opponent colors. This paper presents a demonstration of the experience of blue and yellow tone according to the age of the observer. For the testing of the statistically significant differences in the omission in the color experience according to the color of the background we use following statistical tests: Mann-Whitnney U Test, Kruskal-Wallis ANOVA and Median test. It was proven that the differences are statistically significant in the elderly persons (older than 35 years). PMID:23837226

  5. Bayesian statistical approaches to compositional analyses of transgenic crops 2. Application and validation of informative prior distributions.

    PubMed

    Harrison, Jay M; Breeze, Matthew L; Berman, Kristina H; Harrigan, George G

    2013-03-01

    Bayesian approaches to evaluation of crop composition data allow simpler interpretations than traditional statistical significance tests. An important advantage of Bayesian approaches is that they allow formal incorporation of previously generated data through prior distributions in the analysis steps. This manuscript describes key steps to ensure meaningful and transparent selection and application of informative prior distributions. These include (i) review of previous data in the scientific literature to form the prior distributions, (ii) proper statistical model specification and documentation, (iii) graphical analyses to evaluate the fit of the statistical model to new study data, and (iv) sensitivity analyses to evaluate the robustness of results to the choice of prior distribution. The validity of the prior distribution for any crop component is critical to acceptance of Bayesian approaches to compositional analyses and would be essential for studies conducted in a regulatory setting. Selection and validation of prior distributions for three soybean isoflavones (daidzein, genistein, and glycitein) and two oligosaccharides (raffinose and stachyose) are illustrated in a comparative assessment of data obtained on GM and non-GM soybean seed harvested from replicated field sites at multiple locations in the US during the 2009 growing season. PMID:23261475

  6. Additional Development and Systems Analyses of Pneumatic Technology for High Speed Civil Transport Aircraft

    NASA Technical Reports Server (NTRS)

    Englar, Robert J.; Willie, F. Scott; Lee, Warren J.

    1999-01-01

    In the Task I portion of this NASA research grant, configuration development and experimental investigations have been conducted on a series of pneumatic high-lift and control surface devices applied to a generic High Speed Civil Transport (HSCT) model configuration to determine their potential for improved aerodynamic performance, plus stability and control of higher performance aircraft. These investigations were intended to optimize pneumatic lift and drag performance; provide adequate control and longitudinal stability; reduce separation flowfields at high angle of attack; increase takeoff/climbout lift-to-drag ratios; and reduce system complexity and weight. Experimental aerodynamic evaluations were performed on a semi-span HSCT generic model with improved fuselage fineness ratio and with interchangeable plain flaps, blown flaps, pneumatic Circulation Control Wing (CCW) high-lift configurations, plain and blown canards, a novel Circulation Control (CC) cylinder blown canard, and a clean cruise wing for reference. Conventional tail power was also investigated for longitudinal trim capability. Also evaluated was unsteady pulsed blowing of the wing high-lift system to determine if reduced pulsed mass flow rates and blowing requirements could be made to yield the same lift as that resulting from steady-state blowing. Depending on the pulsing frequency applied, reduced mass flow rates were indeed found able to provide lift augmentation at lesser blowing values than for the steady conditions. Significant improvements in the aerodynamic characteristics leading to improved performance and stability/control were identified, and the various components were compared to evaluate the pneumatic potential of each. Aerodynamic results were provided to the Georgia Tech Aerospace System Design Lab. to conduct the companion system analyses and feasibility study (Task 2) of theses concepts applied to an operational advanced HSCT aircraft. Results and conclusions from these

  7. Calibration of back-analysed model parameters for landslides using classification statistics

    NASA Astrophysics Data System (ADS)

    Cepeda, Jose; Henderson, Laura

    2016-04-01

    Back-analyses are useful for characterizing the geomorphological and mechanical processes and parameters involved in the initiation and propagation of landslides. These processes and parameters can in turn be used for improving forecasts of scenarios and hazard assessments in areas or sites which have similar settings to the back-analysed cases. The selection of the modeled landslide that produces the best agreement with the actual observations requires running a number of simulations by varying the type of model and the sets of input parameters. The comparison of the simulated and observed parameters is normally performed by visual comparison of geomorphological or dynamic variables (e.g., geometry of scarp and final deposit, maximum velocities and depths). Over the past six years, a method developed by NGI has been used by some researchers for a more objective selection of back-analysed input model parameters. That method includes an adaptation of the equations for calculation of classifiers, and a comparative evaluation of classifiers of the selected parameter sets in the Receiver Operating Characteristic (ROC) space. This contribution presents an updating of the methodology. The proposed procedure allows comparisons between two or more "clouds" of classifiers. Each cloud represents the performance of a model over a range of input parameters (e.g., samples of probability distributions). Considering the fact that each cloud does not necessarily produce a full ROC curve, two new normalised ROC-space parameters are introduced for characterizing the performance of each cloud. The first parameter is representative of the cloud position relative to the point of perfect classification. The second parameter characterizes the position of the cloud relative to the theoretically perfect ROC curve and the no-discrimination line. The methodology is illustrated with back-analyses of slope stability and landslide runout of selected case studies. This research activity has been

  8. An Embedded Statistical Method for Coupling Molecular Dynamics and Finite Element Analyses

    NASA Technical Reports Server (NTRS)

    Saether, E.; Glaessgen, E.H.; Yamakov, V.

    2008-01-01

    The coupling of molecular dynamics (MD) simulations with finite element methods (FEM) yields computationally efficient models that link fundamental material processes at the atomistic level with continuum field responses at higher length scales. The theoretical challenge involves developing a seamless connection along an interface between two inherently different simulation frameworks. Various specialized methods have been developed to solve particular classes of problems. Many of these methods link the kinematics of individual MD atoms with FEM nodes at their common interface, necessarily requiring that the finite element mesh be refined to atomic resolution. Some of these coupling approaches also require simulations to be carried out at 0 K and restrict modeling to two-dimensional material domains due to difficulties in simulating full three-dimensional material processes. In the present work, a new approach to MD-FEM coupling is developed based on a restatement of the standard boundary value problem used to define a coupled domain. The method replaces a direct linkage of individual MD atoms and finite element (FE) nodes with a statistical averaging of atomistic displacements in local atomic volumes associated with each FE node in an interface region. The FEM and MD computational systems are effectively independent and communicate only through an iterative update of their boundary conditions. With the use of statistical averages of the atomistic quantities to couple the two computational schemes, the developed approach is referred to as an embedded statistical coupling method (ESCM). ESCM provides an enhanced coupling methodology that is inherently applicable to three-dimensional domains, avoids discretization of the continuum model to atomic scale resolution, and permits finite temperature states to be applied.

  9. JEAB Research Over Time: Species Used, Experimental Designs, Statistical Analyses, and Sex of Subjects.

    PubMed

    Zimmermann, Zachary J; Watkins, Erin E; Poling, Alan

    2015-10-01

    We examined the species used as subjects in every article published in the Journal of the Experimental Analysis of Behavior (JEAB) from 1958 through 2013. We also determined the sex of subjects in every article with human subjects (N = 524) and in an equal number of randomly selected articles with nonhuman subjects, as well as the general type of experimental designs used. Finally, the percentage of articles reporting an inferential statistic was determined at 5-year intervals. In all, 35,317 subjects were studied in 3,084 articles; pigeons ranked first and humans second in number used. Within-subject experimental designs were more popular than between-subjects designs regardless of whether human or nonhuman subjects were studied but were used in a higher percentage of articles with nonhumans (75.4 %) than in articles with humans (68.2 %). The percentage of articles reporting an inferential statistic has increased over time, and more than half of the articles published in 2005 and 2010 reported one. Researchers who publish in JEAB frequently depart from Skinner's preferred research strategy, but it is not clear whether such departures are harmful. Finally, the sex of subjects was not reported in a sizable percentage of articles with both human and nonhuman subjects. This is an unfortunate oversight. PMID:27606171

  10. Statistics

    Cancer.gov

    Links to sources of cancer-related statistics, including the Surveillance, Epidemiology and End Results (SEER) Program, SEER-Medicare datasets, cancer survivor prevalence data, and the Cancer Trends Progress Report.

  11. An experiment in software reliability: Additional analyses using data from automated replications

    NASA Technical Reports Server (NTRS)

    Dunham, Janet R.; Lauterbach, Linda A.

    1988-01-01

    A study undertaken to collect software error data of laboratory quality for use in the development of credible methods for predicting the reliability of software used in life-critical applications is summarized. The software error data reported were acquired through automated repetitive run testing of three independent implementations of a launch interceptor condition module of a radar tracking problem. The results are based on 100 test applications to accumulate a sufficient sample size for error rate estimation. The data collected is used to confirm the results of two Boeing studies reported in NASA-CR-165836 Software Reliability: Repetitive Run Experimentation and Modeling, and NASA-CR-172378 Software Reliability: Additional Investigations into Modeling With Replicated Experiments, respectively. That is, the results confirm the log-linear pattern of software error rates and reject the hypothesis of equal error rates per individual fault. This rejection casts doubt on the assumption that the program's failure rate is a constant multiple of the number of residual bugs; an assumption which underlies some of the current models of software reliability. data raises new questions concerning the phenomenon of interacting faults.

  12. Reprocessing the Southern Hemisphere ADditional OZonesondes (SHADOZ) Database for Long-Term Trend Analyses

    NASA Astrophysics Data System (ADS)

    Witte, J. C.; Thompson, A. M.; Coetzee, G.; Fujiwara, M.; Johnson, B. J.; Sterling, C. W.; Cullis, P.; Ashburn, C. E.; Jordan, A. F.

    2015-12-01

    SHADOZ is a large archive of tropical balloon-bone ozonesonde data at NASA/Goddard Space Flight Center with data from 14 tropical and subtropical stations provided by collaborators in Europe, Asia, Latin America and Africa . The SHADOZ time series began in 1998, using electrochemical concentration cell (ECC) ozonesondes. Like many long-term sounding stations, SHADOZ is characterized by variations in operating procedures, launch protocols, and data processing such that biases within a data record and among sites appear. In addition, over time, the radiosonde and ozonesonde instruments and data processing protocols have changed, adding to the measurement uncertainties at individual stations and limiting the reliability of ozone profile trends and continuous satellite validation. Currently, the ozonesonde community is engaged in reprocessing ECC data, with an emphasis on homogenization of the records to compensate for the variations in instrumentation and technique. The goals are to improve the information and integrity of each measurement record and to support calculation of more reliable trends. We illustrate the reprocessing activity of SHADOZ with selected stations. We will (1) show reprocessing steps based on the recent WMO report that provides post-processing guidelines for ozonesondes; (2) characterize uncertainties in various parts of the ECC conditioning process; and (3) compare original and reprocessed data to co-located ground and satellite measurements of column ozone.

  13. Chemometric and Statistical Analyses of ToF-SIMS Spectra of Increasingly Complex Biological Samples

    SciTech Connect

    Berman, E S; Wu, L; Fortson, S L; Nelson, D O; Kulp, K S; Wu, K J

    2007-10-24

    Characterizing and classifying molecular variation within biological samples is critical for determining fundamental mechanisms of biological processes that will lead to new insights including improved disease understanding. Towards these ends, time-of-flight secondary ion mass spectrometry (ToF-SIMS) was used to examine increasingly complex samples of biological relevance, including monosaccharide isomers, pure proteins, complex protein mixtures, and mouse embryo tissues. The complex mass spectral data sets produced were analyzed using five common statistical and chemometric multivariate analysis techniques: principal component analysis (PCA), linear discriminant analysis (LDA), partial least squares discriminant analysis (PLSDA), soft independent modeling of class analogy (SIMCA), and decision tree analysis by recursive partitioning. PCA was found to be a valuable first step in multivariate analysis, providing insight both into the relative groupings of samples and into the molecular basis for those groupings. For the monosaccharides, pure proteins and protein mixture samples, all of LDA, PLSDA, and SIMCA were found to produce excellent classification given a sufficient number of compound variables calculated. For the mouse embryo tissues, however, SIMCA did not produce as accurate a classification. The decision tree analysis was found to be the least successful for all the data sets, providing neither as accurate a classification nor chemical insight for any of the tested samples. Based on these results we conclude that as the complexity of the sample increases, so must the sophistication of the multivariate technique used to classify the samples. PCA is a preferred first step for understanding ToF-SIMS data that can be followed by either LDA or PLSDA for effective classification analysis. This study demonstrates the strength of ToF-SIMS combined with multivariate statistical and chemometric techniques to classify increasingly complex biological samples

  14. Basic statistical analyses of candidate nickel-hydrogen cells for the Space Station Freedom

    NASA Technical Reports Server (NTRS)

    Maloney, Thomas M.; Frate, David T.

    1993-01-01

    Nickel-Hydrogen (Ni/H2) secondary batteries will be implemented as a power source for the Space Station Freedom as well as for other NASA missions. Consequently, characterization tests of Ni/H2 cells from Eagle-Picher, Whittaker-Yardney, and Hughes were completed at the NASA Lewis Research Center. Watt-hour efficiencies of each Ni/H2 cell were measured for regulated charge and discharge cycles as a function of temperature, charge rate, discharge rate, and state of charge. Temperatures ranged from -5 C to 30 C, charge rates ranged from C/10 to 1C, discharge rates ranged from C/10 to 2C, and states of charge ranged from 20 percent to 100 percent. Results from regression analyses and analyses of mean watt-hour efficiencies demonstrated that overall performance was best at temperatures between 10 C and 20 C while the discharge rate correlated most strongly with watt-hour efficiency. In general, the cell with back-to-back electrode arrangement, single stack, 26 percent KOH, and serrated zircar separator and the cell with a recirculating electrode arrangement, unit stack, 31 percent KOH, zircar separators performed best.

  15. Statistical Analyses of Scatterplots to Identify Important Factors in Large-Scale Simulations, 1: Review and Comparison of Techniques

    SciTech Connect

    Kleijnen, J.P.C.; Helton, J.C.

    1999-03-24

    The robustness of procedures for identifying patterns in scatterplots generated in Monte Carlo sensitivity analyses is investigated. These procedures are based on attempts to detect increasingly complex patterns in the scatterplots under consideration and involve the identification of (i) linear relationships with correlation coefficients, (ii) monotonic relationships with rank correlation coefficients, (iii) trends in central tendency as defined by means, medians and the Kruskal-Wallis statistic, (iv) trends in variability as defined by variances and interquartile ranges, and (v) deviations from randomness as defined by the chi-square statistic. The following two topics related to the robustness of these procedures are considered for a sequence of example analyses with a large model for two-phase fluid flow: the presence of Type I and Type II errors, and the stability of results obtained with independent Latin hypercube samples. Observations from analysis include: (i) Type I errors are unavoidable, (ii) Type 11errors can occur when inappropriate analysis procedures are used, (iii) physical explanations should always be sought for why statistical procedures identify variables as being important, and (iv) the identification of important variables tends to be stable for independent Latin hypercube samples.

  16. Statistical Improvements in Functional Magnetic Resonance Imaging Analyses Produced by Censoring High-Motion Data Points

    PubMed Central

    Siegel, Joshua S.; Power, Jonathan D.; Dubis, Joseph W.; Vogel, Alecia C.; Church, Jessica A.; Schlaggar, Bradley L.; Petersen, Steven E.

    2013-01-01

    Subject motion degrades the quality of task functional magnetic resonance imaging (fMRI) data. Here, we test two classes of methods to counteract the effects of motion in task fMRI data: (1) a variety of motion regressions and (2) motion censoring (“motion scrubbing”). In motion regression, various regressors based on realignment estimates were included as nuisance regressors in general linear model (GLM) estimation. In motion censoring, volumes in which head motion exceeded a threshold were withheld from GLM estimation. The effects of each method were explored in several task fMRI data sets and compared using indicators of data quality and signal-to-noise ratio. Motion censoring decreased variance in parameter estimates within- and across-subjects, reduced residual error in GLM estimation, and increased the magnitude of statistical effects. Motion censoring performed better than all forms of motion regression and also performed well across a variety of parameter spaces, in GLMs with assumed or unassumed response shapes. We conclude that motion censoring improves the quality of task fMRI data and can be a valuable processing step in studies involving populations with even mild amounts of head movement. PMID:23861343

  17. Lightning NOx Statistics Derived by NASA Lightning Nitrogen Oxides Model (LNOM) Data Analyses

    NASA Technical Reports Server (NTRS)

    Koshak, William; Peterson, Harold

    2013-01-01

    What is the LNOM? The NASA Marshall Space Flight Center (MSFC) Lightning Nitrogen Oxides Model (LNOM) [Koshak et al., 2009, 2010, 2011; Koshak and Peterson 2011, 2013] analyzes VHF Lightning Mapping Array (LMA) and National Lightning Detection Network(TradeMark) (NLDN) data to estimate the lightning nitrogen oxides (LNOx) produced by individual flashes. Figure 1 provides an overview of LNOM functionality. Benefits of LNOM: (1) Does away with unrealistic "vertical stick" lightning channel models for estimating LNOx; (2) Uses ground-based VHF data that maps out the true channel in space and time to < 100 m accuracy; (3) Therefore, true channel segment height (ambient air density) is used to compute LNOx; (4) True channel length is used! (typically tens of kilometers since channel has many branches and "wiggles"); (5) Distinction between ground and cloud flashes are made; (6) For ground flashes, actual peak current from NLDN used to compute NOx from lightning return stroke; (7) NOx computed for several other lightning discharge processes (based on Cooray et al., 2009 theory): (a) Hot core of stepped leaders and dart leaders, (b) Corona sheath of stepped leader, (c) K-change, (d) Continuing Currents, and (e) M-components; and (8) LNOM statistics (see later) can be used to parameterize LNOx production for regional air quality models (like CMAQ), and for global chemical transport models (like GEOS-Chem).

  18. Using Innovative Statistical Analyses to Assess Soil Degradation due to Land Use Change

    NASA Astrophysics Data System (ADS)

    Khaledian, Yones; Kiani, Farshad; Ebrahimi, Soheila; Brevik, Eric C.; Aitkenhead-Peterson, Jacqueline

    2016-04-01

    Soil erosion and overall loss of soil fertility is a serious issue for loess soils of the Golestan province, northern Iran. The assessment of soil degradation at large watershed scales is urgently required. This research investigated the role of land use change and its effect on soil degradation in cultivated, pasture and urban lands, when compared to native forest in terms of declines in soil fertility. Some novel statistical methods including partial least squares (PLS), principal component regression (PCR), and ordinary least squares regression (OLS) were used to predict soil cation-exchange capacity (CEC) using soil characteristics. PCA identified five primary components of soil quality. The PLS model was used to predict soil CEC from the soil characteristics including bulk density (BD), electrical conductivity (EC), pH, calcium carbonate equivalent (CCE), soil particle density (DS), mean weight diameter (MWD), soil porosity (F), organic carbon (OC), Labile carbon (LC), mineral carbon, saturation percentage (SP), soil particle size (clay, silt and sand), exchangeable cations (Ca2+, Mg2+, K+, Na+), and soil microbial respiration (SMR) collected in the Ziarat watershed. In order to evaluate the best fit, two other methods, PCR and OLS, were also examined. An exponential semivariogram using PLS predictions revealed stronger spatial dependence among CEC [r2 = 0.80, and RMSE= 1.99] than the other methods, PCR [r2 = 0.84, and RMSE= 2.45] and OLS [r2 = 0.84, and RMSE= 2.45]. Therefore, the PLS method provided the best model for the data. In stepwise regression analysis, MWD and LC were selected as influential variables in all soils, whereas the other influential parameters were different in various land uses. This study quantified reductions in numerous soil quality parameters resulting from extensive land-use changes and urbanization in the Ziarat watershed in Northern Iran.

  19. The distribution of megablocks in the Ries crater, Germany: Remote sensing, field investigation, and statistical analyses

    NASA Astrophysics Data System (ADS)

    Sturm, Sebastian; Kenkmann, Thomas; Willmes, Malte; PöSges, Gisela; Hiesinger, Harald

    2015-01-01

    The Ries crater is a well-preserved, complex impact crater that has been extensively used in the study of impact crater formation processes across the solar system. However, its geologic structure, especially the megablock zone, still poses questions regarding crater formation mechanics. The megablock zone, located between the inner crystalline ring and outer, morphologic crater rim, consists of allochthonous crystalline and sedimentary blocks, Bunte Breccia deposits, patches of suevite, and parautochthonous sedimentary blocks that slumped into the crater during crater modification. Our remote sensing detection method in combination with a shallow drilling campaign and geoelectric measurements at two selected megablocks proved successful in finding new megablock structures (>25 m mean diameter) within the upper approximately 1.5 m of the subsurface in the megablock zone. We analyzed 1777 megablocks of the megablock zone, 81 of which are new discoveries. In our statistical analysis, we also included 2318 ejecta blocks >25 m beyond the crater rim. Parautochthonous megablocks show an increase in total area and size toward the final crater rim. The sizes of allochthonous megablocks generally decrease with increasing radial range, but inside the megablock zone, the coverage with postimpact sediments obscures this trend. The size-frequency distribution of all megablocks obeys a power-law distribution with an exponent between approximately -1.7 and -2.3. We estimated a total volume of 95 km3 of Bunte Breccia and 47 km3 of megablocks. Ejecta volume calculations and a palinspastic restoration of the extension within the megablock zone indicate that the transient cavity diameter was probably 14-15 km.

  20. Statistical evaluation of light water reactor piping damping data for use in PRA (Probabilistic Risk Assessment) analyses

    NASA Astrophysics Data System (ADS)

    Ware, A. G.

    This paper presents the results of studies used to quantify, on a statistical basis, one of the parameters (piping system damping) input to probabilistic risk assessment (PRA) analyses of nuclear structures. Damping data were selected from tests in which the piping had been vibrated at levels representative of at least moderate severity seismic or hydrodynamic transients. These data, representing 27 light water reactor type piping systems, formed the basis for the statistical damping study. Most of these systems were actual nuclear power plant systems, and the lowest mode was < 8 Hz in over 80/percent/ of the systems. Damping was treated as independent of frequency (or mode number). The statistical analysis showed that a lognormal probability fit provided a suitable approximation of the raw data. For the cases in which all data were considered (which allowed duplicate tests for each system to be included so that the overall data were biased by those systems with the most data), mean lognormal damping values ranged from 2.68/percent/ to 3.55/percent/ of critical. When duplicate tests were eliminated, the means ranged from 3.12/percent/ to 3.72/percent/ of critical. For the final cases, which considered only the lowest mode at its highest excitation level, mean lognormal damping values ranged from 3.28/percent/ to 6.50/percent/ of critical.

  1. Statistical inference for the additive hazards model under outcome-dependent sampling

    PubMed Central

    Yu, Jichang; Liu, Yanyan; Sandler, Dale P.; Zhou, Haibo

    2015-01-01

    Cost-effective study design and proper inference procedures for data from such designs are always of particular interests to study investigators. In this article, we propose a biased sampling scheme, an outcome-dependent sampling (ODS) design for survival data with right censoring under the additive hazards model. We develop a weighted pseudo-score estimator for the regression parameters for the proposed design and derive the asymptotic properties of the proposed estimator. We also provide some suggestions for using the proposed method by evaluating the relative efficiency of the proposed method against simple random sampling design and derive the optimal allocation of the subsamples for the proposed design. Simulation studies show that the proposed ODS design is more powerful than other existing designs and the proposed estimator is more efficient than other estimators. We apply our method to analyze a cancer study conducted at NIEHS, the Cancer Incidence and Mortality of Uranium Miners Study, to study the risk of radon exposure to cancer. PMID:26379363

  2. Statistical Analyses for Probabilistic Assessments of the Reactor Pressure Vessel Structural Integrity: Building a Master Curve on an Extract of the 'Euro' Fracture Toughness Dataset, Controlling Statistical Uncertainty for Both Mono-Temperature and multi-temperature tests

    SciTech Connect

    Josse, Florent; Lefebvre, Yannick; Todeschini, Patrick; Turato, Silvia; Meister, Eric

    2006-07-01

    Assessing the structural integrity of a nuclear Reactor Pressure Vessel (RPV) subjected to pressurized-thermal-shock (PTS) transients is extremely important to safety. In addition to conventional deterministic calculations to confirm RPV integrity, Electricite de France (EDF) carries out probabilistic analyses. Probabilistic analyses are interesting because some key variables, albeit conventionally taken at conservative values, can be modeled more accurately through statistical variability. One variable which significantly affects RPV structural integrity assessment is cleavage fracture initiation toughness. The reference fracture toughness method currently in use at EDF is the RCCM and ASME Code lower-bound K{sub IC} based on the indexing parameter RT{sub NDT}. However, in order to quantify the toughness scatter for probabilistic analyses, the master curve method is being analyzed at present. Furthermore, the master curve method is a direct means of evaluating fracture toughness based on K{sub JC} data. In the framework of the master curve investigation undertaken by EDF, this article deals with the following two statistical items: building a master curve from an extract of a fracture toughness dataset (from the European project 'Unified Reference Fracture Toughness Design curves for RPV Steels') and controlling statistical uncertainty for both mono-temperature and multi-temperature tests. Concerning the first point, master curve temperature dependence is empirical in nature. To determine the 'original' master curve, Wallin postulated that a unified description of fracture toughness temperature dependence for ferritic steels is possible, and used a large number of data corresponding to nuclear-grade pressure vessel steels and welds. Our working hypothesis is that some ferritic steels may behave in slightly different ways. Therefore we focused exclusively on the basic french reactor vessel metal of types A508 Class 3 and A 533 grade B Class 1, taking the sampling

  3. Statistical approaches to analyse patient-reported outcomes as response variables: an application to health-related quality of life.

    PubMed

    Arostegui, Inmaculada; Núñez-Antón, Vicente; Quintana, José M

    2012-04-01

    Patient-reported outcomes (PRO) are used as primary endpoints in medical research and their statistical analysis is an important methodological issue. Theoretical assumptions of the selected methodology and interpretation of its results are issues to take into account when selecting an appropriate statistical technique to analyse data. We present eight methods of analysis of a popular PRO tool under different assumptions that lead to different interpretations of the results. All methods were applied to responses obtained from two of the health dimensions of the SF-36 Health Survey. The proposed methods are: multiple linear regression (MLR), with least square and bootstrap estimations, tobit regression, ordinal logistic and probit regressions, beta-binomial regression (BBR), binomial-logit-normal regression (BLNR) and coarsening. Selection of an appropriate model depends not only on its distributional assumptions but also on the continuous or ordinal features of the response and the fact that they are constrained to a bounded interval. The BBR approach renders satisfactory results in a broad number of situations. MLR is not recommended, especially with skewed outcomes. Ordinal methods are only appropriate for outcomes with a few number of categories. Tobit regression is an acceptable option under normality assumptions and in the presence of moderate ceiling or floor effect. The BLNR and coarsening proposals are also acceptable, but only under certain distributional assumptions that are difficult to test a priori. Interpretation of the results is more convenient when using the BBR, BLNR and ordinal logistic regression approaches. PMID:20858689

  4. Analysing recurrent hospitalisations in heart failure: a review of statistical methodology, with application to CHARM-Preserved

    PubMed Central

    Rogers, Jennifer K.; Pocock, Stuart J.; McMurray, John J.V.; Granger, Christopher B; Michelson, Eric L; Östergren, Jan; Pfeffer, Marc A; Solomon, Scott; Swedberg, Karl; Yusuf, Salim

    2015-01-01

    Background Heart failure is characterised by recurrent hospitalisations, but often only the first event is considered in clinical trial reports. In chronic diseases, such as heart failure, analysing all events gives a more complete picture of treatment benefit. We describe methods of analysing repeat hospitalisations, and illustrate their value in one major trial. Methods and Results The Candesartan in Heart failure Assessment of Reduction in Mortality and morbidity (CHARM)-Preserved study compared candesartan with placebo in 3023 patients with heart failure and preserved systolic function. The heart failure hospitalisation rates were 12.5 and 8.9 per 100 patient years in the placebo and candesartan groups respectively. The repeat hospitalisations were analysed using the Andersen-Gill, Poisson and Negative Binomial methods. Death was incorporated into analyses by treating it as an additional event. The win ratio method and a method that jointly models hospitalisations and mortality were also considered. Using repeat events gave larger treatment benefits than time to first event analysis. The Negative Binomial method for the composite of recurrent heart failure hospitalisations and cardiovascular death gave a rate ratio of 0.75 (95% CI 0.62-0.91, P=0.003), and the hazard ratio for time to first heart failure hospitalisation or cardiovascular death was 0.86 (95% CI 0.74-1.00, P=0.050). Conclusions In patients with preserved ejection fraction, candesartan reduces the rate of admissions for worsening heart failure, to a greater extent than from analysing only first hospitalisations. Recurrent events should be routinely incorporated into the analysis of future clinical trials in heart failure. PMID:24453096

  5. Statistical Analyses of High-Resolution Aircraft and Satellite Observations of Sea Ice: Applications for Improving Model Simulations

    NASA Astrophysics Data System (ADS)

    Farrell, S. L.; Kurtz, N. T.; Richter-Menge, J.; Harbeck, J. P.; Onana, V.

    2012-12-01

    Satellite-derived estimates of ice thickness and observations of ice extent over the last decade point to a downward trend in the basin-scale ice volume of the Arctic Ocean. This loss has broad-ranging impacts on the regional climate and ecosystems, as well as implications for regional infrastructure, marine navigation, national security, and resource exploration. New observational datasets at small spatial and temporal scales are now required to improve our understanding of physical processes occurring within the ice pack and advance parameterizations in the next generation of numerical sea-ice models. High-resolution airborne and satellite observations of the sea ice are now available at meter-scale resolution or better that provide new details on the properties and morphology of the ice pack across basin scales. For example the NASA IceBridge airborne campaign routinely surveys the sea ice of the Arctic and Southern Oceans with an advanced sensor suite including laser and radar altimeters and digital cameras that together provide high-resolution measurements of sea ice freeboard, thickness, snow depth and lead distribution. Here we present statistical analyses of the ice pack primarily derived from the following IceBridge instruments: the Digital Mapping System (DMS), a nadir-looking, high-resolution digital camera; the Airborne Topographic Mapper, a scanning lidar; and the University of Kansas snow radar, a novel instrument designed to estimate snow depth on sea ice. Together these instruments provide data from which a wide range of sea ice properties may be derived. We provide statistics on lead distribution and spacing, lead width and area, floe size and distance between floes, as well as ridge height, frequency and distribution. The goals of this study are to (i) identify unique statistics that can be used to describe the characteristics of specific ice regions, for example first-year/multi-year ice, diffuse ice edge/consolidated ice pack, and convergent

  6. Application of both a physical theory and statistical procedure in the analyses of an in vivo study of aerosol deposition

    SciTech Connect

    Cheng, K.H.; Swift, D.L.; Yang, Y.H.

    1995-12-01

    Regional deposition of inhaled aerosols in the respiratory tract is a significant factor in assessing the biological effects from exposure to a variety of environmental particles. Understanding the deposition efficiency of inhaled aerosol particles in the nasal and oral airways can help evaluate doses to the extrathoracic region as well as to the lung. Dose extrapolation from laboratory animals to humans has been questioned due to significant physiological and anatomical variations. Although human studies are considered ideal for obtaining in vivo toxicity information important in risk assessment, the number of subjects in the study is often small compared to epidemiological and animal studies. This study measured in vivo the nasal airway dimensions and the extrathoracic deposition of ultrafine aerosols in 10 normal adult males. Variability among individuals was significant. The nasal geometry of each individual was characterized at a resolution of 3 mm using magnetic resonance imaging (MRI) and acoustic rhinometry (AR). The turbulent diffusion theory was used to describe the nonlinear nature of extrathoracic aerosol deposition. To determine what dimensional features of the nasal airway were responsible for the marked differences in particle deposition, the MIXed-effects NonLINear Regression (MIXNLIN) procedure was used to account for the random effort of repeated measurements on the same subject. Using both turbulent diffusion theory and MIXNLIN, the ultrafine particle deposition is correlated with nasal dimensions measured by the surface area, minimum cross-sectional area, and complexity of the airway shape. The combination of MRI and AR is useful for characterizing both detailed nasal dimensions and temporal changes in nasal patency. We conclude that a suitable statistical procedure incorporated with existing physical theories must be used in data analyses for experimental studies of aerosol deposition that involve a relatively small number of human subjects.

  7. Quantifying Trace Amounts of Aggregates in Biopharmaceuticals Using Analytical Ultracentrifugation Sedimentation Velocity: Bayesian Analyses and F Statistics.

    PubMed

    Wafer, Lucas; Kloczewiak, Marek; Luo, Yin

    2016-07-01

    Analytical ultracentrifugation-sedimentation velocity (AUC-SV) is often used to quantify high molar mass species (HMMS) present in biopharmaceuticals. Although these species are often present in trace quantities, they have received significant attention due to their potential immunogenicity. Commonly, AUC-SV data is analyzed as a diffusion-corrected, sedimentation coefficient distribution, or c(s), using SEDFIT to numerically solve Lamm-type equations. SEDFIT also utilizes maximum entropy or Tikhonov-Phillips regularization to further allow the user to determine relevant sample information, including the number of species present, their sedimentation coefficients, and their relative abundance. However, this methodology has several, often unstated, limitations, which may impact the final analysis of protein therapeutics. These include regularization-specific effects, artificial "ripple peaks," and spurious shifts in the sedimentation coefficients. In this investigation, we experimentally verified that an explicit Bayesian approach, as implemented in SEDFIT, can largely correct for these effects. Clear guidelines on how to implement this technique and interpret the resulting data, especially for samples containing micro-heterogeneity (e.g., differential glycosylation), are also provided. In addition, we demonstrated how the Bayesian approach can be combined with F statistics to draw more accurate conclusions and rigorously exclude artifactual peaks. Numerous examples with an antibody and an antibody-drug conjugate were used to illustrate the strengths and drawbacks of each technique. PMID:27184576

  8. Statistical correlations and risk analyses techniques for a diving dual phase bubble model and data bank using massively parallel supercomputers.

    PubMed

    Wienke, B R; O'Leary, T R

    2008-05-01

    Linking model and data, we detail the LANL diving reduced gradient bubble model (RGBM), dynamical principles, and correlation with data in the LANL Data Bank. Table, profile, and meter risks are obtained from likelihood analysis and quoted for air, nitrox, helitrox no-decompression time limits, repetitive dive tables, and selected mixed gas and repetitive profiles. Application analyses include the EXPLORER decompression meter algorithm, NAUI tables, University of Wisconsin Seafood Diver tables, comparative NAUI, PADI, Oceanic NDLs and repetitive dives, comparative nitrogen and helium mixed gas risks, USS Perry deep rebreather (RB) exploration dive,world record open circuit (OC) dive, and Woodville Karst Plain Project (WKPP) extreme cave exploration profiles. The algorithm has seen extensive and utilitarian application in mixed gas diving, both in recreational and technical sectors, and forms the bases forreleased tables and decompression meters used by scientific, commercial, and research divers. The LANL Data Bank is described, and the methods used to deduce risk are detailed. Risk functions for dissolved gas and bubbles are summarized. Parameters that can be used to estimate profile risk are tallied. To fit data, a modified Levenberg-Marquardt routine is employed with L2 error norm. Appendices sketch the numerical methods, and list reports from field testing for (real) mixed gas diving. A Monte Carlo-like sampling scheme for fast numerical analysis of the data is also detailed, as a coupled variance reduction technique and additional check on the canonical approach to estimating diving risk. The method suggests alternatives to the canonical approach. This work represents a first time correlation effort linking a dynamical bubble model with deep stop data. Supercomputing resources are requisite to connect model and data in application. PMID:18371945

  9. Statistical Analyses of d18O in Meteoric Waters From the Western US and East Asia: Implications for Paleoaltimetry

    NASA Astrophysics Data System (ADS)

    Lechler, A. R.; Niemi, N. A.

    2008-12-01

    Questions on the timing of Tibetan Plateau uplift and its associated influence on the development of the Indian and Asian monsoons are best addressed through accurate determinations of regional paleoelevation. Previous determinations of paleoaltimetry utilized the stable isotopic composition of paleo-meteoric waters as recorded in various proxies (authigenic minerals, fossils, etc.), in combination with empirically and model determined elevation isotopic lapse rates. However, the applicability of these lapse rates, derived principally from orogenic settings, to high continental plateaus remains uncertain. Our research aims to gain a better understanding of the potential controls on the δ18O composition of meteoric waters over continental plateaus through a principal component analysis (PCA) of modern waters from eastern Asia and the western US. In particular, we investigate how various environmental parameters (elevation, latitude, longitude, MAP, and MAT) influence the δ18O composition of these waters. First, these analyses reveal that elevation and latitude are the primary controls on isotopic composition in all regions investigated, as expected. Second, PCA results yield elevation lapse rates from orogenic settings (i.e. Sierra Nevada, Himalaya) of ~ -3‰/km, in strong agreement with both empirical and Rayleigh distillation model derived lapse rates. The Great Plains of the US, although not an orogenic setting, represents a monotonic topographic rise, and is also characterized by a ~ -3‰/km lapse rate. In high, arid plateau regions (Basin and Range, Tibet), however, elevation lapse rates are ~ -1.5‰/km, half that of orogenic settings. An empirically derived lapse rate from small source area springs collected over a 2 km elevation change from a single mountain range in the Basin and Range yields an identical rate. One clue as to the source of this lowered lapse rate is eastern China, which also displays an elevation lapse rate of ~ -1.5‰/km, despite

  10. Thermodynamic studies of the interaction of alpha-chymotrypsin with water. II. Statistical analyses of the enthalpy-entropy compensation effect.

    PubMed

    Lüscher, M; Rüegg, M; Schindler, P

    1978-09-26

    Differential enthalpies (deltaH) and entropies (deltaS) of the interaction of water with a high and low temperature conformer of alpha-chymotrypsin were determined previously by multitemperature sorption measurements. The changes in (deltaH) and (deltaS) with water content of the protein were found to show a pronounced compensation pattern. It is known that van 't Hoff data may exhibit enthalpy-entropy compensation, which is entirely due to statistical error propagation. To discriminate between artifactual and significant compensation, the experimental results are analyzed by statistical methods. The results of two different statistical analyses show that a linear, chemically caused compensation effect can be established for the interaction of water with both chymotrypsin conformers. The compensation temperature beta = deltaH/deltaS was found to be 433 +/- 22 K. The compensation effect is detectable only in the water content range above the monolayer volume (upsilonm), computed by the Brunauer, Emmett and Teller equation. This result is discussed in terms of a monolayer hydration mechanism, formulated on the basis of previous thermodynamic results: The interaction of the first water monolayer with the charged and polar surface area of the dry protein, largely stabilizes its tertiary structure. Further water addition then occurs to a practically invariable protein surface. According to this mechanism (which ensures a maximum of conformational stability with a minimum of hydration water), large conformational changes can be expected to occur mainly in the monolayer water content range. This expectation is confirmed by extra-thermodynamic data (infrared and X-ray measurements). The thermodynamic quantities of the sorption process are thus governed by conformational effects below upsilonm. Above the monolayer water content range, however, the water binding process per se strongly predominates. The deltaH/deltaS compensation effect established for this water content

  11. Biochemical analyses of the antioxidative activity and chemical ingredients in eight different Allium alien monosomic addition lines.

    PubMed

    Yaguchi, Shigenori; Matsumoto, Misato; Date, Rie; Harada, Kazuki; Maeda, Toshimichi; Yamauchi, Naoki; Shigyo, Masayoshi

    2013-01-01

    We measured the antioxidant contents and antioxidative activities in eight Allium fistulosum-shallot monosomic addition lines (MAL; FF+1A-FF+8A). The high antioxidative activity lines (FF+2A and FF+6A) showed high polyphenol accumulation. These additional chromosomes (2A and 6A) would therefore have anonymous genes related to the upregulation of polyphenol production, the antioxidative activities consequently being increased in these MALs. PMID:24317054

  12. The Relationship between Visual Analysis and Five Statistical Analyses in a Simple AB Single-Case Research Design

    ERIC Educational Resources Information Center

    Brossart, Daniel F.; Parker, Richard I.; Olson, Elizabeth A.; Mahadevan, Lakshmi

    2006-01-01

    This study explored some practical issues for single-case researchers who rely on visual analysis of graphed data, but who also may consider supplemental use of promising statistical analysis techniques. The study sought to answer three major questions: (a) What is a typical range of effect sizes from these analytic techniques for data from…

  13. Adjusting the Adjusted X[superscript 2]/df Ratio Statistic for Dichotomous Item Response Theory Analyses: Does the Model Fit?

    ERIC Educational Resources Information Center

    Tay, Louis; Drasgow, Fritz

    2012-01-01

    Two Monte Carlo simulation studies investigated the effectiveness of the mean adjusted X[superscript 2]/df statistic proposed by Drasgow and colleagues and, because of problems with the method, a new approach for assessing the goodness of fit of an item response theory model was developed. It has been previously recommended that mean adjusted…

  14. RT-PCR and statistical analyses of adeABC expression in clinical isolates of Acinetobacter calcoaceticus-Acinetobacter baumannii complex.

    PubMed

    Ruzin, Alexey; Immermann, Frederick W; Bradford, Patricia A

    2010-06-01

    The relationship between expression of adeABC and minimal inhibitory concentration (MIC) of tigecycline was investigated by RT-PCR and statistical analyses in a population of 106 clinical isolates (MIC range, 0.0313-16 microg/ml) of Acinetobacter calcoaceticus-Acinetobacter baumannii complex. There was a statistically significant linear relationship (p < 0.0001) between log-transformed expression values and log-transformed MIC values, indicating that overexpression of AdeABC efflux pump is a prevalent mechanism for decreased susceptibility to tigecycline in A. calcoaceticus-A. baumannii complex. PMID:20438348

  15. Guided waves based SHM systems for composites structural elements: statistical analyses finalized at probability of detection definition and assessment

    NASA Astrophysics Data System (ADS)

    Monaco, E.; Memmolo, V.; Ricci, F.; Boffa, N. D.; Maio, L.

    2015-03-01

    Maintenance approaches based on sensorised structures and Structural Health Monitoring systems could represent one of the most promising innovations in the fields of aerostructures since many years, mostly when composites materials (fibers reinforced resins) are considered. Layered materials still suffer today of drastic reductions of maximum allowable stress values during the design phase as well as of costly and recurrent inspections during the life cycle phase that don't permit of completely exploit their structural and economic potentialities in today aircrafts. Those penalizing measures are necessary mainly to consider the presence of undetected hidden flaws within the layered sequence (delaminations) or in bonded areas (partial disbonding); in order to relax design and maintenance constraints a system based on sensors permanently installed on the structure to detect and locate eventual flaws can be considered (SHM system) once its effectiveness and reliability will be statistically demonstrated via a rigorous Probability Of Detection function definition and evaluation. This paper presents an experimental approach with a statistical procedure for the evaluation of detection threshold of a guided waves based SHM system oriented to delaminations detection on a typical wing composite layered panel. The experimental tests are mostly oriented to characterize the statistical distribution of measurements and damage metrics as well as to characterize the system detection capability using this approach. Numerically it is not possible to substitute part of the experimental tests aimed at POD where the noise in the system response is crucial. Results of experiments are presented in the paper and analyzed.

  16. Statistical analyses of variability/reproducibility of environmentally-assisted cyclic crack growth rate data relative to Δ K control modes

    NASA Astrophysics Data System (ADS)

    Tsuji, Hirokazu; Yokoyama, Norio; Nakajima, Hajime; Kondo, Tatsuo

    1993-06-01

    Statistical analyses were conducted by using the cyclic crack growth rate data for pressure vessel steels stored in the JAERI Material Performance Database (JMPD), and comparisons were made on variability and/or reproducibility of the data between obtained by ΔK-increasing and by ΔK-constant type tests. Based on the results of the statistical analyses, it was concluded that ΔK-constant type tests are generally superior to the commonly used ΔK-increasing type tests from the viewpoint of variability and/or reproducibility of the data. Such a tendency was more pronounced in the tests conducted in simulated LWR primary coolants than those in air.

  17. DESIGNING ENVIRONMENTAL MONITORING DATABASES FOR STATISTIC ASSESSMENT

    EPA Science Inventory

    Databases designed for statistical analyses have characteristics that distinguish them from databases intended for general use. EMAP uses a probabilistic sampling design to collect data to produce statistical assessments of environmental conditions. In addition to supporting the ...

  18. Statistical analyses of the results of 25 years of beach litter surveys on the south-eastern North Sea coast.

    PubMed

    Schulz, Marcus; Clemens, Thomas; Förster, Harald; Harder, Thorsten; Fleet, David; Gaus, Silvia; Grave, Christel; Flegel, Imme; Schrey, Eckart; Hartwig, Eike

    2015-08-01

    In the North Sea, the amount of litter present in the marine environment represents a severe environmental problem. In order to assess the magnitude of the problem and measure changes in abundance, the results of two beach litter monitoring programmes were compared and analysed for long-term trends applying multivariate techniques. Total beach litter pollution was persistently high. Spatial differences in litter abundance made it difficult to identify long-term trends: Partly more than 8000 litter items year(-1) were recorded on a 100 m long survey site on the island of Scharhörn, while the survey site on the beach on the island of Amrum revealed abundances lower by two orders of magnitude. Beach litter was dominated by plastic with mean proportions of 52%-91% of total beach litter. Non-parametric time series analyses detected many significant trends, which, however, did not show any systematic spatial patterns. Cluster analyses partly led to groupings of beaches according to their expositions to sources of litter, wind and currents. Surveys in short intervals of one to two weeks were found to give higher annual sums of beach litter than the quarterly surveys of the OSPAR method. Surveys at regular intervals of four weeks to five months would make monitoring results more reliable. PMID:26026589

  19. Combined Statistical Analyses of Peptide Intensities and Peptide Occurrences Improves Identification of Significant Peptides from MS-based Proteomics Data

    SciTech Connect

    Webb-Robertson, Bobbie-Jo M.; McCue, Lee Ann; Waters, Katrina M.; Matzke, Melissa M.; Jacobs, Jon M.; Metz, Thomas O.; Varnum, Susan M.; Pounds, Joel G.

    2010-11-01

    Liquid chromatography-mass spectrometry-based (LC-MS) proteomics uses peak intensities of proteolytic peptides to infer the differential abundance of peptides/proteins. However, substantial run-to-run variability in peptide intensities and observations (presence/absence) of peptides makes data analysis quite challenging. The missing abundance values in LC-MS proteomics data are difficult to address with traditional imputation-based approaches because the mechanisms by which data are missing are unknown a priori. Data can be missing due to random mechanisms such as experimental error, or non-random mechanisms such as a true biological effect. We present a statistical approach that uses a test of independence known as a G-test to test the null hypothesis of independence between the number of missing values and the experimental groups. We pair the G-test results evaluating independence of missing data (IMD) with a standard analysis of variance (ANOVA) that uses only means and variances computed from the observed data. Each peptide is therefore represented by two statistical confidence metrics, one for qualitative differential observation and one for quantitative differential intensity. We use two simulated and two real LC-MS datasets to demonstrate the robustness and sensitivity of the ANOVA-IMD approach for assigning confidence to peptides with significant differential abundance among experimental groups.

  20. Literature review of some selected types of results and statistical analyses of total-ozone data. [for the ozonosphere

    NASA Technical Reports Server (NTRS)

    Myers, R. H.

    1976-01-01

    The depletion of ozone in the stratosphere is examined, and causes for the depletion are cited. Ground station and satellite measurements of ozone, which are taken on a worldwide basis, are discussed. Instruments used in ozone measurement are discussed, such as the Dobson spectrophotometer, which is credited with providing the longest and most extensive series of observations for ground based observation of stratospheric ozone. Other ground based instruments used to measure ozone are also discussed. The statistical differences of ground based measurements of ozone from these different instruments are compared to each other, and to satellite measurements. Mathematical methods (i.e., trend analysis or linear regression analysis) of analyzing the variability of ozone concentration with respect to time and lattitude are described. Various time series models which can be employed in accounting for ozone concentration variability are examined.

  1. Source apportionment of groundwater pollutants in Apulian agricultural sites using multivariate statistical analyses: case study of Foggia province

    PubMed Central

    2012-01-01

    Background Ground waters are an important resource of water supply for human health and activities. Groundwater uses and applications are often related to its composition, which is increasingly influenced by human activities. In fact the water quality of groundwater is affected by many factors including precipitation, surface runoff, groundwater flow, and the characteristics of the catchment area. During the years 2004-2007 the Agricultural and Food Authority of Apulia Region has implemented the project “Expansion of regional agro-meteorological network” in order to assess, monitor and manage of regional groundwater quality. The total wells monitored during this activity amounted to 473, and the water samples analyzed were 1021. This resulted in a huge and complex data matrix comprised of a large number of physical-chemical parameters, which are often difficult to interpret and draw meaningful conclusions. The application of different multivariate statistical techniques such as Cluster Analysis (CA), Principal Component Analysis (PCA), Absolute Principal Component Scores (APCS) for interpretation of the complex databases offers a better understanding of water quality in the study region. Results Form results obtained by Principal Component and Cluster Analysis applied to data set of Foggia province it’s evident that some sampling sites investigated show dissimilarities, mostly due to the location of the site, the land use and management techniques and groundwater overuse. By APCS method it’s been possible to identify three pollutant sources: Agricultural pollution 1 due to fertilizer applications, Agricultural pollution 2 due to microelements for agriculture and groundwater overuse and a third source that can be identified as soil run off and rock tracer mining. Conclusions Multivariate statistical methods represent a valid tool to understand complex nature of groundwater quality issues, determine priorities in the use of ground waters as irrigation water

  2. Statistical Analyses of Satellite Cloud Object Data From CERES. Part 4; Boundary-layer Cloud Objects During 1998 El Nino

    NASA Technical Reports Server (NTRS)

    Xu, Kuan-Man; Wong, Takmeng; Wielicki, Bruce A.; Parker, Lindsay

    2006-01-01

    Three boundary-layer cloud object types, stratus, stratocumulus and cumulus, that occurred over the Pacific Ocean during January-August 1998, are identified from the CERES (Clouds and the Earth s Radiant Energy System) single scanner footprint (SSF) data from the TRMM (Tropical Rainfall Measuring Mission) satellite. This study emphasizes the differences and similarities in the characteristics of each cloud-object type between the tropical and subtropical regions and among different size categories and among small geographic areas. Both the frequencies of occurrence and statistical distributions of cloud physical properties are analyzed. In terms of frequencies of occurrence, stratocumulus clouds dominate the entire boundary layer cloud population in all regions and among all size categories. Stratus clouds are more prevalent in the subtropics and near the coastal regions, while cumulus clouds are relatively prevalent over open ocean and the equatorial regions, particularly, within the small size categories. The largest size category of stratus cloud objects occurs more frequently in the subtropics than in the tropics and has much larger average size than its cumulus and stratocumulus counterparts. Each of the three cloud object types exhibits small differences in statistical distributions of cloud optical depth, liquid water path, TOA albedo and perhaps cloud-top height, but large differences in those of cloud-top temperature and OLR between the tropics and subtropics. Differences in the sea surface temperature (SST) distributions between the tropics and subtropics influence some of the cloud macrophysical properties, but cloud microphysical properties and albedo for each cloud object type are likely determined by (local) boundary-layer dynamics and structures. Systematic variations of cloud optical depth, TOA albedo, cloud-top height, OLR and SST with cloud object sizes are pronounced for the stratocumulus and stratus types, which are related to systematic

  3. Hydrogeochemical Processes of Groundwater Using Multivariate Statistical Analyses and Inverse Geochemical Modeling in Samrak Park of Nakdong River Basin, Korea

    NASA Astrophysics Data System (ADS)

    Chung, Sang Yong

    2015-04-01

    Multivariate statistical methods and inverse geochemical modelling were used to assess the hydrogeochemical processes of groundwater in Nakdong River basin. The study area is located in a part of Nakdong River basin, the Busan Metropolitan City, Kora. Quaternary deposits forms Samrak Park region and are underlain by intrusive rocks of Bulkuksa group and sedimentary rocks of Yucheon group in the Cretaceous Period. The Samrak park region is acting as two aquifer systems of unconfined aquifer and confined aquifer. The unconfined aquifer consists of upper sand, and confined aquifer is comprised of clay, lower sand, gravel, weathered rock. Porosity and hydraulic conductivity of the area is 37 to 59% and 1.7 to 200m/day, respectively. Depth of the wells ranges from 9 to 77m. Piper's trilinear diagram, CaCl2 type was useful for unconfined aquifer and NaCl type was dominant for confined aquifer. By hierarchical cluster analysis (HCA), Group 1 and Group 2 are fully composed of unconfined aquifer and confined aquifer, respectively. In factor analysis (FA), Factor 1 is described by the strong loadings of EC, Na, K, Ca, Mg, Cl, HCO3, SO4 and Si, and Factor 2 represents the strong loadings of pH and Al. Base on the Gibbs diagram, the unconfined and confined aquifer samples are scattered discretely in the rock and evaporation areas. The principal hydrogeochemical processes occurring in the confined and unconfined aquifers are the ion exchange due to the phenomena of freshening under natural recharge and water-rock interactions followed by evaporation and dissolution. The saturation index of minerals such as Ca-montmorillonite, dolomite and calcite represents oversaturated, and the albite, gypsum and halite show undersaturated. Inverse geochemical modeling using PHREEQC code demonstrated that relatively few phases were required to derive the differences in groundwater chemistry along the flow path in the area. It also suggested that dissolution of carbonate and ion exchange

  4. Extended Statistical Short-Range Guidance for Peak Wind Speed Analyses at the Shuttle Landing Facility: Phase II Results

    NASA Technical Reports Server (NTRS)

    Lambert, Winifred C.

    2003-01-01

    This report describes the results from Phase II of the AMU's Short-Range Statistical Forecasting task for peak winds at the Shuttle Landing Facility (SLF). The peak wind speeds are an important forecast element for the Space Shuttle and Expendable Launch Vehicle programs. The 45th Weather Squadron and the Spaceflight Meteorology Group indicate that peak winds are challenging to forecast. The Applied Meteorology Unit was tasked to develop tools that aid in short-range forecasts of peak winds at tower sites of operational interest. A seven year record of wind tower data was used in the analysis. Hourly and directional climatologies by tower and month were developed to determine the seasonal behavior of the average and peak winds. Probability density functions (PDF) of peak wind speed were calculated to determine the distribution of peak speed with average speed. These provide forecasters with a means of determining the probability of meeting or exceeding a certain peak wind given an observed or forecast average speed. A PC-based Graphical User Interface (GUI) tool was created to display the data quickly.

  5. Water quality variation in the highly disturbed Huai River Basin, China from 1994 to 2005 by multi-statistical analyses.

    PubMed

    Zhai, Xiaoyan; Xia, Jun; Zhang, Yongyong

    2014-10-15

    Water quality deterioration is a prominent issue threatening water security throughout the world. Huai River Basin, as the sixth largest basin in China, is facing the most severe water pollution and high disturbance. Statistical detection of water quality trends and identification of human interferences are significant for sustainable water quality management. Three key water quality elements (ammonium nitrogen: NH3-N, permanganate index: CODMn and dissolved oxygen: DO) at 18 monitoring stations were selected to analyze their spatio-temporal variations in the highly disturbed Huai River Basin using seasonal Mann-Kendall test and Moran's I method. Relationship between surrounding water environment and anthropogenic activities (point source emission, land use) was investigated by regression analysis. The results indicated that water environment was significantly improved on the whole from 1994 to 2005. CODMn and NH3-N concentrations decreased at half of the stations, and DO concentration increased significantly at 39% (7/18) stations. The high pollution cluster centers for both NH3-N and CODMn were in the middle stream of Shaying River and Guo River in the 2000s. Water quality of Huai River Basin was mainly influenced by point source pollution emission, flows regulated by dams, water temperature and land use variations and so on. This study was expected to provide insights into water quality evolution and foundations for water quality management in Huai River Basin, and scientific references for the implementation of water pollution prevention in China. PMID:25108800

  6. How to Tell the Truth with Statistics: The Case for Accountable Data Analyses in Team-based Science

    PubMed Central

    Gelfond, Jonathan A. L.; Klugman, Craig M.; Welty, Leah J.; Heitman, Elizabeth; Louden, Christopher; Pollock, Brad H.

    2015-01-01

    Data analysis is essential to translational medicine, epidemiology, and the scientific process. Although recent advances in promoting reproducibility and reporting standards have made some improvements, the data analysis process remains insufficiently documented and susceptible to avoidable errors, bias, and even fraud. Comprehensively accounting for the full analytical process requires not only records of the statistical methodology used, but also records of communications among the research team. In this regard, the data analysis process can benefit from the principle of accountability that is inherent in other disciplines such as clinical practice. We propose a novel framework for capturing the analytical narrative called the Accountable Data Analysis Process (ADAP), which allows the entire research team to participate in the analysis in a supervised and transparent way. The framework is analogous to an electronic health record in which the dataset is the “patient” and actions related to the dataset are recorded in a project management system. We discuss the design, advantages, and challenges in implementing this type of system in the context of academic health centers, where team based science increasingly demands accountability. PMID:26290897

  7. Testing for Additivity in Chemical Mixtures Using a Fixed-Ratio Ray Design and Statistical Equivalence Testing Methods

    EPA Science Inventory

    Fixed-ratio ray designs have been used for detecting and characterizing interactions of large numbers of chemicals in combination. Single chemical dose-response data are used to predict an “additivity curve” along an environmentally relevant ray. A “mixture curve” is estimated fr...

  8. Statistical analyses in Swedish randomised trials on mammography screening and in other randomised trials on cancer screening: a systematic review

    PubMed Central

    Boniol, Mathieu; Smans, Michel; Sullivan, Richard; Boyle, Peter

    2015-01-01

    Objectives We compared calculations of relative risks of cancer death in Swedish mammography trials and in other cancer screening trials. Participants Men and women from 30 to 74 years of age. Setting Randomised trials on cancer screening. Design For each trial, we identified the intervention period, when screening was offered to screening groups and not to control groups, and the post-intervention period, when screening (or absence of screening) was the same in screening and control groups. We then examined which cancer deaths had been used for the computation of relative risk of cancer death. Main outcome measures Relative risk of cancer death. Results In 17 non-breast screening trials, deaths due to cancers diagnosed during the intervention and post-intervention periods were used for relative risk calculations. In the five Swedish trials, relative risk calculations used deaths due to breast cancers found during intervention periods, but deaths due to breast cancer found at first screening of control groups were added to these groups. After reallocation of the added breast cancer deaths to post-intervention periods of control groups, relative risks of 0.86 (0.76; 0.97) were obtained for cancers found during intervention periods and 0.83 (0.71; 0.97) for cancers found during post-intervention periods, indicating constant reduction in the risk of breast cancer death during follow-up, irrespective of screening. Conclusions The use of unconventional statistical methods in Swedish trials has led to overestimation of risk reduction in breast cancer death attributable to mammography screening. The constant risk reduction observed in screening groups was probably due to the trial design that optimised awareness and medical management of women allocated to screening groups. PMID:26152677

  9. Combining the Power of Statistical Analyses and Community Interviews to Identify Adoption Barriers for Stormwater Best-Management Practices

    NASA Astrophysics Data System (ADS)

    Hoover, F. A.; Bowling, L. C.; Prokopy, L. S.

    2015-12-01

    Urban stormwater is an on-going management concern in municipalities of all sizes. In both combined or separated sewer systems, pollutants from stormwater runoff enter the natural waterway system during heavy rain events. Urban flooding during frequent and more intense storms are also a growing concern. Therefore, stormwater best-management practices (BMPs) are being implemented in efforts to reduce and manage stormwater pollution and overflow. The majority of BMP water quality studies focus on the small-scale, individual effects of the BMP, and the change in water quality directly from the runoff of these infrastructures. At the watershed scale, it is difficult to establish statistically whether or not these BMPs are making a difference in water quality, given that watershed scale monitoring is often costly and time consuming, relying on significant sources of funds, which a city may not have. Hence, there is a need to quantify the level of sampling needed to detect the water quality impact of BMPs at the watershed scale. In this study, a power analysis was performed on data from an urban watershed in Lafayette, Indiana, to determine the frequency of sampling required to detect a significant change in water quality measurements. Using the R platform, results indicate that detecting a significant change in watershed level water quality would require hundreds of weekly measurements, even when improvement is present. The second part of this study investigates whether the difficulty in demonstrating water quality change represents a barrier to adoption of stormwater BMPs. Semi-structured interviews of community residents and organizations in Chicago, IL are being used to investigate residents understanding of water quality and best management practices and identify their attitudes and perceptions towards stormwater BMPs. Second round interviews will examine how information on uncertainty in water quality improvements influences their BMP attitudes and perceptions.

  10. CT-Based Attenuation Correction in Brain SPECT/CT Can Improve the Lesion Detectability of Voxel-Based Statistical Analyses

    PubMed Central

    Kato, Hiroki; Shimosegawa, Eku; Fujino, Koichi; Hatazawa, Jun

    2016-01-01

    Background Integrated SPECT/CT enables non-uniform attenuation correction (AC) using built-in CT instead of the conventional uniform AC. The effect of CT-based AC on voxel-based statistical analyses of brain SPECT findings has not yet been clarified. Here, we assessed differences in the detectability of regional cerebral blood flow (CBF) reduction using SPECT voxel-based statistical analyses based on the two types of AC methods. Subjects and Methods N-isopropyl-p-[123I]iodoamphetamine (IMP) CBF SPECT images were acquired for all the subjects and were reconstructed using 3D-OSEM with two different AC methods: Chang’s method (Chang’s AC) and the CT-based AC method. A normal database was constructed for the analysis using SPECT findings obtained for 25 healthy normal volunteers. Voxel-based Z-statistics were also calculated for SPECT findings obtained for 15 patients with chronic cerebral infarctions and 10 normal subjects. We assumed that an analysis with a higher specificity would likely produce a lower mean absolute Z-score for normal brain tissue, and a more sensitive voxel-based statistical analysis would likely produce a higher absolute Z-score for in old infarct lesions, where the CBF was severely decreased. Results The inter-subject variation in the voxel values in the normal database was lower using CT-based AC, compared with Chang’s AC, for most of the brain regions. The absolute Z-score indicating a SPECT count reduction in infarct lesions was also significantly higher in the images reconstructed using CT-based AC, compared with Chang’s AC (P = 0.003). The mean absolute value of the Z-score in the 10 intact brains was significantly lower in the images reconstructed using CT-based AC than in those reconstructed using Chang’s AC (P = 0.005). Conclusions Non-uniform CT-based AC by integrated SPECT/CT significantly improved sensitivity and the specificity of the voxel-based statistical analyses for regional SPECT count reductions, compared with

  11. Synthesis of Aza-m-Xylylene diradicals with large singlet-triplet energy gap and statistical analyses of their EPR spectra

    SciTech Connect

    Olankitwanit, Arnon; Pink, Maren; Rajca, Suchada; Rajca, Andrzej

    2014-10-08

    We describe synthesis and characterization of a derivative of aza-m-xylylene, diradical 2, that is persistent in solution at room temperature with the half-life measured in minutes (~80–250 s) and in which the triplet ground state is below the lowest singlet state by >10 kcal mol⁻¹. The triplet ground states and ΔEST of 2 in glassy solvent matrix are determined by a new approach based on statistical analyses of their EPR spectra. Characterization and analysis of the analogous diradical 1 are carried out for comparison. Statistical analyses of their EPR spectra reliably provide improved lower bounds for ΔEST (from >0.4 to >0.6 kcal mol⁻¹) and are compatible with a wide range of relative contents of diradical vs monoradical, including samples in which the diradical and monoradical are minor and major components, respectively. This demonstrates a new powerful method for the determination of the triplet ground states and ΔEST applicable to moderately pure diradicals in matrices.

  12. Unlocking Data for Statistical Analyses and Data Mining: Generic Case Extraction of Clinical Items from i2b2 and tranSMART.

    PubMed

    Firnkorn, Daniel; Merker, Sebastian; Ganzinger, Matthias; Muley, Thomas; Knaup, Petra

    2016-01-01

    In medical science, modern IT concepts are increasingly important to gather new findings out of complex diseases. Data Warehouses (DWH) as central data repository systems play a key role by providing standardized, high-quality and secure medical data for effective analyses. However, DWHs in medicine must fulfil various requirements concerning data privacy and the ability to describe the complexity of (rare) disease phenomena. Here, i2b2 and tranSMART are free alternatives representing DWH solutions especially developed for medical informatics purposes. But different functionalities are not yet provided in a sufficient way. In fact, data import and export is still a major problem because of the diversity of schemas, parameter definitions and data quality which are described variously in each single clinic. Further, statistical analyses inside i2b2 and tranSMART are possible, but restricted to the implemented functions. Thus, data export is needed to provide a data basis which can be directly included within statistics software like SPSS and SAS or data mining tools like Weka and RapidMiner. The standard export tools of i2b2 and tranSMART are more or less creating a database dump of key-value pairs which cannot be used immediately by the mentioned tools. They need an instance-based or a case-based representation of each patient. To overcome this lack, we developed a concept called Generic Case Extractor (GCE) which pivots the key-value pairs of each clinical fact into a row-oriented format for each patient sufficient to enable analyses in a broader context. Therefore, complex pivotisation routines where necessary to ensure temporal consistency especially in terms of different data sets and the occurrence of identical but repeated parameters like follow-up data. GCE is embedded inside a comprehensive software platform for systems medicine. PMID:27577447

  13. Delineation and evaluation of hydrologic-landscape regions in the United States using geographic information system tools and multivariate statistical analyses.

    USGS Publications Warehouse

    Wolock, D.M.; Winter, T.C.; McMahon, G.

    2004-01-01

    Hydrologic-landscape regions in the United States were delineated by using geographic information system (GIS) tools combined with principal components and cluster analyses. The GIS and statistical analyses were applied to land-surface form, geologic texture (permeability of the soil and bedrock), and climate variables that describe the physical and climatic setting of 43,931 small (approximately 200 km2) watersheds in the United States. (The term "watersheds" is defined in this paper as the drainage areas of tributary streams, headwater streams, and stream segments lying between two confluences.) The analyses grouped the watersheds into 20 noncontiguous regions based on similarities in land-surface form, geologic texture, and climate characteristics. The percentage of explained variance (R-squared value) in an analysis of variance was used to compare the hydrologic-landscape regions to 19 square geometric regions and the 21 U.S. Environmental Protection Agency level-II ecoregions. Hydrologic-landscape regions generally were better than ecoregions at delineating regions of distinct land-surface form and geologic texture. Hydrologic-landscape regions and ecoregions were equally effective at defining regions in terms of climate, land cover, and water-quality characteristics. For about half of the landscape, climate, and water-quality characteristics, the R-squared values of square geometric regions were as high as hydrologic-landscape regions or ecoregions.

  14. Three-way, Statistically-based Comparison of Convective Clouds and Precipitation in Satellite Observations, GEOS-5 Retrospective Analyses, and Cloud Resolving Model Simulations.

    NASA Astrophysics Data System (ADS)

    Bacmeister, J.; Tao, W.; Suarez, M.; Rienecker, M.

    2008-12-01

    High-resolution satellite datasets such as those from the CloudSat and TRMM instruments provide an unprecedented view of global cloud and precipitation fields. We will combine these observations with new global 0.5x0.66 retrospective analyses from the Goddard Earth Observing System version 5 (GEOS-5) to examine convective events and their parameterized counterparts in atmospheric models. We show, for example, that simple predictors of convective cloud depth based on analyzed profiles of temperature and humidity do a good job of describing the statistics of cloud depth-scales obtained from CloudSat radiances. At the same time, comparisons with parameterized convection in the global model indicate possibly significant differences in character between parameterized and observed convection. Cloud resolving model simulations are used to examine uncertainties arising from satellite sampling, and to examine relationships with other important, unobserved variables such as convective mass flux

  15. Statistical properties of coastal long waves analysed through sea-level time-gradient functions: exemplary analysis of the Siracusa, Italy, tide-gauge data

    NASA Astrophysics Data System (ADS)

    Bressan, L.; Tinti, S.

    2015-09-01

    This study presents a new method to analyse the properties of the sea-level signal recorded by coastal tide gauges in the long wave range that is in a window between wind/storm waves and tides and is typical of several phenomena like local seiches, coastal shelf resonances and tsunamis. The method consists of computing four specific functions based on the time gradient (slope) of the recorded sea level oscillations, namely the instantaneous slope IS, and three more functions based on IS, that are the sea level SL, the background slope BS and the control function CF. These functions are examined through a traditional spectral FFT analysis and also through a statistical analysis showing that they can be characterised by probability distribution functions PDFs such as the Student's t distribution (IS and SL) and the Beta distribution (CF). As an example, the method has been applied to data from the tide-gauge station of Siracusa, Italy.

  16. How different statistical analyses of grain size data can be used for facies determination and palaeoenvironmental reconstructions - an example from the Black Sea coast of Georgia

    NASA Astrophysics Data System (ADS)

    Riedesel, Svenja; Opitz, Stephan; Kelterbaum, Daniel; Laermanns, Hannes; Seeliger, Martin; Rölkens, Julian; Elashvili, Mikaheil; Brueckner, Helmut

    2016-04-01

    Granulometric analyses enable precise and significant statements about sediment transport processes and depositional environments. Bivariate statistics of graphical parameters (mean, sorting, skewness, kurtosis) of grain-size distributions, open the opportunity of grain-size analysis in context of sediment transport patterns, to differentiate between several depositional environments. While such approaches may be limited to unimodal grain-size distributions, the statistical method of the end-member modelling algorithm (EMMA) was created to solve the explicit mixing problem of multimodal grain-size distributions. EMMA enables the extraction of robust end-members from the original dataset. A comparison of extracted end-members with recent surface sample's grain-size distributions allows assumptions for transport processes and depositional environments. Bivariate statistics of graphical grain-size parameters and EMMA were performed on a 9 m long sediment record, taken from a beach ridge sequence at the coastal area of western Georgia. Whereas biplots of calculated parameters give valid information of modern environments, this method fails for the reconstruction of palaeoenvironments. However, by applying EMMA it is possible to extract four robust end-members and combine them with grain-size distributions of modern surface samples. Results gained from EMMA, indicate a threefold of the sediment core (Unit 1, 2 and 3 - from bottom to the top). End-members (EM) 1 and 2 show multimodal grain-size distributions, quite similar to the distributions of modern low-energy fluvial deposits. Such comparable distributions do not indicate exactly the same transport system of present and past, but give a hint on the energy level and the flow velocity of the transport medium. Whereas EM 1 and 2 represent most of the relative EM amount from Unit 2, EM 3 and 4 dominate Unit 1 and 3. They are represented by unimodal distributions, only differing by the position of their peak, which is

  17. Phylogenetic Analyses and Characterization of RNase X25 from Drosophila melanogaster Suggest a Conserved Housekeeping Role and Additional Functions for RNase T2 Enzymes in Protostomes

    PubMed Central

    Ambrosio, Linda; Bailey, Ryan; Ding, Jian; MacIntosh, Gustavo C.

    2014-01-01

    Ribonucleases belonging to the RNase T2 family are enzymes associated with the secretory pathway that are almost absolutely conserved in all eukaryotes. Studies in plants and vertebrates suggest they have an important housekeeping function in rRNA recycling. However, little is known about this family of enzymes in protostomes. We characterized RNase X25, the only RNase T2 enzyme in Drosophila melanogaster. We found that RNase X25 is the major contributor of ribonuclease activity in flies as detected by in gel assays, and has an acidic pH preference. Gene expression analyses showed that the RNase X25 transcript is present in all adult tissues and developmental stages. RNase X25 expression is elevated in response to nutritional stresses; consistent with the hypothesis that this enzyme has a housekeeping role in recycling RNA. A correlation between induction of RNase X25 expression and autophagy was observed. Moreover, induction of gene expression was triggered by oxidative stress suggesting that RNase X25 may have additional roles in stress responses. Phylogenetic analyses of this family in protostomes showed that RNase T2 genes have undergone duplication events followed by divergence in several phyla, including the loss of catalytic residues, and suggest that RNase T2 proteins have acquired novel functions. Among those, it is likely that a role in host immunosuppression evolved independently in several groups, including parasitic Platyhelminthes and parasitoid wasps. The presence of only one RNase T2 gene in the D. melanogaster genome, without any other evident secretory RNase activity detected, makes this organism an ideal system to study the cellular functions of RNase T2 proteins associated with RNA recycling and maintenance of cellular homeostasis. On the other hand, the discovery of gene duplications in several protostome genomes also presents interesting new avenues to study additional biological functions of this ancient family of proteins. PMID:25133712

  18. Novel Flow Cytometry Analyses of Boar Sperm Viability: Can the Addition of Whole Sperm-Rich Fraction Seminal Plasma to Frozen-Thawed Boar Sperm Affect It?

    PubMed Central

    Díaz, Rommy; Boguen, Rodrigo; Martins, Simone Maria Massami Kitamura; Ravagnani, Gisele Mouro; Leal, Diego Feitosa; Oliveira, Melissa de Lima; Muro, Bruno Bracco Donatelli; Parra, Beatriz Martins; Meirelles, Flávio Vieira; Papa, Frederico Ozanan; Dell’Aqua, José Antônio; Alvarenga, Marco Antônio; Moretti, Aníbal de Sant’Anna; Sepúlveda, Néstor

    2016-01-01

    Boar semen cryopreservation remains a challenge due to the extension of cold shock damage. Thus, many alternatives have emerged to improve the quality of frozen-thawed boar sperm. Although the use of seminal plasma arising from boar sperm-rich fraction (SP-SRF) has shown good efficacy; however, the majority of actual sperm evaluation techniques include a single or dual sperm parameter analysis, which overrates the real sperm viability. Within this context, this work was performed to introduce a sperm flow cytometry fourfold stain technique for simultaneous evaluation of plasma and acrosomal membrane integrity and mitochondrial membrane potential. We then used the sperm flow cytometry fourfold stain technique to study the effect of SP-SRF on frozen-thawed boar sperm and further evaluated the effect of this treatment on sperm movement, tyrosine phosphorylation and fertility rate (FR). The sperm fourfold stain technique is accurate (R2 = 0.9356, p > 0.01) for simultaneous evaluation of plasma and acrosomal membrane integrity and mitochondrial membrane potential (IPIAH cells). Centrifugation pre-cryopreservation was not deleterious (p > 0.05) for any analyzed variables. Addition of SP-SRF after cryopreservation was able to improve total and progressive motility (p < 0.05) when boar semen was cryopreserved without SP-SRF; however, it was not able to decrease tyrosine phosphorylation (p > 0.05) or improve IPIAH cells (p > 0.05). FR was not (p > 0.05) statistically increased by the addition of seminal plasma, though females inseminated with frozen-thawed boar semen plus SP-SRF did perform better than those inseminated with sperm lacking seminal plasma. Thus, we conclude that sperm fourfold stain can be used to simultaneously evaluate plasma and acrosomal membrane integrity and mitochondrial membrane potential, and the addition of SP-SRF at thawed boar semen cryopreserved in absence of SP-SRF improve its total and progressive motility. PMID:27529819

  19. The Effect of Ignoring Statistical Interactions in Regression Analyses Conducted in Epidemiologic Studies: An Example with Survival Analysis Using Cox Proportional Hazards Regression Model

    PubMed Central

    Vatcheva, KP; Lee, M; McCormick, JB; Rahbar, MH

    2016-01-01

    Objective To demonstrate the adverse impact of ignoring statistical interactions in regression models used in epidemiologic studies. Study design and setting Based on different scenarios that involved known values for coefficient of the interaction term in Cox regression models we generated 1000 samples of size 600 each. The simulated samples and a real life data set from the Cameron County Hispanic Cohort were used to evaluate the effect of ignoring statistical interactions in these models. Results Compared to correctly specified Cox regression models with interaction terms, misspecified models without interaction terms resulted in up to 8.95 fold bias in estimated regression coefficients. Whereas when data were generated from a perfect additive Cox proportional hazards regression model the inclusion of the interaction between the two covariates resulted in only 2% estimated bias in main effect regression coefficients estimates, but did not alter the main findings of no significant interactions. Conclusions When the effects are synergic, the failure to account for an interaction effect could lead to bias and misinterpretation of the results, and in some instances to incorrect policy decisions. Best practices in regression analysis must include identification of interactions, including for analysis of data from epidemiologic studies.

  20. A preliminary study of the statistical analyses and sampling strategies associated with the integration of remote sensing capabilities into the current agricultural crop forecasting system

    NASA Technical Reports Server (NTRS)

    Sand, F.; Christie, R.

    1975-01-01

    Extending the crop survey application of remote sensing from small experimental regions to state and national levels requires that a sample of agricultural fields be chosen for remote sensing of crop acreage, and that a statistical estimate be formulated with measurable characteristics. The critical requirements for the success of the application are reviewed in this report. The problem of sampling in the presence of cloud cover is discussed. Integration of remotely sensed information about crops into current agricultural crop forecasting systems is treated on the basis of the USDA multiple frame survey concepts, with an assumed addition of a new frame derived from remote sensing. Evolution of a crop forecasting system which utilizes LANDSAT and future remote sensing systems is projected for the 1975-1990 time frame.

  1. Summary of statistical and trend analyses of selected water-quality data collected near the Big Thicket National Preserve, southeast Texas

    USGS Publications Warehouse

    Wells, F.C.; Bourdon, K.C.

    1985-01-01

    Statistical and trend analyses of selected water-quality data collected at three streamflow stations in the lower Neches River basin, Texas, are summarized in order to document baseline water-quality conditions in stream segments that flow through the Big Thicket National Preserve in southeast Texas. Dissolved solids concentrations in the streams are small, less than 132 milligrams per liter in 50 percent of the samples analyzed from each of the sites. Dissolved oxygen concentrations in the Neches River at Evadale (08041000) are generally large, exceeding 8.0 milligrams per liter in more than 50 percent of the samples analyzed. Total nitrogen and total phosphorus concentrations in samples from this site have not exceeded 1.8 and 0.20 milligrams per liter, respectively. Trend tests for dissolved solids and major ions indicate that small down-trends in total alkalinity, dissolved calcium, and hardness occurred in the Neches River at Evadale (08041000) and Pine-Island Bayou near Sour Lake (08041700). Small uptrends in dissolved sulfate were detected at all three stations in the study area. (USGS)

  2. Simultaneous assessment of phase chemistry, phase abundance and bulk chemistry with statistical electron probe micro-analyses: Application to cement clinkers

    SciTech Connect

    Wilson, William; Krakowiak, Konrad J.; Ulm, Franz-Josef

    2014-01-15

    According to recent developments in cement clinker engineering, the optimization of chemical substitutions in the main clinker phases offers a promising approach to improve both reactivity and grindability of clinkers. Thus, monitoring the chemistry of the phases may become part of the quality control at the cement plants, along with the usual measurements of the abundance of the mineralogical phases (quantitative X-ray diffraction) and the bulk chemistry (X-ray fluorescence). This paper presents a new method to assess these three complementary quantities with a single experiment. The method is based on electron microprobe spot analyses, performed over a grid located on a representative surface of the sample and interpreted with advanced statistical tools. This paper describes the method and the experimental program performed on industrial clinkers to establish the accuracy in comparison to conventional methods. -- Highlights: •A new method of clinker characterization •Combination of electron probe technique with cluster analysis •Simultaneous assessment of phase abundance, composition and bulk chemistry •Experimental validation performed on industrial clinkers.

  3. Statistical properties of coastal long waves analysed through sea-level time-gradient functions: exemplary analysis of the Siracusa, Italy, tide-gauge data

    NASA Astrophysics Data System (ADS)

    Bressan, L.; Tinti, S.

    2016-01-01

    This study presents a new method to analyse the properties of the sea-level signal recorded by coastal tide gauges in the long wave range that is in a window between wind/storm waves and tides and is typical of several phenomena like local seiches, coastal shelf resonances and tsunamis. The method consists of computing four specific functions based on the time gradient (slope) of the recorded sea level oscillations, namely the instantaneous slope (IS) as well as three more functions based on IS, namely the reconstructed sea level (RSL), the background slope (BS) and the control function (CF). These functions are examined through a traditional spectral fast Fourier transform (FFT) analysis and also through a statistical analysis, showing that they can be characterised by probability distribution functions PDFs such as the Student's t distribution (IS and RSL) and the beta distribution (CF). As an example, the method has been applied to data from the tide-gauge station of Siracusa, Italy.

  4. The use of mass spectrometry for analysing metabolite biomarkers in epidemiology: methodological and statistical considerations for application to large numbers of biological samples.

    PubMed

    Lind, Mads V; Savolainen, Otto I; Ross, Alastair B

    2016-08-01

    Data quality is critical for epidemiology, and as scientific understanding expands, the range of data available for epidemiological studies and the types of tools used for measurement have also expanded. It is essential for the epidemiologist to have a grasp of the issues involved with different measurement tools. One tool that is increasingly being used for measuring biomarkers in epidemiological cohorts is mass spectrometry (MS), because of the high specificity and sensitivity of MS-based methods and the expanding range of biomarkers that can be measured. Further, the ability of MS to quantify many biomarkers simultaneously is advantageously compared to single biomarker methods. However, as with all methods used to measure biomarkers, there are a number of pitfalls to consider which may have an impact on results when used in epidemiology. In this review we discuss the use of MS for biomarker analyses, focusing on metabolites and their application and potential issues related to large-scale epidemiology studies, the use of MS "omics" approaches for biomarker discovery and how MS-based results can be used for increasing biological knowledge gained from epidemiological studies. Better understanding of the possibilities and possible problems related to MS-based measurements will help the epidemiologist in their discussions with analytical chemists and lead to the use of the most appropriate statistical tools for these data. PMID:27230258

  5. Experimental and statistical analyses to characterize in-vehicle fine particulate matter behavior inside public transit buses operating on B20-grade biodiesel fuel

    NASA Astrophysics Data System (ADS)

    Vijayan, Abhilash; Kumar, Ashok

    2010-11-01

    This paper presents results from an in-vehicle air quality study of public transit buses in Toledo, Ohio, involving continuous monitoring, and experimental and statistical analyses to understand in-vehicle particulate matter (PM) behavior inside buses operating on B20-grade biodiesel fuel. The study also focused on evaluating the effects of vehicle's fuel type, operating periods, operation status, passenger counts, traffic conditions, and the seasonal and meteorological variation on particulates with aerodynamic diameter less than 1 micron (PM 1.0). The study found that the average PM 1.0 mass concentrations in B20-grade biodiesel-fueled bus compartments were approximately 15 μg m -3, while PM 2.5 and PM 10 concentration averages were approximately 19 μg m -3 and 37 μg m -3, respectively. It was also observed that average hourly concentration trends of PM 1.0 and PM 2.5 followed a "μ-shaped" pattern during transit hours. Experimental analyses revealed that the in-vehicle PM 1.0 mass concentrations were higher inside diesel-fueled buses (10.0-71.0 μg m -3 with a mean of 31.8 μg m -3) as compared to biodiesel buses (3.3-33.5 μg m -3 with a mean of 15.3 μg m -3) when the windows were kept open. Vehicle idling conditions and open door status were found to facilitate smaller particle concentrations inside the cabin, while closed door facilitated larger particle concentrations suggesting that smaller particles were originating outside the vehicle and larger particles were formed within the cabin, potentially from passenger activity. The study also found that PM 1.0 mass concentrations at the back of bus compartment (5.7-39.1 μg m -3 with a mean of 28.3 μg m -3) were higher than the concentrations in the front (5.7-25.9 μg m -3 with a mean of 21.9 μg m -3), and the mass concentrations inside the bus compartment were generally 30-70% lower than the just-outside concentrations. Further, bus route, window position, and time of day were found to affect the in

  6. Application of multivariate statistical analyses in the interpretation of geochemical behaviour of uranium in phosphatic rocks in the Red Sea, Nile Valley and Western Desert, Egypt.

    PubMed

    El-Arabi, Abd El-Gabar M; Khalifa, Ibrahim H

    2002-01-01

    Factor and cluster analyses as well as the Pearson correlation coefficient have been applied to geochemical data obtained from phosphorite and phosphatic rocks of Duwi Formation exposed at the Red Sea coast. Nile Valley and Western Desert. Sixty-six samples from a total of 71 collected samples were analysed for SiO2, TiO2, Al203, Fe2O3, CaO, MgO, Na2O, K2O, P2O5, Sr, U and Pb by XRF and their mineral constituents were determined by the use of XRD techniques. In addition, the natural radioactivity of the phosphatic samples due to their uranium, thorium and potassium contents was measured by gamma-spectrometry. The uranium content in the phosphate rocks with P2O5 > 15% (average of 106.6 ppm) is higher than in rocks with P2O5 < 15% (average of 35.5 ppm). Uranium distribution is essentially controlled by the variations of P2O5 and CaO, whereas it is not related to changes in SiO2, TiO2, Al2O3, Fe2O3, MgO, Na2O and K2O concentrations.-Factor analysis and the Pearson correlation coefficient revealed that uranium belaves geochemically in different ways in the phosphatic sediments and phosphorites in the Red Sea, Nile Valley and Western Desert. In the Red Sea and Western Desert phosphorites, uranium occurs mainly in oxidized U6+ state where it seems to be fixed by the phosphate ion, forming secondary uranium phosphate minerals such as phosphuranylite. In the Nile Valley phosphorites, ionic substitution of Ca2+ by U4 is the main controlling factor in the concentration of uranium in phosphate rocks. Moreover, fixation of U6- by phosphate ion and adsorption of uranium on phosphate minerals play subordinate roles. PMID:12066979

  7. Impact of enzalutamide on quality of life in men with metastatic castration-resistant prostate cancer after chemotherapy: additional analyses from the AFFIRM randomized clinical trial

    PubMed Central

    Cella, D.; Ivanescu, C.; Holmstrom, S.; Bui, C. N.; Spalding, J.; Fizazi, K.

    2015-01-01

    Background To present longitudinal changes in Functional Assessment of Cancer Therapy-Prostate (FACT-P) scores during 25-week treatment with enzalutamide or placebo in men with progressive metastatic castration-resistant prostate cancer (mCRPC) after chemotherapy in the AFFIRM trial. Patients and methods Patients were randomly assigned to enzalutamide 160 mg/day or placebo. FACT-P was completed before randomization, at weeks 13, 17, 21, and 25, and every 12 weeks thereafter while on study treatment. Longitudinal changes in FACT-P scores from baseline to 25 weeks were analyzed using a mixed effects model for repeated measures (MMRM), with a pattern mixture model (PMM) applied as secondary analysis to address non-ignorable missing data. Cumulative distribution function (CDF) plots were generated and different methodological approaches and models for handling missing data were applied. Due to the exploratory nature of the analyses, adjustments for multiple comparisons were not made. AFFIRM is registered with ClinicalTrials.gov, number NCT00974311. Results The intention-to-treat FACT-P population included 938 patients (enzalutamide, n = 674; placebo n = 264) with evaluable FACT-P assessments at baseline and ≥1 post-baseline assessment. After 25 weeks, the mean FACT-P total score decreased by 1.52 points with enzalutamide compared with 13.73 points with placebo (P < 0.001). In addition, significant treatment differences at week 25 favoring enzalutamide were evident for all FACT-P subscales and indices, whether analyzed by MMRM or PMM. CDF plots revealed differences favoring enzalutamide compared with placebo across the full range of possible response levels for FACT-P total and all disease- and symptom-specific subscales/indices. Conclusion In men with progressive mCRPC after docetaxel-based chemotherapy, enzalutamide is superior to placebo in health-related quality-of-life outcomes, regardless of analysis model or threshold selected for meaningful response. Clinical

  8. Conducting ANOVA Trend Analyses Using Polynomial Contrasts.

    ERIC Educational Resources Information Center

    Laija, Wilda

    When analysis of variance (ANOVA) or linear regression is used, results may only indicate statistical significance. This statistical significance tells the researcher very little about the data being analyzed. Additional analyses need to be used to extract all the possible information obtained from a study. While a priori and post hoc comparisons…

  9. Multielement chemical and statistical analyses from a uranium hydrogeochemical and stream-sediment survey in and near the Elkhorn Mountains, Jefferson County, Montana; Part I, Surface water

    USGS Publications Warehouse

    Suits, V.J.; Wenrich, K.J.

    1982-01-01

    Fifty-two surface-water samples, collected from an area south of Helena, Jefferson County, were analyzed for 51 chemical species. Of these variables, 35 showed detectable variation over the area, and 29 were utilized in a correlation analysis. Two populations are distinguished in the collected samples and are especially evident in the plot of Ca versus U. Samples separated on the basis of U versus Ca proved to represent drainage areas of two differing lithologies. One group was from waters that drain the Boulder batholith, the other from those that drain the Elkhorn Mountains volcanic rocks. These two groups of samples, in general, proved to have parallel but different linear trends between U and other elements. Therefore, the two groups of samples were treated separately in the statistical analyses. Over the area that drains the Boulder batholith, U concentrations in water ranged from 0.37 to 13.0 ?g/l , with a mean of 1.9 ?g/l. The samples from streams draining volcanic areas ranged from 0.04 to 1.5 ?g/l, with a mean of 0.42 ?g/l. The highest U values (12 and 13 ?g/l) occur along Badger Creek, Rawhide Creek, Little Buffalo Gulch, and an unnamed tributary to Clancy Creek. Conductivity, hardness, Ba, Ca, CI, K, Mg, Na and Sr are significantly correlated with U at or better than the 95 percent confidence limit in both populations. For water draining the Boulder batholith, uranium correlates significantly with akalinity, pH, bicarbonate, Li, Mo, NO2+NO3, P04, SiO2, SO4, F, and inorganic carbon. These correlations are similar to those found in a previous study of water samples in north-central New Mexico (Wenrich-Verbeek, 1977b). Uranium in water from the volcanic terrane does not show correlations with any of the above constituents, but does correlate well with V. This relationship with V is absent within the Boulder batholith samples.

  10. Statistical Analyses Comparing Prismatic Magnetite Crystals in ALH84001 Carbonate Globules with those from the Terrestrial Magnetotactic Bacteria Strain MV-1

    NASA Technical Reports Server (NTRS)

    Thomas-Keprta, Kathie L.; Clemett, Simon J.; Bazylinski, Dennis A.; Kirschvink, Joseph L.; McKay, David S.; Wentworth, Susan J.; Vali, H.; Gibson, Everett K.

    2000-01-01

    Here we use rigorous mathematical modeling to compare ALH84001 prismatic magnetites with those produced by terrestrial magnetotactic bacteria, MV-1. We find that this subset of the Martian magnetites appears to be statistically indistinguishable from those of MV-1.

  11. Monitoring the quality consistency of Weibizhi tablets by micellar electrokinetic chromatography fingerprints combined with multivariate statistical analyses, the simple quantified ratio fingerprint method, and the fingerprint-efficacy relationship.

    PubMed

    Liu, Yingchun; Sun, Guoxiang; Wang, Yan; Yang, Lanping; Yang, Fangliang

    2015-06-01

    Micellar electrokinetic chromatography fingerprinting combined with quantification was successfully developed and applied to monitor the quality consistency of Weibizhi tablets, which is a classical compound preparation used to treat gastric ulcers. A background electrolyte composed of 57 mmol/L sodium borate, 21 mmol/L sodium dodecylsulfate and 100 mmol/L sodium hydroxide was used to separate compounds. To optimize capillary electrophoresis conditions, multivariate statistical analyses were applied. First, the most important factors influencing sample electrophoretic behavior were identified as background electrolyte concentrations. Then, a Box-Benhnken design response surface strategy using resolution index RF as an integrated response was set up to correlate factors with response. RF reflects the effective signal amount, resolution, and signal homogenization in an electropherogram, thus, it was regarded as an excellent indicator. In fingerprint assessments, simple quantified ratio fingerprint method was established for comprehensive quality discrimination of traditional Chinese medicines/herbal medicines from qualitative and quantitative perspectives, by which the quality of 27 samples from the same manufacturer were well differentiated. In addition, the fingerprint-efficacy relationship between fingerprints and antioxidant activities was established using partial least squares regression, which provided important medicinal efficacy information for quality control. The present study offered an efficient means for monitoring Weibizhi tablet quality consistency. PMID:25867134

  12. Statistics Clinic

    NASA Technical Reports Server (NTRS)

    Feiveson, Alan H.; Foy, Millennia; Ploutz-Snyder, Robert; Fiedler, James

    2014-01-01

    Do you have elevated p-values? Is the data analysis process getting you down? Do you experience anxiety when you need to respond to criticism of statistical methods in your manuscript? You may be suffering from Insufficient Statistical Support Syndrome (ISSS). For symptomatic relief of ISSS, come for a free consultation with JSC biostatisticians at our help desk during the poster sessions at the HRP Investigators Workshop. Get answers to common questions about sample size, missing data, multiple testing, when to trust the results of your analyses and more. Side effects may include sudden loss of statistics anxiety, improved interpretation of your data, and increased confidence in your results.

  13. A FORTRAN 77 Program and User's Guide for the Statistical Analyses of Scatterplots to Identify Important Factors in Large-Scale Simulations

    SciTech Connect

    Helton, Jon C.; Shortencarier, Maichael J.

    1999-08-01

    A description and user's guide are given for a computer program, PATTRN, developed at Sandia National Laboratories for use in sensitivity analyses of complex models. This program is intended for use in the analysis of input-output relationships in Monte Carlo analyses when the input has been selected using random or Latin hypercube sampling. Procedures incorporated into the program are based upon attempts to detect increasingly complex patterns in scatterplots and involve the detection of linear relationships, monotonic relationships, trends in measures of central tendency, trends in measures of variability, and deviations from randomness. The program was designed to be easy to use and portable.

  14. Additive-dominance genetic model analyses for late-maturity alpha-amylase activity in a bread wheat factorial crossing population.

    PubMed

    Rasul, Golam; Glover, Karl D; Krishnan, Padmanaban G; Wu, Jixiang; Berzonsky, William A; Ibrahim, Amir M H

    2015-12-01

    Elevated level of late maturity α-amylase activity (LMAA) can result in low falling number scores, reduced grain quality, and downgrade of wheat (Triticum aestivum L.) class. A mating population was developed by crossing parents with different levels of LMAA. The F2 and F3 hybrids and their parents were evaluated for LMAA, and data were analyzed using the R software package 'qgtools' integrated with an additive-dominance genetic model and a mixed linear model approach. Simulated results showed high testing powers for additive and additive × environment variances, and comparatively low powers for dominance and dominance × environment variances. All variance components and their proportions to the phenotypic variance for the parents and hybrids were significant except for the dominance × environment variance. The estimated narrow-sense heritability and broad-sense heritability for LMAA were 14 and 54%, respectively. High significant negative additive effects for parents suggest that spring wheat cultivars 'Lancer' and 'Chester' can serve as good general combiners, and that 'Kinsman' and 'Seri-82' had negative specific combining ability in some hybrids despite of their own significant positive additive effects, suggesting they can be used as parents to reduce LMAA levels. Seri-82 showed very good general combining ability effect when used as a male parent, indicating the importance of reciprocal effects. High significant negative dominance effects and high-parent heterosis for hybrids demonstrated that the specific hybrid combinations; Chester × Kinsman, 'Lerma52' × Lancer, Lerma52 × 'LoSprout' and 'Janz' × Seri-82 could be generated to produce cultivars with significantly reduced LMAA level. PMID:26403988

  15. Suite versus composite statistics

    USGS Publications Warehouse

    Balsillie, J.H.; Tanner, W.F.

    1999-01-01

    Suite and composite methodologies, two statistically valid approaches for producing statistical descriptive measures, are investigated for sample groups representing a probability distribution where, in addition, each sample is probability distribution. Suite and composite means (first moment measures) are always equivalent. Composite standard deviations (second moment measures) are always larger than suite standard deviations. Suite and composite values for higher moment measures have more complex relationships. Very seldom, however, are they equivalent, and they normally yield statistically significant but different results. Multiple samples are preferable to single samples (including composites) because they permit the investigator to examine sample-to-sample variability. These and other relationships for suite and composite probability distribution analyses are investigated and reported using granulometric data.

  16. Halogen-free ionic liquid as an additive in zinc(II)-selective electrode: surface analyses as correlated to the membrane activity.

    PubMed

    Al-Asousi, Maryam F; Shoukry, Adel F; Bu-Olayan, Abdul Hadi

    2012-05-30

    Two conventional Zn(II) polyvinyl chloride (PVC) membrane electrodes have been prepared and characterized. They were based on dibenzo-24-crown-8 (DBC) as a neutral carrier, dioctyl phthalate (DOP) as a plasticizer, and potassium tetrakis (p-chlorophenyl) borate, KTpClPB or the halogen-free ionic liquid, tetraoctylammonium dodecylbenzene sulfonate [TOA][DBS] as an additive. The use of ionic liquid has been found to enhance the selectivity of the sensor. For each electrode, the surfaces of two membranes were investigated using X-ray photoelectron, ion-scattering spectroscopy and atomic force microscopy. One of the two membranes was conditioned by soaking it for 24 h in a 1.0×10(-3) M Zn(NO(3))(2) solution and the second was soaked in bi-distilled water for the same interval (24 h). Comparing the two surfaces indicated the following: (a) the high selectivity in case of using [TOA][DBS] as an additive is due to the extra mediation caused by the ionic liquid and (b) the working mechanism of the electrode is based on phase equilibrium at the surface of the membrane associated with ion transport through the bulk of the membrane. PMID:22608433

  17. Statistical Analyses of Satellite Cloud Object Data from CERES. Part II; Tropical Convective Cloud Objects During 1998 El Nino and Validation of the Fixed Anvil Temperature Hypothesis

    NASA Technical Reports Server (NTRS)

    Xu, Kuan-Man; Wong, Takmeng; Wielicki, Bruce a.; Parker, Lindsay; Lin, Bing; Eitzen, Zachary A.; Branson, Mark

    2006-01-01

    Characteristics of tropical deep convective cloud objects observed over the tropical Pacific during January-August 1998 are examined using the Tropical Rainfall Measuring Mission/ Clouds and the Earth s Radiant Energy System single scanner footprint (SSF) data. These characteristics include the frequencies of occurrence and statistical distributions of cloud physical properties. Their variations with cloud-object size, sea surface temperature (SST), and satellite precessing cycle are analyzed in detail. A cloud object is defined as a contiguous patch of the Earth composed of satellite footprints within a single dominant cloud-system type. It is found that statistical distributions of cloud physical properties are significantly different among three size categories of cloud objects with equivalent diameters of 100 - 150 km (small), 150 - 300 km (medium), and > 300 km (large), respectively, except for the distributions of ice particle size. The distributions for the larger-size category of cloud objects are more skewed towards high SSTs, high cloud tops, low cloud-top temperature, large ice water path, high cloud optical depth, low outgoing longwave (LW) radiation, and high albedo than the smaller-size category. As SST varied from one satellite precessing cycle to another, the changes in macrophysical properties of cloud objects over the entire tropical Pacific were small for the large-size category of cloud objects, relative to those of the small- and medium-size categories. This result suggests that the fixed anvil temperature hypothesis of Hartmann and Larson may be valid for the large-size category. Combining with the result that a higher percentage of the large-size category of cloud objects occurs during higher SST subperiods, this implies that macrophysical properties of cloud objects would be less sensitive to further warming of the climate. On the other hand, when cloud objects are classified according to SSTs where large-scale dynamics plays important roles

  18. Estimability and simple dynamical analyses of range (range-rate range-difference) observations to artificial satellites. [laser range observations to LAGEOS using non-Bayesian statistics

    NASA Technical Reports Server (NTRS)

    Vangelder, B. H. W.

    1978-01-01

    Non-Bayesian statistics were used in simulation studies centered around laser range observations to LAGEOS. The capabilities of satellite laser ranging especially in connection with relative station positioning are evaluated. The satellite measurement system under investigation may fall short in precise determinations of the earth's orientation (precession and nutation) and earth's rotation as opposed to systems as very long baseline interferometry (VLBI) and lunar laser ranging (LLR). Relative station positioning, determination of (differential) polar motion, positioning of stations with respect to the earth's center of mass and determination of the earth's gravity field should be easily realized by satellite laser ranging (SLR). The last two features should be considered as best (or solely) determinable by SLR in contrast to VLBI and LLR.

  19. A graphical user interface (GUI) toolkit for the calculation of three-dimensional (3D) multi-phase biological effective dose (BED) distributions including statistical analyses.

    PubMed

    Kauweloa, Kevin I; Gutierrez, Alonso N; Stathakis, Sotirios; Papanikolaou, Niko; Mavroidis, Panayiotis

    2016-07-01

    A toolkit has been developed for calculating the 3-dimensional biological effective dose (BED) distributions in multi-phase, external beam radiotherapy treatments such as those applied in liver stereotactic body radiation therapy (SBRT) and in multi-prescription treatments. This toolkit also provides a wide range of statistical results related to dose and BED distributions. MATLAB 2010a, version 7.10 was used to create this GUI toolkit. The input data consist of the dose distribution matrices, organ contour coordinates, and treatment planning parameters from the treatment planning system (TPS). The toolkit has the capability of calculating the multi-phase BED distributions using different formulas (denoted as true and approximate). Following the calculations of the BED distributions, the dose and BED distributions can be viewed in different projections (e.g. coronal, sagittal and transverse). The different elements of this toolkit are presented and the important steps for the execution of its calculations are illustrated. The toolkit is applied on brain, head & neck and prostate cancer patients, who received primary and boost phases in order to demonstrate its capability in calculating BED distributions, as well as measuring the inaccuracy and imprecision of the approximate BED distributions. Finally, the clinical situations in which the use of the present toolkit would have a significant clinical impact are indicated. PMID:27265044

  20. Rapid Detection and Statistical Differentiation of KPC Gene Variants in Gram-Negative Pathogens by Use of High-Resolution Melting and ScreenClust Analyses

    PubMed Central

    Roth, Amanda L.

    2013-01-01

    In the United States, the production of the Klebsiella pneumoniae carbapenemase (KPC) is an important mechanism of carbapenem resistance in Gram-negative pathogens. Infections with KPC-producing organisms are associated with increased morbidity and mortality; therefore, the rapid detection of KPC-producing pathogens is critical in patient care and infection control. We developed a real-time PCR assay complemented with traditional high-resolution melting (HRM) analysis, as well as statistically based genotyping, using the Rotor-Gene ScreenClust HRM software to both detect the presence of blaKPC and differentiate between KPC-2-like and KPC-3-like alleles. A total of 166 clinical isolates of Enterobacteriaceae, Pseudomonas aeruginosa, and Acinetobacter baumannii with various β-lactamase susceptibility patterns were tested in the validation of this assay; 66 of these organisms were known to produce the KPC β-lactamase. The real-time PCR assay was able to detect the presence of blaKPC in all 66 of these clinical isolates (100% sensitivity and specificity). HRM analysis demonstrated that 26 had KPC-2-like melting peak temperatures, while 40 had KPC-3-like melting peak temperatures. Sequencing of 21 amplified products confirmed the melting peak results, with 9 isolates carrying blaKPC-2 and 12 isolates carrying blaKPC-3. This PCR/HRM assay can identify KPC-producing Gram-negative pathogens in as little as 3 h after isolation of pure colonies and does not require post-PCR sample manipulation for HRM analysis, and ScreenClust analysis easily distinguishes blaKPC-2-like and blaKPC-3-like alleles. Therefore, this assay is a rapid method to identify the presence of blaKPC enzymes in Gram-negative pathogens that can be easily integrated into busy clinical microbiology laboratories. PMID:23077125

  1. The Enigma of the Dichotomic Pressure Response of GluN1-4a/b Splice Variants of NMDA Receptor: Experimental and Statistical Analyses

    PubMed Central

    Bliznyuk, Alice; Gradwohl, Gideon; Hollmann, Michael; Grossman, Yoram

    2016-01-01

    Professional deep-water divers, exposed to hyperbaric pressure (HP) above 1.1 MPa, develop High Pressure Neurological Syndrome (HPNS), which is associated with central nervous system (CNS) hyperexcitability. It was previously reported that HP augments N-methyl-D-aspartate receptor (NMDAR) synaptic response, increases neuronal excitability and potentially causes irreversible neuronal damage. Our laboratory has reported differential current responses under HP conditions in NMDAR subtypes that contain either GluN1-1a or GluN1-1b splice variants co-expressed in Xenopus laevis oocytes with all four GluN2 subunits. Recently, we reported that the increase in ionic currents measured under HP conditions is also dependent on which of the eight splice variants of GluN1 is co-expressed with the GluN2 subunit. We now report that the NMDAR subtype that contains GluN1-4a/b splice variants exhibited “dichotomic” (either increased or decreased) responses at HP. The distribution of the results is not normal thus analysis of variance (ANOVA) test and clustering analysis were employed for statistical verification of the grouping. Furthermore, the calculated constants of alpha function distribution analysis for the two groups were similar, suggesting that the mechanism underlying the switch between an increase or a decrease of the current at HP is a single process, the nature of which is still unknown. This dichotomic response of the GluN1-4a/b splice variant may be used as a model for studying reduced response in NMDAR at HP. Successful reversal of other NMDAR subtypes response (i.e., current reduction) may allow the elimination of the reversible malfunctioning short term effects (HPNS), or even deleterious long term effects induced by increased NMDAR function during HP exposure. PMID:27375428

  2. The Enigma of the Dichotomic Pressure Response of GluN1-4a/b Splice Variants of NMDA Receptor: Experimental and Statistical Analyses.

    PubMed

    Bliznyuk, Alice; Gradwohl, Gideon; Hollmann, Michael; Grossman, Yoram

    2016-01-01

    Professional deep-water divers, exposed to hyperbaric pressure (HP) above 1.1 MPa, develop High Pressure Neurological Syndrome (HPNS), which is associated with central nervous system (CNS) hyperexcitability. It was previously reported that HP augments N-methyl-D-aspartate receptor (NMDAR) synaptic response, increases neuronal excitability and potentially causes irreversible neuronal damage. Our laboratory has reported differential current responses under HP conditions in NMDAR subtypes that contain either GluN1-1a or GluN1-1b splice variants co-expressed in Xenopus laevis oocytes with all four GluN2 subunits. Recently, we reported that the increase in ionic currents measured under HP conditions is also dependent on which of the eight splice variants of GluN1 is co-expressed with the GluN2 subunit. We now report that the NMDAR subtype that contains GluN1-4a/b splice variants exhibited "dichotomic" (either increased or decreased) responses at HP. The distribution of the results is not normal thus analysis of variance (ANOVA) test and clustering analysis were employed for statistical verification of the grouping. Furthermore, the calculated constants of alpha function distribution analysis for the two groups were similar, suggesting that the mechanism underlying the switch between an increase or a decrease of the current at HP is a single process, the nature of which is still unknown. This dichotomic response of the GluN1-4a/b splice variant may be used as a model for studying reduced response in NMDAR at HP. Successful reversal of other NMDAR subtypes response (i.e., current reduction) may allow the elimination of the reversible malfunctioning short term effects (HPNS), or even deleterious long term effects induced by increased NMDAR function during HP exposure. PMID:27375428

  3. The taxonicity of schizotypy: does the same taxonic class structure emerge from analyses of different attributes of schizotypy and from fundamentally different statistical methods?

    PubMed

    Linscott, Richard J

    2013-12-15

    Findings on the population distribution of schizotypy consistently point toward an underlying class structure. However, past research is methodologically homogeneous, chiefly involving analysis of attribute-specific indicators and coherent cut kinetic methods such as maximum covariance (MAXCOV) analysis. Two questions are examined. Are different or overlapping classes identified from different attributes of the schizophrenia phenotype? Do fundamentally different approaches to analysis yield consistent results? Participants (n=1074) completed the Schizotypal Personality Questionnaire (SPQ). Following item screening, MAXCOV analyses were conducted iteratively on attribute-specific item sets (cognitive-perceptual, interpersonal, and disorganized) and a general item set. Latent variable modeling (factor analysis, latent class analysis, and factor-mixture modeling) was used to examine the consistency of the MAXCOV results using items retained in the general set following MAXCOV analysis. Attribute-specific and general item sets gave taxonic MAXCOV curves and base rates of 8.4-10.4% and 3.6%, respectively. Classes were not independent. No latent variable model emerged as uniquely superior but five models distinguished a small high-scoring class populated by members of the MAXCOV general class. Different attributes distinguished overlapping yet nonredundant taxa, and a general schizotypy taxon identified with MAXCOV was also identified in latent variable modeling. PMID:23932839

  4. An Automated System of Knickpoint Definition and Extraction from a Digital Elevation Model (DEM): Implications for Efficient Large-Scale Mapping and Statistical Analyses of Knickpoint Distributions in Fluvial Networks

    NASA Astrophysics Data System (ADS)

    Neely, A. B.; Bookhagen, B.; Burbank, D. W.

    2014-12-01

    Knickpoints, or convexities in a stream's longitudinal profile, often delineate boundaries in stream networks that separate reaches eroding at different rates resulting from sharp temporal or spatial changes in uplift rate, contributing drainage area, precipitation, or bedrock lithology. We explicitly defined the geometry of a knickpoint in a manner that can be identified using an algorithm which operates in accordance with the stream power incision model, using a chi-plot analysis approach. This method allows for comparison between the real stream profile extracted from a DEM, and a linear best-fit line profile in chi-elevation space, representing a steady state theoretical stream functioning in accordance to uniform temporal and spatial conditions listed above. Assessing where the stream of interest is "under-steepened" and "over-steepened" with respect to a theoretical linear profile reveals knickpoints as certain points of slope inflection, extractable by our algorithm. We tested our algorithm on a 1m resolution LiDAR DEM of Santa Cruz Island (SCI), a tectonically active island 25km south of Santa Barbara, CA with an estimated uplift rate between 0.5 and 1.2mm/yr calculated from uplifted paleoshorelines. We have identified 1025 knickpoints using our algorithm and compared the position of these knickpoints to a similarly-sized dataset of knickpoints manually selected from distance-elevation longitudinal stream profiles for the same region. Our algorithm reduced mapping time by 99.3% and agreed with knickpoint positions from the manually selected knickpoint map for 85% of the 1025 knickpoints. Discrepancies can arise from inconsistencies in manual knickpoint selection that are not present in an automated computation. Additionally, the algorithm measures useful characteristics for each knickpoint allowing for quick statistical analyses. Histograms of knickpoint elevation and chi coordinate have a 3 peaked distribution, possibly expressing 3 levels of uplifted

  5. Cosmic statistics of statistics

    NASA Astrophysics Data System (ADS)

    Szapudi, István; Colombi, Stéphane; Bernardeau, Francis

    1999-12-01

    The errors on statistics measured in finite galaxy catalogues are exhaustively investigated. The theory of errors on factorial moments by Szapudi & Colombi is applied to cumulants via a series expansion method. All results are subsequently extended to the weakly non-linear regime. Together with previous investigations this yields an analytic theory of the errors for moments and connected moments of counts in cells from highly non-linear to weakly non-linear scales. For non-linear functions of unbiased estimators, such as the cumulants, the phenomenon of cosmic bias is identified and computed. Since it is subdued by the cosmic errors in the range of applicability of the theory, correction for it is inconsequential. In addition, the method of Colombi, Szapudi & Szalay concerning sampling effects is generalized, adapting the theory for inhomogeneous galaxy catalogues. While previous work focused on the variance only, the present article calculates the cross-correlations between moments and connected moments as well for a statistically complete description. The final analytic formulae representing the full theory are explicit but somewhat complicated. Therefore we have made available a fortran program capable of calculating the described quantities numerically (for further details e-mail SC at colombi@iap.fr). An important special case is the evaluation of the errors on the two-point correlation function, for which this should be more accurate than any method put forward previously. This tool will be immensely useful in the future for assessing the precision of measurements from existing catalogues, as well as aiding the design of new galaxy surveys. To illustrate the applicability of the results and to explore the numerical aspects of the theory qualitatively and quantitatively, the errors and cross-correlations are predicted under a wide range of assumptions for the future Sloan Digital Sky Survey. The principal results concerning the cumulants ξ, Q3 and Q4 is that

  6. Analysing the Temperature Effect on the Competitiveness of the Amine Addition versus the Amidation Reaction in the Epoxidized Oil/Amine System by MCR-ALS of FTIR Data

    PubMed Central

    del Río, Vanessa; Callao, M. Pilar; Larrechi, M. Soledad

    2011-01-01

    The evaluation of the temperature effect on the competitiveness between the amine addition and the amidation reaction in a model cure acid-catalysed reaction between the epoxidized methyl oleate (EMO), obtained from high oleic sunflower oil, and aniline is reported. The study was carried out analysing the kinetic profiles of the chemical species involved in the system, which were obtained applying multivariate curve resolution-alternating least squares (MCR-ALS) to the Fourier transform infrared spectra data obtained from the reaction monitoring at two different temperatures (60°C and 30°C). At both experimental temperatures, two mechanisms were postulated: non-autocatalytic and autocatalytic. The different behaviour was discussed considering not only the influence of the temperature on the amidation reaction kinetic, but also the presence of the homopolymerization of the EMO reagent. PMID:21765830

  7. Three-dimensional geological modelling and multivariate statistical analysis of water chemistry data to analyse and visualise aquifer structure and groundwater composition in the Wairau Plain, Marlborough District, New Zealand

    NASA Astrophysics Data System (ADS)

    Raiber, Matthias; White, Paul A.; Daughney, Christopher J.; Tschritter, Constanze; Davidson, Peter; Bainbridge, Sophie E.

    2012-05-01

    SummaryConcerns regarding groundwater contamination with nitrate and the long-term sustainability of groundwater resources have prompted the development of a multi-layered three-dimensional (3D) geological model to characterise the aquifer geometry of the Wairau Plain, Marlborough District, New Zealand. The 3D geological model which consists of eight litho-stratigraphic units has been subsequently used to synthesise hydrogeological and hydrogeochemical data for different aquifers in an approach that aims to demonstrate how integration of water chemistry data within the physical framework of a 3D geological model can help to better understand and conceptualise groundwater systems in complex geological settings. Multivariate statistical techniques (e.g. Principal Component Analysis and Hierarchical Cluster Analysis) were applied to groundwater chemistry data to identify hydrochemical facies which are characteristic of distinct evolutionary pathways and a common hydrologic history of groundwaters. Principal Component Analysis on hydrochemical data demonstrated that natural water-rock interactions, redox potential and human agricultural impact are the key controls of groundwater quality in the Wairau Plain. Hierarchical Cluster Analysis revealed distinct hydrochemical water quality groups in the Wairau Plain groundwater system. Visualisation of the results of the multivariate statistical analyses and distribution of groundwater nitrate concentrations in the context of aquifer lithology highlighted the link between groundwater chemistry and the lithology of host aquifers. The methodology followed in this study can be applied in a variety of hydrogeological settings to synthesise geological, hydrogeological and hydrochemical data and present them in a format readily understood by a wide range of stakeholders. This enables a more efficient communication of the results of scientific studies to the wider community.

  8. In situ sulfur isotopes (δ(34)S and δ(33)S) analyses in sulfides and elemental sulfur using high sensitivity cones combined with the addition of nitrogen by laser ablation MC-ICP-MS.

    PubMed

    Fu, Jiali; Hu, Zhaochu; Zhang, Wen; Yang, Lu; Liu, Yongsheng; Li, Ming; Zong, Keqing; Gao, Shan; Hu, Shenghong

    2016-03-10

    The sulfur isotope is an important geochemical tracer in diverse fields of geosciences. In this study, the effects of three different cone combinations with the addition of N2 on the performance of in situ S isotope analyses were investigated in detail. The signal intensities of S isotopes were improved by a factor of 2.3 and 3.6 using the X skimmer cone combined with the standard sample cone or the Jet sample cone, respectively, compared with the standard arrangement (H skimmer cone combined with the standard sample cone). This signal enhancement is important for the improvement of the precision and accuracy of in situ S isotope analysis at high spatial resolution. Different cone combinations have a significant effect on the mass bias and mass bias stability for S isotopes. Poor precisions of S isotope ratios were obtained using the Jet and X cones combination at their corresponding optimum makeup gas flow when using Ar plasma only. The addition of 4-8 ml min(-1) nitrogen to the central gas flow in laser ablation MC-ICP-MS was found to significantly enlarge the mass bias stability zone at their corresponding optimum makeup gas flow in these three different cone combinations. The polyatomic interferences of OO, SH, OOH were also significantly reduced, and the interference free plateaus of sulfur isotopes became broader and flatter in the nitrogen mode (N2 = 4 ml min(-1)). However, the signal intensity of S was not increased by the addition of nitrogen in this study. The laser fluence and ablation mode had significant effects on sulfur isotope fractionation during the analysis of sulfides and elemental sulfur by laser ablation MC-ICP-MS. The matrix effect among different sulfides and elemental sulfur was observed, but could be significantly reduced by line scan ablation in preference to single spot ablation under the optimized fluence. It is recommended that the d90 values of the particles in pressed powder pellets for accurate and precise S isotope analysis

  9. Addition of docetaxel or bisphosphonates to standard of care in men with localised or metastatic, hormone-sensitive prostate cancer: a systematic review and meta-analyses of aggregate data

    PubMed Central

    Vale, Claire L; Burdett, Sarah; Rydzewska, Larysa H M; Albiges, Laurence; Clarke, Noel W; Fisher, David; Fizazi, Karim; Gravis, Gwenaelle; James, Nicholas D; Mason, Malcolm D; Parmar, Mahesh K B; Sweeney, Christopher J; Sydes, Matthew R; Tombal, Bertrand; Tierney, Jayne F

    2016-01-01

    docetaxel for men with locally advanced disease (M0). Survival results from three (GETUG-12, RTOG 0521, STAMPEDE) of these trials (2121 [53%] of 3978 men) showed no evidence of a benefit from the addition of docetaxel (HR 0·87 [95% CI 0·69–1·09]; p=0·218), whereas failure-free survival data from four (GETUG-12, RTOG 0521, STAMPEDE, TAX 3501) of these trials (2348 [59%] of 3978 men) showed that docetaxel improved failure-free survival (0·70 [0·61–0·81]; p<0·0001), which translates into a reduced absolute 4-year failure rate of 8% (5–10). We identified seven eligible randomised controlled trials of bisphosphonates for men with M1 disease. Survival results from three of these trials (2740 [88%] of 3109 men) showed that addition of bisphosphonates improved survival (0·88 [0·79–0·98]; p=0·025), which translates to 5% (1–8) absolute improvement, but this result was influenced by the positive result of one trial of sodium clodronate, and we found no evidence of a benefit from the addition of zoledronic acid (0·94 [0·83–1·07]; p=0·323), which translates to an absolute improvement in survival of 2% (−3 to 7). Of 17 trials of bisphosphonates for men with M0 disease, survival results from four trials (4079 [66%] of 6220 men) showed no evidence of benefit from the addition of bisphosphonates (1·03 [0·89–1·18]; p=0·724) or zoledronic acid (0·98 [0·82–1·16]; p=0·782). Failure-free survival definitions were too inconsistent for formal meta-analyses for the bisphosphonate trials. Interpretation The addition of docetaxel to standard of care should be considered standard care for men with M1 hormone-sensitive prostate cancer who are starting treatment for the first time. More evidence on the effects of docetaxel on survival is needed in the M0 disease setting. No evidence exists to suggest that zoledronic acid improves survival in men with

  10. Descriptive statistics.

    PubMed

    Shi, Runhua; McLarty, Jerry W

    2009-10-01

    In this article, we introduced basic concepts of statistics, type of distributions, and descriptive statistics. A few examples were also provided. The basic concepts presented herein are only a fraction of the concepts related to descriptive statistics. Also, there are many commonly used distributions not presented herein, such as Poisson distributions for rare events and exponential distributions, F distributions, and logistic distributions. More information can be found in many statistics books and publications. PMID:19891281

  11. Statistical Diversions

    ERIC Educational Resources Information Center

    Petocz, Peter; Sowey, Eric

    2008-01-01

    As a branch of knowledge, Statistics is ubiquitous and its applications can be found in (almost) every field of human endeavour. In this article, the authors track down the possible source of the link between the "Siren song" and applications of Statistics. Answers to their previous five questions and five new questions on Statistics are presented.

  12. Statistical Software.

    ERIC Educational Resources Information Center

    Callamaras, Peter

    1983-01-01

    This buyer's guide to seven major types of statistics software packages for microcomputers reviews Edu-Ware Statistics 3.0; Financial Planning; Speed Stat; Statistics with DAISY; Human Systems Dynamics package of Stats Plus, ANOVA II, and REGRESS II; Maxistat; and Moore-Barnes' MBC Test Construction and MBC Correlation. (MBR)

  13. Bayesian Statistics.

    ERIC Educational Resources Information Center

    Meyer, Donald L.

    Bayesian statistical methodology and its possible uses in the behavioral sciences are discussed in relation to the solution of problems in both the use and teaching of fundamental statistical methods, including confidence intervals, significance tests, and sampling. The Bayesian model explains these statistical methods and offers a consistent…

  14. Titanic: A Statistical Exploration.

    ERIC Educational Resources Information Center

    Takis, Sandra L.

    1999-01-01

    Uses the available data about the Titanic's passengers to interest students in exploring categorical data and the chi-square distribution. Describes activities incorporated into a statistics class and gives additional resources for collecting information about the Titanic. (ASK)

  15. Illustrating the practice of statistics

    SciTech Connect

    Hamada, Christina A; Hamada, Michael S

    2009-01-01

    The practice of statistics involves analyzing data and planning data collection schemes to answer scientific questions. Issues often arise with the data that must be dealt with and can lead to new procedures. In analyzing data, these issues can sometimes be addressed through the statistical models that are developed. Simulation can also be helpful in evaluating a new procedure. Moreover, simulation coupled with optimization can be used to plan a data collection scheme. The practice of statistics as just described is much more than just using a statistical package. In analyzing the data, it involves understanding the scientific problem and incorporating the scientist's knowledge. In modeling the data, it involves understanding how the data were collected and accounting for limitations of the data where possible. Moreover, the modeling is likely to be iterative by considering a series of models and evaluating the fit of these models. Designing a data collection scheme involves understanding the scientist's goal and staying within hislher budget in terms of time and the available resources. Consequently, a practicing statistician is faced with such tasks and requires skills and tools to do them quickly. We have written this article for students to provide a glimpse of the practice of statistics. To illustrate the practice of statistics, we consider a problem motivated by some precipitation data that our relative, Masaru Hamada, collected some years ago. We describe his rain gauge observational study in Section 2. We describe modeling and an initial analysis of the precipitation data in Section 3. In Section 4, we consider alternative analyses that address potential issues with the precipitation data. In Section 5, we consider the impact of incorporating additional infonnation. We design a data collection scheme to illustrate the use of simulation and optimization in Section 6. We conclude this article in Section 7 with a discussion.

  16. Sociopolitical Analyses.

    ERIC Educational Resources Information Center

    Van Galen, Jane, Ed.; And Others

    1992-01-01

    This theme issue of the serial "Educational Foundations" contains four articles devoted to the topic of "Sociopolitical Analyses." In "An Interview with Peter L. McLaren," Mary Leach presented the views of Peter L. McLaren on topics of local and national discourses, values, and the politics of difference. Landon E. Beyer's "Educational Studies and…

  17. Information geometry of Bayesian statistics

    NASA Astrophysics Data System (ADS)

    Matsuzoe, Hiroshi

    2015-01-01

    A survey of geometry of Bayesian statistics is given. From the viewpoint of differential geometry, a prior distribution in Bayesian statistics is regarded as a volume element on a statistical model. In this paper, properties of Bayesian estimators are studied by applying equiaffine structures of statistical manifolds. In addition, geometry of anomalous statistics is also studied. Deformed expectations and deformed independeces are important in anomalous statistics. After summarizing geometry of such deformed structues, a generalization of maximum likelihood method is given. A suitable weight on a parameter space is important in Bayesian statistics, whereas a suitable weight on a sample space is important in anomalous statistics.

  18. Statistics: A Brief Overview

    PubMed Central

    Winters, Ryan; Winters, Andrew; Amedee, Ronald G.

    2010-01-01

    The Accreditation Council for Graduate Medical Education sets forth a number of required educational topics that must be addressed in residency and fellowship programs. We sought to provide a primer on some of the important basic statistical concepts to consider when examining the medical literature. It is not essential to understand the exact workings and methodology of every statistical test encountered, but it is necessary to understand selected concepts such as parametric and nonparametric tests, correlation, and numerical versus categorical data. This working knowledge will allow you to spot obvious irregularities in statistical analyses that you encounter. PMID:21603381

  19. Food additives

    MedlinePlus

    Food additives are substances that become part of a food product when they are added during the processing or making of that food. "Direct" food additives are often added during processing to: Add nutrients ...

  20. Statistical databases

    SciTech Connect

    Kogalovskii, M.R.

    1995-03-01

    This paper presents a review of problems related to statistical database systems, which are wide-spread in various fields of activity. Statistical databases (SDB) are referred to as databases that consist of data and are used for statistical analysis. Topics under consideration are: SDB peculiarities, properties of data models adequate for SDB requirements, metadata functions, null-value problems, SDB compromise protection problems, stored data compression techniques, and statistical data representation means. Also examined is whether the present Database Management Systems (DBMS) satisfy the SDB requirements. Some actual research directions in SDB systems are considered.

  1. Morbidity statistics

    PubMed Central

    Smith, Alwyn

    1969-01-01

    This paper is based on an analysis of questionnaires sent to the health ministries of Member States of WHO asking for information about the extent, nature, and scope of morbidity statistical information. It is clear that most countries collect some statistics of morbidity and many countries collect extensive data. However, few countries relate their collection to the needs of health administrators for information, and many countries collect statistics principally for publication in annual volumes which may appear anything up to 3 years after the year to which they refer. The desiderata of morbidity statistics may be summarized as reliability, representativeness, and relevance to current health problems. PMID:5306722

  2. Environmental restoration and statistics: Issues and needs

    SciTech Connect

    Gilbert, R.O.

    1991-10-01

    Statisticians have a vital role to play in environmental restoration (ER) activities. One facet of that role is to point out where additional work is needed to develop statistical sampling plans and data analyses that meet the needs of ER. This paper is an attempt to show where statistics fits into the ER process. The statistician, as member of the ER planning team, works collaboratively with the team to develop the site characterization sampling design, so that data of the quality and quantity required by the specified data quality objectives (DQOs) are obtained. At the same time, the statistician works with the rest of the planning team to design and implement, when appropriate, the observational approach to streamline the ER process and reduce costs. The statistician will also provide the expertise needed to select or develop appropriate tools for statistical analysis that are suited for problems that are common to waste-site data. These data problems include highly heterogeneous waste forms, large variability in concentrations over space, correlated data, data that do not have a normal (Gaussian) distribution, and measurements below detection limits. Other problems include environmental transport and risk models that yield highly uncertain predictions, and the need to effectively communicate to the public highly technical information, such as sampling plans, site characterization data, statistical analysis results, and risk estimates. Even though some statistical analysis methods are available off the shelf'' for use in ER, these problems require the development of additional statistical tools, as discussed in this paper. 29 refs.

  3. Food additives

    PubMed Central

    Spencer, Michael

    1974-01-01

    Food additives are discussed from the food technology point of view. The reasons for their use are summarized: (1) to protect food from chemical and microbiological attack; (2) to even out seasonal supplies; (3) to improve their eating quality; (4) to improve their nutritional value. The various types of food additives are considered, e.g. colours, flavours, emulsifiers, bread and flour additives, preservatives, and nutritional additives. The paper concludes with consideration of those circumstances in which the use of additives is (a) justified and (b) unjustified. PMID:4467857

  4. Statistical Diversions

    ERIC Educational Resources Information Center

    Petocz, Peter; Sowey, Eric

    2008-01-01

    In this article, the authors focus on hypothesis testing--that peculiarly statistical way of deciding things. Statistical methods for testing hypotheses were developed in the 1920s and 1930s by some of the most famous statisticians, in particular Ronald Fisher, Jerzy Neyman and Egon Pearson, who laid the foundations of almost all modern methods of…

  5. Statistics for Learning Genetics

    ERIC Educational Resources Information Center

    Charles, Abigail Sheena

    2012-01-01

    This study investigated the knowledge and skills that biology students may need to help them understand statistics/mathematics as it applies to genetics. The data are based on analyses of current representative genetics texts, practicing genetics professors' perspectives, and more directly, students' perceptions of, and performance in,…

  6. Lehrer in der Bundesrepublik Deutschland. Eine Kritische Analyse Statistischer Daten uber das Lehrpersonal an Allgemeinbildenden Schulen. (Education in the Federal Republic of Germany. A Statistical Study of Teachers in Schools of General Education.)

    ERIC Educational Resources Information Center

    Kohler, Helmut

    The purpose of this study was to analyze the available statistics concerning teachers in schools of general education in the Federal Republic of Germany. An analysis of the demographic structure of the pool of full-time teachers showed that in 1971 30 percent of the teachers were under age 30, and 50 percent were under age 35. It was expected that…

  7. Football goal distributions and extremal statistics

    NASA Astrophysics Data System (ADS)

    Greenhough, J.; Birch, P. C.; Chapman, S. C.; Rowlands, G.

    2002-12-01

    We analyse the distributions of the number of goals scored by home teams, away teams, and the total scored in the match, in domestic football games from 169 countries between 1999 and 2001. The probability density functions (PDFs) of goals scored are too heavy-tailed to be fitted over their entire ranges by Poisson or negative binomial distributions which would be expected for uncorrelated processes. Log-normal distributions cannot include zero scores and here we find that the PDFs are consistent with those arising from extremal statistics. In addition, we show that it is sufficient to model English top division and FA Cup matches in the seasons of 1970/71-2000/01 on Poisson or negative binomial distributions, as reported in analyses of earlier seasons, and that these are not consistent with extremal statistics.

  8. SEER Statistics

    Cancer.gov

    The Surveillance, Epidemiology, and End Results (SEER) Program of the National Cancer Institute works to provide information on cancer statistics in an effort to reduce the burden of cancer among the U.S. population.

  9. Cancer Statistics

    MedlinePlus

    ... cancer statistics across the world. U.S. Cancer Mortality Trends The best indicator of progress against cancer is ... the number of cancer survivors has increased. These trends show that progress is being made against the ...

  10. Statistical Physics

    NASA Astrophysics Data System (ADS)

    Hermann, Claudine

    Statistical Physics bridges the properties of a macroscopic system and the microscopic behavior of its constituting particles, otherwise impossible due to the giant magnitude of Avogadro's number. Numerous systems of today's key technologies - such as semiconductors or lasers - are macroscopic quantum objects; only statistical physics allows for understanding their fundamentals. Therefore, this graduate text also focuses on particular applications such as the properties of electrons in solids with applications, and radiation thermodynamics and the greenhouse effect.

  11. Food additives.

    PubMed

    Berglund, F

    1978-01-01

    The use of additives to food fulfils many purposes, as shown by the index issued by the Codex Committee on Food Additives: Acids, bases and salts; Preservatives, Antioxidants and antioxidant synergists; Anticaking agents; Colours; Emulfifiers; Thickening agents; Flour-treatment agents; Extraction solvents; Carrier solvents; Flavours (synthetic); Flavour enhancers; Non-nutritive sweeteners; Processing aids; Enzyme preparations. Many additives occur naturally in foods, but this does not exclude toxicity at higher levels. Some food additives are nutrients, or even essential nutritents, e.g. NaCl. Examples are known of food additives causing toxicity in man even when used according to regulations, e.g. cobalt in beer. In other instances, poisoning has been due to carry-over, e.g. by nitrate in cheese whey - when used for artificial feed for infants. Poisonings also occur as the result of the permitted substance being added at too high levels, by accident or carelessness, e.g. nitrite in fish. Finally, there are examples of hypersensitivity to food additives, e.g. to tartrazine and other food colours. The toxicological evaluation, based on animal feeding studies, may be complicated by impurities, e.g. orthotoluene-sulfonamide in saccharin; by transformation or disappearance of the additive in food processing in storage, e.g. bisulfite in raisins; by reaction products with food constituents, e.g. formation of ethylurethane from diethyl pyrocarbonate; by metabolic transformation products, e.g. formation in the gut of cyclohexylamine from cyclamate. Metabolic end products may differ in experimental animals and in man: guanylic acid and inosinic acid are metabolized to allantoin in the rat but to uric acid in man. The magnitude of the safety margin in man of the Acceptable Daily Intake (ADI) is not identical to the "safety factor" used when calculating the ADI. The symptoms of Chinese Restaurant Syndrome, although not hazardous, furthermore illustrate that the whole ADI

  12. SNS shielding analyses overview

    SciTech Connect

    Popova, Irina; Gallmeier, Franz; Iverson, Erik B; Lu, Wei; Remec, Igor

    2015-01-01

    This paper gives an overview on on-going shielding analyses for Spallation Neutron Source. Presently, the most of the shielding work is concentrated on the beam lines and instrument enclosures to prepare for commissioning, save operation and adequate radiation background in the future. There is on-going work for the accelerator facility. This includes radiation-protection analyses for radiation monitors placement, designing shielding for additional facilities to test accelerator structures, redesigning some parts of the facility, and designing test facilities to the main accelerator structure for component testing. Neutronics analyses are required as well to support spent structure management, including waste characterisation analyses, choice of proper transport/storage package and shielding enhancement for the package if required.

  13. Statistical optics

    NASA Astrophysics Data System (ADS)

    Goodman, J. W.

    This book is based on the thesis that some training in the area of statistical optics should be included as a standard part of any advanced optics curriculum. Random variables are discussed, taking into account definitions of probability and random variables, distribution functions and density functions, an extension to two or more random variables, statistical averages, transformations of random variables, sums of real random variables, Gaussian random variables, complex-valued random variables, and random phasor sums. Other subjects examined are related to random processes, some first-order properties of light waves, the coherence of optical waves, some problems involving high-order coherence, effects of partial coherence on imaging systems, imaging in the presence of randomly inhomogeneous media, and fundamental limits in photoelectric detection of light. Attention is given to deterministic versus statistical phenomena and models, the Fourier transform, and the fourth-order moment of the spectrum of a detected speckle image.

  14. X-Ray Map Analyser: A new ArcGIS® based tool for the quantitative statistical data handling of X-ray maps (Geo- and material-science applications)

    NASA Astrophysics Data System (ADS)

    Ortolano, Gaetano; Zappalà, Luigi; Mazzoleni, Paolo

    2014-11-01

    A new semi-automated image processing procedure based on multivariate statistical analysis of X-ray maps of petrological and material science interest has been developed to generate high contrast pseudo-coloured images highlighting the element distribution between and within detected mineral phases. This new tool package, developed in Python and integrated with ArcGis®, generates in only a few minutes several graphical outputs useful for classifying chemically homogeneous zones as well as extracting quantitative information through the statistical data handling of X-ray maps. The code, largely based on the use of functions implemented in ArcGis® 9.3 equipped with Spatial Analyst and Data Management licences, has been suitably integrated with original cyclic functions that hugely reduce the time taken to complete lengthy procedures. In particular these tools, after the acquisition of any kind of multispectral images allow fast and powerful data processing for efficient illustration and documentation of key compositional and microtextural relationships in rocks and materials.

  15. Statistics Revelations

    ERIC Educational Resources Information Center

    Chicot, Katie; Holmes, Hilary

    2012-01-01

    The use, and misuse, of statistics is commonplace, yet in the printed format data representations can be either over simplified, supposedly for impact, or so complex as to lead to boredom, supposedly for completeness and accuracy. In this article the link to the video clip shows how dynamic visual representations can enliven and enhance the…

  16. Statistical Fun

    ERIC Educational Resources Information Center

    Catley, Alan

    2007-01-01

    Following the announcement last year that there will be no more math coursework assessment at General Certificate of Secondary Education (GCSE), teachers will in the future be able to devote more time to preparing learners for formal examinations. One of the key things that the author has learned when teaching statistics is that it makes for far…

  17. Potlining Additives

    SciTech Connect

    Rudolf Keller

    2004-08-10

    In this project, a concept to improve the performance of aluminum production cells by introducing potlining additives was examined and tested. Boron oxide was added to cathode blocks, and titanium was dissolved in the metal pool; this resulted in the formation of titanium diboride and caused the molten aluminum to wet the carbonaceous cathode surface. Such wetting reportedly leads to operational improvements and extended cell life. In addition, boron oxide suppresses cyanide formation. This final report presents and discusses the results of this project. Substantial economic benefits for the practical implementation of the technology are projected, especially for modern cells with graphitized blocks. For example, with an energy savings of about 5% and an increase in pot life from 1500 to 2500 days, a cost savings of $ 0.023 per pound of aluminum produced is projected for a 200 kA pot.

  18. Phosphazene additives

    SciTech Connect

    Harrup, Mason K; Rollins, Harry W

    2013-11-26

    An additive comprising a phosphazene compound that has at least two reactive functional groups and at least one capping functional group bonded to phosphorus atoms of the phosphazene compound. One of the at least two reactive functional groups is configured to react with cellulose and the other of the at least two reactive functional groups is configured to react with a resin, such as an amine resin of a polycarboxylic acid resin. The at least one capping functional group is selected from the group consisting of a short chain ether group, an alkoxy group, or an aryloxy group. Also disclosed are an additive-resin admixture, a method of treating a wood product, and a wood product.

  19. Lidar Analyses

    NASA Technical Reports Server (NTRS)

    Spiers, Gary D.

    1995-01-01

    A brief description of enhancements made to the NASA MSFC coherent lidar model is provided. Notable improvements are the addition of routines to automatically determine the 3 dB misalignment loss angle and the backscatter value at which the probability of a good estimate (for a maximum likelihood estimator) falls to 50%. The ability to automatically generate energy/aperture parametrization (EAP) plots which include the effects of angular misalignment has been added. These EAP plots make it very easy to see that for any practical system where there is some degree of misalignment then there is an optimum telescope diameter for which the laser pulse energy required to achieve a particular sensitivity is minimized. Increasing the telescope diameter above this will result in a reduction of sensitivity. These parameterizations also clearly show that the alignment tolerances at shorter wavelengths are much stricter than those at longer wavelengths. A brief outline of the NASA MSFC AEOLUS program is given and a summary of the lidar designs considered during the program is presented. A discussion of some of the design trades is performed both in the text and in a conference publication attached as an appendix.

  20. Statistical Modelling of Compound Floods

    NASA Astrophysics Data System (ADS)

    Bevacqua, Emanuele; Maraun, Douglas; Vrac, Mathieu; Widmann, Martin; Manning, Colin

    2016-04-01

    of interest. This is based on real data for River discharge (Y RIV ER') and Sea level (Y SEA), from the River Têt in south of France. The impact of the compound flood is the water level in the area between the River and Sea station, which we define here as h = αY RIV ER + (1 ‑ α)Y SEA. Here we show the sensitivity of the system to a changes in the two physical parameters. Through variations in α we can study the system in one or two dimensions which allows for the assessment of the risk associated with either of the two variables alone or with a combination of them. Varying instead the second parameter, i.e. the dependence among the variables Y RIV ER and Y SEA, we show how an apparently weak dependence can increase the risk of flooding significantly with respect to the independent case. The model can be applied to future climate inserting predictors into the statistical model as additional conditioning variables. Through conditioning the simulation of the statistical model on the predictors obtained for future projections from Climate Models, both the change of the risk and characteristics of compound floods for the future can be analysed.

  1. Spacelab Charcoal Analyses

    NASA Technical Reports Server (NTRS)

    Slivon, L. E.; Hernon-Kenny, L. A.; Katona, V. R.; Dejarme, L. E.

    1995-01-01

    This report describes analytical methods and results obtained from chemical analysis of 31 charcoal samples in five sets. Each set was obtained from a single scrubber used to filter ambient air on board a Spacelab mission. Analysis of the charcoal samples was conducted by thermal desorption followed by gas chromatography/mass spectrometry (GC/MS). All samples were analyzed using identical methods. The method used for these analyses was able to detect compounds independent of their polarity or volatility. In addition to the charcoal samples, analyses of three Environmental Control and Life Support System (ECLSS) water samples were conducted specifically for trimethylamine.

  2. Candidate Assembly Statistical Evaluation

    1998-07-15

    The Savannah River Site (SRS) receives aluminum clad spent Material Test Reactor (MTR) fuel from all over the world for storage and eventual reprocessing. There are hundreds of different kinds of MTR fuels and these fuels will continue to be received at SRS for approximately ten more years. SRS''s current criticality evaluation methodology requires the modeling of all MTR fuels utilizing Monte Carlo codes, which is extremely time consuming and resource intensive. Now that amore » significant number of MTR calculations have been conducted it is feasible to consider building statistical models that will provide reasonable estimations of MTR behavior. These statistical models can be incorporated into a standardized model homogenization spreadsheet package to provide analysts with a means of performing routine MTR fuel analyses with a minimal commitment of time and resources. This became the purpose for development of the Candidate Assembly Statistical Evaluation (CASE) program at SRS.« less

  3. Statistical Optics

    NASA Astrophysics Data System (ADS)

    Goodman, Joseph W.

    2000-07-01

    The Wiley Classics Library consists of selected books that have become recognized classics in their respective fields. With these new unabridged and inexpensive editions, Wiley hopes to extend the life of these important works by making them available to future generations of mathematicians and scientists. Currently available in the Series: T. W. Anderson The Statistical Analysis of Time Series T. S. Arthanari & Yadolah Dodge Mathematical Programming in Statistics Emil Artin Geometric Algebra Norman T. J. Bailey The Elements of Stochastic Processes with Applications to the Natural Sciences Robert G. Bartle The Elements of Integration and Lebesgue Measure George E. P. Box & Norman R. Draper Evolutionary Operation: A Statistical Method for Process Improvement George E. P. Box & George C. Tiao Bayesian Inference in Statistical Analysis R. W. Carter Finite Groups of Lie Type: Conjugacy Classes and Complex Characters R. W. Carter Simple Groups of Lie Type William G. Cochran & Gertrude M. Cox Experimental Designs, Second Edition Richard Courant Differential and Integral Calculus, Volume I RIchard Courant Differential and Integral Calculus, Volume II Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume I Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume II D. R. Cox Planning of Experiments Harold S. M. Coxeter Introduction to Geometry, Second Edition Charles W. Curtis & Irving Reiner Representation Theory of Finite Groups and Associative Algebras Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume I Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume II Cuthbert Daniel Fitting Equations to Data: Computer Analysis of Multifactor Data, Second Edition Bruno de Finetti Theory of Probability, Volume I Bruno de Finetti Theory of Probability, Volume 2 W. Edwards Deming Sample Design in Business Research

  4. [Statistical materials].

    PubMed

    1986-01-01

    Official population data for the USSR are presented for 1985 and 1986. Part 1 (pp. 65-72) contains data on capitals of union republics and cities with over one million inhabitants, including population estimates for 1986 and vital statistics for 1985. Part 2 (p. 72) presents population estimates by sex and union republic, 1986. Part 3 (pp. 73-6) presents data on population growth, including birth, death, and natural increase rates, 1984-1985; seasonal distribution of births and deaths; birth order; age-specific birth rates in urban and rural areas and by union republic; marriages; age at marriage; and divorces. PMID:12178831

  5. Information Omitted From Analyses.

    PubMed

    2015-08-01

    In the Original Article titled “Higher- Order Genetic and Environmental Structure of Prevalent Forms of Child and Adolescent Psychopathology” published in the February 2011 issue of JAMA Psychiatry (then Archives of General Psychiatry) (2011;68[2]:181-189), there were 2 errors. Although the article stated that the dimensions of psychopathology were measured using parent informants for inattention, hyperactivity-impulsivity, and oppositional defiant disorder, and a combination of parent and youth informants for conduct disorder, major depression, generalized anxiety disorder, separation anxiety disorder, social phobia, specific phobia, agoraphobia, and obsessive-compulsive disorder, all dimensional scores used in the reported analyses were actually based on parent reports of symptoms; youth reports were not used. In addition, whereas the article stated that each symptom dimension was residualized on age, sex, age-squared, and age by sex, the dimensions actually were only residualized on age, sex, and age-squared. All analyses were repeated using parent informants for inattention, hyperactivity-impulsivity, and oppositional defiant disorder, and a combination of parent and youth informants for conduct disorder,major depression, generalized anxiety disorder, separation anxiety disorder, social phobia, specific phobia, agoraphobia, and obsessive-compulsive disorder; these dimensional scores were residualized on age, age-squared, sex, sex by age, and sex by age-squared. The results of the new analyses were qualitatively the same as those reported in the article, with no substantial changes in conclusions. The only notable small difference was that major depression and generalized anxiety disorder dimensions had small but significant loadings on the internalizing factor in addition to their substantial loadings on the general factor in the analyses of both genetic and non-shared covariances in the selected models in the new analyses. Corrections were made to the

  6. Statistical analyses of plume composition and deposited radionuclide mixture ratios

    SciTech Connect

    Kraus, Terrence D.; Sallaberry, Cedric Jean-Marie; Eckert-Gallup, Aubrey Celia; Brito, Roxanne; Hunt, Brian D.; Osborn, Douglas M.

    2014-01-01

    A proposed method is considered to classify the regions in the close neighborhood of selected measurements according to the ratio of two radionuclides measured from either a radioactive plume or a deposited radionuclide mixture. The subsequent associated locations are then considered in the area of interest with a representative ratio class. This method allows for a more comprehensive and meaningful understanding of the data sampled following a radiological incident.

  7. STATISTICAL ANALYSES ON THERMAL ASPECTS OF SOLAR FLARES

    SciTech Connect

    Li, Y. P.; Gan, W. Q.; Feng, L.

    2012-03-10

    The frequency distribution of flare energies provides a crucial diagnostic to calculate the overall energy residing in flares and to estimate the role of flares in coronal heating. It often takes a power law as its functional form. We have analyzed various variables, including the thermal energies E{sub th} of 1843 flares at their peak time. They were recorded by both Geostationary Operational Environmental Satellites and Reuven Ramaty High-Energy Solar Spectroscopic Imager during the time period from 2002 to 2009 and are classified as flares greater than C 1.0. The relationship between different flare parameters is investigated. It is found that fitting the frequency distribution of E{sub th} to a power law results in an index of -2.38. We also investigate the corrected thermal energy E{sub cth}, which represents the flare total thermal energy including the energy loss in the rising phase. Its corresponding power-law slope is -2.35. Compilation of the frequency distributions of the thermal energies from nanoflares, microflares, and flares in the present work and from other authors shows that power-law indices below -2.0 have covered the range from 10{sup 24} to 10{sup 32} erg. Whether this frequency distribution can provide sufficient energy to coronal heatings in active regions and the quiet Sun is discussed.

  8. 10 CFR 436.24 - Uncertainty analyses.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 3 2010-01-01 2010-01-01 false Uncertainty analyses. 436.24 Section 436.24 Energy... Procedures for Life Cycle Cost Analyses § 436.24 Uncertainty analyses. If particular items of cost data or... by conducting additional analyses using any standard engineering economics method such as...

  9. 10 CFR 436.24 - Uncertainty analyses.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 3 2011-01-01 2011-01-01 false Uncertainty analyses. 436.24 Section 436.24 Energy... Procedures for Life Cycle Cost Analyses § 436.24 Uncertainty analyses. If particular items of cost data or... by conducting additional analyses using any standard engineering economics method such as...

  10. Students' attitudes towards learning statistics

    NASA Astrophysics Data System (ADS)

    Ghulami, Hassan Rahnaward; Hamid, Mohd Rashid Ab; Zakaria, Roslinazairimah

    2015-05-01

    Positive attitude towards learning is vital in order to master the core content of the subject matters under study. This is unexceptional in learning statistics course especially at the university level. Therefore, this study investigates the students' attitude towards learning statistics. Six variables or constructs have been identified such as affect, cognitive competence, value, difficulty, interest, and effort. The instrument used for the study is questionnaire that was adopted and adapted from the reliable instrument of Survey of Attitudes towards Statistics(SATS©). This study is conducted to engineering undergraduate students in one of the university in the East Coast of Malaysia. The respondents consist of students who were taking the applied statistics course from different faculties. The results are analysed in terms of descriptive analysis and it contributes to the descriptive understanding of students' attitude towards the teaching and learning process of statistics.

  11. Statistical Studies of Supernova Environments

    NASA Astrophysics Data System (ADS)

    Anderson, Joseph P.; James, Phil A.; Habergham, Stacey M.; Galbany, Lluís; Kuncarayakti, Hanindyo

    2015-05-01

    Mapping the diversity of SNe to progenitor properties is key to our understanding of stellar evolution and explosive stellar death. Investigations of the immediate environments of SNe allow statistical constraints to be made on progenitor properties such as mass and metallicity. Here, we review the progress that has been made in this field. Pixel statistics using tracers of e.g. star formation within galaxies show intriguing differences in the explosion sites of, in particular SNe types II and Ibc (SNe II and SNe Ibc respectively), suggesting statistical differences in population ages. Of particular interest is that SNe Ic are significantly more associated with host galaxy Hα emission than SNe Ib, implying shorter lifetimes for the former. In addition, such studies have shown (unexpectedly) that the interacting SNe IIn do not explode in regions containing the most massive stars, which suggests that at least a significant fraction of their progenitors arise from the lower end of the core-collapse SN mass range. Host H ii region spectroscopy has been obtained for a significant number of core-collapse events, however definitive conclusions on differences between distinct SN types have to-date been elusive. Single stellar evolution models predict that the relative fraction of SNe Ibc to SNe II should increase with increasing metallicity, due to the dependence of mass-loss rates on progenitor metallicity. We present a meta-analysis of all current host H ii region oxygen abundances for CC SNe. It is concluded that the SN II to SN Ibc ratio shows little variation with oxygen abundance, with only a suggestion that the ratio increases in the lowest bin. Radial distributions of different SNe are discussed, where a central excess of SNe Ibc has been observed within disturbed galaxy systems, which is difficult to ascribe to metallicity or selection effects. Environment studies are also being undertaken for SNe Ia, where constraints can be made on the shortest delay times of

  12. Perception in statistical graphics

    NASA Astrophysics Data System (ADS)

    VanderPlas, Susan Ruth

    There has been quite a bit of research on statistical graphics and visualization, generally focused on new types of graphics, new software to create graphics, interactivity, and usability studies. Our ability to interpret and use statistical graphics hinges on the interface between the graph itself and the brain that perceives and interprets it, and there is substantially less research on the interplay between graph, eye, brain, and mind than is sufficient to understand the nature of these relationships. The goal of the work presented here is to further explore the interplay between a static graph, the translation of that graph from paper to mental representation (the journey from eye to brain), and the mental processes that operate on that graph once it is transferred into memory (mind). Understanding the perception of statistical graphics should allow researchers to create more effective graphs which produce fewer distortions and viewer errors while reducing the cognitive load necessary to understand the information presented in the graph. Taken together, these experiments should lay a foundation for exploring the perception of statistical graphics. There has been considerable research into the accuracy of numerical judgments viewers make from graphs, and these studies are useful, but it is more effective to understand how errors in these judgments occur so that the root cause of the error can be addressed directly. Understanding how visual reasoning relates to the ability to make judgments from graphs allows us to tailor graphics to particular target audiences. In addition, understanding the hierarchy of salient features in statistical graphics allows us to clearly communicate the important message from data or statistical models by constructing graphics which are designed specifically for the perceptual system.

  13. Statistics for Learning Genetics

    NASA Astrophysics Data System (ADS)

    Charles, Abigail Sheena

    This study investigated the knowledge and skills that biology students may need to help them understand statistics/mathematics as it applies to genetics. The data are based on analyses of current representative genetics texts, practicing genetics professors' perspectives, and more directly, students' perceptions of, and performance in, doing statistically-based genetics problems. This issue is at the emerging edge of modern college-level genetics instruction, and this study attempts to identify key theoretical components for creating a specialized biological statistics curriculum. The goal of this curriculum will be to prepare biology students with the skills for assimilating quantitatively-based genetic processes, increasingly at the forefront of modern genetics. To fulfill this, two college level classes at two universities were surveyed. One university was located in the northeastern US and the other in the West Indies. There was a sample size of 42 students and a supplementary interview was administered to a select 9 students. Interviews were also administered to professors in the field in order to gain insight into the teaching of statistics in genetics. Key findings indicated that students had very little to no background in statistics (55%). Although students did perform well on exams with 60% of the population receiving an A or B grade, 77% of them did not offer good explanations on a probability question associated with the normal distribution provided in the survey. The scope and presentation of the applicable statistics/mathematics in some of the most used textbooks in genetics teaching, as well as genetics syllabi used by instructors do not help the issue. It was found that the text books, often times, either did not give effective explanations for students, or completely left out certain topics. The omission of certain statistical/mathematical oriented topics was seen to be also true with the genetics syllabi reviewed for this study. Nonetheless

  14. Cosmetic Plastic Surgery Statistics

    MedlinePlus

    2014 Cosmetic Plastic Surgery Statistics Cosmetic Procedure Trends 2014 Plastic Surgery Statistics Report Please credit the AMERICAN SOCIETY OF PLASTIC SURGEONS when citing statistical data or using ...

  15. Statistical Prediction in Proprietary Rehabilitation.

    ERIC Educational Resources Information Center

    Johnson, Kurt L.; And Others

    1987-01-01

    Applied statistical methods to predict case expenditures for low back pain rehabilitation cases in proprietary rehabilitation. Extracted predictor variables from case records of 175 workers compensation claimants with some degree of permanent disability due to back injury. Performed several multiple regression analyses resulting in a formula that…

  16. Misuse of statistics in surgical literature.

    PubMed

    Thiese, Matthew S; Ronna, Brenden; Robbins, Riann B

    2016-08-01

    Statistical analyses are a key part of biomedical research. Traditionally surgical research has relied upon a few statistical methods for evaluation and interpretation of data to improve clinical practice. As research methods have increased in both rigor and complexity, statistical analyses and interpretation have fallen behind. Some evidence suggests that surgical research studies are being designed and analyzed improperly given the specific study question. The goal of this article is to discuss the complexities of surgical research analyses and interpretation, and provide some resources to aid in these processes. PMID:27621909

  17. Misuse of statistics in surgical literature

    PubMed Central

    Ronna, Brenden; Robbins, Riann B.

    2016-01-01

    Statistical analyses are a key part of biomedical research. Traditionally surgical research has relied upon a few statistical methods for evaluation and interpretation of data to improve clinical practice. As research methods have increased in both rigor and complexity, statistical analyses and interpretation have fallen behind. Some evidence suggests that surgical research studies are being designed and analyzed improperly given the specific study question. The goal of this article is to discuss the complexities of surgical research analyses and interpretation, and provide some resources to aid in these processes.

  18. Limitations of Using Microsoft Excel Version 2016 (MS Excel 2016) for Statistical Analysis for Medical Research.

    PubMed

    Tanavalee, Chotetawan; Luksanapruksa, Panya; Singhatanadgige, Weerasak

    2016-06-01

    Microsoft Excel (MS Excel) is a commonly used program for data collection and statistical analysis in biomedical research. However, this program has many limitations, including fewer functions that can be used for analysis and a limited number of total cells compared with dedicated statistical programs. MS Excel cannot complete analyses with blank cells, and cells must be selected manually for analysis. In addition, it requires multiple steps of data transformation and formulas to plot survival analysis graphs, among others. The Megastat add-on program, which will be supported by MS Excel 2016 soon, would eliminate some limitations of using statistic formulas within MS Excel. PMID:27135620

  19. Statistical reporting in the “Clujul Medical” journal

    PubMed Central

    LEUCUȚA, DANIEL-CORNELIU; DRUGAN, TUDOR; ACHIMAȘ, ANDREI

    2015-01-01

    Background and aim Medical research needs statistical analyses to understand the reality of variable phenomena. There are numerous studies showing poor statistical reporting in many journals with different rankings, in different countries. Our aim was to assess the reporting of statistical analyses in original papers published in Clujul Medical journal in the year 2014. Methods All original articles published in Clujul Medical in the year 2014 were assessed using mainly Statistical Analyses and Methods in the Published Literature guidelines. Results The most important issues found in reporting statistical analyses were reduced reporting of: assumptions checking, difference between groups or measures of associations, confidence intervals for the primary outcomes, and errors in the statistical test choice or the descriptive statistic choice for several analyses. These results are similar with other studies assessing different journals worldwide. Conclusion Statistical reporting in Clujul Medical, like in other journals, have to be improved. PMID:26733746

  20. SOCR: Statistics Online Computational Resource

    PubMed Central

    Dinov, Ivo D.

    2011-01-01

    The need for hands-on computer laboratory experience in undergraduate and graduate statistics education has been firmly established in the past decade. As a result a number of attempts have been undertaken to develop novel approaches for problem-driven statistical thinking, data analysis and result interpretation. In this paper we describe an integrated educational web-based framework for: interactive distribution modeling, virtual online probability experimentation, statistical data analysis, visualization and integration. Following years of experience in statistical teaching at all college levels using established licensed statistical software packages, like STATA, S-PLUS, R, SPSS, SAS, Systat, etc., we have attempted to engineer a new statistics education environment, the Statistics Online Computational Resource (SOCR). This resource performs many of the standard types of statistical analysis, much like other classical tools. In addition, it is designed in a plug-in object-oriented architecture and is completely platform independent, web-based, interactive, extensible and secure. Over the past 4 years we have tested, fine-tuned and reanalyzed the SOCR framework in many of our undergraduate and graduate probability and statistics courses and have evidence that SOCR resources build student’s intuition and enhance their learning. PMID:21451741

  1. NASA Pocket Statistics

    NASA Technical Reports Server (NTRS)

    1994-01-01

    Pocket Statistics is published for the use of NASA managers and their staff. Included herein is Administrative and Organizational information, summaries of Space Flight Activity including the NASA Major Launch Record, and NASA Procurement, Financial, and Manpower data. The NASA Major Launch Record includes all launches of Scout class and larger vehicles. Vehicle and spacecraft development flights are also included in the Major Launch Record. Shuttle missions are counted as one launch and one payload, where free flying payloads are not involved. Satellites deployed from the cargo bay of the Shuttle and placed in a separate orbit or trajectory are counted as an additional payload.

  2. NASA Pocket Statistics

    NASA Technical Reports Server (NTRS)

    1995-01-01

    NASA Pocket Statistics is published for the use of NASA managers and their staff. Included herein is Administrative and Organizational information, summaries of Space Flight Activity including the NASA Major Launch Record, and NASA Procurement, Financial, and Manpower data. The NASA Major Launch Record includes all launches of Scout class and larger vehicles. Vehicle and spacecraft development flights are also included in the Major Launch Record. Shuttle missions are counted as one launch and one payload, where free flying payloads are not involved. Satellites deployed from the cargo bay of the Shuttle and placed in a separate orbit or trajectory are counted as an additional payload.

  3. NASA Pocket Statistics

    NASA Technical Reports Server (NTRS)

    1996-01-01

    This booklet of pocket statistics includes the 1996 NASA Major Launch Record, NASA Procurement, Financial, and Workforce data. The NASA Major Launch Record includes all launches of Scout class and larger vehicles. Vehicle and spacecraft development flights are also included in the Major Luanch Record. Shuttle missions are counted as one launch and one payload, where free flying payloads are not involved. Satellites deployed from the cargo bay of the Shuttle and placed in a separate orbit or trajectory are counted as an additional payload.

  4. Statistics: are we related?

    PubMed

    Scott, M; Flaherty, D; Currall, J

    2013-03-01

    This short addition to our series on clinical statistics concerns relationships, and answering questions such as "are blood pressure and weight related?" In a later article, we will answer the more interesting question about how they might be related. This article follows on logically from the previous one dealing with categorical data, the major difference being here that we will consider two continuous variables, which naturally leads to the use of a Pearson correlation or occasionally to a Spearman rank correlation coefficient. PMID:23458641

  5. NASA Pocket Statistics

    NASA Technical Reports Server (NTRS)

    1993-01-01

    Pocket Statistics is published for the use of NASA managers and their staff. Included herein is Administrative and Organizational information, summaries of Space Flight Activity including the NASA Major Launch Record, and NASA Procurement, Financial, and Manpower data. The NASA Major Launch Record includes all launches of Scout class and larger vehicles. Vehicle and spacecraft development flights are also included in the Major Launch Record. Shuttle missions are counted as one launch and one payload, where free flying payloads are not involved. Satellites deployed from the cargo bay of the Shuttle and placed in a separate orbit or trajectory are counted as an additional payload.

  6. Causal mediation analyses with rank preserving models.

    PubMed

    Have, Thomas R Ten; Joffe, Marshall M; Lynch, Kevin G; Brown, Gregory K; Maisto, Stephen A; Beck, Aaron T

    2007-09-01

    We present a linear rank preserving model (RPM) approach for analyzing mediation of a randomized baseline intervention's effect on a univariate follow-up outcome. Unlike standard mediation analyses, our approach does not assume that the mediating factor is also randomly assigned to individuals in addition to the randomized baseline intervention (i.e., sequential ignorability), but does make several structural interaction assumptions that currently are untestable. The G-estimation procedure for the proposed RPM represents an extension of the work on direct effects of randomized intervention effects for survival outcomes by Robins and Greenland (1994, Journal of the American Statistical Association 89, 737-749) and on intervention non-adherence by Ten Have et al. (2004, Journal of the American Statistical Association 99, 8-16). Simulations show good estimation and confidence interval performance by the proposed RPM approach under unmeasured confounding relative to the standard mediation approach, but poor performance under departures from the structural interaction assumptions. The trade-off between these assumptions is evaluated in the context of two suicide/depression intervention studies. PMID:17825022

  7. Predict! Teaching Statistics Using Informational Statistical Inference

    ERIC Educational Resources Information Center

    Makar, Katie

    2013-01-01

    Statistics is one of the most widely used topics for everyday life in the school mathematics curriculum. Unfortunately, the statistics taught in schools focuses on calculations and procedures before students have a chance to see it as a useful and powerful tool. Researchers have found that a dominant view of statistics is as an assortment of tools…

  8. Statistics Poker: Reinforcing Basic Statistical Concepts

    ERIC Educational Resources Information Center

    Leech, Nancy L.

    2008-01-01

    Learning basic statistical concepts does not need to be tedious or dry; it can be fun and interesting through cooperative learning in the small-group activity of Statistics Poker. This article describes a teaching approach for reinforcing basic statistical concepts that can help students who have high anxiety and makes learning and reinforcing…

  9. Integrated Genomic Analyses of Ovarian Carcinoma

    PubMed Central

    2011-01-01

    Summary The Cancer Genome Atlas (TCGA) project has analyzed mRNA expression, miRNA expression, promoter methylation, and DNA copy number in 489 high-grade serous ovarian adenocarcinomas (HGS-OvCa) and the DNA sequences of exons from coding genes in 316 of these tumors. These results show that HGS-OvCa is characterized by TP53 mutations in almost all tumors (96%); low prevalence but statistically recurrent somatic mutations in 9 additional genes including NF1, BRCA1, BRCA2, RB1, and CDK12; 113 significant focal DNA copy number aberrations; and promoter methylation events involving 168 genes. Analyses delineated four ovarian cancer transcriptional subtypes, three miRNA subtypes, four promoter methylation subtypes, a transcriptional signature associated with survival duration and shed new light on the impact on survival of tumors with BRCA1/2 and CCNE1 aberrations. Pathway analyses suggested that homologous recombination is defective in about half of tumors, and that Notch and FOXM1 signaling are involved in serous ovarian cancer pathophysiology. PMID:21720365

  10. Understanding British addiction statistics.

    PubMed

    Johnson, B D

    1975-01-01

    The statistical data issued by the Home Office and Department of Health and Social Security are quite detailed and generally valid measures of hard core addiction in Great Britain (Judson, 1973). Since 1968, the main basis of these high quality British statistics is the routine reports filed by Drug Treatment Centres. The well-trained, experienced staff of these clinics make knowledgeable dicsions about a cleint's addiction, efficiently regulate dosage, and otherwise exert some degree of control over addicts (Judson, 1973; Johnson, 1974). The co-operation of police, courts, prison physicians, and general practitioners is also valuable in collecting data on drug addiction and convictions. Information presented in the tables above indicates that a rising problem of herion addiction between 1962 and 1967 were arrested by the introduction of the treatment clinics in 1968. Further, legally maintained heroin addiction has been reduced by almost one-third since 1968, since many herion addicts have been transferred to injectable methadone. The decline in herion prescribing and the relatively steady number of narcotics addicts has apparently occurred in the face of a continuing, and perhaps increasing, demand for heroin and other opiates. With few exceptions of a minor nature analysis of various tables suggests that the official statistics are internally consistent. There are apparently few "hidden" addicts, since few unknown addicts die of overdoses or are arrested by police (Lewis, 1973), although Blumberg (1974) indicates that some unknown users may exist. In addition, may opitate usersnot officially notified are known by clinic doctors as friends of addicts receiving prescriptions (Judson, 1973; Home Office, 1974). In brief, offical British drug statistics seem to be generally valid and demonstrate that heroin and perhaps methadone addiction has been well contained by the treatment clinics. PMID:1039283

  11. Measuring statistical evidence using relative belief.

    PubMed

    Evans, Michael

    2016-01-01

    A fundamental concern of a theory of statistical inference is how one should measure statistical evidence. Certainly the words "statistical evidence," or perhaps just "evidence," are much used in statistical contexts. It is fair to say, however, that the precise characterization of this concept is somewhat elusive. Our goal here is to provide a definition of how to measure statistical evidence for any particular statistical problem. Since evidence is what causes beliefs to change, it is proposed to measure evidence by the amount beliefs change from a priori to a posteriori. As such, our definition involves prior beliefs and this raises issues of subjectivity versus objectivity in statistical analyses. This is dealt with through a principle requiring the falsifiability of any ingredients to a statistical analysis. These concerns lead to checking for prior-data conflict and measuring the a priori bias in a prior. PMID:26925207

  12. Measuring statistical evidence using relative belief

    PubMed Central

    Evans, Michael

    2016-01-01

    A fundamental concern of a theory of statistical inference is how one should measure statistical evidence. Certainly the words “statistical evidence,” or perhaps just “evidence,” are much used in statistical contexts. It is fair to say, however, that the precise characterization of this concept is somewhat elusive. Our goal here is to provide a definition of how to measure statistical evidence for any particular statistical problem. Since evidence is what causes beliefs to change, it is proposed to measure evidence by the amount beliefs change from a priori to a posteriori. As such, our definition involves prior beliefs and this raises issues of subjectivity versus objectivity in statistical analyses. This is dealt with through a principle requiring the falsifiability of any ingredients to a statistical analysis. These concerns lead to checking for prior-data conflict and measuring the a priori bias in a prior. PMID:26925207

  13. Mad Libs Statistics: A "Happy" Activity

    ERIC Educational Resources Information Center

    Trumpower, David

    2010-01-01

    This article describes a fun activity that can be used to help students make links between statistical analyses and their real-world implications. Although an illustrative example is provided using analysis of variance, the activity may be adapted for use with other statistical techniques.

  14. 47 CFR 1.363 - Introduction of statistical data.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 47 Telecommunication 1 2011-10-01 2011-10-01 false Introduction of statistical data. 1.363 Section... Proceedings Evidence § 1.363 Introduction of statistical data. (a) All statistical studies, offered in... analyses, and experiments, and those parts of other studies involving statistical methodology shall...

  15. 47 CFR 1.363 - Introduction of statistical data.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Introduction of statistical data. 1.363 Section... Proceedings Evidence § 1.363 Introduction of statistical data. (a) All statistical studies, offered in... analyses, and experiments, and those parts of other studies involving statistical methodology shall...

  16. Neuroendocrine Tumor: Statistics

    MedlinePlus

    ... Tumor > Neuroendocrine Tumor - Statistics Request Permissions Neuroendocrine Tumor - Statistics Approved by the Cancer.Net Editorial Board , 04/ ... the body. It is important to remember that statistics on how many people survive this type of ...

  17. Performance of statistical energy analysis

    NASA Technical Reports Server (NTRS)

    Davis, R. F.; Hines, D. E.

    1973-01-01

    Statistical energy analysis (SEA) methods have been developed for high frequency modal analyses on random vibration environments. These SEA methods are evaluated by comparing analytical predictions to test results. Simple test methods are developed for establishing SEA parameter values. Techniques are presented, based on the comparison of the predictions with test values, for estimating SEA accuracy as a function of frequency for a general structure.

  18. Statistics for People Who (Think They) Hate Statistics. Third Edition

    ERIC Educational Resources Information Center

    Salkind, Neil J.

    2007-01-01

    This text teaches an often intimidating and difficult subject in a way that is informative, personable, and clear. The author takes students through various statistical procedures, beginning with correlation and graphical representation of data and ending with inferential techniques and analysis of variance. In addition, the text covers SPSS, and…

  19. Statistical Reference Datasets

    National Institute of Standards and Technology Data Gateway

    Statistical Reference Datasets (Web, free access)   The Statistical Reference Datasets is also supported by the Standard Reference Data Program. The purpose of this project is to improve the accuracy of statistical software by providing reference datasets with certified computational results that enable the objective evaluation of statistical software.

  20. Introductory Statistics and Fish Management.

    ERIC Educational Resources Information Center

    Jardine, Dick

    2002-01-01

    Describes how fisheries research and management data (available on a website) have been incorporated into an Introductory Statistics course. In addition to the motivation gained from seeing the practical relevance of the course, some students have participated in the data collection and analysis for the New Hampshire Fish and Game Department. (MM)

  1. Statistics of lattice animals

    NASA Astrophysics Data System (ADS)

    Hsu, Hsiao-Ping; Nadler, Walder; Grassberger, Peter

    2005-07-01

    The scaling behavior of randomly branched polymers in a good solvent is studied in two to nine dimensions, modeled by lattice animals on simple hypercubic lattices. For the simulations, we use a biased sequential sampling algorithm with re-sampling, similar to the pruned-enriched Rosenbluth method (PERM) used extensively for linear polymers. We obtain high statistics of animals with up to several thousand sites in all dimension 2⩽d⩽9. The partition sum (number of different animals) and gyration radii are estimated. In all dimensions we verify the Parisi-Sourlas prediction, and we verify all exactly known critical exponents in dimensions 2, 3, 4, and ⩾8. In addition, we present the hitherto most precise estimates for growth constants in d⩾3. For clusters with one site attached to an attractive surface, we verify the superuniversality of the cross-over exponent at the adsorption transition predicted by Janssen and Lyssy.

  2. Chemists, Access, Statistics

    NASA Astrophysics Data System (ADS)

    Holmes, Jon L.

    2000-06-01

    IP-number access. Current subscriptions can be upgraded to IP-number access at little additional cost. We are pleased to be able to offer to institutions and libraries this convenient mode of access to subscriber only resources at JCE Online. JCE Online Usage Statistics We are continually amazed by the activity at JCE Online. So far, the year 2000 has shown a marked increase. Given the phenomenal overall growth of the Internet, perhaps our surprise is not warranted. However, during the months of January and February 2000, over 38,000 visitors requested over 275,000 pages. This is a monthly increase of over 33% from the October-December 1999 levels. It is good to know that people are visiting, but we would very much like to know what you would most like to see at JCE Online. Please send your suggestions to JCEOnline@chem.wisc.edu. For those who are interested, JCE Online year-to-date statistics are available. Biographical Snapshots of Famous Chemists: Mission Statement Feature Editor: Barbara Burke Chemistry Department, California State Polytechnic University-Pomona, Pomona, CA 91768 phone: 909/869-3664 fax: 909/869-4616 email: baburke@csupomona.edu The primary goal of this JCE Internet column is to provide information about chemists who have made important contributions to chemistry. For each chemist, there is a short biographical "snapshot" that provides basic information about the person's chemical work, gender, ethnicity, and cultural background. Each snapshot includes links to related websites and to a biobibliographic database. The database provides references for the individual and can be searched through key words listed at the end of each snapshot. All students, not just science majors, need to understand science as it really is: an exciting, challenging, human, and creative way of learning about our natural world. Investigating the life experiences of chemists can provide a means for students to gain a more realistic view of chemistry. In addition students

  3. How to spot a statistical problem: advice for a non-statistical reviewer.

    PubMed

    Greenwood, Darren C; Freeman, Jennifer V

    2015-01-01

    Statistical analyses presented in general medical journals are becoming increasingly sophisticated. BMC Medicine relies on subject reviewers to indicate when a statistical review is required. We consider this policy and provide guidance on when to recommend a manuscript for statistical evaluation. Indicators for statistical review include insufficient detail in methods or results, some common statistical issues and interpretation not based on the presented evidence. Reviewers are required to ensure that the manuscript is methodologically sound and clearly written. Within that context, they are expected to provide constructive feedback and opinion on the statistical design, analysis, presentation and interpretation. If reviewers lack the appropriate background to positively confirm the appropriateness of any of the manuscript's statistical aspects, they are encouraged to recommend it for expert statistical review. PMID:26521808

  4. Time series analyses of global change data.

    PubMed

    Lane, L J; Nichols, M H; Osborn, H B

    1994-01-01

    The hypothesis that statistical analyses of historical time series data can be used to separate the influences of natural variations from anthropogenic sources on global climate change is tested. Point, regional, national, and global temperature data are analyzed. Trend analyses for the period 1901-1987 suggest mean annual temperatures increased (in degrees C per century) globally at the rate of about 0.5, in the USA at about 0.3, in the south-western USA desert region at about 1.2, and at the Walnut Gulch Experimental Watershed in south-eastern Arizona at about 0.8. However, the rates of temperature change are not constant but vary within the 87-year period. Serial correlation and spectral density analysis of the temperature time series showed weak periodicities at various frequencies. The only common periodicity among the temperature series is an apparent cycle of about 43 years. The temperature time series were correlated with the Wolf sunspot index, atmospheric CO(2) concentrations interpolated from the Siple ice core data, and atmospheric CO(2) concentration data from Mauna Loa measurements. Correlation analysis of temperature data with concurrent data on atmospheric CO(2) concentrations and the Wolf sunspot index support previously reported significant correlation over the 1901-1987 period. Correlation analysis between temperature, atmospheric CO(2) concentration, and the Wolf sunspot index for the shorter period, 1958-1987, when continuous Mauna Loa CO(2) data are available, suggest significant correlation between global warming and atmospheric CO(2) concentrations but no significant correlation between global warming and the Wolf sunspot index. This may be because the Wolf sunspot index apparently increased from 1901 until about 1960 and then decreased thereafter, while global warming apparently continued to increase through 1987. Correlation of sunspot activity with global warming may be spurious but additional analyses are required to test this hypothesis

  5. DATA AND ANALYSES

    EPA Science Inventory

    In order to promote transparency and clarity of the analyses performed in support of EPA's Supplemental Guidance for Assessing Susceptibility from Early-Life Exposure to Carcinogens, the data and the analyses are now available on this web site. The data is presented in two diffe...

  6. The Economic Cost of Homosexuality: Multilevel Analyses

    ERIC Educational Resources Information Center

    Baumle, Amanda K.; Poston, Dudley, Jr.

    2011-01-01

    This article builds on earlier studies that have examined "the economic cost of homosexuality," by using data from the 2000 U.S. Census and by employing multilevel analyses. Our findings indicate that partnered gay men experience a 12.5 percent earnings penalty compared to married heterosexual men, and a statistically insignificant earnings…

  7. Analyses of coal samples collected 1975-1977

    SciTech Connect

    Henderson, J.A. Jr.; Oman, C.S.; Coleman, S.L.

    1981-01-01

    In late 1975 the Virginia Division of Mineral Resources began a sampling program of coal beds in Virginia in cooperation with the US Geological Survey and the US Bureau of Mines. A total of 134 samples were collected from coal beds of Pennsylvanian Age in five of the seven counties in the southwest Virginia coal field. Channel samples were collected at each of the sampling sites. In addition, supplemental samples of the roof- and floor-rock and major partings were collected at many sample sites, but were not analyzed. The samples are from most of the major coal beds in southwest Virginia, and are from fresh exposures in active surface and underground mines. Chemical analyses were made by the US Bureau of Mines and the US Geological Survey. The US Bureau of Mines analyses include the proximate and ultimate analyses, forms of sulfur, heat value, fusibility of ash, and the free, swelling index. The US Geological Survey analyses include the major-, minor-, and trace-element concentrations in both ash and whole coal. Statistical tables contain arithmetic and geometric means, observed range, and the standard deviation for samples collected in Virginia and are compared with samples in the National Coal Resources Data System for Tennessee, Kentucky and West Virginia.

  8. Uterine Cancer Statistics

    MedlinePlus

    ... Research AMIGAS Fighting Cervical Cancer Worldwide Stay Informed Statistics for Other Kinds of Cancer Breast Cervical Colorectal ( ... Skin Vaginal and Vulvar Cancer Home Uterine Cancer Statistics Language: English Español (Spanish) Recommend on Facebook Tweet ...

  9. Mathematical and statistical analysis

    NASA Technical Reports Server (NTRS)

    Houston, A. Glen

    1988-01-01

    The goal of the mathematical and statistical analysis component of RICIS is to research, develop, and evaluate mathematical and statistical techniques for aerospace technology applications. Specific research areas of interest include modeling, simulation, experiment design, reliability assessment, and numerical analysis.

  10. Investigation of the freely available easy-to-use software 'EZR' for medical statistics.

    PubMed

    Kanda, Y

    2013-03-01

    Although there are many commercially available statistical software packages, only a few implement a competing risk analysis or a proportional hazards regression model with time-dependent covariates, which are necessary in studies on hematopoietic SCT. In addition, most packages are not clinician friendly, as they require that commands be written based on statistical languages. This report describes the statistical software 'EZR' (Easy R), which is based on R and R commander. EZR enables the application of statistical functions that are frequently used in clinical studies, such as survival analyses, including competing risk analyses and the use of time-dependent covariates, receiver operating characteristics analyses, meta-analyses, sample size calculation and so on, by point-and-click access. EZR is freely available on our website (http://www.jichi.ac.jp/saitama-sct/SaitamaHP.files/statmed.html) and runs on both Windows (Microsoft Corporation, USA) and Mac OS X (Apple, USA). This report provides instructions for the installation and operation of EZR. PMID:23208313

  11. Minnesota Health Statistics 1988.

    ERIC Educational Resources Information Center

    Minnesota State Dept. of Health, St. Paul.

    This document comprises the 1988 annual statistical report of the Minnesota Center for Health Statistics. After introductory technical notes on changes in format, sources of data, and geographic allocation of vital events, an overview is provided of vital health statistics in all areas. Thereafter, separate sections of the report provide tables…

  12. Ethics in Statistics

    ERIC Educational Resources Information Center

    Lenard, Christopher; McCarthy, Sally; Mills, Terence

    2014-01-01

    There are many different aspects of statistics. Statistics involves mathematics, computing, and applications to almost every field of endeavour. Each aspect provides an opportunity to spark someone's interest in the subject. In this paper we discuss some ethical aspects of statistics, and describe how an introduction to ethics has been…

  13. Avoiding Statistical Mistakes

    ERIC Educational Resources Information Center

    Strasser, Nora

    2007-01-01

    Avoiding statistical mistakes is important for educators at all levels. Basic concepts will help you to avoid making mistakes using statistics and to look at data with a critical eye. Statistical data is used at educational institutions for many purposes. It can be used to support budget requests, changes in educational philosophy, changes to…

  14. Statistical quality management

    NASA Astrophysics Data System (ADS)

    Vanderlaan, Paul

    1992-10-01

    Some aspects of statistical quality management are discussed. Quality has to be defined as a concrete, measurable quantity. The concepts of Total Quality Management (TQM), Statistical Process Control (SPC), and inspection are explained. In most cases SPC is better than inspection. It can be concluded that statistics has great possibilities in the field of TQM.

  15. Consumption patterns and perception analyses of hangwa.

    PubMed

    Kwock, Chang Geun; Lee, Min A; Park, So Hyun

    2012-03-01

    Hangwa is a traditional food, corresponding to the current consumption trend, in need of marketing strategies to extend its consumption. Therefore, the purpose of this study was to analyze consumers' consumption patterns and perception of Hangwa to increase consumption in the market. A questionnaire was sent to 250 consumers by e-mail from Oct 8∼23, 2009 and the data from 231 persons were analyzed in this study. Statistical, descriptive, paired samples t-test, and importance-performance analyses were conducted using SPSS WIN 17.0. According to the results, Hangwa was purchased mainly 'for present' (39.8%) and the main reasons for buying it were 'traditional image' (33.3%) and 'taste' (22.5%). When importance and performance of attributes considered in purchasing Hangwa were evaluated, performance was assessed to be lower than importance for all attributes. The attributes in the first quadrant with a high importance and a high performance were 'a sanitary process', 'a rigorous quality mark' and 'taste', which were related with quality of the products. In addition, those with a high importance but a low performance were 'popularization through advertisement', 'promotion through mass media', 'conversion of thought on traditional foods', 'a reasonable price' and 'a wide range of price'. In conclusion, Hangwa manufacturers need to diversify products and extend the expiration date based on technologies to promote its consumption. In terms of price, Hangwa should become more available by lowering the price barrier for consumers who are sensitive to price. PMID:24471065

  16. A Generative Statistical Algorithm for Automatic Detection of Complex Postures

    PubMed Central

    Amit, Yali; Biron, David

    2015-01-01

    This paper presents a method for automated detection of complex (non-self-avoiding) postures of the nematode Caenorhabditis elegans and its application to analyses of locomotion defects. Our approach is based on progressively detailed statistical models that enable detection of the head and the body even in cases of severe coilers, where data from traditional trackers is limited. We restrict the input available to the algorithm to a single digitized frame, such that manual initialization is not required and the detection problem becomes embarrassingly parallel. Consequently, the proposed algorithm does not propagate detection errors and naturally integrates in a “big data” workflow used for large-scale analyses. Using this framework, we analyzed the dynamics of postures and locomotion of wild-type animals and mutants that exhibit severe coiling phenotypes. Our approach can readily be extended to additional automated tracking tasks such as tracking pairs of animals (e.g., for mating assays) or different species. PMID:26439258

  17. Explorations in statistics: statistical facets of reproducibility.

    PubMed

    Curran-Everett, Douglas

    2016-06-01

    Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This eleventh installment of Explorations in Statistics explores statistical facets of reproducibility. If we obtain an experimental result that is scientifically meaningful and statistically unusual, we would like to know that our result reflects a general biological phenomenon that another researcher could reproduce if (s)he repeated our experiment. But more often than not, we may learn this researcher cannot replicate our result. The National Institutes of Health and the Federation of American Societies for Experimental Biology have created training modules and outlined strategies to help improve the reproducibility of research. These particular approaches are necessary, but they are not sufficient. The principles of hypothesis testing and estimation are inherent to the notion of reproducibility in science. If we want to improve the reproducibility of our research, then we need to rethink how we apply fundamental concepts of statistics to our science. PMID:27231259

  18. Synthetic aperture sonar image statistics

    NASA Astrophysics Data System (ADS)

    Johnson, Shawn F.

    Synthetic Aperture Sonar (SAS) systems are capable of producing photograph quality seafloor imagery using a lower frequency than other systems of comparable resolution. However, as with other high-resolution sonar systems, SAS imagery is often characterized by heavy-tailed amplitude distributions which may adversely affect target detection systems. The constant cross-range resolution with respect to range that results from the synthetic aperture formation process provides a unique opportunity to improve our understanding of system and environment interactions, which is essential for accurate performance prediction. This research focused on the impact of multipath contamination and the impact of resolution on image statistics, accomplished through analyses of data collected during at-sea experiments, analytical modeling, and development of numerical simulations. Multipath contamination was shown to have an appreciable impact on image statistics at ranges greater than the water depth and when the levels of the contributing multipath are within 10 dB of the direct path, reducing the image amplitude distribution tails while also degrading image clarity. Image statistics were shown to depend strongly upon both system resolution and orientation to seafloor features such as sand ripples. This work contributes to improving detection systems by aiding understanding of the influences of background (i.e. non-target) image statistics.

  19. 49 CFR 1180.7 - Market analyses.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 49 Transportation 8 2012-10-01 2012-10-01 false Market analyses. 1180.7 Section 1180.7..., TRACKAGE RIGHTS, AND LEASE PROCEDURES General Acquisition Procedures § 1180.7 Market analyses. (a) For... identify and address relevant markets and issues, and provide additional information as requested by...

  20. 49 CFR 1180.7 - Market analyses.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 49 Transportation 8 2014-10-01 2014-10-01 false Market analyses. 1180.7 Section 1180.7..., TRACKAGE RIGHTS, AND LEASE PROCEDURES General Acquisition Procedures § 1180.7 Market analyses. (a) For... identify and address relevant markets and issues, and provide additional information as requested by...

  1. 49 CFR 1180.7 - Market analyses.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 49 Transportation 8 2013-10-01 2013-10-01 false Market analyses. 1180.7 Section 1180.7..., TRACKAGE RIGHTS, AND LEASE PROCEDURES General Acquisition Procedures § 1180.7 Market analyses. (a) For... identify and address relevant markets and issues, and provide additional information as requested by...

  2. 10 CFR 436.24 - Uncertainty analyses.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... Procedures for Life Cycle Cost Analyses § 436.24 Uncertainty analyses. If particular items of cost data or... impact of uncertainty on the calculation of life cycle cost effectiveness or the assignment of rank order... and probabilistic analysis. If additional analysis casts substantial doubt on the life cycle...

  3. 10 CFR 436.24 - Uncertainty analyses.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... Procedures for Life Cycle Cost Analyses § 436.24 Uncertainty analyses. If particular items of cost data or... impact of uncertainty on the calculation of life cycle cost effectiveness or the assignment of rank order... and probabilistic analysis. If additional analysis casts substantial doubt on the life cycle...

  4. Statistical criteria for characterizing irradiance time series.

    SciTech Connect

    Stein, Joshua S.; Ellis, Abraham; Hansen, Clifford W.

    2010-10-01

    We propose and examine several statistical criteria for characterizing time series of solar irradiance. Time series of irradiance are used in analyses that seek to quantify the performance of photovoltaic (PV) power systems over time. Time series of irradiance are either measured or are simulated using models. Simulations of irradiance are often calibrated to or generated from statistics for observed irradiance and simulations are validated by comparing the simulation output to the observed irradiance. Criteria used in this comparison should derive from the context of the analyses in which the simulated irradiance is to be used. We examine three statistics that characterize time series and their use as criteria for comparing time series. We demonstrate these statistics using observed irradiance data recorded in August 2007 in Las Vegas, Nevada, and in June 2009 in Albuquerque, New Mexico.

  5. Statistically determined nickel cadmium performance relationships

    NASA Technical Reports Server (NTRS)

    Gross, Sidney

    1987-01-01

    A statistical analysis was performed on sealed nickel cadmium cell manufacturing data and cell matching data. The cells subjected to the analysis were 30 Ah sealed Ni/Cd cells, made by General Electric. A total of 213 data parameters was investigated, including such information as plate thickness, amount of electrolyte added, weight of active material, positive and negative capacity, and charge-discharge behavior. Statistical analyses were made to determine possible correlations between test events. The data show many departures from normal distribution. Product consistency from one lot to another is an important attribute for aerospace applications. It is clear from these examples that there are some significant differences between lots. Statistical analyses are seen to be an excellent way to spot those differences. Also, it is now proven beyond doubt that battery testing is one of the leading causes of statistics.

  6. Exploring Correlation Coefficients with Golf Statistics

    ERIC Educational Resources Information Center

    Quinn, Robert J

    2006-01-01

    This article explores the relationships between several pairs of statistics kept on professional golfers on the PGA tour. Specifically, two measures related to the player's ability to drive the ball are compared as are two measures related to the player's ability to putt. An additional analysis is made between one statistic related to putting and…

  7. Florida Library Directory with Statistics, 1998.

    ERIC Educational Resources Information Center

    Florida Dept. of State, Tallahassee. Div. of Library and Information Services.

    This 49th annual Florida Library directory with statistics edition includes listings for over 1,000 libraries of all types in Florida, with contact named, phone numbers, addresses, and e-mail and web addresses. In addition, there is a section of library statistics, showing data on the use, resources, and financial condition of Florida's libraries.…

  8. XMM-Newton publication statistics

    NASA Astrophysics Data System (ADS)

    Ness, J.-U.; Parmar, A. N.; Valencic, L. A.; Smith, R.; Loiseau, N.; Salama, A.; Ehle, M.; Schartel, N.

    2014-02-01

    We assessed the scientific productivity of XMM-Newton by examining XMM-Newton publications and data usage statistics. We analyse 3272 refereed papers, published until the end of 2012, that directly use XMM-Newton data. The SAO/NASA Astrophysics Data System (ADS) was used to provide additional information on each paper including the number of citations. For each paper, the XMM-Newton observation identifiers and instruments used to provide the scientific results were determined. The identifiers were used to access the XMM-{Newton} Science Archive (XSA) to provide detailed information on the observations themselves and on the original proposals. The information obtained from these sources was then combined to allow the scientific productivity of the mission to be assessed. Since around three years after the launch of XMM-Newton there have been around 300 refereed papers per year that directly use XMM-Newton data. After more than 13 years in operation, this rate shows no evidence that it is decreasing. Since 2002, around 100 scientists per year become lead authors for the first time on a refereed paper which directly uses XMM-Newton data. Each refereed XMM-Newton paper receives around four citations per year in the first few years with a long-term citation rate of three citations per year, more than five years after publication. About half of the articles citing XMM-Newton articles are not primarily X-ray observational papers. The distribution of elapsed time between observations taken under the Guest Observer programme and first article peaks at 2 years with a possible second peak at 3.25 years. Observations taken under the Target of Opportunity programme are published significantly faster, after one year on average. The fraction of science time taken until the end of 2009 that has been used in at least one article is {˜ 90} %. Most observations were used more than once, yielding on average a factor of two in usage on available observing time per year. About 20 % of

  9. Statistical Physics of Colloidal Dispersions.

    NASA Astrophysics Data System (ADS)

    Canessa, E.

    changes of the depletion attraction with free polymer concentration. Chapter IV deals with the contributions of pairwise additive and volume dependent forces to the free energy of charge stabilized colloidal dispersions. To a first approximation the extra volume dependent contributions due to the chemical equilibrium and counterion-macroion coupling are treated in a one-component plasma approach. Added salt is treated as an ionized gas within the Debye-Huckel theory of electrolytes. In order to set this approach on a quantitative basis the existence of an equilibrium lattice with a small shear modulus is examined. Structural phase transitions in these systems are also analysed theoretically as a function of added electrolyte.

  10. Strengthen forensic entomology in court--the need for data exploration and the validation of a generalised additive mixed model.

    PubMed

    Baqué, Michèle; Amendt, Jens

    2013-01-01

    Developmental data of juvenile blow flies (Diptera: Calliphoridae) are typically used to calculate the age of immature stages found on or around a corpse and thus to estimate a minimum post-mortem interval (PMI(min)). However, many of those data sets don't take into account that immature blow flies grow in a non-linear fashion. Linear models do not supply a sufficient reliability on age estimates and may even lead to an erroneous determination of the PMI(min). According to the Daubert standard and the need for improvements in forensic science, new statistic tools like smoothing methods and mixed models allow the modelling of non-linear relationships and expand the field of statistical analyses. The present study introduces into the background and application of these statistical techniques by analysing a model which describes the development of the forensically important blow fly Calliphora vicina at different temperatures. The comparison of three statistical methods (linear regression, generalised additive modelling and generalised additive mixed modelling) clearly demonstrates that only the latter provided regression parameters that reflect the data adequately. We focus explicitly on both the exploration of the data--to assure their quality and to show the importance of checking it carefully prior to conducting the statistical tests--and the validation of the resulting models. Hence, we present a common method for evaluating and testing forensic entomological data sets by using for the first time generalised additive mixed models. PMID:22370995

  11. Quantum Statistical Mechanics

    NASA Astrophysics Data System (ADS)

    Schieve, William C.; Horwitz, Lawrence P.

    2009-04-01

    1. Foundations of quantum statistical mechanics; 2. Elementary examples; 3. Quantum statistical master equation; 4. Quantum kinetic equations; 5. Quantum irreversibility; 6. Entropy and dissipation: the microscopic theory; 7. Global equilibrium: thermostatics and the microcanonical ensemble; 8. Bose-Einstein ideal gas condensation; 9. Scaling, renormalization and the Ising model; 10. Relativistic covariant statistical mechanics of many particles; 11. Quantum optics and damping; 12. Entanglements; 13. Quantum measurement and irreversibility; 14. Quantum Langevin equation: quantum Brownian motion; 15. Linear response: fluctuation and dissipation theorems; 16. Time dependent quantum Green's functions; 17. Decay scattering; 18. Quantum statistical mechanics, extended; 19. Quantum transport with tunneling and reservoir ballistic transport; 20. Black hole thermodynamics; Appendix; Index.

  12. Statistical distribution sampling

    NASA Technical Reports Server (NTRS)

    Johnson, E. S.

    1975-01-01

    Determining the distribution of statistics by sampling was investigated. Characteristic functions, the quadratic regression problem, and the differential equations for the characteristic functions are analyzed.

  13. Microcomputer statistics packages for biomedical scientists.

    PubMed

    Ludbrook, J

    1995-12-01

    1. There are hundreds of commercially available microcomputer statistics packages, ranging from the very cheap and elementary to the very expensive and complex, and from the very general to the very specialized. This review covers only those that appear to be popular with biomedical investigators who deal with fairly small sets of data but may wish to use relatively complex analytical techniques. 2. It is highly desirable, if not essential, that biomedical investigators who use microcomputer statistics packages have access to a spreadsheet program. These provide sample statistics and simple statistical analyses but, more importantly, they are often the best way of entering data into the files of the statistics packages proper. 3. A vital component of any statistics package is its manual. This should be easy to follow, but at the same time it must provide full documentation of, and references to, precisely how the various statistical tests are performed. 4. Some packages are elementary and offer only a narrow range of test procedures (mini-packages). Some are designed to be used as statistical libraries and programming tools for professional statisticians. Between these extremes are the general purpose packages (mid-range, maxi- and supermaxi-packages) that constitute the main body of this review. 5. All the packages reviewed have some shortcomings or flaws. It is argued that the ideal package for biomedical investigators should have the following features: (i) it should provide a wide range of test procedures for analysing continuous, rank-ordered, and categorical data; (ii) the way in which these tests are carried out should be clearly stated in the manual; and (iii) lastly, although not unimportantly, the package should be easy to use. 6. It is recommended that biomedical investigators purchase a package that provides many more statistical routines than they use in their everyday practice. Provided the manual is a good one and the package itself has no serious

  14. NOAA's National Snow Analyses

    NASA Astrophysics Data System (ADS)

    Carroll, T. R.; Cline, D. W.; Olheiser, C. M.; Rost, A. A.; Nilsson, A. O.; Fall, G. M.; Li, L.; Bovitz, C. T.

    2005-12-01

    NOAA's National Operational Hydrologic Remote Sensing Center (NOHRSC) routinely ingests all of the electronically available, real-time, ground-based, snow data; airborne snow water equivalent data; satellite areal extent of snow cover information; and numerical weather prediction (NWP) model forcings for the coterminous U.S. The NWP model forcings are physically downscaled from their native 13 km2 spatial resolution to a 1 km2 resolution for the CONUS. The downscaled NWP forcings drive an energy-and-mass-balance snow accumulation and ablation model at a 1 km2 spatial resolution and at a 1 hour temporal resolution for the country. The ground-based, airborne, and satellite snow observations are assimilated into the snow model's simulated state variables using a Newtonian nudging technique. The principle advantages of the assimilation technique are: (1) approximate balance is maintained in the snow model, (2) physical processes are easily accommodated in the model, and (3) asynoptic data are incorporated at the appropriate times. The snow model is reinitialized with the assimilated snow observations to generate a variety of snow products that combine to form NOAA's NOHRSC National Snow Analyses (NSA). The NOHRSC NSA incorporate all of the available information necessary and available to produce a "best estimate" of real-time snow cover conditions at 1 km2 spatial resolution and 1 hour temporal resolution for the country. The NOHRSC NSA consist of a variety of daily, operational, products that characterize real-time snowpack conditions including: snow water equivalent, snow depth, surface and internal snowpack temperatures, surface and blowing snow sublimation, and snowmelt for the CONUS. The products are generated and distributed in a variety of formats including: interactive maps, time-series, alphanumeric products (e.g., mean areal snow water equivalent on a hydrologic basin-by-basin basis), text and map discussions, map animations, and quantitative gridded products

  15. 24 CFR 81.65 - Other information and analyses.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 24 Housing and Urban Development 1 2011-04-01 2011-04-01 false Other information and analyses. 81... information and analyses. When deemed appropriate and requested in writing, on a case by-case basis, by the... conduct additional analyses concerning any such report. A GSE shall submit additional reports or...

  16. Wavelet Analyses and Applications

    ERIC Educational Resources Information Center

    Bordeianu, Cristian C.; Landau, Rubin H.; Paez, Manuel J.

    2009-01-01

    It is shown how a modern extension of Fourier analysis known as wavelet analysis is applied to signals containing multiscale information. First, a continuous wavelet transform is used to analyse the spectrum of a nonstationary signal (one whose form changes in time). The spectral analysis of such a signal gives the strength of the signal in each…

  17. Apollo 14 microbial analyses

    NASA Technical Reports Server (NTRS)

    Taylor, G. R.

    1972-01-01

    Extensive microbiological analyses that were performed on the Apollo 14 prime and backup crewmembers and ancillary personnel are discussed. The crewmembers were subjected to four separate and quite different environments during the 137-day monitoring period. The relation between each of these environments and observed changes in the microflora of each astronaut are presented.

  18. Explorations in Statistics: Regression

    ERIC Educational Resources Information Center

    Curran-Everett, Douglas

    2011-01-01

    Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This seventh installment of "Explorations in Statistics" explores regression, a technique that estimates the nature of the relationship between two things for which we may only surmise a mechanistic or predictive connection.…

  19. Multidimensional Visual Statistical Learning

    ERIC Educational Resources Information Center

    Turk-Browne, Nicholas B.; Isola, Phillip J.; Scholl, Brian J.; Treat, Teresa A.

    2008-01-01

    Recent studies of visual statistical learning (VSL) have demonstrated that statistical regularities in sequences of visual stimuli can be automatically extracted, even without intent or awareness. Despite much work on this topic, however, several fundamental questions remain about the nature of VSL. In particular, previous experiments have not…

  20. On Statistical Testing.

    ERIC Educational Resources Information Center

    Huberty, Carl J.

    An approach to statistical testing, which combines Neyman-Pearson hypothesis testing and Fisher significance testing, is recommended. The use of P-values in this approach is discussed in some detail. The author also discusses some problems which are often found in introductory statistics textbooks. The problems involve the definitions of…

  1. Deconstructing Statistical Analysis

    ERIC Educational Resources Information Center

    Snell, Joel

    2014-01-01

    Using a very complex statistical analysis and research method for the sake of enhancing the prestige of an article or making a new product or service legitimate needs to be monitored and questioned for accuracy. 1) The more complicated the statistical analysis, and research the fewer the number of learned readers can understand it. This adds a…

  2. Explorations in Statistics: Power

    ERIC Educational Resources Information Center

    Curran-Everett, Douglas

    2010-01-01

    Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This fifth installment of "Explorations in Statistics" revisits power, a concept fundamental to the test of a null hypothesis. Power is the probability that we reject the null hypothesis when it is false. Four things affect…

  3. Applied Statistics with SPSS

    ERIC Educational Resources Information Center

    Huizingh, Eelko K. R. E.

    2007-01-01

    Accessibly written and easy to use, "Applied Statistics Using SPSS" is an all-in-one self-study guide to SPSS and do-it-yourself guide to statistics. What is unique about Eelko Huizingh's approach is that this book is based around the needs of undergraduate students embarking on their own research project, and its self-help style is designed to…

  4. Overhead Image Statistics

    SciTech Connect

    Vijayaraj, Veeraraghavan; Cheriyadat, Anil M; Bhaduri, Budhendra L; Vatsavai, Raju; Bright, Eddie A

    2008-01-01

    Statistical properties of high-resolution overhead images representing different land use categories are analyzed using various local and global statistical image properties based on the shape of the power spectrum, image gradient distributions, edge co-occurrence, and inter-scale wavelet coefficient distributions. The analysis was performed on a database of high-resolution (1 meter) overhead images representing a multitude of different downtown, suburban, commercial, agricultural and wooded exemplars. Various statistical properties relating to these image categories and their relationship are discussed. The categorical variations in power spectrum contour shapes, the unique gradient distribution characteristics of wooded categories, the similarity in edge co-occurrence statistics for overhead and natural images, and the unique edge co-occurrence statistics of downtown categories are presented in this work. Though previous work on natural image statistics has showed some of the unique characteristics for different categories, the relationships for overhead images are not well understood. The statistical properties of natural images were used in previous studies to develop prior image models, to predict and index objects in a scene and to improve computer vision models. The results from our research findings can be used to augment and adapt computer vision algorithms that rely on prior image statistics to process overhead images, calibrate the performance of overhead image analysis algorithms, and derive features for better discrimination of overhead image categories.

  5. Understanding Undergraduate Statistical Anxiety

    ERIC Educational Resources Information Center

    McKim, Courtney

    2014-01-01

    The purpose of this study was to understand undergraduate students' views of statistics. Results reveal that students with less anxiety have a higher interest in statistics and also believe in their ability to perform well in the course. Also students who have a more positive attitude about the class tend to have a higher belief in their…

  6. Statistics and Measurements

    PubMed Central

    Croarkin, M. Carroll

    2001-01-01

    For more than 50 years, the Statistical Engineering Division (SED) has been instrumental in the success of a broad spectrum of metrology projects at NBS/NIST. This paper highlights fundamental contributions of NBS/NIST statisticians to statistics and to measurement science and technology. Published methods developed by SED staff, especially during the early years, endure as cornerstones of statistics not only in metrology and standards applications, but as data-analytic resources used across all disciplines. The history of statistics at NBS/NIST began with the formation of what is now the SED. Examples from the first five decades of the SED illustrate the critical role of the division in the successful resolution of a few of the highly visible, and sometimes controversial, statistical studies of national importance. A review of the history of major early publications of the division on statistical methods, design of experiments, and error analysis and uncertainty is followed by a survey of several thematic areas. The accompanying examples illustrate the importance of SED in the history of statistics, measurements and standards: calibration and measurement assurance, interlaboratory tests, development of measurement methods, Standard Reference Materials, statistical computing, and dissemination of measurement technology. A brief look forward sketches the expanding opportunity and demand for SED statisticians created by current trends in research and development at NIST.

  7. Water Quality Statistics

    ERIC Educational Resources Information Center

    Hodgson, Ted; Andersen, Lyle; Robison-Cox, Jim; Jones, Clain

    2004-01-01

    Water quality experiments, especially the use of macroinvertebrates as indicators of water quality, offer an ideal context for connecting statistics and science. In the STAR program for secondary students and teachers, water quality experiments were also used as a context for teaching statistics. In this article, we trace one activity that uses…

  8. Application Statistics 1987.

    ERIC Educational Resources Information Center

    Council of Ontario Universities, Toronto.

    Summary statistics on application and registration patterns of applicants wishing to pursue full-time study in first-year places in Ontario universities (for the fall of 1987) are given. Data on registrations were received indirectly from the universities as part of their annual submission of USIS/UAR enrollment data to Statistics Canada and MCU.…

  9. Introduction to Statistical Physics

    NASA Astrophysics Data System (ADS)

    Casquilho, João Paulo; Ivo Cortez Teixeira, Paulo

    2014-12-01

    Preface; 1. Random walks; 2. Review of thermodynamics; 3. The postulates of statistical physics. Thermodynamic equilibrium; 4. Statistical thermodynamics – developments and applications; 5. The classical ideal gas; 6. The quantum ideal gas; 7. Magnetism; 8. The Ising model; 9. Liquid crystals; 10. Phase transitions and critical phenomena; 11. Irreversible processes; Appendixes; Index.

  10. Reform in Statistical Education

    ERIC Educational Resources Information Center

    Huck, Schuyler W.

    2007-01-01

    Two questions are considered in this article: (a) What should professionals in school psychology do in an effort to stay current with developments in applied statistics? (b) What should they do with their existing knowledge to move from surface understanding of statistics to deep understanding? Written for school psychologists who have completed…

  11. Statistical Mapping by Computer.

    ERIC Educational Resources Information Center

    Utano, Jack J.

    The function of a statistical map is to provide readers with a visual impression of the data so that they may be able to identify any geographic characteristics of the displayed phenomena. The increasingly important role played by the computer in the production of statistical maps is manifested by the varied examples of computer maps in recent…

  12. DISABILITY STATISTICS CENTER

    EPA Science Inventory

    The purpose of the Disability Statistics Center is to produce and disseminate statistical information on disability and the status of people with disabilities in American society and to establish and monitor indicators of how conditions are changing over time to meet their health...

  13. Statistics 101 for Radiologists.

    PubMed

    Anvari, Arash; Halpern, Elkan F; Samir, Anthony E

    2015-10-01

    Diagnostic tests have wide clinical applications, including screening, diagnosis, measuring treatment effect, and determining prognosis. Interpreting diagnostic test results requires an understanding of key statistical concepts used to evaluate test efficacy. This review explains descriptive statistics and discusses probability, including mutually exclusive and independent events and conditional probability. In the inferential statistics section, a statistical perspective on study design is provided, together with an explanation of how to select appropriate statistical tests. Key concepts in recruiting study samples are discussed, including representativeness and random sampling. Variable types are defined, including predictor, outcome, and covariate variables, and the relationship of these variables to one another. In the hypothesis testing section, we explain how to determine if observed differences between groups are likely to be due to chance. We explain type I and II errors, statistical significance, and study power, followed by an explanation of effect sizes and how confidence intervals can be used to generalize observed effect sizes to the larger population. Statistical tests are explained in four categories: t tests and analysis of variance, proportion analysis tests, nonparametric tests, and regression techniques. We discuss sensitivity, specificity, accuracy, receiver operating characteristic analysis, and likelihood ratios. Measures of reliability and agreement, including κ statistics, intraclass correlation coefficients, and Bland-Altman graphs and analysis, are introduced. PMID:26466186

  14. The uncertain hockey stick: a statistical perspective on the reconstruction of past temperatures

    NASA Astrophysics Data System (ADS)

    Nychka, Douglas

    2007-03-01

    A reconstruction of past temperatures based on proxies is inherently a statistical process and a deliberate statistical model for the reconstruction can also provide companion measures of uncertainty. This view is often missed in the heat of debating the merits of different analyses and interpretations of paleoclimate data. Although statistical error is acknowledged to be just one component of the total uncertainty in a reconstruction, it can provide a valuable yardstick for comparing different reconstructions or drawing inferences about features. In this talk we suggest a framework where the reconstruction is expressed as a conditional distribution of the temperatures given the proxies. Random draws from this distribution provide an ensemble of reconstructions where the spread among ensemble members is a valid statistical measure of uncertainty. This approach is illustrated for Northern Hemisphere temperatures and the multi-proxy data used by Mann, Bradley and Hughes (1999). Here we explore the scope of the statistical assumptions needed to carry through a rigorous analysis and use Monte Carlo sampling to determine the uncertainty in maxima or other complicated statistics in the reconstructed series. The principles behind this simple example for the Northern Hemisphere can be extended to regional reconstructions, incorporation of additional types proxies and the use of statistics from numerical models.

  15. Tsallis statistics and neurodegenerative disorders

    NASA Astrophysics Data System (ADS)

    Iliopoulos, Aggelos C.; Tsolaki, Magdalini; Aifantis, Elias C.

    2016-08-01

    In this paper, we perform statistical analysis of time series deriving from four neurodegenerative disorders, namely epilepsy, amyotrophic lateral sclerosis (ALS), Parkinson's disease (PD), Huntington's disease (HD). The time series are concerned with electroencephalograms (EEGs) of healthy and epileptic states, as well as gait dynamics (in particular stride intervals) of the ALS, PD and HDs. We study data concerning one subject for each neurodegenerative disorder and one healthy control. The analysis is based on Tsallis non-extensive statistical mechanics and in particular on the estimation of Tsallis q-triplet, namely {qstat, qsen, qrel}. The deviation of Tsallis q-triplet from unity indicates non-Gaussian statistics and long-range dependencies for all time series considered. In addition, the results reveal the efficiency of Tsallis statistics in capturing differences in brain dynamics between healthy and epileptic states, as well as differences between ALS, PD, HDs from healthy control subjects. The results indicate that estimations of Tsallis q-indices could be used as possible biomarkers, along with others, for improving classification and prediction of epileptic seizures, as well as for studying the gait complex dynamics of various diseases providing new insights into severity, medications and fall risk, improving therapeutic interventions.

  16. Statistics in medicine.

    PubMed

    Januszyk, Michael; Gurtner, Geoffrey C

    2011-01-01

    The scope of biomedical research has expanded rapidly during the past several decades, and statistical analysis has become increasingly necessary to understand the meaning of large and diverse quantities of raw data. As such, a familiarity with this lexicon is essential for critical appraisal of medical literature. This article attempts to provide a practical overview of medical statistics, with an emphasis on the selection, application, and interpretation of specific tests. This includes a brief review of statistical theory and its nomenclature, particularly with regard to the classification of variables. A discussion of descriptive methods for data presentation is then provided, followed by an overview of statistical inference and significance analysis, and detailed treatment of specific statistical tests and guidelines for their interpretation. PMID:21200241

  17. Statistical Treatment of Looking-Time Data

    ERIC Educational Resources Information Center

    Csibra, Gergely; Hernik, Mikolaj; Mascaro, Olivier; Tatone, Denis; Lengyel, Máté

    2016-01-01

    Looking times (LTs) are frequently measured in empirical research on infant cognition. We analyzed the statistical distribution of LTs across participants to develop recommendations for their treatment in infancy research. Our analyses focused on a common within-subject experimental design, in which longer looking to novel or unexpected stimuli is…

  18. The Variability Debate: More Statistics, More Linguistics

    ERIC Educational Resources Information Center

    Drai, Dan; Grodzinsky, Yosef

    2006-01-01

    We respond to critical comments and consider alternative statistical and syntactic analyses of our target paper which analyzed comprehension scores of Broca's aphasic patients from multiple sentence types in many languages, and showed that Movement but not Complexity or Mood are factors in the receptive deficit of these patients. Specifically, we…

  19. [Food additives and healthiness].

    PubMed

    Heinonen, Marina

    2014-01-01

    Additives are used for improving food structure or preventing its spoilage, for example. Many substances used as additives are also naturally present in food. The safety of additives is evaluated according to commonly agreed principles. If high concentrations of an additive cause adverse health effects for humans, a limit of acceptable daily intake (ADI) is set for it. An additive is a risk only when ADI is exceeded. The healthiness of food is measured on the basis of nutrient density and scientifically proven effects. PMID:24772784

  20. Polyimide processing additives

    NASA Technical Reports Server (NTRS)

    Pratt, J. R.; St. Clair, T. L.; Burks, H. D.; Stoakley, D. M.

    1987-01-01

    A method has been found for enhancing the melt flow of thermoplastic polyimides during processing. A high molecular weight 422 copoly(amic acid) or copolyimide was fused with approximately 0.05 to 5 pct by weight of a low molecular weight amic acid or imide additive, and this melt was studied by capillary rheometry. Excellent flow and improved composite properties on graphite resulted from the addition of a PMDA-aniline additive to LARC-TPI. Solution viscosity studies imply that amic acid additives temporarily lower molecular weight and, hence, enlarge the processing window. Thus, compositions containing the additive have a lower melt viscosity for a longer time than those unmodified.

  1. Guidelines for Meta-Analyses of Counseling Psychology Research

    ERIC Educational Resources Information Center

    Quintana, Stephen M.; Minami, Takuya

    2006-01-01

    This article conceptually describes the steps in conducting quantitative meta-analyses of counseling psychology research with minimal reliance on statistical formulas. The authors identify sources that describe necessary statistical formula for various meta-analytic calculations and describe recent developments in meta-analytic techniques. The…

  2. Publication bias in dermatology systematic reviews and meta-analyses.

    PubMed

    Atakpo, Paul; Vassar, Matt

    2016-05-01

    Systematic reviews and meta-analyses in dermatology provide high-level evidence for clinicians and policy makers that influence clinical decision making and treatment guidelines. One methodological problem with systematic reviews is the under representation of unpublished studies. This problem is due in part to publication bias. Omission of statistically non-significant data from meta-analyses may result in overestimation of treatment effect sizes which may lead to clinical consequences. Our goal was to assess whether systematic reviewers in dermatology evaluate and report publication bias. Further, we wanted to conduct our own evaluation of publication bias on meta-analyses that failed to do so. Our study considered systematic reviews and meta-analyses from ten dermatology journals from 2006 to 2016. A PubMed search was conducted, and all full-text articles that met our inclusion criteria were retrieved and coded by the primary author. 293 articles were included in our analysis. Additionally, we formally evaluated publication bias in meta-analyses that failed to do so using trim and fill and cumulative meta-analysis by precision methods. Publication bias was mentioned in 107 articles (36.5%) and was formally evaluated in 64 articles (21.8%). Visual inspection of a funnel plot was the most common method of evaluating publication bias. Publication bias was present in 45 articles (15.3%), not present in 57 articles (19.5%) and not determined in 191 articles (65.2%). Using the trim and fill method, 7 meta-analyses (33.33%) showed evidence of publication bias. Although the trim and fill method only found evidence of publication bias in 7 meta-analyses, the cumulative meta-analysis by precision method found evidence of publication bias in 15 meta-analyses (71.4%). Many of the reviews in our study did not mention or evaluate publication bias. Further, of the 42 articles that stated following PRISMA reporting guidelines, 19 (45.2%) evaluated for publication bias. In

  3. Penalized likelihood phenotyping: unifying voxelwise analyses and multi-voxel pattern analyses in neuroimaging: penalized likelihood phenotyping.

    PubMed

    Adluru, Nagesh; Hanlon, Bret M; Lutz, Antoine; Lainhart, Janet E; Alexander, Andrew L; Davidson, Richard J

    2013-04-01

    Neuroimage phenotyping for psychiatric and neurological disorders is performed using voxelwise analyses also known as voxel based analyses or morphometry (VBM). A typical voxelwise analysis treats measurements at each voxel (e.g., fractional anisotropy, gray matter probability) as outcome measures to study the effects of possible explanatory variables (e.g., age, group) in a linear regression setting. Furthermore, each voxel is treated independently until the stage of correction for multiple comparisons. Recently, multi-voxel pattern analyses (MVPA), such as classification, have arisen as an alternative to VBM. The main advantage of MVPA over VBM is that the former employ multivariate methods which can account for interactions among voxels in identifying significant patterns. They also provide ways for computer-aided diagnosis and prognosis at individual subject level. However, compared to VBM, the results of MVPA are often more difficult to interpret and prone to arbitrary conclusions. In this paper, first we use penalized likelihood modeling to provide a unified framework for understanding both VBM and MVPA. We then utilize statistical learning theory to provide practical methods for interpreting the results of MVPA beyond commonly used performance metrics, such as leave-one-out-cross validation accuracy and area under the receiver operating characteristic (ROC) curve. Additionally, we demonstrate that there are challenges in MVPA when trying to obtain image phenotyping information in the form of statistical parametric maps (SPMs), which are commonly obtained from VBM, and provide a bootstrap strategy as a potential solution for generating SPMs using MVPA. This technique also allows us to maximize the use of available training data. We illustrate the empirical performance of the proposed framework using two different neuroimaging studies that pose different levels of challenge for classification using MVPA. PMID:23397550

  4. Statistical Perspectives on Stratospheric Transport

    NASA Technical Reports Server (NTRS)

    Sparling, L. C.

    1999-01-01

    Long-lived tropospheric source gases, such as nitrous oxide, enter the stratosphere through the tropical tropopause, are transported throughout the stratosphere by the Brewer-Dobson circulation, and are photochemically destroyed in the upper stratosphere. These chemical constituents, or "tracers" can be used to track mixing and transport by the stratospheric winds. Much of our understanding about the stratospheric circulation is based on large scale gradients and other spatial features in tracer fields constructed from satellite measurements. The point of view presented in this paper is different, but complementary, in that transport is described in terms of tracer probability distribution functions (PDFs). The PDF is computed from the measurements, and is proportional to the area occupied by tracer values in a given range. The flavor of this paper is tutorial, and the ideas are illustrated with several examples of transport-related phenomena, annotated with remarks that summarize the main point or suggest new directions. One example shows how the multimodal shape of the PDF gives information about the different branches of the circulation. Another example shows how the statistics of fluctuations from the most probable tracer value give insight into mixing between different regions of the atmosphere. Also included is an analysis of the time-dependence of the PDF during the onset and decline of the winter circulation, and a study of how "bursts" in the circulation are reflected in transient periods of rapid evolution of the PDF. The dependence of the statistics on location and time are also shown to be important for practical problems related to statistical robustness and satellite sampling. The examples illustrate how physically-based statistical analysis can shed some light on aspects of stratospheric transport that may not be obvious or quantifiable with other types of analyses. An important motivation for the work presented here is the need for synthesis of the

  5. Statistics at a glance.

    PubMed

    Ector, Hugo

    2010-12-01

    I still remember my first book on statistics: "Elementary statistics with applications in medicine and the biological sciences" by Frederick E. Croxton. For me, it has been the start of pursuing understanding statistics in daily life and in medical practice. It was the first volume in a long row of books. In his introduction, Croxton pretends that"nearly everyone involved in any aspect of medicine needs to have some knowledge of statistics". The reality is that for many clinicians, statistics are limited to a "P < 0.05 = ok". I do not blame my colleagues who omit the paragraph on statistical methods. They have never had the opportunity to learn concise and clear descriptions of the key features. I have experienced how some authors can describe difficult methods in a well understandable language. Others fail completely. As a teacher, I tell my students that life is impossible without a basic knowledge of statistics. This feeling has resulted in an annual seminar of 90 minutes. This tutorial is the summary of this seminar. It is a summary and a transcription of the best pages I have detected. PMID:21302664

  6. Latest statistics on cardiovascular disease in Australia.

    PubMed

    Waters, Anne-Marie; Trinh, Lany; Chau, Theresa; Bourchier, Michael; Moon, Lynelle

    2013-06-01

    The results presented herein summarize the most up-to-date cardiovascular statistics available at this time in Australia. The analysis presented here is based on and extends results published in two Australian Institute of Health and Welfare (AIHW) reports, namely Cardiovascular disease: Australian facts 2011 and the cardiovascular disease (CVD) section of Australia's Health 2012. Despite significant improvements in the cardiovascular health of Australians in recent decades, CVD continues to impose a heavy burden on Australians in terms of illness, disability and premature death. Direct health care expenditure for CVD exceeds that for any other disease group. The most recent national data have been analysed to describe patterns and trends in CVD hospitalization and death rates, with additional analysis by Indigenous status, remoteness and socioeconomic group. The incidence of and case-fatality from major coronary events has also been examined. Although CVD death rates have declined steadily in Australia since the late 1960s, CVD still accounts for a larger proportion of deaths (33% in 2009) than any other disease group. Worryingly, the rate at which the coronary heart disease death rate has been falling in recent years has slowed in younger (35-54 years) age groups. Between 1998-99 and 2009-10, the overall rate of hospitalizations for CVD fell by 13%, with declines observed for most major CVDs. In conclusion, CVD disease remains a significant health problem in Australia despite decreasing death and hospitalization rates. PMID:23517328

  7. Statistics of football dynamics

    NASA Astrophysics Data System (ADS)

    Mendes, R. S.; Malacarne, L. C.; Anteneodo, C.

    2007-06-01

    We investigate the dynamics of football matches. Our goal is to characterize statistically the temporal sequence of ball movements in this collective sport game, searching for traits of complex behavior. Data were collected over a variety of matches in South American, European and World championships throughout 2005 and 2006. We show that the statistics of ball touches presents power-law tails and can be described by q-gamma distributions. To explain such behavior we propose a model that provides information on the characteristics of football dynamics. Furthermore, we discuss the statistics of duration of out-of-play intervals, not directly related to the previous scenario.

  8. A statistical anomaly indicates symbiotic origins of eukaryotic membranes

    PubMed Central

    Bansal, Suneyna; Mittal, Aditya

    2015-01-01

    Compositional analyses of nucleic acids and proteins have shed light on possible origins of living cells. In this work, rigorous compositional analyses of ∼5000 plasma membrane lipid constituents of 273 species in the three life domains (archaea, eubacteria, and eukaryotes) revealed a remarkable statistical paradox, indicating symbiotic origins of eukaryotic cells involving eubacteria. For lipids common to plasma membranes of the three domains, the number of carbon atoms in eubacteria was found to be similar to that in eukaryotes. However, mutually exclusive subsets of same data show exactly the opposite—the number of carbon atoms in lipids of eukaryotes was higher than in eubacteria. This statistical paradox, called Simpson's paradox, was absent for lipids in archaea and for lipids not common to plasma membranes of the three domains. This indicates the presence of interaction(s) and/or association(s) in lipids forming plasma membranes of eubacteria and eukaryotes but not for those in archaea. Further inspection of membrane lipid structures affecting physicochemical properties of plasma membranes provides the first evidence (to our knowledge) on the symbiotic origins of eukaryotic cells based on the “third front” (i.e., lipids) in addition to the growing compositional data from nucleic acids and proteins. PMID:25631820

  9. Additive usage levels.

    PubMed

    Langlais, R

    1996-01-01

    With the adoption of the European Parliament and Council Directives on sweeteners, colours and miscellaneous additives the Commission is now embarking on the project of coordinating the activities of the European Union Member States in the collection of the data that are to make up the report on food additive intake requested by the European Parliament. This presentation looks at the inventory of available sources on additive use levels and concludes that for the time being national legislation is still the best source of information considering that the directives have yet to be transposed into national legislation. Furthermore, this presentation covers the correlation of the food categories as found in the additives directives with those used by national consumption surveys and finds that in a number of instances this correlation still leaves a lot to be desired. The intake of additives via food ingestion and the intake of substances which are chemically identical to additives but which occur naturally in fruits and vegetables is found in a number of cases to be higher than the intake of additives added during the manufacture of foodstuffs. While the difficulties are recognized in contributing to the compilation of food additive intake data, industry as a whole, i.e. the food manufacturing and food additive manufacturing industries, are confident that in a concerted effort, use data on food additives by industry can be made available. Lastly, the paper points out that with the transportation of the additives directives into national legislation and the time by which the food industry will be able to make use of the new food legislative environment several years will still go by; food additives use data by the food industry will thus have to be reviewed at the beginning of the next century. PMID:8792135

  10. Consumption Patterns and Perception Analyses of Hangwa

    PubMed Central

    Kwock, Chang Geun; Lee, Min A; Park, So Hyun

    2012-01-01

    Hangwa is a traditional food, corresponding to the current consumption trend, in need of marketing strategies to extend its consumption. Therefore, the purpose of this study was to analyze consumers’ consumption patterns and perception of Hangwa to increase consumption in the market. A questionnaire was sent to 250 consumers by e-mail from Oct 8∼23, 2009 and the data from 231 persons were analyzed in this study. Statistical, descriptive, paired samples t-test, and importance-performance analyses were conducted using SPSS WIN 17.0. According to the results, Hangwa was purchased mainly ‘for present’ (39.8%) and the main reasons for buying it were ‘traditional image’ (33.3%) and ‘taste’ (22.5%). When importance and performance of attributes considered in purchasing Hangwa were evaluated, performance was assessed to be lower than importance for all attributes. The attributes in the first quadrant with a high importance and a high performance were ‘a sanitary process’, ‘a rigorous quality mark’ and ‘taste’, which were related with quality of the products. In addition, those with a high importance but a low performance were ‘popularization through advertisement’, ‘promotion through mass media’, ‘conversion of thought on traditional foods’, ‘a reasonable price’ and ‘a wide range of price’. In conclusion, Hangwa manufacturers need to diversify products and extend the expiration date based on technologies to promote its consumption. In terms of price, Hangwa should become more available by lowering the price barrier for consumers who are sensitive to price. PMID:24471065

  11. An additional middle cuneiform?

    PubMed Central

    Brookes-Fazakerley, S.D.; Jackson, G.E.; Platt, S.R.

    2015-01-01

    Additional cuneiform bones of the foot have been described in reference to the medial bipartite cuneiform or as small accessory ossicles. An additional middle cuneiform has not been previously documented. We present the case of a patient with an additional ossicle that has the appearance and location of an additional middle cuneiform. Recognizing such an anatomical anomaly is essential for ruling out second metatarsal base or middle cuneiform fractures and for the preoperative planning of arthrodesis or open reduction and internal fixation procedures in this anatomical location. PMID:26224890

  12. Playing at Statistical Mechanics

    ERIC Educational Resources Information Center

    Clark, Paul M.; And Others

    1974-01-01

    Discussed are the applications of counting techniques of a sorting game to distributions and concepts in statistical mechanics. Included are the following distributions: Fermi-Dirac, Bose-Einstein, and most probable. (RH)

  13. Cooperative Learning in Statistics.

    ERIC Educational Resources Information Center

    Keeler, Carolyn M.; And Others

    1994-01-01

    Formal use of cooperative learning techniques proved effective in improving student performance and retention in a freshman level statistics course. Lectures interspersed with group activities proved effective in increasing conceptual understanding and overall class performance. (11 references) (Author)

  14. Understanding Solar Flare Statistics

    NASA Astrophysics Data System (ADS)

    Wheatland, M. S.

    2005-12-01

    A review is presented of work aimed at understanding solar flare statistics, with emphasis on the well known flare power-law size distribution. Although avalanche models are perhaps the favoured model to describe flare statistics, their physical basis is unclear, and they are divorced from developing ideas in large-scale reconnection theory. An alternative model, aimed at reconciling large-scale reconnection models with solar flare statistics, is revisited. The solar flare waiting-time distribution has also attracted recent attention. Observed waiting-time distributions are described, together with what they might tell us about the flare phenomenon. Finally, a practical application of flare statistics to flare prediction is described in detail, including the results of a year of automated (web-based) predictions from the method.

  15. Plague Maps and Statistics

    MedlinePlus

    ... and Statistics Recommend on Facebook Tweet Share Compartir Plague in the United States Plague was first introduced ... per year in the United States: 1900-2012. Plague Worldwide Plague epidemics have occurred in Africa, Asia, ...

  16. Elements of Statistics

    NASA Astrophysics Data System (ADS)

    Grégoire, G.

    2016-05-01

    This chapter is devoted to two objectives. The first one is to answer the request expressed by attendees of the first Astrostatistics School (Annecy, October 2013) to be provided with an elementary vademecum of statistics that would facilitate understanding of the given courses. In this spirit we recall very basic notions, that is definitions and properties that we think sufficient to benefit from courses given in the Astrostatistical School. Thus we give briefly definitions and elementary properties on random variables and vectors, distributions, estimation and tests, maximum likelihood methodology. We intend to present basic ideas in a hopefully comprehensible way. We do not try to give a rigorous presentation, and due to the place devoted to this chapter, can cover only a rather limited field of statistics. The second aim is to focus on some statistical tools that are useful in classification: basic introduction to Bayesian statistics, maximum likelihood methodology, Gaussian vectors and Gaussian mixture models.

  17. Tuberculosis Data and Statistics

    MedlinePlus

    ... Organization Chart Advisory Groups Federal TB Task Force Data and Statistics Language: English Español (Spanish) Recommend on ... United States publication. PDF [6 MB] Interactive TB Data Tool Online Tuberculosis Information System (OTIS) OTIS is ...

  18. Statistics of the sagas

    NASA Astrophysics Data System (ADS)

    Richfield, Jon; bookfeller

    2016-07-01

    In reply to Ralph Kenna and Pádraig Mac Carron's feature article “Maths meets myths” in which they describe how they are using techniques from statistical physics to characterize the societies depicted in ancient Icelandic sagas.

  19. Brain Tumor Statistics

    MedlinePlus

    ... facts and statistics here include brain and central nervous system tumors (including spinal cord, pituitary and pineal gland ... U.S. living with a primary brain and central nervous system tumor. This year, nearly 17,000 people will ...

  20. Purposeful Statistical Investigations

    ERIC Educational Resources Information Center

    Day, Lorraine

    2014-01-01

    Lorraine Day provides us with a great range of statistical investigations using various resources such as maths300 and TinkerPlots. Each of the investigations link mathematics to students' lives and provide engaging and meaningful contexts for mathematical inquiry.

  1. XS: An Analysis and Synthesis System for Linear Regression Constructed by Integrating a Graphical Statistical System, a Relational Database System and an Expert System Shell

    PubMed Central

    Johannes, R.S.; Brown, C. Hendricks; Onstad, Lynn E.

    1989-01-01

    This paper introduces an analysis and synthesis system (XS) which aids users in performing statistical analyses. In any large study, the dataset itself grows and changes dramatically over its life-course. Important datasets are often analyzed by many people over extended periods of time. Effective analysis of these large datasets depends to a large part in integrating past inferences and analytical decisions into current analyses. XS provides statistical expertise to answer current problems, but it also makes available the results of past analyses available for potential integration and consistency checking. In addition, XS permits the integration of knowledge outside the confines of the dataset with statistical results and user input in order to make analytical decisions.

  2. Statistical process control

    SciTech Connect

    Oakland, J.S.

    1986-01-01

    Addressing the increasing importance for firms to have a thorough knowledge of statistically based quality control procedures, this book presents the fundamentals of statistical process control (SPC) in a non-mathematical, practical way. It provides real-life examples and data drawn from a wide variety of industries. The foundations of good quality management and process control, and control of conformance and consistency during production are given. Offers clear guidance to those who wish to understand and implement modern SPC techniques.

  3. Statistical Physics of Particles

    NASA Astrophysics Data System (ADS)

    Kardar, Mehran

    2006-06-01

    Statistical physics has its origins in attempts to describe the thermal properties of matter in terms of its constituent particles, and has played a fundamental role in the development of quantum mechanics. Based on lectures for a course in statistical mechanics taught by Professor Kardar at Massachusetts Institute of Technology, this textbook introduces the central concepts and tools of statistical physics. It contains a chapter on probability and related issues such as the central limit theorem and information theory, and covers interacting particles, with an extensive description of the van der Waals equation and its derivation by mean field approximation. It also contains an integrated set of problems, with solutions to selected problems at the end of the book. It will be invaluable for graduate and advanced undergraduate courses in statistical physics. A complete set of solutions is available to lecturers on a password protected website at www.cambridge.org/9780521873420. Based on lecture notes from a course on Statistical Mechanics taught by the author at MIT Contains 89 exercises, with solutions to selected problems Contains chapters on probability and interacting particles Ideal for graduate courses in Statistical Mechanics

  4. Statistical Physics of Fields

    NASA Astrophysics Data System (ADS)

    Kardar, Mehran

    2006-06-01

    While many scientists are familiar with fractals, fewer are familiar with the concepts of scale-invariance and universality which underly the ubiquity of their shapes. These properties may emerge from the collective behaviour of simple fundamental constituents, and are studied using statistical field theories. Based on lectures for a course in statistical mechanics taught by Professor Kardar at Massachusetts Institute of Technology, this textbook demonstrates how such theories are formulated and studied. Perturbation theory, exact solutions, renormalization groups, and other tools are employed to demonstrate the emergence of scale invariance and universality, and the non-equilibrium dynamics of interfaces and directed paths in random media are discussed. Ideal for advanced graduate courses in statistical physics, it contains an integrated set of problems, with solutions to selected problems at the end of the book. A complete set of solutions is available to lecturers on a password protected website at www.cambridge.org/9780521873413. Based on lecture notes from a course on Statistical Mechanics taught by the author at MIT Contains 65 exercises, with solutions to selected problems Features a thorough introduction to the methods of Statistical Field theory Ideal for graduate courses in Statistical Physics

  5. Statistical Analysis of Single-Trial Granger Causality Spectra

    PubMed Central

    Brovelli, Andrea

    2012-01-01

    Granger causality analysis is becoming central for the analysis of interactions between neural populations and oscillatory networks. However, it is currently unclear whether single-trial estimates of Granger causality spectra can be used reliably to assess directional influence. We addressed this issue by combining single-trial Granger causality spectra with statistical inference based on general linear models. The approach was assessed on synthetic and neurophysiological data. Synthetic bivariate data was generated using two autoregressive processes with unidirectional coupling. We simulated two hypothetical experimental conditions: the first mimicked a constant and unidirectional coupling, whereas the second modelled a linear increase in coupling across trials. The statistical analysis of single-trial Granger causality spectra, based on t-tests and linear regression, successfully recovered the underlying pattern of directional influence. In addition, we characterised the minimum number of trials and coupling strengths required for significant detection of directionality. Finally, we demonstrated the relevance for neurophysiology by analysing two local field potentials (LFPs) simultaneously recorded from the prefrontal and premotor cortices of a macaque monkey performing a conditional visuomotor task. Our results suggest that the combination of single-trial Granger causality spectra and statistical inference provides a valuable tool for the analysis of large-scale cortical networks and brain connectivity. PMID:22649482

  6. Primarily Statistics: Developing an Introductory Statistics Course for Pre-Service Elementary Teachers

    ERIC Educational Resources Information Center

    Green, Jennifer L.; Blankenship, Erin E.

    2013-01-01

    We developed an introductory statistics course for pre-service elementary teachers. In this paper, we describe the goals and structure of the course, as well as the assessments we implemented. Additionally, we use example course work to demonstrate pre-service teachers' progress both in learning statistics and as novice teachers. Overall, the…

  7. Carbamate deposit control additives

    SciTech Connect

    Honnen, L.R.; Lewis, R.A.

    1980-11-25

    Deposit control additives for internal combustion engines are provided which maintain cleanliness of intake systems without contributing to combustion chamber deposits. The additives are poly(oxyalkylene) carbamates comprising a hydrocarbyloxyterminated poly(Oxyalkylene) chain of 2-5 carbon oxyalkylene units bonded through an oxycarbonyl group to a nitrogen atom of ethylenediamine.

  8. LDEF Satellite Radiation Analyses

    NASA Technical Reports Server (NTRS)

    Armstrong, T. W.; Colborn, B. L.

    1996-01-01

    Model calculations and analyses have been carried out to compare with several sets of data (dose, induced radioactivity in various experiment samples and spacecraft components, fission foil measurements, and LET spectra) from passive radiation dosimetry on the Long Duration Exposure Facility (LDEF) satellite, which was recovered after almost six years in space. The calculations and data comparisons are used to estimate the accuracy of current models and methods for predicting the ionizing radiation environment in low earth orbit. The emphasis is on checking the accuracy of trapped proton flux and anisotropy models.

  9. Uncertainty quantification approaches for advanced reactor analyses.

    SciTech Connect

    Briggs, L. L.; Nuclear Engineering Division

    2009-03-24

    The original approach to nuclear reactor design or safety analyses was to make very conservative modeling assumptions so as to ensure meeting the required safety margins. Traditional regulation, as established by the U. S. Nuclear Regulatory Commission required conservatisms which have subsequently been shown to be excessive. The commission has therefore moved away from excessively conservative evaluations and has determined best-estimate calculations to be an acceptable alternative to conservative models, provided the best-estimate results are accompanied by an uncertainty evaluation which can demonstrate that, when a set of analysis cases which statistically account for uncertainties of all types are generated, there is a 95% probability that at least 95% of the cases meet the safety margins. To date, nearly all published work addressing uncertainty evaluations of nuclear power plant calculations has focused on light water reactors and on large-break loss-of-coolant accident (LBLOCA) analyses. However, there is nothing in the uncertainty evaluation methodologies that is limited to a specific type of reactor or to specific types of plant scenarios. These same methodologies can be equally well applied to analyses for high-temperature gas-cooled reactors and to liquid metal reactors, and they can be applied to steady-state calculations, operational transients, or severe accident scenarios. This report reviews and compares both statistical and deterministic uncertainty evaluation approaches. Recommendations are given for selection of an uncertainty methodology and for considerations to be factored into the process of evaluating uncertainties for advanced reactor best-estimate analyses.

  10. Review of statistical methods used in enhanced-oil-recovery research and performance prediction. [131 references

    SciTech Connect

    Selvidge, J.E.

    1982-06-01

    Recent literature in the field of enhanced oil recovery (EOR) was surveyed to determine the extent to which researchers in EOR take advantage of statistical techniques in analyzing their data. In addition to determining the current level of reliance on statistical tools, another objective of this study is to promote by example the greater use of these tools. To serve this objective, the discussion of the techniques highlights the observed trend toward the use of increasingly more sophisticated methods and points out the strengths and pitfalls of different approaches. Several examples are also given of opportunities for extending EOR research findings by additional statistical manipulation. The search of the EOR literature, conducted mainly through computerized data bases, yielded nearly 200 articles containing mathematical analysis of the research. Of these, 21 were found to include examples of statistical approaches to data analysis and are discussed in detail in this review. The use of statistical techniques, as might be expected from their general purpose nature, extends across nearly all types of EOR research covering thermal methods of recovery, miscible processes, and micellar polymer floods. Data come from field tests, the laboratory, and computer simulation. The statistical methods range from simple comparisons of mean values to multiple non-linear regression equations and to probabilistic decision functions. The methods are applied to both engineering and economic data. The results of the survey are grouped by statistical technique and include brief descriptions of each of the 21 relevant papers. Complete abstracts of the papers are included in the bibliography. Brief bibliographic information (without abstracts) is also given for the articles identified in the initial search as containing mathematical analyses using other than statistical methods.

  11. NASA Pocket Statistics: 1997 Edition

    NASA Technical Reports Server (NTRS)

    1997-01-01

    POCKET STATISTICS is published by the NATIONAL AERONAUTICS AND SPACE ADMINISTRATION (NASA). Included in each edition is Administrative and Organizational information, summaries of Space Flight Activity including the NASA Major Launch Record, Aeronautics and Space Transportation and NASA Procurement, Financial and Workforce data. The NASA Major Launch Record includes all launches of Scout class and larger vehicles. Vehicle and spacecraft development flights are also included in the Major Launch Record. Shuttle missions are counted as one launch and one payload, where free flying payloads are not involved. All Satellites deployed from the cargo bay of the Shuttle and placed in a separate orbit or trajectory are counted as an additional payload.

  12. Nonstationary statistical theory for multipactor

    SciTech Connect

    Anza, S.; Vicente, C.; Gil, J.

    2010-06-15

    This work presents a new and general approach to the real dynamics of the multipactor process: the nonstationary statistical multipactor theory. The nonstationary theory removes the stationarity assumption of the classical theory and, as a consequence, it is able to adequately model electron exponential growth as well as absorption processes, above and below the multipactor breakdown level. In addition, it considers both double-surface and single-surface interactions constituting a full framework for nonresonant polyphase multipactor analysis. This work formulates the new theory and validates it with numerical and experimental results with excellent agreement.

  13. Statistical properties of convex clustering

    PubMed Central

    Tan, Kean Ming; Witten, Daniela

    2016-01-01

    In this manuscript, we study the statistical properties of convex clustering. We establish that convex clustering is closely related to single linkage hierarchical clustering and k-means clustering. In addition, we derive the range of the tuning parameter for convex clustering that yields a non-trivial solution. We also provide an unbiased estimator of the degrees of freedom, and provide a finite sample bound for the prediction error for convex clustering. We compare convex clustering to some traditional clustering methods in simulation studies.

  14. Broadband rotor noise analyses

    NASA Technical Reports Server (NTRS)

    George, A. R.; Chou, S. T.

    1984-01-01

    The various mechanisms which generate broadband noise on a range of rotors studied include load fluctuations due to inflow turbulence, due to turbulent boundary layers passing the blades' trailing edges, and due to tip vortex formation. Existing analyses are used and extensions to them are developed to make more accurate predictions of rotor noise spectra and to determine which mechanisms are important in which circumstances. Calculations based on the various prediction methods in existing experiments were compared. The present analyses are adequate to predict the spectra from a wide variety of experiments on fans, full scale and model scale helicopter rotors, wind turbines, and propellers to within about 5 to 10 dB. Better knowledge of the inflow turbulence improves the accuracy of the predictions. Results indicate that inflow turbulence noise depends strongly on ambient conditions and dominates at low frequencies. Trailing edge noise and tip vortex noise are important at higher frequencies if inflow turbulence is weak. Boundary layer trailing edge noise, important, for large sized rotors, increases slowly with angle of attack but not as rapidly as tip vortex noise.

  15. Broadband rotor noise analyses

    NASA Astrophysics Data System (ADS)

    George, A. R.; Chou, S. T.

    1984-04-01

    The various mechanisms which generate broadband noise on a range of rotors studied include load fluctuations due to inflow turbulence, due to turbulent boundary layers passing the blades' trailing edges, and due to tip vortex formation. Existing analyses are used and extensions to them are developed to make more accurate predictions of rotor noise spectra and to determine which mechanisms are important in which circumstances. Calculations based on the various prediction methods in existing experiments were compared. The present analyses are adequate to predict the spectra from a wide variety of experiments on fans, full scale and model scale helicopter rotors, wind turbines, and propellers to within about 5 to 10 dB. Better knowledge of the inflow turbulence improves the accuracy of the predictions. Results indicate that inflow turbulence noise depends strongly on ambient conditions and dominates at low frequencies. Trailing edge noise and tip vortex noise are important at higher frequencies if inflow turbulence is weak. Boundary layer trailing edge noise, important, for large sized rotors, increases slowly with angle of attack but not as rapidly as tip vortex noise.

  16. Polyimide processing additives

    NASA Technical Reports Server (NTRS)

    Fletcher, James C. (Inventor); Pratt, J. Richard (Inventor); St.clair, Terry L. (Inventor); Stoakley, Diane M. (Inventor); Burks, Harold D. (Inventor)

    1992-01-01

    A process for preparing polyimides having enhanced melt flow properties is described. The process consists of heating a mixture of a high molecular weight poly-(amic acid) or polyimide with a low molecular weight amic acid or imide additive in the range of 0.05 to 15 percent by weight of additive. The polyimide powders so obtained show improved processability, as evidenced by lower melt viscosity by capillary rheometry. Likewise, films prepared from mixtures of polymers with additives show improved processability with earlier onset of stretching by TMA.

  17. Polyimide processing additives

    NASA Technical Reports Server (NTRS)

    Pratt, J. Richard (Inventor); St.clair, Terry L. (Inventor); Stoakley, Diane M. (Inventor); Burks, Harold D. (Inventor)

    1993-01-01

    A process for preparing polyimides having enhanced melt flow properties is described. The process consists of heating a mixture of a high molecular weight poly-(amic acid) or polyimide with a low molecular weight amic acid or imide additive in the range of 0.05 to 15 percent by weight of the additive. The polyimide powders so obtained show improved processability, as evidenced by lower melt viscosity by capillary rheometry. Likewise, films prepared from mixtures of polymers with additives show improved processability with earlier onset of stretching by TMA.

  18. Using scientifically and statistically sufficient statistics in comparing image segmentations.

    PubMed

    Chi, Yueh-Yun; Muller, Keith E

    2010-01-01

    Automatic computer segmentation in three dimensions creates opportunity to reduce the cost of three-dimensional treatment planning of radiotherapy for cancer treatment. Comparisons between human and computer accuracy in segmenting kidneys in CT scans generate distance values far larger in number than the number of CT scans. Such high dimension, low sample size (HDLSS) data present a grand challenge to statisticians: how do we find good estimates and make credible inference? We recommend discovering and using scientifically and statistically sufficient statistics as an additional strategy for overcoming the curse of dimensionality. First, we reduced the three-dimensional array of distances for each image comparison to a histogram to be modeled individually. Second, we used non-parametric kernel density estimation to explore distributional patterns and assess multi-modality. Third, a systematic exploratory search for parametric distributions and truncated variations led to choosing a Gaussian form as approximating the distribution of a cube root transformation of distance. Fourth, representing each histogram by an individually estimated distribution eliminated the HDLSS problem by reducing on average 26,000 distances per histogram to just 2 parameter estimates. In the fifth and final step we used classical statistical methods to demonstrate that the two human observers disagreed significantly less with each other than with the computer segmentation. Nevertheless, the size of all disagreements was clinically unimportant relative to the size of a kidney. The hierarchal modeling approach to object-oriented data created response variables deemed sufficient by both the scientists and statisticians. We believe the same strategy provides a useful addition to the imaging toolkit and will succeed with many other high throughput technologies in genetics, metabolomics and chemical analysis. PMID:24967000

  19. Statistical mechanics of economics I

    NASA Astrophysics Data System (ADS)

    Kusmartsev, F. V.

    2011-02-01

    We show that statistical mechanics is useful in the description of financial crisis and economics. Taking a large amount of instant snapshots of a market over an interval of time we construct their ensembles and study their statistical interference. This results in a probability description of the market and gives capital, money, income, wealth and debt distributions, which in the most cases takes the form of the Bose-Einstein distribution. In addition, statistical mechanics provides the main market equations and laws which govern the correlations between the amount of money, debt, product, prices and number of retailers. We applied the found relations to a study of the evolution of the economics in USA between the years 1996 to 2008 and observe that over that time the income of a major population is well described by the Bose-Einstein distribution which parameters are different for each year. Each financial crisis corresponds to a peak in the absolute activity coefficient. The analysis correctly indicates the past crises and predicts the future one.

  20. How Big of a Problem is Analytic Error in Secondary Analyses of Survey Data?

    PubMed Central

    West, Brady T.; Sakshaug, Joseph W.; Aurelien, Guy Alain S.

    2016-01-01

    Secondary analyses of survey data collected from large probability samples of persons or establishments further scientific progress in many fields. The complex design features of these samples improve data collection efficiency, but also require analysts to account for these features when conducting analysis. Unfortunately, many secondary analysts from fields outside of statistics, biostatistics, and survey methodology do not have adequate training in this area, and as a result may apply incorrect statistical methods when analyzing these survey data sets. This in turn could lead to the publication of incorrect inferences based on the survey data that effectively negate the resources dedicated to these surveys. In this article, we build on the results of a preliminary meta-analysis of 100 peer-reviewed journal articles presenting analyses of data from a variety of national health surveys, which suggested that analytic errors may be extremely prevalent in these types of investigations. We first perform a meta-analysis of a stratified random sample of 145 additional research products analyzing survey data from the Scientists and Engineers Statistical Data System (SESTAT), which describes features of the U.S. Science and Engineering workforce, and examine trends in the prevalence of analytic error across the decades used to stratify the sample. We once again find that analytic errors appear to be quite prevalent in these studies. Next, we present several example analyses of real SESTAT data, and demonstrate that a failure to perform these analyses correctly can result in substantially biased estimates with standard errors that do not adequately reflect complex sample design features. Collectively, the results of this investigation suggest that reviewers of this type of research need to pay much closer attention to the analytic methods employed by researchers attempting to publish or present secondary analyses of survey data. PMID:27355817

  1. Combining Statistical Tools and Ecological Assessments in the Study of Biodeterioration Patterns of Stone Temples in Angkor (Cambodia).

    PubMed

    Caneva, G; Bartoli, F; Savo, V; Futagami, Y; Strona, G

    2016-01-01

    Biodeterioration is a major problem for the conservation of cultural heritage materials. We provide a new and original approach to analyzing changes in patterns of colonization (Biodeterioration patterns, BPs) by biological agents responsible for the deterioration of outdoor stone materials. Here we analyzed BPs of four Khmer temples in Angkor (Cambodia) exposed to variable environmental conditions, using qualitative ecological assessments and statistical approaches. The statistical analyses supported the findings obtained with the qualitative approach. Both approaches provided additional information not otherwise available using one single method. Our results indicate that studies on biodeterioration can benefit from integrating diverse methods so that conservation efforts might become more precise and effective. PMID:27597658

  2. Combining Statistical Tools and Ecological Assessments in the Study of Biodeterioration Patterns of Stone Temples in Angkor (Cambodia)

    PubMed Central

    Caneva, G.; Bartoli, F.; Savo, V.; Futagami, Y.; Strona, G.

    2016-01-01

    Biodeterioration is a major problem for the conservation of cultural heritage materials. We provide a new and original approach to analyzing changes in patterns of colonization (Biodeterioration patterns, BPs) by biological agents responsible for the deterioration of outdoor stone materials. Here we analyzed BPs of four Khmer temples in Angkor (Cambodia) exposed to variable environmental conditions, using qualitative ecological assessments and statistical approaches. The statistical analyses supported the findings obtained with the qualitative approach. Both approaches provided additional information not otherwise available using one single method. Our results indicate that studies on biodeterioration can benefit from integrating diverse methods so that conservation efforts might become more precise and effective. PMID:27597658

  3. Uncertainty and Sensitivity Analyses Plan

    SciTech Connect

    Simpson, J.C.; Ramsdell, J.V. Jr.

    1993-04-01

    Hanford Environmental Dose Reconstruction (HEDR) Project staff are developing mathematical models to be used to estimate the radiation dose that individuals may have received as a result of emissions since 1944 from the US Department of Energy's (DOE) Hanford Site near Richland, Washington. An uncertainty and sensitivity analyses plan is essential to understand and interpret the predictions from these mathematical models. This is especially true in the case of the HEDR models where the values of many parameters are unknown. This plan gives a thorough documentation of the uncertainty and hierarchical sensitivity analysis methods recommended for use on all HEDR mathematical models. The documentation includes both technical definitions and examples. In addition, an extensive demonstration of the uncertainty and sensitivity analysis process is provided using actual results from the Hanford Environmental Dose Reconstruction Integrated Codes (HEDRIC). This demonstration shows how the approaches used in the recommended plan can be adapted for all dose predictions in the HEDR Project.

  4. Statistical Physics of Fracture

    SciTech Connect

    Alava, Mikko; Nukala, Phani K; Zapperi, Stefano

    2006-05-01

    Disorder and long-range interactions are two of the key components that make material failure an interesting playfield for the application of statistical mechanics. The cornerstone in this respect has been lattice models of the fracture in which a network of elastic beams, bonds, or electrical fuses with random failure thresholds are subject to an increasing external load. These models describe on a qualitative level the failure processes of real, brittle, or quasi-brittle materials. This has been particularly important in solving the classical engineering problems of material strength: the size dependence of maximum stress and its sample-to-sample statistical fluctuations. At the same time, lattice models pose many new fundamental questions in statistical physics, such as the relation between fracture and phase transitions. Experimental results point out to the existence of an intriguing crackling noise in the acoustic emission and of self-affine fractals in the crack surface morphology. Recent advances in computer power have enabled considerable progress in the understanding of such models. Among these partly still controversial issues, are the scaling and size-effects in material strength and accumulated damage, the statistics of avalanches or bursts of microfailures, and the morphology of the crack surface. Here we present an overview of the results obtained with lattice models for fracture, highlighting the relations with statistical physics theories and more conventional fracture mechanics approaches.

  5. Statistical Downscaling: Lessons Learned

    NASA Astrophysics Data System (ADS)

    Walton, D.; Hall, A. D.; Sun, F.

    2013-12-01

    In this study, we examine ways to improve statistical downscaling of general circulation model (GCM) output. Why do we downscale GCM output? GCMs have low resolution, so they cannot represent local dynamics and topographic effects that cause spatial heterogeneity in the regional climate change signal. Statistical downscaling recovers fine-scale information by utilizing relationships between the large-scale and fine-scale signals to bridge this gap. In theory, the downscaled climate change signal is more credible and accurate than its GCM counterpart, but in practice, there may be little improvement. Here, we tackle the practical problems that arise in statistical downscaling, using temperature change over the Los Angeles region as a test case. This region is an ideal place to apply downscaling since its complex topography and shoreline are poorly simulated by GCMs. By comparing two popular statistical downscaling methods and one dynamical downscaling method, we identify issues with statistically downscaled climate change signals and develop ways to fix them. We focus on scale mismatch, domain of influence, and other problems - many of which users may be unaware of - and discuss practical solutions.

  6. Smog control fuel additives

    SciTech Connect

    Lundby, W.

    1993-06-29

    A method is described of controlling, reducing or eliminating, ozone and related smog resulting from photochemical reactions between ozone and automotive or industrial gases comprising the addition of iodine or compounds of iodine to hydrocarbon-base fuels prior to or during combustion in an amount of about 1 part iodine per 240 to 10,000,000 parts fuel, by weight, to be accomplished by: (a) the addition of these inhibitors during or after the refining or manufacturing process of liquid fuels; (b) the production of these inhibitors for addition into fuel tanks, such as automotive or industrial tanks; or (c) the addition of these inhibitors into combustion chambers of equipment utilizing solid fuels for the purpose of reducing ozone.

  7. Food Additives and Hyperkinesis

    ERIC Educational Resources Information Center

    Wender, Ester H.

    1977-01-01

    The hypothesis that food additives are causally associated with hyperkinesis and learning disabilities in children is reviewed, and available data are summarized. Available from: American Medical Association 535 North Dearborn Street Chicago, Illinois 60610. (JG)

  8. Additional Types of Neuropathy

    MedlinePlus

    ... A A Listen En Español Additional Types of Neuropathy Charcot's Joint Charcot's Joint, also called neuropathic arthropathy, ... can stop bone destruction and aid healing. Cranial Neuropathy Cranial neuropathy affects the 12 pairs of nerves ...

  9. Tips and Tricks for Successful Application of Statistical Methods to Biological Data.

    PubMed

    Schlenker, Evelyn

    2016-01-01

    This chapter discusses experimental design and use of statistics to describe characteristics of data (descriptive statistics) and inferential statistics that test the hypothesis posed by the investigator. Inferential statistics, based on probability distributions, depend upon the type and distribution of the data. For data that are continuous, randomly and independently selected, as well as normally distributed more powerful parametric tests such as Student's t test and analysis of variance (ANOVA) can be used. For non-normally distributed or skewed data, transformation of the data (using logarithms) may normalize the data allowing use of parametric tests. Alternatively, with skewed data nonparametric tests can be utilized, some of which rely on data that are ranked prior to statistical analysis. Experimental designs and analyses need to balance between committing type 1 errors (false positives) and type 2 errors (false negatives). For a variety of clinical studies that determine risk or benefit, relative risk ratios (random clinical trials and cohort studies) or odds ratios (case-control studies) are utilized. Although both use 2 × 2 tables, their premise and calculations differ. Finally, special statistical methods are applied to microarray and proteomics data, since the large number of genes or proteins evaluated increase the likelihood of false discoveries. Additional studies in separate samples are used to verify microarray and proteomic data. Examples in this chapter and references are available to help continued investigation of experimental designs and appropriate data analysis. PMID:26585142

  10. Statistical properties of a quantum cellular automaton

    NASA Astrophysics Data System (ADS)

    Inui, Norio; Inokuchi, Shuichi; Mizoguchi, Yoshihiro; Konno, Norio

    2005-09-01

    We study a quantum cellular automaton (QCA) whose time evolution is defined using the global transition function of a classical cellular automaton (CA). In order to investigate natural transformations from CAs to QCAs, the present QCA includes the CA with Wolfram’s rules 150 and 105 as special cases. We first compute the time evolution of the QCA and examine its statistical properties. As a basic statistical value, the probability of finding an active cell averaged over spatial-temporal space is introduced, and the difference between the CA and QCA is considered. In addition, it is shown that statistical properties in QCAs are related to the classical trajectory in configuration space.

  11. Beyond statistical descriptions of variability

    NASA Astrophysics Data System (ADS)

    Graham, Matthew; Catalina Real-time Transient Survey Team

    2016-01-01

    The first generation of large synoptic survey archives, such as CRTS, PTF and Pan-STARRs, are now (or soon will be) available to the community, enabling unprecedented systematic searches and studies of variable astrophysical phenomena. These range from moving objects in the Solar System to extreme quasars in the distant universe. However, much of the analyses of these data sets conducted so far have aimed at providing statistical descriptions of the variability. Whilst such parameterizations are useful for feeding classification algorithms, they are not effective at describing the underlying type of variability in the sources or the physical mechanism(s) for it. In this talk, we will discuss new approaches, such as wavelet variance, random matrix theory and echo state networks, that can provide insight into the science of variability rather than just statistically characterizing it. We will pay particular attention to sources exhibiting stochastic variation and how much information about the host system can be determined from their time series. For example, characteristic restframe timescales have been identified in quasars, potentially related to the size of coherent noise fields in the accretion disk. Finally, we will also consider the potential limitations of the next generation surveys, such as LSST and SKA.

  12. EEG analyses with SOBI.

    SciTech Connect

    Glickman, Matthew R.; Tang, Akaysha

    2009-02-01

    The motivating vision behind Sandia's MENTOR/PAL LDRD project has been that of systems which use real-time psychophysiological data to support and enhance human performance, both individually and of groups. Relevant and significant psychophysiological data being a necessary prerequisite to such systems, this LDRD has focused on identifying and refining such signals. The project has focused in particular on EEG (electroencephalogram) data as a promising candidate signal because it (potentially) provides a broad window on brain activity with relatively low cost and logistical constraints. We report here on two analyses performed on EEG data collected in this project using the SOBI (Second Order Blind Identification) algorithm to identify two independent sources of brain activity: one in the frontal lobe and one in the occipital. The first study looks at directional influences between the two components, while the second study looks at inferring gender based upon the frontal component.

  13. LDEF Satellite Radiation Analyses

    NASA Technical Reports Server (NTRS)

    Armstrong, T. W.; Colborn, B. L.

    1996-01-01

    This report covers work performed by Science Applications International Corporation (SAIC) under contract NAS8-39386 from the NASA Marshall Space Flight Center entitled LDEF Satellite Radiation Analyses. The basic objective of the study was to evaluate the accuracy of present models and computational methods for defining the ionizing radiation environment for spacecraft in Low Earth Orbit (LEO) by making comparisons with radiation measurements made on the Long Duration Exposure Facility (LDEF) satellite, which was recovered after almost six years in space. The emphasis of the work here is on predictions and comparisons with LDEF measurements of induced radioactivity and Linear Energy Transfer (LET) measurements. These model/data comparisons have been used to evaluate the accuracy of current models for predicting the flux and directionality of trapped protons for LEO missions.

  14. Network Class Superposition Analyses

    PubMed Central

    Pearson, Carl A. B.; Zeng, Chen; Simha, Rahul

    2013-01-01

    Networks are often used to understand a whole system by modeling the interactions among its pieces. Examples include biomolecules in a cell interacting to provide some primary function, or species in an environment forming a stable community. However, these interactions are often unknown; instead, the pieces' dynamic states are known, and network structure must be inferred. Because observed function may be explained by many different networks (e.g., for the yeast cell cycle process [1]), considering dynamics beyond this primary function means picking a single network or suitable sample: measuring over all networks exhibiting the primary function is computationally infeasible. We circumvent that obstacle by calculating the network class ensemble. We represent the ensemble by a stochastic matrix , which is a transition-by-transition superposition of the system dynamics for each member of the class. We present concrete results for derived from Boolean time series dynamics on networks obeying the Strong Inhibition rule, by applying to several traditional questions about network dynamics. We show that the distribution of the number of point attractors can be accurately estimated with . We show how to generate Derrida plots based on . We show that -based Shannon entropy outperforms other methods at selecting experiments to further narrow the network structure. We also outline an experimental test of predictions based on . We motivate all of these results in terms of a popular molecular biology Boolean network model for the yeast cell cycle, but the methods and analyses we introduce are general. We conclude with open questions for , for example, application to other models, computational considerations when scaling up to larger systems, and other potential analyses. PMID:23565141

  15. Equations for estimating selected streamflow statistics in Rhode Island

    USGS Publications Warehouse

    Bent, Gardner C.; Steeves, Peter A.; Waite, Andrew M.

    2014-01-01

    Regional regression equations were developed for estimating selected natural—unaffected by alteration—streamflows of specific flow durations and low-flow frequency statistics for ungaged stream sites in Rhode Island. Selected at-site streamflow statistics are provided for 41 long-term streamgages, 21 short-term streamgages, and 135 partial-record stations in Rhode Island, eastern Connecticut, and southeastern and south-central Massachusetts. The regression equations for estimating selected streamflow statistics and the at-site statistics estimated for each of the 197 sites may be used by Federal, State, and local water managers in addressing water issues in and near Rhode Island. Multiple and simple linear regression equations were developed to estimate the 99-, 98-, 95-, 90-, 85-, 80-, 75-, 70-, 60-, 50-, 40-, 30-, 25-, 20-, 15-, 10-, 5-, 2-, and 1-percent flow durations and the 7Q2 (7-day, 2-year) and 7Q10 (7-day, 10-year) low-flow-frequency statistics. An additional 49 selected statistics, for which regression equations were not developed, also were estimated for the long- and short-term streamgages and partial-record stations for flow durations between the 99.99 and 0.01 percent and for the mean annual, mean monthly, and median monthly streamflows. A total of 70 selected streamflow statistics were estimated for 41 long-term streamgages, 21 short-term streamgages, and 135 partial-record stations in and near Rhode Island. Estimates of the long-term streamflow statistics for the 21 short-term streamgages and 135 partial-record stations were developed by the Maintenance of Variance Extension, type 1 (MOVE.1), record-extension technique. The equations used to estimate selected streamflow statistics were developed by relating the 19 flow-duration and 2 low-flow-frequency statistics to 31 different basin characteristics (physical, land-cover, and climatic) at the 41 long-term and 19 of 21 short-term streamgages (a total of 60 streamgages) in and near Rhode Island

  16. Statistical Analysis Techniques for Small Sample Sizes

    NASA Technical Reports Server (NTRS)

    Navard, S. E.

    1984-01-01

    The small sample sizes problem which is encountered when dealing with analysis of space-flight data is examined. Because of such a amount of data available, careful analyses are essential to extract the maximum amount of information with acceptable accuracy. Statistical analysis of small samples is described. The background material necessary for understanding statistical hypothesis testing is outlined and the various tests which can be done on small samples are explained. Emphasis is on the underlying assumptions of each test and on considerations needed to choose the most appropriate test for a given type of analysis.

  17. Thermodynamics and Statistical Mechanics of Macromolecular Systems

    NASA Astrophysics Data System (ADS)

    Bachmann, Michael

    2014-04-01

    Preface and outline; 1. Introduction; 2. Statistical mechanics: a modern review; 3. The complexity of minimalistic lattice models for protein folding; 4. Monte Carlo and chain growth methods for molecular simulations; 5. First insights to freezing and collapse of flexible polymers; 6. Crystallization of elastic polymers; 7. Structural phases of semiflexible polymers; 8. Generic tertiary folding properties of proteins in mesoscopic scales; 9. Protein folding channels and kinetics of two-state folding; 10. Inducing generic secondary structures by constraints; 11. Statistical analyses of aggregation processes; 12. Hierarchical nature of phase transitions; 13. Adsorption of polymers at solid substrates; 14. Hybrid protein-substrate interfaces; 15. Concluding remarks and outlook; Notes; References; Index.

  18. Additive Manufacturing Infrared Inspection

    NASA Technical Reports Server (NTRS)

    Gaddy, Darrell

    2014-01-01

    Additive manufacturing is a rapid prototyping technology that allows parts to be built in a series of thin layers from plastic, ceramics, and metallics. Metallic additive manufacturing is an emerging form of rapid prototyping that allows complex structures to be built using various metallic powders. Significant time and cost savings have also been observed using the metallic additive manufacturing compared with traditional techniques. Development of the metallic additive manufacturing technology has advanced significantly over the last decade, although many of the techniques to inspect parts made from these processes have not advanced significantly or have limitations. Several external geometry inspection techniques exist such as Coordinate Measurement Machines (CMM), Laser Scanners, Structured Light Scanning Systems, or even traditional calipers and gages. All of the aforementioned techniques are limited to external geometry and contours or must use a contact probe to inspect limited internal dimensions. This presentation will document the development of a process for real-time dimensional inspection technique and digital quality record of the additive manufacturing process using Infrared camera imaging and processing techniques.

  19. Phenylethynyl Containing Reactive Additives

    NASA Technical Reports Server (NTRS)

    Connell, John W. (Inventor); Smith, Joseph G., Jr. (Inventor); Hergenrother, Paul M. (Inventor)

    2002-01-01

    Phenylethynyl containing reactive additives were prepared from aromatic diamine, containing phenylethvnvl groups and various ratios of phthalic anhydride and 4-phenylethynviphthalic anhydride in glacial acetic acid to form the imide in one step or in N-methyl-2-pvrrolidinone to form the amide acid intermediate. The reactive additives were mixed in various amounts (10% to 90%) with oligomers containing either terminal or pendent phenylethynyl groups (or both) to reduce the melt viscosity and thereby enhance processability. Upon thermal cure, the additives react and become chemically incorporated into the matrix and effect an increase in crosslink density relative to that of the host resin. This resultant increase in crosslink density has advantageous consequences on the cured resin properties such as higher glass transition temperature and higher modulus as compared to that of the host resin.

  20. [What meta-analyses teach us: pros and cons].

    PubMed

    Biondi-Zoccai, Giuseppe; D'Ascenzo, Fabrizio; Frati, Giacomo; Abbate, Antonio

    2015-09-01

    The exponential increase in publications focusing on important clinical issues represents a major challenge for patients, physicians, and decision-makers, despite the braggadocio of many experts. Meta-analysis, when conducted within the context of a systematic review, offers an efficient and potent tool to summarize the clinical evidence accrued on a specific clinical question. Despite their many strengths, which include statistical precision, external validity, and the opportunity to analyze subgroups and moderators, meta-analyses also have many limitations. In addition, they are criticized because potentially an exercise in "mega-silliness", mixing "apples and oranges", unable to improve the quality of primary studies (in keeping with the say "garbage in-garbage out"), and focusing on an "average patient" who is only hypothetical. Yet, it is evident that meta-analyses will continue to play a key role in informing decision making whenever the best approach is not self-evident. Thus, it is mandatory to know their main features in order to use them critically and constructively, without being dominated nor scared. PMID:26418385

  1. Statistical Background Needed to Read Professional Pharmacy Journals.

    ERIC Educational Resources Information Center

    Moore, Randy; And Others

    1978-01-01

    An examination of professional pharmacy literature was undertaken to determine types of statistical terminology and analyses presented and compare these with the results of a survey to determine the statistical backgrounds of graduates of schools that grant the Doctor of Pharmacy and/or Master of Science in Hospital Pharmacy. (JMD)

  2. A Taste of Asia with Statistics and Technology

    ERIC Educational Resources Information Center

    Reid, Josh; Carmichael, Colin

    2015-01-01

    Josh Reid and Colin Carmichael describe how some Year 6 children have developed their understanding of mathematics by studying Asian countries. The statistical analyses undertaken by these children appears to have strengthened their understanding of statistical concepts and at the same time provided them with tools for understanding complex…

  3. Modern Statistical Graphs that Provide Insight in Research Results

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Modern statistical graphics offer insight in assessing the results of many common statistical analyses. These ideas, however, are rarely employed in agronomic research articles. This work presents several commonly used graphs, offers one or more alternatives for each, and provides the reasons for pr...

  4. The Empirical Nature and Statistical Treatment of Missing Data

    ERIC Educational Resources Information Center

    Tannenbaum, Christyn E.

    2009-01-01

    Introduction. Missing data is a common problem in research and can produce severely misleading analyses, including biased estimates of statistical parameters, and erroneous conclusions. In its 1999 report, the APA Task Force on Statistical Inference encouraged authors to report complications such as missing data and discouraged the use of…

  5. Additives in plastics.

    PubMed Central

    Deanin, R D

    1975-01-01

    The polymers used in plastics are generally harmless. However, they are rarely used in pure form. In almost all commercial plastics, they are "compounded" with monomeric ingredients to improve their processing and end-use performance. In order of total volume used, these monomeric additives may be classified as follows: reinforcing fibers, fillers, and coupling agents; plasticizers; colorants; stabilizers (halogen stabilizers, antioxidants, ultraviolet absorbers, and biological preservatives); processing aids (lubricants, others, and flow controls); flame retardants, peroxides; and antistats. Some information is already available, and much more is needed, on potential toxicity and safe handling of these additives during processing and manufacture of plastics products. PMID:1175566

  6. Analogies for Understanding Statistics

    ERIC Educational Resources Information Center

    Hocquette, Jean-Francois

    2004-01-01

    This article describes a simple way to explain the limitations of statistics to scientists and students to avoid the publication of misleading conclusions. Biologists examine their results extremely critically and carefully choose the appropriate analytic methods depending on their scientific objectives. However, no such close attention is usually…

  7. Statistical methods in microbiology.

    PubMed Central

    Ilstrup, D M

    1990-01-01

    Statistical methodology is viewed by the average laboratory scientist, or physician, sometimes with fear and trepidation, occasionally with loathing, and seldom with fondness. Statistics may never be loved by the medical community, but it does not have to be hated by them. It is true that statistical science is sometimes highly mathematical, always philosophical, and occasionally obtuse, but for the majority of medical studies it can be made palatable. The goal of this article has been to outline a finite set of methods of analysis that investigators should choose based on the nature of the variable being studied and the design of the experiment. The reader is encouraged to seek the advice of a professional statistician when there is any doubt about the appropriate method of analysis. A statistician can also help the investigator with problems that have nothing to do with statistical tests, such as quality control, choice of response variable and comparison groups, randomization, and blinding of assessment of response variables. PMID:2200604

  8. Statistical Energy Analysis Program

    NASA Technical Reports Server (NTRS)

    Ferebee, R. C.; Trudell, R. W.; Yano, L. I.; Nygaard, S. I.

    1985-01-01

    Statistical Energy Analysis (SEA) is powerful tool for estimating highfrequency vibration spectra of complex structural systems and incorporated into computer program. Basic SEA analysis procedure divided into three steps: Idealization, parameter generation, and problem solution. SEA computer program written in FORTRAN V for batch execution.

  9. Statistical Significance Testing.

    ERIC Educational Resources Information Center

    McLean, James E., Ed.; Kaufman, Alan S., Ed.

    1998-01-01

    The controversy about the use or misuse of statistical significance testing has become the major methodological issue in educational research. This special issue contains three articles that explore the controversy, three commentaries on these articles, an overall response, and three rejoinders by the first three authors. They are: (1)…

  10. Education Statistics Quarterly, 2003.

    ERIC Educational Resources Information Center

    Marenus, Barbara; Burns, Shelley; Fowler, William; Greene, Wilma; Knepper, Paula; Kolstad, Andrew; McMillen Seastrom, Marilyn; Scott, Leslie

    2003-01-01

    This publication provides a comprehensive overview of work done across all parts of the National Center for Education Statistics (NCES). Each issue contains short publications, summaries, and descriptions that cover all NCES publications and data products released in a 3-month period. Each issue also contains a message from the NCES on a timely…

  11. Spitball Scatterplots in Statistics

    ERIC Educational Resources Information Center

    Wagaman, John C.

    2012-01-01

    This paper describes an active learning idea that I have used in my applied statistics class as a first lesson in correlation and regression. Students propel spitballs from various standing distances from the target and use the recorded data to determine if the spitball accuracy is associated with standing distance and review the algebra of lines…

  12. Lack of Statistical Significance

    ERIC Educational Resources Information Center

    Kehle, Thomas J.; Bray, Melissa A.; Chafouleas, Sandra M.; Kawano, Takuji

    2007-01-01

    Criticism has been leveled against the use of statistical significance testing (SST) in many disciplines. However, the field of school psychology has been largely devoid of critiques of SST. Inspection of the primary journals in school psychology indicated numerous examples of SST with nonrandom samples and/or samples of convenience. In this…

  13. Juvenile Court Statistics - 1972.

    ERIC Educational Resources Information Center

    Office of Youth Development (DHEW), Washington, DC.

    This report is a statistical study of juvenile court cases in 1972. The data demonstrates how the court is frequently utilized in dealing with juvenile delinquency by the police as well as by other community agencies and parents. Excluded from this report are the ordinary traffic cases handled by juvenile court. The data indicate that: (1) in…

  14. Library Research and Statistics.

    ERIC Educational Resources Information Center

    Lynch, Mary Jo; St. Lifer, Evan; Halstead, Kent; Fox, Bette-Lee; Miller, Marilyn L.; Shontz, Marilyn L.

    2001-01-01

    These nine articles discuss research and statistics on libraries and librarianship, including libraries in the United States, Canada, and Mexico; acquisition expenditures in public, academic, special, and government libraries; price indexes; state rankings of public library data; library buildings; expenditures in school library media centers; and…

  15. Foundations of Statistical Seismology

    NASA Astrophysics Data System (ADS)

    Vere-Jones, David

    2010-06-01

    A brief account is given of the principles of stochastic modelling in seismology, with special regard to the role and development of stochastic models for seismicity. Stochastic models are seen as arising in a hierarchy of roles in seismology, as in other scientific disciplines. At their simplest, they provide a convenient descriptive tool for summarizing data patterns; in engineering and other applications, they provide a practical way of bridging the gap between the detailed modelling of a complex system, and the need to fit models to limited data; at the most fundamental level they arise as a basic component in the modelling of earthquake phenomena, analogous to that of stochastic models in statistical mechanics or turbulence theory. As an emerging subdiscipline, statistical seismology includes elements of all of these. The scope for the development of stochastic models depends crucially on the quantity and quality of the available data. The availability of extensive, high-quality catalogues and other relevant data lies behind the recent explosion of interest in statistical seismology. At just such a stage, it seems important to review the underlying principles on which statistical modelling is based, and that is the main purpose of the present paper.

  16. Graduate Statistics: Student Attitudes

    ERIC Educational Resources Information Center

    Kennedy, Robert L.; Broadston, Pamela M.

    2004-01-01

    This study investigated the attitudes toward statistics of graduate students who used a computer program as part of the instruction, which allowed for an individualized, self-paced, student-centered, activity-based course. The twelve sections involved in this study were offered in the spring and fall 2001, spring and fall 2002, spring and fall…

  17. Geopositional Statistical Methods

    NASA Technical Reports Server (NTRS)

    Ross, Kenton

    2006-01-01

    RMSE based methods distort circular error estimates (up to 50% overestimation). The empirical approach is the only statistically unbiased estimator offered. Ager modification to Shultz approach is nearly unbiased, but cumbersome. All methods hover around 20% uncertainty (@ 95% confidence) for low geopositional bias error estimates. This requires careful consideration in assessment of higher accuracy products.

  18. Statistical Reasoning over Lunch

    ERIC Educational Resources Information Center

    Selmer, Sarah J.; Bolyard, Johnna J.; Rye, James A.

    2011-01-01

    Students in the 21st century are exposed daily to a staggering amount of numerically infused media. In this era of abundant numeric data, students must be able to engage in sound statistical reasoning when making life decisions after exposure to varied information. The context of nutrition can be used to engage upper elementary and middle school…

  19. Fractional statistics and confinement

    NASA Astrophysics Data System (ADS)

    Gaete, P.; Wotzasek, C.

    2005-02-01

    It is shown that a pointlike composite having charge and magnetic moment displays a confining potential for the static interaction while simultaneously obeying fractional statistics in a pure gauge theory in three dimensions, without a Chern-Simons term. This result is distinct from the Maxwell-Chern-Simons theory that shows a screening nature for the potential.

  20. Learning Statistical Concepts

    ERIC Educational Resources Information Center

    Akram, Muhammad; Siddiqui, Asim Jamal; Yasmeen, Farah

    2004-01-01

    In order to learn the concept of statistical techniques one needs to run real experiments that generate reliable data. In practice, the data from some well-defined process or system is very costly and time consuming. It is difficult to run real experiments during the teaching period in the university. To overcome these difficulties, statisticians…

  1. A statistical package for computing time and frequency domain analysis

    NASA Technical Reports Server (NTRS)

    Brownlow, J.

    1978-01-01

    The spectrum analysis (SPA) program is a general purpose digital computer program designed to aid in data analysis. The program does time and frequency domain statistical analyses as well as some preanalysis data preparation. The capabilities of the SPA program include linear trend removal and/or digital filtering of data, plotting and/or listing of both filtered and unfiltered data, time domain statistical characterization of data, and frequency domain statistical characterization of data.

  2. What can we do about exploratory analyses in clinical trials?

    PubMed

    Moyé, Lem

    2015-11-01

    The research community has alternatively embraced then repudiated exploratory analyses since the inception of clinical trials in the middle of the twentieth century. After a series of important but ultimately unreproducible findings, these non-prospectively declared evaluations were relegated to hypothesis generating. Since the majority of evaluations conducted in clinical trials with their rich data sets are exploratory, the absence of their persuasive power adds to the inefficiency of clinical trial analyses in an atmosphere of fiscal frugality. However, the principle argument against exploratory analyses is not based in statistical theory, but pragmatism and observation. The absence of any theoretical treatment of exploratory analyses postpones the day when their statistical weaknesses might be repaired. Here, we introduce examination of the characteristics of exploratory analyses from a probabilistic and statistical framework. Setting the obvious logistical concerns aside (i.e., the absence of planning produces poor precision), exploratory analyses do not appear to suffer from estimation theory weaknesses. The problem appears to be a difficulty in what is actually reported as the p-value. The use of Bayes Theorem provides p-values that are more in line with confirmatory analyses. This development may inaugurate a body of work that would lead to the readmission of exploratory analyses to a position of persuasive power in clinical trials. PMID:26390962

  3. Biobased lubricant additives

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Fully biobased lubricants are those formulated using all biobased ingredients, i.e. biobased base oils and biobased additives. Such formulations provide the maximum environmental, safety, and economic benefits expected from a biobased product. Currently, there are a number of biobased base oils that...

  4. Multifunctional fuel additives

    SciTech Connect

    Baillargeon, D.J.; Cardis, A.B.; Heck, D.B.

    1991-03-26

    This paper discusses a composition comprising a major amount of a liquid hydrocarbyl fuel and a minor low-temperature flow properties improving amount of an additive product of the reaction of a suitable diol and product of a benzophenone tetracarboxylic dianhydride and a long-chain hydrocarbyl aminoalcohol.

  5. ISSUES IN THE STATISTICAL ANALYSIS OF SMALL-AREA HEALTH DATA. (R825173)

    EPA Science Inventory

    The availability of geographically indexed health and population data, with advances in computing, geographical information systems and statistical methodology, have opened the way for serious exploration of small area health statistics based on routine data. Such analyses may be...

  6. Teaching Statistics in Biology: Using Inquiry-Based Learning to Strengthen Understanding of Statistical Analysis in Biology Laboratory Courses

    ERIC Educational Resources Information Center

    Metz, Anneke M.

    2008-01-01

    There is an increasing need for students in the biological sciences to build a strong foundation in quantitative approaches to data analyses. Although most science, engineering, and math field majors are required to take at least one statistics course, statistical analysis is poorly integrated into undergraduate biology course work, particularly…

  7. TRMM-Based Merged Precipitation Analyses

    NASA Technical Reports Server (NTRS)

    Adler, Robert; Huffman, George; Bolvin, David; Nelkin, Eric; Curtis, Scott

    1999-01-01

    This paper describes results of using Tropical Rainfall Measuring Mission (TRMM) information as the key calibration tool in a merged analysis on a 1X1 latitude/longitude monthly scale based on multiple satellite sources and raingauge analyses. The TRMM-based product is compared with surface-based validation data sets and the community-based 20-year Global Precipitation Climatology Project (GPCP)monthly analyses. The TRMM-based merged analysis uses the TRMM information to calibrate the estimates from SSM/I and geosynchronous IR observations and merges those estimates together with the TRMM and gauge information to produce accurate rainfall estimates with the increased sampling provided by the combined satellite information. This TRMM merged analysis uses the combined instrument (Precipitation Radar [PR] and TRMM Microwave Imager [TMI]) retrieval of Haddad as the TRMM estimate with which to calibrate the other satellite estimates. This TRMM Combined instrument (TCI) estimate is shown to produce very similar absolute values to the other main TRMM products. The TRMM and other satellites merged analysis compares favorably to the atoll data set of Morrissey for the months of 1998 with a very small positive bias of 2%. However, comparison with the preliminary results from the TRMM ground validation radar information at Kwajalein atoll in the western Pacific Ocean shows a 26% positive bias. Therefore, absolute magnitudes from TRMM and/or the ground validation need to be treated with care at this point. A month by month comparison of the TRMM merged analysis and the GPCP analysis indicates very similar patterns, but with subtle differences in magnitude. Focusing on the Pacific Ocean ITCZ one can see the TRMM-based estimates having higher peak values and lower values in the ITCZ periphery. These attributes also show up in the statistics, where GPCP>TRMM at low values (below 10 mm/d) and TRMM>GPCP at high values (greater than 15 mm/d). Integrated over the 37N-37S belt for all

  8. The Statistical Drake Equation

    NASA Astrophysics Data System (ADS)

    Maccone, Claudio

    2010-12-01

    We provide the statistical generalization of the Drake equation. From a simple product of seven positive numbers, the Drake equation is now turned into the product of seven positive random variables. We call this "the Statistical Drake Equation". The mathematical consequences of this transformation are then derived. The proof of our results is based on the Central Limit Theorem (CLT) of Statistics. In loose terms, the CLT states that the sum of any number of independent random variables, each of which may be ARBITRARILY distributed, approaches a Gaussian (i.e. normal) random variable. This is called the Lyapunov Form of the CLT, or the Lindeberg Form of the CLT, depending on the mathematical constraints assumed on the third moments of the various probability distributions. In conclusion, we show that: The new random variable N, yielding the number of communicating civilizations in the Galaxy, follows the LOGNORMAL distribution. Then, as a consequence, the mean value of this lognormal distribution is the ordinary N in the Drake equation. The standard deviation, mode, and all the moments of this lognormal N are also found. The seven factors in the ordinary Drake equation now become seven positive random variables. The probability distribution of each random variable may be ARBITRARY. The CLT in the so-called Lyapunov or Lindeberg forms (that both do not assume the factors to be identically distributed) allows for that. In other words, the CLT "translates" into our statistical Drake equation by allowing an arbitrary probability distribution for each factor. This is both physically realistic and practically very useful, of course. An application of our statistical Drake equation then follows. The (average) DISTANCE between any two neighboring and communicating civilizations in the Galaxy may be shown to be inversely proportional to the cubic root of N. Then, in our approach, this distance becomes a new random variable. We derive the relevant probability density

  9. The Statistical Fermi Paradox

    NASA Astrophysics Data System (ADS)

    Maccone, C.

    In this paper is provided the statistical generalization of the Fermi paradox. The statistics of habitable planets may be based on a set of ten (and possibly more) astrobiological requirements first pointed out by Stephen H. Dole in his book Habitable planets for man (1964). The statistical generalization of the original and by now too simplistic Dole equation is provided by replacing a product of ten positive numbers by the product of ten positive random variables. This is denoted the SEH, an acronym standing for “Statistical Equation for Habitables”. The proof in this paper is based on the Central Limit Theorem (CLT) of Statistics, stating that the sum of any number of independent random variables, each of which may be ARBITRARILY distributed, approaches a Gaussian (i.e. normal) random variable (Lyapunov form of the CLT). It is then shown that: 1. The new random variable NHab, yielding the number of habitables (i.e. habitable planets) in the Galaxy, follows the log- normal distribution. By construction, the mean value of this log-normal distribution is the total number of habitable planets as given by the statistical Dole equation. 2. The ten (or more) astrobiological factors are now positive random variables. The probability distribution of each random variable may be arbitrary. The CLT in the so-called Lyapunov or Lindeberg forms (that both do not assume the factors to be identically distributed) allows for that. In other words, the CLT "translates" into the SEH by allowing an arbitrary probability distribution for each factor. This is both astrobiologically realistic and useful for any further investigations. 3. By applying the SEH it is shown that the (average) distance between any two nearby habitable planets in the Galaxy may be shown to be inversely proportional to the cubic root of NHab. This distance is denoted by new random variable D. The relevant probability density function is derived, which was named the "Maccone distribution" by Paul Davies in

  10. Indian Ocean analyses

    NASA Technical Reports Server (NTRS)

    Meyers, Gary

    1992-01-01

    The background and goals of Indian Ocean thermal sampling are discussed from the perspective of a national project which has research goals relevant to variation of climate in Australia. The critical areas of SST variation are identified. The first goal of thermal sampling at this stage is to develop a climatology of thermal structure in the areas and a description of the annual variation of major currents. The sampling strategy is reviewed. Dense XBT sampling is required to achieve accurate, monthly maps of isotherm-depth because of the high level of noise in the measurements caused by aliasing of small scale variation. In the Indian Ocean ship routes dictate where adequate sampling can be achieved. An efficient sampling rate on available routes is determined based on objective analysis. The statistical structure required for objective analysis is described and compared at 95 locations in the tropical Pacific and 107 in the tropical Indian Oceans. XBT data management and quality control methods at CSIRO are reviewed. Results on the mean and annual variation of temperature and baroclinic structure in the South Equatorial Current and Pacific/Indian Ocean Throughflow are presented for the region between northwest Australia and Java-Timor. The mean relative geostrophic transport (0/400 db) of Throughflow is approximately 5 x 106 m3/sec. A nearly equal volume transport is associated with the reference velocity at 400 db. The Throughflow feeds the South Equatorial Current, which has maximum westward flow in August/September, at the end of the southeasterly Monsoon season. A strong semiannual oscillation in the South Java Current is documented. The results are in good agreement with the Semtner and Chervin (1988) ocean general circulation model. The talk concludes with comments on data inadequacies (insufficient coverage, timeliness) particular to the Indian Ocean and suggestions on the future role that can be played by Data Centers, particularly with regard to quality

  11. Molecular ecological network analyses

    PubMed Central

    2012-01-01

    Background Understanding the interaction among different species within a community and their responses to environmental changes is a central goal in ecology. However, defining the network structure in a microbial community is very challenging due to their extremely high diversity and as-yet uncultivated status. Although recent advance of metagenomic technologies, such as high throughout sequencing and functional gene arrays, provide revolutionary tools for analyzing microbial community structure, it is still difficult to examine network interactions in a microbial community based on high-throughput metagenomics data. Results Here, we describe a novel mathematical and bioinformatics framework to construct ecological association networks named molecular ecological networks (MENs) through Random Matrix Theory (RMT)-based methods. Compared to other network construction methods, this approach is remarkable in that the network is automatically defined and robust to noise, thus providing excellent solutions to several common issues associated with high-throughput metagenomics data. We applied it to determine the network structure of microbial communities subjected to long-term experimental warming based on pyrosequencing data of 16 S rRNA genes. We showed that the constructed MENs under both warming and unwarming conditions exhibited topological features of scale free, small world and modularity, which were consistent with previously described molecular ecological networks. Eigengene analysis indicated that the eigengenes represented the module profiles relatively well. In consistency with many other studies, several major environmental traits including temperature and soil pH were found to be important in determining network interactions in the microbial communities examined. To facilitate its application by the scientific community, all these methods and statistical tools have been integrated into a comprehensive Molecular Ecological Network Analysis Pipeline (MENAP

  12. Boron addition to alloys

    SciTech Connect

    Coad, B. C.

    1985-08-20

    A process for addition of boron to an alloy which involves forming a melt of the alloy and a reactive metal, selected from the group consisting of aluminum, titanium, zirconium and mixtures thereof to the melt, maintaining the resulting reactive mixture in the molten state and reacting the boric oxide with the reactive metal to convert at least a portion of the boric oxide to boron which dissolves in the resulting melt, and to convert at least portion of the reactive metal to the reactive metal oxide, which oxide remains with the resulting melt, and pouring the resulting melt into a gas stream to form a first atomized powder which is subsequently remelted with further addition of boric oxide, re-atomized, and thus reprocessed to convert essentially all the reactive metal to metal oxide to produce a powdered alloy containing specified amounts of boron.

  13. Tackifier for addition polyimides

    NASA Technical Reports Server (NTRS)

    Butler, J. M.; St.clair, T. L.

    1980-01-01

    A modification to the addition polyimide, LaRC-160, was prepared to improve tack and drape and increase prepeg out-time. The essentially solventless, high viscosity laminating resin is synthesized from low cost liquid monomers. The modified version takes advantage of a reactive, liquid plasticizer which is used in place of solvent and helps solve a major problem of maintaining good prepeg tack and drape, or the ability of the prepeg to adhere to adjacent plies and conform to a desired shape during the lay up process. This alternate solventless approach allows both longer life of the polymer prepeg and the processing of low void laminates. This approach appears to be applicable to all addition polyimide systems.

  14. Vinyl capped addition polyimides

    NASA Technical Reports Server (NTRS)

    Vannucci, Raymond D. (Inventor); Malarik, Diane C. (Inventor); Delvigs, Peter (Inventor)

    1991-01-01

    Polyimide resins (PMR) are generally useful where high strength and temperature capabilities are required (at temperatures up to about 700 F). Polyimide resins are particularly useful in applications such as jet engine compressor components, for example, blades, vanes, air seals, air splitters, and engine casing parts. Aromatic vinyl capped addition polyimides are obtained by reacting a diamine, an ester of tetracarboxylic acid, and an aromatic vinyl compound. Low void materials with improved oxidative stability when exposed to 700 F air may be fabricated as fiber reinforced high molecular weight capped polyimide composites. The aromatic vinyl capped polyimides are provided with a more aromatic nature and are more thermally stable than highly aliphatic, norbornenyl-type end-capped polyimides employed in PMR resins. The substitution of aromatic vinyl end-caps for norbornenyl end-caps in addition polyimides results in polymers with improved oxidative stability.

  15. [Biologically active food additives].

    PubMed

    Velichko, M A; Shevchenko, V P

    1998-07-01

    More than half out of 40 projects for the medical science development by the year of 2000 have been connected with the bio-active edible additives that are called "the food of XXI century", non-pharmacological means for many diseases. Most of these additives--nutricevtics and parapharmacevtics--are intended for the enrichment of food rations for the sick or healthy people. The ecologicaly safest and most effective are combined domestic adaptogens with immuno-modulating and antioxidating action that give anabolic and stimulating effect,--"leveton", "phytoton" and "adapton". The MKTs-229 tablets are residue discharge means. For atherosclerosis and general adiposis they recommend "tsar tablets" and "aiconol (ikhtien)"--on the base of cod-liver oil or "splat" made out of seaweed (algae). All these preparations have been clinically tested and received hygiene certificates from the Institute of Dietology of the Russian Academy of Medical Science. PMID:9752776

  16. Electrophilic addition of astatine

    SciTech Connect

    Norseev, Yu.V.; Vasaros, L.; Nhan, D.D.; Huan, N.K.

    1988-03-01

    It has been shown for the first time that astatine is capable of undergoing addition reactions to unsaturated hydrocarbons. A new compound of astatine, viz., ethylene astatohydrin, has been obtained, and its retention numbers of squalane, Apiezon, and tricresyl phosphate have been found. The influence of various factors on the formation of ethylene astatohydrin has been studied. It has been concluded on the basis of the results obtained that the univalent cations of astatine in an acidic medium is protonated hypoastatous acid.

  17. Hydrocarbon fuel additive

    SciTech Connect

    Ambrogio, S.

    1989-02-28

    This patent describes the method of fuel storage or combustion, wherein the fuel supply contains small amounts of water, the step of adding to the fuel supply an additive comprising a blend of a hydrophilic agent chosen from the group of ethylene glycol, n-butyl alcohol, and cellosolve in the range of 22-37% by weight; ethoxylated nonylphenol in the range of 26-35% by weight; nonylphenol polyethylene glycol ether in the range of 32-43% by weight.

  18. Statistical Inference at Work: Statistical Process Control as an Example

    ERIC Educational Resources Information Center

    Bakker, Arthur; Kent, Phillip; Derry, Jan; Noss, Richard; Hoyles, Celia

    2008-01-01

    To characterise statistical inference in the workplace this paper compares a prototypical type of statistical inference at work, statistical process control (SPC), with a type of statistical inference that is better known in educational settings, hypothesis testing. Although there are some similarities between the reasoning structure involved in…

  19. Functional Generalized Additive Models.

    PubMed

    McLean, Mathew W; Hooker, Giles; Staicu, Ana-Maria; Scheipl, Fabian; Ruppert, David

    2014-01-01

    We introduce the functional generalized additive model (FGAM), a novel regression model for association studies between a scalar response and a functional predictor. We model the link-transformed mean response as the integral with respect to t of F{X(t), t} where F(·,·) is an unknown regression function and X(t) is a functional covariate. Rather than having an additive model in a finite number of principal components as in Müller and Yao (2008), our model incorporates the functional predictor directly and thus our model can be viewed as the natural functional extension of generalized additive models. We estimate F(·,·) using tensor-product B-splines with roughness penalties. A pointwise quantile transformation of the functional predictor is also considered to ensure each tensor-product B-spline has observed data on its support. The methods are evaluated using simulated data and their predictive performance is compared with other competing scalar-on-function regression alternatives. We illustrate the usefulness of our approach through an application to brain tractography, where X(t) is a signal from diffusion tensor imaging at position, t, along a tract in the brain. In one example, the response is disease-status (case or control) and in a second example, it is the score on a cognitive test. R code for performing the simulations and fitting the FGAM can be found in supplemental materials available online. PMID:24729671

  20. 3-D Cavern Enlargement Analyses

    SciTech Connect

    EHGARTNER, BRIAN L.; SOBOLIK, STEVEN R.

    2002-03-01

    Three-dimensional finite element analyses simulate the mechanical response of enlarging existing caverns at the Strategic Petroleum Reserve (SPR). The caverns are located in Gulf Coast salt domes and are enlarged by leaching during oil drawdowns as fresh water is injected to displace the crude oil from the caverns. The current criteria adopted by the SPR limits cavern usage to 5 drawdowns (leaches). As a base case, 5 leaches were modeled over a 25 year period to roughly double the volume of a 19 cavern field. Thirteen additional leaches where then simulated until caverns approached coalescence. The cavern field approximated the geometries and geologic properties found at the West Hackberry site. This enabled comparisons are data collected over nearly 20 years to analysis predictions. The analyses closely predicted the measured surface subsidence and cavern closure rates as inferred from historic well head pressures. This provided the necessary assurance that the model displacements, strains, and stresses are accurate. However, the cavern field has not yet experienced the large scale drawdowns being simulated. Should they occur in the future, code predictions should be validated with actual field behavior at that time. The simulations were performed using JAS3D, a three dimensional finite element analysis code for nonlinear quasi-static solids. The results examine the impacts of leaching and cavern workovers, where internal cavern pressures are reduced, on surface subsidence, well integrity, and cavern stability. The results suggest that the current limit of 5 oil drawdowns may be extended with some mitigative action required on the wells and later on to surface structure due to subsidence strains. The predicted stress state in the salt shows damage to start occurring after 15 drawdowns with significant failure occurring at the 16th drawdown, well beyond the current limit of 5 drawdowns.

  1. Fermions from classical statistics

    SciTech Connect

    Wetterich, C.

    2010-12-15

    We describe fermions in terms of a classical statistical ensemble. The states {tau} of this ensemble are characterized by a sequence of values one or zero or a corresponding set of two-level observables. Every classical probability distribution can be associated to a quantum state for fermions. If the time evolution of the classical probabilities p{sub {tau}} amounts to a rotation of the wave function q{sub {tau}}(t)={+-}{radical}(p{sub {tau}}(t)), we infer the unitary time evolution of a quantum system of fermions according to a Schroedinger equation. We establish how such classical statistical ensembles can be mapped to Grassmann functional integrals. Quantum field theories for fermions arise for a suitable time evolution of classical probabilities for generalized Ising models.

  2. Statistics in disease ecology

    PubMed Central

    Waller, Lance A.

    2008-01-01

    The three papers included in this special issue represent a set of presentations in an invited session on disease ecology at the 2005 Spring Meeting of the Eastern North American Region of the International Biometric Society. The papers each address statistical estimation and inference for particular components of different disease processes and, taken together, illustrate the breadth of statistical issues arising in the study of the ecology and public health impact of disease. As an introduction, we provide a very brief overview of the area of “disease ecology”, a variety of synonyms addressing different aspects of disease ecology, and present a schematic structure illustrating general components of the underlying disease process, data collection issues, and different disciplinary perspectives ranging from microbiology to public health surveillance. PMID:19081740

  3. Statistical evaluation of forecasts

    NASA Astrophysics Data System (ADS)

    Mader, Malenka; Mader, Wolfgang; Gluckman, Bruce J.; Timmer, Jens; Schelter, Björn

    2014-08-01

    Reliable forecasts of extreme but rare events, such as earthquakes, financial crashes, and epileptic seizures, would render interventions and precautions possible. Therefore, forecasting methods have been developed which intend to raise an alarm if an extreme event is about to occur. In order to statistically validate the performance of a prediction system, it must be compared to the performance of a random predictor, which raises alarms independent of the events. Such a random predictor can be obtained by bootstrapping or analytically. We propose an analytic statistical framework which, in contrast to conventional methods, allows for validating independently the sensitivity and specificity of a forecasting method. Moreover, our method accounts for the periods during which an event has to remain absent or occur after a respective forecast.

  4. Statistics in clinical research: Important considerations

    PubMed Central

    Barkan, Howard

    2015-01-01

    Statistical analysis is one of the foundations of evidence-based clinical practice, a key in conducting new clinical research and in evaluating and applying prior research. In this paper, we review the choice of statistical procedures, analyses of the associations among variables and techniques used when the clinical processes being examined are still in process. We discuss methods for building predictive models in clinical situations, and ways to assess the stability of these models and other quantitative conclusions. Techniques for comparing independent events are distinguished from those used with events in a causal chain or otherwise linked. Attention then turns to study design, to the determination of the sample size needed to make a given comparison, and to statistically negative studies. PMID:25566715

  5. 1979 DOE statistical symposium

    SciTech Connect

    Gardiner, D.A.; Truett T.

    1980-09-01

    The 1979 DOE Statistical Symposium was the fifth in the series of annual symposia designed to bring together statisticians and other interested parties who are actively engaged in helping to solve the nation's energy problems. The program included presentations of technical papers centered around exploration and disposal of nuclear fuel, general energy-related topics, and health-related issues, and workshops on model evaluation, risk analysis, analysis of large data sets, and resource estimation.

  6. Statistical Methods in Cosmology

    NASA Astrophysics Data System (ADS)

    Verde, L.

    2010-03-01

    The advent of large data-set in cosmology has meant that in the past 10 or 20 years our knowledge and understanding of the Universe has changed not only quantitatively but also, and most importantly, qualitatively. Cosmologists rely on data where a host of useful information is enclosed, but is encoded in a non-trivial way. The challenges in extracting this information must be overcome to make the most of a large experimental effort. Even after having converged to a standard cosmological model (the LCDM model) we should keep in mind that this model is described by 10 or more physical parameters and if we want to study deviations from it, the number of parameters is even larger. Dealing with such a high dimensional parameter space and finding parameters constraints is a challenge on itself. Cosmologists want to be able to compare and combine different data sets both for testing for possible disagreements (which could indicate new physics) and for improving parameter determinations. Finally, cosmologists in many cases want to find out, before actually doing the experiment, how much one would be able to learn from it. For all these reasons, sophisiticated statistical techniques are being employed in cosmology, and it has become crucial to know some statistical background to understand recent literature in the field. I will introduce some statistical tools that any cosmologist should know about in order to be able to understand recently published results from the analysis of cosmological data sets. I will not present a complete and rigorous introduction to statistics as there are several good books which are reported in the references. The reader should refer to those.

  7. Quantum U-statistics

    SciTech Connect

    Guta, Madalin; Butucea, Cristina

    2010-10-15

    The notion of a U-statistic for an n-tuple of identical quantum systems is introduced in analogy to the classical (commutative) case: given a self-adjoint 'kernel' K acting on (C{sup d}){sup '}x{sup r} with rstatistics converges in moments to a linear combination of Hermite polynomials in canonical variables of a canonical commutation relation algebra defined through the quantum central limit theorem. In the special cases of nondegenerate kernels and kernels of order of 2, it is shown that the convergence holds in the stronger distribution sense. Two types of applications in quantum statistics are described: testing beyond the two simple hypotheses scenario and quantum metrology with interacting Hamiltonians.

  8. Statistics in fusion experiments

    NASA Astrophysics Data System (ADS)

    McNeill, D. H.

    1997-11-01

    Since the reasons for the variability in data from plasma experiments are often unknown or uncontrollable, statistical methods must be applied. Reliable interpretation and public accountability require full data sets. Two examples of data misrepresentation at PPPL are analyzed: Te >100 eV on S-1 spheromak.(M. Yamada, Nucl. Fusion 25, 1327 (1985); reports to DoE; etc.) The reported high values (statistical artifacts of Thomson scattering measurements) were selected from a mass of data with an average of 40 eV or less. ``Correlated'' spectroscopic data were meaningless. (2) Extrapolation to Q >=0.5 for DT in TFTR.(D. Meade et al., IAEA Baltimore (1990), V. 1, p. 9; H. P. Furth, Statements to U. S. Congress (1989).) The DD yield used there was the highest through 1990 (>= 50% above average) and the DT to DD power ratio used was about twice any published value. Average DD yields and published yield ratios scale to Q<0.15 for DT, in accord with the observed performance over the last 3 1/2 years. Press reports of outlier data from TFTR have obscured the fact that the DT behavior follows from trivial scaling of the DD data. Good practice in future fusion research would have confidence intervals and other descriptive statistics accompanying reported numerical values (cf. JAMA).

  9. Fast statistical alignment.

    PubMed

    Bradley, Robert K; Roberts, Adam; Smoot, Michael; Juvekar, Sudeep; Do, Jaeyoung; Dewey, Colin; Holmes, Ian; Pachter, Lior

    2009-05-01

    We describe a new program for the alignment of multiple biological sequences that is both statistically motivated and fast enough for problem sizes that arise in practice. Our Fast Statistical Alignment program is based on pair hidden Markov models which approximate an insertion/deletion process on a tree and uses a sequence annealing algorithm to combine the posterior probabilities estimated from these models into a multiple alignment. FSA uses its explicit statistical model to produce multiple alignments which are accompanied by estimates of the alignment accuracy and uncertainty for every column and character of the alignment--previously available only with alignment programs which use computationally-expensive Markov Chain Monte Carlo approaches--yet can align thousands of long sequences. Moreover, FSA utilizes an unsupervised query-specific learning procedure for parameter estimation which leads to improved accuracy on benchmark reference alignments in comparison to existing programs. The centroid alignment approach taken by FSA, in combination with its learning procedure, drastically reduces the amount of false-positive alignment on biological data in comparison to that given by other methods. The FSA program and a companion visualization tool for exploring uncertainty in alignments can be used via a web interface at http://orangutan.math.berkeley.edu/fsa/, and the source code is available at http://fsa.sourceforge.net/. PMID:19478997

  10. Statistical Treatment of Looking-Time Data

    PubMed Central

    2016-01-01

    Looking times (LTs) are frequently measured in empirical research on infant cognition. We analyzed the statistical distribution of LTs across participants to develop recommendations for their treatment in infancy research. Our analyses focused on a common within-subject experimental design, in which longer looking to novel or unexpected stimuli is predicted. We analyzed data from 2 sources: an in-house set of LTs that included data from individual participants (47 experiments, 1,584 observations), and a representative set of published articles reporting group-level LT statistics (149 experiments from 33 articles). We established that LTs are log-normally distributed across participants, and therefore, should always be log-transformed before parametric statistical analyses. We estimated the typical size of significant effects in LT studies, which allowed us to make recommendations about setting sample sizes. We show how our estimate of the distribution of effect sizes of LT studies can be used to design experiments to be analyzed by Bayesian statistics, where the experimenter is required to determine in advance the predicted effect size rather than the sample size. We demonstrate the robustness of this method in both sets of LT experiments. PMID:26845505

  11. Statistical treatment of looking-time data.

    PubMed

    Csibra, Gergely; Hernik, Mikołaj; Mascaro, Olivier; Tatone, Denis; Lengyel, Máté

    2016-04-01

    Looking times (LTs) are frequently measured in empirical research on infant cognition. We analyzed the statistical distribution of LTs across participants to develop recommendations for their treatment in infancy research. Our analyses focused on a common within-subject experimental design, in which longer looking to novel or unexpected stimuli is predicted. We analyzed data from 2 sources: an in-house set of LTs that included data from individual participants (47 experiments, 1,584 observations), and a representative set of published articles reporting group-level LT statistics (149 experiments from 33 articles). We established that LTs are log-normally distributed across participants, and therefore, should always be log-transformed before parametric statistical analyses. We estimated the typical size of significant effects in LT studies, which allowed us to make recommendations about setting sample sizes. We show how our estimate of the distribution of effect sizes of LT studies can be used to design experiments to be analyzed by Bayesian statistics, where the experimenter is required to determine in advance the predicted effect size rather than the sample size. We demonstrate the robustness of this method in both sets of LT experiments. (PsycINFO Database Record PMID:26845505

  12. Malaria Diagnosis Using Automated Analysers: A Boon for Hematopathologists in Endemic Areas

    PubMed Central

    Narang, Vikram; Sood, Neena; Garg, Bhavna; Gupta, Vikram Kumar

    2015-01-01

    Background Haematological abnormalities are common in acute febrile tropical illnesses. Malaria is a major health problem in tropics. In endemic areas especially in the post monsoon season, it is not practical to manually screen all peripheral blood films (PBF) for malarial parasite. Automated analysers offer rapid, sensitive and cost effective screening of all samples. Aim The study was done to evaluate the usefulness of automated cell counters analysing their histograms, scatter-grams and the flaggings generated in malaria positive and negative cases. The comparison of other haematological parameters were also studied which could help to identify malaria parasite in peripheral blood smear. Materials and Methods The blood samples were analysed using Beckman coulter LH-750. The abnormal scatter grams and additional peaks in WBC histograms were observed diligently & compared with normal controls. Haematological abnormalities were also evaluated. Statistical Analysis Statistical analysis was done by using software Epi-Info version 7.1.4 freely available from CDC website. Fisher exact test was applied to calculate the p-value and value < 0.05 was considered as significant. Final identification of malarial parasite species was done independently by peripheral blood smear examination by two pathologists. Results Of all the 200 cases evaluated abnormal scatter grams were observed in all the cases of malaria while abnormal WBC histogram peaks were noted in 96% cases demonstrating a peak at the threshold of the histogram. The difference between number of slides positive for abnormal WBC scatter gram and abnormal WBC histogram peaks were statistically highly significant (p=0.007). So abnormal WBC scatter gram can better give idea of malarial parasite presence. Of the haematological parameters thrombocytopenia (92% cases) emerged as the strongest predictor of malaria. Conclusion It is recommended for haematopathologists to review the haematological data and the scatter plots

  13. Statistical palaeomagnetic field modelling and symmetry considerations

    NASA Astrophysics Data System (ADS)

    Hulot, G.; Bouligand, C.

    2005-06-01

    In the present paper, we address symmetry issues in the context of the so-called giant gaussian process (GGP) modelling approach, currently used to statistically analyse the present and past magnetic field of the Earth at times of stable polarity. We first recall the principle of GGP modelling, and for the first time derive the complete and exact constraints a GGP model should satisfy if it is to satisfy statistical spherical, axisymmetrical or equatorially symmetric properties. We note that as often correctly claimed by the authors, many simplifying assumptions used so far to ease the GGP modelling amount to make symmetry assumptions, but not always exactly so, because previous studies did not recognize that symmetry assumptions do not systematically require a lack of cross-correlations between Gauss coefficients. We further note that GGP models obtained so far for the field over the past 5Myr clearly reveal some spherical symmetry breaking properties in both the mean and the fluctuating field (as defined by the covariance matrix of the model) and some equatorial symmetry breaking properties in the mean field. Non-zonal terms found in the mean field of some models and mismatches between variances defining the fluctuating field (in models however not defined in a consistent way) would further suggest that axial symmetry also is broken. The meaning of this is discussed. Spherical symmetry breaking trivially testifies for the influence of the rotation of the Earth on the geodynamo (a long-recognized fact). Axial symmetry breaking, if confirmed, could hardly be attributed to anything else but some influence of the core-mantle boundary (CMB) conditions on the geodynamo (also a well-known fact). By contrast, equatorial symmetry breaking (in particular the persistence of an axial mean quadrupole) may not trivially be considered as evidence of some influence of CMB conditions. To establish this, one would need to better investigate whether or not this axial quadrupole has

  14. Teaching Statistics in Biology: Using Inquiry-based Learning to Strengthen Understanding of Statistical Analysis in Biology Laboratory Courses

    PubMed Central

    2008-01-01

    There is an increasing need for students in the biological sciences to build a strong foundation in quantitative approaches to data analyses. Although most science, engineering, and math field majors are required to take at least one statistics course, statistical analysis is poorly integrated into undergraduate biology course work, particularly at the lower-division level. Elements of statistics were incorporated into an introductory biology course, including a review of statistics concepts and opportunity for students to perform statistical analysis in a biological context. Learning gains were measured with an 11-item statistics learning survey instrument developed for the course. Students showed a statistically significant 25% (p < 0.005) increase in statistics knowledge after completing introductory biology. Students improved their scores on the survey after completing introductory biology, even if they had previously completed an introductory statistics course (9%, improvement p < 0.005). Students retested 1 yr after completing introductory biology showed no loss of their statistics knowledge as measured by this instrument, suggesting that the use of statistics in biology course work may aid long-term retention of statistics knowledge. No statistically significant differences in learning were detected between male and female students in the study. PMID:18765754

  15. Local analyses of Planck maps with Minkowski functionals

    NASA Astrophysics Data System (ADS)

    Novaes, C. P.; Bernui, A.; Marques, G. A.; Ferreira, I. S.

    2016-09-01

    Minkowski functionals (MF) are excellent tools to investigate the statistical properties of the cosmic background radiation (CMB) maps. Between their notorious advantages is the possibility to use them efficiently in patches of the CMB sphere, which allow studies in masked skies, inclusive analyses of small sky regions. Then, possible deviations from Gaussianity are investigated by comparison with MF obtained from a set of Gaussian isotropic simulated CMB maps to which are applied the same cut-sky masks. These analyses are sensitive enough to detect contaminations of small intensity like primary and secondary CMB anisotropies. Our methodology uses the MF, widely employed to study non-Gaussianities in CMB data, and asserts Gaussian deviations only when all of them points out an exceptional χ2 value, at more than 2.2σ confidence level, in a given sky patch. Following this rigorous procedure, we find 13 regions in the foreground-cleaned Planck maps that evince such high levels of non-Gaussian deviations. According to our results, these non-Gaussian contributions show signatures that can be associated to the presence of hot or cold spots in such regions. Moreover, some of these non-Gaussian deviations signals suggest the presence of foreground residuals in those regions located near the Galactic plane. Additionally, we confirm that most of the regions revealed in our analyses, but not all, have been recently reported in studies done by the Planck collaboration. Furthermore, we also investigate whether these non-Gaussian deviations can be possibly sourced by systematics, like inhomogeneous noise and beam effect in the released Planck data, or perhaps due to residual Galactic foregrounds.

  16. SPA- STATISTICAL PACKAGE FOR TIME AND FREQUENCY DOMAIN ANALYSIS

    NASA Technical Reports Server (NTRS)

    Brownlow, J. D.

    1994-01-01

    The need for statistical analysis often arises when data is in the form of a time series. This type of data is usually a collection of numerical observations made at specified time intervals. Two kinds of analysis may be performed on the data. First, the time series may be treated as a set of independent observations using a time domain analysis to derive the usual statistical properties including the mean, variance, and distribution form. Secondly, the order and time intervals of the observations may be used in a frequency domain analysis to examine the time series for periodicities. In almost all practical applications, the collected data is actually a mixture of the desired signal and a noise signal which is collected over a finite time period with a finite precision. Therefore, any statistical calculations and analyses are actually estimates. The Spectrum Analysis (SPA) program was developed to perform a wide range of statistical estimation functions. SPA can provide the data analyst with a rigorous tool for performing time and frequency domain studies. In a time domain statistical analysis the SPA program will compute the mean variance, standard deviation, mean square, and root mean square. It also lists the data maximum, data minimum, and the number of observations included in the sample. In addition, a histogram of the time domain data is generated, a normal curve is fit to the histogram, and a goodness-of-fit test is performed. These time domain calculations may be performed on both raw and filtered data. For a frequency domain statistical analysis the SPA program computes the power spectrum, cross spectrum, coherence, phase angle, amplitude ratio, and transfer function. The estimates of the frequency domain parameters may be smoothed with the use of Hann-Tukey, Hamming, Barlett, or moving average windows. Various digital filters are available to isolate data frequency components. Frequency components with periods longer than the data collection interval

  17. R.A. Fisher's contributions to genetical statistics.

    PubMed

    Thompson, E A

    1990-12-01

    R. A. Fisher (1890-1962) was a professor of genetics, and many of his statistical innovations found expression in the development of methodology in statistical genetics. However, whereas his contributions in mathematical statistics are easily identified, in population genetics he shares his preeminence with Sewall Wright (1889-1988) and J. B. S. Haldane (1892-1965). This paper traces some of Fisher's major contributions to the foundations of statistical genetics, and his interactions with Wright and with Haldane which contributed to the development of the subject. With modern technology, both statistical methodology and genetic data are changing. Nonetheless much of Fisher's work remains relevant, and may even serve as a foundation for future research in the statistical analysis of DNA data. For Fisher's work reflects his view of the role of statistics in scientific inference, expressed in 1949: There is no wide or urgent demand for people who will define methods of proof in set theory in the name of improving mathematical statistics. There is a widespread and urgent demand for mathematicians who understand that branch of mathematics known as theoretical statistics, but who are capable also of recognising situations in the real world to which such mathematics is applicable. In recognising features of the real world to which his models and analyses should be applicable, Fisher laid a lasting foundation for statistical inference in genetic analyses. PMID:2085639

  18. A framework to create customised LHC analyses within CheckMATE

    NASA Astrophysics Data System (ADS)

    Kim, Jong Soo; Schmeier, Daniel; Tattersall, Jamie; Rolbiecki, Krzysztof

    2015-11-01

    CheckMATE  is a framework that allows the user to conveniently test simulated BSM physics events against current LHC data in order to derive exclusion limits. For this purpose, the data runs through a detector simulation and is then processed by a user chosen selection of experimental analyses. These analyses are all defined by signal regions that can be compared to the experimental data with a multitude of statistical tools. Due to the large and continuously growing number of experimental analyses available, users may quickly find themselves in the situation that the study they are particularly interested in has not (yet) been implemented officially into the CheckMATE  framework. However, the code includes a rather simple framework to allow users to add new analyses on their own. This document serves as a guide to this. In addition, CheckMATE  serves as a powerful tool for testing and implementing new search strategies. To aid this process, many tools are included to allow a rapid prototyping of new analyses.

  19. Reexamination of Statistical Methods for Comparative Anatomy: Examples of Its Application and Comparisons with Other Parametric and Nonparametric Statistics

    PubMed Central

    Aversi-Ferreira, Roqueline A. G. M. F.; Nishijo, Hisao; Aversi-Ferreira, Tales Alexandre

    2015-01-01

    Various statistical methods have been published for comparative anatomy. However, few studies compared parametric and nonparametric statistical methods. Moreover, some previous studies using statistical method for comparative anatomy (SMCA) proposed the formula for comparison of groups of anatomical structures (multiple structures) among different species. The present paper described the usage of SMCA and compared the results by SMCA with those by parametric test (t-test) and nonparametric analyses (cladistics) of anatomical data. In conclusion, the SMCA can offer a more exact and precise way to compare single and multiple anatomical structures across different species, which requires analyses of nominal features in comparative anatomy. PMID:26413553

  20. Reexamination of Statistical Methods for Comparative Anatomy: Examples of Its Application and Comparisons with Other Parametric and Nonparametric Statistics.

    PubMed

    Aversi-Ferreira, Roqueline A G M F; Nishijo, Hisao; Aversi-Ferreira, Tales Alexandre

    2015-01-01

    Various statistical methods have been published for comparative anatomy. However, few studies compared parametric and nonparametric statistical methods. Moreover, some previous studies using statistical method for comparative anatomy (SMCA) proposed the formula for comparison of groups of anatomical structures (multiple structures) among different species. The present paper described the usage of SMCA and compared the results by SMCA with those by parametric test (t-test) and nonparametric analyses (cladistics) of anatomical data. In conclusion, the SMCA can offer a more exact and precise way to compare single and multiple anatomical structures across different species, which requires analyses of nominal features in comparative anatomy. PMID:26413553

  1. Siloxane containing addition polyimides

    NASA Technical Reports Server (NTRS)

    Maudgal, S.; St. Clair, T. L.

    1984-01-01

    Addition polyimide oligomers have been synthesized from bis(gamma-aminopropyl) tetramethyldisiloxane and 3, 3', 4, 4'-benzophenonetetracarboxylic dianhydride using a variety of latent crosslinking groups as endcappers. The prepolymers were isolated and characterized for solubility (in amide, chlorinated and ether solvents), melt flow and cure properties. The most promising systems, maleimide and acetylene terminated prepolymers, were selected for detailed study. Graphite cloth reinforced composites were prepared and properties compared with those of graphite/Kerimid 601, a commercially available bismaleimide. Mixtures of the maleimide terminated system with Kerimid 601, in varying proportions, were also studied.

  2. Oil additive process

    SciTech Connect

    Bishop, H.

    1988-10-18

    This patent describes a method of making an additive comprising: (a) adding 2 parts by volume of 3% sodium hypochlorite to 45 parts by volume of diesel oil fuel to form a sulphur free fuel, (b) removing all water and foreign matter formed by the sodium hypochlorite, (c) blending 30 parts by volume of 24% lead naphthanate with 15 parts by volume of the sulphur free fuel, 15 parts by volume of light-weight material oil to form a blended mixture, and (d) heating the blended mixture slowly and uniformly to 152F.

  3. Experimental Mathematics and Computational Statistics

    SciTech Connect

    Bailey, David H.; Borwein, Jonathan M.

    2009-04-30

    The field of statistics has long been noted for techniques to detect patterns and regularities in numerical data. In this article we explore connections between statistics and the emerging field of 'experimental mathematics'. These includes both applications of experimental mathematics in statistics, as well as statistical methods applied to computational mathematics.

  4. Truth, Damn Truth, and Statistics

    ERIC Educational Resources Information Center

    Velleman, Paul F.

    2008-01-01

    Statisticians and Statistics teachers often have to push back against the popular impression that Statistics teaches how to lie with data. Those who believe incorrectly that Statistics is solely a branch of Mathematics (and thus algorithmic), often see the use of judgment in Statistics as evidence that we do indeed manipulate our results. In the…

  5. Florida Library Directory with Statistics, 1997.

    ERIC Educational Resources Information Center

    Florida Dept. of State, Tallahassee. Div. of Library and Information Services.

    This 48th annual edition includes listings for over 1,000 libraries of all types in Florida, with contact names, phone numbers, addresses, and e-mail and web addresses. In addition, there is a section of library statistics, showing data on the use, resources, and financial condition of Florida's libraries. The first section consists of listings…

  6. Machine learning, statistical learning and the future of biological research in psychiatry.

    PubMed

    Iniesta, R; Stahl, D; McGuffin, P

    2016-09-01

    Psychiatric research has entered the age of 'Big Data'. Datasets now routinely involve thousands of heterogeneous variables, including clinical, neuroimaging, genomic, proteomic, transcriptomic and other 'omic' measures. The analysis of these datasets is challenging, especially when the number of measurements exceeds the number of individuals, and may be further complicated by missing data for some subjects and variables that are highly correlated. Statistical learning-based models are a natural extension of classical statistical approaches but provide more effective methods to analyse very large datasets. In addition, the predictive capability of such models promises to be useful in developing decision support systems. That is, methods that can be introduced to clinical settings and guide, for example, diagnosis classification or personalized treatment. In this review, we aim to outline the potential benefits of statistical learning methods in clinical research. We first introduce the concept of Big Data in different environments. We then describe how modern statistical learning models can be used in practice on Big Datasets to extract relevant information. Finally, we discuss the strengths of using statistical learning in psychiatric studies, from both research and practical clinical points of view. PMID:27406289

  7. FAME: Software for analysing rock microstructures

    NASA Astrophysics Data System (ADS)

    Hammes, Daniel M.; Peternell, Mark

    2016-05-01

    Determination of rock microstructures leads to a better understanding of the formation and deformation of polycrystalline solids. Here, we present FAME (Fabric Analyser based Microstructure Evaluation), an easy-to-use MATLAB®-based software for processing datasets recorded by an automated fabric analyser microscope. FAME is provided as a MATLAB®-independent Windows® executable with an intuitive graphical user interface. Raw data from the fabric analyser microscope can be automatically loaded, filtered and cropped before analysis. Accurate and efficient rock microstructure analysis is based on an advanced user-controlled grain labelling algorithm. The preview and testing environments simplify the determination of appropriate analysis parameters. Various statistic and plotting tools allow a graphical visualisation of the results such as grain size, shape, c-axis orientation and misorientation. The FAME2elle algorithm exports fabric analyser data to an elle (modelling software)-supported format. FAME supports batch processing for multiple thin section analysis or large datasets that are generated for example during 2D in-situ deformation experiments. The use and versatility of FAME is demonstrated on quartz and deuterium ice samples.

  8. Integrated genomic analyses of ovarian carcinoma.

    PubMed

    2011-06-30

    A catalogue of molecular aberrations that cause ovarian cancer is critical for developing and deploying therapies that will improve patients' lives. The Cancer Genome Atlas project has analysed messenger RNA expression, microRNA expression, promoter methylation and DNA copy number in 489 high-grade serous ovarian adenocarcinomas and the DNA sequences of exons from coding genes in 316 of these tumours. Here we report that high-grade serous ovarian cancer is characterized by TP53 mutations in almost all tumours (96%); low prevalence but statistically recurrent somatic mutations in nine further genes including NF1, BRCA1, BRCA2, RB1 and CDK12; 113 significant focal DNA copy number aberrations; and promoter methylation events involving 168 genes. Analyses delineated four ovarian cancer transcriptional subtypes, three microRNA subtypes, four promoter methylation subtypes and a transcriptional signature associated with survival duration, and shed new light on the impact that tumours with BRCA1/2 (BRCA1 or BRCA2) and CCNE1 aberrations have on survival. Pathway analyses suggested that homologous recombination is defective in about half of the tumours analysed, and that NOTCH and FOXM1 signalling are involved in serous ovarian cancer pathophysiology. PMID:21720365

  9. A statistical development of entropy for the introductory physics course

    NASA Astrophysics Data System (ADS)

    Schoepf, David C.

    2002-02-01

    Many introductory physics texts introduce the statistical basis for the definition of entropy in addition to the Clausius definition, ΔS=q/T. We use a model based on equally spaced energy levels to present a way that the statistical definition of entropy can be developed at the introductory level. In addition to motivating the statistical definition of entropy, we also develop statistical arguments to answer the following questions: (i) Why does a system approach a state of maximum number of microstates? (ii) What is the equilibrium distribution of particles? (iii) What is the statistical basis of temperature? (iv) What is the statistical basis for the direction of spontaneous energy transfer? Finally, a correspondence between the statistical and the classical Clausius definitions of entropy is made.

  10. MMOD: an R library for the calculation of population differentiation statistics.

    PubMed

    Winter, David J

    2012-11-01

    MMOD is a library for the R programming language that allows the calculation of the population differentiation measures D(est), G″(ST) and φ'(ST). R provides a powerful environment in which to conduct and record population genetic analyses but, at present, no R libraries provide functions for the calculation of these statistics from standard population genetic files. In addition to the calculation of differentiation measures, mmod can produce parametric bootstrap and jackknife samples of data sets for further analysis. By integrating with and complimenting the existing libraries adegenet and pegas, mmod extends the power of R as a population genetic platform. PMID:22883857

  11. Statistical load data processing

    NASA Technical Reports Server (NTRS)

    Vandijk, G. M.

    1972-01-01

    A recorder system has been installed on two operational fighter aircrafts. Signal values from a c.g.-acceleration transducer and a strain-gage installation at the wing root were sampled and recorded in digital format on the recorder system. To analyse such load-time histories for fatigue evaluation purposes, a number of counting methods are available in which level crossings, peaks, or ranges are counted. Ten different existing counting principles are defined. The load-time histories are analysed to evaluate these counting methods. For some of the described counting methods, the counting results might be affected by arbitrarily chosen parameters such as the magnitude of load ranges that will be neglected and other secondary counting restrictions. Such influences might invalidate the final counting results entirely. The evaluation shows that for the type of load-time histories associated with most counting methods, a sensible value of the parameters involved can be found.

  12. Ontologies and tag-statistics

    NASA Astrophysics Data System (ADS)

    Tibély, Gergely; Pollner, Péter; Vicsek, Tamás; Palla, Gergely

    2012-05-01

    Due to the increasing popularity of collaborative tagging systems, the research on tagged networks, hypergraphs, ontologies, folksonomies and other related concepts is becoming an important interdisciplinary area with great potential and relevance for practical applications. In most collaborative tagging systems the tagging by the users is completely ‘flat’, while in some cases they are allowed to define a shallow hierarchy for their own tags. However, usually no overall hierarchical organization of the tags is given, and one of the interesting challenges of this area is to provide an algorithm generating the ontology of the tags from the available data. In contrast, there are also other types of tagged networks available for research, where the tags are already organized into a directed acyclic graph (DAG), encapsulating the ‘is a sub-category of’ type of hierarchy between each other. In this paper, we study how this DAG affects the statistical distribution of tags on the nodes marked by the tags in various real networks. The motivation for this research was the fact that understanding the tagging based on a known hierarchy can help in revealing the hidden hierarchy of tags in collaborative tagging systems. We analyse the relation between the tag-frequency and the position of the tag in the DAG in two large sub-networks of the English Wikipedia and a protein-protein interaction network. We also study the tag co-occurrence statistics by introducing a two-dimensional (2D) tag-distance distribution preserving both the difference in the levels and the absolute distance in the DAG for the co-occurring pairs of tags. Our most interesting finding is that the local relevance of tags in the DAG (i.e. their rank or significance as characterized by, e.g., the length of the branches starting from them) is much more important than their global distance from the root. Furthermore, we also introduce a simple tagging model based on random walks on the DAG, capable of

  13. Informal Statistics Help Desk

    NASA Technical Reports Server (NTRS)

    Ploutz-Snyder, R. J.; Feiveson, A. H.

    2015-01-01

    Back by popular demand, the JSC Biostatistics Lab is offering an opportunity for informal conversation about challenges you may have encountered with issues of experimental design, analysis, data visualization or related topics. Get answers to common questions about sample size, repeated measures, violation of distributional assumptions, missing data, multiple testing, time-to-event data, when to trust the results of your analyses (reproducibility issues) and more.

  14. Statistics of the galaxy distribution

    NASA Astrophysics Data System (ADS)

    Lachièze-Rey, Marc

    1989-09-01

    The universe appears from recent observational results to be a highly structured but also highly disordered medium. This accounts for the difficulties with a conventional statistical approach. Since the statistics of disordered media is an increasingly well-studied field in physics, it is tempting to try to adapt its methods for the study of the universe (the use of correlation functions also resulted from the adaptation of techniques from a very different field to astrophysics). This is already the case for the fractal analysis, which, mainly developed in microscopic statistics, is increasingly used in astrophysics. I suggest a new approach, also derived from the study of disordered media, both from the study of percolation clusters and from the dynamics of so-called “cluster aggregation” gelification models. This approach is briefly presented. Its main interest lies in two points. First, it suggests an analysis able to characterize features of unconventional statistics (those that seem to be present in the galaxy distribution and which conventional indicators are unable to take into account). It appears also a priori very convenient for a synthetic approach, since it can be related to the other indicators used up to now: the link with the void probability function is very straightforward. The connexion with fractals can be said to be contained in the method, since the objects defined during this analysis are themselves fractal: different kinds of fractal dimensions are very easy to extract from the analysis. The link with the percolation studies is also very natural since the method is adapted from the study of percolation clusters. It is also expected that the information concerning the topology is contained in this approach; this seems natural since the method is very sensitive to the topology of the distribution and posses some common characteristics with the topology analysis already developed by Gott et al. (1986). The quantitative relations remain

  15. Performance Boosting Additive

    NASA Technical Reports Server (NTRS)

    1999-01-01

    Mainstream Engineering Corporation was awarded Phase I and Phase II contracts from Goddard Space Flight Center's Small Business Innovation Research (SBIR) program in early 1990. With support from the SBIR program, Mainstream Engineering Corporation has developed a unique low cost additive, QwikBoost (TM), that increases the performance of air conditioners, heat pumps, refrigerators, and freezers. Because of the energy and environmental benefits of QwikBoost, Mainstream received the Tibbetts Award at a White House Ceremony on October 16, 1997. QwikBoost was introduced at the 1998 International Air Conditioning, Heating, and Refrigeration Exposition. QwikBoost is packaged in a handy 3-ounce can (pressurized with R-134a) and will be available for automotive air conditioning systems in summer 1998.

  16. Sewage sludge additive

    NASA Technical Reports Server (NTRS)

    Kalvinskas, J. J.; Mueller, W. A.; Ingham, J. D. (Inventor)

    1980-01-01

    The additive is for a raw sewage treatment process of the type where settling tanks are used for the purpose of permitting the suspended matter in the raw sewage to be settled as well as to permit adsorption of the dissolved contaminants in the water of the sewage. The sludge, which settles down to the bottom of the settling tank is extracted, pyrolyzed and activated to form activated carbon and ash which is mixed with the sewage prior to its introduction into the settling tank. The sludge does not provide all of the activated carbon and ash required for adequate treatment of the raw sewage. It is necessary to add carbon to the process and instead of expensive commercial carbon, coal is used to provide the carbon supplement.

  17. Perspectives on Additive Manufacturing

    NASA Astrophysics Data System (ADS)

    Bourell, David L.

    2016-07-01

    Additive manufacturing (AM) has skyrocketed in visibility commercially and in the public sector. This article describes the development of this field from early layered manufacturing approaches of photosculpture, topography, and material deposition. Certain precursors to modern AM processes are also briefly described. The growth of the field over the last 30 years is presented. Included is the standard delineation of AM technologies into seven broad categories. The economics of AM part generation is considered, and the impacts of the economics on application sectors are described. On the basis of current trends, the future outlook will include a convergence of AM fabricators, mass-produced AM fabricators, enabling of topology optimization designs, and specialization in the AM legal arena. Long-term developments with huge impact are organ printing and volume-based printing.

  18. New addition curing polyimides

    NASA Technical Reports Server (NTRS)

    Frimer, Aryeh A.; Cavano, Paul

    1991-01-01

    In an attempt to improve the thermal-oxidative stability (TOS) of PMR-type polymers, the use of 1,4-phenylenebis (phenylmaleic anhydride) PPMA, was evaluated. Two series of nadic end-capped addition curing polyimides were prepared by imidizing PPMA with either 4,4'-methylene dianiline or p-phenylenediamine. The first resulted in improved solubility and increased resin flow while the latter yielded a compression molded neat resin sample with a T(sub g) of 408 C, close to 70 C higher than PME-15. The performance of these materials in long term weight loss studies was below that of PMR-15, independent of post-cure conditions. These results can be rationalized in terms of the thermal lability of the pendant phenyl groups and the incomplete imidization of the sterically congested PPMA. The preparation of model compounds as well as future research directions are discussed.

  19. Who Needs Statistics? | Poster

    Cancer.gov

    You may know the feeling. You have collected a lot of new data on an important experiment. Now you are faced with multiple groups of data, a sea of numbers, and a deadline for submitting your paper to a peer-reviewed journal. And you are not sure which data are relevant, or even the best way to present them. The statisticians at Data Management Services (DMS) know how to help. This small group of experts provides a wide array of statistical and mathematical consulting services to the scientific community at NCI at Frederick and NCI-Bethesda.

  20. International petroleum statistics report

    SciTech Connect

    1995-10-01

    The International Petroleum Statistics Report is a monthly publication that provides current international oil data. This report presents data on international oil production, demand, imports, exports and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). Section 2 presents an oil supply/demand balance for the world, in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries.

  1. Statistics of atmospheric correlations.

    PubMed

    Santhanam, M S; Patra, P K

    2001-07-01

    For a large class of quantum systems, the statistical properties of their spectrum show remarkable agreement with random matrix predictions. Recent advances show that the scope of random matrix theory is much wider. In this work, we show that the random matrix approach can be beneficially applied to a completely different classical domain, namely, to the empirical correlation matrices obtained from the analysis of the basic atmospheric parameters that characterize the state of atmosphere. We show that the spectrum of atmospheric correlation matrices satisfy the random matrix prescription. In particular, the eigenmodes of the atmospheric empirical correlation matrices that have physical significance are marked by deviations from the eigenvector distribution. PMID:11461326

  2. Hybrid perturbation methods based on statistical time series models

    NASA Astrophysics Data System (ADS)

    San-Juan, Juan Félix; San-Martín, Montserrat; Pérez, Iván; López, Rosario

    2016-04-01

    In this work we present a new methodology for orbit propagation, the hybrid perturbation theory, based on the combination of an integration method and a prediction technique. The former, which can be a numerical, analytical or semianalytical theory, generates an initial approximation that contains some inaccuracies derived from the fact that, in order to simplify the expressions and subsequent computations, not all the involved forces are taken into account and only low-order terms are considered, not to mention the fact that mathematical models of perturbations not always reproduce physical phenomena with absolute precision. The prediction technique, which can be based on either statistical time series models or computational intelligence methods, is aimed at modelling and reproducing missing dynamics in the previously integrated approximation. This combination results in the precision improvement of conventional numerical, analytical and semianalytical theories for determining the position and velocity of any artificial satellite or space debris object. In order to validate this methodology, we present a family of three hybrid orbit propagators formed by the combination of three different orders of approximation of an analytical theory and a statistical time series model, and analyse their capability to process the effect produced by the flattening of the Earth. The three considered analytical components are the integration of the Kepler problem, a first-order and a second-order analytical theories, whereas the prediction technique is the same in the three cases, namely an additive Holt-Winters method.

  3. Statistical properties of thermal Sunyaev-Zel'dovich maps

    NASA Astrophysics Data System (ADS)

    Munshi, Dipak; Joudaki, Shahab; Smidt, Joseph; Coles, Peter; Kay, Scott T.

    2013-02-01

    On small angular scales, i.e. at high angular frequencies, beyond the damping tail of the primary power spectrum, the dominant contribution to the power spectrum of cosmic microwave background temperature fluctuations is the thermal Sunyaev-Zel'dovich (tSZ) effect. We investigate various important statistical properties of the Sunyaev-Zel'dovich maps, using well-motivated models for dark matter clustering to construct statistical descriptions of the tSZ effect to all orders enabling us to determine the entire probability distribution function (PDF). Any generic deterministic biasing scheme can be incorporated in our analysis and the effects of projection, biasing and the underlying density distribution can be analysed separately and transparently in this approach. We introduce the cumulant correlators as tools to analyse tSZ catalogues and relate them to corresponding statistical descriptors of the underlying density distribution. The statistics of hot spots in frequency-cleaned tSZ maps are also developed in a self-consistent way to an arbitrary order, to obtain results complementary to those found using the halo model. We also consider different beam sizes to check the extent to which the PDF can be extracted from various observational configurations. The formalism is presented with two specific models for underlying matter clustering, the hierarchical ansatz and the lognormal distribution. We find both models to be in very good agreement with the simulation results, though the extension of the hierarchical model has an edge over the lognormal model. In addition to testing against simulations made using semi-analytical techniques, we have also used the maps made using Millennium Gas Simulations to prove that the PDF and bias can indeed be predicted with very high accuracy using these models. The presence of significant non-gravitational effects such as preheating, however, cannot be modelled using an analytical approach which is based on the modelling of

  4. Dissociable behavioural outcomes of visual statistical learning

    PubMed Central

    Turk-Browne, Nicholas B.; Seitz, Aaron R.

    2016-01-01

    Statistical learning refers to the extraction of probabilistic relationships between stimuli and is increasingly used as a method to understand learning processes. However, numerous cognitive processes are sensitive to the statistical relationships between stimuli and any one measure of learning may conflate these processes; to date little research has focused on differentiating these processes. To understand how multiple processes underlie statistical learning, here we compared, within the same study, operational measures of learning from different tasks that may be differentially sensitive to these processes. In Experiment 1, participants were visually exposed to temporal regularities embedded in a stream of shapes. Their task was to periodically detect whether a shape, whose contrast was staircased to a threshold level, was present or absent. Afterwards, they completed a search task, where statistically predictable shapes were found more quickly. We used the search task to label shape pairs as “learned” or “non-learned”, and then used these labels to analyse the detection task. We found a dissociation between learning on the search task and the detection task where only non-learned pairs showed learning effects in the detection task. This finding was replicated in further experiments with recognition memory (Experiment 2) and associative learning tasks (Experiment 3). Taken together, these findings are consistent with the view that statistical learning may comprise a family of processes that can produce dissociable effects on different aspects of behaviour.

  5. NeuroVault.org: a web-based repository for collecting and sharing unthresholded statistical maps of the human brain

    PubMed Central

    Gorgolewski, Krzysztof J.; Varoquaux, Gael; Rivera, Gabriel; Schwarz, Yannick; Ghosh, Satrajit S.; Maumet, Camille; Sochat, Vanessa V.; Nichols, Thomas E.; Poldrack, Russell A.; Poline, Jean-Baptiste; Yarkoni, Tal; Margulies, Daniel S.

    2015-01-01

    Here we present NeuroVault—a web based repository that allows researchers to store, share, visualize, and decode statistical maps of the human brain. NeuroVault is easy to use and employs modern web technologies to provide informative visualization of data without the need to install additional software. In addition, it leverages the power of the Neurosynth database to provide cognitive decoding of deposited maps. The data are exposed through a public REST API enabling other services and tools to take advantage of it. NeuroVault is a new resource for researchers interested in conducting meta- and coactivation analyses. PMID:25914639

  6. Overweight and Obesity Statistics

    MedlinePlus

    ... Research Training & Career Development Grant programs for students, postdocs, and faculty Research at NIDDK Labs, faculty, and ... Resources Additional Reading from the Centers for Disease Control and Prevention Obesity and Socioeconomic Status in Adults: ...

  7. Relationship between Graduate Students' Statistics Self-Efficacy, Statistics Anxiety, Attitude toward Statistics, and Social Support

    ERIC Educational Resources Information Center

    Perepiczka, Michelle; Chandler, Nichelle; Becerra, Michael

    2011-01-01

    Statistics plays an integral role in graduate programs. However, numerous intra- and interpersonal factors may lead to successful completion of needed coursework in this area. The authors examined the extent of the relationship between self-efficacy to learn statistics and statistics anxiety, attitude towards statistics, and social support of 166…

  8. Statistical summary of daily values data and trend analysis of dissolved-solids data at National Stream Quality Accounting Network (NASQAN) stations

    USGS Publications Warehouse

    Wells, F.C.; Schertz, T.L.

    1983-01-01

    A statistical summary is provided of the available continuous and once-daily discharge, specific-conductance, dissolved oxygen , water temperature, and pH data collected at NASQAN stations during the 1973-81 water years and documents the period of record on which the statistical calculations were based. In addition, dissolved-solids data are examined by regression analyses to determine the relation between dissolved solids and specific conductance and to determine if long-term trends can be detected in dissolved-solids concentrations. Statistical summaries, regression equations expressing the relation between dissolved solids and specific conductance, and graphical presentations of trend analyses of dissolved solids are presented for 515 NASQAN stations in the United States, Canada, Guam, and Puerto Rico. 

  9. [Comment on] Statistical discrimination

    NASA Astrophysics Data System (ADS)

    Chinn, Douglas

    In the December 8, 1981, issue of Eos, a news item reported the conclusion of a National Research Council study that sexual discrimination against women with Ph.D.'s exists in the field of geophysics. Basically, the item reported that even when allowances are made for motherhood the percentage of female Ph.D.'s holding high university and corporate positions is significantly lower than the percentage of male Ph.D.'s holding the same types of positions. The sexual discrimination conclusion, based only on these statistics, assumes that there are no basic psychological differences between men and women that might cause different populations in the employment group studied. Therefore, the reasoning goes, after taking into account possible effects from differences related to anatomy, such as women stopping their careers in order to bear and raise children, the statistical distributions of positions held by male and female Ph.D.'s ought to be very similar to one another. Any significant differences between the distributions must be caused primarily by sexual discrimination.

  10. International petroleum statistics report

    SciTech Connect

    1997-05-01

    The International Petroleum Statistics Report is a monthly publication that provides current international oil data. This report is published for the use of Members of Congress, Federal agencies, State agencies, industry, and the general public. Publication of this report is in keeping with responsibilities given the Energy Information Administration in Public Law 95-91. The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1995; OECD stocks from 1973 through 1995; and OECD trade from 1985 through 1995.

  11. Statistical clumped isotope signatures

    PubMed Central

    Röckmann, T.; Popa, M. E.; Krol, M. C.; Hofmann, M. E. G.

    2016-01-01

    High precision measurements of molecules containing more than one heavy isotope may provide novel constraints on element cycles in nature. These so-called clumped isotope signatures are reported relative to the random (stochastic) distribution of heavy isotopes over all available isotopocules of a molecule, which is the conventional reference. When multiple indistinguishable atoms of the same element are present in a molecule, this reference is calculated from the bulk (≈average) isotopic composition of the involved atoms. We show here that this referencing convention leads to apparent negative clumped isotope anomalies (anti-clumping) when the indistinguishable atoms originate from isotopically different populations. Such statistical clumped isotope anomalies must occur in any system where two or more indistinguishable atoms of the same element, but with different isotopic composition, combine in a molecule. The size of the anti-clumping signal is closely related to the difference of the initial isotope ratios of the indistinguishable atoms that have combined. Therefore, a measured statistical clumped isotope anomaly, relative to an expected (e.g. thermodynamical) clumped isotope composition, may allow assessment of the heterogeneity of the isotopic pools of atoms that are the substrate for formation of molecules. PMID:27535168

  12. Statistical clumped isotope signatures.

    PubMed

    Röckmann, T; Popa, M E; Krol, M C; Hofmann, M E G

    2016-01-01

    High precision measurements of molecules containing more than one heavy isotope may provide novel constraints on element cycles in nature. These so-called clumped isotope signatures are reported relative to the random (stochastic) distribution of heavy isotopes over all available isotopocules of a molecule, which is the conventional reference. When multiple indistinguishable atoms of the same element are present in a molecule, this reference is calculated from the bulk (≈average) isotopic composition of the involved atoms. We show here that this referencing convention leads to apparent negative clumped isotope anomalies (anti-clumping) when the indistinguishable atoms originate from isotopically different populations. Such statistical clumped isotope anomalies must occur in any system where two or more indistinguishable atoms of the same element, but with different isotopic composition, combine in a molecule. The size of the anti-clumping signal is closely related to the difference of the initial isotope ratios of the indistinguishable atoms that have combined. Therefore, a measured statistical clumped isotope anomaly, relative to an expected (e.g. thermodynamical) clumped isotope composition, may allow assessment of the heterogeneity of the isotopic pools of atoms that are the substrate for formation of molecules. PMID:27535168

  13. Fragile entanglement statistics

    NASA Astrophysics Data System (ADS)

    Brody, Dorje C.; Hughston, Lane P.; Meier, David M.

    2015-10-01

    If X and Y are independent, Y and Z are independent, and so are X and Z, one might be tempted to conclude that X, Y, and Z are independent. But it has long been known in classical probability theory that, intuitive as it may seem, this is not true in general. In quantum mechanics one can ask whether analogous statistics can emerge for configurations of particles in certain types of entangled states. The explicit construction of such states, along with the specification of suitable sets of observables that have the purported statistical properties, is not entirely straightforward. We show that an example of such a configuration arises in the case of an N-particle GHZ state, and we are able to identify a family of observables with the property that the associated measurement outcomes are independent for any choice of 2,3,\\ldots ,N-1 of the particles, even though the measurement outcomes for all N particles are not independent. Although such states are highly entangled, the entanglement turns out to be ‘fragile’, i.e. the associated density matrix has the property that if one traces out the freedom associated with even a single particle, the resulting reduced density matrix is separable.

  14. Pooling Morphometric Estimates: A Statistical Equivalence Approach.

    PubMed

    Pardoe, Heath R; Cutter, Gary R; Alter, Rachel; Hiess, Rebecca Kucharsky; Semmelroch, Mira; Parker, Donna; Farquharson, Shawna; Jackson, Graeme D; Kuzniecky, Ruben

    2016-01-01

    Changes in hardware or image-processing settings are a common issue for large multicenter studies. To pool MRI data acquired under these changed conditions, it is necessary to demonstrate that the changes do not affect MRI-based measurements. In these circumstances, classical inference testing is inappropriate because it is designed to detect differences, not prove similarity. We used a method known as statistical equivalence testing to address this limitation. Equivalence testing was carried out on 3 datasets: (1) cortical thickness and automated hippocampal volume estimates obtained from healthy individuals imaged using different multichannel head coils; (2) manual hippocampal volumetry obtained using two readers; and (3) corpus callosum area estimates obtained using an automated method with manual cleanup carried out by two readers. Equivalence testing was carried out using the "two one-sided tests" (TOST) approach. Power analyses of the TOST were used to estimate sample sizes required for well-powered equivalence testing analyses. Mean and standard deviation estimates from the automated hippocampal volume dataset were used to carry out an example power analysis. Cortical thickness values were found to be equivalent over 61% of the cortex when different head coils were used (q < .05, false discovery rate correction). Automated hippocampal volume estimates obtained using the same two coils were statistically equivalent (TOST P = 4.28 × 10(-15) ). Manual hippocampal volume estimates obtained using two readers were not statistically equivalent (TOST P = .97). The use of different readers to carry out limited correction of automated corpus callosum segmentations yielded equivalent area estimates (TOST P = 1.28 × 10(-14) ). Power analysis of simulated and automated hippocampal volume data demonstrated that the equivalence margin affects the number of subjects required for well-powered equivalence tests. We have presented a statistical method for determining if

  15. Eigenfunction statistics on quantum graphs

    SciTech Connect

    Gnutzmann, S.; Keating, J.P.; Piotet, F.

    2010-12-15

    We investigate the spatial statistics of the energy eigenfunctions on large quantum graphs. It has previously been conjectured that these should be described by a Gaussian Random Wave Model, by analogy with quantum chaotic systems, for which such a model was proposed by Berry in 1977. The autocorrelation functions we calculate for an individual quantum graph exhibit a universal component, which completely determines a Gaussian Random Wave Model, and a system-dependent deviation. This deviation depends on the graph only through its underlying classical dynamics. Classical criteria for quantum universality to be met asymptotically in the large graph limit (i.e. for the non-universal deviation to vanish) are then extracted. We use an exact field theoretic expression in terms of a variant of a supersymmetric {sigma} model. A saddle-point analysis of this expression leads to the estimates. In particular, intensity correlations are used to discuss the possible equidistribution of the energy eigenfunctions in the large graph limit. When equidistribution is asymptotically realized, our theory predicts a rate of convergence that is a significant refinement of previous estimates. The universal and system-dependent components of intensity correlation functions are recovered by means of an exact trace formula which we analyse in the diagonal approximation, drawing in this way a parallel between the field theory and semiclassics. Our results provide the first instance where an asymptotic Gaussian Random Wave Model has been established microscopically for eigenfunctions in a system with no disorder.

  16. Implications of Monte Carlo Statistical Errors in Criticality Safety Assessments

    SciTech Connect

    Pevey, Ronald E.

    2005-09-15

    Most criticality safety calculations are performed using Monte Carlo techniques because of Monte Carlo's ability to handle complex three-dimensional geometries. For Monte Carlo calculations, the more histories sampled, the lower the standard deviation of the resulting estimates. The common intuition is, therefore, that the more histories, the better; as a result, analysts tend to run Monte Carlo analyses as long as possible (or at least to a minimum acceptable uncertainty). For Monte Carlo criticality safety analyses, however, the optimization situation is complicated by the fact that procedures usually require that an extra margin of safety be added because of the statistical uncertainty of the Monte Carlo calculations. This additional safety margin affects the impact of the choice of the calculational standard deviation, both on production and on safety. This paper shows that, under the assumptions of normally distributed benchmarking calculational errors and exact compliance with the upper subcritical limit (USL), the standard deviation that optimizes production is zero, but there is a non-zero value of the calculational standard deviation that minimizes the risk of inadvertently labeling a supercritical configuration as subcritical. Furthermore, this value is shown to be a simple function of the typical benchmarking step outcomes--the bias, the standard deviation of the bias, the upper subcritical limit, and the number of standard deviations added to calculated k-effectives before comparison to the USL.

  17. Additions and deletions to the known cerambycidae (Coleoptera) of Bolivia

    Technology Transfer Automated Retrieval System (TEKTRAN)

    An additional 137 species and two tribes are added to the known cerambycid fauna of Bolivia while 12 species are deleted. Comments and statistics regarding the growth of knowledge on the Bolivian Cerambycid fauna and species endemicity are included....

  18. Predictability of the recent slowdown and subsequent recovery of large-scale surface warming using statistical methods

    NASA Astrophysics Data System (ADS)

    Mann, Michael E.; Steinman, Byron A.; Miller, Sonya K.; Frankcombe, Leela M.; England, Matthew H.; Cheung, Anson H.

    2016-04-01

    The temporary slowdown in large-scale surface warming during the early 2000s has been attributed to both external and internal sources of climate variability. Using semiempirical estimates of the internal low-frequency variability component in Northern Hemisphere, Atlantic, and Pacific surface temperatures in concert with statistical hindcast experiments, we investigate whether the slowdown and its recent recovery were predictable. We conclude that the internal variability of the North Pacific, which played a critical role in the slowdown, does not appear to have been predictable using statistical forecast methods. An additional minor contribution from the North Atlantic, by contrast, appears to exhibit some predictability. While our analyses focus on combining semiempirical estimates of internal climatic variability with statistical hindcast experiments, possible implications for initialized model predictions are also discussed.

  19. Statistical Literacy: Developing a Youth and Adult Education Statistical Project

    ERIC Educational Resources Information Center

    Conti, Keli Cristina; Lucchesi de Carvalho, Dione

    2014-01-01

    This article focuses on the notion of literacy--general and statistical--in the analysis of data from a fieldwork research project carried out as part of a master's degree that investigated the teaching and learning of statistics in adult education mathematics classes. We describe the statistical context of the project that involved the…

  20. Understanding Statistics and Statistics Education: A Chinese Perspective

    ERIC Educational Resources Information Center

    Shi, Ning-Zhong; He, Xuming; Tao, Jian

    2009-01-01

    In recent years, statistics education in China has made great strides. However, there still exists a fairly large gap with the advanced levels of statistics education in more developed countries. In this paper, we identify some existing problems in statistics education in Chinese schools and make some proposals as to how they may be overcome. We…

  1. Statistics Anxiety and Business Statistics: The International Student

    ERIC Educational Resources Information Center

    Bell, James A.

    2008-01-01

    Does the international student suffer from statistics anxiety? To investigate this, the Statistics Anxiety Rating Scale (STARS) was administered to sixty-six beginning statistics students, including twelve international students and fifty-four domestic students. Due to the small number of international students, nonparametric methods were used to…

  2. Wide Wide World of Statistics: International Statistics on the Internet.

    ERIC Educational Resources Information Center

    Foudy, Geraldine

    2000-01-01

    Explains how to find statistics on the Internet, especially international statistics. Discusses advantages over print sources, including convenience, currency of information, cost effectiveness, and value-added formatting; sources of international statistics; United Nations agencies; search engines and power searching; and evaluating sources. (LRW)

  3. Statistical Modeling for Radiation Hardness Assurance

    NASA Technical Reports Server (NTRS)

    Ladbury, Raymond L.

    2014-01-01

    We cover the models and statistics associated with single event effects (and total ionizing dose), why we need them, and how to use them: What models are used, what errors exist in real test data, and what the model allows us to say about the DUT will be discussed. In addition, how to use other sources of data such as historical, heritage, and similar part and how to apply experience, physics, and expert opinion to the analysis will be covered. Also included will be concepts of Bayesian statistics, data fitting, and bounding rates.

  4. Improving extreme value statistics.

    PubMed

    Shekhawat, Ashivni

    2014-11-01

    The rate of convergence in extreme value statistics is nonuniversal and can be arbitrarily slow. Further, the relative error can be unbounded in the tail of the approximation, leading to difficulty in extrapolating the extreme value fit beyond the available data. We introduce the T method, and show that by using simple nonlinear transformations the extreme value approximation can be rendered rapidly convergent in the bulk, and asymptotic in the tail, thus fixing both issues. The transformations are often parametrized by just one parameter, which can be estimated numerically. The classical extreme value method is shown to be a special case of the proposed method. We demonstrate that vastly improved results can be obtained with almost no extra cost. PMID:25493780

  5. Statistical test of anarchy

    NASA Astrophysics Data System (ADS)

    de Gouvêa, André; Murayama, Hitoshi

    2003-10-01

    “Anarchy” is the hypothesis that there is no fundamental distinction among the three flavors of neutrinos. It describes the mixing angles as random variables, drawn from well-defined probability distributions dictated by the group Haar measure. We perform a Kolmogorov-Smirnov (KS) statistical test to verify whether anarchy is consistent with all neutrino data, including the new result presented by KamLAND. We find a KS probability for Nature's choice of mixing angles equal to 64%, quite consistent with the anarchical hypothesis. In turn, assuming that anarchy is indeed correct, we compute lower bounds on |Ue3|2, the remaining unknown “angle” of the leptonic mixing matrix.

  6. Statistical physics ""Beyond equilibrium

    SciTech Connect

    Ecke, Robert E

    2009-01-01

    The scientific challenges of the 21st century will increasingly involve competing interactions, geometric frustration, spatial and temporal intrinsic inhomogeneity, nanoscale structures, and interactions spanning many scales. We will focus on a broad class of emerging problems that will require new tools in non-equilibrium statistical physics and that will find application in new material functionality, in predicting complex spatial dynamics, and in understanding novel states of matter. Our work will encompass materials under extreme conditions involving elastic/plastic deformation, competing interactions, intrinsic inhomogeneity, frustration in condensed matter systems, scaling phenomena in disordered materials from glasses to granular matter, quantum chemistry applied to nano-scale materials, soft-matter materials, and spatio-temporal properties of both ordinary and complex fluids.

  7. Why Tsallis statistics?

    NASA Astrophysics Data System (ADS)

    Baranger, Michel

    2002-03-01

    It is a remarkable fact that the traditional teaching of thermodynamics, as reflected in the textbooks and including the long developments about ensembles and thermodynamic functions, is almost entirely about systems in equilibrium. The time variable does not enter. There is one exception, however. The single most important item, the flagship of the thermodynamic navy, the second law, is about the irreversibility of the time evolution of systems out of equilibrium. This is a bizarre situation, to say the least; a glaring case of the drunk man looking for his key under the lamp-post, when he knows that he lost it in the dark part of the street. The moment has come for us to go looking in the dark part, the behavior of systems as a function of time. We have been given a powerful new flashlight, chaos theory. We should use it. There, on the formerly dark pavement, we can find Tsallis statistics.

  8. Fast approximate motif statistics.

    PubMed

    Nicodème, P

    2001-01-01

    We present in this article a fast approximate method for computing the statistics of a number of non-self-overlapping matches of motifs in a random text in the nonuniform Bernoulli model. This method is well suited for protein motifs where the probability of self-overlap of motifs is small. For 96% of the PROSITE motifs, the expectations of occurrences of the motifs in a 7-million-amino-acids random database are computed by the approximate method with less than 1% error when compared with the exact method. Processing of the whole PROSITE takes about 30 seconds with the approximate method. We apply this new method to a comparison of the C. elegans and S. cerevisiae proteomes. PMID:11535175

  9. Statistical design controversy

    SciTech Connect

    Evans, L.S.; Hendrey, G.R.; Thompson, K.H.

    1985-02-01

    This article was in response to criticisms received by Evans, Hendrey, and Thompson that their article was biased because of omissions and misrepresentations. The authors contend that experimental designs having only one plot per treatment ''were, from the outset, not capable of differentiating between treatment effects and field-position effects,'' remains valid and is supported by decades of agronomic research. Several men, Irving, Troiano, and McCune thought of the article as a review of all studies of acidic rain effects on soybeans. It was not. The article was written over the concern of the comparisons which were being made among studies which purport to evaluate effects of acid deposition on field-grown crops, and implicitly assumes that all of the studies are of equal scientific value. They are not. Only experimental approaches that are well-focused and designed with appropriate agronomic and statistical procedures should be used for credible regional and national assessments of crop inventories. 12 references.

  10. Statistical Thermodynamics of Biomembranes

    PubMed Central

    Devireddy, Ram V.

    2010-01-01

    An overview of the major issues involved in the statistical thermodynamic treatment of phospholipid membranes at the atomistic level is summarized: thermodynamic ensembles, initial configuration (or the physical system being modeled), force field representation as well as the representation of long-range interactions. This is followed by a description of the various ways that the simulated ensembles can be analyzed: area of the lipid, mass density profiles, radial distribution functions (RDFs), water orientation profile, Deuteurium order parameter, free energy profiles and void (pore) formation; with particular focus on the results obtained from our recent molecular dynamic (MD) simulations of phospholipids interacting with dimethylsulfoxide (Me2SO), a commonly used cryoprotective agent (CPA). PMID:19460363

  11. Statistical crack mechanics

    SciTech Connect

    Dienes, J.K.

    1983-01-01

    An alternative to the use of plasticity theory to characterize the inelastic behavior of solids is to represent the flaws by statistical methods. We have taken such an approach to study fragmentation because it offers a number of advantages. Foremost among these is that, by considering the effects of flaws, it becomes possible to address the underlying physics directly. For example, we have been able to explain why rocks exhibit large strain-rate effects (a consequence of the finite growth rate of cracks), why a spherical explosive imbedded in oil shale produces a cavity with a nearly square section (opening of bedding cracks) and why propellants may detonate following low-speed impact (a consequence of frictional hot spots).

  12. The Statistical Consulting Center for Astronomy (SCCA)

    NASA Technical Reports Server (NTRS)

    Akritas, Michael

    2001-01-01

    The process by which raw astronomical data acquisition is transformed into scientifically meaningful results and interpretation typically involves many statistical steps. Traditional astronomy limits itself to a narrow range of old and familiar statistical methods: means and standard deviations; least-squares methods like chi(sup 2) minimization; and simple nonparametric procedures such as the Kolmogorov-Smirnov tests. These tools are often inadequate for the complex problems and datasets under investigations, and recent years have witnessed an increased usage of maximum-likelihood, survival analysis, multivariate analysis, wavelet and advanced time-series methods. The Statistical Consulting Center for Astronomy (SCCA) assisted astronomers with the use of sophisticated tools, and to match these tools with specific problems. The SCCA operated with two professors of statistics and a professor of astronomy working together. Questions were received by e-mail, and were discussed in detail with the questioner. Summaries of those questions and answers leading to new approaches were posted on the Web (www.state.psu.edu/ mga/SCCA). In addition to serving individual astronomers, the SCCA established a Web site for general use that provides hypertext links to selected on-line public-domain statistical software and services. The StatCodes site (www.astro.psu.edu/statcodes) provides over 200 links in the areas of: Bayesian statistics; censored and truncated data; correlation and regression, density estimation and smoothing, general statistics packages and information; image analysis; interactive Web tools; multivariate analysis; multivariate clustering and classification; nonparametric analysis; software written by astronomers; spatial statistics; statistical distributions; time series analysis; and visualization tools. StatCodes has received a remarkable high and constant hit rate of 250 hits/week (over 10,000/year) since its inception in mid-1997. It is of interest to

  13. Statistical properties of randomization in clinical trials.

    PubMed

    Lachin, J M

    1988-12-01

    This is the first of five articles on the properties of different randomization procedures used in clinical trials. This paper presents definitions and discussions of the statistical properties of randomization procedures as they relate to both the design of a clinical trial and the statistical analysis of trial results. The subsequent papers consider, respectively, the properties of simple (complete), permuted-block (i.e., blocked), and urn (adaptive biased-coin) randomization. The properties described herein are the probabilities of treatment imbalances and the potential effects on the power of statistical tests; the permutational basis for statistical tests; and the potential for experimental biases in the assessment of treatment effects due either to the predictability of the random allocations (selection bias) or the susceptibility of the randomization procedure to covariate imbalances (accidental bias). For most randomization procedures, the probabilities of overall treatment imbalances are readily computed, even when a stratified randomization is used. This is important because treatment imbalance may affect statistical power. It is shown, however, that treatment imbalance must be substantial before power is more than trivially affected. The differences between a population versus a permutation model as a basis for a statistical test are reviewed. It is argued that a population model can only be invoked in clinical trials as an untestable assumption, rather than being formally based on sampling at random from a population. On the other hand, a permutational analysis based on the randomization actually employed requires no assumptions regarding the origin of the samples of patients studied. The large sample permutational distribution of the family of linear rank tests is described as a basis for easily conducting a variety of permutation tests. Subgroup (stratified) analyses, analyses when some data are missing, and regression model analyses are also

  14. Development of the Statistical Reasoning in Biology Concept Inventory (SRBCI).

    PubMed

    Deane, Thomas; Nomme, Kathy; Jeffery, Erica; Pollock, Carol; Birol, Gülnur

    2016-01-01

    We followed established best practices in concept inventory design and developed a 12-item inventory to assess student ability in statistical reasoning in biology (Statistical Reasoning in Biology Concept Inventory [SRBCI]). It is important to assess student thinking in this conceptual area, because it is a fundamental requirement of being statistically literate and associated skills are needed in almost all walks of life. Despite this, previous work shows that non-expert-like thinking in statistical reasoning is common, even after instruction. As science educators, our goal should be to move students along a novice-to-expert spectrum, which could be achieved with growing experience in statistical reasoning. We used item response theory analyses (the one-parameter Rasch model and associated analyses) to assess responses gathered from biology students in two populations at a large research university in Canada in order to test SRBCI's robustness and sensitivity in capturing useful data relating to the students' conceptual ability in statistical reasoning. Our analyses indicated that SRBCI is a unidimensional construct, with items that vary widely in difficulty and provide useful information about such student ability. SRBCI should be useful as a diagnostic tool in a variety of biology settings and as a means of measuring the success of teaching interventions designed to improve statistical reasoning skills. PMID:26903497

  15. Development of the Statistical Reasoning in Biology Concept Inventory (SRBCI)

    PubMed Central

    Deane, Thomas; Nomme, Kathy; Jeffery, Erica; Pollock, Carol; Birol, Gülnur

    2016-01-01

    We followed established best practices in concept inventory design and developed a 12-item inventory to assess student ability in statistical reasoning in biology (Statistical Reasoning in Biology Concept Inventory [SRBCI]). It is important to assess student thinking in this conceptual area, because it is a fundamental requirement of being statistically literate and associated skills are needed in almost all walks of life. Despite this, previous work shows that non–expert-like thinking in statistical reasoning is common, even after instruction. As science educators, our goal should be to move students along a novice-to-expert spectrum, which could be achieved with growing experience in statistical reasoning. We used item response theory analyses (the one-parameter Rasch model and associated analyses) to assess responses gathered from biology students in two populations at a large research university in Canada in order to test SRBCI’s robustness and sensitivity in capturing useful data relating to the students’ conceptual ability in statistical reasoning. Our analyses indicated that SRBCI is a unidimensional construct, with items that vary widely in difficulty and provide useful information about such student ability. SRBCI should be useful as a diagnostic tool in a variety of biology settings and as a means of measuring the success of teaching interventions designed to improve statistical reasoning skills. PMID:26903497

  16. Statistical cautions when estimating DEBtox parameters.

    PubMed

    Billoir, Elise; Delignette-Muller, Marie Laure; Péry, Alexandre R R; Geffard, Olivier; Charles, Sandrine

    2008-09-01

    DEBtox (Dynamic Energy Budget in toxicology) models have been designed to analyse various results from classic tests in ecotoxicology. They consist of a set of mechanistic models describing how organisms manage their energy, when they are exposed to a contaminant. Until now, such a biology-based modeling approach has not been used within the regulatory context. However, these methods have been promoted and discussed in recent guidance documents on the statistical analysis of ecotoxicity data. Indeed, they help us to understand the underlying mechanisms. In this paper, we focused on the 21 day Daphnia magna reproduction test. We first aimed to clarify and detail the model building process leading to DEBtox models. Equations were rederived step by step, and for some of them we obtained results different from the published ones. Then, we statistically evaluated the estimation process quality when using a least squares approach. Using both experimental and simulated data, our analyses highlighted several statistical issues related to the fitting of DEBtox models on OECD-type reproduction data. In this case, particular attention had to be paid to parameter estimates and the interpretation of their confidence intervals. PMID:18571678

  17. Bringing Statistics Up to Speed with Data in Analysis of Lymphocyte Motility

    PubMed Central

    Letendre, Kenneth; Donnadieu, Emmanuel

    2015-01-01

    Two-photon (2P) microscopy provides immunologists with 3D video of the movement of lymphocytes in vivo. Motility parameters extracted from these videos allow detailed analysis of lymphocyte motility in lymph nodes and peripheral tissues. However, standard parametric statistical analyses such as the Student’s t-test are often used incorrectly, and fail to take into account confounds introduced by the experimental methods, potentially leading to erroneous conclusions about T cell motility. Here, we compare the motility of WT T cell versus PKCθ-/-, CARMA1-/-, CCR7-/-, and PTX-treated T cells. We show that the fluorescent dyes used to label T cells have significant effects on T cell motility, and we demonstrate the use of factorial ANOVA as a statistical tool that can control for these effects. In addition, researchers often choose between the use of “cell-based” parameters by averaging multiple steps of a single cell over time (e.g. cell mean speed), or “step-based” parameters, in which all steps of a cell population (e.g. instantaneous speed) are grouped without regard for the cell track. Using mixed model ANOVA, we show that we can maintain cell-based analyses without losing the statistical power of step-based data. We find that as we use additional levels of statistical control, we can more accurately estimate the speed of T cells as they move in lymph nodes as well as measure the impact of individual signaling molecules on T cell motility. As there is increasing interest in using computational modeling to understand T cell behavior in in vivo, these quantitative measures not only give us a better determination of actual T cell movement, they may prove crucial for models to generate accurate predictions about T cell behavior. PMID:25973755

  18. Bringing statistics up to speed with data in analysis of lymphocyte motility.

    PubMed

    Letendre, Kenneth; Donnadieu, Emmanuel; Moses, Melanie E; Cannon, Judy L

    2015-01-01

    Two-photon (2P) microscopy provides immunologists with 3D video of the movement of lymphocytes in vivo. Motility parameters extracted from these videos allow detailed analysis of lymphocyte motility in lymph nodes and peripheral tissues. However, standard parametric statistical analyses such as the Student's t-test are often used incorrectly, and fail to take into account confounds introduced by the experimental methods, potentially leading to erroneous conclusions about T cell motility. Here, we compare the motility of WT T cell versus PKCθ-/-, CARMA1-/-, CCR7-/-, and PTX-treated T cells. We show that the fluorescent dyes used to label T cells have significant effects on T cell motility, and we demonstrate the use of factorial ANOVA as a statistical tool that can control for these effects. In addition, researchers often choose between the use of "cell-based" parameters by averaging multiple steps of a single cell over time (e.g. cell mean speed), or "step-based" parameters, in which all steps of a cell population (e.g. instantaneous speed) are grouped without regard for the cell track. Using mixed model ANOVA, we show that we can maintain cell-based analyses without losing the statistical power of step-based data. We find that as we use additional levels of statistical control, we can more accurately estimate the speed of T cells as they move in lymph nodes as well as measure the impact of individual signaling molecules on T cell motility. As there is increasing interest in using computational modeling to understand T cell behavior in in vivo, these quantitative measures not only give us a better determination of actual T cell movement, they may prove crucial for models to generate accurate predictions about T cell behavior. PMID:25973755

  19. HPV-Associated Cancers Statistics

    MedlinePlus

    ... What CDC Is Doing Related Links Stay Informed Statistics for Other Kinds of Cancer Breast Cervical Colorectal ( ... Vaginal and Vulvar Cancer Home HPV-Associated Cancer Statistics Language: English Español (Spanish) Recommend on Facebook Tweet ...

  20. Key Statistics for Thyroid Cancer

    MedlinePlus

    ... cancer? Next Topic Thyroid cancer risk factors Key statistics for thyroid cancer How common is thyroid cancer? ... remains very low compared with most other cancers. Statistics on survival rates for thyroid cancer are discussed ...

  1. Muscular Dystrophy: Data and Statistics

    MedlinePlus

    ... Statistics Recommend on Facebook Tweet Share Compartir MD STAR net Data and Statistics The following data and ... research [ Read Article ] For more information on MD STAR net see Research and Tracking . Key Findings Feature ...

  2. Heart Disease and Stroke Statistics

    MedlinePlus

    ... Nutrition (PDF) Obesity (PDF) Peripheral Artery Disease (PDF) ... statistics, please contact the American Heart Association National Center, Office of Science & Medicine at statistics@heart.org . Please direct all ...

  3. Biological Parametric Mapping: A Statistical Toolbox for Multi-Modality Brain Image Analysis

    PubMed Central

    Casanova, Ramon; Ryali, Srikanth; Baer, Aaron; Laurienti, Paul J.; Burdette, Jonathan H.; Hayasaka, Satoru; Flowers, Lynn; Wood, Frank; Maldjian, Joseph A.

    2006-01-01

    In recent years multiple brain MR imaging modalities have emerged; however, analysis methodologies have mainly remained modality specific. In addition, when comparing across imaging modalities, most researchers have been forced to rely on simple region-of-interest type analyses, which do not allow the voxel-by-voxel comparisons necessary to answer more sophisticated neuroscience questions. To overcome these limitations, we developed a toolbox for multimodal image analysis called biological parametric mapping (BPM), based on a voxel-wise use of the general linear model. The BPM toolbox incorporates information obtained from other modalities as regressors in a voxel-wise analysis, thereby permitting investigation of more sophisticated hypotheses. The BPM toolbox has been developed in MATLAB with a user friendly interface for performing analyses, including voxel-wise multimodal correlation, ANCOVA, and multiple regression. It has a high degree of integration with the SPM (statistical parametric mapping) software relying on it for visualization and statistical inference. Furthermore, statistical inference for a correlation field, rather than a widely-used T-field, has been implemented in the correlation analysis for more accurate results. An example with in-vivo data is presented demonstrating the potential of the BPM methodology as a tool for multimodal image analysis. PMID:17070709

  4. NOx analyser interefence from alkenes

    NASA Astrophysics Data System (ADS)

    Bloss, W. J.; Alam, M. S.; Lee, J. D.; Vazquez, M.; Munoz, A.; Rodenas, M.

    2012-04-01

    Nitrogen oxides (NO and NO2, collectively NOx) are critical intermediates in atmospheric chemistry. NOx abundance controls the levels of the primary atmospheric oxidants OH, NO3 and O3, and regulates the ozone production which results from the degradation of volatile organic compounds. NOx are also atmospheric pollutants in their own right, and NO2 is commonly included in air quality objectives and regulations. In addition to their role in controlling ozone formation, NOx levels affect the production of other pollutants such as the lachrymator PAN, and the nitrate component of secondary aerosol particles. Consequently, accurate measurement of nitrogen oxides in the atmosphere is of major importance for understanding our atmosphere. The most widely employed approach for the measurement of NOx is chemiluminescent detection of NO2* from the NO + O3 reaction, combined with NO2 reduction by either a heated catalyst or photoconvertor. The reaction between alkenes and ozone is also chemiluminescent; therefore alkenes may contribute to the measured NOx signal, depending upon the instrumental background subtraction cycle employed. This interference has been noted previously, and indeed the effect has been used to measure both alkenes and ozone in the atmosphere. Here we report the results of a systematic investigation of the response of a selection of NOx analysers, ranging from systems used for routine air quality monitoring to atmospheric research instrumentation, to a series of alkenes ranging from ethene to the biogenic monoterpenes, as a function of conditions (co-reactants, humidity). Experiments were performed in the European Photoreactor (EUPHORE) to ensure common calibration, a common sample for the monitors, and to unequivocally confirm the alkene (via FTIR) and NO2 (via DOAS) levels present. The instrument responses ranged from negligible levels up to 10 % depending upon the alkene present and conditions used. Such interferences may be of substantial importance

  5. Statistical methods in physical mapping

    SciTech Connect

    Nelson, D.O.

    1995-05-01

    One of the great success stories of modern molecular genetics has been the ability of biologists to isolate and characterize the genes responsible for serious inherited diseases like fragile X syndrome, cystic fibrosis and myotonic muscular dystrophy. This dissertation concentrates on constructing high-resolution physical maps. It demonstrates how probabilistic modeling and statistical analysis can aid molecular geneticists in the tasks of planning, execution, and evaluation of physical maps of chromosomes and large chromosomal regions. The dissertation is divided into six chapters. Chapter 1 provides an introduction to the field of physical mapping, describing the role of physical mapping in gene isolation and ill past efforts at mapping chromosomal regions. The next two chapters review and extend known results on predicting progress in large mapping projects. Such predictions help project planners decide between various approaches and tactics for mapping large regions of the human genome. Chapter 2 shows how probability models have been used in the past to predict progress in mapping projects. Chapter 3 presents new results, based on stationary point process theory, for progress measures for mapping projects based on directed mapping strategies. Chapter 4 describes in detail the construction of all initial high-resolution physical map for human chromosome 19. This chapter introduces the probability and statistical models involved in map construction in the context of a large, ongoing physical mapping project. Chapter 5 concentrates on one such model, the trinomial model. This chapter contains new results on the large-sample behavior of this model, including distributional results, asymptotic moments, and detection error rates. In addition, it contains an optimality result concerning experimental procedures based on the trinomial model. The last chapter explores unsolved problems and describes future work.

  6. Global atmospheric circulation statistics, 1000-1 mb

    NASA Technical Reports Server (NTRS)

    Randel, William J.

    1992-01-01

    The atlas presents atmospheric general circulation statistics derived from twelve years (1979-90) of daily National Meteorological Center (NMC) operational geopotential height analyses; it is an update of a prior atlas using data over 1979-1986. These global analyses are available on pressure levels covering 1000-1 mb (approximately 0-50 km). The geopotential grids are a combined product of the Climate Analysis Center (which produces analyses over 70-1 mb) and operational NMC analyses (over 1000-100 mb). Balance horizontal winds and hydrostatic temperatures are derived from the geopotential fields.

  7. Shielding Ddsign and analyses of KIPT neutron source facility.

    SciTech Connect

    Zhong, Z.; Gohar, Y.

    2011-01-01

    Argonne National Laboratory (ANL) of USA and Kharkov Institute of Physics and Technology (KIPT) of Ukraine have been collaborating on the conceptual design development of a neutron source facility. An electron accelerator drives a sub-critical facility (ADS) is used for generating the neutron source. The facility will be utilized for performing basic and applied nuclear researches, producing medical isotopes, and training young nuclear specialists. Monte Carlo code MCNPX has been utilized as the major design tool for the design, due to its capability to transport electrons, photons, and neutrons at high energies. However the ADS shielding calculations with MCNPX need enormous computational resources and the small neutron yield per electron makes sampling difficulty for the Monte Carlo calculations. The high energy electrons (E > 100 MeV) generate very high energy neutrons and these neutrons dominant the total radiation dose outside the shield. The radiation dose caused by high energy neutrons is {approx}3-4 orders of magnitude higher than that of the photons. However, the high energy neutron fraction within the total generated neutrons is very small, which increases the sampling difficulty and the required computational time. To solve these difficulties, the user subroutines of MCNPX are utilized to generate a neutron source file, which record the generated neutrons from the photonuclear reactions caused by electrons. This neutron source file is utilized many times in the following MCNPX calculations for weight windows (importance function) generation and radiation dose calculations. In addition, the neutron source file can be sampled multiple times to improve the statistics of the calculated results. In this way the expensive electron transport calculations can be performed once with good statistics for the different ADS shielding problems. This paper presents the method of generating and utilizing the neutron source file by MCNPX for the ADS shielding calculation

  8. Statistical Seismology and Induced Seismicity

    NASA Astrophysics Data System (ADS)

    Tiampo, K. F.; González, P. J.; Kazemian, J.

    2014-12-01

    While seismicity triggered or induced by natural resources production such as mining or water impoundment in large dams has long been recognized, the recent increase in the unconventional production of oil and gas has been linked to rapid rise in seismicity in many places, including central North America (Ellsworth et al., 2012; Ellsworth, 2013). Worldwide, induced events of M~5 have occurred and, although rare, have resulted in both damage and public concern (Horton, 2012; Keranen et al., 2013). In addition, over the past twenty years, the increase in both number and coverage of seismic stations has resulted in an unprecedented ability to precisely record the magnitude and location of large numbers of small magnitude events. The increase in the number and type of seismic sequences available for detailed study has revealed differences in their statistics that previously difficult to quantify. For example, seismic swarms that produce significant numbers of foreshocks as well as aftershocks have been observed in different tectonic settings, including California, Iceland, and the East Pacific Rise (McGuire et al., 2005; Shearer, 2012; Kazemian et al., 2014). Similarly, smaller events have been observed prior to larger induced events in several occurrences from energy production. The field of statistical seismology has long focused on the question of triggering and the mechanisms responsible (Stein et al., 1992; Hill et al., 1993; Steacy et al., 2005; Parsons, 2005; Main et al., 2006). For example, in most cases the associated stress perturbations are much smaller than the earthquake stress drop, suggesting an inherent sensitivity to relatively small stress changes (Nalbant et al., 2005). Induced seismicity provides the opportunity to investigate triggering and, in particular, the differences between long- and short-range triggering. Here we investigate the statistics of induced seismicity sequences from around the world, including central North America and Spain, and

  9. Gait patterns for crime fighting: statistical evaluation

    NASA Astrophysics Data System (ADS)

    Sulovská, Kateřina; Bělašková, Silvie; Adámek, Milan

    2013-10-01

    The criminality is omnipresent during the human history. Modern technology brings novel opportunities for identification of a perpetrator. One of these opportunities is an analysis of video recordings, which may be taken during the crime itself or before/after the crime. The video analysis can be classed as identification analyses, respectively identification of a person via externals. The bipedal locomotion focuses on human movement on the basis of their anatomical-physiological features. Nowadays, the human gait is tested by many laboratories to learn whether the identification via bipedal locomotion is possible or not. The aim of our study is to use 2D components out of 3D data from the VICON Mocap system for deep statistical analyses. This paper introduces recent results of a fundamental study focused on various gait patterns during different conditions. The study contains data from 12 participants. Curves obtained from these measurements were sorted, averaged and statistically tested to estimate the stability and distinctiveness of this biometrics. Results show satisfactory distinctness of some chosen points, while some do not embody significant difference. However, results presented in this paper are of initial phase of further deeper and more exacting analyses of gait patterns under different conditions.

  10. Carbon additions and grain defect formation in directionally solidified nickel-base superalloys

    NASA Astrophysics Data System (ADS)

    Tin, Sammy

    Over the past fifty years, technological advances leading up to the development of modern high-performance turbine engines for aircraft and power generation applications have coincided with significant engineering accomplishments in the area of Ni-base superalloy metallurgy. As the levels of refractory alloying additions to these Ni-base superalloys increase to enhance high-temperature mechanical properties, grain defect formation, particularly the development of freckle chains, during directional solidification has become an increasingly important problem. In this dissertation, the effect of carbon additions on the solidification characteristics of single crystal Ni-base superalloys has been investigated over a wide range of composition. Using statistically designed experiments, carbon additions of 0.1 to 0.125 wt. % were shown to be beneficial in stabilizing against the formation of grain defects due to thermosolutal convective instabilities. Detailed analyses were performed on the single crystal castings to identify the underlying mechanisms by which the carbon additions improve the solidification characteristics. In addition to forming Ta-rich MC carbides during solidification, the carbon additions were also revealed to influence the segregation behavior of the constituent elements in a manner that was beneficial in suppressing the formation of freckle defects during solidification. Using a segregation mapping technique, less segregation of rhenium, tungsten and tantalum was measured in the carbon containing alloys. Carbide formation during solidification was studied using differential thermal analysis. The influence of carbon additions on the solidification characteristics of the experimental single crystal alloys was assessed using a dimensionless Rayleigh analysis. Based on these analyses, the physical presence of carbides during the initial stages of solidification was also shown to inhibit the formation of freckle defects. In this investigation, carbon

  11. Statistics Anxiety and Instructor Immediacy

    ERIC Educational Resources Information Center

    Williams, Amanda S.

    2010-01-01

    The purpose of this study was to investigate the relationship between instructor immediacy and statistics anxiety. It was predicted that students receiving immediacy would report lower levels of statistics anxiety. Using a pretest-posttest-control group design, immediacy was measured using the Instructor Immediacy scale. Statistics anxiety was…

  12. Statistics: It's in the Numbers!

    ERIC Educational Resources Information Center

    Deal, Mary M.; Deal, Walter F., III

    2007-01-01

    Mathematics and statistics play important roles in peoples' lives today. A day hardly passes that they are not bombarded with many different kinds of statistics. As consumers they see statistical information as they surf the web, watch television, listen to their satellite radios, or even read the nutrition facts panel on a cereal box in the…

  13. Statistical log analysis made practical

    SciTech Connect

    Mitchell, W.K.; Nelson, R.J. )

    1991-06-01

    This paper discusses the advantages of a statistical approach to log analysis. Statistical techniques use inverse methods to calculate formation parameters. The use of statistical techniques has been limited, however, by the complexity of the mathematics and lengthy computer time required to minimize traditionally used nonlinear equations.

  14. Invention Activities Support Statistical Reasoning

    ERIC Educational Resources Information Center

    Smith, Carmen Petrick; Kenlan, Kris

    2016-01-01

    Students' experiences with statistics and data analysis in middle school are often limited to little more than making and interpreting graphs. Although students may develop fluency in statistical procedures and vocabulary, they frequently lack the skills necessary to apply statistical reasoning in situations other than clear-cut textbook examples.…

  15. Teaching Statistics Online Using "Excel"

    ERIC Educational Resources Information Center

    Jerome, Lawrence

    2011-01-01

    As anyone who has taught or taken a statistics course knows, statistical calculations can be tedious and error-prone, with the details of a calculation sometimes distracting students from understanding the larger concepts. Traditional statistics courses typically use scientific calculators, which can relieve some of the tedium and errors but…

  16. Explorations in Statistics: the Bootstrap

    ERIC Educational Resources Information Center

    Curran-Everett, Douglas

    2009-01-01

    Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This fourth installment of Explorations in Statistics explores the bootstrap. The bootstrap gives us an empirical approach to estimate the theoretical variability among possible values of a sample statistic such as the…

  17. Representative Ensembles in Statistical Mechanics

    NASA Astrophysics Data System (ADS)

    Yukalov, V. I.

    The notion of representative statistical ensembles, correctly representing statistical systems, is strictly formulated. This notion allows for a proper description of statistical systems, avoiding inconsistencies in theory. As an illustration, a Bose-condensed system is considered. It is shown that a self-consistent treatment of the latter, using a representative ensemble, always yields a conserving and gapless theory.

  18. Use of Statistics by Librarians.

    ERIC Educational Resources Information Center

    Christensen, John O.

    1988-01-01

    Description of common errors found in the statistical methodologies of research carried out by librarians, focuses on sampling and generalizability. The discussion covers the need to either adapt library research to the statistical abilities of librarians or to educate librarians in the proper use of statistics. (15 references) (CLB)

  19. Using R-Project for Free Statistical Analysis in Extension Research

    ERIC Educational Resources Information Center

    Mangiafico, Salvatore S.

    2013-01-01

    One option for Extension professionals wishing to use free statistical software is to use online calculators, which are useful for common, simple analyses. A second option is to use a free computing environment capable of performing statistical analyses, like R-project. R-project is free, cross-platform, powerful, and respected, but may be…

  20. Early aftershock statistics

    NASA Astrophysics Data System (ADS)

    Narteau, C.; Shebalin, P.; Holschneider, M.; Schorlemmer, D.

    2009-04-01

    In the Limited Power Law model (LPL) we consider that after a triggering event - the so-called mainshock - rocks subject to sufficiently large differential stress can fail spontaneously by static fatigue. Then, earlier aftershocks occur in zones of highest stress and the c-value, i.e. the delay before the onset of the power-law aftershock decay rate, depends on the amplitude of the stress perturbation in the aftershock zone. If we assume that this stress perturbation is proportional to the absolute level of stress in the area, the model also predicts that shorter delay occur in zones of higher stress. Here, we present two analyses that support such a prediction. In these analyses, we use only aftershocks of 2.5 < M < 4.5 earthquakes to avoid well-known artifacts resulting from overlapping records. First, we analyze the c-value across different types of faulting in southern California to compare with the differential shear stress predicted by a Mohr-Coulomb failure criterion. As expected, we find that the c-value is on average shorter for thrust earthquakes (high stress) than for normal ones (low stress), taking intermediate values for strike-slip earthquakes (intermediate stress). Second, we test the hypothesis that large earthquakes occur in zones where the level of stress is abnormally high. Instead of the c-value we use the < t >-value, the geometric average of early aftershock times. One more time, we observed that M > 5 earthquakes occur where and when the < t >-value is small. This effect is even stronger for M > 6 earthquakes.

  1. Statistical modelling of ultrasonic sensors in process industries—new prospects for conventional devices

    NASA Astrophysics Data System (ADS)

    Schäfer, Robert; Hauptmann, Peter

    2007-05-01

    The acoustic impedance of different fluids was indirectly measured by analysing the reverberation of a pulsed ultrasonic transducer showing a design as applied in contemporary process instrumentation. The set of observed fluids comprises air, methanol, ethanol, toluene, water and glycerine, which are substances of extremely different acoustic properties. The fluid impedance was precisely gauged using statistical linear models. The model complexity as well as the required calibration effort was minimized including a priori knowledge about the electrical characteristic of the transducer near resonance. The predictive power of different linear model estimates in terms of ordinary least squares, principal component regression and partial least squares was compared. A compensation method was proposed correcting the impact of temperature variation on analysed data. Additionally measured transit time permits the precise estimation of the density of arbitrary homogeneous fluids of varying temperature using traditional transducer designs. Thereby the functional range of conventional ultrasonic devices in process industries can be extended without the need for constructive modifications.

  2. Regression Analyses for ABAB Designs in Educational Research.

    ERIC Educational Resources Information Center

    Beasley, T. Mark

    1996-01-01

    Too many practitioners interpret ABAB research based on visual inspection rather than statistical analysis. Based on an experiment using cooperative learning to mainstream autistic students, hypothetical data for one student from an ABAB reversal design are used to illustrate the techniques and importance of regression analyses. Discussion focuses…

  3. Statistical mechanics and the ontological interpretation

    NASA Astrophysics Data System (ADS)

    Bohm, D.; Hiley, B. J.

    1996-06-01

    To complete our ontological interpretation of quantum theory we have to conclude a treatment of quantum statistical mechanics. The basic concepts in the ontological approach are the particle and the wave function. The density matrix cannot play a fundamental role here. Therefore quantum statistical mechanics will require a further statistical distribution over wave functions in addition to the distribution of particles that have a specified wave function. Ultimately the wave function of the universe will he required, but we show that if the universe in not in thermodynamic equilibrium then it can he treated in terms of weakly interacting large scale constituents that are very nearly independent of each other. In this way we obtain the same results as those of the usual approach within the framework of the ontological interpretation.

  4. Proof of the Spin-Statistics Theorem

    NASA Astrophysics Data System (ADS)

    Santamato, Enrico; De Martini, Francesco

    2015-07-01

    The traditional standard quantum mechanics theory is unable to solve the spin-statistics problem, i.e. to justify the utterly important "Pauli Exclusion Principle". A complete and straightforward solution of the spin-statistics problem is presented on the basis of the "conformal quantum geometrodynamics" theory. This theory provides a Weyl-gauge invariant formulation of the standard quantum mechanics and reproduces successfully all relevant quantum processes including the formulation of Dirac's or Schrödinger's equation, of Heisenberg's uncertainty relations and of the nonlocal EPR correlations. When the conformal quantum geometrodynamics is applied to a system made of many identical particles with spin, an additional constant property of all elementary particles enters naturally into play: the "intrinsic helicity". This property, not considered in the Standard Quantum Mechanics, determines the correct spin-statistics connection observed in Nature.

  5. Paranoid personality has a dimensional latent structure: taxometric analyses of community and clinical samples.

    PubMed

    Edens, John F; Marcus, David K; Morey, Leslie C

    2009-08-01

    Although paranoid personality is one of the most commonly diagnosed personality disorders and is associated with numerous negative life consequences, relatively little is known about the structural properties of this condition. This study examines whether paranoid personality traits represent a latent dimension or a discrete class (i.e., taxon). In Study 1, the authors conducted taxometric analyses of paranoid personality disorder criteria in a sample of 731 patients participating in the Collaborative Longitudinal Study of Personality Disorders project (Gunderson et al., 2000) who had been administered a semistructured diagnostic interview for personality disorders according to criteria of the Diagnostic and Statistical Manual of Mental Disorders (4th ed.; American Psychiatric Association, 1994). In Study 2, the authors conducted parallel analyses of the Paranoia scale of the Personality Assessment Inventory (PAI; L. C. Morey, 2007), using data from the PAI community and clinical normative databases. Analyses across both self-report and interview-based indicators offered compelling support for a dimensional structure. Additionally, analyses of external correlates in these data sets suggested that dimensional models demonstrated stronger validity coefficients with criterion measures than did dichotomous models. PMID:19685951

  6. Topics in statistical mechanics

    SciTech Connect

    Elser, V.

    1984-05-01

    This thesis deals with four independent topics in statistical mechanics: (1) the dimer problem is solved exactly for a hexagonal lattice with general boundary using a known generating function from the theory of partitions. It is shown that the leading term in the entropy depends on the shape of the boundary; (2) continuum models of percolation and self-avoiding walks are introduced with the property that their series expansions are sums over linear graphs with intrinsic combinatorial weights and explicit dimension dependence; (3) a constrained SOS model is used to describe the edge of a simple cubic crystal. Low and high temperature results are derived as well as the detailed behavior near the crystal facet; (4) the microscopic model of the lambda-transition involving atomic permutation cycles is reexamined. In particular, a new derivation of the two-component field theory model of the critical behavior is presented. Results for a lattice model originally proposed by Kikuchi are extended with a high temperature series expansion and Monte Carlo simulation. 30 references.

  7. International petroleum statistics report

    SciTech Connect

    1996-05-01

    The International Petroleum Statistics Report presents data on international oil production, demand, imports, exports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1995; OECD stocks from 1973 through 1995; and OECD trade from 1084 through 1994.

  8. Statistical properties of exoplanets

    NASA Astrophysics Data System (ADS)

    Udry, Stéphane

    Since the detection a decade ago of the planetary companion of 51 Peg, more than 165 extrasolar planets have been unveiled by radial-velocity measurements. They present a wide variety of characteristics such as large masses with small orbital separations, high eccentricities, period resonances in multi-planet systems, etc. Meaningful features of the statistical distributions of the orbital parameters or parent stellar properties have emerged. We discuss them in the context of the constraints they provide for planet-formation models and in comparison to Neptune-mass planets in short-period orbits recently detected by radial-velocity surveys, thanks to new instrumental developments and adequate observing strategy. We expect continued improvement in velocity precision and anticipate the detection of Neptune-mass planets in longer-period orbits and even lower-mass planets in short-period orbits, giving us new information on the mass distribution function of exoplanets. Finally, the role of radial-velocity follow-up measurements of transit candidates is emphasized.

  9. International petroleum statistics report

    SciTech Connect

    1995-07-27

    The International Petroleum Statistics Report presents data on international oil production, demand, imports, and exports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1994; OECD stocks from 1973 through 1994; and OECD trade from 1984 through 1994.

  10. International petroleum statistics report

    SciTech Connect

    1997-07-01

    The International Petroleum Statistics Report is a monthly publication that provides current international data. The report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent 12 months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1996; OECD stocks from 1973 through 1996; and OECD trade from 1986 through 1996.

  11. International petroleum statistics report

    SciTech Connect

    1995-11-01

    The International Petroleum Statistics Report presents data on international oil production, demand, imports, exports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1994; OECD stocks from 1973 through 1994; and OECD trade from 1984 through 1994.

  12. International petroleum statistics report

    SciTech Connect

    1996-10-01

    The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. Word oil production and OECD demand data are for the years 1970 through 1995; OECD stocks from 1973 through 1995; and OECD trade from 1985 through 1995.

  13. Statistical Mechanics of Zooplankton

    PubMed Central

    Hinow, Peter; Nihongi, Ai; Strickler, J. Rudi

    2015-01-01

    Statistical mechanics provides the link between microscopic properties of many-particle systems and macroscopic properties such as pressure and temperature. Observations of similar “microscopic” quantities exist for the motion of zooplankton, as well as many species of other social animals. Herein, we propose to take average squared velocities as the definition of the “ecological temperature” of a population under different conditions on nutrients, light, oxygen and others. We test the usefulness of this definition on observations of the crustacean zooplankton Daphnia pulicaria. In one set of experiments, D. pulicaria is infested with the pathogen Vibrio cholerae, the causative agent of cholera. We find that infested D. pulicaria under light exposure have a significantly greater ecological temperature, which puts them at a greater risk of detection by visual predators. In a second set of experiments, we observe D. pulicaria in cold and warm water, and in darkness and under light exposure. Overall, our ecological temperature is a good discriminator of the crustacean’s swimming behavior. PMID:26270537

  14. Statistical mechanics of nucleosomes

    NASA Astrophysics Data System (ADS)

    Chereji, Razvan V.

    Eukaryotic cells contain long DNA molecules (about two meters for a human cell) which are tightly packed inside the micrometric nuclei. Nucleosomes are the basic packaging unit of the DNA which allows this millionfold compactification. A longstanding puzzle is to understand the principles which allow cells to both organize their genomes into chromatin fibers in the crowded space of their nuclei, and also to keep the DNA accessible to many factors and enzymes. With the nucleosomes covering about three quarters of the DNA, their positions are essential because these influence which genes can be regulated by the transcription factors and which cannot. We study physical models which predict the genome-wide organization of the nucleosomes and also the relevant energies which dictate this organization. In the last five years, the study of chromatin knew many important advances. In particular, in the field of nucleosome positioning, new techniques of identifying nucleosomes and the competing DNA-binding factors appeared, as chemical mapping with hydroxyl radicals, ChIP-exo, among others, the resolution of the nucleosome maps increased by using paired-end sequencing, and the price of sequencing an entire genome decreased. We present a rigorous statistical mechanics model which is able to explain the recent experimental results by taking into account nucleosome unwrapping, competition between different DNA-binding proteins, and both the interaction between histones and DNA, and between neighboring histones. We show a series of predictions of our new model, all in agreement with the experimental observations.

  15. Statistical Mechanics of Zooplankton.

    PubMed

    Hinow, Peter; Nihongi, Ai; Strickler, J Rudi

    2015-01-01

    Statistical mechanics provides the link between microscopic properties of many-particle systems and macroscopic properties such as pressure and temperature. Observations of similar "microscopic" quantities exist for the motion of zooplankton, as well as many species of other social animals. Herein, we propose to take average squared velocities as the definition of the "ecological temperature" of a population under different conditions on nutrients, light, oxygen and others. We test the usefulness of this definition on observations of the crustacean zooplankton Daphnia pulicaria. In one set of experiments, D. pulicaria is infested with the pathogen Vibrio cholerae, the causative agent of cholera. We find that infested D. pulicaria under light exposure have a significantly greater ecological temperature, which puts them at a greater risk of detection by visual predators. In a second set of experiments, we observe D. pulicaria in cold and warm water, and in darkness and under light exposure. Overall, our ecological temperature is a good discriminator of the crustacean's swimming behavior. PMID:26270537

  16. Statistical genetics in traditionally cultivated crops.

    PubMed

    Artoisenet, Pierre; Minsart, Laure-Anne

    2014-11-01

    Traditional farming systems have attracted a lot of attention over the past decades as they have been recognized to supply an important component in the maintenance of the genetic diversity worldwide. A broad spectrum of traditionally managed crops has been studied to investigate how reproductive properties in combination with husbandry characteristics shape the genetic structure of the crops over time. However, traditional farms typically involve populations of small size whose genetic evolution is overwhelmed with statistic fluctuations inherent to the stochastic nature of the crossings. Hence there is generally no one-to-one mapping between crop properties and measured genotype data, and claims regarding crop properties on the basis of the observed genetic structure must be stated within a confidence level to be estimated by means of a dedicated statistical analysis. In this paper, we propose a comprehensive framework to carry out such statistical analyses. We illustrate the capabilities of our approach by applying it to crops of C. lanatus var. lanatus oleaginous type cultivated in Côte d׳Ivoire. While some properties such as the effective field size considerably evade the constraints from experimental data, others such as the mating system turn out to be characterized with a higher statistical significance. We discuss the importance of our approach for studies on traditionally cultivated crops in general. PMID:24992232

  17. Statistics of Statisticians: Critical Mass of Statistics and Operational Research Groups

    NASA Astrophysics Data System (ADS)

    Kenna, Ralph; Berche, Bertrand

    Using a recently developed model, inspired by mean field theory in statistical physics, and data from the UK's Research Assessment Exercise, we analyse the relationship between the qualities of statistics and operational research groups and the quantities of researchers in them. Similar to other academic disciplines, we provide evidence for a linear dependency of quality on quantity up to an upper critical mass, which is interpreted as the average maximum number of colleagues with whom a researcher can communicate meaningfully within a research group. The model also predicts a lower critical mass, which research groups should strive to achieve to avoid extinction. For statistics and operational research, the lower critical mass is estimated to be 9 ± 3. The upper critical mass, beyond which research quality does not significantly depend on group size, is 17 ± 6.

  18. Automated variance reduction for Monte Carlo shielding analyses with MCNP

    NASA Astrophysics Data System (ADS)

    Radulescu, Georgeta

    Variance reduction techniques are employed in Monte Carlo analyses to increase the number of particles in the space phase of interest and thereby lower the variance of statistical estimation. Variance reduction parameters are required to perform Monte Carlo calculations. It is well known that adjoint solutions, even approximate ones, are excellent biasing functions that can significantly increase the efficiency of a Monte Carlo calculation. In this study, an automated method of generating Monte Carlo variance reduction parameters, and of implementing the source energy biasing and the weight window technique in MCNP shielding calculations has been developed. The method is based on the approach used in the SAS4 module of the SCALE code system, which derives the biasing parameters from an adjoint one-dimensional Discrete Ordinates calculation. Unlike SAS4 that determines the radial and axial dose rates of a spent fuel cask in separate calculations, the present method provides energy and spatial biasing parameters for the entire system that optimize the simulation of particle transport towards all external surfaces of a spent fuel cask. The energy and spatial biasing parameters are synthesized from the adjoint fluxes of three one-dimensional Discrete Ordinates adjoint calculations. Additionally, the present method accommodates multiple source regions, such as the photon sources in light-water reactor spent nuclear fuel assemblies, in one calculation. With this automated method, detailed and accurate dose rate maps for photons, neutrons, and secondary photons outside spent fuel casks or other containers can be efficiently determined with minimal efforts.

  19. Sensitivity of Assimilated Tropical Tropospheric Ozone to the Meteorological Analyses

    NASA Technical Reports Server (NTRS)

    Hayashi, Hiroo; Stajner, Ivanka; Pawson, Steven; Thompson, Anne M.

    2002-01-01

    Tropical tropospheric ozone fields from two different experiments performed with an off-line ozone assimilation system developed in NASA's Data Assimilation Office (DAO) are examined. Assimilated ozone fields from the two experiments are compared with the collocated ozone profiles from the Southern Hemispheric Additional Ozonesondes (SHADOZ) network. Results are presented for 1998. The ozone assimilation system includes a chemistry-transport model, which uses analyzed winds from the Goddard Earth Observing System (GEOS) Data Assimilation System (DAS). The two experiments use wind fields from different versions of GEOS DAS: an operational version of the GEOS-2 system and a prototype of the GEOS-4 system. While both versions of the DAS utilize the Physical-space Statistical Analysis System and use comparable observations, they use entirely different general circulation models and data insertion techniques. The shape of the annual-mean vertical profile of the assimilated ozone fields is sensitive to the meteorological analyses, with the GEOS-4-based ozone being closest to the observations. This indicates that the resolved transport in GEOS-4 is more realistic than in GEOS-2. Remaining uncertainties include quantification of the representation of sub-grid-scale processes in the transport calculations, which plays an important role in the locations and seasons where convection dominates the transport.

  20. Basic statistics in cell biology.

    PubMed

    Vaux, David L

    2014-01-01

    The physicist Ernest Rutherford said, "If your experiment needs statistics, you ought to have done a better experiment." Although this aphorism remains true for much of today's research in cell biology, a basic understanding of statistics can be useful to cell biologists to help in monitoring the conduct of their experiments, in interpreting the results, in presenting them in publications, and when critically evaluating research by others. However, training in statistics is often focused on the sophisticated needs of clinical researchers, psychologists, and epidemiologists, whose conclusions depend wholly on statistics, rather than the practical needs of cell biologists, whose experiments often provide evidence that is not statistical in nature. This review describes some of the basic statistical principles that may be of use to experimental biologists, but it does not cover the sophisticated statistics needed for papers that contain evidence of no other kind. PMID:25000992