Science.gov

Sample records for age statistical analysis

  1. Statistical ecology comes of age

    PubMed Central

    Gimenez, Olivier; Buckland, Stephen T.; Morgan, Byron J. T.; Bez, Nicolas; Bertrand, Sophie; Choquet, Rémi; Dray, Stéphane; Etienne, Marie-Pierre; Fewster, Rachel; Gosselin, Frédéric; Mérigot, Bastien; Monestiez, Pascal; Morales, Juan M.; Mortier, Frédéric; Munoz, François; Ovaskainen, Otso; Pavoine, Sandrine; Pradel, Roger; Schurr, Frank M.; Thomas, Len; Thuiller, Wilfried; Trenkel, Verena; de Valpine, Perry; Rexstad, Eric

    2014-01-01

    The desire to predict the consequences of global environmental change has been the driver towards more realistic models embracing the variability and uncertainties inherent in ecology. Statistical ecology has gelled over the past decade as a discipline that moves away from describing patterns towards modelling the ecological processes that generate these patterns. Following the fourth International Statistical Ecology Conference (1–4 July 2014) in Montpellier, France, we analyse current trends in statistical ecology. Important advances in the analysis of individual movement, and in the modelling of population dynamics and species distributions, are made possible by the increasing use of hierarchical and hidden process models. Exciting research perspectives include the development of methods to interpret citizen science data and of efficient, flexible computational algorithms for model fitting. Statistical ecology has come of age: it now provides a general and mathematically rigorous framework linking ecological theory and empirical data. PMID:25540151

  2. Statistical ecology comes of age.

    PubMed

    Gimenez, Olivier; Buckland, Stephen T; Morgan, Byron J T; Bez, Nicolas; Bertrand, Sophie; Choquet, Rémi; Dray, Stéphane; Etienne, Marie-Pierre; Fewster, Rachel; Gosselin, Frédéric; Mérigot, Bastien; Monestiez, Pascal; Morales, Juan M; Mortier, Frédéric; Munoz, François; Ovaskainen, Otso; Pavoine, Sandrine; Pradel, Roger; Schurr, Frank M; Thomas, Len; Thuiller, Wilfried; Trenkel, Verena; de Valpine, Perry; Rexstad, Eric

    2014-12-01

    The desire to predict the consequences of global environmental change has been the driver towards more realistic models embracing the variability and uncertainties inherent in ecology. Statistical ecology has gelled over the past decade as a discipline that moves away from describing patterns towards modelling the ecological processes that generate these patterns. Following the fourth International Statistical Ecology Conference (1-4 July 2014) in Montpellier, France, we analyse current trends in statistical ecology. Important advances in the analysis of individual movement, and in the modelling of population dynamics and species distributions, are made possible by the increasing use of hierarchical and hidden process models. Exciting research perspectives include the development of methods to interpret citizen science data and of efficient, flexible computational algorithms for model fitting. Statistical ecology has come of age: it now provides a general and mathematically rigorous framework linking ecological theory and empirical data.

  3. Mathematical and statistical analysis

    NASA Technical Reports Server (NTRS)

    Houston, A. Glen

    1988-01-01

    The goal of the mathematical and statistical analysis component of RICIS is to research, develop, and evaluate mathematical and statistical techniques for aerospace technology applications. Specific research areas of interest include modeling, simulation, experiment design, reliability assessment, and numerical analysis.

  4. Deconstructing Statistical Analysis

    ERIC Educational Resources Information Center

    Snell, Joel

    2014-01-01

    Using a very complex statistical analysis and research method for the sake of enhancing the prestige of an article or making a new product or service legitimate needs to be monitored and questioned for accuracy. 1) The more complicated the statistical analysis, and research the fewer the number of learned readers can understand it. This adds a…

  5. Metabonomics evaluations of age-related changes in the urinary compositions of male Sprague Dawley rats and effects of data normalization methods on statistical and quantitative analysis

    PubMed Central

    Schnackenberg, Laura K; Sun, Jinchun; Espandiari, Parvaneh; Holland, Ricky D; Hanig, Joseph; Beger, Richard D

    2007-01-01

    Background Urine from male Sprague-Dawley rats 25, 40, and 80 days old was analyzed by NMR and UPLC/MS. The effects of data normalization procedures on principal component analysis (PCA) and quantitative analysis of NMR-based metabonomics data were investigated. Additionally, the effects of age on the metabolic profiles were examined by both NMR and UPLC/MS analyses. Results The data normalization factor was shown to have a great impact on the statistical and quantitative results indicating the need to carefully consider how to best normalize the data within a particular study and when comparing different studies. PCA applied to the data obtained from both NMR and UPLC/MS platforms reveals similar age-related differences. NMR indicated many metabolites associated with the Krebs cycle decrease while citrate and 2-oxoglutarate, also associated with the Krebs cycle, increase in older rats. Conclusion This study compared four different normalization methods for the NMR-based metabonomics spectra from an age-related study. It was shown that each method of normalization has a great effect on both the statistical and quantitative analyses. Each normalization method resulted in altered relative positions of significant PCA loadings for each sample spectra but it did not alter which chemical shifts had the highest loadings. The greater the normalization factor was related to age, the greater the separation between age groups was observed in subsequent PCA analyses. The normalization factor that showed the least age dependence was total NMR intensity, which was consistent with UPLC/MS data. Normalization by total intensity attempts to make corrections due to dietary and water intake of the individual animal, which is especially useful in metabonomics evaluations of urine. Additionally, metabonomics evaluations of age-related effects showed decreased concentrations of many Krebs cycle intermediates along with increased levels of oxidized antioxidants in urine of older rats

  6. Statistical log analysis made practical

    SciTech Connect

    Mitchell, W.K.; Nelson, R.J. )

    1991-06-01

    This paper discusses the advantages of a statistical approach to log analysis. Statistical techniques use inverse methods to calculate formation parameters. The use of statistical techniques has been limited, however, by the complexity of the mathematics and lengthy computer time required to minimize traditionally used nonlinear equations.

  7. Analysis of the Human Adult Urinary Metabolome Variations with Age, Body Mass Index, and Gender by Implementing a Comprehensive Workflow for Univariate and OPLS Statistical Analyses.

    PubMed

    Thévenot, Etienne A; Roux, Aurélie; Xu, Ying; Ezan, Eric; Junot, Christophe

    2015-08-01

    Urine metabolomics is widely used for biomarker research in the fields of medicine and toxicology. As a consequence, characterization of the variations of the urine metabolome under basal conditions becomes critical in order to avoid confounding effects in cohort studies. Such physiological information is however very scarce in the literature and in metabolomics databases so far. Here we studied the influence of age, body mass index (BMI), and gender on metabolite concentrations in a large cohort of 183 adults by using liquid chromatography coupled with high-resolution mass spectrometry (LC-HRMS). We implemented a comprehensive statistical workflow for univariate hypothesis testing and modeling by orthogonal partial least-squares (OPLS), which we made available to the metabolomics community within the online Workflow4Metabolomics.org resource. We found 108 urine metabolites displaying concentration variations with either age, BMI, or gender, by integrating the results from univariate p-values and multivariate variable importance in projection (VIP). Several metabolite clusters were further evidenced by correlation analysis, and they allowed stratification of the cohort. In conclusion, our study highlights the impact of gender and age on the urinary metabolome, and thus it indicates that these factors should be taken into account for the design of metabolomics studies.

  8. Statistical data analysis

    SciTech Connect

    Hahn, A.A.

    1994-11-01

    The complexity of instrumentation sometimes requires data analysis to be done before the result is presented to the control room. This tutorial reviews some of the theoretical assumptions underlying the more popular forms of data analysis and presents simple examples to illuminate the advantages and hazards of different techniques.

  9. Statistical Approaches for the Study of Cognitive and Brain Aging.

    PubMed

    Chen, Huaihou; Zhao, Bingxin; Cao, Guanqun; Proges, Eric C; O'Shea, Andrew; Woods, Adam J; Cohen, Ronald A

    2016-01-01

    Neuroimaging studies of cognitive and brain aging often yield massive datasets that create many analytic and statistical challenges. In this paper, we discuss and address several limitations in the existing work. (1) Linear models are often used to model the age effects on neuroimaging markers, which may be inadequate in capturing the potential nonlinear age effects. (2) Marginal correlations are often used in brain network analysis, which are not efficient in characterizing a complex brain network. (3) Due to the challenge of high-dimensionality, only a small subset of the regional neuroimaging markers is considered in a prediction model, which could miss important regional markers. To overcome those obstacles, we introduce several advanced statistical methods for analyzing data from cognitive and brain aging studies. Specifically, we introduce semiparametric models for modeling age effects, graphical models for brain network analysis, and penalized regression methods for selecting the most important markers in predicting cognitive outcomes. We illustrate these methods using the healthy aging data from the Active Brain Study. PMID:27486400

  10. Statistical Approaches for the Study of Cognitive and Brain Aging

    PubMed Central

    Chen, Huaihou; Zhao, Bingxin; Cao, Guanqun; Proges, Eric C.; O'Shea, Andrew; Woods, Adam J.; Cohen, Ronald A.

    2016-01-01

    Neuroimaging studies of cognitive and brain aging often yield massive datasets that create many analytic and statistical challenges. In this paper, we discuss and address several limitations in the existing work. (1) Linear models are often used to model the age effects on neuroimaging markers, which may be inadequate in capturing the potential nonlinear age effects. (2) Marginal correlations are often used in brain network analysis, which are not efficient in characterizing a complex brain network. (3) Due to the challenge of high-dimensionality, only a small subset of the regional neuroimaging markers is considered in a prediction model, which could miss important regional markers. To overcome those obstacles, we introduce several advanced statistical methods for analyzing data from cognitive and brain aging studies. Specifically, we introduce semiparametric models for modeling age effects, graphical models for brain network analysis, and penalized regression methods for selecting the most important markers in predicting cognitive outcomes. We illustrate these methods using the healthy aging data from the Active Brain Study. PMID:27486400

  11. Statistical physics of age related macular degeneration

    NASA Astrophysics Data System (ADS)

    Family, Fereydoon; Mazzitello, K. I.; Arizmendi, C. M.; Grossniklaus, H. E.

    Age-related macular degeneration (AMD) is the leading cause of blindness beyond the age of 50 years. The most common pathogenic mechanism that leads to AMD is choroidal neovascularization (CNV). CNV is produced by accumulation of residual material caused by aging of retinal pigment epithelium cells (RPE). The RPE is a phagocytic system that is essential for renewal of photoreceptors (rods and cones). With time, incompletely degraded membrane material builds up in the form of lipofuscin. Lipofuscin is made of free-radical-damaged protein and fat, which forms not only in AMD, but also Alzheimer disease and Parkinson disease. The study of lipofuscin formation and growth is important, because of their association with cellular aging. We introduce a model of non-equilibrium cluster growth and aggregation that we have developed for studying the formation and growth of lipofuscin in the aging RPE. Our results agree with a linear growth of the number of lipofuscin granules with age. We apply the dynamic scaling approach to our model and find excellent data collapse for the cluster size distribution. An unusual feature of our model is that while small particles are removed from the RPE the larger ones become fixed and grow by aggregation.

  12. Tools for Basic Statistical Analysis

    NASA Technical Reports Server (NTRS)

    Luz, Paul L.

    2005-01-01

    Statistical Analysis Toolset is a collection of eight Microsoft Excel spreadsheet programs, each of which performs calculations pertaining to an aspect of statistical analysis. These programs present input and output data in user-friendly, menu-driven formats, with automatic execution. The following types of calculations are performed: Descriptive statistics are computed for a set of data x(i) (i = 1, 2, 3 . . . ) entered by the user. Normal Distribution Estimates will calculate the statistical value that corresponds to cumulative probability values, given a sample mean and standard deviation of the normal distribution. Normal Distribution from two Data Points will extend and generate a cumulative normal distribution for the user, given two data points and their associated probability values. Two programs perform two-way analysis of variance (ANOVA) with no replication or generalized ANOVA for two factors with four levels and three repetitions. Linear Regression-ANOVA will curvefit data to the linear equation y=f(x) and will do an ANOVA to check its significance.

  13. Statistical analysis of biomechanical properties of the adult skull and age-related structural changes by sex in a Japanese forensic sample.

    PubMed

    Torimitsu, Suguru; Nishida, Yoshifumi; Takano, Tachio; Koizumi, Yoshinori; Makino, Yohsuke; Yajima, Daisuke; Hayakawa, Mutsumi; Inokuchi, Go; Motomura, Ayumi; Chiba, Fumiko; Otsuka, Katsura; Kobayashi, Kazuhiro; Odo, Yuriko; Iwase, Hirotaro

    2014-01-01

    The purpose of this research was to investigate the biomechanical properties of the adult human skull and the structural changes that occur with age in both sexes. The heads of 94 Japanese cadavers (54 male cadavers, 40 female cadavers) autopsied in our department were used in this research. A total of 376 cranial samples, four from each skull, were collected. Sample fracture load was measured by a bending test. A statistically significant negative correlation between the sample fracture load and cadaver age was found. This indicates that the stiffness of cranial bones in Japanese individuals decreases with age, and the risk of skull fracture thus probably increases with age. Prior to the bending test, the sample mass, the sample thickness, the ratio of the sample thickness to cadaver stature (ST/CS), and the sample density were measured and calculated. Significant negative correlations between cadaver age and sample thickness, ST/CS, and the sample density were observed only among the female samples. Computerized tomographic (CT) images of 358 cranial samples were available. The computed tomography value (CT value) of cancellous bone which refers to a quantitative scale for describing radiodensity, cancellous bone thickness and cortical bone thickness were measured and calculated. Significant negative correlation between cadaver age and the CT value or cortical bone thickness was observed only among the female samples. These findings suggest that the skull is substantially affected by decreased bone metabolism resulting from osteoporosis. Therefore, osteoporosis prevention and treatment may increase cranial stiffness and reinforce the skull structure, leading to a decrease in the risk of skull fractures.

  14. Statistical Analysis of RNA Backbone

    PubMed Central

    Hershkovitz, Eli; Sapiro, Guillermo; Tannenbaum, Allen; Williams, Loren Dean

    2009-01-01

    Local conformation is an important determinant of RNA catalysis and binding. The analysis of RNA conformation is particularly difficult due to the large number of degrees of freedom (torsion angles) per residue. Proteins, by comparison, have many fewer degrees of freedom per residue. In this work, we use and extend classical tools from statistics and signal processing to search for clusters in RNA conformational space. Results are reported both for scalar analysis, where each torsion angle is separately studied, and for vectorial analysis, where several angles are simultaneously clustered. Adapting techniques from vector quantization and clustering to the RNA structure, we find torsion angle clusters and RNA conformational motifs. We validate the technique using well-known conformational motifs, showing that the simultaneous study of the total torsion angle space leads to results consistent with known motifs reported in the literature and also to the finding of new ones. PMID:17048391

  15. Statistical Design in Isothermal Aging of Polyimide Resins

    NASA Technical Reports Server (NTRS)

    Sutter, James K.; Jobe, Marcus; Crane, Elizabeth A.

    1995-01-01

    Recent developments in research on polyimides for high temperature applications have led to the synthesis of many new polymers. Among the criteria that determines their thermal oxidative stability, isothermal aging is one of the most important. Isothermal aging studies require that many experimental factors are controlled to provide accurate results. In this article we describe a statistical plan that compares the isothermal stability of several polyimide resins, while minimizing the variations inherent in high-temperature aging studies.

  16. Statistical Handbook on Aging Americans. 1994 Edition. Statistical Handbook Series Number 5.

    ERIC Educational Resources Information Center

    Schick, Frank L., Ed.; Schick, Renee, Ed.

    This statistical handbook contains 378 tables and charts illustrating the changes in the United States' aging population based on data collected during the 1990 census and several other surveys. The tables and charts are organized by topic as follows: demographics (age and sex distribution, life expectancy, race and ethnicity, geographic…

  17. Statistical Analysis of Tsunami Variability

    NASA Astrophysics Data System (ADS)

    Zolezzi, Francesca; Del Giudice, Tania; Traverso, Chiara; Valfrè, Giulio; Poggi, Pamela; Parker, Eric J.

    2010-05-01

    similar to that seen in ground motion attenuation correlations used for seismic hazard assessment. The second issue was intra-event variability. This refers to the differences in tsunami wave run-up along a section of coast during a single event. Intra-event variability investigated directly considering field observations. The tsunami events used in the statistical evaluation were selected on the basis of the completeness and reliability of the available data. Tsunami considered for the analysis included the recent and well surveyed tsunami of Boxing Day 2004 (Great Indian Ocean Tsunami), Java 2006, Okushiri 1993, Kocaeli 1999, Messina 1908 and a case study of several historic events in Hawaii. Basic statistical analysis was performed on the field observations from these tsunamis. For events with very wide survey regions, the run-up heights have been grouped in order to maintain a homogeneous distance from the source. Where more than one survey was available for a given event, the original datasets were maintained separately to avoid combination of non-homogeneous data. The observed run-up measurements were used to evaluate the minimum, maximum, average, standard deviation and coefficient of variation for each data set. The minimum coefficient of variation was 0.12 measured for the 2004 Boxing Day tsunami at Nias Island (7 data) while the maximum is 0.98 for the Okushiri 1993 event (93 data). The average coefficient of variation is of the order of 0.45.

  18. Asymptotic modal analysis and statistical energy analysis

    NASA Technical Reports Server (NTRS)

    Dowell, Earl H.

    1992-01-01

    Asymptotic Modal Analysis (AMA) is a method which is used to model linear dynamical systems with many participating modes. The AMA method was originally developed to show the relationship between statistical energy analysis (SEA) and classical modal analysis (CMA). In the limit of a large number of modes of a vibrating system, the classical modal analysis result can be shown to be equivalent to the statistical energy analysis result. As the CMA result evolves into the SEA result, a number of systematic assumptions are made. Most of these assumptions are based upon the supposition that the number of modes approaches infinity. It is for this reason that the term 'asymptotic' is used. AMA is the asymptotic result of taking the limit of CMA as the number of modes approaches infinity. AMA refers to any of the intermediate results between CMA and SEA, as well as the SEA result which is derived from CMA. The main advantage of the AMA method is that individual modal characteristics are not required in the model or computations. By contrast, CMA requires that each modal parameter be evaluated at each frequency. In the latter, contributions from each mode are computed and the final answer is obtained by summing over all the modes in the particular band of interest. AMA evaluates modal parameters only at their center frequency and does not sum the individual contributions from each mode in order to obtain a final result. The method is similar to SEA in this respect. However, SEA is only capable of obtaining spatial averages or means, as it is a statistical method. Since AMA is systematically derived from CMA, it can obtain local spatial information as well.

  19. Fake Statistically Valid Isotopic Ages in Impact Crater Geochronology

    NASA Astrophysics Data System (ADS)

    Jourdan, F.; Schmieder, M.; McWilliams, M. M.; Buchner, E.

    2009-05-01

    Precise dating of impact structures is crucial in several fundamental aspects, such as correlating effects on the bio- and geosphere caused by these catastrophic processes. Among the 176 listed impact structures [1], only 25 have a stated age precision better than ± 2%. Statistical investigation of these 25 ages showed that 11 ages are accurate, 12 are at best ambiguous, and 2 are not well characterized [2]. In this study, we show that even with statistically valid isotope ages, the age of an impact can be "missed" by several hundred millions of years. We present a new 40Ar/39Ar plateau age of 444 ± 4 Ma for the Acraman structure (real age ˜590 Ma [3]) and four plateau ages ranging from 81.07 ± 0.76 Ma to 74.6 ± 1.5 Ma for the Brent structure (estimated real age ˜453 Ma [4]). In addition, we discuss a 40Ar/39Ar plateau age of 994 ± 11, recently obtained by [5] on the Dhala structure (real age ˜2.0 Ga [5]). Despite careful sample preparations (single grain handpicking and HF leaching, in order to remove alteration phases), these results are much younger than the impact ages. Petrographic observations show that Acraman and Dhala grain separates all have an orange color and show evidence of alteration. This suggests that these ages are the results of hydrothermal events that triggered intensive 40Ar* loss and crystallization of secondary phases. More intriguing are the Brent samples (glassy melt rocks obtained from a drill core) that appeared very fresh under the microscope. The Brent glass might be a Cretaceous pseudotachylite generated by a late adjustment of the structure and/or by a local earthquake. Because we know the approximate age of the craters with stratigraphic evidences, these outliers are easy to identify. However, this is a red flag for any uncritical interpretation of isotopic ages (including e.g., 40Ar/39Ar, U/Pb, or U-Th/He [6]). In this paper, we encourage a multi-technique approach (i.e., isotopic, stratigraphic, paleogeographic [7,8]) and

  20. Statistical estimation of mineral age by K-Ar method

    SciTech Connect

    Vistelius, A.B.; Drubetzkoy, E.R.; Faas, A.V. )

    1989-11-01

    Statistical estimation of age of {sup 40}Ar/{sup 40}K ratios may be considered a result of convolution of uniform and normal distributions with different weights for different minerals. Data from Gul'shad Massif (Nearbalkhash, Kazakhstan, USSR) indicate that {sup 40}Ar/{sup 40}K ratios reflecting the intensity of geochemical processes can be resolved using convolutions. Loss of {sup 40}Ar in biotites is shown whereas hornblende retained the original content of {sup 40}Ar throughout the geological history of the massif. Results demonstrate that different estimation methods must be used for different minerals and different rocks when radiometric ages are employed for dating.

  1. Statistical analysis of planetary surfaces

    NASA Astrophysics Data System (ADS)

    Schmidt, Frederic; Landais, Francois; Lovejoy, Shaun

    2015-04-01

    In the last decades, a huge amount of topographic data has been obtained by several techniques (laser and radar altimetry, DTM…) for different bodies in the solar system, including Earth, Mars, the Moon etc.. In each case, topographic fields exhibit an extremely high variability with details at each scale, from millimeter to thousands of kilometers. This complexity seems to prohibit global descriptions or global topography models. Nevertheless, this topographic complexity is well-known to exhibit scaling laws that establish a similarity between scales and permit simpler descriptions and models. Indeed, efficient simulations can be made using the statistical properties of scaling fields (fractals). But realistic simulations of global topographic fields must be multi (not mono) scaling behaviour, reflecting the extreme variability and intermittency observed in real fields that can not be generated by simple scaling models. A multiscaling theory has been developed in order to model high variability and intermittency. This theory is a good statistical candidate to model the topography field with a limited number of parameters (called the multifractal parameters). In our study, we show that statistical properties of the Martian topography is accurately reproduced by this model, leading to new interpretation of geomorphological processes.

  2. An R package for statistical provenance analysis

    NASA Astrophysics Data System (ADS)

    Vermeesch, Pieter; Resentini, Alberto; Garzanti, Eduardo

    2016-05-01

    This paper introduces provenance, a software package within the statistical programming environment R, which aims to facilitate the visualisation and interpretation of large amounts of sedimentary provenance data, including mineralogical, petrographic, chemical and isotopic provenance proxies, or any combination of these. provenance comprises functions to: (a) calculate the sample size required to achieve a given detection limit; (b) plot distributional data such as detrital zircon U-Pb age spectra as Cumulative Age Distributions (CADs) or adaptive Kernel Density Estimates (KDEs); (c) plot compositional data as pie charts or ternary diagrams; (d) correct the effects of hydraulic sorting on sandstone petrography and heavy mineral composition; (e) assess the settling equivalence of detrital minerals and grain-size dependence of sediment composition; (f) quantify the dissimilarity between distributional data using the Kolmogorov-Smirnov and Sircombe-Hazelton distances, or between compositional data using the Aitchison and Bray-Curtis distances; (e) interpret multi-sample datasets by means of (classical and nonmetric) Multidimensional Scaling (MDS) and Principal Component Analysis (PCA); and (f) simplify the interpretation of multi-method datasets by means of Generalised Procrustes Analysis (GPA) and 3-way MDS. All these tools can be accessed through an intuitive query-based user interface, which does not require knowledge of the R programming language. provenance is free software released under the GPL-2 licence and will be further expanded based on user feedback.

  3. Statistical Power in Meta-Analysis

    ERIC Educational Resources Information Center

    Liu, Jin

    2015-01-01

    Statistical power is important in a meta-analysis study, although few studies have examined the performance of simulated power in meta-analysis. The purpose of this study is to inform researchers about statistical power estimation on two sample mean difference test under different situations: (1) the discrepancy between the analytical power and…

  4. Asymptotic modal analysis and statistical energy analysis

    NASA Technical Reports Server (NTRS)

    Dowell, Earl H.

    1988-01-01

    Statistical Energy Analysis (SEA) is defined by considering the asymptotic limit of Classical Modal Analysis, an approach called Asymptotic Modal Analysis (AMA). The general approach is described for both structural and acoustical systems. The theoretical foundation is presented for structural systems, and experimental verification is presented for a structural plate responding to a random force. Work accomplished subsequent to the grant initiation focusses on the acoustic response of an interior cavity (i.e., an aircraft or spacecraft fuselage) with a portion of the wall vibrating in a large number of structural modes. First results were presented at the ASME Winter Annual Meeting in December, 1987, and accepted for publication in the Journal of Vibration, Acoustics, Stress and Reliability in Design. It is shown that asymptotically as the number of acoustic modes excited becomes large, the pressure level in the cavity becomes uniform except at the cavity boundaries. However, the mean square pressure at the cavity corner, edge and wall is, respectively, 8, 4, and 2 times the value in the cavity interior. Also it is shown that when the portion of the wall which is vibrating is near a cavity corner or edge, the response is significantly higher.

  5. Collecting operational event data for statistical analysis

    SciTech Connect

    Atwood, C.L.

    1994-09-01

    This report gives guidance for collecting operational data to be used for statistical analysis, especially analysis of event counts. It discusses how to define the purpose of the study, the unit (system, component, etc.) to be studied, events to be counted, and demand or exposure time. Examples are given of classification systems for events in the data sources. A checklist summarizes the essential steps in data collection for statistical analysis.

  6. Statistical Survey and Analysis Handbook.

    ERIC Educational Resources Information Center

    Smith, Kenneth F.

    The National Food and Agriculture Council of the Philippines regularly requires rapid feedback data for analysis, which will assist in monitoring programs to improve and increase the production of selected crops by small scale farmers. Since many other development programs in various subject matter areas also require similar statistical…

  7. Aging and the statistical learning of grammatical form classes.

    PubMed

    Schwab, Jessica F; Schuler, Kathryn D; Stillman, Chelsea M; Newport, Elissa L; Howard, James H; Howard, Darlene V

    2016-08-01

    Language learners must place unfamiliar words into categories, often with few explicit indicators about when and how that word can be used grammatically. Reeder, Newport, and Aslin (2013) showed that college students can learn grammatical form classes from an artificial language by relying solely on distributional information (i.e., contextual cues in the input). Here, 2 experiments revealed that healthy older adults also show such statistical learning, though they are poorer than young at distinguishing grammatical from ungrammatical strings. This finding expands knowledge of which aspects of learning vary with aging, with potential implications for second language learning in late adulthood. (PsycINFO Database Record PMID:27294711

  8. Statistical Analysis For Nucleus/Nucleus Collisions

    NASA Technical Reports Server (NTRS)

    Mcguire, Stephen C.

    1989-01-01

    Report describes use of several statistical techniques to charactertize angular distributions of secondary particles emitted in collisions of atomic nuclei in energy range of 24 to 61 GeV per nucleon. Purpose of statistical analysis to determine correlations between intensities of emitted particles and angles comfirming existence of quark/gluon plasma.

  9. Explorations in Statistics: The Analysis of Change

    ERIC Educational Resources Information Center

    Curran-Everett, Douglas; Williams, Calvin L.

    2015-01-01

    Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This tenth installment of "Explorations in Statistics" explores the analysis of a potential change in some physiological response. As researchers, we often express absolute change as percent change so we can…

  10. A Statistical Analysis of Cotton Fiber Properties

    NASA Astrophysics Data System (ADS)

    Ghosh, Anindya; Das, Subhasis; Majumder, Asha

    2016-04-01

    This paper reports a statistical analysis of different cotton fiber properties, such as strength, breaking elongation, upper half mean length, length uniformity index, short fiber index, micronaire, reflectance and yellowness measured from 1200 cotton bales. The uni-variate, bi-variate and multi-variate statistical analysis have been invoked to elicit interrelationship between above-mentioned properties taking them up singularly, pairwise and multiple way, respectively. In multi-variate analysis all cotton fiber properties are simultaneously considered for multi-dimensional techniques of principal factor analysis.

  11. Statistics analysis embedded in spatial DBMS

    NASA Astrophysics Data System (ADS)

    Chen, Rongguo; Chen, Siqing

    2006-10-01

    This article sets forth the principle and methodology for implementing spatial database management system (DBMS) by using open source object-relational DBMS - PostgreSQL. The geospatial data model and spatial analysis and processing operations for spatial objects and datasets can be inserted into the DBMS by extended SQL. To implement the statistics analysis embedded in spatial DBMS, an open source statistical program R is introduced to extend the capability of the spatial DBMS. R is a language and environment for statistical computing and graphics. There is a large sum of statistical methods in the form of packages in R. Many classical and modern spatial statistical techniques are implemented in R environment. PL/R is a loadable procedural language containing most of the capabilities in R language which is extensible and enables user to write DBMS functions and triggers in R language. Therefore, the PL/R will extend its capability of spatial statistics and geostatistics when the two kinds of packages are loaded into R language. The PL/R can be extended without limit so that any new method of statistics analysis embedded into the spatial DBMS becomes very convenient.

  12. Statistical Analysis Techniques for Small Sample Sizes

    NASA Technical Reports Server (NTRS)

    Navard, S. E.

    1984-01-01

    The small sample sizes problem which is encountered when dealing with analysis of space-flight data is examined. Because of such a amount of data available, careful analyses are essential to extract the maximum amount of information with acceptable accuracy. Statistical analysis of small samples is described. The background material necessary for understanding statistical hypothesis testing is outlined and the various tests which can be done on small samples are explained. Emphasis is on the underlying assumptions of each test and on considerations needed to choose the most appropriate test for a given type of analysis.

  13. Statistical Tools for Forensic Analysis of Toolmarks

    SciTech Connect

    David Baldwin; Max Morris; Stan Bajic; Zhigang Zhou; James Kreiser

    2004-04-22

    Recovery and comparison of toolmarks, footprint impressions, and fractured surfaces connected to a crime scene are of great importance in forensic science. The purpose of this project is to provide statistical tools for the validation of the proposition that particular manufacturing processes produce marks on the work-product (or tool) that are substantially different from tool to tool. The approach to validation involves the collection of digital images of toolmarks produced by various tool manufacturing methods on produced work-products and the development of statistical methods for data reduction and analysis of the images. The developed statistical methods provide a means to objectively calculate a ''degree of association'' between matches of similarly produced toolmarks. The basis for statistical method development relies on ''discriminating criteria'' that examiners use to identify features and spatial relationships in their analysis of forensic samples. The developed data reduction algorithms utilize the same rules used by examiners for classification and association of toolmarks.

  14. PREFACE: Statistical Physics of Ageing Phenomena and the Glass Transition

    NASA Astrophysics Data System (ADS)

    Henkel, Malte; Pleimling, Michel; Sanctuary, Roland

    2006-06-01

    A summer school on `Ageing and the glass transition' was held at the University of Luxembourg on 18-24 September 2005. It brought together about 60 scientists actively studying the related fields of physical ageing and of the thermodynamics of glass-forming systems when undergoing a glass transition. The programme of the school can be found on the homepage ( http://www.theorie1.physik.uni-erlangen.de/sommerschule.html). The school contained both invited lectures and contributed talks and posters. This volume presents the works contributed to the summer school, while the invited lectures will be published elsewhere (M Henkel, M Pleimling and R Sanctuary (eds), Ageing and the glass transition, Springer Lecture Notes in Physics, Springer (Heidelberg 2006)). We have tried to encourage the exchange between theorists and experimentalists to which the topics treated in these proceedings bear witness. They range from experimental studies on the mechanical response of glasses, biopolymers, and granular materials to the effects of ageing on the long-time modification of the properties of glass-forming polymers, from simulational and analytical studies of theoretical models describing the non-equilibrium statistical mechanics of systems displaying the dynamical scaling typical of ageing phenomena and which are thought to capture essential aspects of glass-forming materials close to a glass transition to more mathematically oriented investigations on the symmetries of these systems. The `Grande Région' Sar-Lor-Lux is leading European efforts to overcome national and linguistic barriers, with the view of creating a common academic education. Physics has a standing internationalist tradition and the existing trinational integrated course in Physics SLLS (see the homepage http://www.uni-saarland.de/fak7/krueger/integ/sll/d/cursus.htm) is busily developing ways and means towards this goal, in particular through the delivery of multinational and multilingual university degrees in

  15. Statistical Analysis Experiment for Freshman Chemistry Lab.

    ERIC Educational Resources Information Center

    Salzsieder, John C.

    1995-01-01

    Describes a laboratory experiment dissolving zinc from galvanized nails in which data can be gathered very quickly for statistical analysis. The data have sufficient significant figures and the experiment yields a nice distribution of random errors. Freshman students can gain an appreciation of the relationships between random error, number of…

  16. MICROARRAY DATA ANALYSIS USING MULTIPLE STATISTICAL MODELS

    EPA Science Inventory

    Microarray Data Analysis Using Multiple Statistical Models

    Wenjun Bao1, Judith E. Schmid1, Amber K. Goetz1, Ming Ouyang2, William J. Welsh2,Andrew I. Brooks3,4, ChiYi Chu3,Mitsunori Ogihara3,4, Yinhe Cheng5, David J. Dix1. 1National Health and Environmental Effects Researc...

  17. Applied Behavior Analysis and Statistical Process Control?

    ERIC Educational Resources Information Center

    Hopkins, B. L.

    1995-01-01

    Incorporating statistical process control (SPC) methods into applied behavior analysis is discussed. It is claimed that SPC methods would likely reduce applied behavior analysts' intimate contacts with problems and would likely yield poor treatment and research decisions. Cases and data presented by Pfadt and Wheeler (1995) are cited as examples.…

  18. Survival analysis of aging aircraft

    NASA Astrophysics Data System (ADS)

    Benavides, Samuel

    This study pushes systems engineering of aging aircraft beyond the boundaries of empirical and deterministic modeling by making a sharp break with the traditional laboratory-derived corrosion prediction algorithms that have shrouded real-world failures of aircraft structure. At the heart of this problem is the aeronautical industry's inability to be forthcoming in an accurate model that predicts corrosion failures in aircraft in spite of advances in corrosion algorithms or improvements in simulation and modeling. The struggle to develop accurate corrosion probabilistic models stems from a multitude of real-world interacting variables that synergistically influence corrosion in convoluted and complex ways. This dissertation, in essence, offers a statistical framework for the analysis of structural airframe corrosion failure by utilizing real-world data while considering the effects of interacting corrosion variables. This study injects realism into corrosion failures of aging aircraft systems by accomplishing four major goals related to the conceptual and methodological framework of corrosion modeling. First, this work connects corrosion modeling from the traditional, laboratory derived algorithms to corrosion failures in actual operating aircraft. This work augments physics-based modeling by examining the many confounding and interacting variables, such as environmental, geographical and operational, that impact failure of airframe structure. Examined through the lens of censored failure data from aircraft flying in a maritime environment, this study enhances the understanding between the triad of the theoretical, laboratory and real-world corrosion. Secondly, this study explores the importation and successful application of an advanced biomedical statistical tool---survival analysis---to model censored corrosion failure data. This well-grounded statistical methodology is inverted from a methodology that analyzes survival to one that examines failures. Third, this

  19. Statistical shape analysis: From landmarks to diffeomorphisms.

    PubMed

    Zhang, Miaomiao; Golland, Polina

    2016-10-01

    We offer a blazingly brief review of evolution of shape analysis methods in medical imaging. As the representations and the statistical models grew more sophisticated, the problem of shape analysis has been gradually redefined to accept images rather than binary segmentations as a starting point. This transformation enabled shape analysis to take its rightful place in the arsenal of tools for extracting and understanding patterns in large clinical image sets. We speculate on the future developments in shape analysis and potential applications that would bring this mathematically rich area to bear on clinical practice. PMID:27377332

  20. Statistical Analysis of Big Data on Pharmacogenomics

    PubMed Central

    Fan, Jianqing; Liu, Han

    2013-01-01

    This paper discusses statistical methods for estimating complex correlation structure from large pharmacogenomic datasets. We selectively review several prominent statistical methods for estimating large covariance matrix for understanding correlation structure, inverse covariance matrix for network modeling, large-scale simultaneous tests for selecting significantly differently expressed genes and proteins and genetic markers for complex diseases, and high dimensional variable selection for identifying important molecules for understanding molecule mechanisms in pharmacogenomics. Their applications to gene network estimation and biomarker selection are used to illustrate the methodological power. Several new challenges of Big data analysis, including complex data distribution, missing data, measurement error, spurious correlation, endogeneity, and the need for robust statistical methods, are also discussed. PMID:23602905

  1. Statistical algorithm to test the presence of correlation between time series with age/dating uncertainties.

    NASA Astrophysics Data System (ADS)

    Haam, E. K.; Huybers, P.

    2008-12-01

    To understand the Earth's climate, we must understand the inter-relations between its specific geographical areas which, in the case of paleoclimatology, can be profitably undertaken from an empirical perspective. However, assessment of the inter-relation between separate paleoclimate records is inevitably hindered by uncertainties in the absolute and relative age/dating of these climate records, because the correlation between two paleoclimate data with age uncertainty can change dramatically when variations of the age are allowed within the uncertainty limit. Through rigorous statistical analysis of the available proxy data, we can hope to gain better insight into the nature and scope of the mechanisms governing their variability. We propose a statistical algorithm to test for the presence of correlation between two paleoclimate time series with age/dating uncertainties. Previous works in this area have focused on searching for the maximum similarity out of all possible realizations of the series, either heuristically (visual wiggle matching) or through more quantitative methods (eg. cross-correlation maximizer, dynamic programming). In contrast, this algorithm seeks to determine the statistical significance of the maximum covariance. The probability of obtaining a certain maximum covariance from purely random events can provide us with an objective standard for real correlation and it is assessed using the theory of extreme order statistics, as a multivariate normal integral. Since there is no known closed form solution for a multivariate normal integral, a numerical method is used. We apply this algorithm to test for the correlation of the Dansgaard-Oeschger variability observed during MIS3 in the GISPII ice core and millennial variability recorded at cites including Botuvera Cave in Brazil, Hulu Cave in China, Eastern Indonesia, the Arabian Sea, Villa Cave in Europe, New Zealand and the Santa Barbara basin. Results of the analysis are presented as a map of the

  2. Statistical Analysis of Thermal Analysis Margin

    NASA Technical Reports Server (NTRS)

    Garrison, Matthew B.

    2011-01-01

    NASA Goddard Space Flight Center requires that each project demonstrate a minimum of 5 C margin between temperature predictions and hot and cold flight operational limits. The bounding temperature predictions include worst-case environment and thermal optical properties. The purpose of this work is to: assess how current missions are performing against their pre-launch bounding temperature predictions and suggest any possible changes to the thermal analysis margin rules

  3. Comparative statistical analysis of planetary surfaces

    NASA Astrophysics Data System (ADS)

    Schmidt, Frédéric; Landais, Francois; Lovejoy, Shaun

    2016-04-01

    In the present study, we aim to provide a statistical and comparative description of topographic fields by using the huge amount of topographic data available for different bodies in the solar system, including Earth, Mars, the Moon etc.. Our goal is to characterize and quantify the geophysical processes involved by a relevant statistical description. In each case, topographic fields exhibit an extremely high variability with details at each scale, from millimeter to thousands of kilometers. This complexity seems to prohibit global descriptions or global topography models. Nevertheless, this topographic complexity is well-known to exhibit scaling laws that establish a similarity between scales and permit simpler descriptions and models. Indeed, efficient simulations can be made using the statistical properties of scaling fields (fractals). But realistic simulations of global topographic fields must be multi (not mono) scaling behaviour, reflecting the extreme variability and intermittency observed in real fields that can not be generated by simple scaling models. A multiscaling theory has been developed in order to model high variability and intermittency. This theory is a good statistical candidate to model the topography field with a limited number of parameters (called the multifractal parameters). After a global analysis of Mars (Landais et. al, 2015) we have performed similar analysis on different body in the solar system including the Moon, Venus and mercury indicating that the mulifractal parameters might be relevant to explain the competition between several processes operating on multiple scales

  4. Statistical Analysis of Iberian Peninsula Megaliths Orientations

    NASA Astrophysics Data System (ADS)

    González-García, A. C.

    2009-08-01

    Megalithic monuments have been intensively surveyed and studied from the archaeoastronomical point of view in the past decades. We have orientation measurements for over one thousand megalithic burial monuments in the Iberian Peninsula, from several different periods. These data, however, lack a sound understanding. A way to classify and start to understand such orientations is by means of statistical analysis of the data. A first attempt is done with simple statistical variables and a mere comparison between the different areas. In order to minimise the subjectivity in the process a further more complicated analysis is performed. Some interesting results linking the orientation and the geographical location will be presented. Finally I will present some models comparing the orientation of the megaliths in the Iberian Peninsula with the rising of the sun and the moon at several times of the year.

  5. Protein Sectors: Statistical Coupling Analysis versus Conservation

    PubMed Central

    Teşileanu, Tiberiu; Colwell, Lucy J.; Leibler, Stanislas

    2015-01-01

    Statistical coupling analysis (SCA) is a method for analyzing multiple sequence alignments that was used to identify groups of coevolving residues termed “sectors”. The method applies spectral analysis to a matrix obtained by combining correlation information with sequence conservation. It has been asserted that the protein sectors identified by SCA are functionally significant, with different sectors controlling different biochemical properties of the protein. Here we reconsider the available experimental data and note that it involves almost exclusively proteins with a single sector. We show that in this case sequence conservation is the dominating factor in SCA, and can alone be used to make statistically equivalent functional predictions. Therefore, we suggest shifting the experimental focus to proteins for which SCA identifies several sectors. Correlations in protein alignments, which have been shown to be informative in a number of independent studies, would then be less dominated by sequence conservation. PMID:25723535

  6. Applied behavior analysis and statistical process control?

    PubMed Central

    Hopkins, B L

    1995-01-01

    This paper examines Pfadt and Wheeler's (1995) suggestions that the methods of statistical process control (SPC) be incorporated into applied behavior analysis. The research strategies of SPC are examined and compared to those of applied behavior analysis. I argue that the statistical methods that are a part of SPC would likely reduce applied behavior analysts' intimate contacts with the problems with which they deal and would, therefore, likely yield poor treatment and research decisions. Examples of these kinds of results and decisions are drawn from the cases and data Pfadt and Wheeler present. This paper also describes and clarifies many common misconceptions about SPC, including W. Edwards Deming's involvement in its development, its relationship to total quality management, and its confusion with various other methods designed to detect sources of unwanted variability. PMID:7592156

  7. Multivariate analysis: A statistical approach for computations

    NASA Astrophysics Data System (ADS)

    Michu, Sachin; Kaushik, Vandana

    2014-10-01

    Multivariate analysis is a type of multivariate statistical approach commonly used in, automotive diagnosis, education evaluating clusters in finance etc and more recently in the health-related professions. The objective of the paper is to provide a detailed exploratory discussion about factor analysis (FA) in image retrieval method and correlation analysis (CA) of network traffic. Image retrieval methods aim to retrieve relevant images from a collected database, based on their content. The problem is made more difficult due to the high dimension of the variable space in which the images are represented. Multivariate correlation analysis proposes an anomaly detection and analysis method based on the correlation coefficient matrix. Anomaly behaviors in the network include the various attacks on the network like DDOs attacks and network scanning.

  8. Statistical quality control through overall vibration analysis

    NASA Astrophysics Data System (ADS)

    Carnero, M. a. Carmen; González-Palma, Rafael; Almorza, David; Mayorga, Pedro; López-Escobar, Carlos

    2010-05-01

    The present study introduces the concept of statistical quality control in automotive wheel bearings manufacturing processes. Defects on products under analysis can have a direct influence on passengers' safety and comfort. At present, the use of vibration analysis on machine tools for quality control purposes is not very extensive in manufacturing facilities. Noise and vibration are common quality problems in bearings. These failure modes likely occur under certain operating conditions and do not require high vibration amplitudes but relate to certain vibration frequencies. The vibration frequencies are affected by the type of surface problems (chattering) of ball races that are generated through grinding processes. The purpose of this paper is to identify grinding process variables that affect the quality of bearings by using statistical principles in the field of machine tools. In addition, an evaluation of the quality results of the finished parts under different combinations of process variables is assessed. This paper intends to establish the foundations to predict the quality of the products through the analysis of self-induced vibrations during the contact between the grinding wheel and the parts. To achieve this goal, the overall self-induced vibration readings under different combinations of process variables are analysed using statistical tools. The analysis of data and design of experiments follows a classical approach, considering all potential interactions between variables. The analysis of data is conducted through analysis of variance (ANOVA) for data sets that meet normality and homoscedasticity criteria. This paper utilizes different statistical tools to support the conclusions such as chi squared, Shapiro-Wilks, symmetry, Kurtosis, Cochran, Hartlett, and Hartley and Krushal-Wallis. The analysis presented is the starting point to extend the use of predictive techniques (vibration analysis) for quality control. This paper demonstrates the existence

  9. Analysis and modeling of resistive switching statistics

    NASA Astrophysics Data System (ADS)

    Long, Shibing; Cagli, Carlo; Ielmini, Daniele; Liu, Ming; Suñé, Jordi

    2012-04-01

    The resistive random access memory (RRAM), based on the reversible switching between different resistance states, is a promising candidate for next-generation nonvolatile memories. One of the most important challenges to foster the practical application of RRAM is the control of the statistical variation of switching parameters to gain low variability and high reliability. In this work, starting from the well-known percolation model of dielectric breakdown (BD), we establish a framework of analysis and modeling of the resistive switching statistics in RRAM devices, which are based on the formation and disconnection of a conducting filament (CF). One key aspect of our proposal is the relation between the CF resistance and the switching statistics. Hence, establishing the correlation between SET and RESET switching variables and the initial resistance of the device in the OFF and ON states, respectively, is a fundamental issue. Our modeling approach to the switching statistics is fully analytical and contains two main elements: (i) a geometrical cell-based description of the CF and (ii) a deterministic model for the switching dynamics. Both ingredients might be slightly different for the SET and RESET processes, for the type of switching (bipolar or unipolar), and for the kind of considered resistive structure (oxide-based, conductive bridge, etc.). However, the basic structure of our approach is thought to be useful for all the cases and should provide a framework for the physics-based understanding of the switching mechanisms and the associated statistics, for the trustful estimation of RRAM performance, and for the successful forecast of reliability. As a first application example, we start by considering the case of the RESET statistics of NiO-based RRAM structures. In particular, we statistically analyze the RESET transitions of a statistically significant number of switching cycles of Pt/NiO/W devices. In the RESET transition, the ON-state resistance (RON) is a

  10. Statistical analysis of diversification with species traits.

    PubMed

    Paradis, Emmanuel

    2005-01-01

    Testing whether some species traits have a significant effect on diversification rates is central in the assessment of macroevolutionary theories. However, we still lack a powerful method to tackle this objective. I present a new method for the statistical analysis of diversification with species traits. The required data are observations of the traits on recent species, the phylogenetic tree of these species, and reconstructions of ancestral values of the traits. Several traits, either continuous or discrete, and in some cases their interactions, can be analyzed simultaneously. The parameters are estimated by the method of maximum likelihood. The statistical significance of the effects in a model can be tested with likelihood ratio tests. A simulation study showed that past random extinction events do not affect the Type I error rate of the tests, whereas statistical power is decreased, though some power is still kept if the effect of the simulated trait on speciation is strong. The use of the method is illustrated by the analysis of published data on primates. The analysis of these data showed that the apparent overall positive relationship between body mass and species diversity is actually an artifact due to a clade-specific effect. Within each clade the effect of body mass on speciation rate was in fact negative. The present method allows to take both effects (clade and body mass) into account simultaneously.

  11. Two statistical tests for meiotic breakpoint analysis.

    PubMed Central

    Plaetke, R; Schachtel, G A

    1995-01-01

    Meiotic breakpoint analysis (BPA), a statistical method for ordering genetic markers, is increasing in importance as a method for building genetic maps of human chromosomes. Although BPA does not provide estimates of genetic distances between markers, it efficiently locates new markers on already defined dense maps, when likelihood analysis becomes cumbersome or the sample size is small. However, until now no assessments of statistical significance have been available for evaluating the possibility that the results of a BPA were produced by chance. In this paper, we propose two statistical tests to determine whether the size of a sample and its genetic information content are sufficient to distinguish between "no linkage" and "linkage" of a marker mapped by BPA to a certain region. Both tests are exact and should be conducted after a BPA has assigned the marker to an interval on the map. Applications of the new tests are demonstrated by three examples: (1) a synthetic data set, (2) a data set of five markers on human chromosome 8p, and (3) a data set of four markers on human chromosome 17q. PMID:7847387

  12. Statistical Considerations for Analysis of Microarray Experiments

    PubMed Central

    Owzar, Kouros; Barry, William T.; Jung, Sin-Ho

    2014-01-01

    Microarray technologies enable the simultaneous interrogation of expressions from thousands of genes from a biospecimen sample taken from a patient. This large set of expressions generate a genetic profile of the patient that may be used to identify potential prognostic or predictive genes or genetic models for clinical outcomes. The aim of this article is to provide a broad overview of some of the major statistical considerations for the design and analysis of microarrays experiments conducted as correlative science studies to clinical trials. An emphasis will be placed on how the lack of understanding and improper use of statistical concepts and methods will lead to noise discovery and misinterpretation of experimental results. PMID:22212230

  13. Statistical Analysis of Zebrafish Locomotor Response.

    PubMed

    Liu, Yiwen; Carmer, Robert; Zhang, Gaonan; Venkatraman, Prahatha; Brown, Skye Ashton; Pang, Chi-Pui; Zhang, Mingzhi; Ma, Ping; Leung, Yuk Fai

    2015-01-01

    Zebrafish larvae display rich locomotor behaviour upon external stimulation. The movement can be simultaneously tracked from many larvae arranged in multi-well plates. The resulting time-series locomotor data have been used to reveal new insights into neurobiology and pharmacology. However, the data are of large scale, and the corresponding locomotor behavior is affected by multiple factors. These issues pose a statistical challenge for comparing larval activities. To address this gap, this study has analyzed a visually-driven locomotor behaviour named the visual motor response (VMR) by the Hotelling's T-squared test. This test is congruent with comparing locomotor profiles from a time period. Different wild-type (WT) strains were compared using the test, which shows that they responded differently to light change at different developmental stages. The performance of this test was evaluated by a power analysis, which shows that the test was sensitive for detecting differences between experimental groups with sample numbers that were commonly used in various studies. In addition, this study investigated the effects of various factors that might affect the VMR by multivariate analysis of variance (MANOVA). The results indicate that the larval activity was generally affected by stage, light stimulus, their interaction, and location in the plate. Nonetheless, different factors affected larval activity differently over time, as indicated by a dynamical analysis of the activity at each second. Intriguingly, this analysis also shows that biological and technical repeats had negligible effect on larval activity. This finding is consistent with that from the Hotelling's T-squared test, and suggests that experimental repeats can be combined to enhance statistical power. Together, these investigations have established a statistical framework for analyzing VMR data, a framework that should be generally applicable to other locomotor data with similar structure. PMID:26437184

  14. Statistical analysis of life history calendar data.

    PubMed

    Eerola, Mervi; Helske, Satu

    2016-04-01

    The life history calendar is a data-collection tool for obtaining reliable retrospective data about life events. To illustrate the analysis of such data, we compare the model-based probabilistic event history analysis and the model-free data mining method, sequence analysis. In event history analysis, we estimate instead of transition hazards the cumulative prediction probabilities of life events in the entire trajectory. In sequence analysis, we compare several dissimilarity metrics and contrast data-driven and user-defined substitution costs. As an example, we study young adults' transition to adulthood as a sequence of events in three life domains. The events define the multistate event history model and the parallel life domains in multidimensional sequence analysis. The relationship between life trajectories and excess depressive symptoms in middle age is further studied by their joint prediction in the multistate model and by regressing the symptom scores on individual-specific cluster indices. The two approaches complement each other in life course analysis; sequence analysis can effectively find typical and atypical life patterns while event history analysis is needed for causal inquiries.

  15. Statistical analysis of extreme auroral electrojet indices

    NASA Astrophysics Data System (ADS)

    Nakamura, Masao; Yoneda, Asato; Oda, Mitsunobu; Tsubouchi, Ken

    2015-09-01

    Extreme auroral electrojet activities can damage electrical power grids due to large induced currents in the Earth, degrade radio communications and navigation systems due to the ionospheric disturbances and cause polar-orbiting satellite anomalies due to the enhanced auroral electron precipitation. Statistical estimation of extreme auroral electrojet activities is an important factor in space weather research. For this estimation, we utilize extreme value theory (EVT), which focuses on the statistical behavior in the tail of a distribution. As a measure of auroral electrojet activities, auroral electrojet indices AL, AU, and AE, are used, which describe the maximum current strength of the westward and eastward auroral electrojets and the sum of the two oppositely directed in the auroral latitude ionosphere, respectively. We provide statistical evidence for finite upper limits to AL and AU and estimate the annual expected number and probable intensity of their extreme events. We detect two different types of extreme AE events; therefore, application of the appropriate EVT analysis to AE is difficult.

  16. Statistical Hot Channel Analysis for the NBSR

    SciTech Connect

    Cuadra A.; Baek J.

    2014-05-27

    A statistical analysis of thermal limits has been carried out for the research reactor (NBSR) at the National Institute of Standards and Technology (NIST). The objective of this analysis was to update the uncertainties of the hot channel factors with respect to previous analysis for both high-enriched uranium (HEU) and low-enriched uranium (LEU) fuels. Although uncertainties in key parameters which enter into the analysis are not yet known for the LEU core, the current analysis uses reasonable approximations instead of conservative estimates based on HEU values. Cumulative distribution functions (CDFs) were obtained for critical heat flux ratio (CHFR), and onset of flow instability ratio (OFIR). As was done previously, the Sudo-Kaminaga correlation was used for CHF and the Saha-Zuber correlation was used for OFI. Results were obtained for probability levels of 90%, 95%, and 99.9%. As an example of the analysis, the results for both the existing reactor with HEU fuel and the LEU core show that CHFR would have to be above 1.39 to assure with 95% probability that there is no CHF. For the OFIR, the results show that the ratio should be above 1.40 to assure with a 95% probability that OFI is not reached.

  17. Recent advances in statistical energy analysis

    NASA Technical Reports Server (NTRS)

    Heron, K. H.

    1992-01-01

    Statistical Energy Analysis (SEA) has traditionally been developed using modal summation and averaging approach, and has led to the need for many restrictive SEA assumptions. The assumption of 'weak coupling' is particularly unacceptable when attempts are made to apply SEA to structural coupling. It is now believed that this assumption is more a function of the modal formulation rather than a necessary formulation of SEA. The present analysis ignores this restriction and describes a wave approach to the calculation of plate-plate coupling loss factors. Predictions based on this method are compared with results obtained from experiments using point excitation on one side of an irregular six-sided box structure. Conclusions show that the use and calculation of infinite transmission coefficients is the way forward for the development of a purely predictive SEA code.

  18. Statistical utopianism in the age of aristocratic efficiency.

    PubMed

    Porter, Theodore

    2002-01-01

    The modern history of science is commonly associated with an inexorable move toward increasing specialization and, perhaps, a proliferation of expert discourses at the expense of public discourse. This paper concerns the standing of science as a basis for public authority in late-Victorian and Edwardian Britain, and suggests that, in relation to the political order, this standing remained tenuous. These themes are exemplified by the career of Karl Pearson, founder of the modern school of mathematical statistics and something of a social visionary. Like Huxley and other scientific naturalists, Pearson wished to incorporate science into a reinvigorated "general culture" and in this way to reshape an elite. Statistics, seemingly the archetypal form of specialist expertise, was conceived as an almost utopian program to advance intelligence and mortality in what he sometimes referred to as a new aristocracy. PMID:12385323

  19. Methods of the computer-aided statistical analysis of microcircuits

    NASA Astrophysics Data System (ADS)

    Beliakov, Iu. N.; Kurmaev, F. A.; Batalov, B. V.

    Methods that are currently used for the computer-aided statistical analysis of microcircuits at the design stage are summarized. In particular, attention is given to methods for solving problems in statistical analysis, statistical planning, and factorial model synthesis by means of irregular experimental design. Efficient ways of reducing the computer time required for statistical analysis and numerical methods of microcircuit analysis are proposed. The discussion also covers various aspects of the organization of computer-aided microcircuit modeling and analysis systems.

  20. Multivariate statistical analysis of wildfires in Portugal

    NASA Astrophysics Data System (ADS)

    Costa, Ricardo; Caramelo, Liliana; Pereira, Mário

    2013-04-01

    Several studies demonstrate that wildfires in Portugal present high temporal and spatial variability as well as cluster behavior (Pereira et al., 2005, 2011). This study aims to contribute to the characterization of the fire regime in Portugal with the multivariate statistical analysis of the time series of number of fires and area burned in Portugal during the 1980 - 2009 period. The data used in the analysis is an extended version of the Rural Fire Portuguese Database (PRFD) (Pereira et al, 2011), provided by the National Forest Authority (Autoridade Florestal Nacional, AFN), the Portuguese Forest Service, which includes information for more than 500,000 fire records. There are many multiple advanced techniques for examining the relationships among multiple time series at the same time (e.g., canonical correlation analysis, principal components analysis, factor analysis, path analysis, multiple analyses of variance, clustering systems). This study compares and discusses the results obtained with these different techniques. Pereira, M.G., Trigo, R.M., DaCamara, C.C., Pereira, J.M.C., Leite, S.M., 2005: "Synoptic patterns associated with large summer forest fires in Portugal". Agricultural and Forest Meteorology. 129, 11-25. Pereira, M. G., Malamud, B. D., Trigo, R. M., and Alves, P. I.: The history and characteristics of the 1980-2005 Portuguese rural fire database, Nat. Hazards Earth Syst. Sci., 11, 3343-3358, doi:10.5194/nhess-11-3343-2011, 2011 This work is supported by European Union Funds (FEDER/COMPETE - Operational Competitiveness Programme) and by national funds (FCT - Portuguese Foundation for Science and Technology) under the project FCOMP-01-0124-FEDER-022692, the project FLAIR (PTDC/AAC-AMB/104702/2008) and the EU 7th Framework Program through FUME (contract number 243888).

  1. On intracluster Faraday rotation. II - Statistical analysis

    NASA Technical Reports Server (NTRS)

    Lawler, J. M.; Dennison, B.

    1982-01-01

    The comparison of a reliable sample of radio source Faraday rotation measurements seen through rich clusters of galaxies, with sources seen through the outer parts of clusters and therefore having little intracluster Faraday rotation, indicates that the distribution of rotation in the former population is broadened, but only at the 80% level of statistical confidence. Employing a physical model for the intracluster medium in which the square root of magnetic field strength/turbulent cell per gas core radius number ratio equals approximately 0.07 microgauss, a Monte Carlo simulation is able to reproduce the observed broadening. An upper-limit analysis figure of less than 0.20 microgauss for the field strength/turbulent cell ratio, combined with lower limits on field strength imposed by limitations on the Compton-scattered flux, shows that intracluster magnetic fields must be tangled on scales greater than about 20 kpc.

  2. FRATS: Functional Regression Analysis of DTI Tract Statistics

    PubMed Central

    Zhu, Hongtu; Styner, Martin; Tang, Niansheng; Liu, Zhexing; Lin, Weili; Gilmore, John H.

    2010-01-01

    Diffusion tensor imaging (DTI) provides important information on the structure of white matter fiber bundles as well as detailed tissue properties along these fiber bundles in vivo. This paper presents a functional regression framework, called FRATS, for the analysis of multiple diffusion properties along fiber bundle as functions in an infinite dimensional space and their association with a set of covariates of interest, such as age, diagnostic status and gender, in real applications. The functional regression framework consists of four integrated components: the local polynomial kernel method for smoothing multiple diffusion properties along individual fiber bundles, a functional linear model for characterizing the association between fiber bundle diffusion properties and a set of covariates, a global test statistic for testing hypotheses of interest, and a resampling method for approximating the p-value of the global test statistic. The proposed methodology is applied to characterizing the development of five diffusion properties including fractional anisotropy, mean diffusivity, and the three eigenvalues of diffusion tensor along the splenium of the corpus callosum tract and the right internal capsule tract in a clinical study of neurodevelopment. Significant age and gestational age effects on the five diffusion properties were found in both tracts. The resulting analysis pipeline can be used for understanding normal brain development, the neural bases of neuropsychiatric disorders, and the joint effects of environmental and genetic factors on white matter fiber bundles. PMID:20335089

  3. Statistical Analysis of Cardiovascular Data from FAP

    NASA Technical Reports Server (NTRS)

    Sealey, Meghan

    2016-01-01

    pressure, etc.) to see which could best predict how long the subjects could tolerate the tilt tests. With this I plan to analyze an artificial gravity study in order to determine the effects of orthostatic intolerance during spaceflight. From these projects, I became efficient in using the statistical software Stata, which I had previously never used before. I learned new statistical methods, such as mixed-effects linear regression, maximum likelihood estimation on longitudinal data, and post model-fitting tests to see if certain parameters contribute significantly to the model, all of which will better my understanding for when I continue studying for my masters' degree. I was also able to demonstrate my knowledge of statistics by helping other students run statistical analyses for their own projects. After completing these projects, the experience and knowledge gained from completing this analysis exemplifies the type of work that I would like to pursue in the future. After completing my masters' degree, I plan to pursue a career in biostatistics, which is exactly the position that I interned as, and I plan to use this experience to contribute to that goal

  4. Analysis of Variance: What Is Your Statistical Software Actually Doing?

    ERIC Educational Resources Information Center

    Li, Jian; Lomax, Richard G.

    2011-01-01

    Users assume statistical software packages produce accurate results. In this article, the authors systematically examined Statistical Package for the Social Sciences (SPSS) and Statistical Analysis System (SAS) for 3 analysis of variance (ANOVA) designs, mixed-effects ANOVA, fixed-effects analysis of covariance (ANCOVA), and nested ANOVA. For each…

  5. R: a statistical environment for hydrological analysis

    NASA Astrophysics Data System (ADS)

    Zambrano-Bigiarini, Mauricio; Bellin, Alberto

    2010-05-01

    The free software environment for statistical computing and graphics "R" has been developed and it is maintained by statistical programmers, with the support of an increasing community of users with many different backgrounds, which allows access to both well-established and experimental techniques. Hydrological modelling practitioners spent large amount of time in pre- and post-processing data and results with traditional instruments. In this work "R" and some of its packages are presented as powerful tools to explore and extract patterns from raw information, to pre-process input data of hydrological models, and post-processing its results. In particular, examples are taken from analysing 30-years of daily data for a basin of 85000 km2, saving a large amount of time that could be better spent in doing analysis. In doing so, vectorial and raster GIS files were imported, for carrying out spatial and geostatistical analysis. Thousands of raw text files with time series of precipitation, temperature and streamflow were summarized and organized. Gauging stations to be used in the modelling process are selected according to the amount of days with information, and missing time series data are filled in using spatial interpolation. Time series on the gauging stations are summarized through daily, monthly and annual plots. Input files in dbase format are automatically created in a batch process. Results of a hydrological model are compared with observed values through plots and numerical goodness of fit indexes. Two packages specifically developed to assists hydrologists in the previous tasks are briefly presented. At the end, we think the "R" environment would be a valuable tool to support undergraduate and graduate education in hydrology, because it is helpful to capture the main features of large amount of data; it is a flexible and fully functional programming language, able to be interfaced to existing Fortran and C code and well suited to the ever growing demands

  6. Statistical methods to assess the reliability of measurements in the procedures for forensic age estimation.

    PubMed

    Ferrante, L; Cameriere, R

    2009-07-01

    In forensic science, anthropology, and archaeology, several techniques have been developed to estimate chronological age in both children and adults, using the relationship between age and morphological changes in the structure of teeth. Before implementing a statistical model to describe age as a function of the measured morphological variables, the reliability of the measurements of these variables must be evaluated using suitable statistical methods. This paper introduces some commonly used statistical methods for assessing the reliability of procedures for age estimation in the forensic field. The use of the concordance correlation coefficient and the intraclass correlation coefficient are explained. Finally, some pitfalls in the choice of the statistical methods to assess reliability of the measurements in age estimation are discussed.

  7. Statistical analysis of regulatory ecotoxicity tests.

    PubMed

    Isnard, P; Flammarion, P; Roman, G; Babut, M; Bastien, P; Bintein, S; Esserméant, L; Férard, J F; Gallotti-Schmitt, S; Saouter, E; Saroli, M; Thiébaud, H; Tomassone, R; Vindimian, E

    2001-11-01

    ANOVA-type data analysis, i.e.. determination of lowest-observed-effect concentrations (LOECs), and no-observed-effect concentrations (NOECs), has been widely used for statistical analysis of chronic ecotoxicity data. However, it is more and more criticised for several reasons, among which the most important is probably the fact that the NOEC depends on the choice of test concentrations and number of replications and rewards poor experiments, i.e., high variability, with high NOEC values. Thus, a recent OECD workshop concluded that the use of the NOEC should be phased out and that a regression-based estimation procedure should be used. Following this workshop, a working group was established at the French level between government, academia and industry representatives. Twenty-seven sets of chronic data (algae, daphnia, fish) were collected and analysed by ANOVA and regression procedures. Several regression models were compared and relations between NOECs and ECx, for different values of x, were established in order to find an alternative summary parameter to the NOEC. Biological arguments are scarce to help in defining a negligible level of effect x for the ECx. With regard to their use in the risk assessment procedures, a convenient methodology would be to choose x so that ECx are on average similar to the present NOEC. This would lead to no major change in the risk assessment procedure. However, experimental data show that the ECx depend on the regression models and that their accuracy decreases in the low effect zone. This disadvantage could probably be reduced by adapting existing experimental protocols but it could mean more experimental effort and higher cost. ECx (derived with existing test guidelines, e.g., regarding the number of replicates) whose lowest bounds of the confidence interval are on average similar to present NOEC would improve this approach by a priori encouraging more precise experiments. However, narrow confidence intervals are not only

  8. Calculating summary statistics for population chemical biomonitoring in women of childbearing age with adjustment for age-specific natality.

    PubMed

    Axelrad, Daniel A; Cohen, Jonathan

    2011-01-01

    The effects of chemical exposures during pregnancy on children's health have been an increasing focus of environmental health research in recent years, leading to greater interest in biomonitoring of chemicals in women of childbearing age in the general population. Measurements of mercury in blood from the National Health and Nutrition Examination Survey are frequently reported for "women of childbearing age," defined to be of ages 16-49 years. The intent is to represent prenatal chemical exposure, but blood mercury levels increase with age. Furthermore, women of different ages have different probabilities of giving birth. We evaluated options to address potential bias in biomonitoring summary statistics for women of childbearing age by accounting for age-specific probabilities of giving birth. We calculated median and 95th percentile levels of mercury, PCBs, and cotinine using these approaches: option 1: women aged 16-49 years without natality adjustment; option 2: women aged 16-39 years without natality adjustment; option 3: women aged 16-49 years, adjusted for natality by age; option 4: women aged 16-49 years, adjusted for natality by age and race/ethnicity. Among the three chemicals examined, the choice of option has the greatest impact on estimated levels of serum PCBs, which are strongly associated with age. Serum cotinine levels among Black non-Hispanic women of childbearing age are understated when age-specific natality is not considered. For characterizing in utero exposures, adjustment using age-specific natality provides a substantial improvement in estimation of biomonitoring summary statistics. PMID:21035114

  9. Statistical Analysis of Nondisjunction Assays in Drosophila

    PubMed Central

    Zeng, Yong; Li, Hua; Schweppe, Nicole M.; Hawley, R. Scott; Gilliland, William D.

    2010-01-01

    Many advances in the understanding of meiosis have been made by measuring how often errors in chromosome segregation occur. This process of nondisjunction can be studied by counting experimental progeny, but direct measurement of nondisjunction rates is complicated by not all classes of nondisjunctional progeny being viable. For X chromosome nondisjunction in Drosophila female meiosis, all of the normal progeny survive, while nondisjunctional eggs produce viable progeny only if fertilized by sperm that carry the appropriate sex chromosome. The rate of nondisjunction has traditionally been estimated by assuming a binomial process and doubling the number of observed nondisjunctional progeny, to account for the inviable classes. However, the correct way to derive statistics (such as confidence intervals or hypothesis testing) by this approach is far from clear. Instead, we use the multinomial-Poisson hierarchy model and demonstrate that the old estimator is in fact the maximum-likelihood estimator (MLE). Under more general assumptions, we derive asymptotic normality of this estimator and construct confidence interval and hypothesis testing formulae. Confidence intervals under this framework are always larger than under the binomial framework, and application to published data shows that use of the multinomial approach can avoid an apparent type 1 error made by use of the binomial assumption. The current study provides guidance for researchers designing genetic experiments on nondisjunction and improves several methods for the analysis of genetic data. PMID:20660647

  10. Statistical approach to partial equilibrium analysis

    NASA Astrophysics Data System (ADS)

    Wang, Yougui; Stanley, H. E.

    2009-04-01

    A statistical approach to market equilibrium and efficiency analysis is proposed in this paper. One factor that governs the exchange decisions of traders in a market, named willingness price, is highlighted and constitutes the whole theory. The supply and demand functions are formulated as the distributions of corresponding willing exchange over the willingness price. The laws of supply and demand can be derived directly from these distributions. The characteristics of excess demand function are analyzed and the necessary conditions for the existence and uniqueness of equilibrium point of the market are specified. The rationing rates of buyers and sellers are introduced to describe the ratio of realized exchange to willing exchange, and their dependence on the market price is studied in the cases of shortage and surplus. The realized market surplus, which is the criterion of market efficiency, can be written as a function of the distributions of willing exchange and the rationing rates. With this approach we can strictly prove that a market is efficient in the state of equilibrium.

  11. Statistical energy analysis of nonlinear vibrating systems.

    PubMed

    Spelman, G M; Langley, R S

    2015-09-28

    Nonlinearities in practical systems can arise in contacts between components, possibly from friction or impacts. However, it is also known that quadratic and cubic nonlinearity can occur in the stiffness of structural elements undergoing large amplitude vibration, without the need for local contacts. Nonlinearity due purely to large amplitude vibration can then result in significant energy being found in frequency bands other than those being driven by external forces. To analyse this phenomenon, a method is developed here in which the response of the structure in the frequency domain is divided into frequency bands, and the energy flow between the frequency bands is calculated. The frequency bands are assigned an energy variable to describe the mean response and the nonlinear coupling between bands is described in terms of weighted summations of the convolutions of linear modal transfer functions. This represents a nonlinear extension to an established linear theory known as statistical energy analysis (SEA). The nonlinear extension to SEA theory is presented for the case of a plate structure with quadratic and cubic nonlinearity. PMID:26303923

  12. Refining Martian Ages and Understanding Geological Processes From Cratering Statistics

    NASA Technical Reports Server (NTRS)

    Hartmann, William K.

    2005-01-01

    Senior Scientist William K. Hartman presents his final report on Mars Data Analysis Program grant number NAG5-12217: The third year of the three-year program was recently completed in mid-2005. The program has been extremely productive in research and data analysis regarding Mars, especially using Mars Global Surveyor and Mars Odyssey imagery. In the 2005 alone, three papers have already been published, to which this work contributed.1) Hartmann, W. K. 200.5. Martian cratering 8. Isochron refinement and the history of Martian geologic activity Icarus 174, 294-320. This paper is a summary of my entire program of establishing Martian chronology through counts of Martian impact craters. 2) Arfstrom, John, and W. K. Hartmann 2005. Martian flow features, moraine-like rieges, and gullies: Terrestrial analogs and interrelationships. Icarus 174,32 1-335. This paper makes pioneering connections between Martian glacier-like features and terrestrial glacial features. 3) Hartmann, W.K., D. Winterhalter, and J. Geiss. 2005 Chronology and Physical Evolution of Planet Mars. In The Solar System and Beyond: Ten Years of ISSI (Bern: International Space Science Institute). This is a summary of work conducted at the International Space Science Institute with an international team, emphasizing our publication of a conference volume about Mars, edited by Hartmann and published in 2001.

  13. Web-Based Statistical Sampling and Analysis

    ERIC Educational Resources Information Center

    Quinn, Anne; Larson, Karen

    2016-01-01

    Consistent with the Common Core State Standards for Mathematics (CCSSI 2010), the authors write that they have asked students to do statistics projects with real data. To obtain real data, their students use the free Web-based app, Census at School, created by the American Statistical Association (ASA) to help promote civic awareness among school…

  14. Multivariate statistical analysis of environmental monitoring data

    SciTech Connect

    Ross, D.L.

    1997-11-01

    EPA requires statistical procedures to determine whether soil or ground water adjacent to or below waste units is contaminated. These statistical procedures are often based on comparisons between two sets of data: one representing background conditions, and one representing site conditions. Since statistical requirements were originally promulgated in the 1980s, EPA has made several improvements and modifications. There are, however, problems which remain. One problem is that the regulations do not require a minimum probability that contaminated sites will be correctly identified. Another problems is that the effect of testing several correlated constituents on the probable outcome of the statistical tests has not been quantified. Results from computer simulations to determine power functions for realistic monitoring situations are presented here. Power functions for two different statistical procedures: the Student`s t-test, and the multivariate Hotelling`s T{sup 2} test, are compared. The comparisons indicate that the multivariate test is often more powerful when the tests are applied with significance levels to control the probability of falsely identifying clean sites as contaminated. This program could also be used to verify that statistical procedures achieve some minimum power standard at a regulated waste unit.

  15. [Statistical models for spatial analysis in parasitology].

    PubMed

    Biggeri, A; Catelan, D; Dreassi, E; Lagazio, C; Cringoli, G

    2004-06-01

    The simplest way to study the spatial pattern of a disease is the geographical representation of its cases (or some indicators of them) over a map. Maps based on raw data are generally "wrong" since they do not take into consideration for sampling errors. Indeed, the observed differences between areas (or points in the map) are not directly interpretable, as they derive from the composition of true, structural differences and of the noise deriving from the sampling process. This problem is well known in human epidemiology, and several solutions have been proposed to filter the signal from the noise. These statistical methods are usually referred to as Disease Mapping. In geographical analysis a first goal is to evaluate the statistical significance of the heterogeneity between areas (or points). If the test indicates rejection of the hypothesis of homogeneity the following task is to study the spatial pattern of the disease. The spatial variability of risk is usually decomposed into two terms: a spatially structured (clustering) and a non spatially structured (heterogeneity) one. The heterogeneity term reflects spatial variability due to intrinsic characteristics of the sampling units (e.g. igienic conditions of farms), while the clustering term models the association due to proximity between sampling units, that usually depends on ecological conditions that vary over the study area and that affect in similar way breedings that are close to each other. Hierarchical bayesian models are the main tool to make inference over the clustering and heterogeneity components. The results are based on the marginal posterior distributions of the parameters of the model, that are approximated by Monte Carlo Markov Chain methods. Different models can be defined depending on the terms that are considered, namely a model with only the clustering term, a model with only the heterogeneity term and a model where both are included. Model selection criteria based on a compromise between

  16. Notes on numerical reliability of several statistical analysis programs

    USGS Publications Warehouse

    Landwehr, J.M.; Tasker, Gary D.

    1999-01-01

    This report presents a benchmark analysis of several statistical analysis programs currently in use in the USGS. The benchmark consists of a comparison between the values provided by a statistical analysis program for variables in the reference data set ANASTY and their known or calculated theoretical values. The ANASTY data set is an amendment of the Wilkinson NASTY data set that has been used in the statistical literature to assess the reliability (computational correctness) of calculated analytical results.

  17. Statistical Analysis of Refractivity in UAE

    NASA Astrophysics Data System (ADS)

    Al-Ansari, Kifah; Al-Mal, Abdulhadi Abu; Kamel, Rami

    2007-07-01

    This paper presents the results of the refractivity statistics in the UAE (United Arab Emirates) for a period of 14 years (1990-2003). Six sites have been considered using meteorological surface data (Abu Dhabi, Dubai, Sharjah, Al-Ain, Ras Al-Kaimah, and Al-Fujairah). Upper air (radiosonde) data were available at one site only, Abu Dhabi airport, which has been considered for the refractivity gradient statistics. Monthly and yearly averages are obtained for the two parameters, refractivity and refractivity gradient. Cumulative distributions are also provided.

  18. Measurement of Plethysmogram and Statistical Method for Analysis

    NASA Astrophysics Data System (ADS)

    Shimizu, Toshihiro

    The plethysmogram is measured at different points of human body by using the photo interrupter, which sensitively depends on the physical and mental situation of human body. In this paper the statistical method of the data-analysis is investigated to discuss the dependence of plethysmogram on stress and aging. The first one is the representation method based on the return map, which provides usuful information for the waveform, the flucuation in phase and the fluctuation in amplitude. The return map method makes it possible to understand the fluctuation of plethymogram in amplitude and in phase more clearly and globally than in the conventional power spectrum method. The second is the Lisajous plot and the correlation function to analyze the phase difference between the plethysmograms of the right finger tip and of the left finger tip. The third is the R-index, from which we can estimate “the age of the blood flow”. The R-index is defined by the global character of plethysmogram, which is different from the usual APG-index. The stress- and age-dependence of plethysmogram is discussed by using these methods.

  19. Common misconceptions about data analysis and statistics.

    PubMed

    Motulsky, Harvey J

    2015-02-01

    Ideally, any experienced investigator with the right tools should be able to reproduce a finding published in a peer-reviewed biomedical science journal. In fact, the reproducibility of a large percentage of published findings has been questioned. Undoubtedly, there are many reasons for this, but one reason may be that investigators fool themselves due to a poor understanding of statistical concepts. In particular, investigators often make these mistakes: (1) P-Hacking. This is when you reanalyze a data set in many different ways, or perhaps reanalyze with additional replicates, until you get the result you want. (2) Overemphasis on P values rather than on the actual size of the observed effect. (3) Overuse of statistical hypothesis testing, and being seduced by the word "significant". (4) Overreliance on standard errors, which are often misunderstood. PMID:25692012

  20. Common misconceptions about data analysis and statistics.

    PubMed

    Motulsky, Harvey J

    2014-11-01

    Ideally, any experienced investigator with the right tools should be able to reproduce a finding published in a peer-reviewed biomedical science journal. In fact, the reproducibility of a large percentage of published findings has been questioned. Undoubtedly, there are many reasons for this, but one reason maybe that investigators fool themselves due to a poor understanding of statistical concepts. In particular, investigators often make these mistakes: 1. P-Hacking. This is when you reanalyze a data set in many different ways, or perhaps reanalyze with additional replicates, until you get the result you want. 2. Overemphasis on P values rather than on the actual size of the observed effect. 3. Overuse of statistical hypothesis testing, and being seduced by the word "significant". 4. Overreliance on standard errors, which are often misunderstood. PMID:25213136

  1. Common misconceptions about data analysis and statistics.

    PubMed

    Motulsky, Harvey J

    2014-10-01

    Ideally, any experienced investigator with the right tools should be able to reproduce a finding published in a peer-reviewed biomedical science journal. In fact, however, the reproducibility of a large percentage of published findings has been questioned. Undoubtedly, there are many reasons for this, but one reason may be that investigators fool themselves due to a poor understanding of statistical concepts. In particular, investigators often make these mistakes: 1) P-hacking, which is when you reanalyze a data set in many different ways, or perhaps reanalyze with additional replicates, until you get the result you want; 2) overemphasis on P values rather than on the actual size of the observed effect; 3) overuse of statistical hypothesis testing, and being seduced by the word "significant"; and 4) over-reliance on standard errors, which are often misunderstood. PMID:25204545

  2. Common misconceptions about data analysis and statistics.

    PubMed

    Motulsky, Harvey J

    2014-11-01

    Ideally, any experienced investigator with the right tools should be able to reproduce a finding published in a peer-reviewed biomedical science journal. In fact, the reproducibility of a large percentage of published findings has been questioned. Undoubtedly, there are many reasons for this, but one reason maybe that investigators fool themselves due to a poor understanding of statistical concepts. In particular, investigators often make these mistakes: 1. P-Hacking. This is when you reanalyze a data set in many different ways, or perhaps reanalyze with additional replicates, until you get the result you want. 2. Overemphasis on P values rather than on the actual size of the observed effect. 3. Overuse of statistical hypothesis testing, and being seduced by the word "significant". 4. Overreliance on standard errors, which are often misunderstood.

  3. Common misconceptions about data analysis and statistics.

    PubMed

    Motulsky, Harvey J

    2014-10-01

    Ideally, any experienced investigator with the right tools should be able to reproduce a finding published in a peer-reviewed biomedical science journal. In fact, however, the reproducibility of a large percentage of published findings has been questioned. Undoubtedly, there are many reasons for this, but one reason may be that investigators fool themselves due to a poor understanding of statistical concepts. In particular, investigators often make these mistakes: 1) P-hacking, which is when you reanalyze a data set in many different ways, or perhaps reanalyze with additional replicates, until you get the result you want; 2) overemphasis on P values rather than on the actual size of the observed effect; 3) overuse of statistical hypothesis testing, and being seduced by the word "significant"; and 4) over-reliance on standard errors, which are often misunderstood.

  4. Common misconceptions about data analysis and statistics.

    PubMed

    Motulsky, Harvey J

    2015-02-01

    Ideally, any experienced investigator with the right tools should be able to reproduce a finding published in a peer-reviewed biomedical science journal. In fact, the reproducibility of a large percentage of published findings has been questioned. Undoubtedly, there are many reasons for this, but one reason may be that investigators fool themselves due to a poor understanding of statistical concepts. In particular, investigators often make these mistakes: (1) P-Hacking. This is when you reanalyze a data set in many different ways, or perhaps reanalyze with additional replicates, until you get the result you want. (2) Overemphasis on P values rather than on the actual size of the observed effect. (3) Overuse of statistical hypothesis testing, and being seduced by the word "significant". (4) Overreliance on standard errors, which are often misunderstood.

  5. Practical Issues in Component Aging Analysis

    SciTech Connect

    Dana L. Kelly; Andrei Rodionov; Jens Uwe-Klugel

    2008-09-01

    This paper examines practical issues in the statistical analysis of component aging data. These issues center on the stochastic process chosen to model component failures. The two stochastic processes examined are repair same as new, leading to a renewal process, and repair same as old, leading to a nonhomogeneous Poisson process. Under the first assumption, times between failures can treated as statistically independent observations from a stationary process. The common distribution of the times between failures is called the renewal distribution. Under the second process, the times between failures will not be independently and identically distributed, and one cannot simply fit a renewal distribution to the cumulative failure times or the times between failures. The paper illustrates how the assumption made regarding the repair process is crucial to the analysis. Besides the choice of stochastic process, other issues that are discussed include qualitative graphical analysis and simple nonparametric hypothesis tests to help judge which process appears more appropriate. Numerical examples are presented to illustrate the issues discussed in the paper.

  6. Critical analysis of adsorption data statistically

    NASA Astrophysics Data System (ADS)

    Kaushal, Achla; Singh, S. K.

    2016-09-01

    Experimental data can be presented, computed, and critically analysed in a different way using statistics. A variety of statistical tests are used to make decisions about the significance and validity of the experimental data. In the present study, adsorption was carried out to remove zinc ions from contaminated aqueous solution using mango leaf powder. The experimental data was analysed statistically by hypothesis testing applying t test, paired t test and Chi-square test to (a) test the optimum value of the process pH, (b) verify the success of experiment and (c) study the effect of adsorbent dose in zinc ion removal from aqueous solutions. Comparison of calculated and tabulated values of t and χ 2 showed the results in favour of the data collected from the experiment and this has been shown on probability charts. K value for Langmuir isotherm was 0.8582 and m value for Freundlich adsorption isotherm obtained was 0.725, both are <1, indicating favourable isotherms. Karl Pearson's correlation coefficient values for Langmuir and Freundlich adsorption isotherms were obtained as 0.99 and 0.95 respectively, which show higher degree of correlation between the variables. This validates the data obtained for adsorption of zinc ions from the contaminated aqueous solution with the help of mango leaf powder.

  7. GROUNDWATER INFORMATION TRACKING SYSTEM/STATISTICAL ANALYSIS SYSTEM

    EPA Science Inventory

    The Groundwater Information Tracking System with STATistical analysis capability (GRITS/STAT) is a tool designed to facilitate the storage, analysis, and reporting of data collected through groundwater monitoring programs at RCRA, CERCLA, and other regulated facilities an...

  8. Statistical Uncertainty Analysis Applied to Criticality Calculation

    SciTech Connect

    Hartini, Entin; Andiwijayakusuma, Dinan; Susmikanti, Mike; Nursinta, A. W.

    2010-06-22

    In this paper, we present an uncertainty methodology based on a statistical approach, for assessing uncertainties in criticality prediction using monte carlo method due to uncertainties in the isotopic composition of the fuel. The methodology has been applied to criticality calculations with MCNP5 with additional stochastic input of the isotopic fuel composition. The stochastic input were generated using the latin hypercube sampling method based one the probability density function of each nuclide composition. The automatic passing of the stochastic input to the MCNP and the repeated criticality calculation is made possible by using a python script to link the MCNP and our latin hypercube sampling code.

  9. Computer based statistical study of cartography in mortality upto age of one year.

    PubMed

    Bansal, A K; Indrayan, A

    1993-10-01

    Present cartography procedures for quantitative indicators are arbitrary on choice of the number of categories in which a particular area is to be divided. The choice of initial cutoff and the choice of the width of each category is also arbitrary. To remove this arbitrariness and thus to introduce objectivity, we propose use of a statistical procedure called cluster analysis. This procedure is easy to use on a computer. We also propose using computer based maps. We use these methods on mortality indicators upto age of one year for major states of India to devise objective maps. The terminology of mortality indicators upto age of one year has been used by UNICEF document(1). The mortality indicators analysed are infant mortality rate, neonatal mortality rate, postneonatal mortality rate, perinatal mortality rate and still birth rate. Different indicators reveal different pictures. In this paper, we also propose an innovation to obtain an integrated picture by simultaneously considering all the four indicators in a multivariate setting. Such mapping could help the health managers and planners to devise more effective strategies to control child mortality.

  10. Statistical analysis of Contact Angle Hysteresis

    NASA Astrophysics Data System (ADS)

    Janardan, Nachiketa; Panchagnula, Mahesh

    2015-11-01

    We present the results of a new statistical approach to determining Contact Angle Hysteresis (CAH) by studying the nature of the triple line. A statistical distribution of local contact angles on a random three-dimensional drop is used as the basis for this approach. Drops with randomly shaped triple lines but of fixed volumes were deposited on a substrate and their triple line shapes were extracted by imaging. Using a solution developed by Prabhala et al. (Langmuir, 2010), the complete three dimensional shape of the sessile drop was generated. A distribution of the local contact angles for several such drops but of the same liquid-substrate pairs is generated. This distribution is a result of several microscopic advancing and receding processes along the triple line. This distribution is used to yield an approximation of the CAH associated with the substrate. This is then compared with measurements of CAH by means of a liquid infusion-withdrawal experiment. Static measurements are shown to be sufficient to measure quasistatic contact angle hysteresis of a substrate. The approach also points towards the relationship between microscopic triple line contortions and CAH.

  11. Statistics over features: EEG signals analysis.

    PubMed

    Derya Ubeyli, Elif

    2009-08-01

    This paper presented the usage of statistics over the set of the features representing the electroencephalogram (EEG) signals. Since classification is more accurate when the pattern is simplified through representation by important features, feature extraction and selection play an important role in classifying systems such as neural networks. Multilayer perceptron neural network (MLPNN) architectures were formulated and used as basis for detection of electroencephalographic changes. Three types of EEG signals (EEG signals recorded from healthy volunteers with eyes open, epilepsy patients in the epileptogenic zone during a seizure-free interval, and epilepsy patients during epileptic seizures) were classified. The selected Lyapunov exponents, wavelet coefficients and the power levels of power spectral density (PSD) values obtained by eigenvector methods of the EEG signals were used as inputs of the MLPNN trained with Levenberg-Marquardt algorithm. The classification results confirmed that the proposed MLPNN has potential in detecting the electroencephalographic changes. PMID:19555931

  12. Statistical analysis of low level atmospheric turbulence

    NASA Technical Reports Server (NTRS)

    Tieleman, H. W.; Chen, W. W. L.

    1974-01-01

    The statistical properties of low-level wind-turbulence data were obtained with the model 1080 total vector anemometer and the model 1296 dual split-film anemometer, both manufactured by Thermo Systems Incorporated. The data obtained from the above fast-response probes were compared with the results obtained from a pair of Gill propeller anemometers. The digitized time series representing the three velocity components and the temperature were each divided into a number of blocks, the length of which depended on the lowest frequency of interest and also on the storage capacity of the available computer. A moving-average and differencing high-pass filter was used to remove the trend and the low frequency components in the time series. The calculated results for each of the anemometers used are represented in graphical or tabulated form.

  13. Statistics

    Cancer.gov

    Links to sources of cancer-related statistics, including the Surveillance, Epidemiology and End Results (SEER) Program, SEER-Medicare datasets, cancer survivor prevalence data, and the Cancer Trends Progress Report.

  14. Comparative analysis of positive and negative attitudes toward statistics

    NASA Astrophysics Data System (ADS)

    Ghulami, Hassan Rahnaward; Ab Hamid, Mohd Rashid; Zakaria, Roslinazairimah

    2015-02-01

    Many statistics lecturers and statistics education researchers are interested to know the perception of their students' attitudes toward statistics during the statistics course. In statistics course, positive attitude toward statistics is a vital because it will be encourage students to get interested in the statistics course and in order to master the core content of the subject matters under study. Although, students who have negative attitudes toward statistics they will feel depressed especially in the given group assignment, at risk for failure, are often highly emotional, and could not move forward. Therefore, this study investigates the students' attitude towards learning statistics. Six latent constructs have been the measurement of students' attitudes toward learning statistic such as affect, cognitive competence, value, difficulty, interest, and effort. The questionnaire was adopted and adapted from the reliable and validate instrument of Survey of Attitudes towards Statistics (SATS). This study is conducted among engineering undergraduate engineering students in the university Malaysia Pahang (UMP). The respondents consist of students who were taking the applied statistics course from different faculties. From the analysis, it is found that the questionnaire is acceptable and the relationships among the constructs has been proposed and investigated. In this case, students show full effort to master the statistics course, feel statistics course enjoyable, have confidence that they have intellectual capacity, and they have more positive attitudes then negative attitudes towards statistics learning. In conclusion in terms of affect, cognitive competence, value, interest and effort construct the positive attitude towards statistics was mostly exhibited. While negative attitudes mostly exhibited by difficulty construct.

  15. CORSSA: The Community Online Resource for Statistical Seismicity Analysis

    USGS Publications Warehouse

    Michael, Andrew J.; Wiemer, Stefan

    2010-01-01

    Statistical seismology is the application of rigorous statistical methods to earthquake science with the goal of improving our knowledge of how the earth works. Within statistical seismology there is a strong emphasis on the analysis of seismicity data in order to improve our scientific understanding of earthquakes and to improve the evaluation and testing of earthquake forecasts, earthquake early warning, and seismic hazards assessments. Given the societal importance of these applications, statistical seismology must be done well. Unfortunately, a lack of educational resources and available software tools make it difficult for students and new practitioners to learn about this discipline. The goal of the Community Online Resource for Statistical Seismicity Analysis (CORSSA) is to promote excellence in statistical seismology by providing the knowledge and resources necessary to understand and implement the best practices, so that the reader can apply these methods to their own research. This introduction describes the motivation for and vision of CORRSA. It also describes its structure and contents.

  16. Statistical analysis of the 'Almagest' star catalog

    NASA Astrophysics Data System (ADS)

    Kalashnikov, V. V.; Nosovskii, G. V.; Fomenko, A. T.

    The star catalog contained in the 'Almagest', Ptolemy's classical work of astronomy, is examined. An analysis method is proposed which allows the identification of various types of errors committed by the observer. This method not only removes many of the contradictions contained in the catalog but also makes it possible to determine the time period during which the catalog was compiled.

  17. Improved Statistics for Genome-Wide Interaction Analysis

    PubMed Central

    Ueki, Masao; Cordell, Heather J.

    2012-01-01

    Recently, Wu and colleagues [1] proposed two novel statistics for genome-wide interaction analysis using case/control or case-only data. In computer simulations, their proposed case/control statistic outperformed competing approaches, including the fast-epistasis option in PLINK and logistic regression analysis under the correct model; however, reasons for its superior performance were not fully explored. Here we investigate the theoretical properties and performance of Wu et al.'s proposed statistics and explain why, in some circumstances, they outperform competing approaches. Unfortunately, we find minor errors in the formulae for their statistics, resulting in tests that have higher than nominal type 1 error. We also find minor errors in PLINK's fast-epistasis and case-only statistics, although theory and simulations suggest that these errors have only negligible effect on type 1 error. We propose adjusted versions of all four statistics that, both theoretically and in computer simulations, maintain correct type 1 error rates under the null hypothesis. We also investigate statistics based on correlation coefficients that maintain similar control of type 1 error. Although designed to test specifically for interaction, we show that some of these previously-proposed statistics can, in fact, be sensitive to main effects at one or both loci, particularly in the presence of linkage disequilibrium. We propose two new “joint effects” statistics that, provided the disease is rare, are sensitive only to genuine interaction effects. In computer simulations we find, in most situations considered, that highest power is achieved by analysis under the correct genetic model. Such an analysis is unachievable in practice, as we do not know this model. However, generally high power over a wide range of scenarios is exhibited by our joint effects and adjusted Wu statistics. We recommend use of these alternative or adjusted statistics and urge caution when using Wu et al

  18. Improved statistics for genome-wide interaction analysis.

    PubMed

    Ueki, Masao; Cordell, Heather J

    2012-01-01

    Recently, Wu and colleagues [1] proposed two novel statistics for genome-wide interaction analysis using case/control or case-only data. In computer simulations, their proposed case/control statistic outperformed competing approaches, including the fast-epistasis option in PLINK and logistic regression analysis under the correct model; however, reasons for its superior performance were not fully explored. Here we investigate the theoretical properties and performance of Wu et al.'s proposed statistics and explain why, in some circumstances, they outperform competing approaches. Unfortunately, we find minor errors in the formulae for their statistics, resulting in tests that have higher than nominal type 1 error. We also find minor errors in PLINK's fast-epistasis and case-only statistics, although theory and simulations suggest that these errors have only negligible effect on type 1 error. We propose adjusted versions of all four statistics that, both theoretically and in computer simulations, maintain correct type 1 error rates under the null hypothesis. We also investigate statistics based on correlation coefficients that maintain similar control of type 1 error. Although designed to test specifically for interaction, we show that some of these previously-proposed statistics can, in fact, be sensitive to main effects at one or both loci, particularly in the presence of linkage disequilibrium. We propose two new "joint effects" statistics that, provided the disease is rare, are sensitive only to genuine interaction effects. In computer simulations we find, in most situations considered, that highest power is achieved by analysis under the correct genetic model. Such an analysis is unachievable in practice, as we do not know this model. However, generally high power over a wide range of scenarios is exhibited by our joint effects and adjusted Wu statistics. We recommend use of these alternative or adjusted statistics and urge caution when using Wu et al

  19. Development and aging of superficial white matter myelin from young adulthood to old age: Mapping by vertex-based surface statistics (VBSS).

    PubMed

    Wu, Minjie; Kumar, Anand; Yang, Shaolin

    2016-05-01

    Superficial white matter (SWM) lies immediately beneath cortical gray matter and consists primarily of short association fibers. The characteristics of SWM and its development and aging were seldom examined in the literature and warrant further investigation. Magnetization transfer imaging is sensitive to myelin changes in the white matter. Using an innovative multimodal imaging analysis approach, vertex-based surface statistics (VBSS), the current study vertexwise mapped age-related changes of magnetization transfer ratio (MTR) in SWM from young adulthood to old age (30-85 years, N = 66). Results demonstrated regionally selective and temporally heterochronologic changes of SWM MTR with age, including (1) inverted U-shaped trajectories of SWM MTR in the rostral middle frontal, medial temporal, and temporoparietal regions, suggesting continuing myelination and protracted maturation till age 40-50 years and accelerating demyelination at age 60 and beyond, (2) linear decline of SWM MTR in the middle and superior temporal, and pericalcarine areas, indicating early maturation and less acceleration in age-related degeneration, and (3) no significant changes of SWM MTR in the primary motor, somatosensory and auditory regions, suggesting resistance to age-related deterioration. We did not observe similar patterns of changes in cortical thickness in our sample, suggesting the observed SWM MTR changes are not due to cortical atrophy. Hum Brain Mapp 37:1759-1769, 2016. © 2016 Wiley Periodicals, Inc.

  20. Development and aging of superficial white matter myelin from young adulthood to old age: Mapping by vertex-based surface statistics (VBSS).

    PubMed

    Wu, Minjie; Kumar, Anand; Yang, Shaolin

    2016-05-01

    Superficial white matter (SWM) lies immediately beneath cortical gray matter and consists primarily of short association fibers. The characteristics of SWM and its development and aging were seldom examined in the literature and warrant further investigation. Magnetization transfer imaging is sensitive to myelin changes in the white matter. Using an innovative multimodal imaging analysis approach, vertex-based surface statistics (VBSS), the current study vertexwise mapped age-related changes of magnetization transfer ratio (MTR) in SWM from young adulthood to old age (30-85 years, N = 66). Results demonstrated regionally selective and temporally heterochronologic changes of SWM MTR with age, including (1) inverted U-shaped trajectories of SWM MTR in the rostral middle frontal, medial temporal, and temporoparietal regions, suggesting continuing myelination and protracted maturation till age 40-50 years and accelerating demyelination at age 60 and beyond, (2) linear decline of SWM MTR in the middle and superior temporal, and pericalcarine areas, indicating early maturation and less acceleration in age-related degeneration, and (3) no significant changes of SWM MTR in the primary motor, somatosensory and auditory regions, suggesting resistance to age-related deterioration. We did not observe similar patterns of changes in cortical thickness in our sample, suggesting the observed SWM MTR changes are not due to cortical atrophy. Hum Brain Mapp 37:1759-1769, 2016. © 2016 Wiley Periodicals, Inc. PMID:26955787

  1. Importance of data management with statistical analysis set division.

    PubMed

    Wang, Ling; Li, Chan-juan; Jiang, Zhi-wei; Xia, Jie-lai

    2015-11-01

    Testing of hypothesis was affected by statistical analysis set division which was an important data management work before data base lock-in. Objective division of statistical analysis set under blinding was the guarantee of scientific trial conclusion. All the subjects having accepted at least once trial treatment after randomization should be concluded in safety set. Full analysis set should be close to the intention-to-treat as far as possible. Per protocol set division was the most difficult to control in blinded examination because of more subjectivity than the other two. The objectivity of statistical analysis set division must be guaranteed by the accurate raw data, the comprehensive data check and the scientific discussion, all of which were the strict requirement of data management. Proper division of statistical analysis set objectively and scientifically is an important approach to improve the data management quality. PMID:26911044

  2. Importance of data management with statistical analysis set division.

    PubMed

    Wang, Ling; Li, Chan-juan; Jiang, Zhi-wei; Xia, Jie-lai

    2015-11-01

    Testing of hypothesis was affected by statistical analysis set division which was an important data management work before data base lock-in. Objective division of statistical analysis set under blinding was the guarantee of scientific trial conclusion. All the subjects having accepted at least once trial treatment after randomization should be concluded in safety set. Full analysis set should be close to the intention-to-treat as far as possible. Per protocol set division was the most difficult to control in blinded examination because of more subjectivity than the other two. The objectivity of statistical analysis set division must be guaranteed by the accurate raw data, the comprehensive data check and the scientific discussion, all of which were the strict requirement of data management. Proper division of statistical analysis set objectively and scientifically is an important approach to improve the data management quality.

  3. Comparison of Statistical Population Reconstruction Using Full and Pooled Adult Age-Class Data

    PubMed Central

    Skalski, John R.; Millspaugh, Joshua J.; Clawson, Michael V.

    2012-01-01

    Background Age-at-harvest data are among the most commonly collected, yet neglected, demographic data gathered by wildlife agencies. Statistical population construction techniques can use this information to estimate the abundance of wild populations over wide geographic areas and concurrently estimate recruitment, harvest, and natural survival rates. Although current reconstruction techniques use full age-class data (0.5, 1.5, 2.5, 3.5, … years), it is not always possible to determine an animal's age due to inaccuracy of the methods, expense, and logistics of sample collection. The ability to inventory wild populations would be greatly expanded if pooled adult age-class data (e.g., 0.5, 1.5, 2.5+ years) could be successfully used in statistical population reconstruction. Methodology/Principal Findings We investigated the performance of statistical population reconstruction models developed to analyze full age-class and pooled adult age-class data. We performed Monte Carlo simulations using a stochastic version of a Leslie matrix model, which generated data over a wide range of abundance levels, harvest rates, and natural survival probabilities, representing medium-to-big game species. Results of full age-class and pooled adult age-class population reconstructions were compared for accuracy and precision. No discernible difference in accuracy was detected, but precision was slightly reduced when using the pooled adult age-class reconstruction. On average, the coefficient of variation increased by 0.059 when the adult age-class data were pooled prior to analyses. The analyses and maximum likelihood model for pooled adult age-class reconstruction are illustrated for a black-tailed deer (Odocoileus hemionus) population in Washington State. Conclusions/Significance Inventorying wild populations is one of the greatest challenges of wildlife agencies. These new statistical population reconstruction models should expand the demographic capabilities of wildlife agencies

  4. Internet Data Analysis for the Undergraduate Statistics Curriculum

    ERIC Educational Resources Information Center

    Sanchez, Juana; He, Yan

    2005-01-01

    Statistics textbooks for undergraduates have not caught up with the enormous amount of analysis of Internet data that is taking place these days. Case studies that use Web server log data or Internet network traffic data are rare in undergraduate Statistics education. And yet these data provide numerous examples of skewed and bimodal…

  5. Guidelines for Statistical Analysis of Percentage of Syllables Stuttered Data

    ERIC Educational Resources Information Center

    Jones, Mark; Onslow, Mark; Packman, Ann; Gebski, Val

    2006-01-01

    Purpose: The purpose of this study was to develop guidelines for the statistical analysis of percentage of syllables stuttered (%SS) data in stuttering research. Method; Data on %SS from various independent sources were used to develop a statistical model to describe this type of data. On the basis of this model, %SS data were simulated with…

  6. A Realistic Experimental Design and Statistical Analysis Project

    ERIC Educational Resources Information Center

    Muske, Kenneth R.; Myers, John A.

    2007-01-01

    A realistic applied chemical engineering experimental design and statistical analysis project is documented in this article. This project has been implemented as part of the professional development and applied statistics courses at Villanova University over the past five years. The novel aspects of this project are that the students are given a…

  7. A statistical model including age to predict passenger postures in the rear seats of automobiles.

    PubMed

    Park, Jangwoon; Ebert, Sheila M; Reed, Matthew P; Hallman, Jason J

    2016-06-01

    Few statistical models of rear seat passenger posture have been published, and none has taken into account the effects of occupant age. This study developed new statistical models for predicting passenger postures in the rear seats of automobiles. Postures of 89 adults with a wide range of age and body size were measured in a laboratory mock-up in seven seat configurations. Posture-prediction models for female and male passengers were separately developed by stepwise regression using age, body dimensions, seat configurations and two-way interactions as potential predictors. Passenger posture was significantly associated with age and the effects of other two-way interaction variables depended on age. A set of posture-prediction models are presented for women and men, and the prediction results are compared with previously published models. This study is the first study of passenger posture to include a large cohort of older passengers and the first to report a significant effect of age for adults. The presented models can be used to position computational and physical human models for vehicle design and assessment. Practitioner Summary: The significant effects of age, body dimensions and seat configuration on rear seat passenger posture were identified. The models can be used to accurately position computational human models or crash test dummies for older passengers in known rear seat configurations.

  8. A statistical model including age to predict passenger postures in the rear seats of automobiles.

    PubMed

    Park, Jangwoon; Ebert, Sheila M; Reed, Matthew P; Hallman, Jason J

    2016-06-01

    Few statistical models of rear seat passenger posture have been published, and none has taken into account the effects of occupant age. This study developed new statistical models for predicting passenger postures in the rear seats of automobiles. Postures of 89 adults with a wide range of age and body size were measured in a laboratory mock-up in seven seat configurations. Posture-prediction models for female and male passengers were separately developed by stepwise regression using age, body dimensions, seat configurations and two-way interactions as potential predictors. Passenger posture was significantly associated with age and the effects of other two-way interaction variables depended on age. A set of posture-prediction models are presented for women and men, and the prediction results are compared with previously published models. This study is the first study of passenger posture to include a large cohort of older passengers and the first to report a significant effect of age for adults. The presented models can be used to position computational and physical human models for vehicle design and assessment. Practitioner Summary: The significant effects of age, body dimensions and seat configuration on rear seat passenger posture were identified. The models can be used to accurately position computational human models or crash test dummies for older passengers in known rear seat configurations. PMID:26328769

  9. Epigenetic age analysis of children who seem to evade aging.

    PubMed

    Walker, Richard F; Liu, Jia Sophie; Peters, Brock A; Ritz, Beate R; Wu, Timothy; Ophoff, Roel A; Horvath, Steve

    2015-05-01

    We previously reported the unusual case of a teenage girl stricken with multifocal developmental dysfunctions whose physical development was dramatically delayed resulting in her appearing to be a toddler or at best a preschooler, even unto the occasion of her death at the age of 20 years. Her life-long physician felt that the disorder was unique in the world and that future treatments for age-related diseases might emerge from its study. The objectives of our research were to determine if other such cases exist, and if so, whether aging is actually slowed. Of seven children characterized by dramatically slow developmental rates, five also had associated disorders displayed by the first case. All of the identified subjects were female. To objectively measure the age of blood tissue from these subjects, we used a highly accurate biomarker of aging known as "epigenetic clock" based on DNA methylation levels. No statistically significant differences in chronological and epigenetic ages were detected in any of the newly discovered cases.

  10. Epigenetic age analysis of children who seem to evade aging.

    PubMed

    Walker, Richard F; Liu, Jia Sophie; Peters, Brock A; Ritz, Beate R; Wu, Timothy; Ophoff, Roel A; Horvath, Steve

    2015-05-01

    We previously reported the unusual case of a teenage girl stricken with multifocal developmental dysfunctions whose physical development was dramatically delayed resulting in her appearing to be a toddler or at best a preschooler, even unto the occasion of her death at the age of 20 years. Her life-long physician felt that the disorder was unique in the world and that future treatments for age-related diseases might emerge from its study. The objectives of our research were to determine if other such cases exist, and if so, whether aging is actually slowed. Of seven children characterized by dramatically slow developmental rates, five also had associated disorders displayed by the first case. All of the identified subjects were female. To objectively measure the age of blood tissue from these subjects, we used a highly accurate biomarker of aging known as "epigenetic clock" based on DNA methylation levels. No statistically significant differences in chronological and epigenetic ages were detected in any of the newly discovered cases. PMID:25991677

  11. Absolute ages from crater statistics: Using radiometric ages of Martian samples for determining the Martian cratering chronology

    NASA Technical Reports Server (NTRS)

    Neukum, G.

    1988-01-01

    In the absence of dates derived from rock samples, impact crater frequencies are commonly used to date Martian surface units. All models for absolute dating rely on the lunar cratering chronology and on the validity of its extrapolation to Martian conditions. Starting from somewhat different lunar chronologies, rather different Martian cratering chronologies are found in the literature. Currently favored models are compared. The differences at old ages are significant, the differences at younger ages are considerable and give absolute ages for the same crater frequencies as different as a factor of 3. The total uncertainty could be much higher, though, since the ratio of lunar to Martian cratering rate which is of basic importance in the models is believed to be known no better than within a factor of 2. Thus, it is of crucial importance for understanding the the evolution of Mars and determining the sequence of events to establish an unambiguous Martian cratering chronology from crater statistics in combination with clean radiometric ages of returned Martian samples. For the dating goal, rocks should be as pristine as possible from a geologically simple area with a one-stage emplacement history of the local formation. A minimum of at least one highland site for old ages, two intermediate-aged sites, and one very young site is needed.

  12. Discriminatory power of game-related statistics in 14-15 year age group male volleyball, according to set.

    PubMed

    García-Hermoso, Antonio; Dávila-Romero, Carlos; Saavedra, Jose M

    2013-02-01

    This study compared volleyball game-related statistics by outcome (winners and losers of sets) and set number (total, initial, and last) to identify characteristics that discriminated game performance. Game-related statistics from 314 sets (44 matches) played by teams of male 14- to 15-year-olds in a regional volleyball championship were analysed (2011). Differences between contexts (winning or losing teams) and "set number" (total, initial, and last) were assessed. A discriminant analysis was then performed according to outcome (winners and losers of sets) and "set number" (total, initial, and last). The results showed differences (winning or losing sets) in several variables of Complexes I (attack point and error reception) and II (serve and aces). Game-related statistics which discriminate performance in the sets index the serve, positive reception, and attack point. The predictors of performance at these ages when players are still learning could help coaches plan their training. PMID:23829141

  13. System statistical reliability model and analysis

    NASA Technical Reports Server (NTRS)

    Lekach, V. S.; Rood, H.

    1973-01-01

    A digital computer code was developed to simulate the time-dependent behavior of the 5-kwe reactor thermoelectric system. The code was used to determine lifetime sensitivity coefficients for a number of system design parameters, such as thermoelectric module efficiency and degradation rate, radiator absorptivity and emissivity, fuel element barrier defect constant, beginning-of-life reactivity, etc. A probability distribution (mean and standard deviation) was estimated for each of these design parameters. Then, error analysis was used to obtain a probability distribution for the system lifetime (mean = 7.7 years, standard deviation = 1.1 years). From this, the probability that the system will achieve the design goal of 5 years lifetime is 0.993. This value represents an estimate of the degradation reliability of the system.

  14. Applications of statistics to medical science, IV survival analysis.

    PubMed

    Watanabe, Hiroshi

    2012-01-01

    The fundamental principles of survival analysis are reviewed. In particular, the Kaplan-Meier method and a proportional hazard model are discussed. This work is the last part of a series in which medical statistics are surveyed.

  15. [Statistical analysis using freely-available "EZR (Easy R)" software].

    PubMed

    Kanda, Yoshinobu

    2015-10-01

    Clinicians must often perform statistical analyses for purposes such evaluating preexisting evidence and designing or executing clinical studies. R is a free software environment for statistical computing. R supports many statistical analysis functions, but does not incorporate a statistical graphical user interface (GUI). The R commander provides an easy-to-use basic-statistics GUI for R. However, the statistical function of the R commander is limited, especially in the field of biostatistics. Therefore, the author added several important statistical functions to the R commander and named it "EZR (Easy R)", which is now being distributed on the following website: http://www.jichi.ac.jp/saitama-sct/. EZR allows the application of statistical functions that are frequently used in clinical studies, such as survival analyses, including competing risk analyses and the use of time-dependent covariates and so on, by point-and-click access. In addition, by saving the script automatically created by EZR, users can learn R script writing, maintain the traceability of the analysis, and assure that the statistical process is overseen by a supervisor.

  16. Propensity Score Analysis: An Alternative Statistical Approach for HRD Researchers

    ERIC Educational Resources Information Center

    Keiffer, Greggory L.; Lane, Forrest C.

    2016-01-01

    Purpose: This paper aims to introduce matching in propensity score analysis (PSA) as an alternative statistical approach for researchers looking to make causal inferences using intact groups. Design/methodology/approach: An illustrative example demonstrated the varying results of analysis of variance, analysis of covariance and PSA on a heuristic…

  17. Statistical analysis of litter experiments in teratology

    SciTech Connect

    Williams, R.; Buschbom, R.L.

    1982-11-01

    Teratological data is binary response data (each fetus is either affected or not) in which the responses within a litter are usually not independent. As a result, the litter should be taken as the experimental unit. For each litter, its size, n, and the number of fetuses, x, possessing the effect of interest are recorded. The ratio p = x/n is then the basic data generated by the experiment. There are currently three general approaches to the analysis of teratological data: nonparametric, transformation followed by t-test or ANOVA, and parametric. The first two are currently in wide use by practitioners while the third is relatively new to the field. These first two also appear to possess comparable power levels while maintaining the nominal level of significance. When transformations are employed, care must be exercised to check that the transformed data has the required properties. Since the data is often highly asymmetric, there may be no transformation which renders the data nearly normal. The parametric procedures, including the beta-binomial model, offer the possibility of increased power.

  18. Basic statistical tools in research and data analysis

    PubMed Central

    Ali, Zulfiqar; Bhaskar, S Bala

    2016-01-01

    Statistical methods involved in carrying out a study include planning, designing, collecting data, analysing, drawing meaningful interpretation and reporting of the research findings. The statistical analysis gives meaning to the meaningless numbers, thereby breathing life into a lifeless data. The results and inferences are precise only if proper statistical tests are used. This article will try to acquaint the reader with the basic research tools that are utilised while conducting various studies. The article covers a brief outline of the variables, an understanding of quantitative and qualitative variables and the measures of central tendency. An idea of the sample size estimation, power analysis and the statistical errors is given. Finally, there is a summary of parametric and non-parametric tests used for data analysis. PMID:27729694

  19. Diphenylamine and derivatives as predictors of gunpowder age by means of HPLC and statistical models.

    PubMed

    López-López, María; Bravo, J Carlos; García-Ruiz, Carmen; Torre, Mercedes

    2013-01-15

    The gunpowder age is information of great importance that could help to establish safety regulations related to the propellants use and manipulation. In this work, a forced aging treatment (65°C for 120 days) was applied to four gunpowders stabilized with diphenylamine (DPA). The evolution of DPA and derivatives (N-nitroso-DPA, 2-nitro-DPA, 4-nitro-DPA, and 4-4'-dinitro-DPA) concentration during the days was leaded by High Performance Liquid Chromatography (HPLC). The variation with time of the peak areas of these compounds was used to construct different statistical models that could predict the gunpowders age. These models were validated using nitrocellulose-based gunpowders of known manufacture date. Models that best predicted the gunpowder age provided prediction errors lower than 6, 4, and 2 years for single-base gunpowders with dinitrotoluene (≥ 10%(m/m)), single-base gunpowders and double-base gunpowders, respectively.

  20. A Divergence Statistics Extension to VTK for Performance Analysis.

    SciTech Connect

    Pebay, Philippe Pierre; Bennett, Janine Camille

    2015-02-01

    This report follows the series of previous documents ([PT08, BPRT09b, PT09, BPT09, PT10, PB13], where we presented the parallel descriptive, correlative, multi-correlative, principal component analysis, contingency, k -means, order and auto-correlative statistics engines which we developed within the Visualization Tool Kit ( VTK ) as a scalable, parallel and versatile statistics package. We now report on a new engine which we developed for the calculation of divergence statistics, a concept which we hereafter explain and whose main goal is to quantify the discrepancy, in a stasticial manner akin to measuring a distance, between an observed empirical distribution and a theoretical, "ideal" one. The ease of use of the new diverence statistics engine is illustrated by the means of C++ code snippets. Although this new engine does not yet have a parallel implementation, it has already been applied to HPC performance analysis, of which we provide an example.

  1. Analysis of Coastal Dunes: A Remote Sensing and Statistical Approach.

    ERIC Educational Resources Information Center

    Jones, J. Richard

    1985-01-01

    Remote sensing analysis and statistical methods were used to analyze the coastal dunes of Plum Island, Massachusetts. The research methodology used provides an example of a student project for remote sensing, geomorphology, or spatial analysis courses at the university level. (RM)

  2. Analysis of Community College Faculty by Age.

    ERIC Educational Resources Information Center

    Connecticut Community Coll. System, Hartford.

    This document is based on research in the area of faculty age statistics based upon data about full time faculty in the Connecticut community college system. The report was done in response to interest expressed by the Chancellor and the Council of Presidents. The data was extracted from the Banner Human Resources Information System as of…

  3. Menopausal Hormone Therapy, Age, and Chronic Diseases: Perspectives on Statistical Trends

    PubMed Central

    2016-01-01

    The release of the Women’s Health Initiative (WHI) study in 2002 was a shock to the medical community. Hormone therapy (HT) had generally been considered to be highly beneficial for postmenopausal women since it was the gold standard for relief of menopausal symptoms (hot flashes, night sweats, vaginal atrophy) and it was thought to protect women from osteoporosis, heart disease, and cognitive decline and to generally improve quality of life. However, WHI showed a statistically significant increase in a number of disease states, including breast cancer, cardiovascular disease, and stroke. One problem with the WHI study was that the average age of women in the study was 63, which is considerably older than the age at which most women enter menopause (about 51). The timing hypothesis attempts to rationalize the effect of age on response to HT and risk of various diseases. The data suggests that younger women (50–60) may be protected from heart disease with only a slight increase in breast cancer risk. In contrast, older women (>65) are more susceptible to breast cancer and heart disease and should avoid HT. This Perspective on Statistical Trends evaluates the current data on HT and risk for chronic diseases as a function of age. PMID:27636306

  4. Statistical inference in behavior analysis: Friend or foe?

    PubMed Central

    Baron, Alan

    1999-01-01

    Behavior analysts are undecided about the proper role to be played by inferential statistics in behavioral research. The traditional view, as expressed in Sidman's Tactics of Scientific Research (1960), was that inferential statistics has no place within a science that focuses on the steady-state behavior of individual organisms. Despite this admonition, there have been steady inroads of statistical techniques into behavior analysis since then, as evidenced by publications in the Journal of the Experimental Analysis of Behavior. The issues raised by these developments were considered at a panel held at the 24th annual convention of the Association for Behavior Analysis, Orlando, Florida (May, 1998). The proceedings are reported in this and the following articles. PMID:22478323

  5. Statistical inference in behavior analysis: Experimental control is better

    PubMed Central

    Perone, Michael

    1999-01-01

    Statistical inference promises automatic, objective, reliable assessments of data, independent of the skills or biases of the investigator, whereas the single-subject methods favored by behavior analysts often are said to rely too much on the investigator's subjective impressions, particularly in the visual analysis of data. In fact, conventional statistical methods are difficult to apply correctly, even by experts, and the underlying logic of null-hypothesis testing has drawn criticism since its inception. By comparison, single-subject methods foster direct, continuous interaction between investigator and subject and development of strong forms of experimental control that obviate the need for statistical inference. Treatment effects are demonstrated in experimental designs that incorporate replication within and between subjects, and the visual analysis of data is adequate when integrated into such designs. Thus, single-subject methods are ideal for shaping—and maintaining—the kind of experimental practices that will ensure the continued success of behavior analysis. PMID:22478328

  6. Harnessing the power of gene microarrays for the study of brain aging and Alzheimer's disease: statistical reliability and functional correlation.

    PubMed

    Blalock, E M; Chen, K-C; Stromberg, A J; Norris, C M; Kadish, I; Kraner, S D; Porter, N M; Landfield, P W

    2005-11-01

    During normal brain aging, numerous alterations develop in the physiology, biochemistry and structure of neurons and glia. Aging changes occur in most brain regions and, in the hippocampus, have been linked to declining cognitive performance in both humans and animals. Age-related changes in hippocampal regions also may be harbingers of more severe decrements to come from neurodegenerative disorders such as Alzheimer's disease (AD). However, unraveling the mechanisms underlying brain aging, AD and impaired function has been difficult because of the complexity of the networks that drive these aging-related changes. Gene microarray technology allows massively parallel analysis of most genes expressed in a tissue, and therefore is an important new research tool that potentially can provide the investigative power needed to address the complexity of brain aging/neurodegenerative processes. However, along with this new analytic power, microarrays bring several major bioinformatics and resource problems that frequently hinder the optimal application of this technology. In particular, microarray analyses generate extremely large and unwieldy data sets and are subject to high false positive and false negative rates. Concerns also have been raised regarding their accuracy and uniformity. Furthermore, microarray analyses can result in long lists of altered genes, most of which may be difficult to evaluate for functional relevance. These and other problems have led to some skepticism regarding the reliability and functional usefulness of microarray data and to a general view that microarray data should be validated by an independent method. Given recent progress, however, we suggest that the major problem for current microarray research is no longer validity of expression measurements, but rather, the reliability of inferences from the data, an issue more appropriately redressed by statistical approaches than by validation with a separate method. If tested using statistically

  7. Data analysis using the Gnu R system for statistical computation

    SciTech Connect

    Simone, James; /Fermilab

    2011-07-01

    R is a language system for statistical computation. It is widely used in statistics, bioinformatics, machine learning, data mining, quantitative finance, and the analysis of clinical drug trials. Among the advantages of R are: it has become the standard language for developing statistical techniques, it is being actively developed by a large and growing global user community, it is open source software, it is highly portable (Linux, OS-X and Windows), it has a built-in documentation system, it produces high quality graphics and it is easily extensible with over four thousand extension library packages available covering statistics and applications. This report gives a very brief introduction to R with some examples using lattice QCD simulation results. It then discusses the development of R packages designed for chi-square minimization fits for lattice n-pt correlation functions.

  8. Statistics.

    PubMed

    1993-02-01

    In 1984, 99% of abortions conducted in Bombay, India, were of female fetuses. In 1986-87, 30,000-50,000 female fetuses were aborted in India. In 1987-88, 7 Delhi clinics conducted 13,000 sex determination tests. Thus, discrimination against females begins before birth in India. Some states (Maharashtra, Goa, and Gujarat) have drafted legislation to prevent the use of prenatal diagnostic tests (e.g., ultrasonography) for sex determination purposes. Families make decisions about an infant's nutrition based on the infant's sex so it is not surprising to see a higher incidence of morbidity among girls than boys (e.g., for respiratory infections in 1985, 55.5% vs. 27.3%). Consequently, they are more likely to die than boys. Even though vasectomy is simpler and safer than tubectomy, the government promotes female sterilizations. The percentage of all sexual sterilizations being tubectomy has increased steadily from 84% to 94% (1986-90). Family planning programs focus on female contraceptive methods, despite the higher incidence of adverse health effects from female methods (e.g., IUD causes pain and heavy bleeding). Some women advocates believe the effects to be so great that India should ban contraceptives and injectable contraceptives. The maternal mortality rate is quite high (460/100,000 live births), equaling a lifetime risk of 1:18 of a pregnancy-related death. 70% of these maternal deaths are preventable. Leading causes of maternal deaths in India are anemia, hemorrhage, eclampsia, sepsis, and abortion. Most pregnant women do not receive prenatal care. Untrained personnel attend about 70% of deliveries in rural areas and 29% in urban areas. Appropriate health services and other interventions would prevent the higher age specific death rates for females between 0 and 35 years old. Even though the government does provide maternal and child health services, it needs to stop decreasing resource allocate for health and start increasing it. PMID:12286355

  9. A κ-generalized statistical mechanics approach to income analysis

    NASA Astrophysics Data System (ADS)

    Clementi, F.; Gallegati, M.; Kaniadakis, G.

    2009-02-01

    This paper proposes a statistical mechanics approach to the analysis of income distribution and inequality. A new distribution function, having its roots in the framework of κ-generalized statistics, is derived that is particularly suitable for describing the whole spectrum of incomes, from the low-middle income region up to the high income Pareto power-law regime. Analytical expressions for the shape, moments and some other basic statistical properties are given. Furthermore, several well-known econometric tools for measuring inequality, which all exist in a closed form, are considered. A method for parameter estimation is also discussed. The model is shown to fit remarkably well the data on personal income for the United States, and the analysis of inequality performed in terms of its parameters is revealed as very powerful.

  10. Statistical Learning in Specific Language Impairment and Autism Spectrum Disorder: A Meta-Analysis.

    PubMed

    Obeid, Rita; Brooks, Patricia J; Powers, Kasey L; Gillespie-Lynch, Kristen; Lum, Jarrad A G

    2016-01-01

    Impairments in statistical learning might be a common deficit among individuals with Specific Language Impairment (SLI) and Autism Spectrum Disorder (ASD). Using meta-analysis, we examined statistical learning in SLI (14 studies, 15 comparisons) and ASD (13 studies, 20 comparisons) to evaluate this hypothesis. Effect sizes were examined as a function of diagnosis across multiple statistical learning tasks (Serial Reaction Time, Contextual Cueing, Artificial Grammar Learning, Speech Stream, Observational Learning, and Probabilistic Classification). Individuals with SLI showed deficits in statistical learning relative to age-matched controls. In contrast, statistical learning was intact in individuals with ASD relative to controls. Effect sizes did not vary as a function of task modality or participant age. Our findings inform debates about overlapping social-communicative difficulties in children with SLI and ASD by suggesting distinct underlying mechanisms. In line with the procedural deficit hypothesis (Ullman and Pierpont, 2005), impaired statistical learning may account for phonological and syntactic difficulties associated with SLI. In contrast, impaired statistical learning fails to account for the social-pragmatic difficulties associated with ASD.

  11. Statistical Learning in Specific Language Impairment and Autism Spectrum Disorder: A Meta-Analysis

    PubMed Central

    Obeid, Rita; Brooks, Patricia J.; Powers, Kasey L.; Gillespie-Lynch, Kristen; Lum, Jarrad A. G.

    2016-01-01

    Impairments in statistical learning might be a common deficit among individuals with Specific Language Impairment (SLI) and Autism Spectrum Disorder (ASD). Using meta-analysis, we examined statistical learning in SLI (14 studies, 15 comparisons) and ASD (13 studies, 20 comparisons) to evaluate this hypothesis. Effect sizes were examined as a function of diagnosis across multiple statistical learning tasks (Serial Reaction Time, Contextual Cueing, Artificial Grammar Learning, Speech Stream, Observational Learning, and Probabilistic Classification). Individuals with SLI showed deficits in statistical learning relative to age-matched controls. In contrast, statistical learning was intact in individuals with ASD relative to controls. Effect sizes did not vary as a function of task modality or participant age. Our findings inform debates about overlapping social-communicative difficulties in children with SLI and ASD by suggesting distinct underlying mechanisms. In line with the procedural deficit hypothesis (Ullman and Pierpont, 2005), impaired statistical learning may account for phonological and syntactic difficulties associated with SLI. In contrast, impaired statistical learning fails to account for the social-pragmatic difficulties associated with ASD. PMID:27602006

  12. Statistical Learning in Specific Language Impairment and Autism Spectrum Disorder: A Meta-Analysis.

    PubMed

    Obeid, Rita; Brooks, Patricia J; Powers, Kasey L; Gillespie-Lynch, Kristen; Lum, Jarrad A G

    2016-01-01

    Impairments in statistical learning might be a common deficit among individuals with Specific Language Impairment (SLI) and Autism Spectrum Disorder (ASD). Using meta-analysis, we examined statistical learning in SLI (14 studies, 15 comparisons) and ASD (13 studies, 20 comparisons) to evaluate this hypothesis. Effect sizes were examined as a function of diagnosis across multiple statistical learning tasks (Serial Reaction Time, Contextual Cueing, Artificial Grammar Learning, Speech Stream, Observational Learning, and Probabilistic Classification). Individuals with SLI showed deficits in statistical learning relative to age-matched controls. In contrast, statistical learning was intact in individuals with ASD relative to controls. Effect sizes did not vary as a function of task modality or participant age. Our findings inform debates about overlapping social-communicative difficulties in children with SLI and ASD by suggesting distinct underlying mechanisms. In line with the procedural deficit hypothesis (Ullman and Pierpont, 2005), impaired statistical learning may account for phonological and syntactic difficulties associated with SLI. In contrast, impaired statistical learning fails to account for the social-pragmatic difficulties associated with ASD. PMID:27602006

  13. Statistical Learning in Specific Language Impairment and Autism Spectrum Disorder: A Meta-Analysis

    PubMed Central

    Obeid, Rita; Brooks, Patricia J.; Powers, Kasey L.; Gillespie-Lynch, Kristen; Lum, Jarrad A. G.

    2016-01-01

    Impairments in statistical learning might be a common deficit among individuals with Specific Language Impairment (SLI) and Autism Spectrum Disorder (ASD). Using meta-analysis, we examined statistical learning in SLI (14 studies, 15 comparisons) and ASD (13 studies, 20 comparisons) to evaluate this hypothesis. Effect sizes were examined as a function of diagnosis across multiple statistical learning tasks (Serial Reaction Time, Contextual Cueing, Artificial Grammar Learning, Speech Stream, Observational Learning, and Probabilistic Classification). Individuals with SLI showed deficits in statistical learning relative to age-matched controls. In contrast, statistical learning was intact in individuals with ASD relative to controls. Effect sizes did not vary as a function of task modality or participant age. Our findings inform debates about overlapping social-communicative difficulties in children with SLI and ASD by suggesting distinct underlying mechanisms. In line with the procedural deficit hypothesis (Ullman and Pierpont, 2005), impaired statistical learning may account for phonological and syntactic difficulties associated with SLI. In contrast, impaired statistical learning fails to account for the social-pragmatic difficulties associated with ASD.

  14. A novel statistic for genome-wide interaction analysis.

    PubMed

    Wu, Xuesen; Dong, Hua; Luo, Li; Zhu, Yun; Peng, Gang; Reveille, John D; Xiong, Momiao

    2010-09-23

    Although great progress in genome-wide association studies (GWAS) has been made, the significant SNP associations identified by GWAS account for only a few percent of the genetic variance, leading many to question where and how we can find the missing heritability. There is increasing interest in genome-wide interaction analysis as a possible source of finding heritability unexplained by current GWAS. However, the existing statistics for testing interaction have low power for genome-wide interaction analysis. To meet challenges raised by genome-wide interactional analysis, we have developed a novel statistic for testing interaction between two loci (either linked or unlinked). The null distribution and the type I error rates of the new statistic for testing interaction are validated using simulations. Extensive power studies show that the developed statistic has much higher power to detect interaction than classical logistic regression. The results identified 44 and 211 pairs of SNPs showing significant evidence of interactions with FDR<0.001 and 0.001analysis is a valuable tool for finding remaining missing heritability unexplained by the current GWAS, and the developed novel statistic is able to search significant interaction between SNPs across the genome. Real data analysis showed that the results of genome-wide interaction analysis can be replicated in two independent studies.

  15. Multizone Age-of-Air Analysis

    SciTech Connect

    Sherman, Max H.

    2007-07-01

    Age of air is a technique for evaluating ventilation that has been actively used for over 20 years. Age of air quantifies the time it takes for outdoor air to reach a particular location or zone within then indoor environment. Age of air is often also used to quantify the ventilation effectiveness with respect to indoor air quality. In a purely single zone situation this use of age of air is straightforward, but application of age of air techniques in the general multizone environment has not been fully developed. This article looks at expanding those single-zone techniques to the more complicated environment of multizone buildings and in doing so develops further the general concept of age of air. The results of this analysis shows that the nominal age of air as often used cannot be directly used for determining ventilation effectiveness unless specific assumptions are made regarding source distributions.

  16. Statistical Analysis of Tsunamis of the Italian Coasts

    SciTech Connect

    Caputo, M.; Faita, G.F.

    1982-01-20

    A study of a catalog of 138 tsunamis of the Italian coasts has been made. Intensitities of 106 tsunamis has been assigned and cataloged. The statistical analysis of this data fits a density distribution of the form log n = 3.00-0.425 I, where n is the number of tsunamis of intensity I per thousand years.

  17. Introduction to Statistics and Data Analysis With Computer Applications I.

    ERIC Educational Resources Information Center

    Morris, Carl; Rolph, John

    This document consists of unrevised lecture notes for the first half of a 20-week in-house graduate course at Rand Corporation. The chapter headings are: (1) Histograms and descriptive statistics; (2) Measures of dispersion, distance and goodness of fit; (3) Using JOSS for data analysis; (4) Binomial distribution and normal approximation; (5)…

  18. Investigation of Weibull statistics in fracture analysis of cast aluminum

    NASA Technical Reports Server (NTRS)

    Holland, Frederic A., Jr.; Zaretsky, Erwin V.

    1989-01-01

    The fracture strengths of two large batches of A357-T6 cast aluminum coupon specimens were compared by using two-parameter Weibull analysis. The minimum number of these specimens necessary to find the fracture strength of the material was determined. The applicability of three-parameter Weibull analysis was also investigated. A design methodology based on the combination of elementary stress analysis and Weibull statistical analysis is advanced and applied to the design of a spherical pressure vessel shell. The results from this design methodology are compared with results from the applicable ASME pressure vessel code.

  19. Statistical Software for spatial analysis of stratigraphic data sets

    2003-04-08

    Stratistics s a tool for statistical analysis of spatially explicit data sets and model output for description and for model-data comparisons. lt is intended for the analysis of data sets commonly used in geology, such as gamma ray logs and lithologic sequences, as well as 2-D data such as maps. Stratistics incorporates a far wider range of spatial analysis methods drawn from multiple disciplines, than are currently available in other packages. These include incorporation ofmore » techniques from spatial and landscape ecology, fractal analysis, and mathematical geology. Its use should substantially reduce the risk associated with the use of predictive models« less

  20. Mapping of Planetary Surface Age Based on Crater Statistics Obtained by AN Automatic Detection Algorithm

    NASA Astrophysics Data System (ADS)

    Salih, A. L.; Mühlbauer, M.; Grumpe, A.; Pasckert, J. H.; Wöhler, C.; Hiesinger, H.

    2016-06-01

    The analysis of the impact crater size-frequency distribution (CSFD) is a well-established approach to the determination of the age of planetary surfaces. Classically, estimation of the CSFD is achieved by manual crater counting and size determination in spacecraft images, which, however, becomes very time-consuming for large surface areas and/or high image resolution. With increasing availability of high-resolution (nearly) global image mosaics of planetary surfaces, a variety of automated methods for the detection of craters based on image data and/or topographic data have been developed. In this contribution a template-based crater detection algorithm is used which analyses image data acquired under known illumination conditions. Its results are used to establish the CSFD for the examined area, which is then used to estimate the absolute model age of the surface. The detection threshold of the automatic crater detection algorithm is calibrated based on a region with available manually determined CSFD such that the age inferred from the manual crater counts corresponds to the age inferred from the automatic crater detection results. With this detection threshold, the automatic crater detection algorithm can be applied to a much larger surface region around the calibration area. The proposed age estimation method is demonstrated for a Kaguya Terrain Camera image mosaic of 7.4 m per pixel resolution of the floor region of the lunar crater Tsiolkovsky, which consists of dark and flat mare basalt and has an area of nearly 10,000 km2. The region used for calibration, for which manual crater counts are available, has an area of 100 km2. In order to obtain a spatially resolved age map, CSFDs and surface ages are computed for overlapping quadratic regions of about 4.4 x 4.4 km2 size offset by a step width of 74 m. Our constructed surface age map of the floor of Tsiolkovsky shows age values of typically 3.2-3.3 Ga, while for small regions lower (down to 2.9 Ga) and higher

  1. Systematic misregistration and the statistical analysis of surface data.

    PubMed

    Gee, Andrew H; Treece, Graham M

    2014-02-01

    Spatial normalisation is a key element of statistical parametric mapping and related techniques for analysing cohort statistics on voxel arrays and surfaces. The normalisation process involves aligning each individual specimen to a template using some sort of registration algorithm. Any misregistration will result in data being mapped onto the template at the wrong location. At best, this will introduce spatial imprecision into the subsequent statistical analysis. At worst, when the misregistration varies systematically with a covariate of interest, it may lead to false statistical inference. Since misregistration generally depends on the specimen's shape, we investigate here the effect of allowing for shape as a confound in the statistical analysis, with shape represented by the dominant modes of variation observed in the cohort. In a series of experiments on synthetic surface data, we demonstrate how allowing for shape can reveal true effects that were previously masked by systematic misregistration, and also guard against misinterpreting systematic misregistration as a true effect. We introduce some heuristics for disentangling misregistration effects from true effects, and demonstrate the approach's practical utility in a case study of the cortical bone distribution in 268 human femurs.

  2. HistFitter software framework for statistical data analysis

    NASA Astrophysics Data System (ADS)

    Baak, M.; Besjes, G. J.; Côté, D.; Koutsman, A.; Lorenz, J.; Short, D.

    2015-04-01

    We present a software framework for statistical data analysis, called HistFitter, that has been used extensively by the ATLAS Collaboration to analyze big datasets originating from proton-proton collisions at the Large Hadron Collider at CERN. Since 2012 HistFitter has been the standard statistical tool in searches for supersymmetric particles performed by ATLAS. HistFitter is a programmable and flexible framework to build, book-keep, fit, interpret and present results of data models of nearly arbitrary complexity. Starting from an object-oriented configuration, defined by users, the framework builds probability density functions that are automatically fit to data and interpreted with statistical tests. Internally HistFitter uses the statistics packages RooStats and HistFactory. A key innovation of HistFitter is its design, which is rooted in analysis strategies of particle physics. The concepts of control, signal and validation regions are woven into its fabric. These are progressively treated with statistically rigorous built-in methods. Being capable of working with multiple models at once that describe the data, HistFitter introduces an additional level of abstraction that allows for easy bookkeeping, manipulation and testing of large collections of signal hypotheses. Finally, HistFitter provides a collection of tools to present results with publication quality style through a simple command-line interface.

  3. Transparent Meta-Analysis of Prospective Memory and Aging

    PubMed Central

    Uttl, Bob

    2008-01-01

    Prospective memory (ProM) refers to our ability to become aware of a previously formed plan at the right time and place. After two decades of research on prospective memory and aging, narrative reviews and summaries have arrived at widely different conclusions. One view is that prospective memory shows large age declines, larger than age declines on retrospective memory (RetM). Another view is that prospective memory is an exception to age declines and remains invariant across the adult lifespan. The present meta-analysis of over twenty years of research settles this controversy. It shows that prospective memory declines with aging and that the magnitude of age decline varies by prospective memory subdomain (vigilance, prospective memory proper, habitual prospective memory) as well as test setting (laboratory, natural). Moreover, this meta-analysis demonstrates that previous claims of no age declines in prospective memory are artifacts of methodological and conceptual issues afflicting prior research including widespread ceiling effects, low statistical power, age confounds, and failure to distinguish between various subdomains of prospective memory (e.g., vigilance and prospective memory proper). PMID:18286167

  4. Statistical analysis of flight times for space shuttle ferry flights

    NASA Technical Reports Server (NTRS)

    Graves, M. E.; Perlmutter, M.

    1974-01-01

    Markov chain and Monte Carlo analysis techniques are applied to the simulated Space Shuttle Orbiter Ferry flights to obtain statistical distributions of flight time duration between Edwards Air Force Base and Kennedy Space Center. The two methods are compared, and are found to be in excellent agreement. The flights are subjected to certain operational and meteorological requirements, or constraints, which cause eastbound and westbound trips to yield different results. Persistence of events theory is applied to the occurrence of inclement conditions to find their effect upon the statistical flight time distribution. In a sensitivity test, some of the constraints are varied to observe the corresponding changes in the results.

  5. [Some basic aspects in statistical analysis of visual acuity data].

    PubMed

    Ren, Ze-Qin

    2007-06-01

    All visual acuity charts used currently have their own shortcomings. Therefore, it is difficult for ophthalmologists to evaluate visual acuity data. Many problems present in the use of statistical methods for handling visual acuity data in clinical research. The quantitative relationship between visual acuity and visual angle varied in different visual acuity charts. The type of visual acuity and visual angle are different from each other. Therefore, different statistical methods should be used for different data sources. A correct understanding and analysis of visual acuity data could be obtained only after the elucidation of these aspects.

  6. AstroStat-A VO tool for statistical analysis

    NASA Astrophysics Data System (ADS)

    Kembhavi, A. K.; Mahabal, A. A.; Kale, T.; Jagade, S.; Vibhute, A.; Garg, P.; Vaghmare, K.; Navelkar, S.; Agrawal, T.; Chattopadhyay, A.; Nandrekar, D.; Shaikh, M.

    2015-06-01

    AstroStat is an easy-to-use tool for performing statistical analysis on data. It has been designed to be compatible with Virtual Observatory (VO) standards thus enabling it to become an integral part of the currently available collection of VO tools. A user can load data in a variety of formats into AstroStat and perform various statistical tests using a menu driven interface. Behind the scenes, all analyses are done using the public domain statistical software-R and the output returned is presented in a neatly formatted form to the user. The analyses performable include exploratory tests, visualizations, distribution fitting, correlation & causation, hypothesis testing, multivariate analysis and clustering. The tool is available in two versions with identical interface and features-as a web service that can be run using any standard browser and as an offline application. AstroStat will provide an easy-to-use interface which can allow for both fetching data and performing power statistical analysis on them.

  7. Using Pre-Statistical Analysis to Streamline Monitoring Assessments

    SciTech Connect

    Reed, J.K.

    1999-10-20

    A variety of statistical methods exist to aid evaluation of groundwater quality and subsequent decision making in regulatory programs. These methods are applied because of large temporal and spatial extrapolations commonly applied to these data. In short, statistical conclusions often serve as a surrogate for knowledge. However, facilities with mature monitoring programs that have generated abundant data have inherently less uncertainty because of the sheer quantity of analytical results. In these cases, statistical tests can be less important, and ''expert'' data analysis should assume an important screening role.The WSRC Environmental Protection Department, working with the General Separations Area BSRI Environmental Restoration project team has developed a method for an Integrated Hydrogeological Analysis (IHA) of historical water quality data from the F and H Seepage Basins groundwater remediation project. The IHA combines common sense analytical techniques and a GIS presentation that force direct interactive evaluation of the data. The IHA can perform multiple data analysis tasks required by the RCRA permit. These include: (1) Development of a groundwater quality baseline prior to remediation startup, (2) Targeting of constituents for removal from RCRA GWPS, (3) Targeting of constituents for removal from UIC, permit, (4) Targeting of constituents for reduced, (5)Targeting of monitoring wells not producing representative samples, (6) Reduction in statistical evaluation, and (7) Identification of contamination from other facilities.

  8. Multivariate statistical analysis of atom probe tomography data.

    PubMed

    Parish, Chad M; Miller, Michael K

    2010-10-01

    The application of spectrum imaging multivariate statistical analysis methods, specifically principal component analysis (PCA), to atom probe tomography (APT) data has been investigated. The mathematical method of analysis is described and the results for two example datasets are analyzed and presented. The first dataset is from the analysis of a PM 2000 Fe-Cr-Al-Ti steel containing two different ultrafine precipitate populations. PCA properly describes the matrix and precipitate phases in a simple and intuitive manner. A second APT example is from the analysis of an irradiated reactor pressure vessel steel. Fine, nm-scale Cu-enriched precipitates having a core-shell structure were identified and qualitatively described by PCA. Advantages, disadvantages, and future prospects for implementing these data analysis methodologies for APT datasets, particularly with regard to quantitative analysis, are also discussed. PMID:20650566

  9. Proteome analysis in the assessment of ageing.

    PubMed

    Nkuipou-Kenfack, Esther; Koeck, Thomas; Mischak, Harald; Pich, Andreas; Schanstra, Joost P; Zürbig, Petra; Schumacher, Björn

    2014-11-01

    Based on demographic trends, the societies in many developed countries are facing an increasing number and proportion of people over the age of 65. The raise in elderly populations along with improved health-care will be concomitant with an increased prevalence of ageing-associated chronic conditions like cardiovascular, renal, and respiratory diseases, arthritis, dementia, and diabetes mellitus. This is expected to pose unprecedented challenges both for individuals and societies and their health care systems. An ultimate goal of ageing research is therefore the understanding of physiological ageing and the achievement of 'healthy' ageing by decreasing age-related pathologies. However, on a molecular level, ageing is a complex multi-mechanistic process whose contributing factors may vary individually, partly overlap with pathological alterations, and are often poorly understood. Proteome analysis potentially allows modelling of these multifactorial processes. This review summarises recent proteomic research on age-related changes identified in animal models and human studies. We combined this information with pathway analysis to identify molecular mechanisms associated with ageing. We identified some molecular pathways that are affected in most or even all organs and others that are organ-specific. However, appropriately powered studies are needed to confirm these findings based in in silico evaluation. PMID:25257180

  10. Statistical analysis and interpolation of compositional data in materials science.

    PubMed

    Pesenson, Misha Z; Suram, Santosh K; Gregoire, John M

    2015-02-01

    Compositional data are ubiquitous in chemistry and materials science: analysis of elements in multicomponent systems, combinatorial problems, etc., lead to data that are non-negative and sum to a constant (for example, atomic concentrations). The constant sum constraint restricts the sampling space to a simplex instead of the usual Euclidean space. Since statistical measures such as mean and standard deviation are defined for the Euclidean space, traditional correlation studies, multivariate analysis, and hypothesis testing may lead to erroneous dependencies and incorrect inferences when applied to compositional data. Furthermore, composition measurements that are used for data analytics may not include all of the elements contained in the material; that is, the measurements may be subcompositions of a higher-dimensional parent composition. Physically meaningful statistical analysis must yield results that are invariant under the number of composition elements, requiring the application of specialized statistical tools. We present specifics and subtleties of compositional data processing through discussion of illustrative examples. We introduce basic concepts, terminology, and methods required for the analysis of compositional data and utilize them for the spatial interpolation of composition in a sputtered thin film. The results demonstrate the importance of this mathematical framework for compositional data analysis (CDA) in the fields of materials science and chemistry.

  11. Feature-Based Statistical Analysis of Combustion Simulation Data

    SciTech Connect

    Bennett, J; Krishnamoorthy, V; Liu, S; Grout, R; Hawkes, E; Chen, J; Pascucci, V; Bremer, P T

    2011-11-18

    We present a new framework for feature-based statistical analysis of large-scale scientific data and demonstrate its effectiveness by analyzing features from Direct Numerical Simulations (DNS) of turbulent combustion. Turbulent flows are ubiquitous and account for transport and mixing processes in combustion, astrophysics, fusion, and climate modeling among other disciplines. They are also characterized by coherent structure or organized motion, i.e. nonlocal entities whose geometrical features can directly impact molecular mixing and reactive processes. While traditional multi-point statistics provide correlative information, they lack nonlocal structural information, and hence, fail to provide mechanistic causality information between organized fluid motion and mixing and reactive processes. Hence, it is of great interest to capture and track flow features and their statistics together with their correlation with relevant scalar quantities, e.g. temperature or species concentrations. In our approach we encode the set of all possible flow features by pre-computing merge trees augmented with attributes, such as statistical moments of various scalar fields, e.g. temperature, as well as length-scales computed via spectral analysis. The computation is performed in an efficient streaming manner in a pre-processing step and results in a collection of meta-data that is orders of magnitude smaller than the original simulation data. This meta-data is sufficient to support a fully flexible and interactive analysis of the features, allowing for arbitrary thresholds, providing per-feature statistics, and creating various global diagnostics such as Cumulative Density Functions (CDFs), histograms, or time-series. We combine the analysis with a rendering of the features in a linked-view browser that enables scientists to interactively explore, visualize, and analyze the equivalent of one terabyte of simulation data. We highlight the utility of this new framework for combustion

  12. Statistic analyses of the color experience according to the age of the observer.

    PubMed

    Hunjet, Anica; Parac-Osterman, Durdica; Vucaj, Edita

    2013-04-01

    Psychological experience of color is a real state of the communication between the environment and color, and it will depend on the source of the light, angle of the view, and particular on the observer and his health condition. Hering's theory or a theory of the opponent processes supposes that cones, which are situated in the retina of the eye, are not sensible on the three chromatic domains (areas, fields, zones) (red, green and purple-blue), but they produce a signal based on the principle of the opposed pairs of colors. A reason of this theory depends on the fact that certain disorders of the color eyesight, which include blindness to certain colors, cause blindness to pairs of opponent colors. This paper presents a demonstration of the experience of blue and yellow tone according to the age of the observer. For the testing of the statistically significant differences in the omission in the color experience according to the color of the background we use following statistical tests: Mann-Whitnney U Test, Kruskal-Wallis ANOVA and Median test. It was proven that the differences are statistically significant in the elderly persons (older than 35 years).

  13. Teaching Statistics in Biology: Using Inquiry-based Learning to Strengthen Understanding of Statistical Analysis in Biology Laboratory Courses

    PubMed Central

    2008-01-01

    There is an increasing need for students in the biological sciences to build a strong foundation in quantitative approaches to data analyses. Although most science, engineering, and math field majors are required to take at least one statistics course, statistical analysis is poorly integrated into undergraduate biology course work, particularly at the lower-division level. Elements of statistics were incorporated into an introductory biology course, including a review of statistics concepts and opportunity for students to perform statistical analysis in a biological context. Learning gains were measured with an 11-item statistics learning survey instrument developed for the course. Students showed a statistically significant 25% (p < 0.005) increase in statistics knowledge after completing introductory biology. Students improved their scores on the survey after completing introductory biology, even if they had previously completed an introductory statistics course (9%, improvement p < 0.005). Students retested 1 yr after completing introductory biology showed no loss of their statistics knowledge as measured by this instrument, suggesting that the use of statistics in biology course work may aid long-term retention of statistics knowledge. No statistically significant differences in learning were detected between male and female students in the study. PMID:18765754

  14. SMART: Statistical Metabolomics Analysis-An R Tool.

    PubMed

    Liang, Yu-Jen; Lin, Yu-Ting; Chen, Chia-Wei; Lin, Chien-Wei; Chao, Kun-Mao; Pan, Wen-Harn; Yang, Hsin-Chou

    2016-06-21

    Metabolomics data provide unprecedented opportunities to decipher metabolic mechanisms by analyzing hundreds to thousands of metabolites. Data quality concerns and complex batch effects in metabolomics must be appropriately addressed through statistical analysis. This study developed an integrated analysis tool for metabolomics studies to streamline the complete analysis flow from initial data preprocessing to downstream association analysis. We developed Statistical Metabolomics Analysis-An R Tool (SMART), which can analyze input files with different formats, visually represent various types of data features, implement peak alignment and annotation, conduct quality control for samples and peaks, explore batch effects, and perform association analysis. A pharmacometabolomics study of antihypertensive medication was conducted and data were analyzed using SMART. Neuromedin N was identified as a metabolite significantly associated with angiotensin-converting-enzyme inhibitors in our metabolome-wide association analysis (p = 1.56 × 10(-4) in an analysis of covariance (ANCOVA) with an adjustment for unknown latent groups and p = 1.02 × 10(-4) in an ANCOVA with an adjustment for hidden substructures). This endogenous neuropeptide is highly related to neurotensin and neuromedin U, which are involved in blood pressure regulation and smooth muscle contraction. The SMART software, a user guide, and example data can be downloaded from http://www.stat.sinica.edu.tw/hsinchou/metabolomics/SMART.htm .

  15. Building the Community Online Resource for Statistical Seismicity Analysis (CORSSA)

    NASA Astrophysics Data System (ADS)

    Michael, A. J.; Wiemer, S.; Zechar, J. D.; Hardebeck, J. L.; Naylor, M.; Zhuang, J.; Steacy, S.; Corssa Executive Committee

    2010-12-01

    Statistical seismology is critical to the understanding of seismicity, the testing of proposed earthquake prediction and forecasting methods, and the assessment of seismic hazard. Unfortunately, despite its importance to seismology - especially to those aspects with great impact on public policy - statistical seismology is mostly ignored in the education of seismologists, and there is no central repository for the existing open-source software tools. To remedy these deficiencies, and with the broader goal to enhance the quality of statistical seismology research, we have begun building the Community Online Resource for Statistical Seismicity Analysis (CORSSA). CORSSA is a web-based educational platform that is authoritative, up-to-date, prominent, and user-friendly. We anticipate that the users of CORSSA will range from beginning graduate students to experienced researchers. More than 20 scientists from around the world met for a week in Zurich in May 2010 to kick-start the creation of CORSSA: the format and initial table of contents were defined; a governing structure was organized; and workshop participants began drafting articles. CORSSA materials are organized with respect to six themes, each containing between four and eight articles. The CORSSA web page, www.corssa.org, officially unveiled on September 6, 2010, debuts with an initial set of approximately 10 to 15 articles available online for viewing and commenting with additional articles to be added over the coming months. Each article will be peer-reviewed and will present a balanced discussion, including illustrative examples and code snippets. Topics in the initial set of articles will include: introductions to both CORSSA and statistical seismology, basic statistical tests and their role in seismology; understanding seismicity catalogs and their problems; basic techniques for modeling seismicity; and methods for testing earthquake predictability hypotheses. A special article will compare and review

  16. CORSSA: Community Online Resource for Statistical Seismicity Analysis

    NASA Astrophysics Data System (ADS)

    Zechar, J. D.; Hardebeck, J. L.; Michael, A. J.; Naylor, M.; Steacy, S.; Wiemer, S.; Zhuang, J.

    2011-12-01

    Statistical seismology is critical to the understanding of seismicity, the evaluation of proposed earthquake prediction and forecasting methods, and the assessment of seismic hazard. Unfortunately, despite its importance to seismology-especially to those aspects with great impact on public policy-statistical seismology is mostly ignored in the education of seismologists, and there is no central repository for the existing open-source software tools. To remedy these deficiencies, and with the broader goal to enhance the quality of statistical seismology research, we have begun building the Community Online Resource for Statistical Seismicity Analysis (CORSSA, www.corssa.org). We anticipate that the users of CORSSA will range from beginning graduate students to experienced researchers. More than 20 scientists from around the world met for a week in Zurich in May 2010 to kick-start the creation of CORSSA: the format and initial table of contents were defined; a governing structure was organized; and workshop participants began drafting articles. CORSSA materials are organized with respect to six themes, each will contain between four and eight articles. CORSSA now includes seven articles with an additional six in draft form along with forums for discussion, a glossary, and news about upcoming meetings, special issues, and recent papers. Each article is peer-reviewed and presents a balanced discussion, including illustrative examples and code snippets. Topics in the initial set of articles include: introductions to both CORSSA and statistical seismology, basic statistical tests and their role in seismology; understanding seismicity catalogs and their problems; basic techniques for modeling seismicity; and methods for testing earthquake predictability hypotheses. We have also begun curating a collection of statistical seismology software packages.

  17. Aging and the human vestibular nuclei: morphometric analysis.

    PubMed

    Alvarez, J C; Díaz, C; Suárez, C; Fernández, J A; González del Rey, C; Navarro, A; Tolivia, J

    2000-04-14

    The data concerning the effects of age on the brainstem are scarce and few works are devoted to the human vestibular nuclear complex. The study of the effects of aging in the vestibular nuclei could have clinical interest due to the high prevalence of balance control and gait problems in the elderly. We have used in this work eight human brainstems of different ages sectioned and stained by the formaldehyde-thionin technique. The neuron's profiles were drawn with a camera lucida and Abercrombie's method was used to estimate the total number of neurons. The test of Kolmogorov-Smirnov with the correction of Lilliefors was used to evaluate the fit of our data to a normal distribution and a regression analysis was done to determine if the variation of our data with age was statistically significant. Aging does not affect the volume or length of the vestibular nuclear complex. Our results clearly show that neuronal loss occurs with aging in the descending (DVN), medial (MVN), and lateral (LVN) vestibular nuclei, but not in the superior (SVN). There are changes in the proportions of neurons of different sizes but they are not statistically significant. The neuronal loss could be related with the problems that elderly people have to compensate unilateral vestibular lesions and the alterations of the vestibulospinal reflexes. The preservation of SVN neurons can explain why vestibulo-ocular reflexes are compensated after unilateral vestibular injuries.

  18. Wavelet analysis in ecology and epidemiology: impact of statistical tests.

    PubMed

    Cazelles, Bernard; Cazelles, Kévin; Chavez, Mario

    2014-02-01

    Wavelet analysis is now frequently used to extract information from ecological and epidemiological time series. Statistical hypothesis tests are conducted on associated wavelet quantities to assess the likelihood that they are due to a random process. Such random processes represent null models and are generally based on synthetic data that share some statistical characteristics with the original time series. This allows the comparison of null statistics with those obtained from original time series. When creating synthetic datasets, different techniques of resampling result in different characteristics shared by the synthetic time series. Therefore, it becomes crucial to consider the impact of the resampling method on the results. We have addressed this point by comparing seven different statistical testing methods applied with different real and simulated data. Our results show that statistical assessment of periodic patterns is strongly affected by the choice of the resampling method, so two different resampling techniques could lead to two different conclusions about the same time series. Moreover, our results clearly show the inadequacy of resampling series generated by white noise and red noise that are nevertheless the methods currently used in the wide majority of wavelets applications. Our results highlight that the characteristics of a time series, namely its Fourier spectrum and autocorrelation, are important to consider when choosing the resampling technique. Results suggest that data-driven resampling methods should be used such as the hidden Markov model algorithm and the 'beta-surrogate' method.

  19. Statistical analysis of single-trial Granger causality spectra.

    PubMed

    Brovelli, Andrea

    2012-01-01

    Granger causality analysis is becoming central for the analysis of interactions between neural populations and oscillatory networks. However, it is currently unclear whether single-trial estimates of Granger causality spectra can be used reliably to assess directional influence. We addressed this issue by combining single-trial Granger causality spectra with statistical inference based on general linear models. The approach was assessed on synthetic and neurophysiological data. Synthetic bivariate data was generated using two autoregressive processes with unidirectional coupling. We simulated two hypothetical experimental conditions: the first mimicked a constant and unidirectional coupling, whereas the second modelled a linear increase in coupling across trials. The statistical analysis of single-trial Granger causality spectra, based on t-tests and linear regression, successfully recovered the underlying pattern of directional influence. In addition, we characterised the minimum number of trials and coupling strengths required for significant detection of directionality. Finally, we demonstrated the relevance for neurophysiology by analysing two local field potentials (LFPs) simultaneously recorded from the prefrontal and premotor cortices of a macaque monkey performing a conditional visuomotor task. Our results suggest that the combination of single-trial Granger causality spectra and statistical inference provides a valuable tool for the analysis of large-scale cortical networks and brain connectivity.

  20. STATISTICAL ANALYSIS OF THE HEAVY NEUTRAL ATOMS MEASURED BY IBEX

    SciTech Connect

    Park, Jeewoo; Kucharek, Harald; Möbius, Eberhard; Galli, André; Livadiotis, George; Fuselier, Steve A.; McComas, David J.

    2015-10-15

    We investigate the directional distribution of heavy neutral atoms in the heliosphere by using heavy neutral maps generated with the IBEX-Lo instrument over three years from 2009 to 2011. The interstellar neutral (ISN) O and Ne gas flow was found in the first-year heavy neutral map at 601 keV and its flow direction and temperature were studied. However, due to the low counting statistics, researchers have not treated the full sky maps in detail. The main goal of this study is to evaluate the statistical significance of each pixel in the heavy neutral maps to get a better understanding of the directional distribution of heavy neutral atoms in the heliosphere. Here, we examine three statistical analysis methods: the signal-to-noise filter, the confidence limit method, and the cluster analysis method. These methods allow us to exclude background from areas where the heavy neutral signal is statistically significant. These methods also allow the consistent detection of heavy neutral atom structures. The main emission feature expands toward lower longitude and higher latitude from the observational peak of the ISN O and Ne gas flow. We call this emission the extended tail. It may be an imprint of the secondary oxygen atoms generated by charge exchange between ISN hydrogen atoms and oxygen ions in the outer heliosheath.

  1. Precipitation Hardening and Statistical Modeling of the Aging Parameters and Alloy Compositions in Al-Cu-Mg-Ag Alloys

    NASA Astrophysics Data System (ADS)

    Al-Obaisi, A. M.; El-Danaf, E. A.; Ragab, A. E.; Soliman, M. S.

    2016-06-01

    The addition of Ag to Al-Cu-Mg systems has been proposed to replace the existing high-strength 2xxx and 7xxx Al alloys. The aged Al-Cu-Mg-Ag alloys exhibited promising properties, due to special type of precipitates named Ω, which cooperate with other precipitates to enhance the mechanical properties significantly. In the present investigation, the effect of changing percentages of alloying elements, aging time, and aging temperature on the hardness values was studied based on a factorial design. According to this design of experiments (DOE)—23 factorial design, eight alloys were cast and hot rolled, where (Cu, Mg, and Ag) were added to aluminum with two different levels for each alloying element. These alloys were aged at different temperatures (160, 190, and 220 °C) over a wide range of time intervals from 10 min. to 64 h. The resulting hardness data were used as an input for Minitab software to model and relate the process variables with hardness through a regression analysis. Modifying the alloying elements' weight percentages to the high level enhanced the hardness of the alloy with about 40% as compared to the alloy containing the low level of all alloying elements. Through analysis of variance (ANOVA), it was figured out that altering the fraction of Cu had the greatest effect on the hardness values with a contribution of about 49%. Also, second-level interaction terms had about 21% of impact on the hardness values. Aging time, quadratic terms, and third-level interaction terms had almost the same level of influence on hardness values (about 10% contribution). Furthermore, the results have shown that small addition of Mg and Ag was enough to improve the mechanical properties of the alloy significantly. The statistical model formulated interpreted about 80% of the variation in hardness values.

  2. Statistical Analysis of speckle noise reduction techniques for echocardiographic Images

    NASA Astrophysics Data System (ADS)

    Saini, Kalpana; Dewal, M. L.; Rohit, Manojkumar

    2011-12-01

    Echocardiography is the safe, easy and fast technology for diagnosing the cardiac diseases. As in other ultrasound images these images also contain speckle noise. In some cases this speckle noise is useful such as in motion detection. But in general noise removal is required for better analysis of the image and proper diagnosis. Different Adaptive and anisotropic filters are included for statistical analysis. Statistical parameters such as Signal-to-Noise Ratio (SNR), Peak Signal-to-Noise Ratio (PSNR), and Root Mean Square Error (RMSE) calculated for performance measurement. One more important aspect that there may be blurring during speckle noise removal. So it is prefered that filter should be able to enhance edges during noise removal.

  3. Collagen morphology and texture analysis: from statistics to classification

    NASA Astrophysics Data System (ADS)

    Mostaço-Guidolin, Leila B.; Ko, Alex C.-T.; Wang, Fei; Xiang, Bo; Hewko, Mark; Tian, Ganghong; Major, Arkady; Shiomi, Masashi; Sowa, Michael G.

    2013-07-01

    In this study we present an image analysis methodology capable of quantifying morphological changes in tissue collagen fibril organization caused by pathological conditions. Texture analysis based on first-order statistics (FOS) and second-order statistics such as gray level co-occurrence matrix (GLCM) was explored to extract second-harmonic generation (SHG) image features that are associated with the structural and biochemical changes of tissue collagen networks. Based on these extracted quantitative parameters, multi-group classification of SHG images was performed. With combined FOS and GLCM texture values, we achieved reliable classification of SHG collagen images acquired from atherosclerosis arteries with >90% accuracy, sensitivity and specificity. The proposed methodology can be applied to a wide range of conditions involving collagen re-modeling, such as in skin disorders, different types of fibrosis and muscular-skeletal diseases affecting ligaments and cartilage.

  4. Statistics in experimental design, preprocessing, and analysis of proteomics data.

    PubMed

    Jung, Klaus

    2011-01-01

    High-throughput experiments in proteomics, such as 2-dimensional gel electrophoresis (2-DE) and mass spectrometry (MS), yield usually high-dimensional data sets of expression values for hundreds or thousands of proteins which are, however, observed on only a relatively small number of biological samples. Statistical methods for the planning and analysis of experiments are important to avoid false conclusions and to receive tenable results. In this chapter, the most frequent experimental designs for proteomics experiments are illustrated. In particular, focus is put on studies for the detection of differentially regulated proteins. Furthermore, issues of sample size planning, statistical analysis of expression levels as well as methods for data preprocessing are covered.

  5. Improved statistical power with a sparse shape model in detecting an aging effect in the hippocampus and amygdala

    NASA Astrophysics Data System (ADS)

    Chung, Moo K.; Kim, Seung-Goo; Schaefer, Stacey M.; van Reekum, Carien M.; Peschke-Schmitz, Lara; Sutterer, Matthew J.; Davidson, Richard J.

    2014-03-01

    The sparse regression framework has been widely used in medical image processing and analysis. However, it has been rarely used in anatomical studies. We present a sparse shape modeling framework using the Laplace- Beltrami (LB) eigenfunctions of the underlying shape and show its improvement of statistical power. Tradition- ally, the LB-eigenfunctions are used as a basis for intrinsically representing surface shapes as a form of Fourier descriptors. To reduce high frequency noise, only the first few terms are used in the expansion and higher frequency terms are simply thrown away. However, some lower frequency terms may not necessarily contribute significantly in reconstructing the surfaces. Motivated by this idea, we present a LB-based method to filter out only the significant eigenfunctions by imposing a sparse penalty. For dense anatomical data such as deformation fields on a surface mesh, the sparse regression behaves like a smoothing process, which will reduce the error of incorrectly detecting false negatives. Hence the statistical power improves. The sparse shape model is then applied in investigating the influence of age on amygdala and hippocampus shapes in the normal population. The advantage of the LB sparse framework is demonstrated by showing the increased statistical power.

  6. Statistical Analysis of the Exchange Rate of Bitcoin.

    PubMed

    Chu, Jeffrey; Nadarajah, Saralees; Chan, Stephen

    2015-01-01

    Bitcoin, the first electronic payment system, is becoming a popular currency. We provide a statistical analysis of the log-returns of the exchange rate of Bitcoin versus the United States Dollar. Fifteen of the most popular parametric distributions in finance are fitted to the log-returns. The generalized hyperbolic distribution is shown to give the best fit. Predictions are given for future values of the exchange rate. PMID:26222702

  7. Statistical Analysis of the Exchange Rate of Bitcoin

    PubMed Central

    Chu, Jeffrey; Nadarajah, Saralees; Chan, Stephen

    2015-01-01

    Bitcoin, the first electronic payment system, is becoming a popular currency. We provide a statistical analysis of the log-returns of the exchange rate of Bitcoin versus the United States Dollar. Fifteen of the most popular parametric distributions in finance are fitted to the log-returns. The generalized hyperbolic distribution is shown to give the best fit. Predictions are given for future values of the exchange rate. PMID:26222702

  8. Statistical Analysis of the Exchange Rate of Bitcoin.

    PubMed

    Chu, Jeffrey; Nadarajah, Saralees; Chan, Stephen

    2015-01-01

    Bitcoin, the first electronic payment system, is becoming a popular currency. We provide a statistical analysis of the log-returns of the exchange rate of Bitcoin versus the United States Dollar. Fifteen of the most popular parametric distributions in finance are fitted to the log-returns. The generalized hyperbolic distribution is shown to give the best fit. Predictions are given for future values of the exchange rate.

  9. The statistical analysis of multivariate serological frequency data.

    PubMed

    Reyment, Richard A

    2005-11-01

    Data occurring in the form of frequencies are common in genetics-for example, in serology. Examples are provided by the AB0 group, the Rhesus group, and also DNA data. The statistical analysis of tables of frequencies is carried out using the available methods of multivariate analysis with usually three principal aims. One of these is to seek meaningful relationships between the components of a data set, the second is to examine relationships between populations from which the data have been obtained, the third is to bring about a reduction in dimensionality. This latter aim is usually realized by means of bivariate scatter diagrams using scores computed from a multivariate analysis. The multivariate statistical analysis of tables of frequencies cannot safely be carried out by standard multivariate procedures because they represent compositions and are therefore embedded in simplex space, a subspace of full space. Appropriate procedures for simplex space are compared and contrasted with simple standard methods of multivariate analysis ("raw" principal component analysis). The study shows that the differences between a log-ratio model and a simple logarithmic transformation of proportions may not be very great, particularly as regards graphical ordinations, but important discrepancies do occur. The divergencies between logarithmically based analyses and raw data are, however, great. Published data on Rhesus alleles observed for Italian populations are used to exemplify the subject. PMID:16024067

  10. The statistical analysis of multivariate serological frequency data.

    PubMed

    Reyment, Richard A

    2005-11-01

    Data occurring in the form of frequencies are common in genetics-for example, in serology. Examples are provided by the AB0 group, the Rhesus group, and also DNA data. The statistical analysis of tables of frequencies is carried out using the available methods of multivariate analysis with usually three principal aims. One of these is to seek meaningful relationships between the components of a data set, the second is to examine relationships between populations from which the data have been obtained, the third is to bring about a reduction in dimensionality. This latter aim is usually realized by means of bivariate scatter diagrams using scores computed from a multivariate analysis. The multivariate statistical analysis of tables of frequencies cannot safely be carried out by standard multivariate procedures because they represent compositions and are therefore embedded in simplex space, a subspace of full space. Appropriate procedures for simplex space are compared and contrasted with simple standard methods of multivariate analysis ("raw" principal component analysis). The study shows that the differences between a log-ratio model and a simple logarithmic transformation of proportions may not be very great, particularly as regards graphical ordinations, but important discrepancies do occur. The divergencies between logarithmically based analyses and raw data are, however, great. Published data on Rhesus alleles observed for Italian populations are used to exemplify the subject.

  11. Along-tract statistics allow for enhanced tractography analysis

    PubMed Central

    Colby, John B.; Soderberg, Lindsay; Lebel, Catherine; Dinov, Ivo D.; Thompson, Paul M.; Sowell, Elizabeth R.

    2011-01-01

    Diffusion imaging tractography is a valuable tool for neuroscience researchers because it allows the generation of individualized virtual dissections of major white matter tracts in the human brain. It facilitates between-subject statistical analyses tailored to the specific anatomy of each participant. There is prominent variation in diffusion imaging metrics (e.g., fractional anisotropy, FA) within tracts, but most tractography studies use a “tract-averaged” approach to analysis by averaging the scalar values from the many streamline vertices in a tract dissection into a single point-spread estimate for each tract. Here we describe a complete workflow needed to conduct an along-tract analysis of white matter streamline tract groups. This consists of 1) A flexible MATLAB toolkit for generating along-tract data based on B-spline resampling and compilation of scalar data at different collections of vertices along the curving tract spines, and 2) Statistical analysis and rich data visualization by leveraging tools available through the R platform for statistical computing. We demonstrate the effectiveness of such an along-tract approach over the tract-averaged approach in an example analysis of 10 major white matter tracts in a single subject. We also show that these techniques easily extend to between-group analyses typically used in neuroscience applications, by conducting an along-tract analysis of differences in FA between 9 individuals with fetal alcohol spectrum disorders (FASDs) and 11 typically-developing controls. This analysis reveals localized differences between FASD and control groups that were not apparent using a tract-averaged method. Finally, to validate our approach and highlight the strength of this extensible software framework, we implement 2 other methods from the literature and leverage the existing workflow tools to conduct a comparison study. PMID:22094644

  12. Multivariate statistical analysis of low-voltage EDS spectrum images

    SciTech Connect

    Anderson, I.M.

    1998-03-01

    Whereas energy-dispersive X-ray spectrometry (EDS) has been used for compositional analysis in the scanning electron microscope for 30 years, the benefits of using low operating voltages for such analyses have been explored only during the last few years. This paper couples low-voltage EDS with two other emerging areas of characterization: spectrum imaging and multivariate statistical analysis. The specimen analyzed for this study was a finished Intel Pentium processor, with the polyimide protective coating stripped off to expose the final active layers.

  13. Integrated Data Collection Analysis (IDCA) Program - Statistical Analysis of RDX Standard Data Sets

    SciTech Connect

    Sandstrom, Mary M.; Brown, Geoffrey W.; Preston, Daniel N.; Pollard, Colin J.; Warner, Kirstin F.; Sorensen, Daniel N.; Remmers, Daniel L.; Phillips, Jason J.; Shelley, Timothy J.; Reyes, Jose A.; Hsu, Peter C.; Reynolds, John G.

    2015-10-30

    The Integrated Data Collection Analysis (IDCA) program is conducting a Proficiency Test for Small- Scale Safety and Thermal (SSST) testing of homemade explosives (HMEs). Described here are statistical analyses of the results for impact, friction, electrostatic discharge, and differential scanning calorimetry analysis of the RDX Type II Class 5 standard. The material was tested as a well-characterized standard several times during the proficiency study to assess differences among participants and the range of results that may arise for well-behaved explosive materials. The analyses show that there are detectable differences among the results from IDCA participants. While these differences are statistically significant, most of them can be disregarded for comparison purposes to assess potential variability when laboratories attempt to measure identical samples using methods assumed to be nominally the same. The results presented in this report include the average sensitivity results for the IDCA participants and the ranges of values obtained. The ranges represent variation about the mean values of the tests of between 26% and 42%. The magnitude of this variation is attributed to differences in operator, method, and environment as well as the use of different instruments that are also of varying age. The results appear to be a good representation of the broader safety testing community based on the range of methods, instruments, and environments included in the IDCA Proficiency Test.

  14. HistFitter: a flexible framework for statistical data analysis

    NASA Astrophysics Data System (ADS)

    Besjes, G. J.; Baak, M.; Côté, D.; Koutsman, A.; Lorenz, J. M.; Short, D.

    2015-12-01

    HistFitter is a software framework for statistical data analysis that has been used extensively in the ATLAS Collaboration to analyze data of proton-proton collisions produced by the Large Hadron Collider at CERN. Most notably, HistFitter has become a de-facto standard in searches for supersymmetric particles since 2012, with some usage for Exotic and Higgs boson physics. HistFitter coherently combines several statistics tools in a programmable and flexible framework that is capable of bookkeeping hundreds of data models under study using thousands of generated input histograms. HistFitter interfaces with the statistics tools HistFactory and RooStats to construct parametric models and to perform statistical tests of the data, and extends these tools in four key areas. The key innovations are to weave the concepts of control, validation and signal regions into the very fabric of HistFitter, and to treat these with rigorous methods. Multiple tools to visualize and interpret the results through a simple configuration interface are also provided.

  15. Dark-ages reionization and galaxy formation simulation V: morphology and statistical signatures of reionization

    NASA Astrophysics Data System (ADS)

    Geil, Paul M.; Mutch, Simon J.; Poole, Gregory B.; Angel, Paul W.; Duffy, Alan R.; Mesinger, Andrei; Wyithe, J. Stuart B.

    2016-10-01

    We use the Dark-ages, Reionization And Galaxy formation Observables from Numerical Simulations (DRAGONS) framework to investigate the effect of galaxy formation physics on the morphology and statistics of ionized hydrogen (H II) regions during the Epoch of Reioinization (EoR). DRAGONS self-consistently couples a semi-analytic galaxy formation model with the inhomogeneous ionizing UV background, and can therefore be used to study the dependence of morphology and statistics of reionization on feedback phenomena of the ionizing source galaxy population. Changes in galaxy formation physics modify the sizes of H II regions and the amplitude and shape of 21-cm power spectra. Of the galaxy physics investigated, we find that supernova feedback plays the most important role in reionization, with H II regions up to ≈20 per cent smaller and a fractional difference in the amplitude of power spectra of up to ≈17 per cent at fixed ionized fraction in the absence of this feedback. We compare our galaxy formation-based reionization models with past calculations that assume constant stellar-to-halo mass ratios and find that with the correct choice of minimum halo mass, such models can mimic the predicted reionization morphology. Reionization morphology at fixed neutral fraction is therefore not uniquely determined by the details of galaxy formation, but is sensitive to the mass of the haloes hosting the bulk of the ionizing sources. Simple EoR parametrizations are therefore accurate predictors of reionization statistics. However, a complete understanding of reionization using future 21-cm observations will require interpretation with realistic galaxy formation models, in combination with other observations.

  16. Statistically significant faunal differences among Middle Ordovician age, Chickamauga Group bryozoan bioherms, central Alabama

    SciTech Connect

    Crow, C.J.

    1985-01-01

    Middle Ordovician age Chickamauga Group carbonates crop out along the Birmingham and Murphrees Valley anticlines in central Alabama. The macrofossil contents on exposed surfaces of seven bioherms have been counted to determine their various paleontologic characteristics. Twelve groups of organisms are present in these bioherms. Dominant organisms include bryozoans, algae, brachiopods, sponges, pelmatozoans, stromatoporoids and corals. Minor accessory fauna include predators, scavengers and grazers such as gastropods, ostracods, trilobites, cephalopods and pelecypods. Vertical and horizontal niche zonation has been detected for some of the bioherm dwelling fauna. No one bioherm of those studied exhibits all 12 groups of organisms; rather, individual bioherms display various subsets of the total diversity. Statistical treatment (G-test) of the diversity data indicates a lack of statistical homogeneity of the bioherms, both within and between localities. Between-locality population heterogeneity can be ascribed to differences in biologic responses to such gross environmental factors as water depth and clarity, and energy levels. At any one locality, gross aspects of the paleoenvironments are assumed to have been more uniform. Significant differences among bioherms at any one locality may have resulted from patchy distribution of species populations, differential preservation and other factors.

  17. Statistical analysis of heartbeat data with wavelet techniques

    NASA Astrophysics Data System (ADS)

    Pazsit, Imre

    2004-05-01

    The purpose of this paper is to demonstrate the use of some methods of signal analysis, performed on ECG and in some cases blood pressure signals, for the classification of the health status of the heart of mice and rats. Spectral and wavelet analysis were performed on the raw signals. FFT-based coherence and phase was also calculated between blood pressure and raw ECG signals. Finally, RR-intervals were deduced from the ECG signals and an analysis of the fractal dimensions was performed. The analysis was made on data from mice and rats. A correlation was found between the health status of the mice and the rats and some of the statistical descriptors, most notably the phase of the cross-spectra between ECG and blood pressure, and the fractal properties and dimensions of the interbeat series (RR-interval fluctuations).

  18. Bayesian statistical analysis of protein side-chain rotamer preferences.

    PubMed Central

    Dunbrack, R. L.; Cohen, F. E.

    1997-01-01

    We present a Bayesian statistical analysis of the conformations of side chains in proteins from the Protein Data Bank. This is an extension of the backbone-dependent rotamer library, and includes rotamer populations and average chi angles for a full range of phi, psi values. The Bayesian analysis used here provides a rigorous statistical method for taking account of varying amounts of data. Bayesian statistics requires the assumption of a prior distribution for parameters over their range of possible values. This prior distribution can be derived from previous data or from pooling some of the present data. The prior distribution is combined with the data to form the posterior distribution, which is a compromise between the prior distribution and the data. For the chi 2, chi 3, and chi 4 rotamer prior distributions, we assume that the probability of each rotamer type is dependent only on the previous chi rotamer in the chain. For the backbone-dependence of the chi 1 rotamers, we derive prior distributions from the product of the phi-dependent and psi-dependent probabilities. Molecular mechanics calculations with the CHARMM22 potential show a strong similarity with the experimental distributions, indicating that proteins attain their lowest energy rotamers with respect to local backbone-side-chain interactions. The new library is suitable for use in homology modeling, protein folding simulations, and the refinement of X-ray and NMR structures. PMID:9260279

  19. Self-Contained Statistical Analysis of Gene Sets

    PubMed Central

    Cannon, Judy L.; Ricoy, Ulises M.; Johnson, Christopher

    2016-01-01

    Microarrays are a powerful tool for studying differential gene expression. However, lists of many differentially expressed genes are often generated, and unraveling meaningful biological processes from the lists can be challenging. For this reason, investigators have sought to quantify the statistical probability of compiled gene sets rather than individual genes. The gene sets typically are organized around a biological theme or pathway. We compute correlations between different gene set tests and elect to use Fisher’s self-contained method for gene set analysis. We improve Fisher’s differential expression analysis of a gene set by limiting the p-value of an individual gene within the gene set to prevent a small percentage of genes from determining the statistical significance of the entire set. In addition, we also compute dependencies among genes within the set to determine which genes are statistically linked. The method is applied to T-ALL (T-lineage Acute Lymphoblastic Leukemia) to identify differentially expressed gene sets between T-ALL and normal patients and T-ALL and AML (Acute Myeloid Leukemia) patients. PMID:27711232

  20. Agriculture, population growth, and statistical analysis of the radiocarbon record

    PubMed Central

    Zahid, H. Jabran; Robinson, Erick; Kelly, Robert L.

    2016-01-01

    The human population has grown significantly since the onset of the Holocene about 12,000 y ago. Despite decades of research, the factors determining prehistoric population growth remain uncertain. Here, we examine measurements of the rate of growth of the prehistoric human population based on statistical analysis of the radiocarbon record. We find that, during most of the Holocene, human populations worldwide grew at a long-term annual rate of 0.04%. Statistical analysis of the radiocarbon record shows that transitioning farming societies experienced the same rate of growth as contemporaneous foraging societies. The same rate of growth measured for populations dwelling in a range of environments and practicing a variety of subsistence strategies suggests that the global climate and/or endogenous biological factors, not adaptability to local environment or subsistence practices, regulated the long-term growth of the human population during most of the Holocene. Our results demonstrate that statistical analyses of large ensembles of radiocarbon dates are robust and valuable for quantitatively investigating the demography of prehistoric human populations worldwide. PMID:26699457

  1. Agriculture, population growth, and statistical analysis of the radiocarbon record.

    PubMed

    Zahid, H Jabran; Robinson, Erick; Kelly, Robert L

    2016-01-26

    The human population has grown significantly since the onset of the Holocene about 12,000 y ago. Despite decades of research, the factors determining prehistoric population growth remain uncertain. Here, we examine measurements of the rate of growth of the prehistoric human population based on statistical analysis of the radiocarbon record. We find that, during most of the Holocene, human populations worldwide grew at a long-term annual rate of 0.04%. Statistical analysis of the radiocarbon record shows that transitioning farming societies experienced the same rate of growth as contemporaneous foraging societies. The same rate of growth measured for populations dwelling in a range of environments and practicing a variety of subsistence strategies suggests that the global climate and/or endogenous biological factors, not adaptability to local environment or subsistence practices, regulated the long-term growth of the human population during most of the Holocene. Our results demonstrate that statistical analyses of large ensembles of radiocarbon dates are robust and valuable for quantitatively investigating the demography of prehistoric human populations worldwide.

  2. Agriculture, population growth, and statistical analysis of the radiocarbon record.

    PubMed

    Zahid, H Jabran; Robinson, Erick; Kelly, Robert L

    2016-01-26

    The human population has grown significantly since the onset of the Holocene about 12,000 y ago. Despite decades of research, the factors determining prehistoric population growth remain uncertain. Here, we examine measurements of the rate of growth of the prehistoric human population based on statistical analysis of the radiocarbon record. We find that, during most of the Holocene, human populations worldwide grew at a long-term annual rate of 0.04%. Statistical analysis of the radiocarbon record shows that transitioning farming societies experienced the same rate of growth as contemporaneous foraging societies. The same rate of growth measured for populations dwelling in a range of environments and practicing a variety of subsistence strategies suggests that the global climate and/or endogenous biological factors, not adaptability to local environment or subsistence practices, regulated the long-term growth of the human population during most of the Holocene. Our results demonstrate that statistical analyses of large ensembles of radiocarbon dates are robust and valuable for quantitatively investigating the demography of prehistoric human populations worldwide. PMID:26699457

  3. Statistical strategies to reveal potential vibrational markers for in vivo analysis by confocal Raman spectroscopy

    NASA Astrophysics Data System (ADS)

    Oliveira Mendes, Thiago de; Pinto, Liliane Pereira; Santos, Laurita dos; Tippavajhala, Vamshi Krishna; Téllez Soto, Claudio Alberto; Martin, Airton Abrahão

    2016-07-01

    The analysis of biological systems by spectroscopic techniques involves the evaluation of hundreds to thousands of variables. Hence, different statistical approaches are used to elucidate regions that discriminate classes of samples and to propose new vibrational markers for explaining various phenomena like disease monitoring, mechanisms of action of drugs, food, and so on. However, the technical statistics are not always widely discussed in applied sciences. In this context, this work presents a detailed discussion including the various steps necessary for proper statistical analysis. It includes univariate parametric and nonparametric tests, as well as multivariate unsupervised and supervised approaches. The main objective of this study is to promote proper understanding of the application of various statistical tools in these spectroscopic methods used for the analysis of biological samples. The discussion of these methods is performed on a set of in vivo confocal Raman spectra of human skin analysis that aims to identify skin aging markers. In the Appendix, a complete routine of data analysis is executed in a free software that can be used by the scientific community involved in these studies.

  4. Spatial statistical analysis of tree deaths using airborne digital imagery

    NASA Astrophysics Data System (ADS)

    Chang, Ya-Mei; Baddeley, Adrian; Wallace, Jeremy; Canci, Michael

    2013-04-01

    High resolution digital airborne imagery offers unprecedented opportunities for observation and monitoring of vegetation, providing the potential to identify, locate and track individual vegetation objects over time. Analytical tools are required to quantify relevant information. In this paper, locations of trees over a large area of native woodland vegetation were identified using morphological image analysis techniques. Methods of spatial point process statistics were then applied to estimate the spatially-varying tree death risk, and to show that it is significantly non-uniform. [Tree deaths over the area were detected in our previous work (Wallace et al., 2008).] The study area is a major source of ground water for the city of Perth, and the work was motivated by the need to understand and quantify vegetation changes in the context of water extraction and drying climate. The influence of hydrological variables on tree death risk was investigated using spatial statistics (graphical exploratory methods, spatial point pattern modelling and diagnostics).

  5. [Statistical analysis of DNA sequences nearby splicing sites].

    PubMed

    Korzinov, O M; Astakhova, T V; Vlasov, P K; Roĭtberg, M A

    2008-01-01

    Recognition of coding regions within eukaryotic genomes is one of oldest but yet not solved problems of bioinformatics. New high-accuracy methods of splicing sites recognition are needed to solve this problem. A question of current interest is to identify specific features of nucleotide sequences nearby splicing sites and recognize sites in sequence context. We performed a statistical analysis of human genes fragment database and revealed some characteristics of nucleotide sequences in splicing sites neighborhood. Frequencies of all nucleotides and dinucleotides in splicing sites environment were computed and nucleotides and dinucleotides with extremely high\\low occurrences were identified. Statistical information obtained in this work can be used in further development of the methods of splicing sites annotation and exon-intron structure recognition.

  6. Analysis of the Spatial Organization of Molecules with Robust Statistics

    PubMed Central

    Lagache, Thibault; Lang, Gabriel; Sauvonnet, Nathalie; Olivo-Marin, Jean-Christophe

    2013-01-01

    One major question in molecular biology is whether the spatial distribution of observed molecules is random or organized in clusters. Indeed, this analysis gives information about molecules’ interactions and physical interplay with their environment. The standard tool for analyzing molecules’ distribution statistically is the Ripley’s K function, which tests spatial randomness through the computation of its critical quantiles. However, quantiles’ computation is very cumbersome, hindering its use. Here, we present an analytical expression of these quantiles, leading to a fast and robust statistical test, and we derive the characteristic clusters’ size from the maxima of the Ripley’s K function. Subsequently, we analyze the spatial organization of endocytic spots at the cell membrane and we report that clathrin spots are randomly distributed while clathrin-independent spots are organized in clusters with a radius of , which suggests distinct physical mechanisms and cellular functions for each pathway. PMID:24349021

  7. Statistical analysis of subjective preferences for video enhancement

    NASA Astrophysics Data System (ADS)

    Woods, Russell L.; Satgunam, PremNandhini; Bronstad, P. Matthew; Peli, Eli

    2010-02-01

    Measuring preferences for moving video quality is harder than for static images due to the fleeting and variable nature of moving video. Subjective preferences for image quality can be tested by observers indicating their preference for one image over another. Such pairwise comparisons can be analyzed using Thurstone scaling (Farrell, 1999). Thurstone (1927) scaling is widely used in applied psychology, marketing, food tasting and advertising research. Thurstone analysis constructs an arbitrary perceptual scale for the items that are compared (e.g. enhancement levels). However, Thurstone scaling does not determine the statistical significance of the differences between items on that perceptual scale. Recent papers have provided inferential statistical methods that produce an outcome similar to Thurstone scaling (Lipovetsky and Conklin, 2004). Here, we demonstrate that binary logistic regression can analyze preferences for enhanced video.

  8. Noise removing in encrypted color images by statistical analysis

    NASA Astrophysics Data System (ADS)

    Islam, N.; Puech, W.

    2012-03-01

    Cryptographic techniques are used to secure confidential data from unauthorized access but these techniques are very sensitive to noise. A single bit change in encrypted data can have catastrophic impact over the decrypted data. This paper addresses the problem of removing bit error in visual data which are encrypted using AES algorithm in the CBC mode. In order to remove the noise, a method is proposed which is based on the statistical analysis of each block during the decryption. The proposed method exploits local statistics of the visual data and confusion/diffusion properties of the encryption algorithm to remove the errors. Experimental results show that the proposed method can be used at the receiving end for the possible solution for noise removing in visual data in encrypted domain.

  9. Statistical methods for the detection and analysis of radioactive sources

    NASA Astrophysics Data System (ADS)

    Klumpp, John

    We consider four topics from areas of radioactive statistical analysis in the present study: Bayesian methods for the analysis of count rate data, analysis of energy data, a model for non-constant background count rate distributions, and a zero-inflated model of the sample count rate. The study begins with a review of Bayesian statistics and techniques for analyzing count rate data. Next, we consider a novel system for incorporating energy information into count rate measurements which searches for elevated count rates in multiple energy regions simultaneously. The system analyzes time-interval data in real time to sequentially update a probability distribution for the sample count rate. We then consider a "moving target" model of background radiation in which the instantaneous background count rate is a function of time, rather than being fixed. Unlike the sequential update system, this model assumes a large body of pre-existing data which can be analyzed retrospectively. Finally, we propose a novel Bayesian technique which allows for simultaneous source detection and count rate analysis. This technique is fully compatible with, but independent of, the sequential update system and moving target model.

  10. Bayesian Sensitivity Analysis of Statistical Models with Missing Data

    PubMed Central

    ZHU, HONGTU; IBRAHIM, JOSEPH G.; TANG, NIANSHENG

    2013-01-01

    Methods for handling missing data depend strongly on the mechanism that generated the missing values, such as missing completely at random (MCAR) or missing at random (MAR), as well as other distributional and modeling assumptions at various stages. It is well known that the resulting estimates and tests may be sensitive to these assumptions as well as to outlying observations. In this paper, we introduce various perturbations to modeling assumptions and individual observations, and then develop a formal sensitivity analysis to assess these perturbations in the Bayesian analysis of statistical models with missing data. We develop a geometric framework, called the Bayesian perturbation manifold, to characterize the intrinsic structure of these perturbations. We propose several intrinsic influence measures to perform sensitivity analysis and quantify the effect of various perturbations to statistical models. We use the proposed sensitivity analysis procedure to systematically investigate the tenability of the non-ignorable missing at random (NMAR) assumption. Simulation studies are conducted to evaluate our methods, and a dataset is analyzed to illustrate the use of our diagnostic measures. PMID:24753718

  11. Statistical analysis of pitting corrosion in condenser tubes

    SciTech Connect

    Ault, J.P.; Gehring, G.A. Jr.

    1997-12-31

    Condenser tube failure via wall penetration allows cooling water to contaminate the working fluid (steam). Contamination, especially from brackish or saltwater, will lower steam quality and thus lower overall plant efficiency. Because of the importance of minimizing leakages, power plant engineers are primarily concerned with the maximum localized corrosion in a unit rather than average corrosion values or rates. Extreme value analysis is a useful tool for evaluating the condition of condenser tubing. Extreme value statistical techniques allow the prediction of the most probable deepest pit in a given surface area based upon data acquired from a smaller surface area. Data is gathered from a physical examination of actual tubes (either in-service or from a sidestream unit) rather than small sample coupons. Three distinct applications of extreme value statistics to condenser tube evaluation are presented in this paper: (1) condition assessment of an operating condenser, (2) design data for material selection, and (3) research tool for assessing impact of various factors on condenser tube corrosion. The projections for operating units based on extreme value analysis are shown to be more useful than those made on the basis of other techniques such as eddy current or electrochemical measurements. Extreme value analysis would benefit from advances in two key areas: (1) development of an accurate and economical method for the measurement of maximum pit depths of condenser tubes in-situ would enhance the application of extreme value statistical analysis to the assessment of condenser tubing corrosion pitting and (2) development of methodologies to predict pit depth-time relationship in addition to pit depth-area relationship would be useful for modeling purposes.

  12. On Statistical Analysis of Neuroimages with Imperfect Registration

    PubMed Central

    Kim, Won Hwa; Ravi, Sathya N.; Johnson, Sterling C.; Okonkwo, Ozioma C.; Singh, Vikas

    2016-01-01

    A variety of studies in neuroscience/neuroimaging seek to perform statistical inference on the acquired brain image scans for diagnosis as well as understanding the pathological manifestation of diseases. To do so, an important first step is to register (or co-register) all of the image data into a common coordinate system. This permits meaningful comparison of the intensities at each voxel across groups (e.g., diseased versus healthy) to evaluate the effects of the disease and/or use machine learning algorithms in a subsequent step. But errors in the underlying registration make this problematic, they either decrease the statistical power or make the follow-up inference tasks less effective/accurate. In this paper, we derive a novel algorithm which offers immunity to local errors in the underlying deformation field obtained from registration procedures. By deriving a deformation invariant representation of the image, the downstream analysis can be made more robust as if one had access to a (hypothetical) far superior registration procedure. Our algorithm is based on recent work on scattering transform. Using this as a starting point, we show how results from harmonic analysis (especially, non-Euclidean wavelets) yields strategies for designing deformation and additive noise invariant representations of large 3-D brain image volumes. We present a set of results on synthetic and real brain images where we achieve robust statistical analysis even in the presence of substantial deformation errors; here, standard analysis procedures significantly under-perform and fail to identify the true signal. PMID:27042168

  13. STATISTICAL ANALYSIS OF TANK 18F FLOOR SAMPLE RESULTS

    SciTech Connect

    Harris, S.

    2010-09-02

    Representative sampling has been completed for characterization of the residual material on the floor of Tank 18F as per the statistical sampling plan developed by Shine [1]. Samples from eight locations have been obtained from the tank floor and two of the samples were archived as a contingency. Six samples, referred to in this report as the current scrape samples, have been submitted to and analyzed by SRNL [2]. This report contains the statistical analysis of the floor sample analytical results to determine if further data are needed to reduce uncertainty. Included are comparisons with the prior Mantis samples results [3] to determine if they can be pooled with the current scrape samples to estimate the upper 95% confidence limits (UCL{sub 95%}) for concentration. Statistical analysis revealed that the Mantis and current scrape sample results are not compatible. Therefore, the Mantis sample results were not used to support the quantification of analytes in the residual material. Significant spatial variability among the current sample results was not found. Constituent concentrations were similar between the North and South hemispheres as well as between the inner and outer regions of the tank floor. The current scrape sample results from all six samples fall within their 3-sigma limits. In view of the results from numerous statistical tests, the data were pooled from all six current scrape samples. As such, an adequate sample size was provided for quantification of the residual material on the floor of Tank 18F. The uncertainty is quantified in this report by an upper 95% confidence limit (UCL{sub 95%}) on each analyte concentration. The uncertainty in analyte concentration was calculated as a function of the number of samples, the average, and the standard deviation of the analytical results. The UCL{sub 95%} was based entirely on the six current scrape sample results (each averaged across three analytical determinations).

  14. STATISTICAL ANALYSIS OF TANK 19F FLOOR SAMPLE RESULTS

    SciTech Connect

    Harris, S.

    2010-09-02

    Representative sampling has been completed for characterization of the residual material on the floor of Tank 19F as per the statistical sampling plan developed by Harris and Shine. Samples from eight locations have been obtained from the tank floor and two of the samples were archived as a contingency. Six samples, referred to in this report as the current scrape samples, have been submitted to and analyzed by SRNL. This report contains the statistical analysis of the floor sample analytical results to determine if further data are needed to reduce uncertainty. Included are comparisons with the prior Mantis samples results to determine if they can be pooled with the current scrape samples to estimate the upper 95% confidence limits (UCL95%) for concentration. Statistical analysis revealed that the Mantis and current scrape sample results are not compatible. Therefore, the Mantis sample results were not used to support the quantification of analytes in the residual material. Significant spatial variability among the current scrape sample results was not found. Constituent concentrations were similar between the North and South hemispheres as well as between the inner and outer regions of the tank floor. The current scrape sample results from all six samples fall within their 3-sigma limits. In view of the results from numerous statistical tests, the data were pooled from all six current scrape samples. As such, an adequate sample size was provided for quantification of the residual material on the floor of Tank 19F. The uncertainty is quantified in this report by an UCL95% on each analyte concentration. The uncertainty in analyte concentration was calculated as a function of the number of samples, the average, and the standard deviation of the analytical results. The UCL95% was based entirely on the six current scrape sample results (each averaged across three analytical determinations).

  15. Midface anatomy, aging, and aesthetic analysis.

    PubMed

    Levesque, Andre Yuan; de la Torre, Jorge I

    2015-05-01

    This article reviews the key anatomic structures in the region of the midface, including important surface and bony landmarks, innervation, blood supply, muscle layers, and fat compartments. It also discusses changes in these structures related to the aging process and aesthetic analysis of the midface to aid with operative planning. PMID:25921564

  16. Statistical energy analysis of complex structures, phase 2

    NASA Technical Reports Server (NTRS)

    Trudell, R. W.; Yano, L. I.

    1980-01-01

    A method for estimating the structural vibration properties of complex systems in high frequency environments was investigated. The structure analyzed was the Materials Experiment Assembly, (MEA), which is a portion of the OST-2A payload for the space transportation system. Statistical energy analysis (SEA) techniques were used to model the structure and predict the structural element response to acoustic excitation. A comparison of the intial response predictions and measured acoustic test data is presented. The conclusions indicate that: the SEA predicted the response of primary structure to acoustic excitation over a wide range of frequencies; and the contribution of mechanically induced random vibration to the total MEA is not significant.

  17. Statistical Analysis of Strength Data for an Aerospace Aluminum Alloy

    NASA Technical Reports Server (NTRS)

    Neergaard, L.; Malone, T.

    2001-01-01

    Aerospace vehicles are produced in limited quantities that do not always allow development of MIL-HDBK-5 A-basis design allowables. One method of examining production and composition variations is to perform 100% lot acceptance testing for aerospace Aluminum (Al) alloys. This paper discusses statistical trends seen in strength data for one Al alloy. A four-step approach reduced the data to residuals, visualized residuals as a function of time, grouped data with quantified scatter, and conducted analysis of variance (ANOVA).

  18. Statistical design and analysis of RNA sequencing data.

    PubMed

    Auer, Paul L; Doerge, R W

    2010-06-01

    Next-generation sequencing technologies are quickly becoming the preferred approach for characterizing and quantifying entire genomes. Even though data produced from these technologies are proving to be the most informative of any thus far, very little attention has been paid to fundamental design aspects of data collection and analysis, namely sampling, randomization, replication, and blocking. We discuss these concepts in an RNA sequencing framework. Using simulations we demonstrate the benefits of collecting replicated RNA sequencing data according to well known statistical designs that partition the sources of biological and technical variation. Examples of these designs and their corresponding models are presented with the goal of testing differential expression.

  19. Statistical Analysis in Genetic Studies of Mental Illnesses

    PubMed Central

    Zhang, Heping

    2011-01-01

    Identifying the risk factors for mental illnesses is of significant public health importance. Diagnosis, stigma associated with mental illnesses, comorbidity, and complex etiologies, among others, make it very challenging to study mental disorders. Genetic studies of mental illnesses date back at least a century ago, beginning with descriptive studies based on Mendelian laws of inheritance. A variety of study designs including twin studies, family studies, linkage analysis, and more recently, genomewide association studies have been employed to study the genetics of mental illnesses, or complex diseases in general. In this paper, I will present the challenges and methods from a statistical perspective and focus on genetic association studies. PMID:21909187

  20. Statistical Analysis of Strength Data for an Aerospace Aluminum Alloy

    NASA Technical Reports Server (NTRS)

    Neergaard, Lynn; Malone, Tina; Gentz, Steven J. (Technical Monitor)

    2000-01-01

    Aerospace vehicles are produced in limited quantities that do not always allow development of MIL-HDBK-5 A-basis design allowables. One method of examining production and composition variations is to perform 100% lot acceptance testing for aerospace Aluminum (Al) alloys. This paper discusses statistical trends seen in strength data for one Al alloy. A four-step approach reduced the data to residuals, visualized residuals as a function of time, grouped data with quantified scatter, and conducted analysis of variance (ANOVA).

  1. Multi-scale statistical analysis of coronal solar activity

    DOE PAGES

    Gamborino, Diana; del-Castillo-Negrete, Diego; Martinell, Julio J.

    2016-07-08

    Multi-filter images from the solar corona are used to obtain temperature maps that are analyzed using techniques based on proper orthogonal decomposition (POD) in order to extract dynamical and structural information at various scales. Exploring active regions before and after a solar flare and comparing them with quiet regions, we show that the multi-scale behavior presents distinct statistical properties for each case that can be used to characterize the level of activity in a region. Information about the nature of heat transport is also to be extracted from the analysis.

  2. Omics integrating physical techniques: aged Piedmontese meat analysis.

    PubMed

    Lana, Alessandro; Longo, Valentina; Dalmasso, Alessandra; D'Alessandro, Angelo; Bottero, Maria Teresa; Zolla, Lello

    2015-04-01

    Piedmontese meat tenderness becomes higher by extending the ageing period after slaughter up to 44 days. Classical physical analysis only partially explain this evidence, so in order to discover the reason of the potential beneficial effects of prolonged ageing, we performed omic analysis in the Longissimus thoracis muscle by examining main biochemical changes through mass spectrometry-based metabolomics and proteomics. We observed a progressive decline in myofibrillar structural integrity (underpinning meat tenderness) and impaired energy metabolism. Markers of autophagic responses (e.g. serine and glutathione metabolism) and nitrogen metabolism (urea cycle intermediates) accumulated until the end of the assayed period. Key metabolites such as glutamate, a mediator of the appreciated umami taste of the meat, were found to constantly accumulate until day 44. Finally, statistical analyses revealed that glutamate, serine and arginine could serve as good predictors of ultimate meat quality parameters, even though further studies are mandatory.

  3. Predicting typology of landslide occurrences by statistical GIS analysis

    NASA Astrophysics Data System (ADS)

    Mancini, Francesco; Ceppi, Claudia; Ritrovato, Giuliano

    2010-05-01

    This study aim at the landslide susceptibility mapping by multivariate statistical methods with the additional capability to distinguish among typology of landslide occurrences. The methodology is being tested in a hilly area of the Daunia Region (Apulia, southern Italy) where small settlements are historically threatened by landslide phenomena. In the used multivariate statistical analysis all the variables were managed in a GIS in addition to the landslide inventory where geometric and descriptive properties have to be implemented in a suitable data structure in order to refer the independent set of variables to the typology of landslide occurrences. The independent set of variable selected as possible triggering factors of slope instability phenomena are: elevation, slope, aspect, planform and profile curvature, drained area, lithology, land use, distance from road and river network. The implementation of the landslide inventory was more demanding with respect to a usual multivariate analysis, such as the multiple regression analysis, where the simple presence/absence status of occurrences is being required. According to the classification proposed by Cruden and Varnes, three main landslide typologies were included in the inventory after recognizing by geomorphological survey: a) intermediate to deep-seated compound landslides with failure surface depth > 30m; b) mudslides of shallow to intermediate depth sliding surface; c) deep-seated to intermediate depth rotational landslides with depth of sliding surface < 30m. The inventory implementation constitutes a significant effort supported by the project "Landslide risk assessment for the planning of small urban settlements within chain areas: the case of Daunia" through several expertise. The outcomes of the analysis provide the proneness to landslide, as predicted level of probability, by considering in addition the failure mechanism introduced in the landslide inventory. A map of landslide susceptibility along

  4. Estimating the age structure of a buried adult population: a new statistical approach applied to archaeological digs in France.

    PubMed

    Séguy, Isabelle; Caussinus, Henri; Courgeau, Daniel; Buchet, Luc

    2013-02-01

    Paleodemographers have developed several methods for estimating the age structure of historical populations in absence of civil registration data. Starting from biological indicators alone, they use a reference population of known sex and age to assess the conditional distribution of the biological indicator given age. However, the small amount of data available and the unstable nature of the related statistical problem mean that most methods are disappointing. Using the most reliable reference data possible, we propose a simple statistical method, integrating the maximum amount of information included in the actual data, which quite significantly improves age estimates for a buried population. Here the method is applied to a French cemetery used from Late Antiquity to the end of the Early Middle Ages.

  5. Statistical characterization of life drivers for a probabilistic design analysis

    NASA Technical Reports Server (NTRS)

    Fox, Eric P.; Safie, Fayssal

    1992-01-01

    This paper discusses the issue of statistical characterization of life drivers for a probabilistic design analysis (PDA) approach to support the conventional deterministic structural design methods that are currently used. The probabilistic approach takes into consideration the modeling inadequacies and uncertainties in many design variables such as loads, environments, and material properties. The importance of the distributional assumption is motivated by illustrating an example where the results differ substantially due to the distribution selected. Different types of distributions are discussed and techniques for estimating the parameters are given. Given this information, procedures are outlined for selecting the appropriate distribution based on the particular type of variable (i.e., dimensional, performance) as well as the information that is available (i.e., test data, engineering analysis). Finally, techniques are given for generating random numbers from these selected distributions within the PDA process.

  6. Detection of bearing damage by statistic vibration analysis

    NASA Astrophysics Data System (ADS)

    Sikora, E. A.

    2016-04-01

    The condition of bearings, which are essential components in mechanisms, is crucial to safety. The analysis of the bearing vibration signal, which is always contaminated by certain types of noise, is a very important standard for mechanical condition diagnosis of the bearing and mechanical failure phenomenon. In this paper the method of rolling bearing fault detection by statistical analysis of vibration is proposed to filter out Gaussian noise contained in a raw vibration signal. The results of experiments show that the vibration signal can be significantly enhanced by application of the proposed method. Besides, the proposed method is used to analyse real acoustic signals of a bearing with inner race and outer race faults, respectively. The values of attributes are determined according to the degree of the fault. The results confirm that the periods between the transients, which represent bearing fault characteristics, can be successfully detected.

  7. Vibroacoustic optimization using a statistical energy analysis model

    NASA Astrophysics Data System (ADS)

    Culla, Antonio; D`Ambrogio, Walter; Fregolent, Annalisa; Milana, Silvia

    2016-08-01

    In this paper, an optimization technique for medium-high frequency dynamic problems based on Statistical Energy Analysis (SEA) method is presented. Using a SEA model, the subsystem energies are controlled by internal loss factors (ILF) and coupling loss factors (CLF), which in turn depend on the physical parameters of the subsystems. A preliminary sensitivity analysis of subsystem energy to CLF's is performed to select CLF's that are most effective on subsystem energies. Since the injected power depends not only on the external loads but on the physical parameters of the subsystems as well, it must be taken into account under certain conditions. This is accomplished in the optimization procedure, where approximate relationships between CLF's, injected power and physical parameters are derived. The approach is applied on a typical aeronautical structure: the cabin of a helicopter.

  8. Statistical learning analysis in neuroscience: aiming for transparency.

    PubMed

    Hanke, Michael; Halchenko, Yaroslav O; Haxby, James V; Pollmann, Stefan

    2010-01-01

    Encouraged by a rise of reciprocal interest between the machine learning and neuroscience communities, several recent studies have demonstrated the explanatory power of statistical learning techniques for the analysis of neural data. In order to facilitate a wider adoption of these methods, neuroscientific research needs to ensure a maximum of transparency to allow for comprehensive evaluation of the employed procedures. We argue that such transparency requires "neuroscience-aware" technology for the performance of multivariate pattern analyses of neural data that can be documented in a comprehensive, yet comprehensible way. Recently, we introduced PyMVPA, a specialized Python framework for machine learning based data analysis that addresses this demand. Here, we review its features and applicability to various neural data modalities. PMID:20582270

  9. First statistical analysis of Geant4 quality software metrics

    NASA Astrophysics Data System (ADS)

    Ronchieri, Elisabetta; Grazia Pia, Maria; Giacomini, Francesco

    2015-12-01

    Geant4 is a simulation system of particle transport through matter, widely used in several experimental areas from high energy physics and nuclear experiments to medical studies. Some of its applications may involve critical use cases; therefore they would benefit from an objective assessment of the software quality of Geant4. In this paper, we provide a first statistical evaluation of software metrics data related to a set of Geant4 physics packages. The analysis aims at identifying risks for Geant4 maintainability, which would benefit from being addressed at an early stage. The findings of this pilot study set the grounds for further extensions of the analysis to the whole of Geant4 and to other high energy physics software systems.

  10. Statistical analysis of cascading failures in power grids

    SciTech Connect

    Chertkov, Michael; Pfitzner, Rene; Turitsyn, Konstantin

    2010-12-01

    We introduce a new microscopic model of cascading failures in transmission power grids. This model accounts for automatic response of the grid to load fluctuations that take place on the scale of minutes, when optimum power flow adjustments and load shedding controls are unavailable. We describe extreme events, caused by load fluctuations, which cause cascading failures of loads, generators and lines. Our model is quasi-static in the causal, discrete time and sequential resolution of individual failures. The model, in its simplest realization based on the Directed Current description of the power flow problem, is tested on three standard IEEE systems consisting of 30, 39 and 118 buses. Our statistical analysis suggests a straightforward classification of cascading and islanding phases in terms of the ratios between average number of removed loads, generators and links. The analysis also demonstrates sensitivity to variations in line capacities. Future research challenges in modeling and control of cascading outages over real-world power networks are discussed.

  11. Design and statistical analysis of oral medicine studies: common pitfalls.

    PubMed

    Baccaglini, L; Shuster, J J; Cheng, J; Theriaque, D W; Schoenbach, V J; Tomar, S L; Poole, C

    2010-04-01

    A growing number of articles are emerging in the medical and statistics literature that describe epidemiologic and statistical flaws of research studies. Many examples of these deficiencies are encountered in the oral, craniofacial, and dental literature. However, only a handful of methodologic articles have been published in the oral literature warning investigators of potential errors that may arise early in the study and that can irreparably bias the final results. In this study, we briefly review some of the most common pitfalls that our team of epidemiologists and statisticians has identified during the review of submitted or published manuscripts and research grant applications. We use practical examples from the oral medicine and dental literature to illustrate potential shortcomings in the design and analysis of research studies, and how these deficiencies may affect the results and their interpretation. A good study design is essential, because errors in the analysis can be corrected if the design was sound, but flaws in study design can lead to data that are not salvageable. We recommend consultation with an epidemiologist or a statistician during the planning phase of a research study to optimize study efficiency, minimize potential sources of bias, and document the analytic plan.

  12. Bayes Method Plant Aging Risk Analysis

    1992-03-13

    DORIAN is an integrated package for performing Bayesian aging analysis of reliability data; e.g. for identifying trends in component failure rates and/or outage durations as a function of time. The user must specify several alternatives hypothesized aging models (i.e. possible trends) along with prior probabilities indicating the subjective probability that each trend is actually the correct one. DORIAN then uses component failure and/or repair data over time to update these prior probabilities and develop amore » posterior probability for each aging model, representing the probability that each model is the correct one in light of the observed data rather than a priori. Mean, median, and 5th and 95th percentile trends are also compiled from the posterior probabilities.« less

  13. Neutral dynamics with environmental noise: Age-size statistics and species lifetimes

    NASA Astrophysics Data System (ADS)

    Kessler, David; Suweis, Samir; Formentin, Marco; Shnerb, Nadav M.

    2015-08-01

    Neutral dynamics, where taxa are assumed to be demographically equivalent and their abundance is governed solely by the stochasticity of the underlying birth-death process, has proved itself as an important minimal model that accounts for many empirical datasets in genetics and ecology. However, the restriction of the model to demographic [O (√{N }) ] noise yields relatively slow dynamics that appears to be in conflict with both short-term and long-term characteristics of the observed systems. Here we analyze two of these problems—age-size relationships and species extinction time—in the framework of a neutral theory with both demographic and environmental stochasticity. It turns out that environmentally induced variations of the demographic rates control the long-term dynamics and modify dramatically the predictions of the neutral theory with demographic noise only, yielding much better agreement with empirical data. We consider two prototypes of "zero mean" environmental noise, one which is balanced with regard to the arithmetic abundance, another balanced in the logarithmic (fitness) space, study their species lifetime statistics, and discuss their relevance to realistic models of community dynamics.

  14. Statistical analysis of gait maturation in children based on probability density functions.

    PubMed

    Wu, Yunfeng; Zhong, Zhangting; Lu, Meng; He, Jia

    2011-01-01

    Analysis of gait patterns in children is useful for the study of maturation of locomotor control. In this paper, we utilized the Parzen-window method to estimate the probability density functions (PDFs) of the stride interval for 50 children. With the estimated PDFs, the statistical measures, i.e., averaged stride interval (ASI), variation of stride interval (VSI), PDF skewness (SK), and PDF kurtosis (KU), were computed for the gait maturation in three age groups (aged 3-5 years, 6-8 years, and 10-14 years) of young children. The results indicated that the ASI and VSI values are significantly different between the three age groups. The VSI is decreased rapidly until 8 years of age, and then continues to be decreased at a slower rate. The SK values of the PDFs for all of the three age groups are positive, which shows a slight imbalance in the stride interval distribution within each age group. In addition, the decrease of the KU values of the PDFs is age-dependent, which suggests the effects of the musculo-skeletal growth on the gait maturation in young children. PMID:22254641

  15. The Statistical Analysis of stars with Hα emission in IC 348 Cluster

    NASA Astrophysics Data System (ADS)

    Nikoghosyan, E. H.; Vardanyan, A. V.; Khachatryan, K. G.

    2016-09-01

    In this work the results of the statistical analysis of the ˜200 stars with Hα emission in the IC 348 cluster are presented. The sample is completed up to R < 20.0. The percentage of emission stars increases from bright to fainter objects and to the range of 13.0 ≤ R-AR ≤ 19.0 reaches 80%. The ratio between WTTau and CTTau objects is 64% and 36%. The 70% of X-ray sources are WTTau stars. The age of WTTau and CTTau objects are ˜2·10^6 years. The age of the non emission stars with a mass less solar is ˜2·10^6 years also, but non emission more massive objects are "older", the age of them is ˜7·10^6 years.

  16. Statistical Analysis of Surface Water Quality Data of Eastern Massachusetts

    NASA Astrophysics Data System (ADS)

    Andronache, C.; Hon, R.; Tedder, N.; Xian, Q.; Schaudt, B.

    2008-05-01

    We present a characterization of current state of surface water, changes in time and dependence on land use, precipitation regime, and possible other natural and human influences based on data from the USGS National Water Quality Assessment (NAWQA) Program for New England streams. Time series analysis is used to detect changes and relationship with discharge and precipitation regime. Statistical techniques are employed to analyze relationships among multiple chemical variable monitored. Analysis of ion concentrations reveals information about possible natural sources and processes, and anthropogenic influences. A notable example is the increase in salt concentration in ground and surface waters, with impact on drinking water quality. Salt concentration increase in water can be linked to road salt usage during winters with heavy snowfall and other factors. Road salt enters water supplies by percolation through soil into groundwater or runoff and drainage into reservoirs. After entering fast-flowing streams, rivers and lakes, salt runoff concentrations are rapidly diluted. Road salt infiltration is more common for groundwater-based supplies, such as wells, springs, and reservoirs that are recharged mainly by groundwater. We use principal component analysis and other statistical procedures to obtain a description of the dominant independent variables that influence the observed chemical compositional range. In most cases, over 85 percent of the total variation can be explained by 3 to 4 components. The overwhelming variation is attributed to a large compositional range of Na and Cl seen even if all data are combined into a single dataset. Na versus Cl correlation coefficients are commonly greater than 0.9. Second components are typically associated with dilutions by overland flows (non winter months) and/or increased concentrations due to evaporation (summer season) or overland flows (winter season) if a snow storm is followed by the application of deicers on road

  17. Statistical Scalability Analysis of Communication Operations in Distributed Applications

    SciTech Connect

    Vetter, J S; McCracken, M O

    2001-02-27

    Current trends in high performance computing suggest that users will soon have widespread access to clusters of multiprocessors with hundreds, if not thousands, of processors. This unprecedented degree of parallelism will undoubtedly expose scalability limitations in existing applications, where scalability is the ability of a parallel algorithm on a parallel architecture to effectively utilize an increasing number of processors. Users will need precise and automated techniques for detecting the cause of limited scalability. This paper addresses this dilemma. First, we argue that users face numerous challenges in understanding application scalability: managing substantial amounts of experiment data, extracting useful trends from this data, and reconciling performance information with their application's design. Second, we propose a solution to automate this data analysis problem by applying fundamental statistical techniques to scalability experiment data. Finally, we evaluate our operational prototype on several applications, and show that statistical techniques offer an effective strategy for assessing application scalability. In particular, we find that non-parametric correlation of the number of tasks to the ratio of the time for individual communication operations to overall communication time provides a reliable measure for identifying communication operations that scale poorly.

  18. Statistical analysis of the autoregressive modeling of reverberant speech.

    PubMed

    Gaubitch, Nikolay D; Ward, Darren B; Naylor, Patrick A

    2006-12-01

    Hands-free speech input is required in many modern telecommunication applications that employ autoregressive (AR) techniques such as linear predictive coding. When the hands-free input is obtained in enclosed reverberant spaces such as typical office rooms, the speech signal is distorted by the room transfer function. This paper utilizes theoretical results from statistical room acoustics to analyze the AR modeling of speech under these reverberant conditions. Three cases are considered: (i) AR coefficients calculated from a single observation; (ii) AR coefficients calculated jointly from an M-channel observation (M > 1); and (iii) AR coefficients calculated from the output of a delay-and sum beamformer. The statistical analysis, with supporting simulations, shows that the spatial expectation of the AR coefficients for cases (i) and (ii) are approximately equal to those from the original speech, while for case (iii) there is a discrepancy due to spatial correlation between the microphones which can be significant. It is subsequently demonstrated that at each individual source-microphone position (without spatial expectation), the M-channel AR coefficients from case (ii) provide the best approximation to the clean speech coefficients when microphones are closely spaced (<0.3m). PMID:17225429

  19. Statistical Analysis of NAS Parallel Benchmarks and LINPACK Results

    NASA Technical Reports Server (NTRS)

    Meuer, Hans-Werner; Simon, Horst D.; Strohmeier, Erich; Lasinski, T. A. (Technical Monitor)

    1994-01-01

    In the last three years extensive performance data have been reported for parallel machines both based on the NAS Parallel Benchmarks, and on LINPACK. In this study we have used the reported benchmark results and performed a number of statistical experiments using factor, cluster, and regression analyses. In addition to the performance results of LINPACK and the eight NAS parallel benchmarks, we have also included peak performance of the machine, and the LINPACK n and n(sub 1/2) values. Some of the results and observations can be summarized as follows: 1) All benchmarks are strongly correlated with peak performance. 2) LINPACK and EP have each a unique signature. 3) The remaining NPB can grouped into three groups as follows: (CG and IS), (LU and SP), and (MG, FT, and BT). Hence three (or four with EP) benchmarks are sufficient to characterize the overall NPB performance. Our poster presentation will follow a standard poster format, and will present the data of our statistical analysis in detail.

  20. Constraining cosmology with shear peak statistics: tomographic analysis

    NASA Astrophysics Data System (ADS)

    Martinet, Nicolas; Bartlett, James G.; Kiessling, Alina; Sartoris, Barbara

    2015-09-01

    The abundance of peaks in weak gravitational lensing maps is a potentially powerful cosmological tool, complementary to measurements of the shear power spectrum. We study peaks detected directly in shear maps, rather than convergence maps, an approach that has the advantage of working directly with the observable quantity, the galaxy ellipticity catalog. Using large numbers of numerical simulations to accurately predict the abundance of peaks and their covariance, we quantify the cosmological constraints attainable by a large-area survey similar to that expected from the Euclid mission, focusing on the density parameter, Ωm, and on the power spectrum normalization, σ8, for illustration. We present a tomographic peak counting method that improves the conditional (marginal) constraints by a factor of 1.2 (2) over those from a two-dimensional (i.e., non-tomographic) peak-count analysis. We find that peak statistics provide constraints an order of magnitude less accurate than those from the cluster sample in the ideal situation of a perfectly known observable-mass relation; however, when the scaling relation is not known a priori, the shear-peak constraints are twice as strong and orthogonal to the cluster constraints, highlighting the value of using both clusters and shear-peak statistics.

  1. Statistical analysis of test data for APM rod issue

    SciTech Connect

    Edwards, T.B.; Harris, S.P.; Reeve, C.P.

    1992-05-01

    The uncertainty associated with the use of the K-Reactor axial power monitors (APMs) to measure roof-top-ratios is investigated in this report. Internal heating test data acquired under both DC-flow conditions and AC-flow conditions have been analyzed. These tests were conducted to simulate gamma heating at the lower power levels planned for reactor operation. The objective of this statistical analysis is to investigate the relationship between the observed and true roof-top-ratio (RTR) values and associated uncertainties at power levels within this lower operational range. Conditional on a given, known power level, a prediction interval for the true RTR value corresponding to a new, observed RTR is given. This is done for a range of power levels. Estimates of total system uncertainty are also determined by combining the analog-to-digital converter uncertainty with the results from the test data.

  2. Statistical models of video structure for content analysis and characterization.

    PubMed

    Vasconcelos, N; Lippman, A

    2000-01-01

    Content structure plays an important role in the understanding of video. In this paper, we argue that knowledge about structure can be used both as a means to improve the performance of content analysis and to extract features that convey semantic information about the content. We introduce statistical models for two important components of this structure, shot duration and activity, and demonstrate the usefulness of these models with two practical applications. First, we develop a Bayesian formulation for the shot segmentation problem that is shown to extend the standard thresholding model in an adaptive and intuitive way, leading to improved segmentation accuracy. Second, by applying the transformation into the shot duration/activity feature space to a database of movie clips, we also illustrate how the Bayesian model captures semantic properties of the content. We suggest ways in which these properties can be used as a basis for intuitive content-based access to movie libraries.

  3. Statistical analysis of a carcinogen mixture experiment. I. Liver carcinogens.

    PubMed

    Elashoff, R M; Fears, T R; Schneiderman, M A

    1987-09-01

    This paper describes factorial experiments designed to determine whether 2 liver carcinogens act synergistically to produce liver cancers in Fischer 344 rats. Four hepatocarcinogens, cycad flour, lasiocarpine (CAS: 303-34-4), aflatoxin B1 (CAS: 1162-65-8), and dipentylnitrosamine (CAS: 13256-06-9), were studied in pairwise combinations. Each of the 6 possible pairs was studied by means of 4 X 4 factorial experiment, each agent being fed at zero and at 3 non-zero doses. Methods of analysis designed explicitly for this study were derived to study interaction. These methods were supplemented by standard statistical methods appropriate for one-at-a-time studies. Antagonism was not discovered in any chemical mixture. Some chemical mixtures did interact synergistically. Findings for male and female animals were generally, but not always, in agreement.

  4. Barcode localization with region based gradient statistical analysis

    NASA Astrophysics Data System (ADS)

    Chen, Zhiyuan; Zhao, Yuming

    2015-03-01

    Barcode, as a kind of data representation method, has been adopted in a wide range of areas. Especially with the rise of the smart phone and the hand-held device equipped with high resolution camera and great computation power, barcode technique has found itself more extensive applications. In industrial field, barcode reading system is highly demanded to be robust to blur, illumination change, pitch, rotation, and scale change. This paper gives a new idea in localizing barcode under a region-based gradient statistical analysis. Making this idea as the basis, four algorithms have been developed for dealing with Linear, PDF417, Stacked 1D1D and Stacked 1D2D barcodes respectively. After being evaluated on our challenging dataset with more than 17000 images, the result shows that our methods can achieve an average localization accuracy of 82.17% with respect to 8 kinds of distortions and within an average time of 12 ms.

  5. Statistical analysis of arch shape with conic sections.

    PubMed

    Sampson, P D

    1983-06-01

    Arcs of conic sections are used to model the shapes of human dental arches and to provide a basis for the statistical and graphical analysis of a population of shapes. The Bingham distribution, an elliptical distribution on a hypersphere, is applied in order to model the coefficients of the conic arcs. It provides a definition of an 'average shape' and it quantifies variation in shape. Geometric envelopes of families of conic arcs whose coefficients satisfy a quadratic constraint are used to depict the distribution of shapes in the plane and to make graphical inferences about the average shape. The methods are demonstrated with conic arcs fitted to a sample of 66 maxillary dental arches.

  6. Dental arch shape: a statistical analysis using conic sections.

    PubMed

    Sampson, P D

    1981-05-01

    This report addresses two problems in the study of the shape of human dental arches; (1) the description of arch shape by mathematical functions and (2) the description of variation among the dental arch shapes in a population. A new algorithm for fitting conic sections is used to model the maxillary dental arches of a sample of sixty-six subjects. A statistical model for shapes represented by arcs of conic sections is demonstrated on the sample of sixty-six dental arches. It permits the definition of an "average shape" and the graphic representation of variation in shape. The model and methods of analysis presented should help dental scientists to better define and quantify "normal" or "ideal" shapes and "normal ranges of variation" for the shape of the dental arch.

  7. A statistical analysis of the internal organ weights of normal Japanese people

    SciTech Connect

    Ogiu, Nobuko; Nakamura, Yuji; Ogiu, Toshiaki

    1997-03-01

    Correlation of weights of various organs with age, body weight, and/or body height was statistically analyzed using data on the Japanese physique collected by the Medico-Legal Society from Universities and Research Institutes in almost all areas of Japan. After exclusion of unsuitable individual data for statistical analysis, findings for 4,667 Japanese, aged 0-95 y, including 3,023 males and 1,644 females were used in the present study. Analyses of age-dependent changes in weights of the brain, heart, lung, kidney, spleen, pancreas, thymus, thyroid gland and adrenal gland and also of correlations between organ weights and body height, weight, or surface area were carried out. It was concluded that organ weights in the growing generation (under 19 y) generally increased with a coefficient expressed as (body height X body weight{sup 0.5}). Because clear age-dependent changes were not observed in adults over 20 y, they were classified into 4 physical types, thin, standard, plump and obese, and the relations of organ weights with these physical types were assessed. Some organs were relatively heavier in fat groups and light in thin individuals, or vice versa. 36 refs., 5 figs., 11 tabs.

  8. A statistical analysis of the daily streamflow hydrograph

    NASA Astrophysics Data System (ADS)

    Kavvas, M. L.; Delleur, J. W.

    1984-03-01

    In this study a periodic statistical analysis of daily streamflow data in Indiana, U.S.A., was performed to gain some new insight into the stochastic structure which describes the daily streamflow process. This analysis was performed by the periodic mean and covariance functions of the daily streamflows, by the time and peak discharge -dependent recession limb of the daily streamflow hydrograph, by the time and discharge exceedance level (DEL) -dependent probability distribution of the hydrograph peak interarrival time, and by the time-dependent probability distribution of the time to peak discharge. Some new statistical estimators were developed and used in this study. In general features, this study has shown that: (a) the persistence properties of daily flows depend on the storage state of the basin at the specified time origin of the flow process; (b) the daily streamflow process is time irreversible; (c) the probability distribution of the daily hydrograph peak interarrival time depends both on the occurrence time of the peak from which the inter-arrival time originates and on the discharge exceedance level; and (d) if the daily streamflow process is modeled as the release from a linear watershed storage, this release should depend on the state of the storage and on the time of the release as the persistence properties and the recession limb decay rates were observed to change with the state of the watershed storage and time. Therefore, a time-varying reservoir system needs to be considered if the daily streamflow process is to be modeled as the release from a linear watershed storage.

  9. Spectral signature verification using statistical analysis and text mining

    NASA Astrophysics Data System (ADS)

    DeCoster, Mallory E.; Firpi, Alexe H.; Jacobs, Samantha K.; Cone, Shelli R.; Tzeng, Nigel H.; Rodriguez, Benjamin M.

    2016-05-01

    In the spectral science community, numerous spectral signatures are stored in databases representative of many sample materials collected from a variety of spectrometers and spectroscopists. Due to the variety and variability of the spectra that comprise many spectral databases, it is necessary to establish a metric for validating the quality of spectral signatures. This has been an area of great discussion and debate in the spectral science community. This paper discusses a method that independently validates two different aspects of a spectral signature to arrive at a final qualitative assessment; the textual meta-data and numerical spectral data. Results associated with the spectral data stored in the Signature Database1 (SigDB) are proposed. The numerical data comprising a sample material's spectrum is validated based on statistical properties derived from an ideal population set. The quality of the test spectrum is ranked based on a spectral angle mapper (SAM) comparison to the mean spectrum derived from the population set. Additionally, the contextual data of a test spectrum is qualitatively analyzed using lexical analysis text mining. This technique analyzes to understand the syntax of the meta-data to provide local learning patterns and trends within the spectral data, indicative of the test spectrum's quality. Text mining applications have successfully been implemented for security2 (text encryption/decryption), biomedical3 , and marketing4 applications. The text mining lexical analysis algorithm is trained on the meta-data patterns of a subset of high and low quality spectra, in order to have a model to apply to the entire SigDB data set. The statistical and textual methods combine to assess the quality of a test spectrum existing in a database without the need of an expert user. This method has been compared to other validation methods accepted by the spectral science community, and has provided promising results when a baseline spectral signature is

  10. Statistical Signal Analysis for Systems with Interferenced Inputs

    NASA Technical Reports Server (NTRS)

    Bai, R. M.; Mielnicka-Pate, A. L.

    1985-01-01

    A new approach is introduced, based on statistical signal analysis, which overcomes the error due to input signal interference. The model analyzed is given. The input signals u sub 1 (t) and u sub 2 (t) are assumed to be unknown. The measurable signals x sub 1 (t) and x sub 2 (t) are interferened according to the frequency response functions, H sub 12 (f) and H sub 21 (f). The goal of the analysis was to evaluate the power output due to each input, u sub 1 (t) and u sub 2 (t), for the case where both are applied to the same time. In addition, all frequency response functions are calculated. The interferenced system is described by a set of five equations with six unknown functions. An IBM XT Personal Computer, which was interfaced with the FFT, was used to solve the set of equations. The software was tested on an electrical two-input, one-output system. The results were excellent. The research presented includes the analysis of the acoustic radiation from a rectangular plate with two force inputs and the sound pressure as an output signal.

  11. Statistical analysis and modelling of small satellite reliability

    NASA Astrophysics Data System (ADS)

    Guo, Jian; Monas, Liora; Gill, Eberhard

    2014-05-01

    This paper attempts to characterize failure behaviour of small satellites through statistical analysis of actual in-orbit failures. A unique Small Satellite Anomalies Database comprising empirical failure data of 222 small satellites has been developed. A nonparametric analysis of the failure data has been implemented by means of a Kaplan-Meier estimation. An innovative modelling method, i.e. Bayesian theory in combination with Markov Chain Monte Carlo (MCMC) simulations, has been proposed to model the reliability of small satellites. An extensive parametric analysis using the Bayesian/MCMC method has been performed to fit a Weibull distribution to the data. The influence of several characteristics such as the design lifetime, mass, launch year, mission type and the type of satellite developers on the reliability has been analyzed. The results clearly show the infant mortality of small satellites. Compared with the classical maximum-likelihood estimation methods, the proposed Bayesian/MCMC method results in better fitting Weibull models and is especially suitable for reliability modelling where only very limited failures are observed.

  12. Helioseismology of pre-emerging active regions. III. Statistical analysis

    SciTech Connect

    Barnes, G.; Leka, K. D.; Braun, D. C.; Birch, A. C.

    2014-05-01

    The subsurface properties of active regions (ARs) prior to their appearance at the solar surface may shed light on the process of AR formation. Helioseismic holography has been applied to samples taken from two populations of regions on the Sun (pre-emergence and without emergence), each sample having over 100 members, that were selected to minimize systematic bias, as described in Paper I. Paper II showed that there are statistically significant signatures in the average helioseismic properties that precede the formation of an AR. This paper describes a more detailed analysis of the samples of pre-emergence regions and regions without emergence based on discriminant analysis. The property that is best able to distinguish the populations is found to be the surface magnetic field, even a day before the emergence time. However, after accounting for the correlations between the surface field and the quantities derived from helioseismology, there is still evidence of a helioseismic precursor to AR emergence that is present for at least a day prior to emergence, although the analysis presented cannot definitively determine the subsurface properties prior to emergence due to the small sample sizes.

  13. Confirmatory Factor Analysis of the Statistical Anxiety Rating Scale With Online Graduate Students.

    PubMed

    DeVaney, Thomas A

    2016-04-01

    The Statistical Anxiety Rating Scale was examined using data from a convenience sample of 450 female and 65 male students enrolled in online, graduate-level introductory statistics courses. The mean age of the students was 33.1 (SD = 8.2), and 58.3% had completed six or fewer online courses. The majority of students were enrolled in education or counseling degree programs. Confirmatory factor analysis using unweighted least squares estimation was used to test three proposed models, and alpha coefficients were used to examine the internal consistency. The confirmatory factor analysis results supported the six-factor structure and indicated that proper models should include correlations among the six factors or two second-order factors (anxiety and attitude). Internal consistency estimates ranged from .82 to .95 and were consistent with values reported by previous researchers. The findings suggest that, when measuring statistics anxiety of online students using Statistical Anxiety Rating Scale, researchers and instructors can use scores from the individual subscales or generate two composite scores, anxiety and attitude, instead of a total score. PMID:27154380

  14. Statistical analysis of suicide characteristics in Iaşi County.

    PubMed

    Herea, Speranta-Giulia; Scripcaru, C

    2012-01-01

    A prospective study intended for statistic analysis of suicide events occurring in 2004-2009 period, in lasi County, was performed. Specific data emerged from the conventional investigation, focusing on the sex, age, seasonality, marital condition, occupation status, blood alcohol concentration, religion adherence, and previous suicide attempts of the persons who committed the lethal self-aggression. The results showed a males: females (M:F) ratio of 4.13:1, central tendency to suicide towards the 46 years, a mean age of the self-murderers series of 45 years, while the most frequent age was 49 years. The interquartile range expanded from 33 to 56 years. The rural:urban (R:U) ratio was 1.38:1, whereas a statistically-significant seasonal variation was found in villages. Suicide events occurred more frequently around the Easter and Christmas, whereas the orthodox Christian believers seemed to suicide more than Catholics. Additionally, a correlated analysis, based essentially on data provided by the local Institute of Legal Medicine and Psychiatry Hospital, offered a comprehensive understanding of the mental state of the self-murderers and their psychiatric profile.

  15. Classification of Malaysia aromatic rice using multivariate statistical analysis

    SciTech Connect

    Abdullah, A. H.; Adom, A. H.; Shakaff, A. Y. Md; Masnan, M. J.; Zakaria, A.; Rahim, N. A.; Omar, O.

    2015-05-15

    Aromatic rice (Oryza sativa L.) is considered as the best quality premium rice. The varieties are preferred by consumers because of its preference criteria such as shape, colour, distinctive aroma and flavour. The price of aromatic rice is higher than ordinary rice due to its special needed growth condition for instance specific climate and soil. Presently, the aromatic rice quality is identified by using its key elements and isotopic variables. The rice can also be classified via Gas Chromatography Mass Spectrometry (GC-MS) or human sensory panels. However, the uses of human sensory panels have significant drawbacks such as lengthy training time, and prone to fatigue as the number of sample increased and inconsistent. The GC–MS analysis techniques on the other hand, require detailed procedures, lengthy analysis and quite costly. This paper presents the application of in-house developed Electronic Nose (e-nose) to classify new aromatic rice varieties. The e-nose is used to classify the variety of aromatic rice based on the samples odour. The samples were taken from the variety of rice. The instrument utilizes multivariate statistical data analysis, including Principal Component Analysis (PCA), Linear Discriminant Analysis (LDA) and K-Nearest Neighbours (KNN) to classify the unknown rice samples. The Leave-One-Out (LOO) validation approach is applied to evaluate the ability of KNN to perform recognition and classification of the unspecified samples. The visual observation of the PCA and LDA plots of the rice proves that the instrument was able to separate the samples into different clusters accordingly. The results of LDA and KNN with low misclassification error support the above findings and we may conclude that the e-nose is successfully applied to the classification of the aromatic rice varieties.

  16. Classification of Malaysia aromatic rice using multivariate statistical analysis

    NASA Astrophysics Data System (ADS)

    Abdullah, A. H.; Adom, A. H.; Shakaff, A. Y. Md; Masnan, M. J.; Zakaria, A.; Rahim, N. A.; Omar, O.

    2015-05-01

    Aromatic rice (Oryza sativa L.) is considered as the best quality premium rice. The varieties are preferred by consumers because of its preference criteria such as shape, colour, distinctive aroma and flavour. The price of aromatic rice is higher than ordinary rice due to its special needed growth condition for instance specific climate and soil. Presently, the aromatic rice quality is identified by using its key elements and isotopic variables. The rice can also be classified via Gas Chromatography Mass Spectrometry (GC-MS) or human sensory panels. However, the uses of human sensory panels have significant drawbacks such as lengthy training time, and prone to fatigue as the number of sample increased and inconsistent. The GC-MS analysis techniques on the other hand, require detailed procedures, lengthy analysis and quite costly. This paper presents the application of in-house developed Electronic Nose (e-nose) to classify new aromatic rice varieties. The e-nose is used to classify the variety of aromatic rice based on the samples odour. The samples were taken from the variety of rice. The instrument utilizes multivariate statistical data analysis, including Principal Component Analysis (PCA), Linear Discriminant Analysis (LDA) and K-Nearest Neighbours (KNN) to classify the unknown rice samples. The Leave-One-Out (LOO) validation approach is applied to evaluate the ability of KNN to perform recognition and classification of the unspecified samples. The visual observation of the PCA and LDA plots of the rice proves that the instrument was able to separate the samples into different clusters accordingly. The results of LDA and KNN with low misclassification error support the above findings and we may conclude that the e-nose is successfully applied to the classification of the aromatic rice varieties.

  17. A longitudinal functional analysis framework for analysis of white matter tract statistics.

    PubMed

    Yuan, Ying; Gilmore, John H; Geng, Xiujuan; Styner, Martin A; Chen, Kehui; Wang, Jane-Ling; Zhu, Hongtu

    2013-01-01

    Many longitudinal imaging studies have been/are being widely conducted to use diffusion tensor imaging (DTI) to better understand white matter maturation in normal controls and diseased subjects. There is an urgent demand for the development of statistical methods for analyzing diffusion properties along major fiber tracts obtained from longitudinal DTI studies. Jointly analyzing fiber-tract diffusion properties and covariates from longitudinal studies raises several major challenges including (i) infinite-dimensional functional response data, (ii) complex spatial-temporal correlation structure, and (iii) complex spatial smoothness. To address these challenges, this article is to develop a longitudinal functional analysis framework (LFAF) to delineate the dynamic changes of diffusion properties along major fiber tracts and their association with a set of covariates of interest (e.g., age and group status) and the structure of the variability of these white matter tract properties in various longitudinal studies. Our LFAF consists of a functional mixed effects model for addressing all three challenges, an efficient method for spatially smoothing varying coefficient functions, an estimation method for estimating the spatial-temporal correlation structure, a test procedure with a global test statistic for testing hypotheses of interest associated with functional response, and a simultaneous confidence band for quantifying the uncertainty in the estimated coefficient functions. Simulated data are used to evaluate the finite sample performance of LFAF and to demonstrate that LFAF significantly outperforms a voxel-wise mixed model method. We apply LFAF to study the spatial-temporal dynamics of white-matter fiber tracts in a clinical study of neurodevelopment.

  18. The Importance of Understanding Statistics: An Analysis of Document Supply Statistics at Macquarie University Library

    ERIC Educational Resources Information Center

    Pearson, Kathryn

    2008-01-01

    Macquarie University Library was concerned at the length of time that elapsed between placement of an interlibrary loan request to the satisfaction of that request. Taking advantage of improved statistical information available to them through membership of the CLIC Consortium, library staff investigated the reasons for delivery delay. This led to…

  19. Statistical Analysis of Tank 5 Floor Sample Results

    SciTech Connect

    Shine, E. P.

    2013-01-31

    Sampling has been completed for the characterization of the residual material on the floor of Tank 5 in the F-Area Tank Farm at the Savannah River Site (SRS), near Aiken, SC. The sampling was performed by Savannah River Remediation (SRR) LLC using a stratified random sampling plan with volume-proportional compositing. The plan consisted of partitioning the residual material on the floor of Tank 5 into three non-overlapping strata: two strata enclosed accumulations, and a third stratum consisted of a thin layer of material outside the regions of the two accumulations. Each of three composite samples was constructed from five primary sample locations of residual material on the floor of Tank 5. Three of the primary samples were obtained from the stratum containing the thin layer of material, and one primary sample was obtained from each of the two strata containing an accumulation. This report documents the statistical analyses of the analytical results for the composite samples. The objective of the analysis is to determine the mean concentrations and upper 95% confidence (UCL95) bounds for the mean concentrations for a set of analytes in the tank residuals. The statistical procedures employed in the analyses were consistent with the Environmental Protection Agency (EPA) technical guidance by Singh and others [2010]. Savannah River National Laboratory (SRNL) measured the sample bulk density, nonvolatile beta, gross alpha, and the radionuclide1, elemental, and chemical concentrations three times for each of the composite samples. The analyte concentration data were partitioned into three separate groups for further analysis: analytes with every measurement above their minimum detectable concentrations (MDCs), analytes with no measurements above their MDCs, and analytes with a mixture of some measurement results above and below their MDCs. The means, standard deviations, and UCL95s were computed for the analytes in the two groups that had at least some measurements

  20. Statistical Analysis Of Tank 5 Floor Sample Results

    SciTech Connect

    Shine, E. P.

    2012-08-01

    Sampling has been completed for the characterization of the residual material on the floor of Tank 5 in the F-Area Tank Farm at the Savannah River Site (SRS), near Aiken, SC. The sampling was performed by Savannah River Remediation (SRR) LLC using a stratified random sampling plan with volume-proportional compositing. The plan consisted of partitioning the residual material on the floor of Tank 5 into three non-overlapping strata: two strata enclosed accumulations, and a third stratum consisted of a thin layer of material outside the regions of the two accumulations. Each of three composite samples was constructed from five primary sample locations of residual material on the floor of Tank 5. Three of the primary samples were obtained from the stratum containing the thin layer of material, and one primary sample was obtained from each of the two strata containing an accumulation. This report documents the statistical analyses of the analytical results for the composite samples. The objective of the analysis is to determine the mean concentrations and upper 95% confidence (UCL95) bounds for the mean concentrations for a set of analytes in the tank residuals. The statistical procedures employed in the analyses were consistent with the Environmental Protection Agency (EPA) technical guidance by Singh and others [2010]. Savannah River National Laboratory (SRNL) measured the sample bulk density, nonvolatile beta, gross alpha, and the radionuclide, elemental, and chemical concentrations three times for each of the composite samples. The analyte concentration data were partitioned into three separate groups for further analysis: analytes with every measurement above their minimum detectable concentrations (MDCs), analytes with no measurements above their MDCs, and analytes with a mixture of some measurement results above and below their MDCs. The means, standard deviations, and UCL95s were computed for the analytes in the two groups that had at least some measurements

  1. STATISTICAL ANALYSIS OF TANK 5 FLOOR SAMPLE RESULTS

    SciTech Connect

    Shine, E.

    2012-03-14

    Sampling has been completed for the characterization of the residual material on the floor of Tank 5 in the F-Area Tank Farm at the Savannah River Site (SRS), near Aiken, SC. The sampling was performed by Savannah River Remediation (SRR) LLC using a stratified random sampling plan with volume-proportional compositing. The plan consisted of partitioning the residual material on the floor of Tank 5 into three non-overlapping strata: two strata enclosed accumulations, and a third stratum consisted of a thin layer of material outside the regions of the two accumulations. Each of three composite samples was constructed from five primary sample locations of residual material on the floor of Tank 5. Three of the primary samples were obtained from the stratum containing the thin layer of material, and one primary sample was obtained from each of the two strata containing an accumulation. This report documents the statistical analyses of the analytical results for the composite samples. The objective of the analysis is to determine the mean concentrations and upper 95% confidence (UCL95) bounds for the mean concentrations for a set of analytes in the tank residuals. The statistical procedures employed in the analyses were consistent with the Environmental Protection Agency (EPA) technical guidance by Singh and others [2010]. Savannah River National Laboratory (SRNL) measured the sample bulk density, nonvolatile beta, gross alpha, radionuclide, inorganic, and anion concentrations three times for each of the composite samples. The analyte concentration data were partitioned into three separate groups for further analysis: analytes with every measurement above their minimum detectable concentrations (MDCs), analytes with no measurements above their MDCs, and analytes with a mixture of some measurement results above and below their MDCs. The means, standard deviations, and UCL95s were computed for the analytes in the two groups that had at least some measurements above their

  2. A statistical design for testing apomictic diversification through linkage analysis.

    PubMed

    Zeng, Yanru; Hou, Wei; Song, Shuang; Feng, Sisi; Shen, Lin; Xia, Guohua; Wu, Rongling

    2014-03-01

    The capacity of apomixis to generate maternal clones through seed reproduction has made it a useful characteristic for the fixation of heterosis in plant breeding. It has been observed that apomixis displays pronounced intra- and interspecific diversification, but the genetic mechanisms underlying this diversification remains elusive, obstructing the exploitation of this phenomenon in practical breeding programs. By capitalizing on molecular information in mapping populations, we describe and assess a statistical design that deploys linkage analysis to estimate and test the pattern and extent of apomictic differences at various levels from genotypes to species. The design is based on two reciprocal crosses between two individuals each chosen from a hermaphrodite or monoecious species. A multinomial distribution likelihood is constructed by combining marker information from two crosses. The EM algorithm is implemented to estimate the rate of apomixis and test its difference between two plant populations or species as the parents. The design is validated by computer simulation. A real data analysis of two reciprocal crosses between hickory (Carya cathayensis) and pecan (C. illinoensis) demonstrates the utilization and usefulness of the design in practice. The design provides a tool to address fundamental and applied questions related to the evolution and breeding of apomixis.

  3. A statistical method for draft tube pressure pulsation analysis

    NASA Astrophysics Data System (ADS)

    Doerfler, P. K.; Ruchonnet, N.

    2012-11-01

    Draft tube pressure pulsation (DTPP) in Francis turbines is composed of various components originating from different physical phenomena. These components may be separated because they differ by their spatial relationships and by their propagation mechanism. The first step for such an analysis was to distinguish between so-called synchronous and asynchronous pulsations; only approximately periodic phenomena could be described in this manner. However, less regular pulsations are always present, and these become important when turbines have to operate in the far off-design range, in particular at very low load. The statistical method described here permits to separate the stochastic (random) component from the two traditional 'regular' components. It works in connection with the standard technique of model testing with several pressure signals measured in draft tube cone. The difference between the individual signals and the averaged pressure signal, together with the coherence between the individual pressure signals is used for analysis. An example reveals that a generalized, non-periodic version of the asynchronous pulsation is important at low load.

  4. Data Analysis & Statistical Methods for Command File Errors

    NASA Technical Reports Server (NTRS)

    Meshkat, Leila; Waggoner, Bruce; Bryant, Larry

    2014-01-01

    This paper explains current work on modeling for managing the risk of command file errors. It is focused on analyzing actual data from a JPL spaceflight mission to build models for evaluating and predicting error rates as a function of several key variables. We constructed a rich dataset by considering the number of errors, the number of files radiated, including the number commands and blocks in each file, as well as subjective estimates of workload and operational novelty. We have assessed these data using different curve fitting and distribution fitting techniques, such as multiple regression analysis, and maximum likelihood estimation to see how much of the variability in the error rates can be explained with these. We have also used goodness of fit testing strategies and principal component analysis to further assess our data. Finally, we constructed a model of expected error rates based on the what these statistics bore out as critical drivers to the error rate. This model allows project management to evaluate the error rate against a theoretically expected rate as well as anticipate future error rates.

  5. Autotasked Performance in the NAS Workload: A Statistical Analysis

    NASA Technical Reports Server (NTRS)

    Carter, R. L.; Stockdale, I. E.; Kutler, Paul (Technical Monitor)

    1998-01-01

    A statistical analysis of the workload performance of a production quality FORTRAN code for five different Cray Y-MP hardware and system software configurations is performed. The analysis was based on an experimental procedure that was designed to minimize correlations between the number of requested CPUs and the time of day the runs were initiated. Observed autotasking over heads were significantly larger for the set of jobs that requested the maximum number of CPUs. Speedups for UNICOS 6 releases show consistent wall clock speedups in the workload of around 2. which is quite good. The observed speed ups were very similar for the set of jobs that requested 8 CPUs and the set that requested 4 CPUs. The original NAS algorithm for determining charges to the user discourages autotasking in the workload. A new charging algorithm to be applied to jobs run in the NQS multitasking queues also discourages NAS users from using auto tasking. The new algorithm favors jobs requesting 8 CPUs over those that request less, although the jobs requesting 8 CPUs experienced significantly higher over head and presumably degraded system throughput. A charging algorithm is presented that has the following desirable characteristics when applied to the data: higher overhead jobs requesting 8 CPUs are penalized when compared to moderate overhead jobs requesting 4 CPUs, thereby providing a charging incentive to NAS users to use autotasking in a manner that provides them with significantly improved turnaround while also maintaining system throughput.

  6. External quality assessment in water microbiology: statistical analysis of performance.

    PubMed

    Tillett, H E; Lightfoot, N F; Eaton, S

    1993-04-01

    A UK-based scheme of water microbiology assessment requires participants to record counts of relevant organisms. Not every sample will contain the target number of organisms because of natural variation and therefore a range of results is acceptable. Results which are tail-end (i.e. at the extreme low or high end of this range) could occasionally be reported by any individual laboratory by chance. Several tail-end results might imply a laboratory problem. Statistical assessment is done in two stages. A non-parametric test of the distribution of tail-end counts amongst laboratories is performed (Cochran's Q) and, if they are not random, then observed and expected frequencies of tail-end counts are compared to identify participants who may have reported excessive numbers of low or high results. Analyses so far have shown that laboratories find high counts no more frequently than would be expected by chance, but that significant clusters of low counts can be detected among participants. These findings have been observed both in short-term and in long-term assessments, thus allowing detection of new episodes of poor performance and intermittent problems. The analysis relies on an objective definition of tail-end results. Working definitions are presented which should identify poor performance in terms of microbiological significance, and which allow fair comparison between membrane-filtration and multiple-tube techniques. Smaller differences between laboratories, which may be statistically significant, will not be detected. Different definitions of poor performance could be incorporated into future assessments.

  7. Constraints on Statistical Computations at 10 Months of Age: The Use of Phonological Features

    ERIC Educational Resources Information Center

    Gonzalez-Gomez, Nayeli; Nazzi, Thierry

    2015-01-01

    Recently, several studies have argued that infants capitalize on the statistical properties of natural languages to acquire the linguistic structure of their native language, but the kinds of constraints which apply to statistical computations remain largely unknown. Here we explored French-learning infants' perceptual preference for…

  8. Statistical Analysis of Data with Non-Detectable Values

    SciTech Connect

    Frome, E.L.

    2004-08-26

    Environmental exposure measurements are, in general, positive and may be subject to left censoring, i.e. the measured value is less than a ''limit of detection''. In occupational monitoring, strategies for assessing workplace exposures typically focus on the mean exposure level or the probability that any measurement exceeds a limit. A basic problem of interest in environmental risk assessment is to determine if the mean concentration of an analyte is less than a prescribed action level. Parametric methods, used to determine acceptable levels of exposure, are often based on a two parameter lognormal distribution. The mean exposure level and/or an upper percentile (e.g. the 95th percentile) are used to characterize exposure levels, and upper confidence limits are needed to describe the uncertainty in these estimates. In certain situations it is of interest to estimate the probability of observing a future (or ''missed'') value of a lognormal variable. Statistical methods for random samples (without non-detects) from the lognormal distribution are well known for each of these situations. In this report, methods for estimating these quantities based on the maximum likelihood method for randomly left censored lognormal data are described and graphical methods are used to evaluate the lognormal assumption. If the lognormal model is in doubt and an alternative distribution for the exposure profile of a similar exposure group is not available, then nonparametric methods for left censored data are used. The mean exposure level, along with the upper confidence limit, is obtained using the product limit estimate, and the upper confidence limit on the 95th percentile (i.e. the upper tolerance limit) is obtained using a nonparametric approach. All of these methods are well known but computational complexity has limited their use in routine data analysis with left censored data. The recent development of the R environment for statistical data analysis and graphics has greatly

  9. Spectral signature verification using statistical analysis and text mining

    NASA Astrophysics Data System (ADS)

    DeCoster, Mallory E.; Firpi, Alexe H.; Jacobs, Samantha K.; Cone, Shelli R.; Tzeng, Nigel H.; Rodriguez, Benjamin M.

    2016-05-01

    In the spectral science community, numerous spectral signatures are stored in databases representative of many sample materials collected from a variety of spectrometers and spectroscopists. Due to the variety and variability of the spectra that comprise many spectral databases, it is necessary to establish a metric for validating the quality of spectral signatures. This has been an area of great discussion and debate in the spectral science community. This paper discusses a method that independently validates two different aspects of a spectral signature to arrive at a final qualitative assessment; the textual meta-data and numerical spectral data. Results associated with the spectral data stored in the Signature Database1 (SigDB) are proposed. The numerical data comprising a sample material's spectrum is validated based on statistical properties derived from an ideal population set. The quality of the test spectrum is ranked based on a spectral angle mapper (SAM) comparison to the mean spectrum derived from the population set. Additionally, the contextual data of a test spectrum is qualitatively analyzed using lexical analysis text mining. This technique analyzes to understand the syntax of the meta-data to provide local learning patterns and trends within the spectral data, indicative of the test spectrum's quality. Text mining applications have successfully been implemented for security2 (text encryption/decryption), biomedical3 , and marketing4 applications. The text mining lexical analysis algorithm is trained on the meta-data patterns of a subset of high and low quality spectra, in order to have a model to apply to the entire SigDB data set. The statistical and textual methods combine to assess the quality of a test spectrum existing in a database without the need of an expert user. This method has been compared to other validation methods accepted by the spectral science community, and has provided promising results when a baseline spectral signature is

  10. On Conceptual Analysis as the Primary Qualitative Approach to Statistics Education Research in Psychology

    ERIC Educational Resources Information Center

    Petocz, Agnes; Newbery, Glenn

    2010-01-01

    Statistics education in psychology often falls disappointingly short of its goals. The increasing use of qualitative approaches in statistics education research has extended and enriched our understanding of statistical cognition processes, and thus facilitated improvements in statistical education and practices. Yet conceptual analysis, a…

  11. Hydrogeochemical characteristics of groundwater in Latvia using multivariate statistical analysis

    NASA Astrophysics Data System (ADS)

    Retike, Inga; Kalvans, Andis; Bikse, Janis; Popovs, Konrads; Babre, Alise

    2015-04-01

    The aim of this study is to determine geochemical processes denoting trace element levels and variations in the fresh groundwater in Latvia. The database of 1398 groundwater samples containing records about major ion chemistry, trace elements and geological conditions was made and used. Accuracy of groundwater analysis and errors were determined and excluded prior statistical analysis. Groundwater hydrogeochemical groups were distributed on the basis of major ion concentrations using Hierarchical Cluster Analysis (HCA) and Principal Component Analysis (PCA). The results of PCA showed that there are three main geochemical groups explaining 84% of the total variance in data set. Component 1 explains the greatest amount of variance- 51% with main positive loadings of Cl, Na, K and Mg. Component 2 explains 21% of the variance with highest loadings of HCO3, Ca and Mg. Component 3 shows the highest loadings of SO4 and Ca and explains 12% of the total variance. HCA was chosen because of its great ability to group large amount of data (groundwater samples) in several clusters based on similar characteristics. As a result three large groups comprising nine distinctive clusters was made. It was possible to characterise each cluster depending on its depth of sampling, aquifer material and geochemical processes: carbonate dissolution (weathering), groundwater mixing, gypsum dissolution, ion exchange and seawater and upward saline water intrusion. Cluster 1 is the least altered infiltration water with very low load of dissolved salts. It is concluded that the groundwater in Cluster 5 has evolved from Cluster 1 by carbonate weathering in an open system conditions. The Cluster 4 is similar to Cluster 5, yet have been affected by reduction of sulphates and iron species. Cluster 3 is characterised by highest loading of chloride salts while Cluster 9 represents groundwater with highest sulphate concentrations resulting from gypsum dissolution. However, Cluster 8 is an intermediate

  12. Transcriptome analysis of aging mouse meibomian glands

    PubMed Central

    Parfitt, Geraint J.; Brown, Donald J.

    2016-01-01

    Purpose Dry eye disease is a common condition associated with age-related meibomian gland dysfunction (ARMGD). We have previously shown that ARMGD occurs in old mice, similar to that observed in human patients with MGD. To begin to understand the mechanism underlying ARMGD, we generated transcriptome profiles of eyelids excised from young and old mice of both sexes. Methods Male and female C57BL/6 mice were euthanized at ages of 3 months or 2 years and their lower eyelids removed, the conjunctival epithelium scrapped off, and the tarsal plate, containing the meibomian glands, dissected from the overlying muscle and lid epidermis. RNA was isolated, enriched, and transcribed into cDNA and processed to generate four non-stranded libraries with distinct bar codes on each adaptor. The libraries were then sequenced and mapped to the mm10 reference genome, and expression results were gathered as reads per length of transcript in kilobases per million mapped reads (RPKM) values. Differential gene expression analyses were performed using CyberT. Results Approximately 55 million reads were generated from each library. Expression data indicated that about 15,000 genes were expressed in these tissues. Of the genes that showed more than twofold significant differences in either young or old tissue, 698 were identified as differentially expressed. According to the Gene Ontology (GO) analysis, the cellular, developmental, and metabolic processes were found to be highly represented with Wnt function noted to be altered in the aging mouse. Conclusions The RNA sequencing data identified several signaling pathways, including fibroblast growth factor (FGF) and Wnt that were altered in the meibomian glands of aging mice. PMID:27279727

  13. Common pitfalls in statistical analysis: Intention-to-treat versus per-protocol analysis

    PubMed Central

    Ranganathan, Priya; Pramesh, C. S.; Aggarwal, Rakesh

    2016-01-01

    During the conduct of clinical trials, it is not uncommon to have protocol violations or inability to assess outcomes. This article in our series on common pitfalls in statistical analysis explains the complexities of analyzing results from such trials and highlights the importance of “intention-to-treat” analysis. PMID:27453832

  14. Metrology Optical Power Budgeting in SIM Using Statistical Analysis Techniques

    NASA Technical Reports Server (NTRS)

    Kuan, Gary M

    2008-01-01

    The Space Interferometry Mission (SIM) is a space-based stellar interferometry instrument, consisting of up to three interferometers, which will be capable of micro-arc second resolution. Alignment knowledge of the three interferometer baselines requires a three-dimensional, 14-leg truss with each leg being monitored by an external metrology gauge. In addition, each of the three interferometers requires an internal metrology gauge to monitor the optical path length differences between the two sides. Both external and internal metrology gauges are interferometry based, operating at a wavelength of 1319 nanometers. Each gauge has fiber inputs delivering measurement and local oscillator (LO) power, split into probe-LO and reference-LO beam pairs. These beams experience power loss due to a variety of mechanisms including, but not restricted to, design efficiency, material attenuation, element misalignment, diffraction, and coupling efficiency. Since the attenuation due to these sources may degrade over time, an accounting of the range of expected attenuation is needed so an optical power margin can be book kept. A method of statistical optical power analysis and budgeting, based on a technique developed for deep space RF telecommunications, is described in this paper and provides a numerical confidence level for having sufficient optical power relative to mission metrology performance requirements.

  15. Statistical Analysis of the AIAA Drag Prediction Workshop CFD Solutions

    NASA Technical Reports Server (NTRS)

    Morrison, Joseph H.; Hemsch, Michael J.

    2007-01-01

    The first AIAA Drag Prediction Workshop (DPW), held in June 2001, evaluated the results from an extensive N-version test of a collection of Reynolds-Averaged Navier-Stokes CFD codes. The code-to-code scatter was more than an order of magnitude larger than desired for design and experimental validation of cruise conditions for a subsonic transport configuration. The second AIAA Drag Prediction Workshop, held in June 2003, emphasized the determination of installed pylon-nacelle drag increments and grid refinement studies. The code-to-code scatter was significantly reduced compared to the first DPW, but still larger than desired. However, grid refinement studies showed no significant improvement in code-to-code scatter with increasing grid refinement. The third AIAA Drag Prediction Workshop, held in June 2006, focused on the determination of installed side-of-body fairing drag increments and grid refinement studies for clean attached flow on wing alone configurations and for separated flow on the DLR-F6 subsonic transport model. This report compares the transonic cruise prediction results of the second and third workshops using statistical analysis.

  16. Statistical analysis of mission profile parameters of civil transport airplanes

    NASA Technical Reports Server (NTRS)

    Buxbaum, O.

    1972-01-01

    The statistical analysis of flight times as well as airplane gross weights and fuel weights of jet-powered civil transport airplanes has shown that the distributions of their frequency of occurrence per flight can be presented approximately in general form. Before, however, these results may be used during the project stage of an airplane for defining a typical mission profile (the parameters of which are assumed to occur, for example, with a probability of 50 percent), the following points have to be taken into account. Because the individual airplanes were rotated during service, the scatter between the distributions of mission profile parameters for airplanes of the same type, which were flown with similar payload, has proven to be very small. Significant deviations from the generalized distributions may occur if an operator uses one airplane preferably on one or two specific routes. Another reason for larger deviations could be that the maintenance services of the operators of the observed airplanes are not representative of other airlines. Although there are indications that this is unlikely, similar information should be obtained from other operators. Such information would improve the reliability of the data.

  17. Statistical Analysis of Resistivity Anomalies Caused by Underground Caves

    NASA Astrophysics Data System (ADS)

    Frid, V.; Averbach, A.; Frid, M.; Dudkinski, D.; Liskevich, G.

    2015-05-01

    Geophysical prospecting of underground caves being performed on a construction site is often still a challenging procedure. Estimation of a likelihood level of an anomaly found is frequently a mandatory requirement of a project principal due to necessity of risk/safety assessment. However, the methodology of such estimation is not hitherto developed. Aiming to put forward such a methodology the present study (being performed as a part of an underground caves mapping prior to the land development on the site area) consisted of application of electrical resistivity tomography (ERT) together with statistical analysis utilized for the likelihood assessment of underground anomalies located. The methodology was first verified via a synthetic modeling technique and applied to the in situ collected ERT data and then crossed referenced with intrusive investigations (excavation and drilling) for the data verification. The drilling/excavation results showed that the proper discovering of underground caves can be done if anomaly probability level is not lower than 90 %. Such a probability value was shown to be consistent with the modeling results. More than 30 underground cavities were discovered on the site utilizing the methodology.

  18. Plutonium metal exchange program : current status and statistical analysis

    SciTech Connect

    Tandon, L.; Eglin, J. L.; Michalak, S. E.; Picard, R. R.; Temer, D. J.

    2004-01-01

    The Rocky Flats Plutonium (Pu) Metal Sample Exchange program was conducted to insure the quality and intercomparability of measurements such as Pu assay, Pu isotopics, and impurity analyses. The Rocky Flats program was discontinued in 1989 after more than 30 years. In 2001, Los Alamos National Laboratory (LANL) reestablished the Pu Metal Exchange program. In addition to the Atomic Weapons Establishment (AWE) at Aldermaston, six Department of Energy (DOE) facilities Argonne East, Argonne West, Livermore, Los Alamos, New Brunswick Laboratory, and Savannah River are currently participating in the program. Plutonium metal samples are prepared and distributed to the sites for destructive measurements to determine elemental concentration, isotopic abundance, and both metallic and nonmetallic impurity levels. The program provides independent verification of analytical measurement capabilies for each participating facility and allows problems in analytical methods to be identified. The current status of the program will be discussed with emphasis on the unique statistical analysis and modeling of the data developed for the program. The discussion includes the definition of the consensus values for each analyte (in the presence and absence of anomalous values and/or censored values), and interesting features of the data and the results.

  19. Ensemble Solar Forecasting Statistical Quantification and Sensitivity Analysis

    SciTech Connect

    Cheung, WanYin; Zhang, Jie; Florita, Anthony; Hodge, Bri-Mathias; Lu, Siyuan; Hamann, Hendrik F.; Sun, Qian; Lehman, Brad

    2015-10-02

    Uncertainties associated with solar forecasts present challenges to maintain grid reliability, especially at high solar penetrations. This study aims to quantify the errors associated with the day-ahead solar forecast parameters and the theoretical solar power output for a 51-kW solar power plant in a utility area in the state of Vermont, U.S. Forecasts were generated by three numerical weather prediction (NWP) models, including the Rapid Refresh, the High Resolution Rapid Refresh, and the North American Model, and a machine-learning ensemble model. A photovoltaic (PV) performance model was adopted to calculate theoretical solar power generation using the forecast parameters (e.g., irradiance, cell temperature, and wind speed). Errors of the power outputs were quantified using statistical moments and a suite of metrics, such as the normalized root mean squared error (NRMSE). In addition, the PV model's sensitivity to different forecast parameters was quantified and analyzed. Results showed that the ensemble model yielded forecasts in all parameters with the smallest NRMSE. The NRMSE of solar irradiance forecasts of the ensemble NWP model was reduced by 28.10% compared to the best of the three NWP models. Further, the sensitivity analysis indicated that the errors of the forecasted cell temperature attributed only approximately 0.12% to the NRMSE of the power output as opposed to 7.44% from the forecasted solar irradiance.

  20. Ensemble Solar Forecasting Statistical Quantification and Sensitivity Analysis: Preprint

    SciTech Connect

    Cheung, WanYin; Zhang, Jie; Florita, Anthony; Hodge, Bri-Mathias; Lu, Siyuan; Hamann, Hendrik F.; Sun, Qian; Lehman, Brad

    2015-12-08

    Uncertainties associated with solar forecasts present challenges to maintain grid reliability, especially at high solar penetrations. This study aims to quantify the errors associated with the day-ahead solar forecast parameters and the theoretical solar power output for a 51-kW solar power plant in a utility area in the state of Vermont, U.S. Forecasts were generated by three numerical weather prediction (NWP) models, including the Rapid Refresh, the High Resolution Rapid Refresh, and the North American Model, and a machine-learning ensemble model. A photovoltaic (PV) performance model was adopted to calculate theoretical solar power generation using the forecast parameters (e.g., irradiance, cell temperature, and wind speed). Errors of the power outputs were quantified using statistical moments and a suite of metrics, such as the normalized root mean squared error (NRMSE). In addition, the PV model's sensitivity to different forecast parameters was quantified and analyzed. Results showed that the ensemble model yielded forecasts in all parameters with the smallest NRMSE. The NRMSE of solar irradiance forecasts of the ensemble NWP model was reduced by 28.10% compared to the best of the three NWP models. Further, the sensitivity analysis indicated that the errors of the forecasted cell temperature attributed only approximately 0.12% to the NRMSE of the power output as opposed to 7.44% from the forecasted solar irradiance.

  1. Slow and fast solar wind - data selection and statistical analysis

    NASA Astrophysics Data System (ADS)

    Wawrzaszek, Anna; Macek, Wiesław M.; Bruno, Roberto; Echim, Marius

    2014-05-01

    In this work we consider the important problem of selection of slow and fast solar wind data measured in-situ by the Ulysses spacecraft during two solar minima (1995-1997, 2007-2008) and solar maximum (1999-2001). To recognise different types of solar wind we use a set of following parameters: radial velocity, proton density, proton temperature, the distribution of charge states of oxygen ions, and compressibility of magnetic field. We present how this idea of the data selection works on Ulysses data. In the next step we consider the chosen intervals for fast and slow solar wind and perform statistical analysis of the fluctuating magnetic field components. In particular, we check the possibility of identification of inertial range by considering the scale dependence of the third and fourth orders scaling exponents of structure function. We try to verify the size of inertial range depending on the heliographic latitudes, heliocentric distance and phase of the solar cycle. Research supported by the European Community's Seventh Framework Programme (FP7/2007 - 2013) under grant agreement no 313038/STORM.

  2. A Statistical Aggregation Engine for Climatology and Trend Analysis

    NASA Astrophysics Data System (ADS)

    Chapman, D. R.; Simon, T. A.; Halem, M.

    2014-12-01

    Fundamental climate data records (FCDRs) from satellite instruments often span tens to hundreds of terabytes or even petabytes in scale. These large volumes make it difficult to aggregate or summarize their climatology and climate trends. It is especially cumbersome to supply the full derivation (provenance) of these aggregate calculations. We present a lightweight and resilient software platform, Gridderama that simplifies the calculation of climatology by exploiting the "Data-Cube" topology often present in earth observing satellite records. By using the large array storage (LAS) paradigm, Gridderama allows the analyst to more easily produce a series of aggregate climate data products at progressively coarser spatial and temporal resolutions. Furthermore, provenance tracking and extensive visualization capabilities allow the analyst to track down and correct for data problems such as missing data and outliers that may impact the scientific results. We have developed and applied Gridderama to calculate a trend analysis of 55 Terabytes of AIRS Level 1b infrared radiances, and show statistically significant trending in the greenhouse gas absorption bands as observed by AIRS over the 2003-2012 decade. We will extend this calculation to show regional changes in CO2 concentration from AIRS over the 2003-2012 decade by using a neural network retrieval algorithm.

  3. Statistical analysis of the seasonal variation in the twinning rate.

    PubMed

    Fellman, J; Eriksson, A W

    1999-03-01

    There have been few secular analyses of the seasonal variation in human twinning and the results are conflicting. One reason for this is that the seasonal pattern of twinning varies in different populations and at different periods. Another reason is that the statistical methods used are different. The changing pattern of seasonal variation in twinning rates and total maternities in Denmark was traced for three periods (1855-69, 1870-94, and 1937-84). Two alternative methods of analysis are considered. The method of Walter and Elwood and a trigonometric regression model give closely similar results. The seasonal distribution of twin maternities for the periods in the 19th century showed highly significant departures. For both twin and general maternities, the main peaks can be seen from March to June and a local peak in September. During the spring-summer season the twinning rates were higher than the total birth rates, indicating a stronger seasonal variation for the twin maternities than for the general maternities. For 1937-84, there was a similar, but less accentuated, pattern. Studies of other populations are compared with the Danish results. The more accentuated seasonal variation of twinning in the past indicate that some factors in the past affected women during summer-autumn and around Christmas time, making them more fecund and particularly to be more prone to polyovulation and/or more able to complete a gestation with multiple embryos.

  4. Age estimation of bloodstains using smartphones and digital image analysis.

    PubMed

    Thanakiatkrai, Phuvadol; Yaodam, Alisa; Kitpipit, Thitika

    2013-12-10

    Recent studies on bloodstains have focused on determining the time since deposition of bloodstains, which can provide useful temporal information to forensic investigations. This study is the first to use smartphone cameras in combination with a truly low-cost illumination system as a tool to estimate the age of bloodstains. Bloodstains were deposited on various substrates and photographed with a smartphone camera. Three smartphones (Samsung Galaxy S Plus, Apple iPhone 4, and Apple iPad 2) were compared. The environmental effects - temperature, humidity, light exposure, and anticoagulant - on the bloodstain age estimation process were explored. The color values from the digital images were extracted and correlated with time since deposition. Magenta had the highest correlation (R(2)=0.966) and was used in subsequent experiments. The Samsung Galaxy S Plus was the most suitable smartphone as its magenta decreased exponentially with increasing time and had highest repeatability (low variation within and between pictures). The quantifiable color change observed is consistent with well-established hemoglobin denaturation process. Using a statistical classification technique called Random Forests™, we could predict bloodstain age accurately up to 42 days with an error rate of 12%. Additionally, the age of forty blind stains were all correctly predicted, and 83% of mock casework samples were correctly classified. No within- and between-person variations were observed (p>0.05), while smartphone camera, temperature, humidity, and substrate color influenced the age determination process in different ways. Our technique provides a cheap, rapid, easy-to-use, and truly portable alternative to more complicated analysis using specialized equipment, e.g. spectroscopy and HPLC. No training is necessary with our method, and we envision a smartphone application that could take user inputs of environmental factors and provide an accurate estimate of bloodstain age. PMID:24314532

  5. The Statistics Concept Inventory: Development and analysis of a cognitive assessment instrument in statistics

    NASA Astrophysics Data System (ADS)

    Allen, Kirk

    The Statistics Concept Inventory (SCI) is a multiple choice test designed to assess students' conceptual understanding of topics typically encountered in an introductory statistics course. This dissertation documents the development of the SCI from Fall 2002 up to Spring 2006. The first phase of the project essentially sought to answer the question: "Can you write a test to assess topics typically encountered in introductory statistics?" Book One presents the results utilized in answering this question in the affirmative. The bulk of the results present the development and evolution of the items, primarily relying on objective metrics to gauge effectiveness but also incorporating student feedback. The second phase boils down to: "Now that you have the test, what else can you do with it?" This includes an exploration of Cronbach's alpha, the most commonly-used measure of test reliability in the literature. An online version of the SCI was designed, and its equivalency to the paper version is assessed. Adding an extra wrinkle to the online SCI, subjects rated their answer confidence. These results show a general positive trend between confidence and correct responses. However, some items buck this trend, revealing potential sources of misunderstandings, with comparisons offered to the extant statistics and probability educational research. The third phase is a re-assessment of the SCI: "Are you sure?" A factor analytic study favored a uni-dimensional structure for the SCI, although maintaining the likelihood of a deeper structure if more items can be written to tap similar topics. A shortened version of the instrument is proposed, demonstrated to be able to maintain a reliability nearly identical to that of the full instrument. Incorporating student feedback and a faculty topics survey, improvements to the items and recommendations for further research are proposed. The state of the concept inventory movement is assessed, to offer a comparison to the work presented

  6. Bayesian Analysis of Order-Statistics Models for Ranking Data.

    ERIC Educational Resources Information Center

    Yu, Philip L. H.

    2000-01-01

    Studied the order-statistics models, extending the usual normal order-statistics model into one in which the underlying random variables followed a multivariate normal distribution. Used a Bayesian approach and the Gibbs sampling technique. Applied the proposed method to analyze presidential election data from the American Psychological…

  7. The Higher Education System in Israel: Statistical Abstract and Analysis.

    ERIC Educational Resources Information Center

    Herskovic, Shlomo

    This edition of a statistical abstract published every few years on the higher education system in Israel presents the most recent data available through 1990-91. The data were gathered through the cooperation of the Central Bureau of Statistics and institutions of higher education. Chapter 1 presents a summary of principal findings covering the…

  8. A statistical analysis of icing prediction in complex terrains

    NASA Astrophysics Data System (ADS)

    Terborg, Amanda M.

    The issue of icing has been around for decades in aviation industry, and while notable improvements have been made in the study of the formation and process of icing, the prediction of icing events is a challenge that has yet to be completely overcome. Low level icing prediction, particularly in complex terrain, has been bumped to the back burner in an attempt to perfect the models created for in-flight icing. However, over the years there have been a number of different, non-model methods used to better refine the variable involved in low-level icing prediction. One of those methods comes through statistical analysis and modeling, particularly through the use of the Classification and Regression Tree (CART) techniques. These techniques examine the statistical significance of each predictor within a data set to determine various decision rules. Those rules in which the overall misclassification error is the smallest are then used to construct a decision tree and can be used to create a forecast for icing events. Using adiabatically adjusted Rapid Update Cycle (RUC) interpolated sounding data these CART techniques are used in this study to examine icing events in the White Mountains of New Hampshire, specifically on the summit of Mount Washington. The Mount Washington Observatory (MWO), which sits on the summit and is manned year around by weather observers, is no stranger to icing occurrences. In fact, the summit sees icing events from October all the way until April, and occasionally even into May. In this study, these events are examined in detail for the October 2010 to April 2011 season, and five CART models generated for icing in general, rime icing, and glaze icing in attempt to create a decision tree or trees with a high predictive accuracy. Also examined in this study for the October 2010 to April 2011 icing season is the Air Weather Service Pamphlet (AWSP) algorithm, a decision tree model currently in use by the Air Force to predict icing events. Producing

  9. Combined statistical analysis of landslide release and propagation

    NASA Astrophysics Data System (ADS)

    Mergili, Martin; Rohmaneo, Mohammad; Chu, Hone-Jay

    2016-04-01

    Statistical methods - often coupled with stochastic concepts - are commonly employed to relate areas affected by landslides with environmental layers, and to estimate spatial landslide probabilities by applying these relationships. However, such methods only concern the release of landslides, disregarding their motion. Conceptual models for mass flow routing are used for estimating landslide travel distances and possible impact areas. Automated approaches combining release and impact probabilities are rare. The present work attempts to fill this gap by a fully automated procedure combining statistical and stochastic elements, building on the open source GRASS GIS software: (1) The landslide inventory is subset into release and deposition zones. (2) We employ a traditional statistical approach to estimate the spatial release probability of landslides. (3) We back-calculate the probability distribution of the angle of reach of the observed landslides, employing the software tool r.randomwalk. One set of random walks is routed downslope from each pixel defined as release area. Each random walk stops when leaving the observed impact area of the landslide. (4) The cumulative probability function (cdf) derived in (3) is used as input to route a set of random walks downslope from each pixel in the study area through the DEM, assigning the probability gained from the cdf to each pixel along the path (impact probability). The impact probability of a pixel is defined as the average impact probability of all sets of random walks impacting a pixel. Further, the average release probabilities of the release pixels of all sets of random walks impacting a given pixel are stored along with the area of the possible release zone. (5) We compute the zonal release probability by increasing the release probability according to the size of the release zone - the larger the zone, the larger the probability that a landslide will originate from at least one pixel within this zone. We

  10. Investigating Moderator Hypotheses in Aging Research: Statistical, Methodological, and Conceptual Difficulties with Comparing Separate Regressions

    ERIC Educational Resources Information Center

    Newsom, Jason T.; Prigerson, Holly G.; Schulz, Richard; Reynolds, Charles F., III

    2003-01-01

    Many topics in aging research address questions about group differences in prediction. Such questions can be viewed in terms of interaction or moderator effects, and use of appropriate methods to test these hypotheses are necessary to arrive at accurate conclusions about age differences. This article discusses the conceptual, methodological, and…

  11. Parallelization of the Physical-Space Statistical Analysis System (PSAS)

    NASA Technical Reports Server (NTRS)

    Larson, J. W.; Guo, J.; Lyster, P. M.

    1999-01-01

    Atmospheric data assimilation is a method of combining observations with model forecasts to produce a more accurate description of the atmosphere than the observations or forecast alone can provide. Data assimilation plays an increasingly important role in the study of climate and atmospheric chemistry. The NASA Data Assimilation Office (DAO) has developed the Goddard Earth Observing System Data Assimilation System (GEOS DAS) to create assimilated datasets. The core computational components of the GEOS DAS include the GEOS General Circulation Model (GCM) and the Physical-space Statistical Analysis System (PSAS). The need for timely validation of scientific enhancements to the data assimilation system poses computational demands that are best met by distributed parallel software. PSAS is implemented in Fortran 90 using object-based design principles. The analysis portions of the code solve two equations. The first of these is the "innovation" equation, which is solved on the unstructured observation grid using a preconditioned conjugate gradient (CG) method. The "analysis" equation is a transformation from the observation grid back to a structured grid, and is solved by a direct matrix-vector multiplication. Use of a factored-operator formulation reduces the computational complexity of both the CG solver and the matrix-vector multiplication, rendering the matrix-vector multiplications as a successive product of operators on a vector. Sparsity is introduced to these operators by partitioning the observations using an icosahedral decomposition scheme. PSAS builds a large (approx. 128MB) run-time database of parameters used in the calculation of these operators. Implementing a message passing parallel computing paradigm into an existing yet developing computational system as complex as PSAS is nontrivial. One of the technical challenges is balancing the requirements for computational reproducibility with the need for high performance. The problem of computational

  12. Carpal bone analysis in bone age assessment

    NASA Astrophysics Data System (ADS)

    Zhang, Aifeng; Gertych, Arkadiusz; Kurkowska-Pospiech, Sylwia; Liu, Brent J.; Huang, H. K.

    2006-03-01

    A computer-aided-diagnosis (CAD) method has been previously developed in our Laboratory based on features extracted from regions of interest (ROI) in phalanges in a digital hand atlas. Due to various factors, including, the diversity of size, shape and orientation of carpal bones, non-uniformity of soft tissue, low contrast between the bony structure and soft tissue, the automatic identification and segmentation of bone boundaries is an extremely challenging task. Past research work on carpal bone segmentation has been done utilizing dynamic thresholding. However, due to the discrepancy of carpal bones developments and the limitations of segmentation algorithms, carpal bone ROI has not been taken into consideration in the bone age assessment procedure. In this paper, we present a method for fully automatic carpal bone segmentation and feature analysis in hand X-ray radiograph. The purpose of this paper is to automatically segment the carpal bones by anisotropic diffusion and Canny edge detection techniques. By adding their respective features extracted from carpal bones ROI to the phalangeal ROI feature space, the accuracy of bone age assessment can be improved especially when the image processing in the phalangeal ROI fails in younger children.

  13. First-passage statistics for aging diffusion in systems with annealed and quenched disorder

    NASA Astrophysics Data System (ADS)

    Krüsemann, Henning; Godec, Aljaž; Metzler, Ralf

    2014-04-01

    Aging, the dependence of the dynamics of a physical process on the time ta since its original preparation, is observed in systems ranging from the motion of charge carriers in amorphous semiconductors over the blinking dynamics of quantum dots to the tracer dispersion in living biological cells. Here we study the effects of aging on one of the most fundamental properties of a stochastic process, the first-passage dynamics. We find that for an aging continuous time random walk process, the scaling exponent of the density of first-passage times changes twice as the aging progresses and reveals an intermediate scaling regime. The first-passage dynamics depends on ta differently for intermediate and strong aging. Similar crossovers are obtained for the first-passage dynamics for a confined and driven particle. Comparison to the motion of an aged particle in the quenched trap model with a bias shows excellent agreement with our analytical findings. Our results demonstrate how first-passage measurements can be used to unravel the age ta of a physical system.

  14. Thermal hydraulic limits analysis using statistical propagation of parametric uncertainties

    SciTech Connect

    Chiang, K. Y.; Hu, L. W.; Forget, B.

    2012-07-01

    The MIT Research Reactor (MITR) is evaluating the conversion from highly enriched uranium (HEU) to low enrichment uranium (LEU) fuel. In addition to the fuel element re-design, a reactor power upgraded from 6 MW to 7 MW is proposed in order to maintain the same reactor performance of the HEU core. Previous approach in analyzing the impact of engineering uncertainties on thermal hydraulic limits via the use of engineering hot channel factors (EHCFs) was unable to explicitly quantify the uncertainty and confidence level in reactor parameters. The objective of this study is to develop a methodology for MITR thermal hydraulic limits analysis by statistically combining engineering uncertainties with an aim to eliminate unnecessary conservatism inherent in traditional analyses. This method was employed to analyze the Limiting Safety System Settings (LSSS) for the MITR, which is the avoidance of the onset of nucleate boiling (ONB). Key parameters, such as coolant channel tolerances and heat transfer coefficients, were considered as normal distributions using Oracle Crystal Ball to calculate ONB. The LSSS power is determined with 99.7% confidence level. The LSSS power calculated using this new methodology is 9.1 MW, based on core outlet coolant temperature of 60 deg. C, and primary coolant flow rate of 1800 gpm, compared to 8.3 MW obtained from the analytical method using the EHCFs with same operating conditions. The same methodology was also used to calculate the safety limit (SL) for the MITR, conservatively determined using onset of flow instability (OFI) as the criterion, to verify that adequate safety margin exists between LSSS and SL. The calculated SL is 10.6 MW, which is 1.5 MW higher than LSSS. (authors)

  15. 3D statistical failure analysis of monolithic dental ceramic crowns.

    PubMed

    Nasrin, Sadia; Katsube, Noriko; Seghi, Robert R; Rokhlin, Stanislav I

    2016-07-01

    For adhesively retained ceramic crown of various types, it has been clinically observed that the most catastrophic failures initiate from the cement interface as a result of radial crack formation as opposed to Hertzian contact stresses originating on the occlusal surface. In this work, a 3D failure prognosis model is developed for interface initiated failures of monolithic ceramic crowns. The surface flaw distribution parameters determined by biaxial flexural tests on ceramic plates and point-to-point variations of multi-axial stress state at the intaglio surface are obtained by finite element stress analysis. They are combined on the basis of fracture mechanics based statistical failure probability model to predict failure probability of a monolithic crown subjected to single-cycle indentation load. The proposed method is verified by prior 2D axisymmetric model and experimental data. Under conditions where the crowns are completely bonded to the tooth substrate, both high flexural stress and high interfacial shear stress are shown to occur in the wall region where the crown thickness is relatively thin while high interfacial normal tensile stress distribution is observed at the margin region. Significant impact of reduced cement modulus on these stress states is shown. While the analyses are limited to single-cycle load-to-failure tests, high interfacial normal tensile stress or high interfacial shear stress may contribute to degradation of the cement bond between ceramic and dentin. In addition, the crown failure probability is shown to be controlled by high flexural stress concentrations over a small area, and the proposed method might be of some value to detect initial crown design errors. PMID:27215334

  16. Statistical analysis of simple repeats in the human genome

    NASA Astrophysics Data System (ADS)

    Piazza, F.; Liò, P.

    2005-03-01

    The human genome contains repetitive DNA at different level of sequence length, number and dispersion. Highly repetitive DNA is particularly rich in homo- and di-nucleotide repeats, while middle repetitive DNA is rich of families of interspersed, mobile elements hundreds of base pairs (bp) long, among which belong the Alu families. A link between homo- and di-polymeric tracts and mobile elements has been recently highlighted. In particular, the mobility of Alu repeats, which form 10% of the human genome, has been correlated with the length of poly(A) tracts located at one end of the Alu. These tracts have a rigid and non-bendable structure and have an inhibitory effect on nucleosomes, which normally compact the DNA. We performed a statistical analysis of the genome-wide distribution of lengths and inter-tract separations of poly(X) and poly(XY) tracts in the human genome. Our study shows that in humans the length distributions of these sequences reflect the dynamics of their expansion and DNA replication. By means of general tools from linguistics, we show that the latter play the role of highly-significant content-bearing terms in the DNA text. Furthermore, we find that such tracts are positioned in a non-random fashion, with an apparent periodicity of 150 bases. This allows us to extend the link between repetitive, highly mobile elements such as Alus and low-complexity words in human DNA. More precisely, we show that Alus are sources of poly(X) tracts, which in turn affect in a subtle way the combination and diversification of gene expression and the fixation of multigene families.

  17. SUBMILLIMETER NUMBER COUNTS FROM STATISTICAL ANALYSIS OF BLAST MAPS

    SciTech Connect

    Patanchon, Guillaume; Ade, Peter A. R.; Griffin, Matthew; Hargrave, Peter C.; Mauskopf, Philip; Moncelsi, Lorenzo; Pascale, Enzo; Bock, James J.; Chapin, Edward L.; Halpern, Mark; Marsden, Gaelen; Scott, Douglas; Devlin, Mark J.; Dicker, Simon R.; Klein, Jeff; Rex, Marie; Gundersen, Joshua O.; Hughes, David H.; Netterfield, Calvin B.; Olmi, Luca

    2009-12-20

    We describe the application of a statistical method to estimate submillimeter galaxy number counts from confusion-limited observations by the Balloon-borne Large Aperture Submillimeter Telescope (BLAST). Our method is based on a maximum likelihood fit to the pixel histogram, sometimes called 'P(D)', an approach which has been used before to probe faint counts, the difference being that here we advocate its use even for sources with relatively high signal-to-noise ratios. This method has an advantage over standard techniques of source extraction in providing an unbiased estimate of the counts from the bright end down to flux densities well below the confusion limit. We specifically analyze BLAST observations of a roughly 10 deg{sup 2} map centered on the Great Observatories Origins Deep Survey South field. We provide estimates of number counts at the three BLAST wavelengths 250, 350, and 500 mum; instead of counting sources in flux bins we estimate the counts at several flux density nodes connected with power laws. We observe a generally very steep slope for the counts of about -3.7 at 250 mum, and -4.5 at 350 and 500 mum, over the range approx0.02-0.5 Jy, breaking to a shallower slope below about 0.015 Jy at all three wavelengths. We also describe how to estimate the uncertainties and correlations in this method so that the results can be used for model-fitting. This method should be well suited for analysis of data from the Herschel satellite.

  18. Bayesian Statistical Analysis of Circadian Oscillations in Fibroblasts

    PubMed Central

    Cohen, Andrew L.; Leise, Tanya L.; Welsh, David K.

    2012-01-01

    Precise determination of a noisy biological oscillator’s period from limited experimental data can be challenging. The common practice is to calculate a single number (a point estimate) for the period of a particular time course. Uncertainty is inherent in any statistical estimator applied to noisy data, so our confidence in such point estimates depends on the quality and quantity of the data. Ideally, a period estimation method should both produce an accurate point estimate of the period and measure the uncertainty in that point estimate. A variety of period estimation methods are known, but few assess the uncertainty of the estimates, and a measure of uncertainty is rarely reported in the experimental literature. We compare the accuracy of point estimates using six common methods, only one of which can also produce uncertainty measures. We then illustrate the advantages of a new Bayesian method for estimating period, which outperforms the other six methods in accuracy of point estimates for simulated data and also provides a measure of uncertainty. We apply this method to analyze circadian oscillations of gene expression in individual mouse fibroblast cells and compute the number of cells and sampling duration required to reduce the uncertainty in period estimates to a desired level. This analysis indicates that, due to the stochastic variability of noisy intracellular oscillators, achieving a narrow margin of error can require an impractically large number of cells. In addition, we use a hierarchical model to determine the distribution of intrinsic cell periods, thereby separating the variability due to stochastic gene expression within each cell from the variability in period across the population of cells. PMID:22982138

  19. 3D statistical failure analysis of monolithic dental ceramic crowns.

    PubMed

    Nasrin, Sadia; Katsube, Noriko; Seghi, Robert R; Rokhlin, Stanislav I

    2016-07-01

    For adhesively retained ceramic crown of various types, it has been clinically observed that the most catastrophic failures initiate from the cement interface as a result of radial crack formation as opposed to Hertzian contact stresses originating on the occlusal surface. In this work, a 3D failure prognosis model is developed for interface initiated failures of monolithic ceramic crowns. The surface flaw distribution parameters determined by biaxial flexural tests on ceramic plates and point-to-point variations of multi-axial stress state at the intaglio surface are obtained by finite element stress analysis. They are combined on the basis of fracture mechanics based statistical failure probability model to predict failure probability of a monolithic crown subjected to single-cycle indentation load. The proposed method is verified by prior 2D axisymmetric model and experimental data. Under conditions where the crowns are completely bonded to the tooth substrate, both high flexural stress and high interfacial shear stress are shown to occur in the wall region where the crown thickness is relatively thin while high interfacial normal tensile stress distribution is observed at the margin region. Significant impact of reduced cement modulus on these stress states is shown. While the analyses are limited to single-cycle load-to-failure tests, high interfacial normal tensile stress or high interfacial shear stress may contribute to degradation of the cement bond between ceramic and dentin. In addition, the crown failure probability is shown to be controlled by high flexural stress concentrations over a small area, and the proposed method might be of some value to detect initial crown design errors.

  20. Statistical Design, Models and Analysis for the Job Change Framework.

    ERIC Educational Resources Information Center

    Gleser, Leon Jay

    1990-01-01

    Proposes statistical methodology for testing Loughead and Black's "job change thermostat." Discusses choice of target population; relationship between job satisfaction and values, perceptions, and opportunities; and determinants of job change. (SK)

  1. Analysis of statistical model properties from discrete nuclear structure data

    NASA Astrophysics Data System (ADS)

    Firestone, Richard B.

    2012-02-01

    Experimental M1, E1, and E2 photon strengths have been compiled from experimental data in the Evaluated Nuclear Structure Data File (ENSDF) and the Evaluated Gamma-ray Activation File (EGAF). Over 20,000 Weisskopf reduced transition probabilities were recovered from the ENSDF and EGAF databases. These transition strengths have been analyzed for their dependence on transition energies, initial and final level energies, spin/parity dependence, and nuclear deformation. ENSDF BE1W values were found to increase exponentially with energy, possibly consistent with the Axel-Brink hypothesis, although considerable excess strength observed for transitions between 4-8 MeV. No similar energy dependence was observed in EGAF or ARC data. BM1W average values were nearly constant at all energies above 1 MeV with substantial excess strength below 1 MeV and between 4-8 MeV. BE2W values decreased exponentially by a factor of 1000 from 0 to 16 MeV. The distribution of ENSDF transition probabilities for all multipolarities could be described by a lognormal statistical distribution. BE1W, BM1W, and BE2W strengths all increased substantially for initial transition level energies between 4-8 MeV possibly due to dominance of spin-flip and Pygmy resonance transitions at those excitations. Analysis of the average resonance capture data indicated no transition probability dependence on final level spins or energies between 0-3 MeV. The comparison of favored to unfavored transition probabilities for odd-A or odd-Z targets indicated only partial support for the expected branching intensity ratios with many unfavored transitions having nearly the same strength as favored ones. Average resonance capture BE2W transition strengths generally increased with greater deformation. Analysis of ARC data suggest that there is a large E2 admixture in M1 transitions with the mixing ratio δ ≈ 1.0. The ENSDF reduced transition strengths were considerably stronger than those derived from capture gamma ray

  2. Statistical Learning in Typically Developing Children: The Role of Age and Speed of Stimulus Presentation

    ERIC Educational Resources Information Center

    Arciuli, Joanne; Simpson, Ian C.

    2011-01-01

    It is possible that statistical learning (SL) plays a role in almost every mental activity. Indeed, research on SL has grown rapidly over recent decades in an effort to better understand perception and cognition. Yet, there remain gaps in our understanding of how SL operates, in particular with regard to its (im)mutability. Here, we investigated…

  3. The Coming of Age of Statistics Education in New Zealand, and Its Influence Internationally

    ERIC Educational Resources Information Center

    Forbes, Sharleen

    2014-01-01

    New Zealand has been leading the world in terms of the data handling, and in more recent years, data visualisation approach in its school statistics curriculum. In 2013, bootstrapping and randomisation were added to the senior secondary school (Ministry of Education 2012). This paper gives an historical perspective of the people and groups that…

  4. Exploratory Factor Analysis of Diagnostic and Statistical Manual, 5th Edition, Criteria for Posttraumatic Stress Disorder.

    PubMed

    McSweeney, Lauren B; Koch, Ellen I; Saules, Karen K; Jefferson, Stephen

    2016-01-01

    One change to the posttraumatic stress disorder (PTSD) nomenclature highlighted in the Diagnostic and Statistical Manual, 5th Edition (DSM-5; American Psychiatric Association, 2013) is the conceptualization of PTSD as a diagnostic category with four distinct symptom clusters. This article presents exploratory factor analysis to test the structural validity of the DSM-5 conceptualization of PTSD via an online survey that included the PTSD Checklist-5. The study utilized a sample of 113 college students from a large Midwestern university and 177 Amazon Mechanical Turk users. Participants were primarily female, Caucasian, single, and heterosexual with an average age of 32 years. Approximately 30% to 35% of participants met diagnostic criteria for PTSD based on two different scoring criteria. Results of the exploratory factor analysis revealed five distinct symptom clusters. The implications for the classification of PTSD are discussed.

  5. Statistical analysis of synaptic transmission: model discrimination and confidence limits.

    PubMed Central

    Stricker, C; Redman, S; Daley, D

    1994-01-01

    Procedures for discriminating between competing statistical models of synaptic transmission, and for providing confidence limits on the parameters of these models, have been developed. These procedures were tested against simulated data and were used to analyze the fluctuations in synaptic currents evoked in hippocampal neurones. All models were fitted to data using the Expectation-Maximization algorithm and a maximum likelihood criterion. Competing models were evaluated using the log-likelihood ratio (Wilks statistic). When the competing models were not nested, Monte Carlo sampling of the model used as the null hypothesis (H0) provided density functions against which H0 and the alternate model (H1) were tested. The statistic for the log-likelihood ratio was determined from the fit of H0 and H1 to these probability densities. This statistic was used to determine the significance level at which H0 could be rejected for the original data. When the competing models were nested, log-likelihood ratios and the chi 2 statistic were used to determine the confidence level for rejection. Once the model that provided the best statistical fit to the data was identified, many estimates for the model parameters were calculated by resampling the original data. Bootstrap techniques were then used to obtain the confidence limits of these parameters. PMID:7948672

  6. A statistical model for the study of U-Nb aging (u)

    SciTech Connect

    Hemphill, Geralyn M; Hackenberg, Robert E

    2009-01-01

    This study was undertaken to model the aging response of U-Nb alloys in order to quantify property and lifetime predictions and uncertainties, in response to concerns that aging during long-term stockpile storage may change the microstructure and properties of U-6 wt%Nb alloy components in ways adversely affecting performance. U-6Nb has many desirable properties, but is a complex material because of its gross compositional inhomogeneity (its chemical banding spans 4-8 wt%), its metastable starting microstructure, and the fact that a variety of external factors such as temperature, stress, and gaseous species can cause aging through multiple mechanisms. The most significant aging mechanism identified in earlier studies [2007hac2] is age hardening, phenomenologically defined as increasing hardness and strength and decreasing ductility observed as a function of increasing aging time-at-temperature. The scientific fundamentals of age hardening at temperatures relevant to U-6Nb material processing ({le}200 C) and stockpile storage ({le}60 C) remain unresolved in spite of significant experimental efforts [2007hac2, 2009cla]. Equally problematic is the lack of a well-established U-6Nb component failure criterion. These limitations make the most desirable approach of property response and lifetime prediction - that based on fundamental physics - unattainable at the present time. Therefore, a semi-empirical approach was taken to model the phenomenological property evolution during aging. This enabled lifetime estimates to be made from an assumed failure criterion (derived from a manufacturing acceptance criterion) couched in terms of an age-sensitive property, namely quasi-static tensile elongation to failure. The predictions of this and other age-sensitive properties are also useful for U-6Nb component surveillance studies. Drawing upon a large body of artificial aging data obtained from nonbanded (chemically homogeneous) U-5.6Nb and U-7.7Nb material [2007hacJ ] over 100

  7. Statistical analysis of observational study of the influence of radon and other risk factors on lung cancer incidence.

    PubMed

    Zhukovsky, Michael; Varaksin, Anatole; Pakholkina, Olga

    2014-07-01

    An observational study is a type of epidemiological study when the researcher observes the situation but is not able to change the conditions of the experiment. The statistical analysis of the observational study of the population of Lermontov city (North Caucasus) was conducted. In the initial group, there were 121 people with lung cancer diagnosis and 196 people of the control group. Statistical analysis was performed only for men (95 cases and 76 controls). The use of logistic regression with correction on age gives the value of odds ratio 1.95 (0.87÷4.37; 90% CI) per 100 working levels per month of combined (occupational and domestic) radon exposure. It was demonstrated that chronic lung diseases are an additional risk factor for uranium miners but it is not a significant risk factor for general population. Thus, the possibility of obtaining statistically reliable results in the observational studies when using the correct methods of analysis is demonstrated.

  8. Application of artificial aging techniques to samples of rum and comparison with traditionally aged rums by analysis with artificial neural nets.

    PubMed

    Quesada Granados, J; Merelo Guervós, J J; Oliveras López, M J; González Peñalver, J; Olalla Herrera, M; Blanca Herrera, R; López Martinez, M C

    2002-03-13

    Artificial aging techniques were applied to samples of rum. These samples were then compared, by artificial neural nets, with traditionally aged rums. Analysis was based on the phenolic and furanic composition of each sample. There were found to be few statistical differences between samples, thus confirming the possibility of applying artificial aging techniques to obtain rum with phenolic and furanic characteristics that are similar to those of rum obtained by traditional methods.

  9. Age-Associated Changes in the Spectral and Statistical Parameters of Surface Electromyogram of Tibialis Anterior

    PubMed Central

    2016-01-01

    Age-related neuromuscular change of Tibialis Anterior (TA) is a leading cause of muscle strength decline among the elderly. This study has established the baseline for age-associated changes in sEMG of TA at different levels of voluntary contraction. We have investigated the use of Gaussianity and maximal power of the power spectral density (PSD) as suitable features to identify age-associated changes in the surface electromyogram (sEMG). Eighteen younger (20–30 years) and 18 older (60–85 years) cohorts completed two trials of isometric dorsiflexion at four different force levels between 10% and 50% of the maximal voluntary contraction. Gaussianity and maximal power of the PSD of sEMG were determined. Results show a significant increase in sEMG's maximal power of the PSD and Gaussianity with increase in force for both cohorts. It was also observed that older cohorts had higher maximal power of the PSD and lower Gaussianity. These age-related differences observed in the PSD and Gaussianity could be due to motor unit remodelling. This can be useful for noninvasive tracking of age-associated neuromuscular changes. PMID:27610379

  10. Estimating Small-area Populations by Age and Sex Using Spatial Interpolation and Statistical Inference Methods

    SciTech Connect

    Qai, Qiang; Rushton, Gerald; Bhaduri, Budhendra L; Bright, Eddie A; Coleman, Phil R

    2006-01-01

    The objective of this research is to compute population estimates by age and sex for small areas whose boundaries are different from those for which the population counts were made. In our approach, population surfaces and age-sex proportion surfaces are separately estimated. Age-sex population estimates for small areas and their confidence intervals are then computed using a binomial model with the two surfaces as inputs. The approach was implemented for Iowa using a 90 m resolution population grid (LandScan USA) and U.S. Census 2000 population. Three spatial interpolation methods, the areal weighting (AW) method, the ordinary kriging (OK) method, and a modification of the pycnophylactic method, were used on Census Tract populations to estimate the age-sex proportion surfaces. To verify the model, age-sex population estimates were computed for paired Block Groups that straddled Census Tracts and therefore were spatially misaligned with them. The pycnophylactic method and the OK method were more accurate than the AW method. The approach is general and can be used to estimate subgroup-count types of variables from information in existing administrative areas for custom-defined areas used as the spatial basis of support in other applications.

  11. Age-Associated Changes in the Spectral and Statistical Parameters of Surface Electromyogram of Tibialis Anterior

    PubMed Central

    2016-01-01

    Age-related neuromuscular change of Tibialis Anterior (TA) is a leading cause of muscle strength decline among the elderly. This study has established the baseline for age-associated changes in sEMG of TA at different levels of voluntary contraction. We have investigated the use of Gaussianity and maximal power of the power spectral density (PSD) as suitable features to identify age-associated changes in the surface electromyogram (sEMG). Eighteen younger (20–30 years) and 18 older (60–85 years) cohorts completed two trials of isometric dorsiflexion at four different force levels between 10% and 50% of the maximal voluntary contraction. Gaussianity and maximal power of the PSD of sEMG were determined. Results show a significant increase in sEMG's maximal power of the PSD and Gaussianity with increase in force for both cohorts. It was also observed that older cohorts had higher maximal power of the PSD and lower Gaussianity. These age-related differences observed in the PSD and Gaussianity could be due to motor unit remodelling. This can be useful for noninvasive tracking of age-associated neuromuscular changes.

  12. Age-Associated Changes in the Spectral and Statistical Parameters of Surface Electromyogram of Tibialis Anterior.

    PubMed

    Siddiqi, Ariba; Arjunan, Sridhar Poosapadi; Kumar, Dinesh Kant

    2016-01-01

    Age-related neuromuscular change of Tibialis Anterior (TA) is a leading cause of muscle strength decline among the elderly. This study has established the baseline for age-associated changes in sEMG of TA at different levels of voluntary contraction. We have investigated the use of Gaussianity and maximal power of the power spectral density (PSD) as suitable features to identify age-associated changes in the surface electromyogram (sEMG). Eighteen younger (20-30 years) and 18 older (60-85 years) cohorts completed two trials of isometric dorsiflexion at four different force levels between 10% and 50% of the maximal voluntary contraction. Gaussianity and maximal power of the PSD of sEMG were determined. Results show a significant increase in sEMG's maximal power of the PSD and Gaussianity with increase in force for both cohorts. It was also observed that older cohorts had higher maximal power of the PSD and lower Gaussianity. These age-related differences observed in the PSD and Gaussianity could be due to motor unit remodelling. This can be useful for noninvasive tracking of age-associated neuromuscular changes. PMID:27610379

  13. Statistical Analysis of CMC Constituent and Processing Data

    NASA Technical Reports Server (NTRS)

    Fornuff, Jonathan

    2004-01-01

    observed using statistical analysis software. The ultimate purpose of this study is to determine what variations in material processing can lead to the most critical changes in the materials property. The work I have taken part in this summer explores, in general, the key properties needed In this study SiC/SiC composites of varying architectures, utilizing a boron-nitride (BN)

  14. Constraints on statistical computations at 10 months of age: the use of phonological features.

    PubMed

    Gonzalez-Gomez, Nayeli; Nazzi, Thierry

    2015-11-01

    Recently, several studies have argued that infants capitalize on the statistical properties of natural languages to acquire the linguistic structure of their native language, but the kinds of constraints which apply to statistical computations remain largely unknown. Here we explored French-learning infants' perceptual preference for labial-coronal (LC) words over coronal-labial words (CL) words (e.g. preferring bat over tab) to determine whether this phonotactic preference is based on the acquisition of the statistical properties of the input based on a single phonological feature (i.e. place of articulation), multiple features (i.e. place and manner of articulation), or individual consonant pairs. Results from four experiments revealed that infants had a labial-coronal bias for nasal sequences (Experiment 1) and for all plosive sequences (Experiments 2 and 4) but a coronal-labial bias for all fricative sequences (Experiments 3 and 4), independently of the frequencies of individual consonant pairs. These results establish for the first time that constellations of multiple phonological features, defining broad consonant classes, constrain the early acquisition of phonotactic regularities of the native language.

  15. {chi}{sup 2} versus median statistics in supernova type Ia data analysis

    SciTech Connect

    Barreira, A.; Avelino, P. P.

    2011-10-15

    In this paper we compare the performances of the {chi}{sup 2} and median likelihood analysis in the determination of cosmological constraints using type Ia supernovae data. We perform a statistical analysis using the 307 supernovae of the Union 2 compilation of the Supernova Cosmology Project and find that the {chi}{sup 2} statistical analysis yields tighter cosmological constraints than the median statistic if only supernovae data is taken into account. We also show that when additional measurements from the cosmic microwave background and baryonic acoustic oscillations are considered, the combined cosmological constraints are not strongly dependent on whether one applies the {chi}{sup 2} statistic or the median statistic to the supernovae data. This indicates that, when complementary information from other cosmological probes is taken into account, the performances of the {chi}{sup 2} and median statistics are very similar, demonstrating the robustness of the statistical analysis.

  16. Interfaces between statistical analysis packages and the ESRI geographic information system

    NASA Technical Reports Server (NTRS)

    Masuoka, E.

    1980-01-01

    Interfaces between ESRI's geographic information system (GIS) data files and real valued data files written to facilitate statistical analysis and display of spatially referenced multivariable data are described. An example of data analysis which utilized the GIS and the statistical analysis system is presented to illustrate the utility of combining the analytic capability of a statistical package with the data management and display features of the GIS.

  17. Gene Identification Algorithms Using Exploratory Statistical Analysis of Periodicity

    NASA Astrophysics Data System (ADS)

    Mukherjee, Shashi Bajaj; Sen, Pradip Kumar

    2010-10-01

    Studying periodic pattern is expected as a standard line of attack for recognizing DNA sequence in identification of gene and similar problems. But peculiarly very little significant work is done in this direction. This paper studies statistical properties of DNA sequences of complete genome using a new technique. A DNA sequence is converted to a numeric sequence using various types of mappings and standard Fourier technique is applied to study the periodicity. Distinct statistical behaviour of periodicity parameters is found in coding and non-coding sequences, which can be used to distinguish between these parts. Here DNA sequences of Drosophila melanogaster were analyzed with significant accuracy.

  18. Radar Derived Spatial Statistics of Summer Rain. Volume 2; Data Reduction and Analysis

    NASA Technical Reports Server (NTRS)

    Konrad, T. G.; Kropfli, R. A.

    1975-01-01

    Data reduction and analysis procedures are discussed along with the physical and statistical descriptors used. The statistical modeling techniques are outlined and examples of the derived statistical characterization of rain cells in terms of the several physical descriptors are presented. Recommendations concerning analyses which can be pursued using the data base collected during the experiment are included.

  19. Statistical Power Analysis in Education Research. NCSER 2010-3006

    ERIC Educational Resources Information Center

    Hedges, Larry V.; Rhoads, Christopher

    2010-01-01

    This paper provides a guide to calculating statistical power for the complex multilevel designs that are used in most field studies in education research. For multilevel evaluation studies in the field of education, it is important to account for the impact of clustering on the standard errors of estimates of treatment effects. Using ideas from…

  20. Statistical models and NMR analysis of polymer microstructure

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Statistical models can be used in conjunction with NMR spectroscopy to study polymer microstructure and polymerization mechanisms. Thus, Bernoullian, Markovian, and enantiomorphic-site models are well known. Many additional models have been formulated over the years for additional situations. Typica...

  1. Did Tanzania Achieve the Second Millennium Development Goal? Statistical Analysis

    ERIC Educational Resources Information Center

    Magoti, Edwin

    2016-01-01

    Development Goal "Achieve universal primary education", the challenges faced, along with the way forward towards achieving the fourth Sustainable Development Goal "Ensure inclusive and equitable quality education and promote lifelong learning opportunities for all". Statistics show that Tanzania has made very promising steps…

  2. Statistical Analysis Tools for Learning in Engineering Laboratories.

    ERIC Educational Resources Information Center

    Maher, Carolyn A.

    1990-01-01

    Described are engineering programs that have used automated data acquisition systems to implement data collection and analyze experiments. Applications include a biochemical engineering laboratory, heat transfer performance, engineering materials testing, mechanical system reliability, statistical control laboratory, thermo-fluid laboratory, and a…

  3. Private School Universe Survey, 1991-92. Statistical Analysis Report.

    ERIC Educational Resources Information Center

    Broughman, Stephen; And Others

    This report on the private school universe, a data collection system developed by the National Center for Education Statistics, presents data on schools with grades kindergarten through 12 by school size, school level, religious orientation, geographical region, and program emphasis. Numbers of students and teachers are reported in the same…

  4. The PRIME System: Computer Programs for Statistical Analysis.

    ERIC Educational Resources Information Center

    Veldman, Donald J.

    PRIME is a library of 44 batch-oriented computer routines: 20 major package programs, which use 12 statistical utility routines, and 12 other utility routines for input/output and data manipulation. This manual contains a general description of data preparation and coding, standard control cards, input deck arrangement, standard options, and…

  5. Toward a comprehensive framework for the spatiotemporal statistical analysis of longitudinal shape data

    PubMed Central

    Durrleman, S.; Pennec, X.; Trouvé, A.; Braga, J.; Gerig, G.; Ayache, N.

    2013-01-01

    This paper proposes an original approach for the statistical analysis of longitudinal shape data. The proposed method allows the characterization of typical growth patterns and subject-specific shape changes in repeated time-series observations of several subjects. This can be seen as the extension of usual longitudinal statistics of scalar measurements to high-dimensional shape or image data. The method is based on the estimation of continuous subject-specific growth trajectories and the comparison of such temporal shape changes across subjects. Differences between growth trajectories are decomposed into morphological deformations, which account for shape changes independent of the time, and time warps, which account for different rates of shape changes over time. Given a longitudinal shape data set, we estimate a mean growth scenario representative of the population, and the variations of this scenario both in terms of shape changes and in terms of change in growth speed. Then, intrinsic statistics are derived in the space of spatiotemporal deformations, which characterize the typical variations in shape and in growth speed within the studied population. They can be used to detect systematic developmental delays across subjects. In the context of neuroscience, we apply this method to analyze the differences in the growth of the hippocampus in children diagnosed with autism, developmental delays and in controls. Result suggest that group differences may be better characterized by a different speed of maturation rather than shape differences at a given age. In the context of anthropology, we assess the differences in the typical growth of the endocranium between chimpanzees and bonobos. We take advantage of this study to show the robustness of the method with respect to change of parameters and perturbation of the age estimates. PMID:23956495

  6. Improved reporting of statistical design and analysis: guidelines, education, and editorial policies.

    PubMed

    Mazumdar, Madhu; Banerjee, Samprit; Van Epps, Heather L

    2010-01-01

    A majority of original articles published in biomedical journals include some form of statistical analysis. Unfortunately, many of the articles contain errors in statistical design and/or analysis. These errors are worrisome, as the misuse of statistics jeopardizes the process of scientific discovery and the accumulation of scientific knowledge. To help avoid these errors and improve statistical reporting, four approaches are suggested: (1) development of guidelines for statistical reporting that could be adopted by all journals, (2) improvement in statistics curricula in biomedical research programs with an emphasis on hands-on teaching by biostatisticians, (3) expansion and enhancement of biomedical science curricula in statistics programs, and (4) increased participation of biostatisticians in the peer review process along with the adoption of more rigorous journal editorial policies regarding statistics. In this chapter, we provide an overview of these issues with emphasis to the field of molecular biology and highlight the need for continuing efforts on all fronts.

  7. Digital Natives, Digital Immigrants: An Analysis of Age and ICT Competency in Teacher Education

    ERIC Educational Resources Information Center

    Guo, Ruth Xiaoqing; Dobson, Teresa; Petrina, Stephen

    2008-01-01

    This article examines the intersection of age and ICT (information and communication technology) competency and critiques the "digital natives versus digital immigrants" argument proposed by Prensky (2001a, 2001b). Quantitative analysis was applied to a statistical data set collected in the context of a study with over 2,000 pre-service teachers…

  8. Research on the integrative strategy of spatial statistical analysis of GIS

    NASA Astrophysics Data System (ADS)

    Xie, Zhong; Han, Qi Juan; Wu, Liang

    2008-12-01

    Presently, the spacial social and natural phenomenon is studied by both the GIS technique and statistics methods. However, plenty of complex practical applications restrict these research methods. The data models and technologies exploited are full of special localization. This paper firstly sums up the requirement of spacial statistical analysis. On the base of the requirement, the universal spatial statistical models are transformed into the function tools in statistical GIS system. A pyramidal structure of three layers is brought forward. Therefore, it is feasible to combine the techniques of spacial dada management, searches and visualization in GIS with the methods of processing data in the statistic analysis. It will form an integrative statistical GIS environment with the management, analysis, application and assistant decision-making of spacial statistical information.

  9. Modeling statistical properties of the X-ray emission from aged pulsar wind nebulae

    NASA Astrophysics Data System (ADS)

    Bandiera, R.

    2014-03-01

    The number of known pulsar wind nebulae (PWNe) has recently increased considerably, and the majority of them are now middle-age objects. Recent studies have shown a clear correlation of both X-ray luminosity and size with the PWN age, but fail in providing a thorough explanation of the observed trends. Here I propose a different approach to these effects, based on the hypothesis that the observed trends do not simply reproduce the evolution of a ``typical'' PWN, but are a combined effect of PWNe evolving under different ambient conditions, the leading parameter being the ambient medium density. Using a simple analytic approach, I show that most middle-aged PWNe are more likely observable during the reverberation phase, and I succeed in reproducing trends consistent with those observed, provided that the evolution of the X-ray emitting electrons remains adiabatic over the whole reverberation phase. As a direct consequence, I show that the X-ray spectra of older PWNe should be harder, also consistent with observations.

  10. Spatial statistical analysis of basal stem root disease under natural field epidemic of oil palm

    NASA Astrophysics Data System (ADS)

    Kamu, Assis; Phin, Chong Khim; Seman, Idris Abu; Wan, Hoong Hak; Mun, Ho Chong

    2015-02-01

    Oil palm or scientifically known as Elaeis guineensis Jacq. is the most important commodity crop in Malaysia and has greatly contributed to the economy growth of the country. As far as disease is concerned in the industry, Basal Stem Rot (BSR) caused by Ganoderma boninence remains the most important disease. BSR disease is the most widely studied with information available for oil palm disease in Malaysia. However, there is still limited study on the spatial as well as temporal pattern or distribution of the disease especially under natural field epidemic condition in oil palm plantation. The objective of this study is to spatially identify the pattern of BSR disease under natural field epidemic using two geospatial analytical techniques, which are quadrat analysis for the first order properties of partial pattern analysis and nearest-neighbor analysis (NNA) for the second order properties of partial pattern analysis. Two study sites were selected with different age of tree. Both sites are located in Tawau, Sabah and managed by the same company. The results showed that at least one of the point pattern analysis used which is NNA (i.e. the second order properties of partial pattern analysis) has confirmed the disease is complete spatial randomness. This suggests the spread of the disease is not from tree to tree and the age of palm does not play a significance role in determining the spatial pattern of the disease. From the spatial pattern of the disease, it would help in the disease management program and for the industry in the future. The statistical modelling is expected to help in identifying the right model to estimate the yield loss of oil palm due to BSR disease in the future.

  11. Descriptive statistics tables from a detailed analysis of the National Human Activity Pattern Survey (NHAPS) data

    SciTech Connect

    Tsang, A.M.; Klepeis, N.E.

    1996-07-01

    Detailed results tables are presented from an unweighted statistical analysis of selected portions of the 1992--1994 National Human Activity Pattern Survey (NHAPS) data base. This survey collected data on the potential exposure of Americans to important household pollutants. Randomly selected individuals (9,386) supplied minute-by-minute diaries spanning a 24-hour day as well as follow-up questions on specific exposure types. Selected 24-hour diary locations and activities, selected regroupings of the 24-hour diary locations, activities, and smoker-present categories, and most of the follow-up question variables in the NHAPS data base were statistically analyzed across 12 subgroups (gender, age, Hispanic, education, employment, census region, day-of-week, season, asthma, angina and bronchitis/emphysema). Overall statistics were also generated for the 9,386 total respondents. Tables show descriptive statistics (including frequency distributions) of time spent and frequency of occurrence in each of 39 locations and for 22 activities (that were experienced by more than 50 respondents), along with equivalent tables for 10 regrouped locations (Residence-Indoors, Residence-Outdoors, Inside Vehicle, Near Vehicle, Other Outdoor, Office/Factory, Mall/Store, Public Building, Bar/Restaurant, Other Indoor), seven regrouped activities and smoker present. Tables of frequency distributions of time spent in exposure activities, or the frequency of occurrence of exposure activities, as determined from the follow up questions that were analyzed are also presented. Detailed indices provide page numbers for each table. An Appendix contains a condensed listing of the questionnaires (Versions A and B for adults, child-direct and child-proxy interview types), including the question number, the NHAPS data base variable name, and the verbatim question text.

  12. A Statistical Framework for the Functional Analysis of Metagenomes

    SciTech Connect

    Sharon, Itai; Pati, Amrita; Markowitz, Victor; Pinter, Ron Y.

    2008-10-01

    Metagenomic studies consider the genetic makeup of microbial communities as a whole, rather than their individual member organisms. The functional and metabolic potential of microbial communities can be analyzed by comparing the relative abundance of gene families in their collective genomic sequences (metagenome) under different conditions. Such comparisons require accurate estimation of gene family frequencies. They present a statistical framework for assessing these frequencies based on the Lander-Waterman theory developed originally for Whole Genome Shotgun (WGS) sequencing projects. They also provide a novel method for assessing the reliability of the estimations which can be used for removing seemingly unreliable measurements. They tested their method on a wide range of datasets, including simulated genomes and real WGS data from sequencing projects of whole genomes. Results suggest that their framework corrects inherent biases in accepted methods and provides a good approximation to the true statistics of gene families in WGS projects.

  13. Statistical Analysis of CFD Solutions from the Drag Prediction Workshop

    NASA Technical Reports Server (NTRS)

    Hemsch, Michael J.

    2002-01-01

    A simple, graphical framework is presented for robust statistical evaluation of results obtained from N-Version testing of a series of RANS CFD codes. The solutions were obtained by a variety of code developers and users for the June 2001 Drag Prediction Workshop sponsored by the AIAA Applied Aerodynamics Technical Committee. The aerodynamic configuration used for the computational tests is the DLR-F4 wing-body combination previously tested in several European wind tunnels and for which a previous N-Version test had been conducted. The statistical framework is used to evaluate code results for (1) a single cruise design point, (2) drag polars and (3) drag rise. The paper concludes with a discussion of the meaning of the results, especially with respect to predictability, Validation, and reporting of solutions.

  14. Statistical Methods for Rapid Aerothermal Analysis and Design Technology

    NASA Technical Reports Server (NTRS)

    Morgan, Carolyn; DePriest, Douglas; Thompson, Richard (Technical Monitor)

    2002-01-01

    The cost and safety goals for NASA's next generation of reusable launch vehicle (RLV) will require that rapid high-fidelity aerothermodynamic design tools be used early in the design cycle. To meet these requirements, it is desirable to establish statistical models that quantify and improve the accuracy, extend the applicability, and enable combined analyses using existing prediction tools. The research work was focused on establishing the suitable mathematical/statistical models for these purposes. It is anticipated that the resulting models can be incorporated into a software tool to provide rapid, variable-fidelity, aerothermal environments to predict heating along an arbitrary trajectory. This work will support development of an integrated design tool to perform automated thermal protection system (TPS) sizing and material selection.

  15. Common misconceptions about data analysis and statistics1

    PubMed Central

    Motulsky, Harvey J

    2015-01-01

    Ideally, any experienced investigator with the right tools should be able to reproduce a finding published in a peer-reviewed biomedical science journal. In fact, the reproducibility of a large percentage of published findings has been questioned. Undoubtedly, there are many reasons for this, but one reason may be that investigators fool themselves due to a poor understanding of statistical concepts. In particular, investigators often make these mistakes: (1) P-Hacking. This is when you reanalyze a data set in many different ways, or perhaps reanalyze with additional replicates, until you get the result you want. (2) Overemphasis on P values rather than on the actual size of the observed effect. (3) Overuse of statistical hypothesis testing, and being seduced by the word “significant”. (4) Overreliance on standard errors, which are often misunderstood. PMID:25692012

  16. Statistical analysis of motion contrast in optical coherence tomography angiography

    NASA Astrophysics Data System (ADS)

    Cheng, Yuxuan; Guo, Li; Pan, Cong; Lu, Tongtong; Hong, Tianyu; Ding, Zhihua; Li, Peng

    2015-11-01

    Optical coherence tomography angiography (Angio-OCT), mainly based on the temporal dynamics of OCT scattering signals, has found a range of potential applications in clinical and scientific research. Based on the model of random phasor sums, temporal statistics of the complex-valued OCT signals are mathematically described. Statistical distributions of the amplitude differential and complex differential Angio-OCT signals are derived. The theories are validated through the flow phantom and live animal experiments. Using the model developed, the origin of the motion contrast in Angio-OCT is mathematically explained, and the implications in the improvement of motion contrast are further discussed, including threshold determination and its residual classification error, averaging method, and scanning protocol. The proposed mathematical model of Angio-OCT signals can aid in the optimal design of the system and associated algorithms.

  17. Statistical analysis of modeling error in structural dynamic systems

    NASA Technical Reports Server (NTRS)

    Hasselman, T. K.; Chrostowski, J. D.

    1990-01-01

    The paper presents a generic statistical model of the (total) modeling error for conventional space structures in their launch configuration. Modeling error is defined as the difference between analytical prediction and experimental measurement. It is represented by the differences between predicted and measured real eigenvalues and eigenvectors. Comparisons are made between pre-test and post-test models. Total modeling error is then subdivided into measurement error, experimental error and 'pure' modeling error, and comparisons made between measurement error and total modeling error. The generic statistical model presented in this paper is based on the first four global (primary structure) modes of four different structures belonging to the generic category of Conventional Space Structures (specifically excluding large truss-type space structures). As such, it may be used to evaluate the uncertainty of predicted mode shapes and frequencies, sinusoidal response, or the transient response of other structures belonging to the same generic category.

  18. Ambiguity and nonidentifiability in the statistical analysis of neural codes

    PubMed Central

    Amarasingham, Asohan; Geman, Stuart; Harrison, Matthew T.

    2015-01-01

    Many experimental studies of neural coding rely on a statistical interpretation of the theoretical notion of the rate at which a neuron fires spikes. For example, neuroscientists often ask, “Does a population of neurons exhibit more synchronous spiking than one would expect from the covariability of their instantaneous firing rates?” For another example, “How much of a neuron’s observed spiking variability is caused by the variability of its instantaneous firing rate, and how much is caused by spike timing variability?” However, a neuron’s theoretical firing rate is not necessarily well-defined. Consequently, neuroscientific questions involving the theoretical firing rate do not have a meaning in isolation but can only be interpreted in light of additional statistical modeling choices. Ignoring this ambiguity can lead to inconsistent reasoning or wayward conclusions. We illustrate these issues with examples drawn from the neural-coding literature. PMID:25934918

  19. Computational and Statistical Analysis of Protein Mass Spectrometry Data

    PubMed Central

    Noble, William Stafford; MacCoss, Michael J.

    2012-01-01

    High-throughput proteomics experiments involving tandem mass spectrometry produce large volumes of complex data that require sophisticated computational analyses. As such, the field offers many challenges for computational biologists. In this article, we briefly introduce some of the core computational and statistical problems in the field and then describe a variety of outstanding problems that readers of PLoS Computational Biology might be able to help solve. PMID:22291580

  20. Statistical mechanics analysis of thresholding 1-bit compressed sensing

    NASA Astrophysics Data System (ADS)

    Xu, Yingying; Kabashima, Yoshiyuki

    2016-08-01

    The one-bit compressed sensing framework aims to reconstruct a sparse signal by only using the sign information of its linear measurements. To compensate for the loss of scale information, past studies in the area have proposed recovering the signal by imposing an additional constraint on the l 2-norm of the signal. Recently, an alternative strategy that captures scale information by introducing a threshold parameter to the quantization process was advanced. In this paper, we analyze the typical behavior of thresholding 1-bit compressed sensing utilizing the replica method of statistical mechanics, so as to gain an insight for properly setting the threshold value. Our result shows that fixing the threshold at a constant value yields better performance than varying it randomly when the constant is optimally tuned, statistically. Unfortunately, the optimal threshold value depends on the statistical properties of the target signal, which may not be known in advance. In order to handle this inconvenience, we develop a heuristic that adaptively tunes the threshold parameter based on the frequency of positive (or negative) values in the binary outputs. Numerical experiments show that the heuristic exhibits satisfactory performance while incurring low computational cost.

  1. Aging and Alienation: A Longitudinal Analysis.

    ERIC Educational Resources Information Center

    Wong, Tso Sang

    Alienation has been a key concept and major area of empirical studies in sociology and psychology; however, most alienation studies have not dealt with the elderly. In an attempt to explore the effects of the aging process and the major events of later life on the aging person's vulnerability to alienation, older residents (50 years or more) in a…

  2. Condition of America's Public School Facilities, 1999. Statistical Analysis Report.

    ERIC Educational Resources Information Center

    Lewis, Laurie; Snow, Kyle; Farris, Elizabeth; Smerdon, Becky; Cronen, Stephanie; Kaplan, Jessica

    This report provides national data for 903 U.S. public elementary and secondary schools on the condition of public schools in 1999 and the costs to bring them into good condition. Additionally provided are school plans for repairs, renovations, and replacements; data on the age of public schools; and overcrowding and practices used to address…

  3. Child Mortality in a Developing Country: A Statistical Analysis

    ERIC Educational Resources Information Center

    Uddin, Md. Jamal; Hossain, Md. Zakir; Ullah, Mohammad Ohid

    2009-01-01

    This study uses data from the "Bangladesh Demographic and Health Survey (BDHS] 1999-2000" to investigate the predictors of child (age 1-4 years) mortality in a developing country like Bangladesh. The cross-tabulation and multiple logistic regression techniques have been used to estimate the predictors of child mortality. The cross-tabulation…

  4. 75 FR 24718 - Guidance for Industry on Documenting Statistical Analysis Programs and Data Files; Availability

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-05

    ... HUMAN SERVICES Food and Drug Administration Guidance for Industry on Documenting Statistical Analysis...: The Food and Drug Administration (FDA) is announcing the availability of a guidance for industry 197 entitled ``Documenting Statistical Analysis Programs and Data Files.'' This guidance is provided to...

  5. Statistics Education Research in Malaysia and the Philippines: A Comparative Analysis

    ERIC Educational Resources Information Center

    Reston, Enriqueta; Krishnan, Saras; Idris, Noraini

    2014-01-01

    This paper presents a comparative analysis of statistics education research in Malaysia and the Philippines by modes of dissemination, research areas, and trends. An electronic search for published research papers in the area of statistics education from 2000-2012 yielded 20 for Malaysia and 19 for the Philippines. Analysis of these papers showed…

  6. Cost-effectiveness analysis: a proposal of new reporting standards in statistical analysis.

    PubMed

    Bang, Heejung; Zhao, Hongwei

    2014-01-01

    Cost-effectiveness analysis (CEA) is a method for evaluating the outcomes and costs of competing strategies designed to improve health, and has been applied to a variety of different scientific fields. Yet there are inherent complexities in cost estimation and CEA from statistical perspectives (e.g., skewness, bidimensionality, and censoring). The incremental cost-effectiveness ratio that represents the additional cost per unit of outcome gained by a new strategy has served as the most widely accepted methodology in the CEA. In this article, we call for expanded perspectives and reporting standards reflecting a more comprehensive analysis that can elucidate different aspects of available data. Specifically, we propose that mean- and median-based incremental cost-effectiveness ratios and average cost-effectiveness ratios be reported together, along with relevant summary and inferential statistics, as complementary measures for informed decision making.

  7. Statistical analysis of wing/fin buffeting response

    NASA Astrophysics Data System (ADS)

    Lee, B. H. K.

    2002-05-01

    The random nature of the aerodynamic loading on the wing and tail structures of an aircraft makes it necessary to adopt a statistical approach in the prediction of the buffeting response. This review describes a buffeting prediction technique based on rigid model pressure measurements that is commonly used in North America, and also the buffet excitation parameter technique favored by many researchers in the UK. It is shown that the two models are equivalent and have their origin based on a statistical theory of the response of a mechanical system to a random load. In formulating the model for predicting aircraft response at flight conditions using rigid model wind tunnel pressure measurements, the wing (fin) is divided into panels, and the load is computed from measured pressure fluctuations at the center of each panel. The methods used to model pressure correlation between panels are discussed. The coupling between the wing (fin) motion and the induced aerodynamics using a doublet-lattice unsteady aerodynamics code is described. The buffet excitation parameter approach to predict flight test response using wind tunnel model data is derived from the equations for the pressure model formulation. Examples of flight correlation with prediction based on wind tunnel measurements for wing and vertical tail buffeting response are presented for a number of aircraft. For rapid maneuvers inside the buffet regime, the statistical properties of the buffet load are usually non-stationary because of the short time records and difficulties in maintaining constant flight conditions. The time history of the applied load is segmented into a number of time intervals. In each time segment, the non-stationary load is represented as a product of a deterministic shaping function and a random function. Various forms of the load power spectral density that permits analytical solution of the mean square displacement and acceleration response are considered. Illustrations are given using F

  8. Statistical analysis of epidemiologic data of pregnancy outcomes

    SciTech Connect

    Butler, W.J.; Kalasinski, L.A. )

    1989-02-01

    In this paper, a generalized logistic regression model for correlated observations is used to analyze epidemiologic data on the frequency of spontaneous abortion among a group of women office workers. The results are compared to those obtained from the use of the standard logistic regression model that assumes statistical independence among all the pregnancies contributed by one woman. In this example, the correlation among pregnancies from the same woman is fairly small and did not have a substantial impact on the magnitude of estimates of parameters of the model. This is due at least partly to the small average number of pregnancies contributed by each woman.

  9. Statistical Analysis of Noisy Signals Using Classification Tools

    SciTech Connect

    Thompson, Sandra E.; Heredia-Langner, Alejandro; Johnson, Timothy J.; Foster, Nancy S.; Valentine, Nancy B.; Amonette, James E.

    2005-06-04

    The potential use of chemicals, biotoxins and biological pathogens are a threat to military and police forces as well as the general public. Rapid identification of these agents is made difficult due to the noisy nature of the signal that can be obtained from portable, in-field sensors. In previously published articles, we created a flowchart that illustrated a method for triaging bacterial identification by combining standard statistical techniques for discrimination and identification with mid-infrared spectroscopic data. The present work documents the process of characterizing and eliminating the sources of the noise and outlines how multidisciplinary teams are necessary to accomplish that goal.

  10. Period04: Statistical analysis of large astronomical time series

    NASA Astrophysics Data System (ADS)

    Lenz, Patrick; Breger, Michel

    2014-07-01

    Period04 statistically analyzes large astronomical time series containing gaps. It calculates formal uncertainties, can extract the individual frequencies from the multiperiodic content of time series, and provides a flexible interface to perform multiple-frequency fits with a combination of least-squares fitting and the discrete Fourier transform algorithm. Period04, written in Java/C++, supports the SAMP communication protocol to provide interoperability with other applications of the Virtual Observatory. It is a reworked and extended version of Period98 (Sperl 1998) and PERIOD/PERDET (Breger 1990).

  11. Statistical analysis of multivariate atmospheric variables. [cloud cover

    NASA Technical Reports Server (NTRS)

    Tubbs, J. D.

    1979-01-01

    Topics covered include: (1) estimation in discrete multivariate distributions; (2) a procedure to predict cloud cover frequencies in the bivariate case; (3) a program to compute conditional bivariate normal parameters; (4) the transformation of nonnormal multivariate to near-normal; (5) test of fit for the extreme value distribution based upon the generalized minimum chi-square; (6) test of fit for continuous distributions based upon the generalized minimum chi-square; (7) effect of correlated observations on confidence sets based upon chi-square statistics; and (8) generation of random variates from specified distributions.

  12. CAG-Repeat Length and the Age of Onset in Huntington Disease (HD): A Review and Validation Study of Statistical Approaches

    PubMed Central

    Langbehn, Douglas R.; Hayden, Michael; Paulsen, Jane S.

    2011-01-01

    Background CAG-repeat length in the gene for HD is inversely correlated with age of onset (AOO). A number of statistical models elucidating the relationship between CAG length and AOO have recently been published. In the present article, we review the published formulae, summarize essential differences in subject sources, statistical methodologies, and predictive results. We argue that unrepresentative sampling and failure to use appropriate survival analysis methodology may have substantially biased much of the literature. We also explain why the survival analysis perspective is necessary if any such model is to undergo prospective validation. Methods We use prospective diagnostic data from the PREDICT-HD longitudinal study of CAG-expanded participants to test conditional predictions derived from two survival models of age of onset of HD. Principal Findings A prior model of the relationship of CAG and AOO originally published by Langbehn et al. yields reasonably accurate predictions, while a similar model by Gutierrez and MacDonald substantially overestimates diagnosis risk for all but the highest risk subjects in this sample. Conclusions/Significance The Langbehn et al model appears accurate enough to have substantial utility in various research contexts. We also emphasize remaining caveats, many of which are relevant for any direct application to genetic counseling. PMID:19548255

  13. Longitudinal analysis in Plantago: strength of selection and reverse-age analysis reveal age-indeterminate senescence

    PubMed Central

    Shefferson, Richard P.; Roach, Deborah A.

    2013-01-01

    Summary 1. Senescence is usually viewed as increased age-specific mortality or decreased age-specific fecundity due to the declining ability of natural selection to remove deleterious age-specific mutations with age. In herbaceous perennial plants, trends in age-specific mortality are often confounded by size. Age-indeterminate senescence, where accumulated physiological damage varies strongly with environment, may be a better model of senescence in these species. 2. We analysed trends in size and fertility in Plantago lanceolata, using a long-term demographic census involving >10 years and >8,000 individuals in 4 cohorts. We used elasticity and pairwise invasion analysis of life history function-parameterized age × stage matrices to assess whether the force of natural selection declined with age. Then, we used reverse age analysis of size and fertility to assess whether age-indeterminate senescence occurred. Reverse age analysis uses longitudinal data for individuals that have died to look at trait patterns as a function of both age and remaining time to death. We hypothesized that i) the strength of natural selection would decline strongly with age, and ii) physiological condition would deteriorate for several years prior to death. 3. Both elasticity and invasion analyses suggested that the strength of natural selection through mortality declined strongly with age once size was accounted for. Further, reverse age analyses showed that individuals shrank for ~3yrs prior to death, suggesting physiological decline. Inflorescence production declined with age, and also declined in the 3 years prior to death regardless of overall age. 4. Synthesis The hypothesis that plants escape senescence generally assumes that plants can continue to grow larger and increase reproduction as they get older. The results here show that size and reproduction decline with age and the rates of these declines toward death are lifespan- and age-dependent. Further research is needed to

  14. [Statistical analysis of fabrication of indirect single restorations].

    PubMed

    Sato, T; Kawawa, A; Okada, D; Ohno, S; Akiba, H; Watanabe, Y; Endo, K; Mayanagi, A; Miura, H; Hasegawa, S

    1999-09-01

    A statistical survey based on laboratory records was performed on the number of indirect restorations fabricated at the dental hospital of Tokyo Medical and Dental University from April 1 to September 30, 1997. A comparison was also carried out with a previous survey, which had been carried out in 1986, in order to detect any change and possible alterations in the near future. Based on the results of this statistical survey, the conclusions were as follows: 1. A total of 9,126 indirect restorations were fabricated during the six month period in 1997; among them, 8,007 (87.7%) restorations were covered by health insurance and 1,119 (12.3%) restorations were not. 2. The most common restoration was the cast post and core (28.6%), followed by full crowns (18.5%) and removable partial dentures (15.6%). On the other hand, the least number were post crowns (0.03%) and resin jacket crowns (0.2%). 3. When making a comparison with the data in 1986, an increase in the number of removable partial dentures and a decrease in the number of inlays were the most distinctive features. 4. For anterior teeth, resin-veneered crowns were most common, especially for lower teeth. The percentage of restorations, which were not covered by health insurance, decreased from 45.0% (in 1986) to 12.3% (in 1997).

  15. Statistical analysis of bankrupting and non-bankrupting stocks

    NASA Astrophysics Data System (ADS)

    Li, Qian; Wang, Fengzhong; Wei, Jianrong; Liang, Yuan; Huang, Jiping; Stanley, H. Eugene

    2012-04-01

    The recent financial crisis has caused extensive world-wide economic damage, affecting in particular those who invested in companies that eventually filed for bankruptcy. A better understanding of stocks that become bankrupt would be helpful in reducing risk in future investments. Economists have conducted extensive research on this topic, and here we ask whether statistical physics concepts and approaches may offer insights into pre-bankruptcy stock behavior. To this end, we study all 20092 stocks listed in US stock markets for the 20-year period 1989-2008, including 4223 (21 percent) that became bankrupt during that period. We find that, surprisingly, the distributions of the daily returns of those stocks that become bankrupt differ significantly from those that do not. Moreover, these differences are consistent for the entire period studied. We further study the relation between the distribution of returns and the length of time until bankruptcy, and observe that larger differences of the distribution of returns correlate with shorter time periods preceding bankruptcy. This behavior suggests that sharper fluctuations in the stock price occur when the stock is closer to bankruptcy. We also analyze the cross-correlations between the return and the trading volume, and find that stocks approaching bankruptcy tend to have larger return-volume cross-correlations than stocks that are not. Furthermore, the difference increases as bankruptcy approaches. We conclude that before a firm becomes bankrupt its stock exhibits unusual behavior that is statistically quantifiable.

  16. Statistical analysis of biotissues Mueller matrix images in cancer diagnostics

    NASA Astrophysics Data System (ADS)

    Yermolenko, Sergey; Ivashko, Pavlo; Goudail, François; Gruia, Ion

    2010-11-01

    This work is directed to the investigation of the scope of the technique of laser polarimetry and polarization spectrometry of oncological changes of the human prostate tissue under the conditions of multiple scattering. It was shown that the third statistic moment in the intensity distribution proved to be the most sensitive to pathological changes in orientation structure. Its value in the intensity distribution of polarization image I (0 - 90) of oncologically changed tissue is 21 times higher if compared with the similar statistic parameter of the intensity distribution of the healthy tissue. The results of studies of size linear dichroism prostate gland, as healthy and affected by malignant tumor at different stages of its development was presented. Significant difference in the values of linear dichroism and its spectral dependence in the spectral range λ = 280 - 840 nm as between research facilities, and between biotissues - healthy (or affected by benign tumors) and cancer patients was shown. These results may have diagnostic value for detection and assessment of the development of cancer.

  17. Texture analysis with statistical methods for wheat ear extraction

    NASA Astrophysics Data System (ADS)

    Bakhouche, M.; Cointault, F.; Gouton, P.

    2007-01-01

    In agronomic domain, the simplification of crop counting, necessary for yield prediction and agronomic studies, is an important project for technical institutes such as Arvalis. Although the main objective of our global project is to conceive a mobile robot for natural image acquisition directly in a field, Arvalis has proposed us first to detect by image processing the number of wheat ears in images before to count them, which will allow to obtain the first component of the yield. In this paper we compare different texture image segmentation techniques based on feature extraction by first and higher order statistical methods which have been applied on our images. The extracted features are used for unsupervised pixel classification to obtain the different classes in the image. So, the K-means algorithm is implemented before the choice of a threshold to highlight the ears. Three methods have been tested in this feasibility study with very average error of 6%. Although the evaluation of the quality of the detection is visually done, automatic evaluation algorithms are currently implementing. Moreover, other statistical methods of higher order will be implemented in the future jointly with methods based on spatio-frequential transforms and specific filtering.

  18. Statistical analysis of the Indus script using n-grams.

    PubMed

    Yadav, Nisha; Joglekar, Hrishikesh; Rao, Rajesh P N; Vahia, Mayank N; Adhikari, Ronojoy; Mahadevan, Iravatham

    2010-03-19

    The Indus script is one of the major undeciphered scripts of the ancient world. The small size of the corpus, the absence of bilingual texts, and the lack of definite knowledge of the underlying language has frustrated efforts at decipherment since the discovery of the remains of the Indus civilization. Building on previous statistical approaches, we apply the tools of statistical language processing, specifically n-gram Markov chains, to analyze the syntax of the Indus script. We find that unigrams follow a Zipf-Mandelbrot distribution. Text beginner and ender distributions are unequal, providing internal evidence for syntax. We see clear evidence of strong bigram correlations and extract significant pairs and triplets using a log-likelihood measure of association. Highly frequent pairs and triplets are not always highly significant. The model performance is evaluated using information-theoretic measures and cross-validation. The model can restore doubtfully read texts with an accuracy of about 75%. We find that a quadrigram Markov chain saturates information theoretic measures against a held-out corpus. Our work forms the basis for the development of a stochastic grammar which may be used to explore the syntax of the Indus script in greater detail.

  19. How Many Studies Do You Need? A Primer on Statistical Power for Meta-Analysis

    ERIC Educational Resources Information Center

    Valentine, Jeffrey C.; Pigott, Therese D.; Rothstein, Hannah R.

    2010-01-01

    In this article, the authors outline methods for using fixed and random effects power analysis in the context of meta-analysis. Like statistical power analysis for primary studies, power analysis for meta-analysis can be done either prospectively or retrospectively and requires assumptions about parameters that are unknown. The authors provide…

  20. A STATISTICAL ANALYSIS OF THE LATE-TYPE STELLAR CONTENT IN THE ANDROMEDA HALO

    SciTech Connect

    Koch, Andreas; Rich, R. Michael E-mail: rmr@astro.ucla.ed

    2010-06-15

    We present a statistical characterization of the carbon-star to M-giant (C/M) ratio in the halo of M31. Based on the application of pseudo-filter bandpasses to our Keck/DEIMOS spectra, we measure the 81 - 77 color index of 1288 stars in the giant stellar stream and in halo fields out to large distances. From this well-established narrow-band system, supplemented by V - I colors, we find only a low number (five in total) of C-star candidates. The resulting low C/M ratio of 10% is consistent with the values in the M31 disk and inner halo from the literature. Although our analysis is challenged by small number statistics and our sample selection, there is an indication that the oxygen-rich M-giants occur in similar number throughout the entire halo. We also find no difference in the C-star population of the halo fields compared to the giant stream. The very low C/M ratio is at odds with the observed low metallicities and the presence of intermediate-age stars at large radii. Our observed absence of a substantial carbon-star population in these regions indicates that the (outer) M31 halo cannot be dominated by the debris of disk-like or Small-Magellanic-Cloud-type galaxies, but rather resemble the dwarf elliptical NGC 147.

  1. Statistical analysis of loopy belief propagation in random fields

    NASA Astrophysics Data System (ADS)

    Yasuda, Muneki; Kataoka, Shun; Tanaka, Kazuyuki

    2015-10-01

    Loopy belief propagation (LBP), which is equivalent to the Bethe approximation in statistical mechanics, is a message-passing-type inference method that is widely used to analyze systems based on Markov random fields (MRFs). In this paper, we propose a message-passing-type method to analytically evaluate the quenched average of LBP in random fields by using the replica cluster variation method. The proposed analytical method is applicable to general pairwise MRFs with random fields whose distributions differ from each other and can give the quenched averages of the Bethe free energies over random fields, which are consistent with numerical results. The order of its computational cost is equivalent to that of standard LBP. In the latter part of this paper, we describe the application of the proposed method to Bayesian image restoration, in which we observed that our theoretical results are in good agreement with the numerical results for natural images.

  2. Statistical Analysis of Complexity Generators for Cost Estimation

    NASA Technical Reports Server (NTRS)

    Rowell, Ginger Holmes

    1999-01-01

    Predicting the cost of cutting edge new technologies involved with spacecraft hardware can be quite complicated. A new feature of the NASA Air Force Cost Model (NAFCOM), called the Complexity Generator, is being developed to model the complexity factors that drive the cost of space hardware. This parametric approach is also designed to account for the differences in cost, based on factors that are unique to each system and subsystem. The cost driver categories included in this model are weight, inheritance from previous missions, technical complexity, and management factors. This paper explains the Complexity Generator framework, the statistical methods used to select the best model within this framework, and the procedures used to find the region of predictability and the prediction intervals for the cost of a mission.

  3. Statistical Methods for Rapid Aerothermal Analysis and Design Technology: Validation

    NASA Technical Reports Server (NTRS)

    DePriest, Douglas; Morgan, Carolyn

    2003-01-01

    The cost and safety goals for NASA s next generation of reusable launch vehicle (RLV) will require that rapid high-fidelity aerothermodynamic design tools be used early in the design cycle. To meet these requirements, it is desirable to identify adequate statistical models that quantify and improve the accuracy, extend the applicability, and enable combined analyses using existing prediction tools. The initial research work focused on establishing suitable candidate models for these purposes. The second phase is focused on assessing the performance of these models to accurately predict the heat rate for a given candidate data set. This validation work compared models and methods that may be useful in predicting the heat rate.

  4. Statistical Analysis of Haralick Texture Features to Discriminate Lung Abnormalities

    PubMed Central

    Zayed, Nourhan; Elnemr, Heba A.

    2015-01-01

    The Haralick texture features are a well-known mathematical method to detect the lung abnormalities and give the opportunity to the physician to localize the abnormality tissue type, either lung tumor or pulmonary edema. In this paper, statistical evaluation of the different features will represent the reported performance of the proposed method. Thirty-seven patients CT datasets with either lung tumor or pulmonary edema were included in this study. The CT images are first preprocessed for noise reduction and image enhancement, followed by segmentation techniques to segment the lungs, and finally Haralick texture features to detect the type of the abnormality within the lungs. In spite of the presence of low contrast and high noise in images, the proposed algorithms introduce promising results in detecting the abnormality of lungs in most of the patients in comparison with the normal and suggest that some of the features are significantly recommended than others. PMID:26557845

  5. Statistical Analysis of Haralick Texture Features to Discriminate Lung Abnormalities.

    PubMed

    Zayed, Nourhan; Elnemr, Heba A

    2015-01-01

    The Haralick texture features are a well-known mathematical method to detect the lung abnormalities and give the opportunity to the physician to localize the abnormality tissue type, either lung tumor or pulmonary edema. In this paper, statistical evaluation of the different features will represent the reported performance of the proposed method. Thirty-seven patients CT datasets with either lung tumor or pulmonary edema were included in this study. The CT images are first preprocessed for noise reduction and image enhancement, followed by segmentation techniques to segment the lungs, and finally Haralick texture features to detect the type of the abnormality within the lungs. In spite of the presence of low contrast and high noise in images, the proposed algorithms introduce promising results in detecting the abnormality of lungs in most of the patients in comparison with the normal and suggest that some of the features are significantly recommended than others. PMID:26557845

  6. Statistical analysis of Nomao customer votes for spots of France

    NASA Astrophysics Data System (ADS)

    Pálovics, Róbert; Daróczy, Bálint; Benczúr, András; Pap, Julia; Ermann, Leonardo; Phan, Samuel; Chepelianskii, Alexei D.; Shepelyansky, Dima L.

    2015-08-01

    We investigate the statistical properties of votes of customers for spots of France collected by the startup company Nomao. The frequencies of votes per spot and per customer are characterized by a power law distribution which remains stable on a time scale of a decade when the number of votes is varied by almost two orders of magnitude. Using the computer science methods we explore the spectrum and the eigenvalues of a matrix containing user ratings to geolocalized items. Eigenvalues nicely map to large towns and regions but show certain level of instability as we modify the interpretation of the underlying matrix. We evaluate imputation strategies that provide improved prediction performance by reaching geographically smooth eigenvectors. We point on possible links between distribution of votes and the phenomenon of self-organized criticality.

  7. Analysis of surface sputtering on a quantum statistical basis

    NASA Technical Reports Server (NTRS)

    Wilhelm, H. E.

    1975-01-01

    Surface sputtering is explained theoretically by means of a 3-body sputtering mechanism involving the ion and two surface atoms of the solid. By means of quantum-statistical mechanics, a formula for the sputtering ratio S(E) is derived from first principles. The theoretical sputtering rate S(E) was found experimentally to be proportional to the square of the difference between incident ion energy and the threshold energy for sputtering of surface atoms at low ion energies. Extrapolation of the theoretical sputtering formula to larger ion energies indicates that S(E) reaches a saturation value and finally decreases at high ion energies. The theoretical sputtering ratios S(E) for wolfram, tantalum, and molybdenum are compared with the corresponding experimental sputtering curves in the low energy region from threshold sputtering energy to 120 eV above the respective threshold energy. Theory and experiment are shown to be in good agreement.

  8. Power flow as a complement to statistical energy analysis and finite element analysis

    NASA Technical Reports Server (NTRS)

    Cuschieri, J. M.

    1987-01-01

    Present methods of analysis of the structural response and the structure-borne transmission of vibrational energy use either finite element (FE) techniques or statistical energy analysis (SEA) methods. The FE methods are a very useful tool at low frequencies where the number of resonances involved in the analysis is rather small. On the other hand SEA methods can predict with acceptable accuracy the response and energy transmission between coupled structures at relatively high frequencies where the structural modal density is high and a statistical approach is the appropriate solution. In the mid-frequency range, a relatively large number of resonances exist which make finite element method too costly. On the other hand SEA methods can only predict an average level form. In this mid-frequency range a possible alternative is to use power flow techniques, where the input and flow of vibrational energy to excited and coupled structural components can be expressed in terms of input and transfer mobilities. This power flow technique can be extended from low to high frequencies and this can be integrated with established FE models at low frequencies and SEA models at high frequencies to form a verification of the method. This method of structural analysis using power flo and mobility methods, and its integration with SEA and FE analysis is applied to the case of two thin beams joined together at right angles.

  9. Limitations of Using Microsoft Excel Version 2016 (MS Excel 2016) for Statistical Analysis for Medical Research.

    PubMed

    Tanavalee, Chotetawan; Luksanapruksa, Panya; Singhatanadgige, Weerasak

    2016-06-01

    Microsoft Excel (MS Excel) is a commonly used program for data collection and statistical analysis in biomedical research. However, this program has many limitations, including fewer functions that can be used for analysis and a limited number of total cells compared with dedicated statistical programs. MS Excel cannot complete analyses with blank cells, and cells must be selected manually for analysis. In addition, it requires multiple steps of data transformation and formulas to plot survival analysis graphs, among others. The Megastat add-on program, which will be supported by MS Excel 2016 soon, would eliminate some limitations of using statistic formulas within MS Excel.

  10. Multivariate Statistical Analysis of MSL APXS Bulk Geochemical Data

    NASA Astrophysics Data System (ADS)

    Hamilton, V. E.; Edwards, C. S.; Thompson, L. M.; Schmidt, M. E.

    2014-12-01

    We apply cluster and factor analyses to bulk chemical data of 130 soil and rock samples measured by the Alpha Particle X-ray Spectrometer (APXS) on the Mars Science Laboratory (MSL) rover Curiosity through sol 650. Multivariate approaches such as principal components analysis (PCA), cluster analysis, and factor analysis compliment more traditional approaches (e.g., Harker diagrams), with the advantage of simultaneously examining the relationships between multiple variables for large numbers of samples. Principal components analysis has been applied with success to APXS, Pancam, and Mössbauer data from the Mars Exploration Rovers. Factor analysis and cluster analysis have been applied with success to thermal infrared (TIR) spectral data of Mars. Cluster analyses group the input data by similarity, where there are a number of different methods for defining similarity (hierarchical, density, distribution, etc.). For example, without any assumptions about the chemical contributions of surface dust, preliminary hierarchical and K-means cluster analyses clearly distinguish the physically adjacent rock targets Windjana and Stephen as being distinctly different than lithologies observed prior to Curiosity's arrival at The Kimberley. In addition, they are separated from each other, consistent with chemical trends observed in variation diagrams but without requiring assumptions about chemical relationships. We will discuss the variation in cluster analysis results as a function of clustering method and pre-processing (e.g., log transformation, correction for dust cover) and implications for interpreting chemical data. Factor analysis shares some similarities with PCA, and examines the variability among observed components of a dataset so as to reveal variations attributable to unobserved components. Factor analysis has been used to extract the TIR spectra of components that are typically observed in mixtures and only rarely in isolation; there is the potential for similar

  11. Statistical analysis of imperfection effect on cylindrical buckling response

    NASA Astrophysics Data System (ADS)

    Ismail, M. S.; Purbolaksono, J.; Muhammad, N.; Andriyana, A.; Liew, H. L.

    2015-12-01

    It is widely reported that no efficient guidelines for modelling imperfections in composite structures are available. In response, this work evaluates the imperfection factors of axially compressed Carbon Fibre Reinforced Polymer (CFRP) cylinder with different ply angles through finite element (FE) analysis. The sensitivity of imperfection factors were analysed using design of experiment: factorial design approach. From the analysis it identified three critical factors that sensitively reacted towards buckling load. Furthermore empirical equation is proposed according to each type of cylinder. Eventually, critical buckling loads estimated by empirical equation showed good agreements with FE analysis. The design of experiment methodology is useful in identifying parameters that lead to structures imperfection tolerance.

  12. Multispectral image analysis of bruise age

    NASA Astrophysics Data System (ADS)

    Sprigle, Stephen; Yi, Dingrong; Caspall, Jayme; Linden, Maureen; Kong, Linghua; Duckworth, Mark

    2007-03-01

    The detection and aging of bruises is important within clinical and forensic environments. Traditionally, visual and photographic assessment of bruise color is used to determine age, but this substantially subjective technique has been shown to be inaccurate and unreliable. The purpose of this study was to develop a technique to spectrally-age bruises using a reflective multi-spectral imaging system that minimizes the filtering and hardware requirements while achieving acceptable accuracy. This approach will then be incorporated into a handheld, point-of-care technology that is clinically-viable and affordable. Sixteen bruises from elder residents of a long term care facility were imaged over time. A multi-spectral system collected images through eleven narrow band (~10 nm FWHM) filters having center wavelengths ranging between 370-970 nm corresponding to specific skin and blood chromophores. Normalized bruise reflectance (NBR)- defined as the ratio of optical reflectance coefficient of bruised skin over that of normal skin- was calculated for all bruises at all wavelengths. The smallest mean NBR, regardless of bruise age, was found at wavelength between 555 & 577nm suggesting that contrast in bruises are from the hemoglobin, and that they linger for a long duration. A contrast metric, based on the NBR at 460nm and 650nm, was found to be sensitive to age and requires further investigation. Overall, the study identified four key wavelengths that have promise to characterize bruise age. However, the high variability across the bruises imaged in this study complicates the development of a handheld detection system until additional data is available.

  13. New Statistical Methods for the Analysis of the Cratering on Venus

    NASA Astrophysics Data System (ADS)

    Xie, M.; Smrekar, S. E.; Handcock, M. S.

    2014-12-01

    The sparse crater population (~1000 craters) on Venus is the most important clue of determining the planet's surface age and aids in understanding its geologic history. What processes (volcanism, tectonism, weathering, etc.) modify the total impact crater population? Are the processes regional or global in occurrence? The heated debate on these questions points to the need for better approaches. We present new statistical methods for the analysis of the crater locations and characteristics. Specifically: 1) We produce a map of crater density and the proportion of no halo craters (inferred to be modified) by using generalized additive models, and smoothing splines with a spherical spline basis set. Based on this map, we are able to predict the probability of a crater has no halo given that there is a crater at that point. We also obtain a continuous representation of the ratio of craters with no halo as a function of crater density. This approach allows us to look for regions that appear to have experienced more or less modification, and are thus potentially older or younger. 2) We examine the randomness or clustering of distributions of craters by type (e.g. dark floored, intermediate). For example, for dark floored craters we consider two hypotheses: i) the dark floored craters are randomly distributed on the surface; ii) the dark floored craters are random given the locations of the crater population. Instead of only using a single measure such as average nearest neighbor distance, we use the probability density function of these distances, and compare it to complete spatial randomness to get the relative probability density function. This function gives us a clearer picture of how and where the nearest neighbor distances differ from complete spatial randomness. We also conduct statistical tests of these hypotheses. Confidence intervals with specified global coverage are constructed. Software to reproduce the methods is available in the open source statistics

  14. Statistical Modeling and Analysis of Laser-Evoked Potentials of Electrocorticogram Recordings from Awake Humans

    PubMed Central

    Chen, Zhe; Ohara, Shinji; Cao, Jianting; Vialatte, François; Lenz, Fred A.; Cichocki, Andrzej

    2007-01-01

    This article is devoted to statistical modeling and analysis of electrocorticogram (ECoG) signals induced by painful cutaneous laser stimuli, which were recorded from implanted electrodes in awake humans. Specifically, with statistical tools of factor analysis and independent component analysis, the pain-induced laser-evoked potentials (LEPs) were extracted and investigated under different controlled conditions. With the help of wavelet analysis, quantitative and qualitative analyses were conducted regarding the LEPs' attributes of power, amplitude, and latency, in both averaging and single-trial experiments. Statistical hypothesis tests were also applied in various experimental setups. Experimental results reported herein also confirm previous findings in the neurophysiology literature. In addition, single-trial analysis has also revealed many new observations that might be interesting to the neuroscientists or clinical neurophysiologists. These promising results show convincing validation that advanced signal processing and statistical analysis may open new avenues for future studies of such ECoG or other relevant biomedical recordings. PMID:18369410

  15. New Statistical Approach to the Analysis of Hierarchical Data

    NASA Astrophysics Data System (ADS)

    Neuman, S. P.; Guadagnini, A.; Riva, M.

    2014-12-01

    Many variables possess a hierarchical structure reflected in how their increments vary in space and/or time. Quite commonly the increments (a) fluctuate in a highly irregular manner; (b) possess symmetric, non-Gaussian frequency distributions characterized by heavy tails that often decay with separation distance or lag; (c) exhibit nonlinear power-law scaling of sample structure functions in a midrange of lags, with breakdown in such scaling at small and large lags; (d) show extended power-law scaling (ESS) at all lags; and (e) display nonlinear scaling of power-law exponent with order of sample structure function. Some interpret this to imply that the variables are multifractal, which explains neither breakdowns in power-law scaling nor ESS. We offer an alternative interpretation consistent with all above phenomena. It views data as samples from stationary, anisotropic sub-Gaussian random fields subordinated to truncated fractional Brownian motion (tfBm) or truncated fractional Gaussian noise (tfGn). The fields are scaled Gaussian mixtures with random variances. Truncation of fBm and fGn entails filtering out components below data measurement or resolution scale and above domain scale. Our novel interpretation of the data allows us to obtain maximum likelihood estimates of all parameters characterizing the underlying truncated sub-Gaussian fields. These parameters in turn make it possible to downscale or upscale all statistical moments to situations entailing smaller or larger measurement or resolution and sampling scales, respectively. They also allow one to perform conditional or unconditional Monte Carlo simulations of random field realizations corresponding to these scales. Aspects of our approach are illustrated on field and laboratory measured porous and fractured rock permeabilities, as well as soil texture characteristics and neural network estimates of unsaturated hydraulic parameters in a deep vadose zone near Phoenix, Arizona. We also use our approach

  16. ANOVA like analysis of cancer death age

    NASA Astrophysics Data System (ADS)

    Areia, Aníbal; Mexia, João T.

    2016-06-01

    We use ANOVA to study the influence of year, sex, country and location on the average cancer death age. The data used was from the World Health Organization (WHO) files for 1999, 2003, 2007 and 2011. The locations considered were: kidney, leukaemia, melanoma of skin and oesophagus and the countries: Portugal, Norway, Greece and Romania.

  17. Statistical analysis of properties of dwarf novae outbursts

    NASA Astrophysics Data System (ADS)

    Otulakowska-Hypka, Magdalena; Olech, Arkadiusz; Patterson, Joseph

    2016-08-01

    We present a statistical study of all measurable photometric features of a large sample of dwarf novae during their outbursts and superoutbursts. We used all accessible photometric data for all our objects to make the study as complete and up to date as possible. Our aim was to check correlations between these photometric features in order to constrain theoretical models which try to explain the nature of dwarf novae outbursts. We managed to confirm a few of the known correlations, that is the Stolz and Schoembs relation, the Bailey relation for long outbursts above the period gap, the relations between the cycle and supercycle lengths, amplitudes of normal and superoutbursts, amplitude and duration of superoutbursts, outburst duration and orbital period, outburst duration and mass ratio for short and normal outbursts, as well as the relation between the rise and decline rates of superoutbursts. However, we question the existence of the Kukarkin-Parenago relation but we found an analogous relation for superoutbursts. We also failed to find one presumed relation between outburst duration and mass ratio for superoutbursts. This study should help to direct theoretical work dedicated to dwarf novae.

  18. Statistical analysis of AFE GN&C aeropass performance

    NASA Technical Reports Server (NTRS)

    Chang, Ho-Pen; French, Raymond A.

    1990-01-01

    Performance of the guidance, navigation, and control (GN&C) system used on the Aeroassist Flight Experiment (AFE) spacecraft has been studied with Monte Carlo techniques. The performance of the AFE GN&C is investigated with a 6-DOF numerical dynamic model which includes a Global Reference Atmospheric Model (GRAM) and a gravitational model with oblateness corrections. The study considers all the uncertainties due to the environment and the system itself. In the AFE's aeropass phase, perturbations on the system performance are caused by an error space which has over 20 dimensions of the correlated/uncorrelated error sources. The goal of this study is to determine, in a statistical sense, how much flight path angle error can be tolerated at entry interface (EI) and still have acceptable delta-V capability at exit to position the AFE spacecraft for recovery. Assuming there is fuel available to produce 380 ft/sec of delta-V at atmospheric exit, a 3-sigma standard deviation in flight path angle error of 0.04 degrees at EI would result in a 98-percent probability of mission success.

  19. A statistical mechanics analysis of the set covering problem

    NASA Astrophysics Data System (ADS)

    Fontanari, J. F.

    1996-02-01

    The dependence of the optimal solution average cost 0305-4470/29/3/004/img1 of the set covering problem on the density of 1's of the incidence matrix (0305-4470/29/3/004/img2) and on the number of constraints (P) is investigated in the limit where the number of items (N) goes to infinity. The annealed approximation is employed to study two stochastic models: the constant density model, where the elements of the incidence matrix are statistically independent random variables, and the Karp model, where the rows of the incidence matrix possess the same number of 1's. Lower bounds for 0305-4470/29/3/004/img1 are presented in the case that P scales with ln N and 0305-4470/29/3/004/img2 is of order 1, as well as in the case that P scales linearly with N and 0305-4470/29/3/004/img2 is of order 1/N. It is shown that in the case that P scales with exp N and 0305-4470/29/3/004/img2 is of order 1 the annealed approximation yields exact results for both models.

  20. Statistical analysis of dendritic spine distributions in rat hippocampal cultures

    PubMed Central

    2013-01-01

    Background Dendritic spines serve as key computational structures in brain plasticity. Much remains to be learned about their spatial and temporal distribution among neurons. Our aim in this study was to perform exploratory analyses based on the population distributions of dendritic spines with regard to their morphological characteristics and period of growth in dissociated hippocampal neurons. We fit a log-linear model to the contingency table of spine features such as spine type and distance from the soma to first determine which features were important in modeling the spines, as well as the relationships between such features. A multinomial logistic regression was then used to predict the spine types using the features suggested by the log-linear model, along with neighboring spine information. Finally, an important variant of Ripley’s K-function applicable to linear networks was used to study the spatial distribution of spines along dendrites. Results Our study indicated that in the culture system, (i) dendritic spine densities were "completely spatially random", (ii) spine type and distance from the soma were independent quantities, and most importantly, (iii) spines had a tendency to cluster with other spines of the same type. Conclusions Although these results may vary with other systems, our primary contribution is the set of statistical tools for morphological modeling of spines which can be used to assess neuronal cultures following gene manipulation such as RNAi, and to study induced pluripotent stem cells differentiated to neurons. PMID:24088199

  1. Statistical analysis of the properties of foreshock density cavitons

    NASA Astrophysics Data System (ADS)

    Kajdič, P.; Blanco-Cano, X.; Omidi, N.; Russell, C. T.

    2008-12-01

    Global hybrid simulations (kinetic ions, fluid electrons) have shown the existence of foreshock density cavitons immersed in regions permeated by ULF waves (Omidi, 2007, Blanco-Cano et al., 2008). These cavitons are characterized by large depressions in magnetic field magnitude and density, and are bounded by regions with enhanced field and density. In this work we study statistical properties of foreshock cavitons observed by Cluster spacecraft between the years 2001 and 2005. We have identified approximately 90 foreshock cavitons and use magnetic field and plasma data to analyze their durations, sizes, amplitude, and orientation. We compare caviton B and n values with ambient values. We also study the foreshock conditions in which the cavitons are detected, i.e. θBV, the angle between the incoming solar wind flow and the IMF, and Mach number, among others. We also determine the characteristics of the waves that surround the cavitons or even appear within them. We find that the foreshock cavitons can be observed in various ways - some are found as single cavitons immersed in ULF waves, others appear in groups, separated temporally only by a few minutes. In some cases we find two or three cavitons that are in the process of merging into a larger structure, and still developing.

  2. On the Statistical Analysis of X-ray Polarization Measurements

    NASA Technical Reports Server (NTRS)

    Strohmayer, T. E.; Kallman, T. R.

    2013-01-01

    In many polarimetry applications, including observations in the X-ray band, the measurement of a polarization signal can be reduced to the detection and quantification of a deviation from uniformity of a distribution of measured angles of the form alpha plus beta cosine (exp 2)(phi - phi(sub 0) (0 (is) less than phi is less than pi). We explore the statistics of such polarization measurements using both Monte Carlo simulations as well as analytic calculations based on the appropriate probability distributions. We derive relations for the number of counts required to reach a given detection level (parameterized by beta the "number of sigma's" of the measurement) appropriate for measuring the modulation amplitude alpha by itself (single interesting parameter case) or jointly with the position angle phi (two interesting parameters case). We show that for the former case when the intrinsic amplitude is equal to the well known minimum detectable polarization (MDP) it is, on average, detected at the 3sigma level. For the latter case, when one requires a joint measurement at the same confidence level, then more counts are needed, by a factor of approximately equal to 2.2, than that required to achieve the MDP level. We find that the position angle uncertainty at 1sigma confidence is well described by the relation sigma(sub pi) equals 28.5(degrees) divided by beta.

  3. Statistical language analysis for automatic exfiltration event detection.

    SciTech Connect

    Robinson, David Gerald

    2010-04-01

    This paper discusses the recent development a statistical approach for the automatic identification of anomalous network activity that is characteristic of exfiltration events. This approach is based on the language processing method eferred to as latent dirichlet allocation (LDA). Cyber security experts currently depend heavily on a rule-based framework for initial detection of suspect network events. The application of the rule set typically results in an extensive list of uspect network events that are then further explored manually for suspicious activity. The ability to identify anomalous network events is heavily dependent on the experience of the security personnel wading through the network log. Limitations f this approach are clear: rule-based systems only apply to exfiltration behavior that has previously been observed, and experienced cyber security personnel are rare commodities. Since the new methodology is not a discrete rule-based pproach, it is more difficult for an insider to disguise the exfiltration events. A further benefit is that the methodology provides a risk-based approach that can be implemented in a continuous, dynamic or evolutionary fashion. This permits uspect network activity to be identified early with a quantifiable risk associated with decision making when responding to suspicious activity.

  4. Ockham's razor and Bayesian analysis. [statistical theory for systems evaluation

    NASA Technical Reports Server (NTRS)

    Jefferys, William H.; Berger, James O.

    1992-01-01

    'Ockham's razor', the ad hoc principle enjoining the greatest possible simplicity in theoretical explanations, is presently shown to be justifiable as a consequence of Bayesian inference; Bayesian analysis can, moreover, clarify the nature of the 'simplest' hypothesis consistent with the given data. By choosing the prior probabilities of hypotheses, it becomes possible to quantify the scientific judgment that simpler hypotheses are more likely to be correct. Bayesian analysis also shows that a hypothesis with fewer adjustable parameters intrinsically possesses an enhanced posterior probability, due to the clarity of its predictions.

  5. Computer-assisted analysis of cervical vertebral bone age using cephalometric radiographs in Brazilian subjects.

    PubMed

    Caldas, Maria de Paula; Ambrosano, Gláucia Maria Bovi; Haiter Neto, Francisco

    2010-01-01

    The aims of this study were to develop a computerized program for objectively evaluating skeletal maturation on cephalometric radiographs, and to apply the new method to Brazilian subjects. The samples were taken from the patient files of Oral Radiological Clinics from the North, Northeast, Midwest and South regions of the country. A total of 717 subjects aged 7.0 to 15.9 years who had lateral cephalometric radiographs and hand-wrist radiographs were selected. A cervical vertebral computerized analysis was created in the Radiocef Studio 2 computer software for digital cephalometric analysis, and cervical vertebral bone age was calculated using the formulas developed by Caldas et al.17 (2007). Hand-wrist bone age was evaluated by the TW3 method. Analysis of variance (ANOVA) and the Tukey test were used to compare cervical vertebral bone age, hand-wrist bone age and chronological age (P < 0.05). No significant difference was found between cervical vertebral bone age and chronological age in all regions studied. When analyzing bone age, it was possible to observe a statistically significant difference between cervical vertebral bone age and hand-wrist bone age for female and male subjects in the North and Northeast regions, as well as for male subjects in the Midwest region. No significant difference was observed between bone age and chronological age in all regions except for male subjects in the North and female subjects in the Northeast. Using cervical vertebral bone age, it might be possible to evaluate skeletal maturation in an objective manner using cephalometric radiographs.

  6. New Test Statistics for MANOVA/Descriptive Discriminant Analysis.

    ERIC Educational Resources Information Center

    Coombs, William T.; Algina, James

    1996-01-01

    Univariate procedures proposed by M. Brown and A. Forsythe (1974) and the multivariate procedures from D. Nel and C. van der Merwe (1986) were generalized to form five new multivariate alternatives to one-way multivariate analysis of variance (MANOVA) for use when dispersion matrices are heteroscedastic. These alternatives are evaluated for Type I…

  7. The Patterns of Teacher Compensation. Statistical Analysis Report.

    ERIC Educational Resources Information Center

    Chambers, Jay; Bobbitt, Sharon A.

    This report presents information regarding the patterns of variation in the salaries paid to public and private school teachers in relation to various personal and job characteristics. Specifically, the analysis examines the relationship between compensation and variables such as public/private schools, gender, race/ethnic background, school level…

  8. Using Neural Networks for Descriptive Statistical Analysis of Educational Data.

    ERIC Educational Resources Information Center

    Tirri, Henry; And Others

    Methodological issues of using a class of neural networks called Mixture Density Networks (MDN) for discriminant analysis are discussed. MDN models have the advantage of having a rigorous probabilistic interpretation, and they have proven to be a viable alternative as a classification procedure in discrete domains. Both classification and…

  9. Statistical analysis of unsolicited thermal sensation complaints in commercial buildings

    SciTech Connect

    Federspiel, C.C.

    1998-10-01

    Unsolicited complaints from 23,500 occupants in 690 commercial buildings were examined with regard to absolute and relative frequency of complaints, temperatures at which thermal sensation complaints (too hot or too cold) occurred, and response times and actions. The analysis shows that thermal sensation complaints are the single most common complaint of any type and that they are the overwhelming majority of environmental complaints. The analysis indicates that thermal sensation complaints are mostly the result of poor control performance and HVAC system faults rather than inter-individual differences in preferred temperatures. The analysis also shows that the neutral temperature in summer is greater than in winter, and the difference between summer and winter neutral temperatures is smaller than the difference between the midpoints of the summer and winter ASHRAE comfort zones. On average, women complain that it is cold at a higher temperature than men, and the temperature at which men complain that it is hot is more variable than for women. Analysis of response times and actions provides information that may be useful for designing a dispatching policy, and it also demonstrates that there is potential to reduce the labor cost of HVAC maintenance by 20% by reducing the frequency of thermal sensation complaints.

  10. Granger causality--statistical analysis under a configural perspective.

    PubMed

    von Eye, Alexander; Wiedermann, Wolfgang; Mun, Eun-Young

    2014-03-01

    The concept of Granger causality can be used to examine putative causal relations between two series of scores. Based on regression models, it is asked whether one series can be considered the cause for the second series. In this article, we propose extending the pool of methods available for testing hypotheses that are compatible with Granger causation by adopting a configural perspective. This perspective allows researchers to assume that effects exist for specific categories only or for specific sectors of the data space, but not for other categories or sectors. Configural Frequency Analysis (CFA) is proposed as the method of analysis from a configural perspective. CFA base models are derived for the exploratory analysis of Granger causation. These models are specified so that they parallel the regression models used for variable-oriented analysis of hypotheses of Granger causation. An example from the development of aggression in adolescence is used. The example shows that only one pattern of change in aggressive impulses over time Granger-causes change in physical aggression against peers.

  11. Open Access Publishing Trend Analysis: Statistics beyond the Perception

    ERIC Educational Resources Information Center

    Poltronieri, Elisabetta; Bravo, Elena; Curti, Moreno; Maurizio Ferri,; Mancini, Cristina

    2016-01-01

    Introduction: The purpose of this analysis was twofold: to track the number of open access journals acquiring impact factor, and to investigate the distribution of subject categories pertaining to these journals. As a case study, journals in which the researchers of the National Institute of Health (Istituto Superiore di Sanità) in Italy have…

  12. Statistical analysis of geodetic networks for detecting regional events

    NASA Technical Reports Server (NTRS)

    Granat, Robert

    2004-01-01

    We present an application of hidden Markov models (HMMs) to analysis of geodetic time series in Southern California. Our model fitting method uses a regularized version of the deterministic annealing expectation-maximization algorithm to ensure that model solutions are both robust and of high quality.

  13. Bayesian Statistics and Uncertainty Quantification for Safety Boundary Analysis in Complex Systems

    NASA Technical Reports Server (NTRS)

    He, Yuning; Davies, Misty Dawn

    2014-01-01

    The analysis of a safety-critical system often requires detailed knowledge of safe regions and their highdimensional non-linear boundaries. We present a statistical approach to iteratively detect and characterize the boundaries, which are provided as parameterized shape candidates. Using methods from uncertainty quantification and active learning, we incrementally construct a statistical model from only few simulation runs and obtain statistically sound estimates of the shape parameters for safety boundaries.

  14. Using the statistical analysis method to assess the landslide susceptibility

    NASA Astrophysics Data System (ADS)

    Chan, Hsun-Chuan; Chen, Bo-An; Wen, Yo-Ting

    2015-04-01

    This study assessed the landslide susceptibility in Jing-Shan River upstream watershed, central Taiwan. The landslide inventories during typhoons Toraji in 2001, Mindulle in 2004, Kalmaegi and Sinlaku in 2008, Morakot in 2009, and the 0719 rainfall event in 2011, which were established by Taiwan Central Geological Survey, were used as landslide data. This study aims to assess the landslide susceptibility by using different statistical methods including logistic regression, instability index method and support vector machine (SVM). After the evaluations, the elevation, slope, slope aspect, lithology, terrain roughness, slope roughness, plan curvature, profile curvature, total curvature, average of rainfall were chosen as the landslide factors. The validity of the three established models was further examined by the receiver operating characteristic curve. The result of logistic regression showed that the factor of terrain roughness and slope roughness had a stronger impact on the susceptibility value. Instability index method showed that the factor of terrain roughness and lithology had a stronger impact on the susceptibility value. Due to the fact that the use of instability index method may lead to possible underestimation around the river side. In addition, landslide susceptibility indicated that the use of instability index method laid a potential issue about the number of factor classification. An increase of the number of factor classification may cause excessive variation coefficient of the factor. An decrease of the number of factor classification may make a large range of nearby cells classified into the same susceptibility level. Finally, using the receiver operating characteristic curve discriminate the three models. SVM is a preferred method than the others in assessment of landslide susceptibility. Moreover, SVM is further suggested to be nearly logistic regression in terms of recognizing the medium-high and high susceptibility.

  15. An overview of the mathematical and statistical analysis component of RICIS

    NASA Technical Reports Server (NTRS)

    Hallum, Cecil R.

    1987-01-01

    Mathematical and statistical analysis components of RICIS (Research Institute for Computing and Information Systems) can be used in the following problem areas: (1) quantification and measurement of software reliability; (2) assessment of changes in software reliability over time (reliability growth); (3) analysis of software-failure data; and (4) decision logic for whether to continue or stop testing software. Other areas of interest to NASA/JSC where mathematical and statistical analysis can be successfully employed include: math modeling of physical systems, simulation, statistical data reduction, evaluation methods, optimization, algorithm development, and mathematical methods in signal processing.

  16. Characterization of Nuclear Fuel using Multivariate Statistical Analysis

    SciTech Connect

    Robel, M; Robel, M; Robel, M; Kristo, M J; Kristo, M J

    2007-11-27

    Various combinations of reactor type and fuel composition have been characterized using principle components analysis (PCA) of the concentrations of 9 U and Pu isotopes in the 10 fuel as a function of burnup. The use of PCA allows the reduction of the 9-dimensional data (isotopic concentrations) into a 3-dimensional approximation, giving a visual representation of the changes in nuclear fuel composition with burnup. Real-world variation in the concentrations of {sup 234}U and {sup 236}U in the fresh (unirradiated) fuel was accounted for. The effects of reprocessing were also simulated. The results suggest that, 15 even after reprocessing, Pu isotopes can be used to determine both the type of reactor and the initial fuel composition with good discrimination. Finally, partial least squares discriminant analysis (PSLDA) was investigated as a substitute for PCA. Our results suggest that PLSDA is a better tool for this application where separation between known classes is most important.

  17. Three dimensional graphics in the statistical analysis of scientific data

    SciTech Connect

    Grotch, S.L.

    1986-05-01

    In scientific data analysis, the two-dimensional plot has become an indispensable tool. As the scientist more commonly encounters multivariate data, three dimensional graphics will form the natural extension of these more traditional representations. There can be little doubt that as the accessibility to ever more powerful graphics tools increases, their use will expand dramatically. In using three dimensional graphics in routine data analysis for nearly a decade, they have proved to be a powerful means for obtaining insights into data simply not available with traditional 2D methods. Examples of this work, taken primarily from chemistry and meteorology, are presented to illustrate a variety of 3D graphics found to be practically useful. Some approaches for improving these presentations are also highlighted.

  18. Statistical theory and methodology for remote sensing data analysis

    NASA Technical Reports Server (NTRS)

    Odell, P. L.

    1974-01-01

    A model is developed for the evaluation of acreages (proportions) of different crop-types over a geographical area using a classification approach and methods for estimating the crop acreages are given. In estimating the acreages of a specific croptype such as wheat, it is suggested to treat the problem as a two-crop problem: wheat vs. nonwheat, since this simplifies the estimation problem considerably. The error analysis and the sample size problem is investigated for the two-crop approach. Certain numerical results for sample sizes are given for a JSC-ERTS-1 data example on wheat identification performance in Hill County, Montana and Burke County, North Dakota. Lastly, for a large area crop acreages inventory a sampling scheme is suggested for acquiring sample data and the problem of crop acreage estimation and the error analysis is discussed.

  19. Phosphoproteomic analysis of aged skeletal muscle.

    PubMed

    Gannon, Joan; Staunton, Lisa; O'Connell, Kathleen; Doran, Philip; Ohlendieck, Kay

    2008-07-01

    One of the most important post-translational modifications is represented by phosphorylation on tyrosine, threonine and serine residues. Since abnormal phosphorylation is associated with various pathologies, it was of interest to perform a phosphoproteomic profiling of age-related skeletal muscle degeneration. We used the fluorescent phospho-specific Pro-Q Diamond dye to determine whether changes in the overall phosphorylation of the soluble skeletal muscle proteome differs significantly between young adult and senescent fibres. As an established model system of sarcopenia, we employed 30-month-old rat gastrocnemius fibres. Following the mass spectrometric identification of 59 major 2-D phosphoprotein landmark spots, the fluorescent dye staining survey revealed that 22 muscle proteins showed a differential expression pattern between 3-month- and 30-month-old muscle. Increased phosphorylation levels were shown for myosin light chain 2, tropomyosin alpha, lactate dehydrogenase, desmin, actin, albumin and aconitase. In contrast, decreased phospho-specific dye binding was observed for cytochrome c oxidase, creatine kinase and enolase. Thus, aging-induced alterations in phosphoproteins appear to involve the contractile machinery and the cytoskeleton, as well as the cytosolic and mitochondrial metabolism. This confirms that sarcopenia of old age is a complex neuromuscular pathology that is associated with drastic changes in the abundance and structure of key muscle proteins. PMID:18575773

  20. Practical guidance for statistical analysis of operational event data

    SciTech Connect

    Atwood, C.L.

    1995-10-01

    This report presents ways to avoid mistakes that are sometimes made in analysis of operational event data. It then gives guidance on what to do when a model is rejected, a list of standard types of models to consider, and principles for choosing one model over another. For estimating reliability, it gives advice on which failure modes to model, and moment formulas for combinations of failure modes. The issues are illustrated with many examples and case studies.

  1. Measuring the Success of an Academic Development Programme: A Statistical Analysis

    ERIC Educational Resources Information Center

    Smith, L. C.

    2009-01-01

    This study uses statistical analysis to estimate the impact of first-year academic development courses in microeconomics, statistics, accountancy, and information systems, offered by the University of Cape Town's Commerce Academic Development Programme, on students' graduation performance relative to that achieved by mainstream students. The data…

  2. A new statistic for the analysis of circular data in gamma-ray astronomy

    NASA Technical Reports Server (NTRS)

    Protheroe, R. J.

    1985-01-01

    A new statistic is proposed for the analysis of circular data. The statistic is designed specifically for situations where a test of uniformity is required which is powerful against alternatives in which a small fraction of the observations is grouped in a small range of directions, or phases.

  3. Statistical and Scientometric Analysis of International Research in Geographical and Environmental Education

    ERIC Educational Resources Information Center

    Papadimitriou, Fivos; Kidman, Gillian

    2012-01-01

    Certain statistic and scientometric features of articles published in the journal "International Research in Geographical and Environmental Education" (IRGEE) are examined in this paper for the period 1992-2009 by applying nonparametric statistics and Shannon's entropy (diversity) formula. The main findings of this analysis are: (a) after 2004,…

  4. Comparing Methods for Item Analysis: The Impact of Different Item-Selection Statistics on Test Difficulty

    ERIC Educational Resources Information Center

    Jones, Andrew T.

    2011-01-01

    Practitioners often depend on item analysis to select items for exam forms and have a variety of options available to them. These include the point-biserial correlation, the agreement statistic, the B index, and the phi coefficient. Although research has demonstrated that these statistics can be useful for item selection, no research as of yet has…

  5. Meaningfulness, Statistical Significance, Effect Size, and Power Analysis: A General Discussion with Implications for MANOVA.

    ERIC Educational Resources Information Center

    Huston, Holly L.

    This paper begins with a general discussion of statistical significance, effect size, and power analysis; and concludes by extending the discussion to the multivariate case (MANOVA). Historically, traditional statistical significance testing has guided researchers' thinking about the meaningfulness of their data. The use of significance testing…

  6. Statistical correlation analysis for comparing vibration data from test and analysis

    NASA Technical Reports Server (NTRS)

    Butler, T. G.; Strang, R. F.; Purves, L. R.; Hershfeld, D. J.

    1986-01-01

    A theory was developed to compare vibration modes obtained by NASTRAN analysis with those obtained experimentally. Because many more analytical modes can be obtained than experimental modes, the analytical set was treated as expansion functions for putting both sources in comparative form. The dimensional symmetry was developed for three general cases: nonsymmetric whole model compared with a nonsymmetric whole structural test, symmetric analytical portion compared with a symmetric experimental portion, and analytical symmetric portion with a whole experimental test. The theory was coded and a statistical correlation program was installed as a utility. The theory is established with small classical structures.

  7. Meta-analysis of age-related gene expression profiles identifies common signatures of aging

    PubMed Central

    de Magalhães, João Pedro; Curado, João; Church, George M.

    2009-01-01

    Motivation: Numerous microarray studies of aging have been conducted, yet given the noisy nature of gene expression changes with age, elucidating the transcriptional features of aging and how these relate to physiological, biochemical and pathological changes remains a critical problem. Results: We performed a meta-analysis of age-related gene expression profiles using 27 datasets from mice, rats and humans. Our results reveal several common signatures of aging, including 56 genes consistently overexpressed with age, the most significant of which was APOD, and 17 genes underexpressed with age. We characterized the biological processes associated with these signatures and found that age-related gene expression changes most notably involve an overexpression of inflammation and immune response genes and of genes associated with the lysosome. An underexpression of collagen genes and of genes associated with energy metabolism, particularly mitochondrial genes, as well as alterations in the expression of genes related to apoptosis, cell cycle and cellular senescence biomarkers, were also observed. By employing a new method that emphasizes sensitivity, our work further reveals previously unknown transcriptional changes with age in many genes, processes and functions. We suggest these molecular signatures reflect a combination of degenerative processes but also transcriptional responses to the process of aging. Overall, our results help to understand how transcriptional changes relate to the process of aging and could serve as targets for future studies. Availability: http://genomics.senescence.info/uarrays/signatures.html Contact: jp@senescence.info Supplementary information: Supplementary data are available at Bioinformatics online. PMID:19189975

  8. Analysis of compressive fracture in rock using statistical techniques

    SciTech Connect

    Blair, S.C.

    1994-12-01

    Fracture of rock in compression is analyzed using a field-theory model, and the processes of crack coalescence and fracture formation and the effect of grain-scale heterogeneities on macroscopic behavior of rock are studied. The model is based on observations of fracture in laboratory compression tests, and incorporates assumptions developed using fracture mechanics analysis of rock fracture. The model represents grains as discrete sites, and uses superposition of continuum and crack-interaction stresses to create cracks at these sites. The sites are also used to introduce local heterogeneity. Clusters of cracked sites can be analyzed using percolation theory. Stress-strain curves for simulated uniaxial tests were analyzed by studying the location of cracked sites, and partitioning of strain energy for selected intervals. Results show that the model implicitly predicts both development of shear-type fracture surfaces and a strength-vs-size relation that are similar to those observed for real rocks. Results of a parameter-sensitivity analysis indicate that heterogeneity in the local stresses, attributed to the shape and loading of individual grains, has a first-order effect on strength, and that increasing local stress heterogeneity lowers compressive strength following an inverse power law. Peak strength decreased with increasing lattice size and decreasing mean site strength, and was independent of site-strength distribution. A model for rock fracture based on a nearest-neighbor algorithm for stress redistribution is also presented and used to simulate laboratory compression tests, with promising results.

  9. Statistical methods for the forensic analysis of striated tool marks

    SciTech Connect

    Hoeksema, Amy Beth

    2013-01-01

    In forensics, fingerprints can be used to uniquely identify suspects in a crime. Similarly, a tool mark left at a crime scene can be used to identify the tool that was used. However, the current practice of identifying matching tool marks involves visual inspection of marks by forensic experts which can be a very subjective process. As a result, declared matches are often successfully challenged in court, so law enforcement agencies are particularly interested in encouraging research in more objective approaches. Our analysis is based on comparisons of profilometry data, essentially depth contours of a tool mark surface taken along a linear path. In current practice, for stronger support of a match or non-match, multiple marks are made in the lab under the same conditions by the suspect tool. We propose the use of a likelihood ratio test to analyze the difference between a sample of comparisons of lab tool marks to a field tool mark, against a sample of comparisons of two lab tool marks. Chumbley et al. (2010) point out that the angle of incidence between the tool and the marked surface can have a substantial impact on the tool mark and on the effectiveness of both manual and algorithmic matching procedures. To better address this problem, we describe how the analysis can be enhanced to model the effect of tool angle and allow for angle estimation for a tool mark left at a crime scene. With sufficient development, such methods may lead to more defensible forensic analyses.

  10. Statistical analysis of the ambiguities in the asteroid period determinations

    NASA Astrophysics Data System (ADS)

    Butkiewicz, M.; Kwiatkowski, T.; Bartczak, P.; Dudziński, G.

    2014-07-01

    A synodic period of an asteroid can be derived from its lightcurve by standard methods like Fourier-series fitting. A problem appears when results of observations are based on less than a full coverage of a lightcurve and/or high level of noise. Also, long gaps between individual lightcurves create an ambiguity in the cycle count which leads to aliases. Excluding binary systems and objects with non-principal-axis rotation, the rotation period is usually identical to the period of the second Fourier harmonic of the lightcurve. There are cases, however, where it may be connected with the 1st, 3rd, or 4th harmonic and it is difficult to choose among them when searching for the period. To help remove such uncertainties we analysed asteroid lightcurves for a range of shapes and observing/illuminating geometries. We simulated them using a modified internal code from the ISAM service (Marciniak et al. 2012, A&A 545, A131). In our computations, shapes of asteroids were modeled as Gaussian random spheres (Muinonen 1998, A&A, 332, 1087). A combination of Lommel-Seeliger and Lambert scattering laws was assumed. For each of the 100 shapes, we randomly selected 1000 positions of the spin axis, systematically changing the solar phase angle with a step of 5°. For each lightcurve, we determined its peak-to-peak amplitude, fitted the 6th-order Fourier series and derived the amplitudes of its harmonics. Instead of the number of the lightcurve extrema, which in many cases is subjective, we characterized each lightcurve by the order of the highest-amplitude Fourier harmonic. The goal of our simulations was to derive statistically significant conclusions (based on the underlying assumptions) about the dominance of different harmonics in the lightcurves of the specified amplitude and phase angle. The results, presented in the Figure, can be used in individual cases to estimate the probability that the obtained lightcurve is dominated by a specified Fourier harmonic. Some of the

  11. Statistical Analysis Strategies for Association Studies Involving Rare Variants

    PubMed Central

    Bansal, Vikas; Libiger, Ondrej; Torkamani, Ali; Schork, Nicholas J.

    2013-01-01

    The limitations of genome-wide association (GWA) studies that focus on the phenotypic influence of common genetic variants have motivated human geneticists to consider the contribution of rare variants to phenotypic expression. The increasing availability of high-throughput sequencing technology has enabled studies of rare variants, but will not be sufficient for their success since appropriate analytical methods are also needed. We consider data analysis approaches to testing associations between a phenotype and collections of rare variants in a defined genomic region or set of regions. Ultimately, although a wide variety of analytical approaches exist, more work is needed to refine them and determine their properties and power in different contexts. PMID:20940738

  12. STATISTICAL ANALYSIS OF THE VERY QUIET SUN MAGNETISM

    SciTech Connect

    Martinez Gonzalez, M. J.; Manso Sainz, R.; Asensio Ramos, A.

    2010-03-10

    The behavior of the observed polarization amplitudes with spatial resolution is a strong constraint on the nature and organization of solar magnetic fields below the resolution limit. We study the polarization of the very quiet Sun at different spatial resolutions using ground- and space-based observations. It is shown that 80% of the observed polarization signals do not change with spatial resolution, suggesting that, observationally, the very quiet Sun magnetism remains the same despite the high spatial resolution of space-based observations. Our analysis also reveals a cascade of spatial scales for the magnetic field within the resolution element. It is manifest that the Zeeman effect is sensitive to the microturbulent field usually associated with Hanle diagnostics. This demonstrates that Zeeman and Hanle studies show complementary perspectives of the same magnetism.

  13. Statistical Analysis of Factors Affecting Child Mortality in Pakistan.

    PubMed

    Ahmed, Zoya; Kamal, Asifa; Kamal, Asma

    2016-06-01

    Child mortality is a composite indicator reflecting economic, social, environmental, healthcare services, and their delivery situation in a country. Globally, Pakistan has the third highest burden of fetal, maternal, and child mortality. Factors affecting child mortality in Pakistan are investigated by using Binary Logistic Regression Analysis. Region, education of mother, birth order, preceding birth interval (the period between the previous child birth and the index child birth), size of child at birth, and breastfeeding and family size were found to be significantly important with child mortality in Pakistan. Child mortality decreased as level of mother's education, preceding birth interval, size of child at birth, and family size increased. Child mortality was found to be significantly higher in Balochistan as compared to other regions. Child mortality was low for low birth orders. Child survival was significantly higher for children who were breastfed as compared to those who were not.

  14. Statistical analysis of the temporal properties of BL Lacertae

    NASA Astrophysics Data System (ADS)

    Guo, Yu Cheng; Hu, Shao Ming; Li, Yu Tong; Chen, Xu

    2016-08-01

    A comprehensive temporal analysis has been performed on optical light curves of BL Lacertae in the B, V and R bands. The light curves were denoised by Gaussian smoothing and decomposed into individual flares using an exponential profile. The asymmetry, duration, peak flux and equivalent energy output of flares were measured and the frequency distributions presented. Most optical flares of BL Lacertae are highly symmetric, with a weak tendency towards gradual rises and rapid decays. The distribution of flare durations is not random, but consistent with a gamma distribution. Peak fluxes and energy outputs of flares all follow a log-normal distribution. A positive correlation is detected between flare durations and peak fluxes. The temporal properties of BL Lacertae provide evidence of the stochastic magnetohydrodynamic process in the accretion disc and jet.The results presented here can serve as constraints on physical models attempting to interpret blazar variations.

  15. Spectral reflectance of surface soils - A statistical analysis

    NASA Technical Reports Server (NTRS)

    Crouse, K. R.; Henninger, D. L.; Thompson, D. R.

    1983-01-01

    The relationship of the physical and chemical properties of soils to their spectral reflectance as measured at six wavebands of Thematic Mapper (TM) aboard NASA's Landsat-4 satellite was examined. The results of performing regressions of over 20 soil properties on the six TM bands indicated that organic matter, water, clay, cation exchange capacity, and calcium were the properties most readily predicted from TM data. The middle infrared bands, bands 5 and 7, were the best bands for predicting soil properties, and the near infrared band, band 4, was nearly as good. Clustering 234 soil samples on the TM bands and characterizing the clusters on the basis of soil properties revealed several clear relationships between properties and reflectance. Discriminant analysis found organic matter, fine sand, base saturation, sand, extractable acidity, and water to be significant in discriminating among clusters.

  16. Fine needle aspiration biopsy diagnosis of mucoepidermoid carcinoma. Statistical analysis.

    PubMed

    Cohen, M B; Fisher, P E; Holly, E A; Ljung, B M; Löwhagen, T; Bottles, K

    1990-01-01

    Fine needle aspiration (FNA) biopsy is an increasingly popular method for the evaluation of salivary gland tumors. Of the common salivary gland tumors, mucoepidermoid carcinoma is probably the most difficult to diagnose accurately by this means. A series of 96 FNA biopsy specimens of salivary gland masses, including 34 mucoepidermoid carcinomas, 51 other benign and malignant neoplasms, 7 nonneoplastic lesions and 4 normal salivary glands, were analyzed in order to identify the most useful criteria for diagnosing mucoepidermoid carcinoma. Thirteen cytologic criteria were evaluated in the FNA specimens, and a stepwise logistic regression analysis was performed. The three cytologic features selected as most predictive of mucoepidermoid carcinoma were intermediate cells, squamous cells and overlapping epithelial groups. Using these three features together, the sensitivity and specificity of accurately diagnosing mucoepidermoid carcinoma were 97% and 100%, respectively.

  17. Ordinary chondrites - Multivariate statistical analysis of trace element contents

    NASA Technical Reports Server (NTRS)

    Lipschutz, Michael E.; Samuels, Stephen M.

    1991-01-01

    The contents of mobile trace elements (Co, Au, Sb, Ga, Se, Rb, Cs, Te, Bi, Ag, In, Tl, Zn, and Cd) in Antarctic and non-Antarctic populations of H4-6 and L4-6 chondrites, were compared using standard multivariate discriminant functions borrowed from linear discriminant analysis and logistic regression. A nonstandard randomization-simulation method was developed, making it possible to carry out probability assignments on a distribution-free basis. Compositional differences were found both between the Antarctic and non-Antarctic H4-6 chondrite populations and between two L4-6 chondrite populations. It is shown that, for various types of meteorites (in particular, for the H4-6 chondrites), the Antarctic/non-Antarctic compositional difference is due to preterrestrial differences in the genesis of their parent materials.

  18. Statistical Analysis of Temple Orientation in Ancient India

    NASA Astrophysics Data System (ADS)

    Aller, Alba; Belmonte, Juan Antonio

    2015-05-01

    The great diversity of religions that have been followed in India for over 3000 years is the reason why there are hundreds of temples built to worship dozens of different divinities. In this work, more than one hundred temples geographically distributed over the whole Indian land have been analyzed, obtaining remarkable results. For this purpose, a deep analysis of the main deities who are worshipped in each of them, as well as of the different dynasties (or cultures) who built them has also been conducted. As a result, we have found that the main axes of the temples dedicated to Shiva seem to be oriented to the east cardinal point while those temples dedicated to Vishnu would be oriented to both the east and west cardinal points. To explain these cardinal directions we propose to look back to the origins of Hinduism. Besides these cardinal orientations, clear solar orientations have also been found, especially at the equinoctial declination.

  19. Statistical Analysis of Acoustic Wave Parameters Near Solar Active Regions

    NASA Astrophysics Data System (ADS)

    Rabello-Soares, M. Cristina; Bogart, Richard S.; Scherrer, Philip H.

    2016-08-01

    In order to quantify the influence of magnetic fields on acoustic mode parameters and flows in and around active regions, we analyze the differences in the parameters in magnetically quiet regions nearby an active region (which we call “nearby regions”), compared with those of quiet regions at the same disk locations for which there are no neighboring active regions. We also compare the mode parameters in active regions with those in comparably located quiet regions. Our analysis is based on ring-diagram analysis of all active regions observed by the Helioseismic and Magnetic Imager (HMI) during almost five years. We find that the frequency at which the mode amplitude changes from attenuation to amplification in the quiet nearby regions is around 4.2 mHz, in contrast to the active regions, for which it is about 5.1 mHz. This amplitude enhacement (the “acoustic halo effect”) is as large as that observed in the active regions, and has a very weak dependence on the wave propagation direction. The mode energy difference in nearby regions also changes from a deficit to an excess at around 4.2 mHz, but averages to zero over all modes. The frequency difference in nearby regions increases with increasing frequency until a point at which the frequency shifts turn over sharply, as in active regions. However, this turnover occurs around 4.9 mHz, which is significantly below the acoustic cutoff frequency. Inverting the horizontal flow parameters in the direction of the neigboring active regions, we find flows that are consistent with a model of the thermal energy flow being blocked directly below the active region.

  20. Statistical analysis of large-scale neuronal recording data

    PubMed Central

    Reed, Jamie L.; Kaas, Jon H.

    2010-01-01

    Relating stimulus properties to the response properties of individual neurons and neuronal networks is a major goal of sensory research. Many investigators implant electrode arrays in multiple brain areas and record from chronically implanted electrodes over time to answer a variety of questions. Technical challenges related to analyzing large-scale neuronal recording data are not trivial. Several analysis methods traditionally used by neurophysiologists do not account for dependencies in the data that are inherent in multi-electrode recordings. In addition, when neurophysiological data are not best modeled by the normal distribution and when the variables of interest may not be linearly related, extensions of the linear modeling techniques are recommended. A variety of methods exist to analyze correlated data, even when data are not normally distributed and the relationships are nonlinear. Here we review expansions of the Generalized Linear Model designed to address these data properties. Such methods are used in other research fields, and the application to large-scale neuronal recording data will enable investigators to determine the variable properties that convincingly contribute to the variances in the observed neuronal measures. Standard measures of neuron properties such as response magnitudes can be analyzed using these methods, and measures of neuronal network activity such as spike timing correlations can be analyzed as well. We have done just that in recordings from 100-electrode arrays implanted in the primary somatosensory cortex of owl monkeys. Here we illustrate how one example method, Generalized Estimating Equations analysis, is a useful method to apply to large-scale neuronal recordings. PMID:20472395

  1. A new statistical analysis of rare earth element diffusion data in garnet

    NASA Astrophysics Data System (ADS)

    Chu, X.; Ague, J. J.

    2015-12-01

    The incorporation of rare earth elements (REE) in garnet, Sm and Lu in particular, links garnet chemical zoning to absolute age determinations. The application of REE-based geochronology depends critically on the diffusion behaviors of the parent and daughter isotopes. Previous experimental studies on REE diffusion in garnet, however, exhibit significant discrepancies that impact interpretations of garnet Sm/Nd and Lu/Hf ages.We present a new statistical framework to analyze diffusion data for REE using an Arrhenius relationship that accounts for oxygen fugacity, cation radius and garnet unit-cell dimensions [1]. Our approach is based on Bayesian statistics and is implemented by the Markov chain Monte Carlo method. A similar approach has been recently applied to model diffusion of divalent cations in garnet [2]. The analysis incorporates recent data [3] in addition to the data compilation in ref. [1]. We also include the inter-run bias that helps reconcile the discrepancies among data sets. This additional term estimates the reproducibility and other experimental variabilities not explicitly incorporated in the Arrhenius relationship [2] (e.g., compositional dependence [3] and water content).The fitted Arrhenius relationships are consistent with the models in ref. [3], as well as refs. [1]&[4] at high temperatures. Down-temperature extrapolation leads to >0.5 order of magnitude faster diffusion coefficients than in refs. [1]&[4] at <750 °C. The predicted diffusion coefficients are significantly slower than ref. [5]. The fast diffusion [5] was supported by a field test of the Pikwitonei Granulite—the garnet Sm/Nd age postdates the metamorphic peak (750 °C) by ~30 Myr [6], suggesting considerable resetting of the Sm/Nd system during cooling. However, the Pikwitonei Granulite is a recently recognized UHT terrane with peak temperature exceeding 900 °C [7]. The revised closure temperature (~730 °C) is consistent with our new diffusion model.[1] Carlson (2012) Am

  2. STATISTICAL METHODOLOGY FOR THE SIMULTANEOUS ANALYSIS OF MULTIPLE TYPES OF OUTCOMES IN NONLINEAR THRESHOLD MODELS.

    EPA Science Inventory

    Multiple outcomes are often measured on each experimental unit in toxicology experiments. These multiple observations typically imply the existence of correlation between endpoints, and a statistical analysis that incorporates it may result in improved inference. When both disc...

  3. Meta-analysis for Discovering Rare-Variant Associations: Statistical Methods and Software Programs

    PubMed Central

    Tang, Zheng-Zheng; Lin, Dan-Yu

    2015-01-01

    There is heightened interest in using next-generation sequencing technologies to identify rare variants that influence complex human diseases and traits. Meta-analysis is essential to this endeavor because large sample sizes are required for detecting associations with rare variants. In this article, we provide a comprehensive overview of statistical methods for meta-analysis of sequencing studies for discovering rare-variant associations. Specifically, we discuss the calculation of relevant summary statistics from participating studies, the construction of gene-level association tests, the choice of transformation for quantitative traits, the use of fixed-effects versus random-effects models, and the removal of shadow association signals through conditional analysis. We also show that meta-analysis based on properly calculated summary statistics is as powerful as joint analysis of individual-participant data. In addition, we demonstrate the performance of different meta-analysis methods by using both simulated and empirical data. We then compare four major software packages for meta-analysis of rare-variant associations—MASS, RAREMETAL, MetaSKAT, and seqMeta—in terms of the underlying statistical methodology, analysis pipeline, and software interface. Finally, we present PreMeta, a software interface that integrates the four meta-analysis packages and allows a consortium to combine otherwise incompatible summary statistics. PMID:26094574

  4. Meta-analysis for Discovering Rare-Variant Associations: Statistical Methods and Software Programs.

    PubMed

    Tang, Zheng-Zheng; Lin, Dan-Yu

    2015-07-01

    There is heightened interest in using next-generation sequencing technologies to identify rare variants that influence complex human diseases and traits. Meta-analysis is essential to this endeavor because large sample sizes are required for detecting associations with rare variants. In this article, we provide a comprehensive overview of statistical methods for meta-analysis of sequencing studies for discovering rare-variant associations. Specifically, we discuss the calculation of relevant summary statistics from participating studies, the construction of gene-level association tests, the choice of transformation for quantitative traits, the use of fixed-effects versus random-effects models, and the removal of shadow association signals through conditional analysis. We also show that meta-analysis based on properly calculated summary statistics is as powerful as joint analysis of individual-participant data. In addition, we demonstrate the performance of different meta-analysis methods by using both simulated and empirical data. We then compare four major software packages for meta-analysis of rare-variant associations-MASS, RAREMETAL, MetaSKAT, and seqMeta-in terms of the underlying statistical methodology, analysis pipeline, and software interface. Finally, we present PreMeta, a software interface that integrates the four meta-analysis packages and allows a consortium to combine otherwise incompatible summary statistics.

  5. Statistical analysis of low-voltage EDS spectrum images

    SciTech Connect

    Anderson, I.M.

    1998-03-01

    The benefits of using low ({le}5 kV) operating voltages for energy-dispersive X-ray spectrometry (EDS) of bulk specimens have been explored only during the last few years. This paper couples low-voltage EDS with two other emerging areas of characterization: spectrum imaging of a computer chip manufactured by a major semiconductor company. Data acquisition was performed with a Philips XL30-FEG SEM operated at 4 kV and equipped with an Oxford super-ATW detector and XP3 pulse processor. The specimen was normal to the electron beam and the take-off angle for acquisition was 35{degree}. The microscope was operated with a 150 {micro}m diameter final aperture at spot size 3, which yielded an X-ray count rate of {approximately}2,000 s{sup {minus}1}. EDS spectrum images were acquired as Adobe Photoshop files with the 4pi plug-in module. (The spectrum images could also be stored as NIH Image files, but the raw data are automatically rescaled as maximum-contrast (0--255) 8-bit TIFF images -- even at 16-bit resolution -- which poses an inconvenience for quantitative analysis.) The 4pi plug-in module is designed for EDS X-ray mapping and allows simultaneous acquisition of maps from 48 elements plus an SEM image. The spectrum image was acquired by re-defining the energy intervals of 48 elements to form a series of contiguous 20 eV windows from 1.25 kV to 2.19 kV. A spectrum image of 450 x 344 pixels was acquired from the specimen with a sampling density of 50 nm/pixel and a dwell time of 0.25 live seconds per pixel, for a total acquisition time of {approximately}14 h. The binary data files were imported into Mathematica for analysis with software developed by the author at Oak Ridge National Laboratory. A 400 x 300 pixel section of the original image was analyzed. MSA required {approximately}185 Mbytes of memory and {approximately}18 h of CPU time on a 300 MHz Power Macintosh 9600.

  6. Wheat signature modeling and analysis for improved training statistics

    NASA Technical Reports Server (NTRS)

    Nalepka, R. F. (Principal Investigator); Malila, W. A.; Cicone, R. C.; Gleason, J. M.

    1976-01-01

    The author has identified the following significant results. The spectral, spatial, and temporal characteristics of wheat and other signatures in LANDSAT multispectral scanner data were examined through empirical analysis and simulation. Irrigation patterns varied widely within Kansas; 88 percent of wheat acreage in Finney was irrigated and 24 percent in Morton, as opposed to less than 3 percent for western 2/3's of the State. The irrigation practice was definitely correlated with the observed spectral response; wheat variety differences produced observable spectral differences due to leaf coloration and different dates of maturation. Between-field differences were generally greater than within-field differences, and boundary pixels produced spectral features distinct from those within field centers. Multiclass boundary pixels contributed much of the observed bias in proportion estimates. The variability between signatures obtained by different draws of training data decreased as the sample size became larger; also, the resulting signatures became more robust and the particular decision threshold value became less important.

  7. Accounting for Multiple Sources of Uncertainty in the Statistical Analysis of Holocene Sea Levels

    NASA Astrophysics Data System (ADS)

    Cahill, N.; Parnell, A. C.; Kemp, A.; Horton, B.

    2014-12-01

    We perform a Bayesian statistical analysis on historical and late Holocene rates of sea-level change. The data that form the input to the statistical model are tide-gauge measurements and proxy reconstructions from cores of coastal sediment. The aims are to estimate rates of sea-level change, to determine when modern rates of rise began and to observe how these rates have evolved over time. Many current methods for doing this use simple linear regression to estimate rates. This is often inappropriate as it is too rigid and it can ignore uncertainties that arise as part of the data collection exercise. This can lead to over-confidence in the sea-level trends being characterized. The proposed model places a Gaussian process prior on the rate process (i.e. the process that determines how rates of sea-level are changing over time). The likelihood of the observed data is the integral of this process. When dealing with proxy reconstructions, the model is set in an errors-in-variables framework so as to take account of age uncertainty. It is also necessary to account for glacio-isostatic adjustment, which introduces a covariance between individual age and sea-level observations. This method allows for the estimation of the rate process with full consideration of all sources of uncertainty. The model captures the continuous and dynamic evolution of sea-level change and results show that modern rates of rise are consistently increasing. Analysis of a global tide-gauge record (Church and White, 2011) indicated that the rate of sea-level rise increased continuously since 1880AD and is currently 1.9mm/yr (95% credible interval of 1.84 to 2.03mm/yr). Applying the model to a proxy reconstruction from North Carolina (Kemp et al., 2011) indicated that the mean rate of rise in this locality since the middle of the 19th century (current rate of 2.44 mm/yr with a 95% credible interval of 1.91 to 3.01mm/yr) is unprecedented in at least the last 2000 years.

  8. Indoor Soiling Method and Outdoor Statistical Risk Analysis of Photovoltaic Power Plants

    NASA Astrophysics Data System (ADS)

    Rajasekar, Vidyashree

    This is a two-part thesis. Part 1 presents an approach for working towards the development of a standardized artificial soiling method for laminated photovoltaic (PV) cells or mini-modules. Construction of an artificial chamber to maintain controlled environmental conditions and components/chemicals used in artificial soil formulation is briefly explained. Both poly-Si mini-modules and a single cell mono-Si coupons were soiled and characterization tests such as I-V, reflectance and quantum efficiency (QE) were carried out on both soiled, and cleaned coupons. From the results obtained, poly-Si mini-modules proved to be a good measure of soil uniformity, as any non-uniformity present would not result in a smooth curve during I-V measurements. The challenges faced while executing reflectance and QE characterization tests on poly-Si due to smaller size cells was eliminated on the mono-Si coupons with large cells to obtain highly repeatable measurements. This study indicates that the reflectance measurements between 600-700 nm wavelengths can be used as a direct measure of soil density on the modules. Part 2 determines the most dominant failure modes of field aged PV modules using experimental data obtained in the field and statistical analysis, FMECA (Failure Mode, Effect, and Criticality Analysis). The failure and degradation modes of about 744 poly-Si glass/polymer frameless modules fielded for 18 years under the cold-dry climate of New York was evaluated. Defect chart, degradation rates (both string and module levels) and safety map were generated using the field measured data. A statistical reliability tool, FMECA that uses Risk Priority Number (RPN) is used to determine the dominant failure or degradation modes in the strings and modules by means of ranking and prioritizing the modes. This study on PV power plants considers all the failure and degradation modes from both safety and performance perspectives. The indoor and outdoor soiling studies were jointly

  9. Statistical analysis of molecular nanotemplate driven DNA adsorption on graphite.

    PubMed

    Dubrovin, E V; Speller, S; Yaminsky, I V

    2014-12-30

    In this work, we have studied the conformation of DNA molecules aligned on the nanotemplates of octadecylamine, stearyl alcohol, and stearic acid on highly oriented pyrolytic graphite (HOPG). For this purpose, fluctuations of contours of adsorbed biopolymers obtained from atomic force microscopy (AFM) images were analyzed using the wormlike chain model. Moreover, the conformations of adsorbed biopolymer molecules were characterized by the analysis of the scaling exponent ν, which relates the mean squared end-to-end distance and contour length of the polymer. During adsorption on octadecylamine and stearyl alcohol nanotemplates, DNA forms straight segments, which order along crystallographic axes of graphite. In this case, the conformation of DNA molecules can be described using two different length scales. On a large length scale (at contour lengths l > 200-400 nm), aligned DNA molecules have either 2D compact globule or partially relaxed 2D conformation, whereas on a short length scale (at l ≤ 200-400 nm) their conformation is close to that of rigid rods. The latter type of conformation can be also assigned to DNA adsorbed on a stearic acid nanotemplate. The different conformation of DNA molecules observed on the studied monolayers is connected with the different DNA-nanotemplate interactions associated with the nature of the functional group of the alkane derivative in the nanotemplate (amine, alcohol, or acid). The persistence length of λ-DNA adsorbed on octadecylamine nanotemplates is 31 ± 2 nm indicating the loss of DNA rigidity in comparison with its native state. Similar values of the persistence length (34 ± 2 nm) obtained for 24-times shorter DNA molecules adsorbed on an octadecylamine nanotemplate demonstrate that this rigidity change does not depend on biopolymer length. Possible reasons for the reduction of DNA persistence length are discussed in view of the internal DNA structure and DNA-surface interaction.

  10. Processing and statistical analysis of soil-root images

    NASA Astrophysics Data System (ADS)

    Razavi, Bahar S.; Hoang, Duyen; Kuzyakov, Yakov

    2016-04-01

    Importance of the hotspots such as rhizosphere, the small soil volume that surrounds and is influenced by plant roots, calls for spatially explicit methods to visualize distribution of microbial activities in this active site (Kuzyakov and Blagodatskaya, 2015). Zymography technique has previously been adapted to visualize the spatial dynamics of enzyme activities in rhizosphere (Spohn and Kuzyakov, 2014). Following further developing of soil zymography -to obtain a higher resolution of enzyme activities - we aimed to 1) quantify the images, 2) determine whether the pattern (e.g. distribution of hotspots in space) is clumped (aggregated) or regular (dispersed). To this end, we incubated soil-filled rhizoboxes with maize Zea mays L. and without maize (control box) for two weeks. In situ soil zymography was applied to visualize enzymatic activity of β-glucosidase and phosphatase at soil-root interface. Spatial resolution of fluorescent images was improved by direct application of a substrate saturated membrane to the soil-root system. Furthermore, we applied "spatial point pattern analysis" to determine whether the pattern (e.g. distribution of hotspots in space) is clumped (aggregated) or regular (dispersed). Our results demonstrated that distribution of hotspots at rhizosphere is clumped (aggregated) compare to control box without plant which showed regular (dispersed) pattern. These patterns were similar in all three replicates and for both enzymes. We conclude that improved zymography is promising in situ technique to identify, analyze, visualize and quantify spatial distribution of enzyme activities in the rhizosphere. Moreover, such different patterns should be considered in assessments and modeling of rhizosphere extension and the corresponding effects on soil properties and functions. Key words: rhizosphere, spatial point pattern, enzyme activity, zymography, maize.

  11. R: A Software Environment for Comprehensive Statistical Analysis of Astronomical Data

    NASA Astrophysics Data System (ADS)

    Feigelson, E. D.

    2012-09-01

    R is the largest public domain software language for statistical analysis of data. Together with CRAN, its rapidly growing collection of >3000 add-on specialized packages, it implements around 60,000 statistical functionalities in a cohesive software environment. Extensive graphical capabilities and interfaces with other programming languages are also available. The scope and language of R/CRAN are briefly described, along with efforts to promulgate its use in the astronomy. R can become an important tool for advanced statistical analysis of astronomical data.

  12. Relative age effects in Japanese baseball: an historical analysis.

    PubMed

    Nakata, Hiroki; Sakamoto, Kiwako

    2013-08-01

    The present study investigated the existence of the relative age effect, a biased distribution of birth dates, in Japanese professional baseball players born from 1911 to 1980. Japan applies a unique annual-age grouping for sport and education, which is from April 1 to March 31 of the following year. Thus, athletes were divided into four groups based on their month of birth; quarters Q1 (April-June), Q2 (July-September), Q3 (October-December), and Q4 (January-March of the following year). There were statistically biased distributions of birth dates among players born in the 1940s and subsequent decades (medium effects), and similar (but small) relative age effects were observed among players born in the 1910s, 1920s, and 1930s. The magnitude of the relative age effect changed with time, and socio-cultural factors such as international competition and media coverage may have contributed greatly to this effect.

  13. Relative age effects in Japanese baseball: an historical analysis.

    PubMed

    Nakata, Hiroki; Sakamoto, Kiwako

    2013-08-01

    The present study investigated the existence of the relative age effect, a biased distribution of birth dates, in Japanese professional baseball players born from 1911 to 1980. Japan applies a unique annual-age grouping for sport and education, which is from April 1 to March 31 of the following year. Thus, athletes were divided into four groups based on their month of birth; quarters Q1 (April-June), Q2 (July-September), Q3 (October-December), and Q4 (January-March of the following year). There were statistically biased distributions of birth dates among players born in the 1940s and subsequent decades (medium effects), and similar (but small) relative age effects were observed among players born in the 1910s, 1920s, and 1930s. The magnitude of the relative age effect changed with time, and socio-cultural factors such as international competition and media coverage may have contributed greatly to this effect. PMID:24422356

  14. Statistical Energy Analysis (SEA) and Energy Finite Element Analysis (EFEA) Predictions for a Floor-Equipped Composite Cylinder

    NASA Technical Reports Server (NTRS)

    Grosveld, Ferdinand W.; Schiller, Noah H.; Cabell, Randolph H.

    2011-01-01

    Comet Enflow is a commercially available, high frequency vibroacoustic analysis software founded on Energy Finite Element Analysis (EFEA) and Energy Boundary Element Analysis (EBEA). Energy Finite Element Analysis (EFEA) was validated on a floor-equipped composite cylinder by comparing EFEA vibroacoustic response predictions with Statistical Energy Analysis (SEA) and experimental results. Statistical Energy Analysis (SEA) predictions were made using the commercial software program VA One 2009 from ESI Group. The frequency region of interest for this study covers the one-third octave bands with center frequencies from 100 Hz to 4000 Hz.

  15. Feasibility of voxel-based statistical analysis method for myocardial PET

    NASA Astrophysics Data System (ADS)

    Ram Yu, A.; Kim, Jin Su; Paik, Chang H.; Kim, Kyeong Min; Moo Lim, Sang

    2014-09-01

    Although statistical parametric mapping (SPM) analysis is widely used in neuroimaging studies, to our best knowledge, there was no application to myocardial PET data analysis. In this study, we developed the voxel based statistical analysis method for myocardial PET which provides statistical comparison results between groups in image space. PET Emission data of normal and myocardial infarction rats were acquired For the SPM analysis, a rat heart template was created. In addition, individual PET data was spatially normalized and smoothed. Two sample t-tests were performed to identify the myocardial infarct region. This developed SPM method was compared with conventional ROI methods. Myocardial glucose metabolism was decreased in the lateral wall of the left ventricle. In the result of ROI analysis, the mean value of the lateral wall was 29% decreased. The newly developed SPM method for myocardial PET could provide quantitative information in myocardial PET study.

  16. In Vivo Brillouin Analysis of the Aging Crystalline Lens

    PubMed Central

    Besner, Sebastien; Scarcelli, Giuliano; Pineda, Roberto; Yun, Seok-Hyun

    2016-01-01

    Purpose To analyze the age dependence of the longitudinal modulus of the crystalline lens in vivo using Brillouin scattering data in healthy subjects. Methods Brillouin scans were performed along the crystalline lens in 56 eyes from 30 healthy subjects aged from 19 to 63 years. Longitudinal elastic modulus was acquired along the sagittal axis of the lens with a transverse and axial resolution of 4 and 60 μm, respectively. The relative lens stiffness was computed, and correlations with age were analyzed. Results Brillouin axial profiles revealed nonuniform longitudinal modulus within the lens, increasing from a softer periphery toward a stiffer central plateau at all ages. The longitudinal modulus at the central plateau showed no age dependence in a range of 19 to 45 years and a slight decrease with age from 45 to 63 years. A significant intersubject variability was observed in an age-matched analysis. Importantly, the extent of the central stiff plateau region increased steadily over age from 19 to 63 years. The slope of change in Brillouin modulus in the peripheral regions were nearly age-invariant. Conclusions The adult human lens showed no measurable age-related increase in the peak longitudinal modulus. The expansion of the stiff central region of the lens is likely to be the major contributing factor to age-related lens stiffening. Brillouin microscopy may be useful in characterizing the crystalline lens for the optimization of surgical or pharmacological treatments aimed at restoring accommodative power. PMID:27699407

  17. [Anthropometry: the modern statistical analysis and significance for clinics of internal diseases and nutrition].

    PubMed

    Petykhov, A B; Maev, I V; Deriabin, V E

    2012-01-01

    Anthropometry--a technique, allowing to obtain the necessary features for the characteristic of human body's changes in norm and at pathology. Statistical analysis of anthropometric parameters, such as--body mass, length, waist line, hip, shoulder and wrist circumferences, skin rolls of fat thickness: on triceps, under a bladebone, on a breast, on a venter and on a biceps, with calculation of indexes and an assessment of possible age influence was carried out for the first time in domestic medicine. Complexes of showing interrelations anthropometric characteristics were detected. Correlation coefficients (r) were counted and the factorial (on a method main a component with the subsequent rotation--a varimax method), covariance and discriminative analyses (with application of the Kaiser and Wilks criterions and F-test) is applied. Study of intergroup variability of body composition was carried out on separate characteristics in healthy individuals groups (135 surveyed aged 45,6 +/- 1,2 years, 56,3% men and 43,7% women) and at internal pathology: patients after a gastrectomy--121 (57,7 +/- 1,2 years, 52% men and 48% women); after Billroth operation--214 (56,1 +/- 1,0 years, 53% men and 47% women); after enterectomy--103 (44,5 +/- 1,8 years, 53% men and 47% women); after mixed genesis protein-energy wasting--206 (29,04 +/- 1,6 years, 79% men and 21% women). The group of interlocking characteristics which includes anthropometric parameters of hypodermic lipopexia (rolls of fat thickness on triceps, a biceps, under a bladebone, on a venter) and fatty body mass was defined by results of the analysis. These characteristics are interconnected with age and growth and have more expressed dependence at women, that reflects development of a fatty component of a body, at assessment of body mass index at women (unlike men). The waist-hip circumference index differs irrespective of body composition indicators that doesn't allow to characterize it with the terms of truncal or

  18. [Anthropometry: the modern statistical analysis and significance for clinics of internal diseases and nutrition].

    PubMed

    Petykhov, A B; Maev, I V; Deriabin, V E

    2012-01-01

    Anthropometry--a technique, allowing to obtain the necessary features for the characteristic of human body's changes in norm and at pathology. Statistical analysis of anthropometric parameters, such as--body mass, length, waist line, hip, shoulder and wrist circumferences, skin rolls of fat thickness: on triceps, under a bladebone, on a breast, on a venter and on a biceps, with calculation of indexes and an assessment of possible age influence was carried out for the first time in domestic medicine. Complexes of showing interrelations anthropometric characteristics were detected. Correlation coefficients (r) were counted and the factorial (on a method main a component with the subsequent rotation--a varimax method), covariance and discriminative analyses (with application of the Kaiser and Wilks criterions and F-test) is applied. Study of intergroup variability of body composition was carried out on separate characteristics in healthy individuals groups (135 surveyed aged 45,6 +/- 1,2 years, 56,3% men and 43,7% women) and at internal pathology: patients after a gastrectomy--121 (57,7 +/- 1,2 years, 52% men and 48% women); after Billroth operation--214 (56,1 +/- 1,0 years, 53% men and 47% women); after enterectomy--103 (44,5 +/- 1,8 years, 53% men and 47% women); after mixed genesis protein-energy wasting--206 (29,04 +/- 1,6 years, 79% men and 21% women). The group of interlocking characteristics which includes anthropometric parameters of hypodermic lipopexia (rolls of fat thickness on triceps, a biceps, under a bladebone, on a venter) and fatty body mass was defined by results of the analysis. These characteristics are interconnected with age and growth and have more expressed dependence at women, that reflects development of a fatty component of a body, at assessment of body mass index at women (unlike men). The waist-hip circumference index differs irrespective of body composition indicators that doesn't allow to characterize it with the terms of truncal or

  19. Childhood autism in India: A case-control study using tract-based spatial statistics analysis

    PubMed Central

    Assis, Zarina Abdul; Bagepally, Bhavani Shankara; Saini, Jitender; Srinath, Shoba; Bharath, Rose Dawn; Naidu, Purushotham R.; Gupta, Arun Kumar

    2015-01-01

    Context: Autism is a serious behavioral disorder among young children that now occurs at epidemic rates in developing countries like India. We have used tract-based spatial statistics (TBSS) of diffusion tensor imaging (DTI) measures to investigate the microstructure of primary neurocircuitry involved in autistic spectral disorders as compared to the typically developed children. Objective: To evaluate the various white matter tracts in Indian autistic children as compared to the controls using TBSS. Materials and Methods: Prospective, case-control, voxel-based, whole-brain DTI analysis using TBSS was performed. The study included 19 autistic children (mean age 8.7 years ± 3.84, 16 males and 3 females) and 34 controls (mean age 12.38 ± 3.76, all males). Fractional anisotropy (FA), mean diffusivity (MD), radial diffusivity (RD), and axial diffusivity (AD) values were used as outcome variables. Results: Compared to the control group, TBSS demonstrated multiple areas of markedly reduced FA involving multiple long white matter tracts, entire corpus callosum, bilateral posterior thalami, and bilateral optic tracts (OTs). Notably, there were no voxels where FA was significantly increased in the autism group. Increased RD was also noted in these regions, suggesting underlying myelination defect. The MD was elevated in many of the projections and association fibers and notably in the OTs. There were no significant changes in the AD in these regions, indicating no significant axonal injury. There was no significant correlation between the FA values and Childhood Autism Rating Scale. Conclusion: This is a first of a kind study evaluating DTI findings in autistic children in India. In our study, DTI has shown a significant fault with the underlying intricate brain wiring system in autism. OT abnormality is a novel finding and needs further research. PMID:26600581

  20. Age inclusive services or separate old age and working age services? A historical analysis from the formative years of old age psychiatry c.1940-1989.

    PubMed

    Hilton, Claire

    2015-04-01

    The Equality Act 2010 made it unlawful to discriminate in the provision of services on the grounds of age. This legislation is open to interpretation, but it is affecting the way older people's services are defined and provided. Historical evidence indicates that, since the 1940s, apart from psychiatrists working in dedicated old age services, most were unenthusiastic about working with mentally unwell older people and unsupportive of those who chose to do so. A historical analysis might shed light on current dilemmas about 'all age' or 'old age' services and inform decision-making on future mental health services.

  1. Space Shuttle Columbia Aging Wiring Failure Analysis

    NASA Technical Reports Server (NTRS)

    McDaniels, Steven J.

    2005-01-01

    A Space Shuttle Columbia main engine controller 14 AWG wire short circuited during the launch of STS-93. Post-flight examination divulged that the wire had electrically arced against the head of a nearby bolt. More extensive inspection revealed additional damage to the subject wire, and to other wires as well from the mid-body of Columbia. The shorted wire was to have been constructed from nickel-plated copper conductors surrounded by the polyimide insulation Kapton, top-coated with an aromatic polyimide resin. The wires were analyzed via scanning electron microscope (SEM), energy dispersive X-Ray spectroscopy (EDX), and electron spectroscopy for chemical analysis (ESCA); differential scanning calorimetry (DSC) and thermal gravimetric analysis (TGA) were performed on the polyimide. Exemplar testing under laboratory conditions was performed to replicate the mechanical damage characteristics evident on the failed wires. The exemplar testing included a step test, where, as the name implies, a person stepped on a simulated wire bundle that rested upon a bolt head. Likewise, a shear test that forced a bolt head and a torque tip against a wire was performed to attempt to damage the insulation and conductor. Additionally, a vibration test was performed to determine if a wire bundle would abrade when vibrated against the head of a bolt. Also, an abrasion test was undertaken to determine if the polyimide of the wire could be damaged by rubbing against convolex helical tubing. Finally, an impact test was performed to ascertain if the use of the tubing would protect the wire from the strike of a foreign object.

  2. Statistical analysis of ground based magnetic field measurements with the field line resonance detector

    NASA Astrophysics Data System (ADS)

    Plaschke, F.; Glassmeier, K.-H.; Constantinescu, O. D.; Mann, I. R.; Milling, D. K.; Motschmann, U.; Rae, I. J.

    2008-11-01

    In this paper we introduce the field line resonance detector (FLRD), a wave telescope technique which has been specially adapted to estimate the spectral energy density of field line resonance (FLR) phase structures in a superposed wave field. The field line resonance detector is able to detect and correctly characterize several superposed FLR structures of a wave field and therefore constitutes a new and powerful tool in ULF pulsation studies. In our work we derive the technique from the classical wave telescope beamformer and present a statistical analysis of one year of ground based magnetometer data from the Canadian magnetometer network CANOPUS, now known as CARISMA. The statistical analysis shows that the FLRD is capable of detecting and characterizing superposed or hidden FLR structures in most of the detected ULF pulsation events; the one year statistical database is therefore extraordinarily comprehensive. The results of this analysis confirm the results of previous FLR characterizations and furthermore allow a detailed generalized dispersion analysis of FLRs.

  3. [Application of multivariate statistical analysis and thinking in quality control of Chinese medicine].

    PubMed

    Liu, Na; Li, Jun; Li, Bao-Guo

    2014-11-01

    The study of quality control of Chinese medicine has always been the hot and the difficulty spot of the development of traditional Chinese medicine (TCM), which is also one of the key problems restricting the modernization and internationalization of Chinese medicine. Multivariate statistical analysis is an analytical method which is suitable for the analysis of characteristics of TCM. It has been used widely in the study of quality control of TCM. Multivariate Statistical analysis was used for multivariate indicators and variables that appeared in the study of quality control and had certain correlation between each other, to find out the hidden law or the relationship between the data can be found,.which could apply to serve the decision-making and realize the effective quality evaluation of TCM. In this paper, the application of multivariate statistical analysis in the quality control of Chinese medicine was summarized, which could provided the basis for its further study. PMID:25775806

  4. Application of multivariate statistical methods to the analysis of ancient Turkish potsherds

    SciTech Connect

    Martin, R.C.

    1986-01-01

    Three hundred ancient Turkish potsherds were analyzed by instrumental neutron activation analysis, and the resulting data analyzed by several techniques of multivariate statistical analysis, some only recently developed. The programs AGCLUS, MASLOC, and SIMCA were sequentially employed to characterize and group the samples by type of pottery and site of excavation. Comparison of the statistical analyses by each method provided archaeological insight into the site/type relationships of the samples and ultimately evidence relevant to the commercial relations between the ancient communities and specialization of pottery production over time. The techniques used for statistical analysis were found to be of significant potential utility in the future analysis of other archaeometric data sets. 25 refs., 33 figs.

  5. Analysis of Variance with Summary Statistics in Microsoft® Excel®

    ERIC Educational Resources Information Center

    Larson, David A.; Hsu, Ko-Cheng

    2010-01-01

    Students regularly are asked to solve Single Factor Analysis of Variance problems given only the sample summary statistics (number of observations per category, category means, and corresponding category standard deviations). Most undergraduate students today use Excel for data analysis of this type. However, Excel, like all other statistical…

  6. Bayesian Statistical Analysis of Historical and Late Holocene Rates of Sea-Level Change

    NASA Astrophysics Data System (ADS)

    Cahill, Niamh; Parnell, Andrew; Kemp, Andrew; Horton, Benjamin

    2014-05-01

    A fundamental concern associated with climate change is the rate at which sea levels are rising. Studies of past sea level (particularly beyond the instrumental data range) allow modern sea-level rise to be placed in a more complete context. Considering this, we perform a Bayesian statistical analysis on historical and late Holocene rates of sea-level change. The data that form the input to the statistical model are tide-gauge measurements and proxy reconstructions from cores of coastal sediment. The aims are to estimate rates of sea-level rise, to determine when modern rates of sea-level rise began and to observe how these rates have been changing over time. Many of the current methods for doing this use simple linear regression to estimate rates. This is often inappropriate as it is too rigid and it can ignore uncertainties that arise as part of the data collection exercise. This can lead to over confidence in the sea-level trends being characterized. The proposed Bayesian model places a Gaussian process prior on the rate process (i.e. the process that determines how rates of sea-level are changing over time). The likelihood of the observed data is the integral of this process. When dealing with proxy reconstructions, this is set in an errors-in-variables framework so as to take account of age uncertainty. It is also necessary, in this case, for the model to account for glacio-isostatic adjustment, which introduces a covariance between individual age and sea-level observations. This method provides a flexible fit and it allows for the direct estimation of the rate process with full consideration of all sources of uncertainty. Analysis of tide-gauge datasets and proxy reconstructions in this way means that changing rates of sea level can be estimated more comprehensively and accurately than previously possible. The model captures the continuous and dynamic evolution of sea-level change and results show that not only are modern sea levels rising but that the rates

  7. Statistical-fluctuation analysis for quantum key distribution with consideration of after-pulse contributions

    NASA Astrophysics Data System (ADS)

    Li, Hongxin; Jiang, Haodong; Gao, Ming; Ma, Zhi; Ma, Chuangui; Wang, Wei

    2015-12-01

    The statistical fluctuation problem is a critical factor in all quantum key distribution (QKD) protocols under finite-key conditions. The current statistical fluctuation analysis is mainly based on independent random samples, however, the precondition cannot always be satisfied because of different choices of samples and actual parameters. As a result, proper statistical fluctuation methods are required to solve this problem. Taking the after-pulse contributions into consideration, this paper gives the expression for the secure key rate and the mathematical model for statistical fluctuations, focusing on a decoy-state QKD protocol [Z.-C. Wei et al., Sci. Rep. 3, 2453 (2013), 10.1038/srep02453] with a biased basis choice. On this basis, a classified analysis of statistical fluctuation is represented according to the mutual relationship between random samples. First, for independent identical relations, a deviation comparison is made between the law of large numbers and standard error analysis. Second, a sufficient condition is given that the Chernoff bound achieves a better result than Hoeffding's inequality based on only independent relations. Third, by constructing the proper martingale, a stringent way is proposed to deal issues based on dependent random samples through making use of Azuma's inequality. In numerical optimization, the impact on the secure key rate, the comparison of secure key rates, and the respective deviations under various kinds of statistical fluctuation analyses are depicted.

  8. Q-Type Factor Analysis of Healthy Aged Men.

    ERIC Educational Resources Information Center

    Kleban, Morton H.

    Q-type factor analysis was used to re-analyze baseline data collected in 1957, on 47 men aged 65-91. Q-type analysis is the use of factor methods to study persons rather than tests. Although 550 variables were originally studied involving psychiatry, medicine, cerebral metabolism and chemistry, personality, audiometry, dichotic and diotic memory,…

  9. The linear statistical d.c. model of GaAs MESFET using factor analysis

    NASA Astrophysics Data System (ADS)

    Dobrzanski, Lech

    1995-02-01

    The linear statistical model of the GaAs MESFET's current generator is obtained by means of factor analysis. Three different MESFET deterministic models are taken into account in the analysis: the Statz model (ST), the Materka-type model (MT) and a new proprietary model of MESFET with implanted channel (PLD). It is shown that statistical models obtained using factor analysis provide excellent generation of the multidimensional random variable representing the drain current of MESFET. The method of implementation of the statistical model into the SPICE program is presented. It is proved that for a strongly limited number of Monte Carlo analysis runs in that program, the statistical models considered in each case (ST, MT and PLD) enable good reconstruction of the empirical factor structure. The empirical correlation matrix of model parameters is not reconstructed exactly by statistical modelling, but values of correlation matrix elements obtained from simulated data are within the confidence intervals for the small sample. This paper proves that a formal approach to statistical modelling using factor analysis is the right path to follow, in spite of the fact, that CAD systems (PSpice[MicroSim Corp.], Microwave Harmonica[Compact Software]) are not designed properly for generation of the multidimensional random variable. It is obvious that further progress in implementation of statistical methods in CAD software is required. Furthermore, a new approach to the MESFET's d.c. model is presented. The separate functions, describing the linear as well as the saturated region of MESFET output characteristics, are combined in the single equation. This way of modelling is particularly suitable for transistors with an implanted channel.

  10. General specifications for the development of a USL NASA PC R and D statistical analysis support package

    NASA Technical Reports Server (NTRS)

    Dominick, Wayne D. (Editor); Bassari, Jinous; Triantafyllopoulos, Spiros

    1984-01-01

    The University of Southwestern Louisiana (USL) NASA PC R and D statistical analysis support package is designed to be a three-level package to allow statistical analysis for a variety of applications within the USL Data Base Management System (DBMS) contract work. The design addresses usage of the statistical facilities as a library package, as an interactive statistical analysis system, and as a batch processing package.

  11. Linguistic Analysis of the Human Heartbeat Using Frequency and Rank Order Statistics

    NASA Astrophysics Data System (ADS)

    Yang, Albert C.-C.; Hseu, Shu-Shya; Yien, Huey-Wen; Goldberger, Ary L.; Peng, C.-K.

    2003-03-01

    Complex physiologic signals may carry unique dynamical signatures that are related to their underlying mechanisms. We present a method based on rank order statistics of symbolic sequences to investigate the profile of different types of physiologic dynamics. We apply this method to heart rate fluctuations, the output of a central physiologic control system. The method robustly discriminates patterns generated from healthy and pathologic states, as well as aging. Furthermore, we observe increased randomness in the heartbeat time series with physiologic aging and pathologic states and also uncover nonrandom patterns in the ventricular response to atrial fibrillation.

  12. Aging Chart: a community resource for rapid exploratory pathway analysis of age-related processes.

    PubMed

    Moskalev, Alexey; Zhikrivetskaya, Svetlana; Shaposhnikov, Mikhail; Dobrovolskaya, Evgenia; Gurinovich, Roman; Kuryan, Oleg; Pashuk, Aleksandr; Jellen, Leslie C; Aliper, Alex; Peregudov, Alex; Zhavoronkov, Alex

    2016-01-01

    Aging research is a multi-disciplinary field encompassing knowledge from many areas of basic, applied and clinical research. Age-related processes occur on molecular, cellular, tissue, organ, system, organismal and even psychological levels, trigger the onset of multiple debilitating diseases and lead to a loss of function, and there is a need for a unified knowledge repository designed to track, analyze and visualize the cause and effect relationships and interactions between the many elements and processes on all levels. Aging Chart (http://agingchart.org/) is a new, community-curated collection of aging pathways and knowledge that provides a platform for rapid exploratory analysis. Building on an initial content base constructed by a team of experts from peer-reviewed literature, users can integrate new data into biological pathway diagrams for a visible, intuitive, top-down framework of aging processes that fosters knowledge-building and collaboration. As the body of knowledge in aging research is rapidly increasing, an open visual encyclopedia of aging processes will be useful to both the new entrants and experts in the field. PMID:26602690

  13. Aging Chart: a community resource for rapid exploratory pathway analysis of age-related processes.

    PubMed

    Moskalev, Alexey; Zhikrivetskaya, Svetlana; Shaposhnikov, Mikhail; Dobrovolskaya, Evgenia; Gurinovich, Roman; Kuryan, Oleg; Pashuk, Aleksandr; Jellen, Leslie C; Aliper, Alex; Peregudov, Alex; Zhavoronkov, Alex

    2016-01-01

    Aging research is a multi-disciplinary field encompassing knowledge from many areas of basic, applied and clinical research. Age-related processes occur on molecular, cellular, tissue, organ, system, organismal and even psychological levels, trigger the onset of multiple debilitating diseases and lead to a loss of function, and there is a need for a unified knowledge repository designed to track, analyze and visualize the cause and effect relationships and interactions between the many elements and processes on all levels. Aging Chart (http://agingchart.org/) is a new, community-curated collection of aging pathways and knowledge that provides a platform for rapid exploratory analysis. Building on an initial content base constructed by a team of experts from peer-reviewed literature, users can integrate new data into biological pathway diagrams for a visible, intuitive, top-down framework of aging processes that fosters knowledge-building and collaboration. As the body of knowledge in aging research is rapidly increasing, an open visual encyclopedia of aging processes will be useful to both the new entrants and experts in the field.

  14. Aging Chart: a community resource for rapid exploratory pathway analysis of age-related processes

    PubMed Central

    Moskalev, Alexey; Zhikrivetskaya, Svetlana; Shaposhnikov, Mikhail; Dobrovolskaya, Evgenia; Gurinovich, Roman; Kuryan, Oleg; Pashuk, Aleksandr; Jellen, Leslie C.; Aliper, Alex; Peregudov, Alex; Zhavoronkov, Alex

    2016-01-01

    Aging research is a multi-disciplinary field encompassing knowledge from many areas of basic, applied and clinical research. Age-related processes occur on molecular, cellular, tissue, organ, system, organismal and even psychological levels, trigger the onset of multiple debilitating diseases and lead to a loss of function, and there is a need for a unified knowledge repository designed to track, analyze and visualize the cause and effect relationships and interactions between the many elements and processes on all levels. Aging Chart (http://agingchart.org/) is a new, community-curated collection of aging pathways and knowledge that provides a platform for rapid exploratory analysis. Building on an initial content base constructed by a team of experts from peer-reviewed literature, users can integrate new data into biological pathway diagrams for a visible, intuitive, top-down framework of aging processes that fosters knowledge-building and collaboration. As the body of knowledge in aging research is rapidly increasing, an open visual encyclopedia of aging processes will be useful to both the new entrants and experts in the field. PMID:26602690

  15. Statistical analysis of water-quality data containing multiple detection limits: S-language software for regression on order statistics

    USGS Publications Warehouse

    Lee, L.; Helsel, D.

    2005-01-01

    Trace contaminants in water, including metals and organics, often are measured at sufficiently low concentrations to be reported only as values below the instrument detection limit. Interpretation of these "less thans" is complicated when multiple detection limits occur. Statistical methods for multiply censored, or multiple-detection limit, datasets have been developed for medical and industrial statistics, and can be employed to estimate summary statistics or model the distributions of trace-level environmental data. We describe S-language-based software tools that perform robust linear regression on order statistics (ROS). The ROS method has been evaluated as one of the most reliable procedures for developing summary statistics of multiply censored data. It is applicable to any dataset that has 0 to 80% of its values censored. These tools are a part of a software library, or add-on package, for the R environment for statistical computing. This library can be used to generate ROS models and associated summary statistics, plot modeled distributions, and predict exceedance probabilities of water-quality standards. ?? 2005 Elsevier Ltd. All rights reserved.

  16. Statistical analysis of interaction between lake seepage rates and groundwater and lake levels

    NASA Astrophysics Data System (ADS)

    Ala-aho, P.; Rossi, P. M.; Klöve, B.

    2012-04-01

    In Finland, the main sources of groundwater are the esker deposits from the last ice age. Small lakes imbedded in the aquifer with no outlets or inlets are typically found in eskers. Some lakes at Rokua esker, in Northern Finland, have been suffering from changes in water stage and quality. A possible permanent decline of water level has raised considerable concern as the area is also used for recreation and tourism. Rare biotypes supported by the oligotrophic lakes can also be endangered by the level decline. Drainage of peatlands located in the discharge zone of the aquifer is a possible threat for the lakes and the whole aquifer. Drainage can potentially lower the aquifer water table which can have an effect on the groundwater-lake interaction. The aim of this study was to understand in more detail the interaction of the aquifer and the lake systems so potential causes for the lake level variations could be better understood and managed. In-depth understanding of hydrogeological system provides foundation to study the nutrient input to lakes affecting lake ecosystems. A small lake imbedded the Rokua esker aquifer was studied in detail. Direct measurements of seepage rate between the lake and the aquifer were carried out using seepage meters. Seepage was measured from six locations for eight times during May 2010 - November 2010. Precipitation was recorded with a tipping bucket rain gauge adjacent to the lake. Lake stage and groundwater levels from three piezometers were registered on an hourly interval using pressure probes. Statistical methods were applied to examine relationship between seepage measurements and levels of lake and groundwater and amount of precipitation. Distinct areas of inseepage and outseepage of the lake were distinguished with seepage meter measurements. Seepage rates showed only little variation within individual measurement locations. Nevertheless analysis revealed statistically significant correlation of seepage rate variation in four

  17. New ordering principle for the classical statistical analysis of Poisson processes with background

    NASA Astrophysics Data System (ADS)

    Giunti, C.

    1999-03-01

    Inspired by the recent proposal by Feldman and Cousins of a ``unified approach to the classical statistical analysis of small signals'' based on a choice of ordering in Neyman's construction of classical confidence intervals, I propose a new ordering principle for the classical statistical analysis of Poisson processes with a background which minimizes the effect on the resulting confidence intervals of the observation of fewer background events than expected. The new ordering principle is applied to the calculation of the confidence region implied by the recent null result of the KARMEN neutrino oscillation experiment.

  18. Differential Expression Analysis for RNA-Seq: An Overview of Statistical Methods and Computational Software

    PubMed Central

    Huang, Huei-Chung; Niu, Yi; Qin, Li-Xuan

    2015-01-01

    Deep sequencing has recently emerged as a powerful alternative to microarrays for the high-throughput profiling of gene expression. In order to account for the discrete nature of RNA sequencing data, new statistical methods and computational tools have been developed for the analysis of differential expression to identify genes that are relevant to a disease such as cancer. In this paper, it is thus timely to provide an overview of these analysis methods and tools. For readers with statistical background, we also review the parameter estimation algorithms and hypothesis testing strategies used in these methods. PMID:26688660

  19. JULIDE: a software tool for 3D reconstruction and statistical analysis of autoradiographic mouse brain sections.

    PubMed

    Ribes, Delphine; Parafita, Julia; Charrier, Rémi; Magara, Fulvio; Magistretti, Pierre J; Thiran, Jean-Philippe

    2010-11-23

    In this article we introduce JULIDE, a software toolkit developed to perform the 3D reconstruction, intensity normalization, volume standardization by 3D image registration and voxel-wise statistical analysis of autoradiographs of mouse brain sections. This software tool has been developed in the open-source ITK software framework and is freely available under a GPL license. The article presents the complete image processing chain from raw data acquisition to 3D statistical group analysis. Results of the group comparison in the context of a study on spatial learning are shown as an illustration of the data that can be obtained with this tool.

  20. A Comparative Review of Sensitivity and Uncertainty Analysis of Large-Scale Systems - II: Statistical Methods

    SciTech Connect

    Cacuci, Dan G.; Ionescu-Bujor, Mihaela

    2004-07-15

    Part II of this review paper highlights the salient features of the most popular statistical methods currently used for local and global sensitivity and uncertainty analysis of both large-scale computational models and indirect experimental measurements. These statistical procedures represent sampling-based methods (random sampling, stratified importance sampling, and Latin Hypercube sampling), first- and second-order reliability algorithms (FORM and SORM, respectively), variance-based methods (correlation ratio-based methods, the Fourier Amplitude Sensitivity Test, and the Sobol Method), and screening design methods (classical one-at-a-time experiments, global one-at-a-time design methods, systematic fractional replicate designs, and sequential bifurcation designs). It is emphasized that all statistical uncertainty and sensitivity analysis procedures first commence with the 'uncertainty analysis' stage and only subsequently proceed to the 'sensitivity analysis' stage; this path is the exact reverse of the conceptual path underlying the methods of deterministic sensitivity and uncertainty analysis where the sensitivities are determined prior to using them for uncertainty analysis. By comparison to deterministic methods, statistical methods for uncertainty and sensitivity analysis are relatively easier to develop and use but cannot yield exact values of the local sensitivities. Furthermore, current statistical methods have two major inherent drawbacks as follows: 1. Since many thousands of simulations are needed to obtain reliable results, statistical methods are at best expensive (for small systems) or, at worst, impracticable (e.g., for large time-dependent systems).2. Since the response sensitivities and parameter uncertainties are inherently and inseparably amalgamated in the results produced by these methods, improvements in parameter uncertainties cannot be directly propagated to improve response uncertainties; rather, the entire set of simulations and

  1. A Third Moment Adjusted Test Statistic for Small Sample Factor Analysis.

    PubMed

    Lin, Johnny; Bentler, Peter M

    2012-01-01

    Goodness of fit testing in factor analysis is based on the assumption that the test statistic is asymptotically chi-square; but this property may not hold in small samples even when the factors and errors are normally distributed in the population. Robust methods such as Browne's asymptotically distribution-free method and Satorra Bentler's mean scaling statistic were developed under the presumption of non-normality in the factors and errors. This paper finds new application to the case where factors and errors are normally distributed in the population but the skewness of the obtained test statistic is still high due to sampling error in the observed indicators. An extension of Satorra Bentler's statistic is proposed that not only scales the mean but also adjusts the degrees of freedom based on the skewness of the obtained test statistic in order to improve its robustness under small samples. A simple simulation study shows that this third moment adjusted statistic asymptotically performs on par with previously proposed methods, and at a very small sample size offers superior Type I error rates under a properly specified model. Data from Mardia, Kent and Bibby's study of students tested for their ability in five content areas that were either open or closed book were used to illustrate the real-world performance of this statistic.

  2. Long-term Statistical Analysis of the Simultaneity of Forbush Decrease Events at Middle Latitudes

    NASA Astrophysics Data System (ADS)

    Lee, Seongsuk; Oh, Suyeon; Yi, Yu; Evenson, Paul; Jee, Geonhwa; Choi, Hwajin

    2015-03-01

    Forbush Decreases (FD) are transient, sudden reductions of cosmic ray (CR) intensity lasting a few days, to a week. Such events are observed globally using ground neutron monitors (NMs). Most studies of FD events indicate that an FD event is observed simultaneously at NM stations located all over the Earth. However, using statistical analysis, previous researchers verified that while FD events could occur simultaneously, in some cases, FD events could occur non-simultaneously. Previous studies confirmed the statistical reality of non-simultaneous FD events and the mechanism by which they occur, using data from high-latitude and middle-latitude NM stations. In this study, we used long-term data (1971-2006) from middle-latitude NM stations (Irkutsk, Climax, and Jungfraujoch) to enhance statistical reliability. According to the results from this analysis, the variation of cosmic ray intensity during the main phase, is larger (statistically significant) for simultaneous FD events, than for non-simultaneous ones. Moreover, the distribution of main-phase-onset time shows differences that are statistically significant. While the onset times for the simultaneous FDs are distributed evenly over 24- hour intervals (day and night), those of non-simultaneous FDs are mostly distributed over 12-hour intervals, in daytime. Thus, the existence of the two kinds of FD events, according to differences in their statistical properties, were verified based on data from middle-latitude NM stations.

  3. [Aging at home with telecare in Spain. A dicourse analysis].

    PubMed

    Aceros, Juan C; Cavalcante, Maria Tereza Leal; Domènech, Miquel

    2016-08-01

    Caring for the elderly is turning to forms of community care and home care. Telecare is one of those emergent modalities of caring. This article will explore the meanings that older people give to the experience of staying at home in later life by using telecare. Discourse analysis is used to examine a set of focus groups and interviews with telecare users from different cities of Catalonia (Spain). The outcomes include three interpretative repertoires that we called: "Aging at home", "normal aging" and "unsafe aging". For each repertoire we examine how the permanence of older people in their homes is accounted, and which role telecare plays in such experience.

  4. [Aging at home with telecare in Spain. A dicourse analysis].

    PubMed

    Aceros, Juan C; Cavalcante, Maria Tereza Leal; Domènech, Miquel

    2016-08-01

    Caring for the elderly is turning to forms of community care and home care. Telecare is one of those emergent modalities of caring. This article will explore the meanings that older people give to the experience of staying at home in later life by using telecare. Discourse analysis is used to examine a set of focus groups and interviews with telecare users from different cities of Catalonia (Spain). The outcomes include three interpretative repertoires that we called: "Aging at home", "normal aging" and "unsafe aging". For each repertoire we examine how the permanence of older people in their homes is accounted, and which role telecare plays in such experience. PMID:27557015

  5. Gene set analysis for GWAS: assessing the use of modified Kolmogorov-Smirnov statistics.

    PubMed

    Debrabant, Birgit; Soerensen, Mette

    2014-10-01

    We discuss the use of modified Kolmogorov-Smirnov (KS) statistics in the context of gene set analysis and review corresponding null and alternative hypotheses. Especially, we show that, when enhancing the impact of highly significant genes in the calculation of the test statistic, the corresponding test can be considered to infer the classical self-contained null hypothesis. We use simulations to estimate the power for different kinds of alternatives, and to assess the impact of the weight parameter of the modified KS statistic on the power. Finally, we show the analogy between the weight parameter and the genesis and distribution of the gene-level statistics, and illustrate the effects of differential weighting in a real-life example.

  6. Geochemical and statistical analysis of toxic elements in tsunami deposits occurred at March 11, 2011

    NASA Astrophysics Data System (ADS)

    Komai, T.; Kuwatani, T.; Kawabe, Y.; Hara, J.; Okada, M.

    2013-12-01

    Huge amount of tsunami deposits remain after the large earthquake and tsunami occurred on March 11, 2011. This event may bring a possibility of environmental pollutions, particularly in the environment of soil and sediments around coastal areas of eastern Japan. Therefor a geochemical survey and investigation for soil contamination risk was carried out, to make clear the risk level caused by tsunami event and its deposits. First more than 200 points of sampling soil and sediment samples were selected on the basis of tsunami event hazard and topography features. Samples were analyzed by means of chemical and physical methods to accumulate the database for evaluating the environmental risk. Various kinds of tsunami deposits were observed at the coastal areas, some of them are sandy sediments and others are muddy with much clay components. The result of chemical analysis showed that some portions of deposits contain a little higher content of arsenic and lead, however, almost are similar component compared with normal subsurface soils. Environmental risk assessment by using our developed GERAS system indicated that tsunami deposits sampled at around north Miyagi and Iwate pref. have relatively higher risk level. In this case some kind of risk management is necessary for their storage and utilization. Other amount of deposits and soils can be safely used for reconstruction activity because of acceptable risk level. In the analysis of physical properties of deposits, a series of database was developed for particle distribution, soil and clay components, and content of organic matters. The behaviors of biological effects and aging trend in terms of components of tsunami deposits with sulfide minerals were clarified by the precise investigation by a long term testing method. The authors also conduct an approach of statistical analysis of elements in tsunami deposits by using an original technic of sparse modeling, in which the discrimination between tsunami deposits and

  7. The statistical analysis techniques to support the NGNP fuel performance experiments

    SciTech Connect

    Binh T. Pham; Jeffrey J. Einerson

    2013-10-01

    This paper describes the development and application of statistical analysis techniques to support the Advanced Gas Reactor (AGR) experimental program on Next Generation Nuclear Plant (NGNP) fuel performance. The experiments conducted in the Idaho National Laboratory’s Advanced Test Reactor employ fuel compacts placed in a graphite cylinder shrouded by a steel capsule. The tests are instrumented with thermocouples embedded in graphite blocks and the target quantity (fuel temperature) is regulated by the He–Ne gas mixture that fills the gap volume. Three techniques for statistical analysis, namely control charting, correlation analysis, and regression analysis, are implemented in the NGNP Data Management and Analysis System for automated processing and qualification of the AGR measured data. The neutronic and thermal code simulation results are used for comparative scrutiny. The ultimate objective of this work includes (a) a multi-faceted system for data monitoring and data accuracy testing, (b) identification of possible modes of diagnostics deterioration and changes in experimental conditions, (c) qualification of data for use in code validation, and (d) identification and use of data trends to support effective control of test conditions with respect to the test target. Analysis results and examples given in the paper show the three statistical analysis techniques providing a complementary capability to warn of thermocouple failures. It also suggests that the regression analysis models relating calculated fuel temperatures and thermocouple readings can enable online regulation of experimental parameters (i.e. gas mixture content), to effectively maintain the fuel temperature within a given range.

  8. Linearised and non-linearised isotherm models optimization analysis by error functions and statistical means.

    PubMed

    Subramanyam, Busetty; Das, Ashutosh

    2014-01-01

    In adsorption study, to describe sorption process and evaluation of best-fitting isotherm model is a key analysis to investigate the theoretical hypothesis. Hence, numerous statistically analysis have been extensively used to estimate validity of the experimental equilibrium adsorption values with the predicted equilibrium values. Several statistical error analysis were carried out. In the present study, the following statistical analysis were carried out to evaluate the adsorption isotherm model fitness, like the Pearson correlation, the coefficient of determination and the Chi-square test, have been used. The ANOVA test was carried out for evaluating significance of various error functions and also coefficient of dispersion were evaluated for linearised and non-linearised models. The adsorption of phenol onto natural soil (Local name Kalathur soil) was carried out, in batch mode at 30 ± 20 C. For estimating the isotherm parameters, to get a holistic view of the analysis the models were compared between linear and non-linear isotherm models. The result reveled that, among above mentioned error functions and statistical functions were designed to determine the best fitting isotherm. PMID:25018878

  9. Statistical analysis of eruptive vent distribution from post-subduction monogenetic fields in Baja California, Mexico

    NASA Astrophysics Data System (ADS)

    Germa, A.; Cañon-Tapia, E.; Connor, L.; Le Corvec, N.

    2012-04-01

    Volcanism in Baja California (BC, Mexico) was active from the end of the subduction of the Farallon plate (12.5Ma) until recently (< 1Ma). Most of this volcanism formed twelve volcanic fields, seven of them being monogenetic, delineating a ~600-km-long array parallel to the Gulf of California. Previous studies on these fields have focused on the compositional diversity of magmatic products. Although geochemistry and ages of few lava flows are constrained, only two studies investigated the spatial distribution of the eruptive vents of San Borja. Within a monogenetic volcanic field, cone alignments and linear arrays are considered to reflect the geometry of feeder dikes formed either parallel to the maximum principal stress (σ1) in the lithosphere or using pre-existing crustal fractures. These intrinsic local structures will be compared with the shape of the field, which could reflect the shape of the source at depth. Using satellite imagery to define the location of eruptive centres on four monogenetic volcanic fields from central Baja California (Jaraguay, San Borja, Santa Clara, and San Ignacio), we completed statistical analyses of their spatial distribution. Using commercially available GIS software, spatial density analysis, and statistical scripts, each volcanic field was analysed for the number and density of vents, clustering, vent spacing and alignment azimuths. Our preliminary results reveal that vent densities are within the range of 0.001 to 0.2 vents / 100 km2. Eruptive vents are generally clustered, with density higher than 0.1 vents/100 km2. A common elongation direction trends N135° to N152° in most clusters and fields. We thus propose a NW-SE direction as the preferred orientation of the maximum principal stress (σ1), direction that needs to be confirmed by azimuths of the vents alignments. Using a combination of different computational methods, this study allows to quantify the influence of tectonic stresses at the deep and shallow level within

  10. Statistical Analysis of CFD Solutions from the Fourth AIAA Drag Prediction Workshop

    NASA Technical Reports Server (NTRS)

    Morrison, Joseph H.

    2010-01-01

    A graphical framework is used for statistical analysis of the results from an extensive N-version test of a collection of Reynolds-averaged Navier-Stokes computational fluid dynamics codes. The solutions were obtained by code developers and users from the U.S., Europe, Asia, and Russia using a variety of grid systems and turbulence models for the June 2009 4th Drag Prediction Workshop sponsored by the AIAA Applied Aerodynamics Technical Committee. The aerodynamic configuration for this workshop was a new subsonic transport model, the Common Research Model, designed using a modern approach for the wing and included a horizontal tail. The fourth workshop focused on the prediction of both absolute and incremental drag levels for wing-body and wing-body-horizontal tail configurations. This work continues the statistical analysis begun in the earlier workshops and compares the results from the grid convergence study of the most recent workshop with earlier workshops using the statistical framework.

  11. Landing Site Dispersion Analysis and Statistical Assessment for the Mars Phoenix Lander

    NASA Technical Reports Server (NTRS)

    Bonfiglio, Eugene P.; Adams, Douglas; Craig, Lynn; Spencer, David A.; Strauss, William; Seelos, Frank P.; Seelos, Kimberly D.; Arvidson, Ray; Heet, Tabatha

    2008-01-01

    The Mars Phoenix Lander launched on August 4, 2007 and successfully landed on Mars 10 months later on May 25, 2008. Landing ellipse predicts and hazard maps were key in selecting safe surface targets for Phoenix. Hazard maps were based on terrain slopes, geomorphology maps and automated rock counts of MRO's High Resolution Imaging Science Experiment (HiRISE) images. The expected landing dispersion which led to the selection of Phoenix's surface target is discussed as well as the actual landing dispersion predicts determined during operations in the weeks, days, and hours before landing. A statistical assessment of these dispersions is performed, comparing the actual landing-safety probabilities to criteria levied by the project. Also discussed are applications for this statistical analysis which were used by the Phoenix project. These include using the statistical analysis used to verify the effectiveness of a pre-planned maneuver menu and calculating the probability of future maneuvers.

  12. Comparisons of Non-Gaussian Statistical Models in DNA Methylation Analysis

    PubMed Central

    Ma, Zhanyu; Teschendorff, Andrew E.; Yu, Hong; Taghia, Jalil; Guo, Jun

    2014-01-01

    As a key regulatory mechanism of gene expression, DNA methylation patterns are widely altered in many complex genetic diseases, including cancer. DNA methylation is naturally quantified by bounded support data; therefore, it is non-Gaussian distributed. In order to capture such properties, we introduce some non-Gaussian statistical models to perform dimension reduction on DNA methylation data. Afterwards, non-Gaussian statistical model-based unsupervised clustering strategies are applied to cluster the data. Comparisons and analysis of different dimension reduction strategies and unsupervised clustering methods are presented. Experimental results show that the non-Gaussian statistical model-based methods are superior to the conventional Gaussian distribution-based method. They are meaningful tools for DNA methylation analysis. Moreover, among several non-Gaussian methods, the one that captures the bounded nature of DNA methylation data reveals the best clustering performance. PMID:24937687

  13. Multivariate statistical analysis: Principles and applications to coorbital streams of meteorite falls

    NASA Technical Reports Server (NTRS)

    Wolf, S. F.; Lipschutz, M. E.

    1993-01-01

    Multivariate statistical analysis techniques (linear discriminant analysis and logistic regression) can provide powerful discrimination tools which are generally unfamiliar to the planetary science community. Fall parameters were used to identify a group of 17 H chondrites (Cluster 1) that were part of a coorbital stream which intersected Earth's orbit in May, from 1855 - 1895, and can be distinguished from all other H chondrite falls. Using multivariate statistical techniques, it was demonstrated that a totally different criterion, labile trace element contents - hence thermal histories - or 13 Cluster 1 meteorites are distinguishable from those of 45 non-Cluster 1 H chondrites. Here, we focus upon the principles of multivariate statistical techniques and illustrate their application using non-meteoritic and meteoritic examples.

  14. Risk assessment of coal production: an information system user's manual. [SAS (Statistical Analysis System) format

    SciTech Connect

    Watson, A.P.; Birchfield, T.E.; Fore, C.S.

    1982-10-01

    A specialized information system comprising all US domestic coal mine and processing plant injuries as reported to the Mine Safety and Health Administration of the US Department of Labor for the years 1975 through 1980 has been developed at Oak Ridge National Laboratory (ORNL) for online and batch users. The data are stored in two principal datasets: (1) annual summaries of accidental injuries and fatalities in both surface and underground bituminous and anthracite mines, as well as information on injuries suffered by workers employed in coal-processing (blending, crushing, etc.) facilities; and (2) annual summaries of employment (person-hours, number of individuals) and production (tons) of each domestic mine or processing facility for which the US Department of Labor has granted an operating permit. There are currently more than 232 000 records available online to interested users. Data are recorded for the following variables: county, state, date of injury, sex of victim, age at time of accident, degree of injury, occupation title at time of injury, activity during injury, location of accident, type of coal, type of mine, type of mining machine, type of accident, source and nature of injury, part of body injured, total mine experience, experience at current mine and job title held at time of injury, and number of days away from work or number of days restricted or charged due to the injury. As these values are organized by FIPS (Federal Information Processing Standards) county code for each reporting facility, compilations may be made on a subregional or substate basis. The datasets have been established in SAS (Statistical Analysis System) format and are readily manipulated by SAS routines available at ORNL. Several appendices are included in the manual to provide the user with a detailed description of all the codes available for data retrieval. Sample retrieval sessions are also incorporated.

  15. Dental computed tomographic imaging as age estimation: morphological analysis of the third molar of a group of Turkish population.

    PubMed

    Cantekin, Kenan; Sekerci, Ahmet Ercan; Buyuk, Suleyman Kutalmis

    2013-12-01

    Computed tomography (CT) is capable of providing accurate and measurable 3-dimensional images of the third molar. The aims of this study were to analyze the development of the mandibular third molar and its relation to chronological age and to create new reference data for a group of Turkish participants aged 9 to 25 years on the basis of cone-beam CT images. All data were obtained from the patients' records including medical, social, and dental anamnesis and cone-beam CT images of 752 patients. Linear regression analysis was performed to obtain regression formulas for dental age calculation with chronological age and to determine the coefficient of determination (r) for each sex. Statistical analysis showed a strong correlation between age and third-molar development for the males (r2 = 0.80) and the females (r2 = 0.78). Computed tomographic images are clinically useful for accurate and reliable estimation of dental ages of children and youth.

  16. A statistical method for estimating rates of soil development and ages of geologic deposits: A design for soil-chronosequence studies

    USGS Publications Warehouse

    Switzer, P.; Harden, J.W.; Mark, R.K.

    1988-01-01

    A statistical method for estimating rates of soil development in a given region based on calibration from a series of dated soils is used to estimate ages of soils in the same region that are not dated directly. The method is designed specifically to account for sampling procedures and uncertainties that are inherent in soil studies. Soil variation and measurement error, uncertainties in calibration dates and their relation to the age of the soil, and the limited number of dated soils are all considered. Maximum likelihood (ML) is employed to estimate a parametric linear calibration curve, relating soil development to time or age on suitably transformed scales. Soil variation on a geomorphic surface of a certain age is characterized by replicate sampling of soils on each surface; such variation is assumed to have a Gaussian distribution. The age of a geomorphic surface is described by older and younger bounds. This technique allows age uncertainty to be characterized by either a Gaussian distribution or by a triangular distribution using minimum, best-estimate, and maximum ages. The calibration curve is taken to be linear after suitable (in certain cases logarithmic) transformations, if required, of the soil parameter and age variables. Soil variability, measurement error, and departures from linearity are described in a combined fashion using Gaussian distributions with variances particular to each sampled geomorphic surface and the number of sample replicates. Uncertainty in age of a geomorphic surface used for calibration is described using three parameters by one of two methods. In the first method, upper and lower ages are specified together with a coverage probability; this specification is converted to a Gaussian distribution with the appropriate mean and variance. In the second method, "absolute" older and younger ages are specified together with a most probable age; this specification is converted to an asymmetric triangular distribution with mode at the

  17. Multicanonical simulation of biomolecules and microcanonical statistical analysis of conformational transitions

    NASA Astrophysics Data System (ADS)

    Bachmann, Michael

    2013-05-01

    The simulation of biomolecular structural transitions such as folding and aggregation does not only require adequate models that reflect the key aspects of the cooperative transition behaviour. It is likewise important to employ thermodynamically correct simulation methods and to perform an accurate subsequent statistical analysis of the data obtained in the simulation. The efficient combination of methodology and analysis can be quite sophisticated, but also very instructive in their feedback to a better understanding of the physics of the underlying cooperative processes that drive the conformational transition. We here show that the density of states, which is the central result of multicanonical sampling and any other generalized-ensemble simulation, serves as the optimal basis for the microcanonical statistical analysis of transitions. The microcanonical inflection-point analysis method, which has been introduced for this purpose recently, is a perfect tool for a precise, unique identification and classification of all structural transitions.

  18. Exstatix: Expandable Statistical Analysis System for the Macintosh. A Software Review.

    ERIC Educational Resources Information Center

    Ferrell, Barbara G.

    The Exstatix statistical analysis software package by K. C. Killion for use with Macintosh computers is evaluated. In evaluating the package, the framework developed by C. J. Ansorge et al. (1986) was used. This framework encompasses features such as transportability of files, compatibility of files with other Macintosh software, and ability to…

  19. Relationships between Association of Research Libraries (ARL) Statistics and Bibliometric Indicators: A Principal Components Analysis

    ERIC Educational Resources Information Center

    Hendrix, Dean

    2010-01-01

    This study analyzed 2005-2006 Web of Science bibliometric data from institutions belonging to the Association of Research Libraries (ARL) and corresponding ARL statistics to find any associations between indicators from the two data sets. Principal components analysis on 36 variables from 103 universities revealed obvious associations between…

  20. 1977-78 Cost Analysis for Florida Schools and Districts. Statistical Report. Series 79-01.

    ERIC Educational Resources Information Center

    Florida State Dept. of Education, Tallahassee. Div. of Public Schools.

    This statistical report describes some of the cost analysis information available from computer reports produced by the Florida Department of Education. It reproduces examples of Florida school and school district financial data that can be used by state, district, and school-level administrators as they analyze program costs and expenditures. The…

  1. A Statistical Analysis of College Biochemistry Textbooks in China: The Statuses on the Publishing and Usage

    ERIC Educational Resources Information Center

    Zhou, Ping; Wang, Qinwen; Yang, Jie; Li, Jingqiu; Guo, Junming; Gong, Zhaohui

    2015-01-01

    This study aimed to investigate the statuses on the publishing and usage of college biochemistry textbooks in China. A textbook database was constructed and the statistical analysis was adopted to evaluate the textbooks. The results showed that there were 945 (~57%) books for theory teaching, 379 (~23%) books for experiment teaching and 331 (~20%)…

  2. Indexing Combined with Statistical Deflation as a Tool for Analysis of Longitudinal Data.

    ERIC Educational Resources Information Center

    Babcock, Judith A.

    Indexing is a tool that can be used with longitudinal, quantitative data for analysis of relative changes and for comparisons of changes among items. For greater accuracy, raw financial data should be deflated into constant dollars prior to indexing. This paper demonstrates the procedures for indexing, statistical deflation, and the use of…

  3. Clustered Stomates in "Begonia": An Exercise in Data Collection & Statistical Analysis of Biological Space

    ERIC Educational Resources Information Center

    Lau, Joann M.; Korn, Robert W.

    2007-01-01

    In this article, the authors present a laboratory exercise in data collection and statistical analysis in biological space using clustered stomates on leaves of "Begonia" plants. The exercise can be done in middle school classes by students making their own slides and seeing imprints of cells, or at the high school level through collecting data of…

  4. Bayesian Statistical Analysis Applied to NAA Data for Neutron Flux Spectrum Determination

    NASA Astrophysics Data System (ADS)

    Chiesa, D.; Previtali, E.; Sisti, M.

    2014-04-01

    In this paper, we present a statistical method, based on Bayesian statistics, to evaluate the neutron flux spectrum from the activation data of different isotopes. The experimental data were acquired during a neutron activation analysis (NAA) experiment [A. Borio di Tigliole et al., Absolute flux measurement by NAA at the Pavia University TRIGA Mark II reactor facilities, ENC 2012 - Transactions Research Reactors, ISBN 978-92-95064-14-0, 22 (2012)] performed at the TRIGA Mark II reactor of Pavia University (Italy). In order to evaluate the neutron flux spectrum, subdivided in energy groups, we must solve a system of linear equations containing the grouped cross sections and the activation rate data. We solve this problem with Bayesian statistical analysis, including the uncertainties of the coefficients and the a priori information about the neutron flux. A program for the analysis of Bayesian hierarchical models, based on Markov Chain Monte Carlo (MCMC) simulations, is used to define the problem statistical model and solve it. The energy group fluxes and their uncertainties are then determined with great accuracy and the correlations between the groups are analyzed. Finally, the dependence of the results on the prior distribution choice and on the group cross section data is investigated to confirm the reliability of the analysis.

  5. Development of Statistically Parallel Tests by Analysis of Unique Item Variance.

    ERIC Educational Resources Information Center

    Ree, Malcolm James

    A method for developing statistically parallel tests based on the analysis of unique item variance was developed. A test population of 907 basic airmen trainees were required to estimate the angle at which an object in a photograph was viewed, selecting from eight possibilities. A FORTRAN program known as VARSEL was used to rank all the test items…

  6. The Impact of Training and Demographics in WIA Program Performance: A Statistical Analysis

    ERIC Educational Resources Information Center

    Moore, Richard W.; Gorman, Philip C.

    2009-01-01

    The Workforce Investment Act (WIA) measures participant labor market outcomes to drive program performance. This article uses statistical analysis to examine the relationship between participant characteristics and key outcome measures in one large California local WIA program. This study also measures the impact of different training…

  7. Data analysis and graphing in an introductory physics laboratory: spreadsheet versus statistics suite

    NASA Astrophysics Data System (ADS)

    Peterlin, Primož

    2010-07-01

    Two methods of data analysis are compared: spreadsheet software and a statistics software suite. Their use is compared analysing data collected in three selected experiments taken from an introductory physics laboratory, which include a linear dependence, a nonlinear dependence and a histogram. The merits of each method are compared.

  8. A Statistical Analysis of Infrequent Events on Multiple-Choice Tests that Indicate Probable Cheating

    ERIC Educational Resources Information Center

    Sundermann, Michael J.

    2008-01-01

    A statistical analysis of multiple-choice answers is performed to identify anomalies that can be used as evidence of student cheating. The ratio of exact errors in common (EEIC: two students put the same wrong answer for a question) to differences (D: two students get different answers) was found to be a good indicator of cheating under a wide…

  9. Data Analysis and Graphing in an Introductory Physics Laboratory: Spreadsheet versus Statistics Suite

    ERIC Educational Resources Information Center

    Peterlin, Primoz

    2010-01-01

    Two methods of data analysis are compared: spreadsheet software and a statistics software suite. Their use is compared analysing data collected in three selected experiments taken from an introductory physics laboratory, which include a linear dependence, a nonlinear dependence and a histogram. The merits of each method are compared. (Contains 7…

  10. The incidence of cervical spondylosis decreases with aging in the elderly, and increases with aging in the young and adult population: a hospital-based clinical analysis

    PubMed Central

    Wang, Chuanling; Tian, Fuming; Zhou, Yingjun; He, Wenbo; Cai, Zhiyou

    2016-01-01

    Background and purpose Cervical spondylosis is well accepted as a common degenerative change in the cervical spine. Compelling evidence has shown that the incidence of cervical spondylosis increases with age. However, the relationship between age and the incidence of cervical spondylosis remains obscure. It is essential to note the relationship between age and the incidence of cervical spondylosis through more and more clinical data. Methods In the case-controlled study reported here, retrospective clinical analysis of 1,276 cases of cervical spondylosis has been conducted. We analyzed the general clinical data, the relationship between age and the incidence of cervical spondylosis, and the relationship between age-related risk factors and the incidence of cervical spondylosis. A chi-square test was used to analyze the associations between different variables. Statistical significance was defined as a P-value of less than 0.05. Results The imaging examination demonstrated the most prominent characteristic features of cervical spondylosis: bulge or herniation at C3-C4, C4-C5, and C5-C6. The incidence of cervical spondylosis increased with aging before age 50 years and decreased with aging after age 50 years, especially in the elderly after 60 years old. The occurrence rate of bulge or herniation at C3-C4, C4-C5, C5-C6, and C6-C7 increased with aging before age 50 years and decreased with aging after age 50 years, especially after 60 years. Moreover, the incidence of hyperosteogeny and spinal stenosis increased with aging before age 60 years and decreased with aging after age 60 years, although there was no obvious change in calcification. The age-related risk factors, such as hypertension, hyperlipidemia, diabetes, cerebral infarct, cardiovascular diseases, smoking, and drinking, have no relationship with the incidence of cervical spondylosis. Conclusion A decreasing proportion of cervical spondylosis with aging occurs in the elderly, while the proportion of

  11. Age estimation in the living: Transition analysis on developing third molars.

    PubMed

    Tangmose, Sara; Thevissen, Patrick; Lynnerup, Niels; Willems, Guy; Boldsen, Jesper

    2015-12-01

    A radiographic assessment of third molar development is essential for differentiating between juveniles and adolescents in forensic age estimations. As the developmental stages of third molars are highly correlated, age estimates based on a combination of a full set of third molar scores are statistically complicated. Transition analysis (TA) is a statistical method developed for estimating age at death in skeletons, which combines several correlated developmental traits into one age estimate including a 95% prediction interval. The aim of this study was to evaluate the performance of TA in the living on a full set of third molar scores. A cross sectional sample of 854 panoramic radiographs, homogenously distributed by sex and age (15.0-24.0 years), were randomly split in two; a reference sample for obtaining age estimates including a 95% prediction interval according to TA; and a validation sample to test the age estimates against actual age. The mean inaccuracy of the age estimates was 1.82 years (±1.35) in males and 1.81 years (±1.44) in females. The mean bias was 0.55 years (±2.20) in males and 0.31 years (±2.30) in females. Of the actual ages, 93.7% of the males and 95.9% of the females (validation sample) fell within the 95% prediction interval. Moreover, at a sensitivity and specificity of 0.824 and 0.937 in males and 0.814 and 0.827 in females, TA performs well in differentiating between being a minor as opposed to an adult. Although accuracy does not outperform other methods, TA provides unbiased age estimates which minimize the risk of wrongly estimating minors as adults. Furthermore, when corrected ad hoc, TA produces appropriate prediction intervals. As TA allows expansion with additional traits, i.e. stages of development of the left hand-wrist and the clavicle, it has a great potential for future more accurate and reproducible age estimates, including an estimated probability of having attained the legal age limit of 18 years.

  12. Demographic analysis from summaries of an age-structured population

    USGS Publications Warehouse

    Link, W.A.; Royle, J. Andrew; Hatfield, J.S.

    2003-01-01

    Demographic analyses of age-structured populations typically rely on life history data for individuals, or when individual animals are not identified, on information about the numbers of individuals in each age class through time. While it is usually difficult to determine the age class of a randomly encountered individual, it is often the case that the individual can be readily and reliably assigned to one of a set of age classes. For example, it is often possible to distinguish first-year from older birds. In such cases, the population age structure can be regarded as a latent variable governed by a process prior, and the data as summaries of this latent structure. In this article, we consider the problem of uncovering the latent structure and estimating process parameters from summaries of age class information. We present a demographic analysis for the critically endangered migratory population of whooping cranes (Grus americana), based only on counts of first-year birds and of older birds. We estimate age and year-specific survival rates. We address the controversial issue of whether management action on the breeding grounds has influenced recruitment, relating recruitment rates to the number of seventh-year and older birds, and examining the pattern of variation through time in this rate.

  13. The age-related posterior-anterior shift as revealed by voxelwise analysis of functional brain networks

    PubMed Central

    McCarthy, Paul; Benuskova, Lubica; Franz, Elizabeth A.

    2014-01-01

    The posterior-anterior shift in aging (PASA) is a commonly observed phenomenon in functional neuroimaging studies of aging, characterized by age-related reductions in occipital activity alongside increases in frontal activity. In this work we have investigated the hypothesis as to whether the PASA is also manifested in functional brain network measures such as degree, clustering coefficient, path length and local efficiency. We have performed statistical analysis upon functional networks derived from a fMRI dataset containing data from healthy young, healthy aged, and aged individuals with very mild to mild Alzheimer's disease (AD). Analysis of both task based and resting state functional network properties has indicated that the PASA can also be characterized in terms of modulation of functional network properties, and that the onset of AD appears to accentuate this modulation. We also explore the effect of spatial normalization upon the results of our analysis. PMID:25426065

  14. HYPOTHESIS SETTING AND ORDER STATISTIC FOR ROBUST GENOMIC META-ANALYSIS.

    PubMed

    Song, Chi; Tseng, George C

    2014-01-01

    Meta-analysis techniques have been widely developed and applied in genomic applications, especially for combining multiple transcriptomic studies. In this paper, we propose an order statistic of p-values (rth ordered p-value, rOP) across combined studies as the test statistic. We illustrate different hypothesis settings that detect gene markers differentially expressed (DE) "in all studies", "in the majority of studies", or "in one or more studies", and specify rOP as a suitable method for detecting DE genes "in the majority of studies". We develop methods to estimate the parameter r in rOP for real applications. Statistical properties such as its asymptotic behavior and a one-sided testing correction for detecting markers of concordant expression changes are explored. Power calculation and simulation show better performance of rOP compared to classical Fisher's method, Stouffer's method, minimum p-value method and maximum p-value method under the focused hypothesis setting. Theoretically, rOP is found connected to the naïve vote counting method and can be viewed as a generalized form of vote counting with better statistical properties. The method is applied to three microarray meta-analysis examples including major depressive disorder, brain cancer and diabetes. The results demonstrate rOP as a more generalizable, robust and sensitive statistical framework to detect disease-related markers.

  15. A roadmap for the genetic analysis of renal aging.

    PubMed

    Noordmans, Gerda A; Hillebrands, Jan-Luuk; van Goor, Harry; Korstanje, Ron

    2015-10-01

    Several studies show evidence for the genetic basis of renal disease, which renders some individuals more prone than others to accelerated renal aging. Studying the genetics of renal aging can help us to identify genes involved in this process and to unravel the underlying pathways. First, this opinion article will give an overview of the phenotypes that can be observed in age-related kidney disease. Accurate phenotyping is essential in performing genetic analysis. For kidney aging, this could include both functional and structural changes. Subsequently, this article reviews the studies that report on candidate genes associated with renal aging in humans and mice. Several loci or candidate genes have been found associated with kidney disease, but identification of the specific genetic variants involved has proven to be difficult. CUBN, UMOD, and SHROOM3 were identified by human GWAS as being associated with albuminuria, kidney function, and chronic kidney disease (CKD). These are promising examples of genes that could be involved in renal aging, and were further mechanistically evaluated in animal models. Eventually, we will provide approaches for performing genetic analysis. We should leverage the power of mouse models, as testing in humans is limited. Mouse and other animal models can be used to explain the underlying biological mechanisms of genes and loci identified by human GWAS. Furthermore, mouse models can be used to identify genetic variants associated with age-associated histological changes, of which Far2, Wisp2, and Esrrg are examples. A new outbred mouse population with high genetic diversity will facilitate the identification of genes associated with renal aging by enabling high-resolution genetic mapping while also allowing the control of environmental factors, and by enabling access to renal tissues at specific time points for histology, proteomics, and gene expression.

  16. Meta-analysis of correlated traits via summary statistics from GWASs with an application in hypertension.

    PubMed

    Zhu, Xiaofeng; Feng, Tao; Tayo, Bamidele O; Liang, Jingjing; Young, J Hunter; Franceschini, Nora; Smith, Jennifer A; Yanek, Lisa R; Sun, Yan V; Edwards, Todd L; Chen, Wei; Nalls, Mike; Fox, Ervin; Sale, Michele; Bottinger, Erwin; Rotimi, Charles; Liu, Yongmei; McKnight, Barbara; Liu, Kiang; Arnett, Donna K; Chakravati, Aravinda; Cooper, Richard S; Redline, Susan

    2015-01-01

    Genome-wide association studies (GWASs) have identified many genetic variants underlying complex traits. Many detected genetic loci harbor variants that associate with multiple-even distinct-traits. Most current analysis approaches focus on single traits, even though the final results from multiple traits are evaluated together. Such approaches miss the opportunity to systemically integrate the phenome-wide data available for genetic association analysis. In this study, we propose a general approach that can integrate association evidence from summary statistics of multiple traits, either correlated, independent, continuous, or binary traits, which might come from the same or different studies. We allow for trait heterogeneity effects. Population structure and cryptic relatedness can also be controlled. Our simulations suggest that the proposed method has improved statistical power over single-trait analysis in most of the cases we studied. We applied our method to the Continental Origins and Genetic Epidemiology Network (COGENT) African ancestry samples for three blood pressure traits and identified four loci (CHIC2, HOXA-EVX1, IGFBP1/IGFBP3, and CDH17; p < 5.0 × 10(-8)) associated with hypertension-related traits that were missed by a single-trait analysis in the original report. Six additional loci with suggestive association evidence (p < 5.0 × 10(-7)) were also observed, including CACNA1D and WNT3. Our study strongly suggests that analyzing multiple phenotypes can improve statistical power and that such analysis can be executed with the summary statistics from GWASs. Our method also provides a way to study a cross phenotype (CP) association by using summary statistics from GWASs of multiple phenotypes. PMID:25500260

  17. Quantitative shape analysis with weighted covariance estimates for increased statistical efficiency

    PubMed Central

    2013-01-01

    Background The introduction and statistical formalisation of landmark-based methods for analysing biological shape has made a major impact on comparative morphometric analyses. However, a satisfactory solution for including information from 2D/3D shapes represented by ‘semi-landmarks’ alongside well-defined landmarks into the analyses is still missing. Also, there has not been an integration of a statistical treatment of measurement error in the current approaches. Results We propose a procedure based upon the description of landmarks with measurement covariance, which extends statistical linear modelling processes to semi-landmarks for further analysis. Our formulation is based upon a self consistent approach to the construction of likelihood-based parameter estimation and includes corrections for parameter bias, induced by the degrees of freedom within the linear model. The method has been implemented and tested on measurements from 2D fly wing, 2D mouse mandible and 3D mouse skull data. We use these data to explore possible advantages and disadvantages over the use of standard Procrustes/PCA analysis via a combination of Monte-Carlo studies and quantitative statistical tests. In the process we show how appropriate weighting provides not only greater stability but also more efficient use of the available landmark data. The set of new landmarks generated in our procedure (‘ghost points’) can then be used in any further downstream statistical analysis. Conclusions Our approach provides a consistent way of including different forms of landmarks into an analysis and reduces instabilities due to poorly defined points. Our results suggest that the method has the potential to be utilised for the analysis of 2D/3D data, and in particular, for the inclusion of information from surfaces represented by multiple landmark points. PMID:23548043

  18. Meta-analysis of Correlated Traits via Summary Statistics from GWASs with an Application in Hypertension

    PubMed Central

    Zhu, Xiaofeng; Feng, Tao; Tayo, Bamidele O.; Liang, Jingjing; Young, J. Hunter; Franceschini, Nora; Smith, Jennifer A.; Yanek, Lisa R.; Sun, Yan V.; Edwards, Todd L.; Chen, Wei; Nalls, Mike; Fox, Ervin; Sale, Michele; Bottinger, Erwin; Rotimi, Charles; Liu, Yongmei; McKnight, Barbara; Liu, Kiang; Arnett, Donna K.; Chakravati, Aravinda; Cooper, Richard S.; Redline, Susan

    2015-01-01

    Genome-wide association studies (GWASs) have identified many genetic variants underlying complex traits. Many detected genetic loci harbor variants that associate with multiple—even distinct—traits. Most current analysis approaches focus on single traits, even though the final results from multiple traits are evaluated together. Such approaches miss the opportunity to systemically integrate the phenome-wide data available for genetic association analysis. In this study, we propose a general approach that can integrate association evidence from summary statistics of multiple traits, either correlated, independent, continuous, or binary traits, which might come from the same or different studies. We allow for trait heterogeneity effects. Population structure and cryptic relatedness can also be controlled. Our simulations suggest that the proposed method has improved statistical power over single-trait analysis in most of the cases we studied. We applied our method to the Continental Origins and Genetic Epidemiology Network (COGENT) African ancestry samples for three blood pressure traits and identified four loci (CHIC2, HOXA-EVX1, IGFBP1/IGFBP3, and CDH17; p < 5.0 × 10−8) associated with hypertension-related traits that were missed by a single-trait analysis in the original report. Six additional loci with suggestive association evidence (p < 5.0 × 10−7) were also observed, including CACNA1D and WNT3. Our study strongly suggests that analyzing multiple phenotypes can improve statistical power and that such analysis can be executed with the summary statistics from GWASs. Our method also provides a way to study a cross phenotype (CP) association by using summary statistics from GWASs of multiple phenotypes. PMID:25500260

  19. Meta-analysis of correlated traits via summary statistics from GWASs with an application in hypertension.

    PubMed

    Zhu, Xiaofeng; Feng, Tao; Tayo, Bamidele O; Liang, Jingjing; Young, J Hunter; Franceschini, Nora; Smith, Jennifer A; Yanek, Lisa R; Sun, Yan V; Edwards, Todd L; Chen, Wei; Nalls, Mike; Fox, Ervin; Sale, Michele; Bottinger, Erwin; Rotimi, Charles; Liu, Yongmei; McKnight, Barbara; Liu, Kiang; Arnett, Donna K; Chakravati, Aravinda; Cooper, Richard S; Redline, Susan

    2015-01-01

    Genome-wide association studies (GWASs) have identified many genetic variants underlying complex traits. Many detected genetic loci harbor variants that associate with multiple-even distinct-traits. Most current analysis approaches focus on single traits, even though the final results from multiple traits are evaluated together. Such approaches miss the opportunity to systemically integrate the phenome-wide data available for genetic association analysis. In this study, we propose a general approach that can integrate association evidence from summary statistics of multiple traits, either correlated, independent, continuous, or binary traits, which might come from the same or different studies. We allow for trait heterogeneity effects. Population structure and cryptic relatedness can also be controlled. Our simulations suggest that the proposed method has improved statistical power over single-trait analysis in most of the cases we studied. We applied our method to the Continental Origins and Genetic Epidemiology Network (COGENT) African ancestry samples for three blood pressure traits and identified four loci (CHIC2, HOXA-EVX1, IGFBP1/IGFBP3, and CDH17; p < 5.0 × 10(-8)) associated with hypertension-related traits that were missed by a single-trait analysis in the original report. Six additional loci with suggestive association evidence (p < 5.0 × 10(-7)) were also observed, including CACNA1D and WNT3. Our study strongly suggests that analyzing multiple phenotypes can improve statistical power and that such analysis can be executed with the summary statistics from GWASs. Our method also provides a way to study a cross phenotype (CP) association by using summary statistics from GWASs of multiple phenotypes.

  20. Age at menarche and risk of ovarian cancer: a meta-analysis of epidemiological studies.

    PubMed

    Gong, Ting-Ting; Wu, Qi-Jun; Vogtmann, Emily; Lin, Bei; Wang, Yong-Lai

    2013-06-15

    Epidemiological studies have reported inconsistent associations between menarcheal age and ovarian cancer risk. To our knowledge, a meta-analysis for the association between menarcheal age and ovarian cancer has not been reported. Relevant published studies of menarcheal age and ovarian cancer were identified using MEDLINE, EMBASE and Web of Science through the end of April 2012. Two authors (T-T.G. and Q-J.W.) independently assessed eligibility and extracted data. We pooled the relative risks (RRs) from individual studies using a random-effects model and performed heterogeneity and publication bias analyses. A total of 27 observational studies consisting of 22 case-control and five cohort studies were included in our analysis. In a pooled analysis of all studies, a statistically significant inverse association was observed between menarcheal age (for the oldest compared to the youngest category) and ovarian cancer risk (RR = 0.85; 95% confidence interval [CI] = 0.75-0.97). The pooled RRs of ovarian cancer for the oldest versus the youngest categories of menarcheal age in prospective and case-control studies were 0.89 (95% CI = 0.76-1.03) and 0.84 (95% CI = 0.70-0.99), respectively. Inverse associations between menarcheal age and ovarian cancer risk were observed in most subgroups; however, the significant association was restricted to invasive and borderline serous ovarian cancer. In conclusion, findings from this meta-analysis support that menarcheal age was inversely associated with the risk of ovarian cancer. More large studies are warranted to stratify these results by different cancer grading and histotype of ovarian cancer.

  1. Operant Analysis of Intellectual Behavior in Old Age

    ERIC Educational Resources Information Center

    Labouvie-Vief, G.; And Others

    1974-01-01

    Proposes an operant framework for the analysis of environment-intelligence interactions in old age and calls for an implementation of research aimed at examining the range of modifiability of intellectual proficiency in the elderly. Intellectual decrement is interpreted to reflect the lack of supportive environmental contingencies. (Author/SDH)

  2. Analysis of spontaneous MEG activity in mild cognitive impairment and Alzheimer's disease using spectral entropies and statistical complexity measures

    NASA Astrophysics Data System (ADS)

    Bruña, Ricardo; Poza, Jesús; Gómez, Carlos; García, María; Fernández, Alberto; Hornero, Roberto

    2012-06-01

    Alzheimer's disease (AD) is the most common cause of dementia. Over the last few years, a considerable effort has been devoted to exploring new biomarkers. Nevertheless, a better understanding of brain dynamics is still required to optimize therapeutic strategies. In this regard, the characterization of mild cognitive impairment (MCI) is crucial, due to the high conversion rate from MCI to AD. However, only a few studies have focused on the analysis of magnetoencephalographic (MEG) rhythms to characterize AD and MCI. In this study, we assess the ability of several parameters derived from information theory to describe spontaneous MEG activity from 36 AD patients, 18 MCI subjects and 26 controls. Three entropies (Shannon, Tsallis and Rényi entropies), one disequilibrium measure (based on Euclidean distance ED) and three statistical complexities (based on Lopez Ruiz-Mancini-Calbet complexity LMC) were used to estimate the irregularity and statistical complexity of MEG activity. Statistically significant differences between AD patients and controls were obtained with all parameters (p < 0.01). In addition, statistically significant differences between MCI subjects and controls were achieved by ED and LMC (p < 0.05). In order to assess the diagnostic ability of the parameters, a linear discriminant analysis with a leave-one-out cross-validation procedure was applied. The accuracies reached 83.9% and 65.9% to discriminate AD and MCI subjects from controls, respectively. Our findings suggest that MCI subjects exhibit an intermediate pattern of abnormalities between normal aging and AD. Furthermore, the proposed parameters provide a new description of brain dynamics in AD and MCI.

  3. A Statistical Analysis of SEEDS and Other High-contrast Exoplanet Surveys: Massive Planets or Low-mass Brown Dwarfs?

    NASA Astrophysics Data System (ADS)

    Brandt, Timothy D.; McElwain, Michael W.; Turner, Edwin L.; Mede, Kyle; Spiegel, David S.; Kuzuhara, Masayuki; Schlieder, Joshua E.; Wisniewski, John P.; Abe, L.; Biller, B.; Brandner, W.; Carson, J.; Currie, T.; Egner, S.; Feldt, M.; Golota, T.; Goto, M.; Grady, C. A.; Guyon, O.; Hashimoto, J.; Hayano, Y.; Hayashi, M.; Hayashi, S.; Henning, T.; Hodapp, K. W.; Inutsuka, S.; Ishii, M.; Iye, M.; Janson, M.; Kandori, R.; Knapp, G. R.; Kudo, T.; Kusakabe, N.; Kwon, J.; Matsuo, T.; Miyama, S.; Morino, J.-I.; Moro-Martín, A.; Nishimura, T.; Pyo, T.-S.; Serabyn, E.; Suto, H.; Suzuki, R.; Takami, M.; Takato, N.; Terada, H.; Thalmann, C.; Tomono, D.; Watanabe, M.; Yamada, T.; Takami, H.; Usuda, T.; Tamura, M.

    2014-10-01

    We conduct a statistical analysis of a combined sample of direct imaging data, totalling nearly 250 stars. The stars cover a wide range of ages and spectral types, and include five detections (κ And b, two ~60 M J brown dwarf companions in the Pleiades, PZ Tel B, and CD-35 2722B). For some analyses we add a currently unpublished set of SEEDS observations, including the detections GJ 504b and GJ 758B. We conduct a uniform, Bayesian analysis of all stellar ages using both membership in a kinematic moving group and activity/rotation age indicators. We then present a new statistical method for computing the likelihood of a substellar distribution function. By performing most of the integrals analytically, we achieve an enormous speedup over brute-force Monte Carlo. We use this method to place upper limits on the maximum semimajor axis of the distribution function derived from radial-velocity planets, finding model-dependent values of ~30-100 AU. Finally, we model the entire substellar sample, from massive brown dwarfs to a theoretically motivated cutoff at ~5 M J, with a single power-law distribution. We find that p(M, a)vpropM -0.65 ± 0.60 a -0.85 ± 0.39 (1σ errors) provides an adequate fit to our data, with 1.0%-3.1% (68% confidence) of stars hosting 5-70 M J companions between 10 and 100 AU. This suggests that many of the directly imaged exoplanets known, including most (if not all) of the low-mass companions in our sample, formed by fragmentation in a cloud or disk, and represent the low-mass tail of the brown dwarfs. Based on data collected at Subaru Telescope, which is operated by the National Astronomical Observatory of Japan.

  4. A statistical analysis of seeds and other high-contrast exoplanet surveys: massive planets or low-mass brown dwarfs?

    SciTech Connect

    Brandt, Timothy D.; Spiegel, David S.; McElwain, Michael W.; Grady, C. A.; Turner, Edwin L.; Mede, Kyle; Kuzuhara, Masayuki; Schlieder, Joshua E.; Brandner, W.; Feldt, M.; Wisniewski, John P.; Abe, L.; Biller, B.; Carson, J.; Currie, T.; Egner, S.; Golota, T.; Guyon, O.; Goto, M.; Hashimoto, J.; and others

    2014-10-20

    We conduct a statistical analysis of a combined sample of direct imaging data, totalling nearly 250 stars. The stars cover a wide range of ages and spectral types, and include five detections (κ And b, two ∼60 M {sub J} brown dwarf companions in the Pleiades, PZ Tel B, and CD–35 2722B). For some analyses we add a currently unpublished set of SEEDS observations, including the detections GJ 504b and GJ 758B. We conduct a uniform, Bayesian analysis of all stellar ages using both membership in a kinematic moving group and activity/rotation age indicators. We then present a new statistical method for computing the likelihood of a substellar distribution function. By performing most of the integrals analytically, we achieve an enormous speedup over brute-force Monte Carlo. We use this method to place upper limits on the maximum semimajor axis of the distribution function derived from radial-velocity planets, finding model-dependent values of ∼30-100 AU. Finally, we model the entire substellar sample, from massive brown dwarfs to a theoretically motivated cutoff at ∼5 M {sub J}, with a single power-law distribution. We find that p(M, a)∝M {sup –0.65} {sup ±} {sup 0.60} a {sup –0.85} {sup ±} {sup 0.39} (1σ errors) provides an adequate fit to our data, with 1.0%-3.1% (68% confidence) of stars hosting 5-70 M {sub J} companions between 10 and 100 AU. This suggests that many of the directly imaged exoplanets known, including most (if not all) of the low-mass companions in our sample, formed by fragmentation in a cloud or disk, and represent the low-mass tail of the brown dwarfs.

  5. SEM/EDX spectrum imaging and statistical analysis of a metal/ceramic braze

    SciTech Connect

    KOTULA,PAUL G.; KEENAN,MICHAEL R.; ANDERSON,IAN M.

    2000-01-25

    Energy dispersive x-ray (EDX) spectrum imaging has been performed in a scanning electron microscope (SEM) on a metal/ceramic braze to characterize the elemental distribution near the interface. Statistical methods were utilized to extract the relevant information (i.e., chemical phases and their distributions) from the spectrum image data set in a robust and unbiased way. The raw spectrum image was over 15 Mbytes (7500 spectra) while the statistical analysis resulted in five spectra and five images which describe the phases resolved above the noise level and their distribution in the microstructure.

  6. A mesoscale gravity wave event observed during CCOPE. I - Multiscale statistical analysis of wave characteristics

    NASA Technical Reports Server (NTRS)

    Koch, Steven E.; Golus, Robert E.

    1988-01-01

    This paper presents a statistical analysis of the characteristics of the wavelike activity that occurred over the north-central United States on July 11-12, 1981, using data from the Cooperative Convective Precipitation Experiment in Montana. In particular, two distinct wave episodes of about 8-h duration within a longer (33 h) period of wave activity were studied in detail. It is demonstrated that the observed phenomena display features consistent with those of mesoscale gravity waves. The principles of statistical methods used to detect and track mesoscale gravity waves are discussed together with their limitations.

  7. Statistical models for the analysis and design of digital polymerase chain (dPCR) experiments

    USGS Publications Warehouse

    Dorazio, Robert; Hunter, Margaret

    2015-01-01

    Statistical methods for the analysis and design of experiments using digital PCR (dPCR) have received only limited attention and have been misused in many instances. To address this issue and to provide a more general approach to the analysis of dPCR data, we describe a class of statistical models for the analysis and design of experiments that require quantification of nucleic acids. These models are mathematically equivalent to generalized linear models of binomial responses that include a complementary, log–log link function and an offset that is dependent on the dPCR partition volume. These models are both versatile and easy to fit using conventional statistical software. Covariates can be used to specify different sources of variation in nucleic acid concentration, and a model’s parameters can be used to quantify the effects of these covariates. For purposes of illustration, we analyzed dPCR data from different types of experiments, including serial dilution, evaluation of copy number variation, and quantification of gene expression. We also showed how these models can be used to help design dPCR experiments, as in selection of sample sizes needed to achieve desired levels of precision in estimates of nucleic acid concentration or to detect differences in concentration among treatments with prescribed levels of statistical power.

  8. Exploring the Multi-Scale Statistical Analysis of Ionospheric Scintillation via Wavelets and Empirical Mode Decomposition

    NASA Astrophysics Data System (ADS)

    Piersanti, Mirko; Materassi, Massimo; Spogli, Luca; Cicone, Antonio; Alberti, Tommaso

    2016-04-01

    Highly irregular fluctuations of the power of trans-ionospheric GNSS signals, namely radio power scintillation, are, at least to a large extent, the effect of ionospheric plasma turbulence, a by-product of the non-linear and non-stationary evolution of the plasma fields defining the Earth's upper atmosphere. One could expect the ionospheric turbulence characteristics of inter-scale coupling, local randomness and high time variability to be inherited by the scintillation on radio signals crossing the medium. On this basis, the remote sensing of local features of the turbulent plasma could be expected as feasible by studying radio scintillation. The dependence of the statistical properties of the medium fluctuations on the space- and time-scale is the distinctive character of intermittent turbulent media. In this paper, a multi-scale statistical analysis of some samples of GPS radio scintillation is presented: the idea is that assessing how the statistics of signal fluctuations vary with time scale under different Helio-Geophysical conditions will be of help in understanding the corresponding multi-scale statistics of the turbulent medium causing that scintillation. In particular, two techniques are tested as multi-scale decomposition schemes of the signals: the discrete wavelet analysis and the Empirical Mode Decomposition. The discussion of the results of the one analysis versus the other will be presented, trying to highlight benefits and limits of each scheme, also under suitably different helio-geophysical conditions.

  9. An Application of Multivariate Statistical Analysis for Query-Driven Visualization

    SciTech Connect

    Gosink, Luke J.; Garth, Christoph; Anderson, John C.; Bethel, E. Wes; Joy, Kenneth I.

    2010-03-01

    Abstract?Driven by the ability to generate ever-larger, increasingly complex data, there is an urgent need in the scientific community for scalable analysis methods that can rapidly identify salient trends in scientific data. Query-Driven Visualization (QDV) strategies are among the small subset of techniques that can address both large and highly complex datasets. This paper extends the utility of QDV strategies with a statistics-based framework that integrates non-parametric distribution estimation techniques with a new segmentation strategy to visually identify statistically significant trends and features within the solution space of a query. In this framework, query distribution estimates help users to interactively explore their query's solution and visually identify the regions where the combined behavior of constrained variables is most important, statistically, to their inquiry. Our new segmentation strategy extends the distribution estimation analysis by visually conveying the individual importance of each variable to these regions of high statistical significance. We demonstrate the analysis benefits these two strategies provide and show how they may be used to facilitate the refinement of constraints over variables expressed in a user's query. We apply our method to datasets from two different scientific domains to demonstrate its broad applicability.

  10. Analysis and meta-analysis of single-case designs with a standardized mean difference statistic: a primer and applications.

    PubMed

    Shadish, William R; Hedges, Larry V; Pustejovsky, James E

    2014-04-01

    This article presents a d-statistic for single-case designs that is in the same metric as the d-statistic used in between-subjects designs such as randomized experiments and offers some reasons why such a statistic would be useful in SCD research. The d has a formal statistical development, is accompanied by appropriate power analyses, and can be estimated using user-friendly SPSS macros. We discuss both advantages and disadvantages of d compared to other approaches such as previous d-statistics, overlap statistics, and multilevel modeling. It requires at least three cases for computation and assumes normally distributed outcomes and stationarity, assumptions that are discussed in some detail. We also show how to test these assumptions. The core of the article then demonstrates in depth how to compute d for one study, including estimation of the autocorrelation and the ratio of between case variance to total variance (between case plus within case variance), how to compute power using a macro, and how to use the d to conduct a meta-analysis of studies using single-case designs in the free program R, including syntax in an appendix. This syntax includes how to read data, compute fixed and random effect average effect sizes, prepare a forest plot and a cumulative meta-analysis, estimate various influence statistics to identify studies contributing to heterogeneity and effect size, and do various kinds of publication bias analyses. This d may prove useful for both the analysis and meta-analysis of data from SCDs.

  11. Research Update: Spatially resolved mapping of electronic structure on atomic level by multivariate statistical analysis

    SciTech Connect

    Belianinov, Alex; Panchapakesan, G.; Lin, Wenzhi; Sales, Brian C.; Sefat, Athena Safa; Jesse, Stephen; Pan, Minghu; Kalinin, Sergei V.

    2014-12-02

    Atomic level spatial variability of electronic structure in Fe-based superconductor FeTe0.55Se0.45 (Tc = 15 K) is explored using current-imaging tunneling-spectroscopy. Multivariate statistical analysis of the data differentiates regions of dissimilar electronic behavior that can be identified with the segregation of chalcogen atoms, as well as boundaries between terminations and near neighbor interactions. Subsequent clustering analysis allows identification of the spatial localization of these dissimilar regions. Similar statistical analysis of modeled calculated density of states of chemically inhomogeneous FeTe1 x Sex structures further confirms that the two types of chalcogens, i.e., Te and Se, can be identified by their electronic signature and differentiated by their local chemical environment. This approach allows detailed chemical discrimination of the scanning tunneling microscopy data including separation of atomic identities, proximity, and local configuration effects and can be universally applicable to chemically and electronically inhomogeneous surfaces.

  12. Advanced statistical methods for improved data analysis of NASA astrophysics missions

    NASA Astrophysics Data System (ADS)

    Feigelson, Eric D.

    The investigators under this grant studied ways to improve the statistical analysis of astronomical data. They looked at existing techniques, the development of new techniques, and the production and distribution of specialized software to the astronomical community. Abstracts of nine papers that were produced are included, as well as brief descriptions of four software packages. The articles that are abstracted discuss analytical and Monte Carlo comparisons of six different linear least squares fits, a (second) paper on linear regression in astronomy, two reviews of public domain software for the astronomer, subsample and half-sample methods for estimating sampling distributions, a nonparametric estimation of survival functions under dependent competing risks, censoring in astronomical data due to nondetections, an astronomy survival analysis computer package called ASURV, and improving the statistical methodology of astronomical data analysis.

  13. Research Update: Spatially resolved mapping of electronic structure on atomic level by multivariate statistical analysis

    NASA Astrophysics Data System (ADS)

    Belianinov, Alex; Ganesh, Panchapakesan; Lin, Wenzhi; Sales, Brian C.; Sefat, Athena S.; Jesse, Stephen; Pan, Minghu; Kalinin, Sergei V.

    2014-12-01

    Atomic level spatial variability of electronic structure in Fe-based superconductor FeTe0.55Se0.45 (Tc = 15 K) is explored using current-imaging tunneling-spectroscopy. Multivariate statistical analysis of the data differentiates regions of dissimilar electronic behavior that can be identified with the segregation of chalcogen atoms, as well as boundaries between terminations and near neighbor interactions. Subsequent clustering analysis allows identification of the spatial localization of these dissimilar regions. Similar statistical analysis of modeled calculated density of states of chemically inhomogeneous FeTe1-xSex structures further confirms that the two types of chalcogens, i.e., Te and Se, can be identified by their electronic signature and differentiated by their local chemical environment. This approach allows detailed chemical discrimination of the scanning tunneling microscopy data including separation of atomic identities, proximity, and local configuration effects and can be universally applicable to chemically and electronically inhomogeneous surfaces.

  14. On the Interpretation of Running Trends as Summary Statistics for Time Series Analysis

    NASA Astrophysics Data System (ADS)

    Vigo, Isabel M.; Trottini, Mario; Belda, Santiago

    2016-04-01

    In recent years, running trends analysis (RTA) has been widely used in climate applied research as summary statistics for time series analysis. There is no doubt that RTA might be a useful descriptive tool, but despite its general use in applied research, precisely what it reveals about the underlying time series is unclear and, as a result, its interpretation is unclear too. This work contributes to such interpretation in two ways: 1) an explicit formula is obtained for the set of time series with a given series of running trends, making it possible to show that running trends, alone, perform very poorly as summary statistics for time series analysis; and 2) an equivalence is established between RTA and the estimation of a (possibly nonlinear) trend component of the underlying time series using a weighted moving average filter. Such equivalence provides a solid ground for RTA implementation and interpretation/validation.

  15. Advanced statistical methods for improved data analysis of NASA astrophysics missions

    NASA Technical Reports Server (NTRS)

    Feigelson, Eric D.

    1992-01-01

    The investigators under this grant studied ways to improve the statistical analysis of astronomical data. They looked at existing techniques, the development of new techniques, and the production and distribution of specialized software to the astronomical community. Abstracts of nine papers that were produced are included, as well as brief descriptions of four software packages. The articles that are abstracted discuss analytical and Monte Carlo comparisons of six different linear least squares fits, a (second) paper on linear regression in astronomy, two reviews of public domain software for the astronomer, subsample and half-sample methods for estimating sampling distributions, a nonparametric estimation of survival functions under dependent competing risks, censoring in astronomical data due to nondetections, an astronomy survival analysis computer package called ASURV, and improving the statistical methodology of astronomical data analysis.

  16. How complex climate networks complement eigen techniques for the statistical analysis of climatological data

    NASA Astrophysics Data System (ADS)

    Donges, Jonathan; Petrova, Irina; Löw, Alexander; Marwan, Norbert; Kurths, Jürgen

    2015-04-01

    Eigen techniques such as empirical orthogonal function (EOF) or coupled pattern (CP) / maximum covariance analysis have been frequently used for detecting patterns in multivariate climatological data sets. Recently, statistical methods originating from the theory of complex networks have been employed for the very same purpose of spatio-temporal analysis. This climate network (CN) analysis is usually based on the same set of similarity matrices as is used in classical EOF or CP analysis, e.g., the correlation matrix of a single climatological field or the cross-correlation matrix between two distinct climatological fields. In this study, formal relationships as well as conceptual differences between both eigen and network approaches are derived and illustrated using global precipitation, evaporation and surface air temperature data sets. These results allow us to pinpoint that CN analysis can complement classical eigen techniques and provides additional information on the higher-order structure of statistical interrelationships in climatological data. Hence, CNs are a valuable supplement to the statistical toolbox of the climatologist, particularly for making sense out of very large data sets such as those generated by satellite observations and climate model intercomparison exercises.

  17. How complex climate networks complement eigen techniques for the statistical analysis of climatological data

    NASA Astrophysics Data System (ADS)

    Donges, Jonathan F.; Petrova, Irina; Loew, Alexander; Marwan, Norbert; Kurths, Jürgen

    2015-11-01

    Eigen techniques such as empirical orthogonal function (EOF) or coupled pattern (CP)/maximum covariance analysis have been frequently used for detecting patterns in multivariate climatological data sets. Recently, statistical methods originating from the theory of complex networks have been employed for the very same purpose of spatio-temporal analysis. This climate network (CN) analysis is usually based on the same set of similarity matrices as is used in classical EOF or CP analysis, e.g., the correlation matrix of a single climatological field or the cross-correlation matrix between two distinct climatological fields. In this study, formal relationships as well as conceptual differences between both eigen and network approaches are derived and illustrated using global precipitation, evaporation and surface air temperature data sets. These results allow us to pinpoint that CN analysis can complement classical eigen techniques and provides additional information on the higher-order structure of statistical interrelationships in climatological data. Hence, CNs are a valuable supplement to the statistical toolbox of the climatologist, particularly for making sense out of very large data sets such as those generated by satellite observations and climate model intercomparison exercises.

  18. Statistical Analysis of Current Sheets in Three-dimensional Magnetohydrodynamic Turbulence

    NASA Astrophysics Data System (ADS)

    Zhdankin, Vladimir; Uzdensky, Dmitri A.; Perez, Jean C.; Boldyrev, Stanislav

    2013-07-01

    We develop a framework for studying the statistical properties of current sheets in numerical simulations of magnetohydrodynamic (MHD) turbulence with a strong guide field, as modeled by reduced MHD. We describe an algorithm that identifies current sheets in a simulation snapshot and then determines their geometrical properties (including length, width, and thickness) and intensities (peak current density and total energy dissipation rate). We then apply this procedure to simulations of reduced MHD and perform a statistical analysis on the obtained population of current sheets. We evaluate the role of reconnection by separately studying the populations of current sheets which contain magnetic X-points and those which do not. We find that the statistical properties of the two populations are different in general. We compare the scaling of these properties to phenomenological predictions obtained for the inertial range of MHD turbulence. Finally, we test whether the reconnecting current sheets are consistent with the Sweet-Parker model.

  19. Statistical Analysis of Spectral Properties and Prosodic Parameters of Emotional Speech

    NASA Astrophysics Data System (ADS)

    Přibil, J.; Přibilová, A.

    2009-01-01

    The paper addresses reflection of microintonation and spectral properties in male and female acted emotional speech. Microintonation component of speech melody is analyzed regarding its spectral and statistical parameters. According to psychological research of emotional speech, different emotions are accompanied by different spectral noise. We control its amount by spectral flatness according to which the high frequency noise is mixed in voiced frames during cepstral speech synthesis. Our experiments are aimed at statistical analysis of cepstral coefficient values and ranges of spectral flatness in three emotions (joy, sadness, anger), and a neutral state for comparison. Calculated histograms of spectral flatness distribution are visually compared and modelled by Gamma probability distribution. Histograms of cepstral coefficient distribution are evaluated and compared using skewness and kurtosis. Achieved statistical results show good correlation comparing male and female voices for all emotional states portrayed by several Czech and Slovak professional actors.

  20. Statistical analysis of CMEs' geoeffectiveness over one year of solar maximum during cycle 23

    NASA Astrophysics Data System (ADS)

    Schmieder, Brigitte; Bocchialini, Karine; Menvielle, Michel

    2016-07-01

    Using different propagation models from the Sun to the Earth, we performed a statistical analysis over the year 2002 on CME's geoeffectiveness linked to sudden storm commencements (ssc). We also classified the perturbations of the interplanetary medium that trigger the sscs. For each CME, the sources on the Sun of the CME are identified as well as the properties of the parameters deduced from spacecraft measurements along the path of the CME related event, in the solar atmosphere, the interplanetary medium, and the Earth ionized (magnetosphere and ionosphere) and neutral (thermosphere) environments. The set of observations is statistically analysed so as to evaluate the geoeffectiveness of CMEs in terms of ionospheric and thermospheric signatures, with attention to possible differences related to different kinds of solar sources. The observed Sun-to-Earth travel times are compared to those estimated using the existing models of propagation in the interplanetary medium, and this comparison is used to statistically assess the performances of the various models.