Science.gov

Sample records for age statistical analysis

  1. Statistical Data Analysis in the Computer Age

    NASA Astrophysics Data System (ADS)

    Efron, Bradley; Tibshirani, Robert

    1991-07-01

    Most of our familiar statistical methods, such as hypothesis testing, linear regression, analysis of variance, and maximum likelihood estimation, were designed to be implemented on mechanical calculators. modern electronic computation has encouraged a host of new statistical methods that require fewer distributional assumptions than their predecessors and can be applied to more complicated statistical estimators. These methods allow the scientist to explore and describe data and draw valid statistical inferences without the usual concerns for mathematical tractability. This is possible because traditional methods of mathematical analysis are replaced by specially constructed computer algorithms. Mathematics has not disappeared from statistical theory. It is the main method for deciding which algorithms are correct and efficient tools for automating statistical inference.

  2. Statistical analysis of accelerated temperature aging of semiconductor devices

    NASA Astrophysics Data System (ADS)

    Johnson, W. A.; Milles, M. F.

    1981-05-01

    A number of semiconductor devices taken from a distribution were operated at several elevated temperatures to induce failure in all devices within a reasonable time. Assuming general characteristics of the device failure probability density function (pdf) and its temperature dependence, the expected cumulative failure function (cff) for devices in normal operation were estimated based on statistical inference, taking the average probability of a random device (from the same distribution but operated at a normal temperature) failing as a function of time. A review of the mathematical formalism employed in semiconductor reliability discussions is included. Three failure pdf's at particular usefulness to this analysis--exponential, normal, and lognormal - are discussed. The cff, at times orders of magnitude loss then, at times comparable to the desired system useful, life (*10 to the 4th power to 10 to the 5th power hr) is considered. A review of accelerated temperature aging is presented, and the assumption concerning the general characteristics of the failure pdf, which are fundamental to this analysis, are emphasized.

  3. Clusters in the distribution of pulsars in period, pulse-width, and age. [statistical analysis/statistical distributions

    NASA Technical Reports Server (NTRS)

    Baker, K. B.; Sturrock, P. A.

    1975-01-01

    The question of whether pulsars form a single group or whether pulsars come in two or more different groups is discussed. It is proposed that such groups might be related to several factors such as the initial creation of the neutron star, or the orientation of the magnetic field axis with the spin axis. Various statistical models are examined.

  4. Mathematical and statistical analysis

    NASA Technical Reports Server (NTRS)

    Houston, A. Glen

    1988-01-01

    The goal of the mathematical and statistical analysis component of RICIS is to research, develop, and evaluate mathematical and statistical techniques for aerospace technology applications. Specific research areas of interest include modeling, simulation, experiment design, reliability assessment, and numerical analysis.

  5. Deconstructing Statistical Analysis

    ERIC Educational Resources Information Center

    Snell, Joel

    2014-01-01

    Using a very complex statistical analysis and research method for the sake of enhancing the prestige of an article or making a new product or service legitimate needs to be monitored and questioned for accuracy. 1) The more complicated the statistical analysis, and research the fewer the number of learned readers can understand it. This adds a…

  6. Statistical Energy Analysis Program

    NASA Technical Reports Server (NTRS)

    Ferebee, R. C.; Trudell, R. W.; Yano, L. I.; Nygaard, S. I.

    1985-01-01

    Statistical Energy Analysis (SEA) is powerful tool for estimating highfrequency vibration spectra of complex structural systems and incorporated into computer program. Basic SEA analysis procedure divided into three steps: Idealization, parameter generation, and problem solution. SEA computer program written in FORTRAN V for batch execution.

  7. Age-Specific Sex-Related Differences in Infections: A Statistical Analysis of National Surveillance Data in Japan

    PubMed Central

    Eshima, Nobuoki; Tokumaru, Osamu; Hara, Shohei; Bacal, Kira; Korematsu, Seigo; Karukaya, Shigeru; Uruma, Kiyo; Okabe, Nobuhiko; Matsuishi, Toyojiro

    2012-01-01

    Background To prevent and control infectious diseases, it is important to understand how sex and age influence morbidity rates, but consistent clear descriptions of differences in the reported incidence of infectious diseases in terms of sex and age are sparse. Methods and Findings Data from the Japanese surveillance system for infectious diseases from 2000 to 2009 were used in the analysis of seven viral and four bacterial infectious diseases with relatively large impact on the Japanese community. The male-to-female morbidity (MFM) ratios in different age groups were estimated to compare incidence rates of symptomatic reported infection between the sexes at different ages. MFM ratios were >1 for five viral infections out of seven in childhood, i.e. male children were more frequently reported as infected than females with pharyngoconjunctival fever, herpangina, hand-foot-and-mouth disease, mumps, and varicella. More males were also reported to be infected with erythema infectiosum and exanthema subitum, but only in children 1 year of age. By contrast, in adulthood the MFM ratios decreased to <1 for all of the viral infections above except varicella, i.e. adult women were more frequently reported to be infected than men. Sex- and age-related differences in reported morbidity were also documented for bacterial infections. Reported morbidity for enterohemorrhagic Escherichia coli infection was higher in adult females and females were reportedly more infected with mycoplasma pneumonia than males in all age groups up to 70 years. Conclusions Sex-related differences in reported morbidity for viral and bacterial infections were documented among different age groups. Changes in MFM ratios with age may reflect differences between the sexes in underlying development processes, including those affecting the immune, endocrine, and reproductive systems, or differences in reporting rates. PMID:22848753

  8. Statistical log analysis made practical

    SciTech Connect

    Mitchell, W.K.; Nelson, R.J. )

    1991-06-01

    This paper discusses the advantages of a statistical approach to log analysis. Statistical techniques use inverse methods to calculate formation parameters. The use of statistical techniques has been limited, however, by the complexity of the mathematics and lengthy computer time required to minimize traditionally used nonlinear equations.

  9. Statistical data analysis

    SciTech Connect

    Hahn, A.A.

    1994-11-01

    The complexity of instrumentation sometimes requires data analysis to be done before the result is presented to the control room. This tutorial reviews some of the theoretical assumptions underlying the more popular forms of data analysis and presents simple examples to illuminate the advantages and hazards of different techniques.

  10. Statistical Approaches for the Study of Cognitive and Brain Aging

    PubMed Central

    Chen, Huaihou; Zhao, Bingxin; Cao, Guanqun; Proges, Eric C.; O'Shea, Andrew; Woods, Adam J.; Cohen, Ronald A.

    2016-01-01

    Neuroimaging studies of cognitive and brain aging often yield massive datasets that create many analytic and statistical challenges. In this paper, we discuss and address several limitations in the existing work. (1) Linear models are often used to model the age effects on neuroimaging markers, which may be inadequate in capturing the potential nonlinear age effects. (2) Marginal correlations are often used in brain network analysis, which are not efficient in characterizing a complex brain network. (3) Due to the challenge of high-dimensionality, only a small subset of the regional neuroimaging markers is considered in a prediction model, which could miss important regional markers. To overcome those obstacles, we introduce several advanced statistical methods for analyzing data from cognitive and brain aging studies. Specifically, we introduce semiparametric models for modeling age effects, graphical models for brain network analysis, and penalized regression methods for selecting the most important markers in predicting cognitive outcomes. We illustrate these methods using the healthy aging data from the Active Brain Study. PMID:27486400

  11. Analysis of network statistics

    NASA Astrophysics Data System (ADS)

    Cottrell, R. L. A.

    1987-08-01

    This talk discusses the types and sources of data obtainable from networks of computer systems and terminals connected by communications paths. These paths often utilize mixtures of protocols and devices (such as modems, multiplexors, switches and front-ends) from multiple vendors. The talk describes how the data can be gathered from these devices and protocol layers, consolidated, stored, and analyzed. The analysis typically includes merging information from data bases describing the network topology, components, etc. Examples of reports and displays of the information gleaned are shown, together with illustrations of how the information may be useful for troubleshooting, performance measurement, auditing, accounting, and trend prediction.

  12. STATISTICAL SAMPLING AND DATA ANALYSIS

    EPA Science Inventory

    Research is being conducted to develop approaches to improve soil and sediment sampling techniques, measurement design and geostatistics, and data analysis via chemometric, environmetric, and robust statistical methods. Improvements in sampling contaminated soil and other hetero...

  13. Federal Interagency Forum on Aging-Related Statistics

    MedlinePlus

    Member Agencies Federal Agency Login X Administration on Aging Agency for Healthcare Research and Quality Bureau of ... National Center for Health Statistics National Institute on Aging Office of the Assistant Secretary for Planning & Evaluation, ...

  14. Statistical physics of age related macular degeneration

    NASA Astrophysics Data System (ADS)

    Family, Fereydoon; Mazzitello, K. I.; Arizmendi, C. M.; Grossniklaus, H. E.

    Age-related macular degeneration (AMD) is the leading cause of blindness beyond the age of 50 years. The most common pathogenic mechanism that leads to AMD is choroidal neovascularization (CNV). CNV is produced by accumulation of residual material caused by aging of retinal pigment epithelium cells (RPE). The RPE is a phagocytic system that is essential for renewal of photoreceptors (rods and cones). With time, incompletely degraded membrane material builds up in the form of lipofuscin. Lipofuscin is made of free-radical-damaged protein and fat, which forms not only in AMD, but also Alzheimer disease and Parkinson disease. The study of lipofuscin formation and growth is important, because of their association with cellular aging. We introduce a model of non-equilibrium cluster growth and aggregation that we have developed for studying the formation and growth of lipofuscin in the aging RPE. Our results agree with a linear growth of the number of lipofuscin granules with age. We apply the dynamic scaling approach to our model and find excellent data collapse for the cluster size distribution. An unusual feature of our model is that while small particles are removed from the RPE the larger ones become fixed and grow by aggregation.

  15. STATISTICS AND DATA ANALYSIS WORKSHOP

    EPA Science Inventory

    On Janauary 15 and 16, 2003, a workshop for Tribal water resources staff on Statistics and Data Analysis was held at the Indian Springs Lodge on the Forest County Potowatomi Reservation near Wabeno, WI. The workshop was co-sponsored by the EPA, Sokaogon Chippewa (Mole Lake) Comm...

  16. Statistical Analysis in Climate Research

    NASA Astrophysics Data System (ADS)

    von Storch, Hans; Zwiers, Francis W.

    2002-03-01

    The purpose of this book is to help the climatologist understand the basic precepts of the statistician's art and to provide some of the background needed to apply statistical methodology correctly and usefully. The book is self contained: introductory material, standard advanced techniques, and the specialized techniques used specifically by climatologists are all contained within this one source. There are a wealth of real-world examples drawn from the climate literature to demonstrate the need, power and pitfalls of statistical analysis in climate research.

  17. Tools for Basic Statistical Analysis

    NASA Technical Reports Server (NTRS)

    Luz, Paul L.

    2005-01-01

    Statistical Analysis Toolset is a collection of eight Microsoft Excel spreadsheet programs, each of which performs calculations pertaining to an aspect of statistical analysis. These programs present input and output data in user-friendly, menu-driven formats, with automatic execution. The following types of calculations are performed: Descriptive statistics are computed for a set of data x(i) (i = 1, 2, 3 . . . ) entered by the user. Normal Distribution Estimates will calculate the statistical value that corresponds to cumulative probability values, given a sample mean and standard deviation of the normal distribution. Normal Distribution from two Data Points will extend and generate a cumulative normal distribution for the user, given two data points and their associated probability values. Two programs perform two-way analysis of variance (ANOVA) with no replication or generalized ANOVA for two factors with four levels and three repetitions. Linear Regression-ANOVA will curvefit data to the linear equation y=f(x) and will do an ANOVA to check its significance.

  18. Performance of statistical energy analysis

    NASA Technical Reports Server (NTRS)

    Davis, R. F.; Hines, D. E.

    1973-01-01

    Statistical energy analysis (SEA) methods have been developed for high frequency modal analyses on random vibration environments. These SEA methods are evaluated by comparing analytical predictions to test results. Simple test methods are developed for establishing SEA parameter values. Techniques are presented, based on the comparison of the predictions with test values, for estimating SEA accuracy as a function of frequency for a general structure.

  19. Statistical Analysis of RNA Backbone

    PubMed Central

    Hershkovitz, Eli; Sapiro, Guillermo; Tannenbaum, Allen; Williams, Loren Dean

    2009-01-01

    Local conformation is an important determinant of RNA catalysis and binding. The analysis of RNA conformation is particularly difficult due to the large number of degrees of freedom (torsion angles) per residue. Proteins, by comparison, have many fewer degrees of freedom per residue. In this work, we use and extend classical tools from statistics and signal processing to search for clusters in RNA conformational space. Results are reported both for scalar analysis, where each torsion angle is separately studied, and for vectorial analysis, where several angles are simultaneously clustered. Adapting techniques from vector quantization and clustering to the RNA structure, we find torsion angle clusters and RNA conformational motifs. We validate the technique using well-known conformational motifs, showing that the simultaneous study of the total torsion angle space leads to results consistent with known motifs reported in the literature and also to the finding of new ones. PMID:17048391

  20. Statistical analysis of nucleotide sequences.

    PubMed Central

    Stückle, E E; Emmrich, C; Grob, U; Nielsen, P J

    1990-01-01

    In order to scan nucleic acid databases for potentially relevant but as yet unknown signals, we have developed an improved statistical model for pattern analysis of nucleic acid sequences by modifying previous methods based on Markov chains. We demonstrate the importance of selecting the appropriate parameters in order for the method to function at all. The model allows the simultaneous analysis of several short sequences with unequal base frequencies and Markov order k not equal to 0 as is usually the case in databases. As a test of these modifications, we show that in E. coli sequences there is a bias against palindromic hexamers which correspond to known restriction enzyme recognition sites. PMID:2251125

  1. Statistical analysis of pyroshock data

    NASA Astrophysics Data System (ADS)

    Hughes, William O.

    2002-05-01

    The sample size of aerospace pyroshock test data is typically small. This often forces the engineer to make assumptions on its population distribution and to use conservative margins or methodologies in determining shock specifications. For example, the maximum expected environment is often derived by adding 3-6 dB to the maximum envelope of a limited amount of shock data. The recent availability of a large amount of pyroshock test data has allowed a rare statistical analysis to be performed. Findings and procedures from this analysis will be explained, including information on population distributions, procedures to properly combine families of test data, and methods of deriving appropriate shock specifications for a multipoint shock source.

  2. Statistical Design in Isothermal Aging of Polyimide Resins

    NASA Technical Reports Server (NTRS)

    Sutter, James K.; Jobe, Marcus; Crane, Elizabeth A.

    1995-01-01

    Recent developments in research on polyimides for high temperature applications have led to the synthesis of many new polymers. Among the criteria that determines their thermal oxidative stability, isothermal aging is one of the most important. Isothermal aging studies require that many experimental factors are controlled to provide accurate results. In this article we describe a statistical plan that compares the isothermal stability of several polyimide resins, while minimizing the variations inherent in high-temperature aging studies.

  3. Statistical Handbook on Aging Americans. 1994 Edition. Statistical Handbook Series Number 5.

    ERIC Educational Resources Information Center

    Schick, Frank L., Ed.; Schick, Renee, Ed.

    This statistical handbook contains 378 tables and charts illustrating the changes in the United States' aging population based on data collected during the 1990 census and several other surveys. The tables and charts are organized by topic as follows: demographics (age and sex distribution, life expectancy, race and ethnicity, geographic…

  4. Statistical Analysis of Tsunami Variability

    NASA Astrophysics Data System (ADS)

    Zolezzi, Francesca; Del Giudice, Tania; Traverso, Chiara; Valfrè, Giulio; Poggi, Pamela; Parker, Eric J.

    2010-05-01

    similar to that seen in ground motion attenuation correlations used for seismic hazard assessment. The second issue was intra-event variability. This refers to the differences in tsunami wave run-up along a section of coast during a single event. Intra-event variability investigated directly considering field observations. The tsunami events used in the statistical evaluation were selected on the basis of the completeness and reliability of the available data. Tsunami considered for the analysis included the recent and well surveyed tsunami of Boxing Day 2004 (Great Indian Ocean Tsunami), Java 2006, Okushiri 1993, Kocaeli 1999, Messina 1908 and a case study of several historic events in Hawaii. Basic statistical analysis was performed on the field observations from these tsunamis. For events with very wide survey regions, the run-up heights have been grouped in order to maintain a homogeneous distance from the source. Where more than one survey was available for a given event, the original datasets were maintained separately to avoid combination of non-homogeneous data. The observed run-up measurements were used to evaluate the minimum, maximum, average, standard deviation and coefficient of variation for each data set. The minimum coefficient of variation was 0.12 measured for the 2004 Boxing Day tsunami at Nias Island (7 data) while the maximum is 0.98 for the Okushiri 1993 event (93 data). The average coefficient of variation is of the order of 0.45.

  5. Asymptotic modal analysis and statistical energy analysis

    NASA Technical Reports Server (NTRS)

    Dowell, Earl H.

    1992-01-01

    Asymptotic Modal Analysis (AMA) is a method which is used to model linear dynamical systems with many participating modes. The AMA method was originally developed to show the relationship between statistical energy analysis (SEA) and classical modal analysis (CMA). In the limit of a large number of modes of a vibrating system, the classical modal analysis result can be shown to be equivalent to the statistical energy analysis result. As the CMA result evolves into the SEA result, a number of systematic assumptions are made. Most of these assumptions are based upon the supposition that the number of modes approaches infinity. It is for this reason that the term 'asymptotic' is used. AMA is the asymptotic result of taking the limit of CMA as the number of modes approaches infinity. AMA refers to any of the intermediate results between CMA and SEA, as well as the SEA result which is derived from CMA. The main advantage of the AMA method is that individual modal characteristics are not required in the model or computations. By contrast, CMA requires that each modal parameter be evaluated at each frequency. In the latter, contributions from each mode are computed and the final answer is obtained by summing over all the modes in the particular band of interest. AMA evaluates modal parameters only at their center frequency and does not sum the individual contributions from each mode in order to obtain a final result. The method is similar to SEA in this respect. However, SEA is only capable of obtaining spatial averages or means, as it is a statistical method. Since AMA is systematically derived from CMA, it can obtain local spatial information as well.

  6. Fake Statistically Valid Isotopic Ages in Impact Crater Geochronology

    NASA Astrophysics Data System (ADS)

    Jourdan, F.; Schmieder, M.; McWilliams, M. M.; Buchner, E.

    2009-05-01

    Precise dating of impact structures is crucial in several fundamental aspects, such as correlating effects on the bio- and geosphere caused by these catastrophic processes. Among the 176 listed impact structures [1], only 25 have a stated age precision better than ± 2%. Statistical investigation of these 25 ages showed that 11 ages are accurate, 12 are at best ambiguous, and 2 are not well characterized [2]. In this study, we show that even with statistically valid isotope ages, the age of an impact can be "missed" by several hundred millions of years. We present a new 40Ar/39Ar plateau age of 444 ± 4 Ma for the Acraman structure (real age ˜590 Ma [3]) and four plateau ages ranging from 81.07 ± 0.76 Ma to 74.6 ± 1.5 Ma for the Brent structure (estimated real age ˜453 Ma [4]). In addition, we discuss a 40Ar/39Ar plateau age of 994 ± 11, recently obtained by [5] on the Dhala structure (real age ˜2.0 Ga [5]). Despite careful sample preparations (single grain handpicking and HF leaching, in order to remove alteration phases), these results are much younger than the impact ages. Petrographic observations show that Acraman and Dhala grain separates all have an orange color and show evidence of alteration. This suggests that these ages are the results of hydrothermal events that triggered intensive 40Ar* loss and crystallization of secondary phases. More intriguing are the Brent samples (glassy melt rocks obtained from a drill core) that appeared very fresh under the microscope. The Brent glass might be a Cretaceous pseudotachylite generated by a late adjustment of the structure and/or by a local earthquake. Because we know the approximate age of the craters with stratigraphic evidences, these outliers are easy to identify. However, this is a red flag for any uncritical interpretation of isotopic ages (including e.g., 40Ar/39Ar, U/Pb, or U-Th/He [6]). In this paper, we encourage a multi-technique approach (i.e., isotopic, stratigraphic, paleogeographic [7,8]) and

  7. Statistical estimation of mineral age by K-Ar method

    SciTech Connect

    Vistelius, A.B.; Drubetzkoy, E.R.; Faas, A.V. )

    1989-11-01

    Statistical estimation of age of {sup 40}Ar/{sup 40}K ratios may be considered a result of convolution of uniform and normal distributions with different weights for different minerals. Data from Gul'shad Massif (Nearbalkhash, Kazakhstan, USSR) indicate that {sup 40}Ar/{sup 40}K ratios reflecting the intensity of geochemical processes can be resolved using convolutions. Loss of {sup 40}Ar in biotites is shown whereas hornblende retained the original content of {sup 40}Ar throughout the geological history of the massif. Results demonstrate that different estimation methods must be used for different minerals and different rocks when radiometric ages are employed for dating.

  8. Asymptotic modal analysis and statistical energy analysis

    NASA Technical Reports Server (NTRS)

    Dowell, Earl H.; Peretti, Linda F.

    1990-01-01

    The sound field of a structural-acoustic enclosure was subject to experimental analysis and theoretical description in order to develop an efficient and accurate method for predicting sound pressure levels in enclosures such as aircraft fuselages. Asymptotic Modal Analysis (AMA) is the method under investigation. AMA is derived from classical modal analysis (CMA) by considering the asymptotic limit of the sound pressure level as the number of acoustic and/or structural modes approaches infinity. Using AMA, results identical to those of Statistical Energy Analysis (SEA) were obtained for the spatially-averaged sound pressure levels in the interior. AMA is systematically derived from CMA and therefore the degree of generality of the end result can be adjusted through the choice of appropriate simplifying assumptions. For example, AMA can be used to obtain local sound pressure levels at particular points inside the enclosure, or to include the effects of varying the size and/or location of the sound source. AMA theoretical results were compared with CMA theory and also with experiment for the case where the structural-acoustic enclosure is a rectangular cavity with part of one wall flexible and vibrating, while the rest of the cavity is rigid.

  9. Statistical analysis of barefoot impressions.

    PubMed

    Kennedy, Robert B; Pressman, Irwin S; Chen, Sanping; Petersen, Peter H; Pressman, Ari E

    2003-01-01

    Comparison of the shapes of barefoot impressions from an individual with footprints or shoes linked to a crime may be useful as a means of including or excluding that individual as possibly being at the scene of a crime. The question of the distinguishability of a person's barefoot print arises frequently. This study indicates that measurements taken from the outlines of inked footprint impressions show a great degree of variability between donors and a great degree of similarity for multiple impressions taken from the same donor. The normality of the set of measurements on footprint outlines that we have selected for this study is confirmed. A statistical justification for the use of the product rule on individual statistical precisions is developed. PMID:12570199

  10. An R package for statistical provenance analysis

    NASA Astrophysics Data System (ADS)

    Vermeesch, Pieter; Resentini, Alberto; Garzanti, Eduardo

    2016-05-01

    This paper introduces provenance, a software package within the statistical programming environment R, which aims to facilitate the visualisation and interpretation of large amounts of sedimentary provenance data, including mineralogical, petrographic, chemical and isotopic provenance proxies, or any combination of these. provenance comprises functions to: (a) calculate the sample size required to achieve a given detection limit; (b) plot distributional data such as detrital zircon U-Pb age spectra as Cumulative Age Distributions (CADs) or adaptive Kernel Density Estimates (KDEs); (c) plot compositional data as pie charts or ternary diagrams; (d) correct the effects of hydraulic sorting on sandstone petrography and heavy mineral composition; (e) assess the settling equivalence of detrital minerals and grain-size dependence of sediment composition; (f) quantify the dissimilarity between distributional data using the Kolmogorov-Smirnov and Sircombe-Hazelton distances, or between compositional data using the Aitchison and Bray-Curtis distances; (e) interpret multi-sample datasets by means of (classical and nonmetric) Multidimensional Scaling (MDS) and Principal Component Analysis (PCA); and (f) simplify the interpretation of multi-method datasets by means of Generalised Procrustes Analysis (GPA) and 3-way MDS. All these tools can be accessed through an intuitive query-based user interface, which does not require knowledge of the R programming language. provenance is free software released under the GPL-2 licence and will be further expanded based on user feedback.

  11. Statistical analysis of planetary surfaces

    NASA Astrophysics Data System (ADS)

    Schmidt, Frederic; Landais, Francois; Lovejoy, Shaun

    2015-04-01

    In the last decades, a huge amount of topographic data has been obtained by several techniques (laser and radar altimetry, DTM…) for different bodies in the solar system, including Earth, Mars, the Moon etc.. In each case, topographic fields exhibit an extremely high variability with details at each scale, from millimeter to thousands of kilometers. This complexity seems to prohibit global descriptions or global topography models. Nevertheless, this topographic complexity is well-known to exhibit scaling laws that establish a similarity between scales and permit simpler descriptions and models. Indeed, efficient simulations can be made using the statistical properties of scaling fields (fractals). But realistic simulations of global topographic fields must be multi (not mono) scaling behaviour, reflecting the extreme variability and intermittency observed in real fields that can not be generated by simple scaling models. A multiscaling theory has been developed in order to model high variability and intermittency. This theory is a good statistical candidate to model the topography field with a limited number of parameters (called the multifractal parameters). In our study, we show that statistical properties of the Martian topography is accurately reproduced by this model, leading to new interpretation of geomorphological processes.

  12. Asymptotic modal analysis and statistical energy analysis

    NASA Technical Reports Server (NTRS)

    Dowell, Earl H.

    1988-01-01

    Statistical Energy Analysis (SEA) is defined by considering the asymptotic limit of Classical Modal Analysis, an approach called Asymptotic Modal Analysis (AMA). The general approach is described for both structural and acoustical systems. The theoretical foundation is presented for structural systems, and experimental verification is presented for a structural plate responding to a random force. Work accomplished subsequent to the grant initiation focusses on the acoustic response of an interior cavity (i.e., an aircraft or spacecraft fuselage) with a portion of the wall vibrating in a large number of structural modes. First results were presented at the ASME Winter Annual Meeting in December, 1987, and accepted for publication in the Journal of Vibration, Acoustics, Stress and Reliability in Design. It is shown that asymptotically as the number of acoustic modes excited becomes large, the pressure level in the cavity becomes uniform except at the cavity boundaries. However, the mean square pressure at the cavity corner, edge and wall is, respectively, 8, 4, and 2 times the value in the cavity interior. Also it is shown that when the portion of the wall which is vibrating is near a cavity corner or edge, the response is significantly higher.

  13. Statistical Power in Meta-Analysis

    ERIC Educational Resources Information Center

    Liu, Jin

    2015-01-01

    Statistical power is important in a meta-analysis study, although few studies have examined the performance of simulated power in meta-analysis. The purpose of this study is to inform researchers about statistical power estimation on two sample mean difference test under different situations: (1) the discrepancy between the analytical power and…

  14. Aging and the statistical learning of grammatical form classes.

    PubMed

    Schwab, Jessica F; Schuler, Kathryn D; Stillman, Chelsea M; Newport, Elissa L; Howard, James H; Howard, Darlene V

    2016-08-01

    Language learners must place unfamiliar words into categories, often with few explicit indicators about when and how that word can be used grammatically. Reeder, Newport, and Aslin (2013) showed that college students can learn grammatical form classes from an artificial language by relying solely on distributional information (i.e., contextual cues in the input). Here, 2 experiments revealed that healthy older adults also show such statistical learning, though they are poorer than young at distinguishing grammatical from ungrammatical strings. This finding expands knowledge of which aspects of learning vary with aging, with potential implications for second language learning in late adulthood. (PsycINFO Database Record PMID:27294711

  15. Statistical Survey and Analysis Handbook.

    ERIC Educational Resources Information Center

    Smith, Kenneth F.

    The National Food and Agriculture Council of the Philippines regularly requires rapid feedback data for analysis, which will assist in monitoring programs to improve and increase the production of selected crops by small scale farmers. Since many other development programs in various subject matter areas also require similar statistical…

  16. Statistical Analysis of DWPF ARG-1 Data

    SciTech Connect

    Harris, S.P.

    2001-03-02

    A statistical analysis of analytical results for ARG-1, an Analytical Reference Glass, blanks, and the associated calibration and bench standards has been completed. These statistics provide a means for DWPF to review the performance of their laboratory as well as identify areas of improvement.

  17. Explorations in Statistics: The Analysis of Change

    ERIC Educational Resources Information Center

    Curran-Everett, Douglas; Williams, Calvin L.

    2015-01-01

    Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This tenth installment of "Explorations in Statistics" explores the analysis of a potential change in some physiological response. As researchers, we often express absolute change as percent change so we can…

  18. Statistical Analysis For Nucleus/Nucleus Collisions

    NASA Technical Reports Server (NTRS)

    Mcguire, Stephen C.

    1989-01-01

    Report describes use of several statistical techniques to charactertize angular distributions of secondary particles emitted in collisions of atomic nuclei in energy range of 24 to 61 GeV per nucleon. Purpose of statistical analysis to determine correlations between intensities of emitted particles and angles comfirming existence of quark/gluon plasma.

  19. STATISTICAL ANALYSIS OF A DETERMINISTIC STOCHASTIC ORBIT

    SciTech Connect

    Kaufman, Allan N.; Abarbanel, Henry D.I.; Grebogi, Celso

    1980-05-01

    If the solution of a deterministic equation is stochastic (in the sense of orbital instability), it can be subjected to a statistical analysis. This is illustrated for a coded orbit of the Chirikov mapping. Statistical dependence and the Markov assumption are tested. The Kolmogorov-Sinai entropy is related to the probability distribution for the orbit.

  20. Statistical analysis of histopathological endpoints.

    PubMed

    Green, John W; Springer, Timothy A; Saulnier, Amy N; Swintek, Joe

    2014-05-01

    Histopathological assessments of fish from aquatic ecotoxicology studies are being performed with increasing frequency. Aquatic ecotoxicology studies performed for submission to regulatory agencies are usually conducted with multiple subjects (e.g., fish) in each of multiple vessels (replicates) within a water control and within each of several concentrations of a test substance. A number of histopathological endpoints are evaluated in each fish, and a severity score is generally recorded for each endpoint. The severity scores are often recorded using a nonquantitative scale of 0 to 4, with 0 indicating no effect, 1 indicating minimal effect, through 4 for severe effect. Statistical methods often used to analyze these scores suffer from several shortcomings: computing average scores as though scores were quantitative values, considering only the frequency of abnormality while ignoring severity, ignoring any concentration-response trend, and ignoring the possible correlation between responses of individuals within test vessels. A new test, the Rao-Scott Cochran-Armitage by Slices (RSCABS), is proposed that incorporates the replicate vessel experimental design and the biological expectation that the severity of the effect tends to increase with increasing doses or concentrations, while retaining the individual subject scores and taking into account the severity as well as frequency of scores. A power simulation and examples demonstrate the performance of the test. R-based software has been developed to carry out this test and is available free of charge at www.epa.gov/med/Prods_Pubs/rscabs.htm. The SAS-based RSCABS software is available from the first and third authors. PMID:24464649

  1. Advice on statistical analysis for Circulation Research.

    PubMed

    Kusuoka, Hideo; Hoffman, Julien I E

    2002-10-18

    Since the late 1970s when many journals published articles warning about the misuse of statistical methods in the analysis of data, researchers have become more careful about statistical analysis, but errors including low statistical power and inadequate analysis of repeated-measurement studies are still prevalent. In this review, several statistical methods are introduced that are not always familiar to basic and clinical cardiologists but may be useful for revealing the correct answer from the data. The aim of this review is not only to draw the attention of investigators to these tests but also to stress the conditions in which they are applicable. These methods are now generally available in statistical program packages. Researchers need not know how to calculate the statistics from the data but are required to select the correct method from the menu and interpret the statistical results accurately. With the choice of appropriate statistical programs, the issue is no longer how to do the test but when to do it. PMID:12386142

  2. A Statistical Analysis of Cotton Fiber Properties

    NASA Astrophysics Data System (ADS)

    Ghosh, Anindya; Das, Subhasis; Majumder, Asha

    2016-04-01

    This paper reports a statistical analysis of different cotton fiber properties, such as strength, breaking elongation, upper half mean length, length uniformity index, short fiber index, micronaire, reflectance and yellowness measured from 1200 cotton bales. The uni-variate, bi-variate and multi-variate statistical analysis have been invoked to elicit interrelationship between above-mentioned properties taking them up singularly, pairwise and multiple way, respectively. In multi-variate analysis all cotton fiber properties are simultaneously considered for multi-dimensional techniques of principal factor analysis.

  3. Survival analysis of aging aircraft

    NASA Astrophysics Data System (ADS)

    Benavides, Samuel

    This study pushes systems engineering of aging aircraft beyond the boundaries of empirical and deterministic modeling by making a sharp break with the traditional laboratory-derived corrosion prediction algorithms that have shrouded real-world failures of aircraft structure. At the heart of this problem is the aeronautical industry's inability to be forthcoming in an accurate model that predicts corrosion failures in aircraft in spite of advances in corrosion algorithms or improvements in simulation and modeling. The struggle to develop accurate corrosion probabilistic models stems from a multitude of real-world interacting variables that synergistically influence corrosion in convoluted and complex ways. This dissertation, in essence, offers a statistical framework for the analysis of structural airframe corrosion failure by utilizing real-world data while considering the effects of interacting corrosion variables. This study injects realism into corrosion failures of aging aircraft systems by accomplishing four major goals related to the conceptual and methodological framework of corrosion modeling. First, this work connects corrosion modeling from the traditional, laboratory derived algorithms to corrosion failures in actual operating aircraft. This work augments physics-based modeling by examining the many confounding and interacting variables, such as environmental, geographical and operational, that impact failure of airframe structure. Examined through the lens of censored failure data from aircraft flying in a maritime environment, this study enhances the understanding between the triad of the theoretical, laboratory and real-world corrosion. Secondly, this study explores the importation and successful application of an advanced biomedical statistical tool---survival analysis---to model censored corrosion failure data. This well-grounded statistical methodology is inverted from a methodology that analyzes survival to one that examines failures. Third, this

  4. Statistical Analysis Techniques for Small Sample Sizes

    NASA Technical Reports Server (NTRS)

    Navard, S. E.

    1984-01-01

    The small sample sizes problem which is encountered when dealing with analysis of space-flight data is examined. Because of such a amount of data available, careful analyses are essential to extract the maximum amount of information with acceptable accuracy. Statistical analysis of small samples is described. The background material necessary for understanding statistical hypothesis testing is outlined and the various tests which can be done on small samples are explained. Emphasis is on the underlying assumptions of each test and on considerations needed to choose the most appropriate test for a given type of analysis.

  5. Statistical Tools for Forensic Analysis of Toolmarks

    SciTech Connect

    David Baldwin; Max Morris; Stan Bajic; Zhigang Zhou; James Kreiser

    2004-04-22

    Recovery and comparison of toolmarks, footprint impressions, and fractured surfaces connected to a crime scene are of great importance in forensic science. The purpose of this project is to provide statistical tools for the validation of the proposition that particular manufacturing processes produce marks on the work-product (or tool) that are substantially different from tool to tool. The approach to validation involves the collection of digital images of toolmarks produced by various tool manufacturing methods on produced work-products and the development of statistical methods for data reduction and analysis of the images. The developed statistical methods provide a means to objectively calculate a ''degree of association'' between matches of similarly produced toolmarks. The basis for statistical method development relies on ''discriminating criteria'' that examiners use to identify features and spatial relationships in their analysis of forensic samples. The developed data reduction algorithms utilize the same rules used by examiners for classification and association of toolmarks.

  6. Statistical energy analysis computer program, user's guide

    NASA Technical Reports Server (NTRS)

    Trudell, R. W.; Yano, L. I.

    1981-01-01

    A high frequency random vibration analysis, (statistical energy analysis (SEA) method) is examined. The SEA method accomplishes high frequency prediction of arbitrary structural configurations. A general SEA computer program is described. A summary of SEA theory, example problems of SEA program application, and complete program listing are presented.

  7. Multiset Statistics for Gene Set Analysis

    PubMed Central

    Newton, Michael A.; Wang, Zhishi

    2015-01-01

    An important data analysis task in statistical genomics involves the integration of genome-wide gene-level measurements with preexisting data on the same genes. A wide variety of statistical methodologies and computational tools have been developed for this general task. We emphasize one particular distinction among methodologies, namely whether they process gene sets one at a time (uniset) or simultaneously via some multiset technique. Owing to the complexity of collections of gene sets, the multiset approach offers some advantages, as it naturally accommodates set-size variations and among-set overlaps. However, this approach presents both computational and inferential challenges. After reviewing some statistical issues that arise in uniset analysis, we examine two model-based multiset methods for gene list data. PMID:25914887

  8. MICROARRAY DATA ANALYSIS USING MULTIPLE STATISTICAL MODELS

    EPA Science Inventory

    Microarray Data Analysis Using Multiple Statistical Models

    Wenjun Bao1, Judith E. Schmid1, Amber K. Goetz1, Ming Ouyang2, William J. Welsh2,Andrew I. Brooks3,4, ChiYi Chu3,Mitsunori Ogihara3,4, Yinhe Cheng5, David J. Dix1. 1National Health and Environmental Effects Researc...

  9. Statistical Analysis Experiment for Freshman Chemistry Lab.

    ERIC Educational Resources Information Center

    Salzsieder, John C.

    1995-01-01

    Describes a laboratory experiment dissolving zinc from galvanized nails in which data can be gathered very quickly for statistical analysis. The data have sufficient significant figures and the experiment yields a nice distribution of random errors. Freshman students can gain an appreciation of the relationships between random error, number of…

  10. Bayesian Statistics for Biological Data: Pedigree Analysis

    ERIC Educational Resources Information Center

    Stanfield, William D.; Carlton, Matthew A.

    2004-01-01

    The use of Bayes' formula is applied to the biological problem of pedigree analysis to show that the Bayes' formula and non-Bayesian or "classical" methods of probability calculation give different answers. First year college students of biology can be introduced to the Bayesian statistics.

  11. Applied Behavior Analysis and Statistical Process Control?

    ERIC Educational Resources Information Center

    Hopkins, B. L.

    1995-01-01

    Incorporating statistical process control (SPC) methods into applied behavior analysis is discussed. It is claimed that SPC methods would likely reduce applied behavior analysts' intimate contacts with problems and would likely yield poor treatment and research decisions. Cases and data presented by Pfadt and Wheeler (1995) are cited as examples.…

  12. Statistical shape analysis: From landmarks to diffeomorphisms.

    PubMed

    Zhang, Miaomiao; Golland, Polina

    2016-10-01

    We offer a blazingly brief review of evolution of shape analysis methods in medical imaging. As the representations and the statistical models grew more sophisticated, the problem of shape analysis has been gradually redefined to accept images rather than binary segmentations as a starting point. This transformation enabled shape analysis to take its rightful place in the arsenal of tools for extracting and understanding patterns in large clinical image sets. We speculate on the future developments in shape analysis and potential applications that would bring this mathematically rich area to bear on clinical practice. PMID:27377332

  13. Unified statistical approach to cortical thickness analysis.

    PubMed

    Chung, Moo K; Robbins, Steve; Evans, Alan C

    2005-01-01

    This paper presents a unified image processing and analysis framework for cortical thickness in characterizing a clinical population. The emphasis is placed on the development of data smoothing and analysis framework. The human brain cortex is a highly convoluted surface. Due to the convoluted non-Euclidean surface geometry, data smoothing and analysis on the cortex are inherently difficult. When measurements lie on a curved surface, it is natural to assign kernel smoothing weights based on the geodesic distance along the surface rather than the Euclidean distance. We present a new data smoothing framework that address this problem implicitly without actually computing the geodesic distance and present its statistical properties. Afterwards, the statistical inference is based on the random field theory based multiple comparison correction. As an illustration, we have applied the method in detecting the regions of abnormal cortical thickness in 16 high functioning autistic children. PMID:17354731

  14. Statistical Analysis of Thermal Analysis Margin

    NASA Technical Reports Server (NTRS)

    Garrison, Matthew B.

    2011-01-01

    NASA Goddard Space Flight Center requires that each project demonstrate a minimum of 5 C margin between temperature predictions and hot and cold flight operational limits. The bounding temperature predictions include worst-case environment and thermal optical properties. The purpose of this work is to: assess how current missions are performing against their pre-launch bounding temperature predictions and suggest any possible changes to the thermal analysis margin rules

  15. Comparative statistical analysis of planetary surfaces

    NASA Astrophysics Data System (ADS)

    Schmidt, Frédéric; Landais, Francois; Lovejoy, Shaun

    2016-04-01

    In the present study, we aim to provide a statistical and comparative description of topographic fields by using the huge amount of topographic data available for different bodies in the solar system, including Earth, Mars, the Moon etc.. Our goal is to characterize and quantify the geophysical processes involved by a relevant statistical description. In each case, topographic fields exhibit an extremely high variability with details at each scale, from millimeter to thousands of kilometers. This complexity seems to prohibit global descriptions or global topography models. Nevertheless, this topographic complexity is well-known to exhibit scaling laws that establish a similarity between scales and permit simpler descriptions and models. Indeed, efficient simulations can be made using the statistical properties of scaling fields (fractals). But realistic simulations of global topographic fields must be multi (not mono) scaling behaviour, reflecting the extreme variability and intermittency observed in real fields that can not be generated by simple scaling models. A multiscaling theory has been developed in order to model high variability and intermittency. This theory is a good statistical candidate to model the topography field with a limited number of parameters (called the multifractal parameters). After a global analysis of Mars (Landais et. al, 2015) we have performed similar analysis on different body in the solar system including the Moon, Venus and mercury indicating that the mulifractal parameters might be relevant to explain the competition between several processes operating on multiple scales

  16. Applied behavior analysis and statistical process control?

    PubMed Central

    Hopkins, B L

    1995-01-01

    This paper examines Pfadt and Wheeler's (1995) suggestions that the methods of statistical process control (SPC) be incorporated into applied behavior analysis. The research strategies of SPC are examined and compared to those of applied behavior analysis. I argue that the statistical methods that are a part of SPC would likely reduce applied behavior analysts' intimate contacts with the problems with which they deal and would, therefore, likely yield poor treatment and research decisions. Examples of these kinds of results and decisions are drawn from the cases and data Pfadt and Wheeler present. This paper also describes and clarifies many common misconceptions about SPC, including W. Edwards Deming's involvement in its development, its relationship to total quality management, and its confusion with various other methods designed to detect sources of unwanted variability. PMID:7592156

  17. Statistical Analysis of Iberian Peninsula Megaliths Orientations

    NASA Astrophysics Data System (ADS)

    González-García, A. C.

    2009-08-01

    Megalithic monuments have been intensively surveyed and studied from the archaeoastronomical point of view in the past decades. We have orientation measurements for over one thousand megalithic burial monuments in the Iberian Peninsula, from several different periods. These data, however, lack a sound understanding. A way to classify and start to understand such orientations is by means of statistical analysis of the data. A first attempt is done with simple statistical variables and a mere comparison between the different areas. In order to minimise the subjectivity in the process a further more complicated analysis is performed. Some interesting results linking the orientation and the geographical location will be presented. Finally I will present some models comparing the orientation of the megaliths in the Iberian Peninsula with the rising of the sun and the moon at several times of the year.

  18. Multivariate analysis: A statistical approach for computations

    NASA Astrophysics Data System (ADS)

    Michu, Sachin; Kaushik, Vandana

    2014-10-01

    Multivariate analysis is a type of multivariate statistical approach commonly used in, automotive diagnosis, education evaluating clusters in finance etc and more recently in the health-related professions. The objective of the paper is to provide a detailed exploratory discussion about factor analysis (FA) in image retrieval method and correlation analysis (CA) of network traffic. Image retrieval methods aim to retrieve relevant images from a collected database, based on their content. The problem is made more difficult due to the high dimension of the variable space in which the images are represented. Multivariate correlation analysis proposes an anomaly detection and analysis method based on the correlation coefficient matrix. Anomaly behaviors in the network include the various attacks on the network like DDOs attacks and network scanning.

  19. A statistical analysis of NMR spectrometer noise.

    PubMed

    Grage, Halfdan; Akke, Mikael

    2003-05-01

    Estimation of NMR spectral parameters, using e.g. maximum likelihood methods, is commonly based on the assumption of white complex Gaussian noise in the signal obtained by quadrature detection. Here we present a statistical analysis with the purpose of discussing and testing the validity of this fundamental assumption. Theoretical expressions are derived for the correlation structure of the noise under various conditions, showing that in general the noise in the sampled signal is not strictly white, even if the thermal noise in the receiver steps prior to digitisation can be characterised as white Gaussian noise. It is shown that the noise correlation properties depend on the ratio between the sampling frequency and the filter cut-off frequency, as well as the filter characteristics. The theoretical analysis identifies conditions that are expected to yield non-white noise in the sampled signal. Extensive statistical characterisation of experimental noise confirms the theoretical predictions. The statistical methods outlined here are also useful for residual analysis in connection with validation of the model and the parameter estimates. PMID:12762994

  20. Statistical Analysis of Spectra with Many Lines

    NASA Astrophysics Data System (ADS)

    van Dyk, D. A.; Kang, H. S.; Connors, A.; Kashyap, V. L.; Siemiginowska, A.

    2001-12-01

    Please join us in our wider effort to engage the strengths of modern computational statistics methods in solving challenging stellar and solar data analysis problems. As just one example (of a great breadth of possibilities) consider analyzing a spectrum with a very large number of lines. Some of these may be faint, merged, indistinguishable from each other and the underlying smooth continuum. The ensemble of line intensities follows a predictable distribution. The shape of this distribution depends on the properties of the source, e.g., its temperature, abundances, and emission measure. Hence, a better understanding of the distribution of line fluxes in a particular source may tighten our inference for other model parameters such as temperature---even when very few lines are actually easy to distinguish. To take advantage of this structure, we directly model the distribution of the line fluxes rather than fitting each line flux directly or ``investing'' the emissivities to get a DEM. Statistically, this strategy reduces the number of free parameters, which we expect will lead to improved statistical properties. We believe this method holds much promise for improved analysis, especially for low count sources. For example, we expect this method to correctly account for the ``pseudo-continuum'' that results from the large number of faint, unresolvable lines in X-ray grating spectra. Moreover, our statistical methods should apply directly to other settings involving a multitude of lines such as timing data. We hope that these methods will increase our statistical power to set the continuum level in the presence of a multitude of lines and to distinguish weak lines from fluctuations in the continuum. Funding for this project partially provided by NSF grant and DMS-01-04129 and by NASA Contract NAS8-39073 (CXC).

  1. Statistical Tolerance and Clearance Analysis for Assembly

    NASA Technical Reports Server (NTRS)

    Lee, S.; Yi, C.

    1996-01-01

    Tolerance is inevitable because manufacturing exactly equal parts is known to be impossible. Furthermore, the specification of tolerances is an integral part of product design since tolerances directly affect the assemblability, functionality, manufacturability, and cost effectiveness of a product. In this paper, we present statistical tolerance and clearance analysis for the assembly. Our proposed work is expected to make the following contributions: (i) to help the designers to evaluate products for assemblability, (ii) to provide a new perspective to tolerance problems, and (iii) to provide a tolerance analysis tool which can be incorporated into a CAD or solid modeling system.

  2. Meaningful statistical analysis of large computational clusters.

    SciTech Connect

    Gentile, Ann C.; Marzouk, Youssef M.; Brandt, James M.; Pebay, Philippe Pierre

    2005-07-01

    Effective monitoring of large computational clusters demands the analysis of a vast amount of raw data from a large number of machines. The fundamental interactions of the system are not, however, well-defined, making it difficult to draw meaningful conclusions from this data, even if one were able to efficiently handle and process it. In this paper we show that computational clusters, because they are comprised of a large number of identical machines, behave in a statistically meaningful fashion. We therefore can employ normal statistical methods to derive information about individual systems and their environment and to detect problems sooner than with traditional mechanisms. We discuss design details necessary to use these methods on a large system in a timely and low-impact fashion.

  3. Apparatus for statistical time-series analysis of electrical signals

    NASA Technical Reports Server (NTRS)

    Stewart, C. H. (Inventor)

    1973-01-01

    An apparatus for performing statistical time-series analysis of complex electrical signal waveforms, permitting prompt and accurate determination of statistical characteristics of the signal is presented.

  4. Statistical analysis of life history calendar data.

    PubMed

    Eerola, Mervi; Helske, Satu

    2016-04-01

    The life history calendar is a data-collection tool for obtaining reliable retrospective data about life events. To illustrate the analysis of such data, we compare the model-based probabilistic event history analysis and the model-free data mining method, sequence analysis. In event history analysis, we estimate instead of transition hazards the cumulative prediction probabilities of life events in the entire trajectory. In sequence analysis, we compare several dissimilarity metrics and contrast data-driven and user-defined substitution costs. As an example, we study young adults' transition to adulthood as a sequence of events in three life domains. The events define the multistate event history model and the parallel life domains in multidimensional sequence analysis. The relationship between life trajectories and excess depressive symptoms in middle age is further studied by their joint prediction in the multistate model and by regressing the symptom scores on individual-specific cluster indices. The two approaches complement each other in life course analysis; sequence analysis can effectively find typical and atypical life patterns while event history analysis is needed for causal inquiries. PMID:23117406

  5. Statistical analysis of extreme auroral electrojet indices

    NASA Astrophysics Data System (ADS)

    Nakamura, Masao; Yoneda, Asato; Oda, Mitsunobu; Tsubouchi, Ken

    2015-09-01

    Extreme auroral electrojet activities can damage electrical power grids due to large induced currents in the Earth, degrade radio communications and navigation systems due to the ionospheric disturbances and cause polar-orbiting satellite anomalies due to the enhanced auroral electron precipitation. Statistical estimation of extreme auroral electrojet activities is an important factor in space weather research. For this estimation, we utilize extreme value theory (EVT), which focuses on the statistical behavior in the tail of a distribution. As a measure of auroral electrojet activities, auroral electrojet indices AL, AU, and AE, are used, which describe the maximum current strength of the westward and eastward auroral electrojets and the sum of the two oppositely directed in the auroral latitude ionosphere, respectively. We provide statistical evidence for finite upper limits to AL and AU and estimate the annual expected number and probable intensity of their extreme events. We detect two different types of extreme AE events; therefore, application of the appropriate EVT analysis to AE is difficult.

  6. Statistical Hot Channel Analysis for the NBSR

    SciTech Connect

    Cuadra A.; Baek J.

    2014-05-27

    A statistical analysis of thermal limits has been carried out for the research reactor (NBSR) at the National Institute of Standards and Technology (NIST). The objective of this analysis was to update the uncertainties of the hot channel factors with respect to previous analysis for both high-enriched uranium (HEU) and low-enriched uranium (LEU) fuels. Although uncertainties in key parameters which enter into the analysis are not yet known for the LEU core, the current analysis uses reasonable approximations instead of conservative estimates based on HEU values. Cumulative distribution functions (CDFs) were obtained for critical heat flux ratio (CHFR), and onset of flow instability ratio (OFIR). As was done previously, the Sudo-Kaminaga correlation was used for CHF and the Saha-Zuber correlation was used for OFI. Results were obtained for probability levels of 90%, 95%, and 99.9%. As an example of the analysis, the results for both the existing reactor with HEU fuel and the LEU core show that CHFR would have to be above 1.39 to assure with 95% probability that there is no CHF. For the OFIR, the results show that the ratio should be above 1.40 to assure with a 95% probability that OFI is not reached.

  7. Recent advances in statistical energy analysis

    NASA Technical Reports Server (NTRS)

    Heron, K. H.

    1992-01-01

    Statistical Energy Analysis (SEA) has traditionally been developed using modal summation and averaging approach, and has led to the need for many restrictive SEA assumptions. The assumption of 'weak coupling' is particularly unacceptable when attempts are made to apply SEA to structural coupling. It is now believed that this assumption is more a function of the modal formulation rather than a necessary formulation of SEA. The present analysis ignores this restriction and describes a wave approach to the calculation of plate-plate coupling loss factors. Predictions based on this method are compared with results obtained from experiments using point excitation on one side of an irregular six-sided box structure. Conclusions show that the use and calculation of infinite transmission coefficients is the way forward for the development of a purely predictive SEA code.

  8. Statistical Analysis to Select Evacuation Route

    NASA Astrophysics Data System (ADS)

    Musyarof, Z.; Sutarto, D. Y.; Atika, D. R.; Fajriya Hakim, R. B.

    2015-06-01

    Each country should be responsible for the safety of people, especially responsible for the safety of people living in disaster-prone areas. One of those services is provides evacuation route for them. But all this time, the selection of evacuation route is seem does not well organized, it could be seen that when a disaster happen, there will be many accumulation of people on the steps of evacuation route. That condition is dangerous to people because hampers evacuation process. By some methods in Statistical analysis, author tries to give a suggestion how to prepare evacuation route which is organized and based on people habit. Those methods are association rules, sequential pattern mining, hierarchical cluster analysis and fuzzy logic.

  9. Statistical utopianism in the age of aristocratic efficiency.

    PubMed

    Porter, Theodore

    2002-01-01

    The modern history of science is commonly associated with an inexorable move toward increasing specialization and, perhaps, a proliferation of expert discourses at the expense of public discourse. This paper concerns the standing of science as a basis for public authority in late-Victorian and Edwardian Britain, and suggests that, in relation to the political order, this standing remained tenuous. These themes are exemplified by the career of Karl Pearson, founder of the modern school of mathematical statistics and something of a social visionary. Like Huxley and other scientific naturalists, Pearson wished to incorporate science into a reinvigorated "general culture" and in this way to reshape an elite. Statistics, seemingly the archetypal form of specialist expertise, was conceived as an almost utopian program to advance intelligence and mortality in what he sometimes referred to as a new aristocracy. PMID:12385323

  10. FRATS: Functional Regression Analysis of DTI Tract Statistics

    PubMed Central

    Zhu, Hongtu; Styner, Martin; Tang, Niansheng; Liu, Zhexing; Lin, Weili; Gilmore, John H.

    2010-01-01

    Diffusion tensor imaging (DTI) provides important information on the structure of white matter fiber bundles as well as detailed tissue properties along these fiber bundles in vivo. This paper presents a functional regression framework, called FRATS, for the analysis of multiple diffusion properties along fiber bundle as functions in an infinite dimensional space and their association with a set of covariates of interest, such as age, diagnostic status and gender, in real applications. The functional regression framework consists of four integrated components: the local polynomial kernel method for smoothing multiple diffusion properties along individual fiber bundles, a functional linear model for characterizing the association between fiber bundle diffusion properties and a set of covariates, a global test statistic for testing hypotheses of interest, and a resampling method for approximating the p-value of the global test statistic. The proposed methodology is applied to characterizing the development of five diffusion properties including fractional anisotropy, mean diffusivity, and the three eigenvalues of diffusion tensor along the splenium of the corpus callosum tract and the right internal capsule tract in a clinical study of neurodevelopment. Significant age and gestational age effects on the five diffusion properties were found in both tracts. The resulting analysis pipeline can be used for understanding normal brain development, the neural bases of neuropsychiatric disorders, and the joint effects of environmental and genetic factors on white matter fiber bundles. PMID:20335089

  11. Environmental studies: Mathematical, computational, and statistical analysis

    SciTech Connect

    Wheeler, M.F.

    1996-12-31

    The Summer Program on Mathematical, Computational, and Statistical Analyses in Environmental Studies held 6--31 July 1992 was designed to provide a much needed interdisciplinary forum for joint exploration of recent advances in the formulation and application of (A) environmental models, (B) environmental data and data assimilation, (C) stochastic modeling and optimization, and (D) global climate modeling. These four conceptual frameworks provided common themes among a broad spectrum of specific technical topics at this workshop. The program brought forth a mix of physical concepts and processes such as chemical kinetics, atmospheric dynamics, cloud physics and dynamics, flow in porous media, remote sensing, climate statistical, stochastic processes, parameter identification, model performance evaluation, aerosol physics and chemistry, and data sampling together with mathematical concepts in stiff differential systems, advective-diffusive-reactive PDEs, inverse scattering theory, time series analysis, particle dynamics, stochastic equations, optimal control, and others. Nineteen papers are presented in this volume. Selected papers are indexed separately for inclusion in the Energy Science and Technology Database.

  12. Multivariate statistical analysis of wildfires in Portugal

    NASA Astrophysics Data System (ADS)

    Costa, Ricardo; Caramelo, Liliana; Pereira, Mário

    2013-04-01

    Several studies demonstrate that wildfires in Portugal present high temporal and spatial variability as well as cluster behavior (Pereira et al., 2005, 2011). This study aims to contribute to the characterization of the fire regime in Portugal with the multivariate statistical analysis of the time series of number of fires and area burned in Portugal during the 1980 - 2009 period. The data used in the analysis is an extended version of the Rural Fire Portuguese Database (PRFD) (Pereira et al, 2011), provided by the National Forest Authority (Autoridade Florestal Nacional, AFN), the Portuguese Forest Service, which includes information for more than 500,000 fire records. There are many multiple advanced techniques for examining the relationships among multiple time series at the same time (e.g., canonical correlation analysis, principal components analysis, factor analysis, path analysis, multiple analyses of variance, clustering systems). This study compares and discusses the results obtained with these different techniques. Pereira, M.G., Trigo, R.M., DaCamara, C.C., Pereira, J.M.C., Leite, S.M., 2005: "Synoptic patterns associated with large summer forest fires in Portugal". Agricultural and Forest Meteorology. 129, 11-25. Pereira, M. G., Malamud, B. D., Trigo, R. M., and Alves, P. I.: The history and characteristics of the 1980-2005 Portuguese rural fire database, Nat. Hazards Earth Syst. Sci., 11, 3343-3358, doi:10.5194/nhess-11-3343-2011, 2011 This work is supported by European Union Funds (FEDER/COMPETE - Operational Competitiveness Programme) and by national funds (FCT - Portuguese Foundation for Science and Technology) under the project FCOMP-01-0124-FEDER-022692, the project FLAIR (PTDC/AAC-AMB/104702/2008) and the EU 7th Framework Program through FUME (contract number 243888).

  13. On intracluster Faraday rotation. II - Statistical analysis

    NASA Technical Reports Server (NTRS)

    Lawler, J. M.; Dennison, B.

    1982-01-01

    The comparison of a reliable sample of radio source Faraday rotation measurements seen through rich clusters of galaxies, with sources seen through the outer parts of clusters and therefore having little intracluster Faraday rotation, indicates that the distribution of rotation in the former population is broadened, but only at the 80% level of statistical confidence. Employing a physical model for the intracluster medium in which the square root of magnetic field strength/turbulent cell per gas core radius number ratio equals approximately 0.07 microgauss, a Monte Carlo simulation is able to reproduce the observed broadening. An upper-limit analysis figure of less than 0.20 microgauss for the field strength/turbulent cell ratio, combined with lower limits on field strength imposed by limitations on the Compton-scattered flux, shows that intracluster magnetic fields must be tangled on scales greater than about 20 kpc.

  14. Statistical analysis of ultrasonic measurements in concrete

    NASA Astrophysics Data System (ADS)

    Chiang, Chih-Hung; Chen, Po-Chih

    2002-05-01

    Stress wave techniques such as measurements of ultrasonic pulse velocity are often used to evaluate concrete quality in structures. For proper interpretation of measurement results, the dependence of pulse transit time on the average acoustic impedance and the material homogeneity along the sound path need to be examined. Semi-direct measurement of pulse velocity could be more convenient than through transmission measurement. It is not necessary to assess both sides of concrete floors or walls. A novel measurement scheme is proposed and verified based on statistical analysis. It is shown that Semi-direct measurements are very effective for gathering large amount of pulse velocity data from concrete reference specimens. The variability of measurements is comparable with that reported by American Concrete Institute using either break-off or pullout tests.

  15. Analysis of Variance: What Is Your Statistical Software Actually Doing?

    ERIC Educational Resources Information Center

    Li, Jian; Lomax, Richard G.

    2011-01-01

    Users assume statistical software packages produce accurate results. In this article, the authors systematically examined Statistical Package for the Social Sciences (SPSS) and Statistical Analysis System (SAS) for 3 analysis of variance (ANOVA) designs, mixed-effects ANOVA, fixed-effects analysis of covariance (ANCOVA), and nested ANOVA. For each…

  16. Statistical Analysis of Cardiovascular Data from FAP

    NASA Technical Reports Server (NTRS)

    Sealey, Meghan

    2016-01-01

    pressure, etc.) to see which could best predict how long the subjects could tolerate the tilt tests. With this I plan to analyze an artificial gravity study in order to determine the effects of orthostatic intolerance during spaceflight. From these projects, I became efficient in using the statistical software Stata, which I had previously never used before. I learned new statistical methods, such as mixed-effects linear regression, maximum likelihood estimation on longitudinal data, and post model-fitting tests to see if certain parameters contribute significantly to the model, all of which will better my understanding for when I continue studying for my masters' degree. I was also able to demonstrate my knowledge of statistics by helping other students run statistical analyses for their own projects. After completing these projects, the experience and knowledge gained from completing this analysis exemplifies the type of work that I would like to pursue in the future. After completing my masters' degree, I plan to pursue a career in biostatistics, which is exactly the position that I interned as, and I plan to use this experience to contribute to that goal

  17. Statistical Analysis of Streamflow Trends in Slovenia

    NASA Astrophysics Data System (ADS)

    Jurko, M.; Kobold, M.; Mikoš, M.

    2009-04-01

    According to climate change, trends of river discharges were analyzed showing the hydrological change and future projections of hydrological behaviour in Slovenia. In last years droughts and floods are becoming more and more frequent. In the statistical analysis of streamflow trends of Slovenian rivers, available data on the low, mean and high discharges were examined using mean daily discharges and the Hydrospect software, which was developed under the auspices of WMO for detecting changes in hydrological data (Kundzewicz and Robson, 2000). The Mann-Kendall test was applied for the estimation of trends in the river flow index series. Trend analysis requires long records of observation to distinguish climate change-induced trends from climate variability. The problems of missing values, seasonal and other short-term fluctuations or anthropogenic impacts and lack of homogeneity of data due to the changes in instruments and observation techniques are frequently present in existing hydrological data sets. Therefore the analysis was carried out for 77 water gauging stations representatively distributed across Slovenia with sufficiently long and reliable continuous data sets. The average length of the data sets from the selected water gauging stations is about 50 years. Different indices were used to assess the temporal variation of discharges: annual mean daily discharge, annual maximum daily discharge, two magnitude and frequency series by peak-over-threshold (POT) approach (POT1 and POT3), and two low flow indices describing the different duration of low flows (7 and 30 days). The clustering method was used to classify the results of trends into groups. The assumption of a general decrease of water quantities in Slovenian rivers was confirmed. The annual mean daily discharges of the analyzed water gauging stations show a significant negative trend for the majority of the stations. Similar results with lower statistical significance show annual minimum 7-day and 30

  18. Statistical image analysis of longitudinal RAVENS images

    PubMed Central

    Lee, Seonjoo; Zipunnikov, Vadim; Reich, Daniel S.; Pham, Dzung L.

    2015-01-01

    Regional analysis of volumes examined in normalized space (RAVENS) are transformation images used in the study of brain morphometry. In this paper, RAVENS images are analyzed using a longitudinal variant of voxel-based morphometry (VBM) and longitudinal functional principal component analysis (LFPCA) for high-dimensional images. We demonstrate that the latter overcomes the limitations of standard longitudinal VBM analyses, which does not separate registration errors from other longitudinal changes and baseline patterns. This is especially important in contexts where longitudinal changes are only a small fraction of the overall observed variability, which is typical in normal aging and many chronic diseases. Our simulation study shows that LFPCA effectively separates registration error from baseline and longitudinal signals of interest by decomposing RAVENS images measured at multiple visits into three components: a subject-specific imaging random intercept that quantifies the cross-sectional variability, a subject-specific imaging slope that quantifies the irreversible changes over multiple visits, and a subject-visit specific imaging deviation. We describe strategies to identify baseline/longitudinal variation and registration errors combined with covariates of interest. Our analysis suggests that specific regional brain atrophy and ventricular enlargement are associated with multiple sclerosis (MS) disease progression. PMID:26539071

  19. Time Series Analysis Based on Running Mann Whitney Z Statistics

    Technology Transfer Automated Retrieval System (TEKTRAN)

    A sensitive and objective time series analysis method based on the calculation of Mann Whitney U statistics is described. This method samples data rankings over moving time windows, converts those samples to Mann-Whitney U statistics, and then normalizes the U statistics to Z statistics using Monte-...

  20. Calculating summary statistics for population chemical biomonitoring in women of childbearing age with adjustment for age-specific natality.

    PubMed

    Axelrad, Daniel A; Cohen, Jonathan

    2011-01-01

    The effects of chemical exposures during pregnancy on children's health have been an increasing focus of environmental health research in recent years, leading to greater interest in biomonitoring of chemicals in women of childbearing age in the general population. Measurements of mercury in blood from the National Health and Nutrition Examination Survey are frequently reported for "women of childbearing age," defined to be of ages 16-49 years. The intent is to represent prenatal chemical exposure, but blood mercury levels increase with age. Furthermore, women of different ages have different probabilities of giving birth. We evaluated options to address potential bias in biomonitoring summary statistics for women of childbearing age by accounting for age-specific probabilities of giving birth. We calculated median and 95th percentile levels of mercury, PCBs, and cotinine using these approaches: option 1: women aged 16-49 years without natality adjustment; option 2: women aged 16-39 years without natality adjustment; option 3: women aged 16-49 years, adjusted for natality by age; option 4: women aged 16-49 years, adjusted for natality by age and race/ethnicity. Among the three chemicals examined, the choice of option has the greatest impact on estimated levels of serum PCBs, which are strongly associated with age. Serum cotinine levels among Black non-Hispanic women of childbearing age are understated when age-specific natality is not considered. For characterizing in utero exposures, adjustment using age-specific natality provides a substantial improvement in estimation of biomonitoring summary statistics. PMID:21035114

  1. Statistical Analysis of Nondisjunction Assays in Drosophila

    PubMed Central

    Zeng, Yong; Li, Hua; Schweppe, Nicole M.; Hawley, R. Scott; Gilliland, William D.

    2010-01-01

    Many advances in the understanding of meiosis have been made by measuring how often errors in chromosome segregation occur. This process of nondisjunction can be studied by counting experimental progeny, but direct measurement of nondisjunction rates is complicated by not all classes of nondisjunctional progeny being viable. For X chromosome nondisjunction in Drosophila female meiosis, all of the normal progeny survive, while nondisjunctional eggs produce viable progeny only if fertilized by sperm that carry the appropriate sex chromosome. The rate of nondisjunction has traditionally been estimated by assuming a binomial process and doubling the number of observed nondisjunctional progeny, to account for the inviable classes. However, the correct way to derive statistics (such as confidence intervals or hypothesis testing) by this approach is far from clear. Instead, we use the multinomial-Poisson hierarchy model and demonstrate that the old estimator is in fact the maximum-likelihood estimator (MLE). Under more general assumptions, we derive asymptotic normality of this estimator and construct confidence interval and hypothesis testing formulae. Confidence intervals under this framework are always larger than under the binomial framework, and application to published data shows that use of the multinomial approach can avoid an apparent type 1 error made by use of the binomial assumption. The current study provides guidance for researchers designing genetic experiments on nondisjunction and improves several methods for the analysis of genetic data. PMID:20660647

  2. Statistical approach to partial equilibrium analysis

    NASA Astrophysics Data System (ADS)

    Wang, Yougui; Stanley, H. E.

    2009-04-01

    A statistical approach to market equilibrium and efficiency analysis is proposed in this paper. One factor that governs the exchange decisions of traders in a market, named willingness price, is highlighted and constitutes the whole theory. The supply and demand functions are formulated as the distributions of corresponding willing exchange over the willingness price. The laws of supply and demand can be derived directly from these distributions. The characteristics of excess demand function are analyzed and the necessary conditions for the existence and uniqueness of equilibrium point of the market are specified. The rationing rates of buyers and sellers are introduced to describe the ratio of realized exchange to willing exchange, and their dependence on the market price is studied in the cases of shortage and surplus. The realized market surplus, which is the criterion of market efficiency, can be written as a function of the distributions of willing exchange and the rationing rates. With this approach we can strictly prove that a market is efficient in the state of equilibrium.

  3. Statistical energy analysis of nonlinear vibrating systems.

    PubMed

    Spelman, G M; Langley, R S

    2015-09-28

    Nonlinearities in practical systems can arise in contacts between components, possibly from friction or impacts. However, it is also known that quadratic and cubic nonlinearity can occur in the stiffness of structural elements undergoing large amplitude vibration, without the need for local contacts. Nonlinearity due purely to large amplitude vibration can then result in significant energy being found in frequency bands other than those being driven by external forces. To analyse this phenomenon, a method is developed here in which the response of the structure in the frequency domain is divided into frequency bands, and the energy flow between the frequency bands is calculated. The frequency bands are assigned an energy variable to describe the mean response and the nonlinear coupling between bands is described in terms of weighted summations of the convolutions of linear modal transfer functions. This represents a nonlinear extension to an established linear theory known as statistical energy analysis (SEA). The nonlinear extension to SEA theory is presented for the case of a plate structure with quadratic and cubic nonlinearity. PMID:26303923

  4. Refining Martian Ages and Understanding Geological Processes From Cratering Statistics

    NASA Technical Reports Server (NTRS)

    Hartmann, William K.

    2005-01-01

    Senior Scientist William K. Hartman presents his final report on Mars Data Analysis Program grant number NAG5-12217: The third year of the three-year program was recently completed in mid-2005. The program has been extremely productive in research and data analysis regarding Mars, especially using Mars Global Surveyor and Mars Odyssey imagery. In the 2005 alone, three papers have already been published, to which this work contributed.1) Hartmann, W. K. 200.5. Martian cratering 8. Isochron refinement and the history of Martian geologic activity Icarus 174, 294-320. This paper is a summary of my entire program of establishing Martian chronology through counts of Martian impact craters. 2) Arfstrom, John, and W. K. Hartmann 2005. Martian flow features, moraine-like rieges, and gullies: Terrestrial analogs and interrelationships. Icarus 174,32 1-335. This paper makes pioneering connections between Martian glacier-like features and terrestrial glacial features. 3) Hartmann, W.K., D. Winterhalter, and J. Geiss. 2005 Chronology and Physical Evolution of Planet Mars. In The Solar System and Beyond: Ten Years of ISSI (Bern: International Space Science Institute). This is a summary of work conducted at the International Space Science Institute with an international team, emphasizing our publication of a conference volume about Mars, edited by Hartmann and published in 2001.

  5. Web-Based Statistical Sampling and Analysis

    ERIC Educational Resources Information Center

    Quinn, Anne; Larson, Karen

    2016-01-01

    Consistent with the Common Core State Standards for Mathematics (CCSSI 2010), the authors write that they have asked students to do statistics projects with real data. To obtain real data, their students use the free Web-based app, Census at School, created by the American Statistical Association (ASA) to help promote civic awareness among school…

  6. Statistical Error Analysis for Digital Recursive Filters

    NASA Astrophysics Data System (ADS)

    Wu, Kevin Chi-Rung

    The study of arithmetic roundoff error has attracted many researchers to investigate how the signal-to-noise ratio (SNR) is affected by algorithmic parameters, especially since the VLSI (Very Large Scale Integrated circuits) technologies have become more promising for digital signal processing. Typically, digital signal processing involving, either with or without matrix inversion, will have tradeoffs on speed and processor cost. Hence, the problems of an area-time efficient matrix computation and roundoff error behavior analysis will play an important role in this dissertation. A newly developed non-Cholesky square-root matrix will be discussed which precludes the arithmetic roundoff error over some interesting operations, such as complex -valued matrix inversion with its SNR analysis and error propagation effects. A non-CORDIC parallelism approach for complex-valued matrix will be presented to upgrade speed at the cost of moderate increase of processor. The lattice filter will also be looked into, in such a way, that one can understand the SNR behavior under the conditions of different inputs in the joint process system. Pipelining technique will be demonstrated to manifest the possibility of high-speed non-matrix-inversion lattice filter. Floating point arithmetic modelings used in this study have been focused on effective methodologies that have been proved to be reliable and feasible. With the models in hand, we study the roundoff error behavior based on some statistical assumptions. Results are demonstrated by carrying out simulation to show the feasibility of SNR analysis. We will observe that non-Cholesky square-root matrix has advantage of saving a time of O(n^3) as well as a reduced realization cost. It will be apparent that for a Kalman filter the register size is increasing significantly, if pole of the system matrix is moving closer to the edge of the unit circle. By comparing roundoff error effect due to floating-point and fixed-point arithmetics, we

  7. Practical Issues in Component Aging Analysis

    SciTech Connect

    Dana L. Kelly; Andrei Rodionov; Jens Uwe-Klugel

    2008-09-01

    This paper examines practical issues in the statistical analysis of component aging data. These issues center on the stochastic process chosen to model component failures. The two stochastic processes examined are repair same as new, leading to a renewal process, and repair same as old, leading to a nonhomogeneous Poisson process. Under the first assumption, times between failures can treated as statistically independent observations from a stationary process. The common distribution of the times between failures is called the renewal distribution. Under the second process, the times between failures will not be independently and identically distributed, and one cannot simply fit a renewal distribution to the cumulative failure times or the times between failures. The paper illustrates how the assumption made regarding the repair process is crucial to the analysis. Besides the choice of stochastic process, other issues that are discussed include qualitative graphical analysis and simple nonparametric hypothesis tests to help judge which process appears more appropriate. Numerical examples are presented to illustrate the issues discussed in the paper.

  8. Statistical Analysis of Examination to Detect Cheating.

    ERIC Educational Resources Information Center

    Code, Ronald P.

    1985-01-01

    A number of statistical procedures that were developed in 1983 at the University of Medicine and Dentistry of New Jersey-Rutgers Medical School to verify the suspicion that a student cheated during an examination are described. (MLW)

  9. Detection and analysis of statistical differences in anatomical shape.

    PubMed

    Golland, Polina; Grimson, W Eric L; Shenton, Martha E; Kikinis, Ron

    2005-02-01

    We present a computational framework for image-based analysis and interpretation of statistical differences in anatomical shape between populations. Applications of such analysis include understanding developmental and anatomical aspects of disorders when comparing patients versus normal controls, studying morphological changes caused by aging, or even differences in normal anatomy, for example, differences between genders. Once a quantitative description of organ shape is extracted from input images, the problem of identifying differences between the two groups can be reduced to one of the classical questions in machine learning of constructing a classifier function for assigning new examples to one of the two groups while making as few misclassifications as possible. The resulting classifier must be interpreted in terms of shape differences between the two groups back in the image domain. We demonstrate a novel approach to such interpretation that allows us to argue about the identified shape differences in anatomically meaningful terms of organ deformation. Given a classifier function in the feature space, we derive a deformation that corresponds to the differences between the two classes while ignoring shape variability within each class. Based on this approach, we present a system for statistical shape analysis using distance transforms for shape representation and the support vector machines learning algorithm for the optimal classifier estimation and demonstrate it on artificially generated data sets, as well as real medical studies. PMID:15581813

  10. Notes on numerical reliability of several statistical analysis programs

    USGS Publications Warehouse

    Landwehr, J.M.; Tasker, Gary D.

    1999-01-01

    This report presents a benchmark analysis of several statistical analysis programs currently in use in the USGS. The benchmark consists of a comparison between the values provided by a statistical analysis program for variables in the reference data set ANASTY and their known or calculated theoretical values. The ANASTY data set is an amendment of the Wilkinson NASTY data set that has been used in the statistical literature to assess the reliability (computational correctness) of calculated analytical results.

  11. Statistical Analysis of Refractivity in UAE

    NASA Astrophysics Data System (ADS)

    Al-Ansari, Kifah; Al-Mal, Abdulhadi Abu; Kamel, Rami

    2007-07-01

    This paper presents the results of the refractivity statistics in the UAE (United Arab Emirates) for a period of 14 years (1990-2003). Six sites have been considered using meteorological surface data (Abu Dhabi, Dubai, Sharjah, Al-Ain, Ras Al-Kaimah, and Al-Fujairah). Upper air (radiosonde) data were available at one site only, Abu Dhabi airport, which has been considered for the refractivity gradient statistics. Monthly and yearly averages are obtained for the two parameters, refractivity and refractivity gradient. Cumulative distributions are also provided.

  12. Statistical measures for workload capacity analysis.

    PubMed

    Houpt, Joseph W; Townsend, James T

    2012-10-01

    A critical component of how we understand a mental process is given by measuring the effect of varying the workload. The capacity coefficient (Townsend & Nozawa, 1995; Townsend & Wenger, 2004) is a measure on response times for quantifying changes in performance due to workload. Despite its precise mathematical foundation, until now rigorous statistical tests have been lacking. In this paper, we demonstrate statistical properties of the components of the capacity measure and propose a significance test for comparing the capacity coefficient to a baseline measure or two capacity coefficients to each other. PMID:23175582

  13. EXPERIMENTAL DESIGN: STATISTICAL CONSIDERATIONS AND ANALYSIS

    Technology Transfer Automated Retrieval System (TEKTRAN)

    In this book chapter, information on how field experiments in invertebrate pathology are designed and the data collected, analyzed, and interpreted is presented. The practical and statistical issues that need to be considered and the rationale and assumptions behind different designs or procedures ...

  14. Measurement of Plethysmogram and Statistical Method for Analysis

    NASA Astrophysics Data System (ADS)

    Shimizu, Toshihiro

    The plethysmogram is measured at different points of human body by using the photo interrupter, which sensitively depends on the physical and mental situation of human body. In this paper the statistical method of the data-analysis is investigated to discuss the dependence of plethysmogram on stress and aging. The first one is the representation method based on the return map, which provides usuful information for the waveform, the flucuation in phase and the fluctuation in amplitude. The return map method makes it possible to understand the fluctuation of plethymogram in amplitude and in phase more clearly and globally than in the conventional power spectrum method. The second is the Lisajous plot and the correlation function to analyze the phase difference between the plethysmograms of the right finger tip and of the left finger tip. The third is the R-index, from which we can estimate “the age of the blood flow”. The R-index is defined by the global character of plethysmogram, which is different from the usual APG-index. The stress- and age-dependence of plethysmogram is discussed by using these methods.

  15. Common misconceptions about data analysis and statistics.

    PubMed

    Motulsky, Harvey J

    2015-02-01

    Ideally, any experienced investigator with the right tools should be able to reproduce a finding published in a peer-reviewed biomedical science journal. In fact, the reproducibility of a large percentage of published findings has been questioned. Undoubtedly, there are many reasons for this, but one reason may be that investigators fool themselves due to a poor understanding of statistical concepts. In particular, investigators often make these mistakes: (1) P-Hacking. This is when you reanalyze a data set in many different ways, or perhaps reanalyze with additional replicates, until you get the result you want. (2) Overemphasis on P values rather than on the actual size of the observed effect. (3) Overuse of statistical hypothesis testing, and being seduced by the word "significant". (4) Overreliance on standard errors, which are often misunderstood. PMID:25692012

  16. Common misconceptions about data analysis and statistics.

    PubMed

    Motulsky, Harvey J

    2014-10-01

    Ideally, any experienced investigator with the right tools should be able to reproduce a finding published in a peer-reviewed biomedical science journal. In fact, however, the reproducibility of a large percentage of published findings has been questioned. Undoubtedly, there are many reasons for this, but one reason may be that investigators fool themselves due to a poor understanding of statistical concepts. In particular, investigators often make these mistakes: 1) P-hacking, which is when you reanalyze a data set in many different ways, or perhaps reanalyze with additional replicates, until you get the result you want; 2) overemphasis on P values rather than on the actual size of the observed effect; 3) overuse of statistical hypothesis testing, and being seduced by the word "significant"; and 4) over-reliance on standard errors, which are often misunderstood. PMID:25204545

  17. Common misconceptions about data analysis and statistics.

    PubMed

    Motulsky, Harvey J

    2014-11-01

    Ideally, any experienced investigator with the right tools should be able to reproduce a finding published in a peer-reviewed biomedical science journal. In fact, the reproducibility of a large percentage of published findings has been questioned. Undoubtedly, there are many reasons for this, but one reason maybe that investigators fool themselves due to a poor understanding of statistical concepts. In particular, investigators often make these mistakes: 1. P-Hacking. This is when you reanalyze a data set in many different ways, or perhaps reanalyze with additional replicates, until you get the result you want. 2. Overemphasis on P values rather than on the actual size of the observed effect. 3. Overuse of statistical hypothesis testing, and being seduced by the word "significant". 4. Overreliance on standard errors, which are often misunderstood. PMID:25213136

  18. A statistical package for computing time and frequency domain analysis

    NASA Technical Reports Server (NTRS)

    Brownlow, J.

    1978-01-01

    The spectrum analysis (SPA) program is a general purpose digital computer program designed to aid in data analysis. The program does time and frequency domain statistical analyses as well as some preanalysis data preparation. The capabilities of the SPA program include linear trend removal and/or digital filtering of data, plotting and/or listing of both filtered and unfiltered data, time domain statistical characterization of data, and frequency domain statistical characterization of data.

  19. Computer program performs statistical analysis for random processes

    NASA Technical Reports Server (NTRS)

    Newberry, M. H.

    1966-01-01

    Random Vibration Analysis Program /RAVAN/ performs statistical analysis on a number of phenomena associated with flight and captive tests, but can also be used in analyzing data from many other random processes.

  20. Personal Ad Content Analysis Teaches Statistical Applications.

    ERIC Educational Resources Information Center

    Rajecki, D. W.

    2002-01-01

    Focuses on an undergraduate student project which asked students to write a paper based on their examination of age preferences indicated by writers of personal advertisements appearing in newspapers. Reports on the student responses to this project using a questionnaire. Examines the student scores on the final examination for the course. (CMK)

  1. Statistical Uncertainty Analysis Applied to Criticality Calculation

    SciTech Connect

    Hartini, Entin; Andiwijayakusuma, Dinan; Susmikanti, Mike; Nursinta, A. W.

    2010-06-22

    In this paper, we present an uncertainty methodology based on a statistical approach, for assessing uncertainties in criticality prediction using monte carlo method due to uncertainties in the isotopic composition of the fuel. The methodology has been applied to criticality calculations with MCNP5 with additional stochastic input of the isotopic fuel composition. The stochastic input were generated using the latin hypercube sampling method based one the probability density function of each nuclide composition. The automatic passing of the stochastic input to the MCNP and the repeated criticality calculation is made possible by using a python script to link the MCNP and our latin hypercube sampling code.

  2. Statistical Analysis of Random Number Generators

    NASA Astrophysics Data System (ADS)

    Accardi, Luigi; Gäbler, Markus

    2011-01-01

    In many applications, for example cryptography and Monte Carlo simulation, there is need for random numbers. Any procedure, algorithm or device which is intended to produce such is called a random number generator (RNG). What makes a good RNG? This paper gives an overview on empirical testing of the statistical properties of the sequences produced by RNGs and special software packages designed for that purpose. We also present the results of applying a particular test suite--TestU01-- to a family of RNGs currently being developed at the Centro Interdipartimentale Vito Volterra (CIVV), Roma, Italy.

  3. Uncertainty analysis of statistical downscaling methods

    NASA Astrophysics Data System (ADS)

    Khan, Mohammad Sajjad; Coulibaly, Paulin; Dibike, Yonas

    2006-03-01

    Three downscaling models namely Statistical Down-Scaling Model (SDSM), Long Ashton Research Station Weather Generator (LARS-WG) model and Artificial Neural Network (ANN) model have been compared in terms various uncertainty assessments exhibited in their downscaled results of daily precipitation, daily maximum and minimum temperatures. In case of daily maximum and minimum temperature, uncertainty is assessed by comparing monthly mean and variance of downscaled and observed daily maximum and minimum temperature at each month of the year at 95% confidence level. In addition, uncertainties of the monthly means and variances of downscaled daily temperature have been calculated using 95% confidence intervals, which are compared with the observed uncertainties of means and variances. In daily precipitation downscaling, in addition to comparing means and variances, uncertainties have been assessed by comparing monthly mean dry and wet spell lengths and their confidence intervals, cumulative frequency distributions (cdfs) of monthly mean of daily precipitation, and the distributions of monthly wet and dry days for observed and downscaled daily precipitation. The study has been carried out using 40 years of observed and downscaled daily precipitation, daily maximum and minimum temperature data using NCEP (National Center for Environmental Prediction) reanalysis predictors starting from 1961 to 2000. The uncertainty assessment results indicate that the SDSM is the most capable of reproducing various statistical characteristics of observed data in its downscaled results with 95% confidence level, the ANN is the least capable in this respect, and the LARS-WG is in between SDSM and ANN.

  4. On the statistical analysis of maximal magnitude

    NASA Astrophysics Data System (ADS)

    Holschneider, M.; Zöller, G.; Hainzl, S.

    2012-04-01

    We show how the maximum expected magnitude within a time horizon [0,T] may be estimated from earthquake catalog data within the context of truncated Gutenberg-Richter statistics. We present the results in a frequentist and in a Bayesian setting. Instead of deriving point estimations of this parameter and reporting its performance in terms of expectation value and variance, we focus on the calculation of confidence intervals based on an imposed level of confidence α. We present an estimate of the maximum magnitude within an observational time interval T in the future, given a complete earthquake catalog for a time period Tc in the past and optionally some paleoseismic events. We argue that from a statistical point of view the maximum magnitude in a time window is a reasonable parameter for probabilistic seismic hazard assessment, while the commonly used maximum possible magnitude for all times does almost certainly not allow the calculation of useful (i.e. non-trivial) confidence intervals. In the context of an unbounded GR law we show, that Jeffreys invariant prior distribtution yields normalizable posteriors. The predictive distribution based on this prior is explicitely computed.

  5. Statistical Seismic Landslide Analysis: an Update

    NASA Astrophysics Data System (ADS)

    Lee, Chyi-Tyi

    2015-04-01

    Landslides are secondary or induced features, whose recurrence is controlled by the repetition of triggering events, such as earthquakes or heavy rainfall. This makes seismic landslide hazard analysis more complicated than ordinary seismic hazard analysis, and it requires multi-stage analysis. First, susceptibility analysis is utilized to divide a region into successive classes. Then, it is necessary to construct a relationship between the probability of landslide failure and earthquake intensity for each susceptibility class for a region, or to find the probability of failure surface using the susceptibility value and earthquake intensity as independent variables at the study region. Then, hazard analysis for the exceedance probability of earthquake intensity is performed. Finally, an analysis of the spatial probability of landslide failure under a certain return-period earthquake is drawn. This study uses data for Chi-Chi earthquake induced landslides as the training data set to perform the susceptibility analysis and probability of failure surface analysis. A regular probabilistic seismic hazard analysis is also conducted to map different return-period Arias intensities. Finally a seismic landslide hazard map for the whole of Taiwan is provided.

  6. Diagnostic rhyme test statistical analysis programs

    NASA Astrophysics Data System (ADS)

    Sim, A.; Bain, R.; Belyavin, A. J.; Pratt, R. L.

    1991-08-01

    The statistical techniques and associated computer programs used to analyze data from Diagnostic Rhyme Test (DRT) are described. The DRT is used extensively for assessing the intelligibility of military communications systems and became an accepted NATO standard for testing linear predictive coders. The DRT vocabulary comprises ninety six minimally contrasting rhyming word pairs, the initial consonants of which differ only by a single acoustic feature, or attribute. There are six such attributes: voicing, nasality, sustention, silibation, graveness, and compactness. The attribute voicing is present when the vocal cords are excited: in the word pair 'veal-feel', the consonant 'v' is voiced, but the constant 'f' is unvoiced. The procedure for the implementation of the DRT is presented. To ensure the stability of the results, tests using not less than eight talkers and eight listeners are conducted.

  7. Statistics over features: EEG signals analysis.

    PubMed

    Derya Ubeyli, Elif

    2009-08-01

    This paper presented the usage of statistics over the set of the features representing the electroencephalogram (EEG) signals. Since classification is more accurate when the pattern is simplified through representation by important features, feature extraction and selection play an important role in classifying systems such as neural networks. Multilayer perceptron neural network (MLPNN) architectures were formulated and used as basis for detection of electroencephalographic changes. Three types of EEG signals (EEG signals recorded from healthy volunteers with eyes open, epilepsy patients in the epileptogenic zone during a seizure-free interval, and epilepsy patients during epileptic seizures) were classified. The selected Lyapunov exponents, wavelet coefficients and the power levels of power spectral density (PSD) values obtained by eigenvector methods of the EEG signals were used as inputs of the MLPNN trained with Levenberg-Marquardt algorithm. The classification results confirmed that the proposed MLPNN has potential in detecting the electroencephalographic changes. PMID:19555931

  8. Statistical analysis of low level atmospheric turbulence

    NASA Technical Reports Server (NTRS)

    Tieleman, H. W.; Chen, W. W. L.

    1974-01-01

    The statistical properties of low-level wind-turbulence data were obtained with the model 1080 total vector anemometer and the model 1296 dual split-film anemometer, both manufactured by Thermo Systems Incorporated. The data obtained from the above fast-response probes were compared with the results obtained from a pair of Gill propeller anemometers. The digitized time series representing the three velocity components and the temperature were each divided into a number of blocks, the length of which depended on the lowest frequency of interest and also on the storage capacity of the available computer. A moving-average and differencing high-pass filter was used to remove the trend and the low frequency components in the time series. The calculated results for each of the anemometers used are represented in graphical or tabulated form.

  9. Statistical Analysis of Galaxy Redshift Surveys

    NASA Astrophysics Data System (ADS)

    Percival, Will J.

    2008-12-01

    The statistical distribution of galaxies encodes significant cosmological information. For Gaussian random fields, 2-point functions, the correlation function in real space and the power spectrum in Fourier space are complete, and offer the most direct route to this information. In this proceedings, I consider three mechanisms for extracting information from the power spectrum. The relative amplitude of small-scale and large-scale power can constrain the matter-radiation equality scale, but this is hard to disentangle from galaxy bias. Baryon Acoustic Oscillations are more robust to galaxy bias effects, and lead to constraints the evolution of the Universe by providing a standard ruler whose distance can be compared at different redshifts. Redshift-Space distortions, resulting from galaxy peculiar velocities can be used to measure the cosmological growth of structure, and are immune to density bias as the velocities are independent of galaxy properties.

  10. Development and aging of superficial white matter myelin from young adulthood to old age: Mapping by vertex-based surface statistics (VBSS).

    PubMed

    Wu, Minjie; Kumar, Anand; Yang, Shaolin

    2016-05-01

    Superficial white matter (SWM) lies immediately beneath cortical gray matter and consists primarily of short association fibers. The characteristics of SWM and its development and aging were seldom examined in the literature and warrant further investigation. Magnetization transfer imaging is sensitive to myelin changes in the white matter. Using an innovative multimodal imaging analysis approach, vertex-based surface statistics (VBSS), the current study vertexwise mapped age-related changes of magnetization transfer ratio (MTR) in SWM from young adulthood to old age (30-85 years, N = 66). Results demonstrated regionally selective and temporally heterochronologic changes of SWM MTR with age, including (1) inverted U-shaped trajectories of SWM MTR in the rostral middle frontal, medial temporal, and temporoparietal regions, suggesting continuing myelination and protracted maturation till age 40-50 years and accelerating demyelination at age 60 and beyond, (2) linear decline of SWM MTR in the middle and superior temporal, and pericalcarine areas, indicating early maturation and less acceleration in age-related degeneration, and (3) no significant changes of SWM MTR in the primary motor, somatosensory and auditory regions, suggesting resistance to age-related deterioration. We did not observe similar patterns of changes in cortical thickness in our sample, suggesting the observed SWM MTR changes are not due to cortical atrophy. Hum Brain Mapp 37:1759-1769, 2016. © 2016 Wiley Periodicals, Inc. PMID:26955787

  11. Comparative analysis of positive and negative attitudes toward statistics

    NASA Astrophysics Data System (ADS)

    Ghulami, Hassan Rahnaward; Ab Hamid, Mohd Rashid; Zakaria, Roslinazairimah

    2015-02-01

    Many statistics lecturers and statistics education researchers are interested to know the perception of their students' attitudes toward statistics during the statistics course. In statistics course, positive attitude toward statistics is a vital because it will be encourage students to get interested in the statistics course and in order to master the core content of the subject matters under study. Although, students who have negative attitudes toward statistics they will feel depressed especially in the given group assignment, at risk for failure, are often highly emotional, and could not move forward. Therefore, this study investigates the students' attitude towards learning statistics. Six latent constructs have been the measurement of students' attitudes toward learning statistic such as affect, cognitive competence, value, difficulty, interest, and effort. The questionnaire was adopted and adapted from the reliable and validate instrument of Survey of Attitudes towards Statistics (SATS). This study is conducted among engineering undergraduate engineering students in the university Malaysia Pahang (UMP). The respondents consist of students who were taking the applied statistics course from different faculties. From the analysis, it is found that the questionnaire is acceptable and the relationships among the constructs has been proposed and investigated. In this case, students show full effort to master the statistics course, feel statistics course enjoyable, have confidence that they have intellectual capacity, and they have more positive attitudes then negative attitudes towards statistics learning. In conclusion in terms of affect, cognitive competence, value, interest and effort construct the positive attitude towards statistics was mostly exhibited. While negative attitudes mostly exhibited by difficulty construct.

  12. Statistical Evaluation of Time Series Analysis Techniques

    NASA Technical Reports Server (NTRS)

    Benignus, V. A.

    1973-01-01

    The performance of a modified version of NASA's multivariate spectrum analysis program is discussed. A multiple regression model was used to make the revisions. Performance improvements were documented and compared to the standard fast Fourier transform by Monte Carlo techniques.

  13. CORSSA: The Community Online Resource for Statistical Seismicity Analysis

    USGS Publications Warehouse

    Michael, Andrew J.; Wiemer, Stefan

    2010-01-01

    Statistical seismology is the application of rigorous statistical methods to earthquake science with the goal of improving our knowledge of how the earth works. Within statistical seismology there is a strong emphasis on the analysis of seismicity data in order to improve our scientific understanding of earthquakes and to improve the evaluation and testing of earthquake forecasts, earthquake early warning, and seismic hazards assessments. Given the societal importance of these applications, statistical seismology must be done well. Unfortunately, a lack of educational resources and available software tools make it difficult for students and new practitioners to learn about this discipline. The goal of the Community Online Resource for Statistical Seismicity Analysis (CORSSA) is to promote excellence in statistical seismology by providing the knowledge and resources necessary to understand and implement the best practices, so that the reader can apply these methods to their own research. This introduction describes the motivation for and vision of CORRSA. It also describes its structure and contents.

  14. Improved Statistics for Genome-Wide Interaction Analysis

    PubMed Central

    Ueki, Masao; Cordell, Heather J.

    2012-01-01

    Recently, Wu and colleagues [1] proposed two novel statistics for genome-wide interaction analysis using case/control or case-only data. In computer simulations, their proposed case/control statistic outperformed competing approaches, including the fast-epistasis option in PLINK and logistic regression analysis under the correct model; however, reasons for its superior performance were not fully explored. Here we investigate the theoretical properties and performance of Wu et al.'s proposed statistics and explain why, in some circumstances, they outperform competing approaches. Unfortunately, we find minor errors in the formulae for their statistics, resulting in tests that have higher than nominal type 1 error. We also find minor errors in PLINK's fast-epistasis and case-only statistics, although theory and simulations suggest that these errors have only negligible effect on type 1 error. We propose adjusted versions of all four statistics that, both theoretically and in computer simulations, maintain correct type 1 error rates under the null hypothesis. We also investigate statistics based on correlation coefficients that maintain similar control of type 1 error. Although designed to test specifically for interaction, we show that some of these previously-proposed statistics can, in fact, be sensitive to main effects at one or both loci, particularly in the presence of linkage disequilibrium. We propose two new “joint effects” statistics that, provided the disease is rare, are sensitive only to genuine interaction effects. In computer simulations we find, in most situations considered, that highest power is achieved by analysis under the correct genetic model. Such an analysis is unachievable in practice, as we do not know this model. However, generally high power over a wide range of scenarios is exhibited by our joint effects and adjusted Wu statistics. We recommend use of these alternative or adjusted statistics and urge caution when using Wu et al

  15. Importance of data management with statistical analysis set division.

    PubMed

    Wang, Ling; Li, Chan-juan; Jiang, Zhi-wei; Xia, Jie-lai

    2015-11-01

    Testing of hypothesis was affected by statistical analysis set division which was an important data management work before data base lock-in. Objective division of statistical analysis set under blinding was the guarantee of scientific trial conclusion. All the subjects having accepted at least once trial treatment after randomization should be concluded in safety set. Full analysis set should be close to the intention-to-treat as far as possible. Per protocol set division was the most difficult to control in blinded examination because of more subjectivity than the other two. The objectivity of statistical analysis set division must be guaranteed by the accurate raw data, the comprehensive data check and the scientific discussion, all of which were the strict requirement of data management. Proper division of statistical analysis set objectively and scientifically is an important approach to improve the data management quality. PMID:26911044

  16. Statistical analysis of the 'Almagest' star catalog

    NASA Astrophysics Data System (ADS)

    Kalashnikov, V. V.; Nosovskii, G. V.; Fomenko, A. T.

    The star catalog contained in the 'Almagest', Ptolemy's classical work of astronomy, is examined. An analysis method is proposed which allows the identification of various types of errors committed by the observer. This method not only removes many of the contradictions contained in the catalog but also makes it possible to determine the time period during which the catalog was compiled.

  17. Statistical analysis of fixed income market

    NASA Astrophysics Data System (ADS)

    Bernaschi, Massimo; Grilli, Luca; Vergni, Davide

    2002-05-01

    We present cross and time series analysis of price fluctuations in the US Treasury fixed income market. Bonds have been classified according to a suitable metric based on the correlation among them. The classification shows how the correlation among fixed income securities depends strongly on their maturity. We study also the structure of price fluctuations for single time series.

  18. A statistical model including age to predict passenger postures in the rear seats of automobiles.

    PubMed

    Park, Jangwoon; Ebert, Sheila M; Reed, Matthew P; Hallman, Jason J

    2016-06-01

    Few statistical models of rear seat passenger posture have been published, and none has taken into account the effects of occupant age. This study developed new statistical models for predicting passenger postures in the rear seats of automobiles. Postures of 89 adults with a wide range of age and body size were measured in a laboratory mock-up in seven seat configurations. Posture-prediction models for female and male passengers were separately developed by stepwise regression using age, body dimensions, seat configurations and two-way interactions as potential predictors. Passenger posture was significantly associated with age and the effects of other two-way interaction variables depended on age. A set of posture-prediction models are presented for women and men, and the prediction results are compared with previously published models. This study is the first study of passenger posture to include a large cohort of older passengers and the first to report a significant effect of age for adults. The presented models can be used to position computational and physical human models for vehicle design and assessment. Practitioner Summary: The significant effects of age, body dimensions and seat configuration on rear seat passenger posture were identified. The models can be used to accurately position computational human models or crash test dummies for older passengers in known rear seat configurations. PMID:26328769

  19. A Realistic Experimental Design and Statistical Analysis Project

    ERIC Educational Resources Information Center

    Muske, Kenneth R.; Myers, John A.

    2007-01-01

    A realistic applied chemical engineering experimental design and statistical analysis project is documented in this article. This project has been implemented as part of the professional development and applied statistics courses at Villanova University over the past five years. The novel aspects of this project are that the students are given a…

  20. Internet Data Analysis for the Undergraduate Statistics Curriculum

    ERIC Educational Resources Information Center

    Sanchez, Juana; He, Yan

    2005-01-01

    Statistics textbooks for undergraduates have not caught up with the enormous amount of analysis of Internet data that is taking place these days. Case studies that use Web server log data or Internet network traffic data are rare in undergraduate Statistics education. And yet these data provide numerous examples of skewed and bimodal…

  1. Guidelines for Statistical Analysis of Percentage of Syllables Stuttered Data

    ERIC Educational Resources Information Center

    Jones, Mark; Onslow, Mark; Packman, Ann; Gebski, Val

    2006-01-01

    Purpose: The purpose of this study was to develop guidelines for the statistical analysis of percentage of syllables stuttered (%SS) data in stuttering research. Method; Data on %SS from various independent sources were used to develop a statistical model to describe this type of data. On the basis of this model, %SS data were simulated with…

  2. Explorations in Statistics: The Analysis of Ratios and Normalized Data

    ERIC Educational Resources Information Center

    Curran-Everett, Douglas

    2013-01-01

    Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This ninth installment of "Explorations in Statistics" explores the analysis of ratios and normalized--or standardized--data. As researchers, we compute a ratio--a numerator divided by a denominator--to compute a…

  3. Absolute ages from crater statistics: Using radiometric ages of Martian samples for determining the Martian cratering chronology

    NASA Technical Reports Server (NTRS)

    Neukum, G.

    1988-01-01

    In the absence of dates derived from rock samples, impact crater frequencies are commonly used to date Martian surface units. All models for absolute dating rely on the lunar cratering chronology and on the validity of its extrapolation to Martian conditions. Starting from somewhat different lunar chronologies, rather different Martian cratering chronologies are found in the literature. Currently favored models are compared. The differences at old ages are significant, the differences at younger ages are considerable and give absolute ages for the same crater frequencies as different as a factor of 3. The total uncertainty could be much higher, though, since the ratio of lunar to Martian cratering rate which is of basic importance in the models is believed to be known no better than within a factor of 2. Thus, it is of crucial importance for understanding the the evolution of Mars and determining the sequence of events to establish an unambiguous Martian cratering chronology from crater statistics in combination with clean radiometric ages of returned Martian samples. For the dating goal, rocks should be as pristine as possible from a geologically simple area with a one-stage emplacement history of the local formation. A minimum of at least one highland site for old ages, two intermediate-aged sites, and one very young site is needed.

  4. Statistics

    Cancer.gov

    Links to sources of cancer-related statistics, including the Surveillance, Epidemiology and End Results (SEER) Program, SEER-Medicare datasets, cancer survivor prevalence data, and the Cancer Trends Progress Report.

  5. Improving the Statistical Methodology of Astronomical Data Analysis

    NASA Astrophysics Data System (ADS)

    Feigelson, Eric D.; Babu, Gutti Jogesh

    Contemporary observational astronomers are generally unfamiliar with the extensive advances made in mathematical and applied statistics during the past several decades. Astronomical problems can often be addressed by methods developed in statistical fields such as spatial point processes, density estimation, Bayesian statistics, and sampling theory. The common problem of bivariate linear regression illustrates the need for sophisticated methods. Astronomical problems often require combinations of ordinary least-squares lines, double-weighted and errors-in-variables models, censored and truncated regressions, each with its own error analysis procedure. The recent conference Statistical Challenges in Modern Astronomy highlighted issues of mutual interest to statisticians and astronomers including clustering of point processes and time series analysis. We conclude with advice on how the astronomical community can advance its statistical methodology with improvements in education of astrophysicists, collaboration and consultation with professional statisticians, and acquisition of new software.

  6. System statistical reliability model and analysis

    NASA Technical Reports Server (NTRS)

    Lekach, V. S.; Rood, H.

    1973-01-01

    A digital computer code was developed to simulate the time-dependent behavior of the 5-kwe reactor thermoelectric system. The code was used to determine lifetime sensitivity coefficients for a number of system design parameters, such as thermoelectric module efficiency and degradation rate, radiator absorptivity and emissivity, fuel element barrier defect constant, beginning-of-life reactivity, etc. A probability distribution (mean and standard deviation) was estimated for each of these design parameters. Then, error analysis was used to obtain a probability distribution for the system lifetime (mean = 7.7 years, standard deviation = 1.1 years). From this, the probability that the system will achieve the design goal of 5 years lifetime is 0.993. This value represents an estimate of the degradation reliability of the system.

  7. The HONEYPOT Randomized Controlled Trial Statistical Analysis Plan

    PubMed Central

    Pascoe, Elaine Mary; Lo, Serigne; Scaria, Anish; Badve, Sunil V.; Beller, Elaine Mary; Cass, Alan; Hawley, Carmel Mary; Johnson, David W.

    2013-01-01

    ♦ Background: The HONEYPOT study is a multicenter, open-label, blinded-outcome, randomized controlled trial designed to determine whether, compared with standard topical application of mupirocin for nasal staphylococcal carriage, exit-site application of antibacterial honey reduces the rate of catheter-associated infections in peritoneal dialysis patients. ♦ Objective: To make public the pre-specified statistical analysis principles to be adhered to and the procedures to be performed by statisticians who will analyze the data for the HONEYPOT trial. ♦ Methods: Statisticians and clinical investigators who were blinded to treatment allocation and treatment-related study results and who will remain blinded until the central database is locked for final data extraction and analysis determined the statistical methods and procedures to be used for analysis and wrote the statistical analysis plan. The plan describes basic analysis principles, methods for dealing with a range of commonly encountered data analysis issues, and the specific statistical procedures for analyzing the primary, secondary, and safety outcomes. ♦ Results: A statistical analysis plan containing the pre-specified principles, methods, and procedures to be adhered to in the analysis of the data from the HONEYPOT trial was developed in accordance with international guidelines. The structure and content of the plan provide sufficient detail to meet the guidelines on statistical principles for clinical trials produced by the International Conference on Harmonization of Technical Requirements for Registration of Pharmaceuticals for Human Use. ♦ Conclusions: Making public the pre-specified statistical analysis plan for the HONEYPOT trial minimizes the potential for bias in the analysis of trial data and the interpretation and reporting of trial results. PMID:23843589

  8. A Warning System for Stromboli Volcano Based on Statistical Analysis

    NASA Astrophysics Data System (ADS)

    Nunnari, Giuseppe; Puglisi, Giuseppe; Spata, Alessandro

    2008-08-01

    In this paper we describe a warning system based on statistical analysis for the purpose of monitoring ground deformation at the Sciara del Fuoco (Stromboli Volcano, Sicily). After a statistical analysis of ground deformation time-series measured at Stromboli by the monitoring system known as THEODOROS (THEOdolite and Distancemeter Robot Observatory of Stromboli), the paper describes the solution adopted for implementing the warning system. A robust statistical index has been defined in order to evaluate the movements of the area. A fuzzy approach has been proposed to evaluate an AI (Alarm Intensity) index which indicates the level of hazard of the Sciara del Fuoco sliding.

  9. Cross-population validation of statistical distance as a measure of physiological dysregulation during aging.

    PubMed

    Cohen, Alan A; Milot, Emmanuel; Li, Qing; Legault, Véronique; Fried, Linda P; Ferrucci, Luigi

    2014-09-01

    Measuring physiological dysregulation during aging could be a key tool both to understand underlying aging mechanisms and to predict clinical outcomes in patients. However, most existing indices are either circular or hard to interpret biologically. Recently, we showed that statistical distance of 14 common blood biomarkers (a measure of how strange an individual's biomarker profile is) was associated with age and mortality in the WHAS II data set, validating its use as a measure of physiological dysregulation. Here, we extend the analyses to other data sets (WHAS I and InCHIANTI) to assess the stability of the measure across populations. We found that the statistical criteria used to determine the original 14 biomarkers produced diverging results across populations; in other words, had we started with a different data set, we would have chosen a different set of markers. Nonetheless, the same 14 markers (or the subset of 12 available for InCHIANTI) produced highly similar predictions of age and mortality. We include analyses of all combinatorial subsets of the markers and show that results do not depend much on biomarker choice or data set, but that more markers produce a stronger signal. We conclude that statistical distance as a measure of physiological dysregulation is stable across populations in Europe and North America. PMID:24802990

  10. Statistical analysis of litter experiments in teratology

    SciTech Connect

    Williams, R.; Buschbom, R.L.

    1982-11-01

    Teratological data is binary response data (each fetus is either affected or not) in which the responses within a litter are usually not independent. As a result, the litter should be taken as the experimental unit. For each litter, its size, n, and the number of fetuses, x, possessing the effect of interest are recorded. The ratio p = x/n is then the basic data generated by the experiment. There are currently three general approaches to the analysis of teratological data: nonparametric, transformation followed by t-test or ANOVA, and parametric. The first two are currently in wide use by practitioners while the third is relatively new to the field. These first two also appear to possess comparable power levels while maintaining the nominal level of significance. When transformations are employed, care must be exercised to check that the transformed data has the required properties. Since the data is often highly asymmetric, there may be no transformation which renders the data nearly normal. The parametric procedures, including the beta-binomial model, offer the possibility of increased power.

  11. Life cycle cost analysis of aging aircraft airframe maintenance

    NASA Astrophysics Data System (ADS)

    Sperry, Kenneth Robert

    Scope and method of study. The purpose of this study was to examine the relationship between an aircraft's age and its annual airframe maintenance costs. Common life cycle costing methodology has previously not recognized the existence of this cost growth potential, and has therefor not determined the magnitude nor significance of this cost element. This study analyzed twenty-five years of DOT Form 41-airframe maintenance cost data for the Boeing 727, 737, 747 and McDonnell Douglas DC9 and DC-10 aircraft. Statistical analysis included regression analysis, Pearson's r, and t-tests to test the null hypothesis. Findings and conclusion. Airframe maintenance cost growth was confirmed to be increasing after an aircraft's age exceeded its designed service objective of approximately twenty-years. Annual airframe maintenance cost growth increases were measured ranging from 3.5% annually for a DC-9, to approximately 9% annually for a DC-10 aircraft. Average measured coefficient of determination between age and airframe maintenance, exceeded .80, confirming a strong relationship between cost: and age. The statistical significance of the difference between airframe costs sampled in 1985, compared to airframe costs sampled in 1998 was confirmed by t-tests performed on each subject aircraft group. Future cost forecasts involving aging aircraft subjects must address cost growth due to aging when attempting to model an aircraft's economic service life.

  12. A Divergence Statistics Extension to VTK for Performance Analysis.

    SciTech Connect

    Pebay, Philippe Pierre; Bennett, Janine Camille

    2015-02-01

    This report follows the series of previous documents ([PT08, BPRT09b, PT09, BPT09, PT10, PB13], where we presented the parallel descriptive, correlative, multi-correlative, principal component analysis, contingency, k -means, order and auto-correlative statistics engines which we developed within the Visualization Tool Kit ( VTK ) as a scalable, parallel and versatile statistics package. We now report on a new engine which we developed for the calculation of divergence statistics, a concept which we hereafter explain and whose main goal is to quantify the discrepancy, in a stasticial manner akin to measuring a distance, between an observed empirical distribution and a theoretical, "ideal" one. The ease of use of the new diverence statistics engine is illustrated by the means of C++ code snippets. Although this new engine does not yet have a parallel implementation, it has already been applied to HPC performance analysis, of which we provide an example.

  13. Data explorer: a prototype expert system for statistical analysis.

    PubMed Central

    Aliferis, C.; Chao, E.; Cooper, G. F.

    1993-01-01

    The inadequate analysis of medical research data, due mainly to the unavailability of local statistical expertise, seriously jeopardizes the quality of new medical knowledge. Data Explorer is a prototype Expert System that builds on the versatility and power of existing statistical software, to provide automatic analyses and interpretation of medical data. The system draws much of its power by using belief network methods in place of more traditional, but difficult to automate, classical multivariate statistical techniques. Data Explorer identifies statistically significant relationships among variables, and using power-size analysis, belief network inference/learning and various explanatory techniques helps the user understand the importance of the findings. Finally the system can be used as a tool for the automatic development of predictive/diagnostic models from patient databases. PMID:8130501

  14. A Statistical Analysis of the Charles F. Kettering Climate Scale.

    ERIC Educational Resources Information Center

    Johnson, William L.; Dixon, Paul N.

    A statistical analysis was performed on the Charles F. Kettering (CFK) Scale, a popular four-section measure of school climate. The study centered on a multivariate analysis of Part A, the General Climate Factors section of the instrument, using data gathered from several elementary, junior high, and high school campuses in a large school district…

  15. Analysis of Coastal Dunes: A Remote Sensing and Statistical Approach.

    ERIC Educational Resources Information Center

    Jones, J. Richard

    1985-01-01

    Remote sensing analysis and statistical methods were used to analyze the coastal dunes of Plum Island, Massachusetts. The research methodology used provides an example of a student project for remote sensing, geomorphology, or spatial analysis courses at the university level. (RM)

  16. Statistical analysis in dBASE-compatible databases.

    PubMed

    Hauer-Jensen, M

    1991-01-01

    Database management in clinical and experimental research often requires statistical analysis of the data in addition to the usual functions for storing, organizing, manipulating and reporting. With most database systems, transfer of data to a dedicated statistics package is a relatively simple task. However, many statistics programs lack the powerful features found in database management software. dBASE IV and compatible programs are currently among the most widely used database management programs. d4STAT is a utility program for dBASE, containing a collection of statistical functions and tests for data stored in the dBASE file format. By using d4STAT, statistical calculations may be performed directly on the data stored in the database without having to exit dBASE IV or export data. Record selection and variable transformations are performed in memory, thus obviating the need for creating new variables or data files. The current version of the program contains routines for descriptive statistics, paired and unpaired t-tests, correlation, linear regression, frequency tables, Mann-Whitney U-test, Wilcoxon signed rank test, a time-saving procedure for counting observations according to user specified selection criteria, survival analysis (product limit estimate analysis, log-rank test, and graphics), and normal t and chi-squared distribution functions. PMID:2004275

  17. Fisher statistics for analysis of diffusion tensor directional information.

    PubMed

    Hutchinson, Elizabeth B; Rutecki, Paul A; Alexander, Andrew L; Sutula, Thomas P

    2012-04-30

    A statistical approach is presented for the quantitative analysis of diffusion tensor imaging (DTI) directional information using Fisher statistics, which were originally developed for the analysis of vectors in the field of paleomagnetism. In this framework, descriptive and inferential statistics have been formulated based on the Fisher probability density function, a spherical analogue of the normal distribution. The Fisher approach was evaluated for investigation of rat brain DTI maps to characterize tissue orientation in the corpus callosum, fornix, and hilus of the dorsal hippocampal dentate gyrus, and to compare directional properties in these regions following status epilepticus (SE) or traumatic brain injury (TBI) with values in healthy brains. Direction vectors were determined for each region of interest (ROI) for each brain sample and Fisher statistics were applied to calculate the mean direction vector and variance parameters in the corpus callosum, fornix, and dentate gyrus of normal rats and rats that experienced TBI or SE. Hypothesis testing was performed by calculation of Watson's F-statistic and associated p-value giving the likelihood that grouped observations were from the same directional distribution. In the fornix and midline corpus callosum, no directional differences were detected between groups, however in the hilus, significant (p<0.0005) differences were found that robustly confirmed observations that were suggested by visual inspection of directionally encoded color DTI maps. The Fisher approach is a potentially useful analysis tool that may extend the current capabilities of DTI investigation by providing a means of statistical comparison of tissue structural orientation. PMID:22342971

  18. Proteome analysis in the assessment of ageing.

    PubMed

    Nkuipou-Kenfack, Esther; Koeck, Thomas; Mischak, Harald; Pich, Andreas; Schanstra, Joost P; Zürbig, Petra; Schumacher, Björn

    2014-11-01

    Based on demographic trends, the societies in many developed countries are facing an increasing number and proportion of people over the age of 65. The raise in elderly populations along with improved health-care will be concomitant with an increased prevalence of ageing-associated chronic conditions like cardiovascular, renal, and respiratory diseases, arthritis, dementia, and diabetes mellitus. This is expected to pose unprecedented challenges both for individuals and societies and their health care systems. An ultimate goal of ageing research is therefore the understanding of physiological ageing and the achievement of 'healthy' ageing by decreasing age-related pathologies. However, on a molecular level, ageing is a complex multi-mechanistic process whose contributing factors may vary individually, partly overlap with pathological alterations, and are often poorly understood. Proteome analysis potentially allows modelling of these multifactorial processes. This review summarises recent proteomic research on age-related changes identified in animal models and human studies. We combined this information with pathway analysis to identify molecular mechanisms associated with ageing. We identified some molecular pathways that are affected in most or even all organs and others that are organ-specific. However, appropriately powered studies are needed to confirm these findings based in in silico evaluation. PMID:25257180

  19. Adaptive Strategy for the Statistical Analysis of Connectomes

    PubMed Central

    Meskaldji, Djalel Eddine; Ottet, Marie-Christine; Cammoun, Leila; Hagmann, Patric; Meuli, Reto; Eliez, Stephan; Thiran, Jean Philippe; Morgenthaler, Stephan

    2011-01-01

    We study an adaptive statistical approach to analyze brain networks represented by brain connection matrices of interregional connectivity (connectomes). Our approach is at a middle level between a global analysis and single connections analysis by considering subnetworks of the global brain network. These subnetworks represent either the inter-connectivity between two brain anatomical regions or by the intra-connectivity within the same brain anatomical region. An appropriate summary statistic, that characterizes a meaningful feature of the subnetwork, is evaluated. Based on this summary statistic, a statistical test is performed to derive the corresponding p-value. The reformulation of the problem in this way reduces the number of statistical tests in an orderly fashion based on our understanding of the problem. Considering the global testing problem, the p-values are corrected to control the rate of false discoveries. Finally, the procedure is followed by a local investigation within the significant subnetworks. We contrast this strategy with the one based on the individual measures in terms of power. We show that this strategy has a great potential, in particular in cases where the subnetworks are well defined and the summary statistics are properly chosen. As an application example, we compare structural brain connection matrices of two groups of subjects with a 22q11.2 deletion syndrome, distinguished by their IQ scores. PMID:21829681

  20. Data analysis using the Gnu R system for statistical computation

    SciTech Connect

    Simone, James; /Fermilab

    2011-07-01

    R is a language system for statistical computation. It is widely used in statistics, bioinformatics, machine learning, data mining, quantitative finance, and the analysis of clinical drug trials. Among the advantages of R are: it has become the standard language for developing statistical techniques, it is being actively developed by a large and growing global user community, it is open source software, it is highly portable (Linux, OS-X and Windows), it has a built-in documentation system, it produces high quality graphics and it is easily extensible with over four thousand extension library packages available covering statistics and applications. This report gives a very brief introduction to R with some examples using lattice QCD simulation results. It then discusses the development of R packages designed for chi-square minimization fits for lattice n-pt correlation functions.

  1. Statistical Learning in Specific Language Impairment and Autism Spectrum Disorder: A Meta-Analysis.

    PubMed

    Obeid, Rita; Brooks, Patricia J; Powers, Kasey L; Gillespie-Lynch, Kristen; Lum, Jarrad A G

    2016-01-01

    Impairments in statistical learning might be a common deficit among individuals with Specific Language Impairment (SLI) and Autism Spectrum Disorder (ASD). Using meta-analysis, we examined statistical learning in SLI (14 studies, 15 comparisons) and ASD (13 studies, 20 comparisons) to evaluate this hypothesis. Effect sizes were examined as a function of diagnosis across multiple statistical learning tasks (Serial Reaction Time, Contextual Cueing, Artificial Grammar Learning, Speech Stream, Observational Learning, and Probabilistic Classification). Individuals with SLI showed deficits in statistical learning relative to age-matched controls. In contrast, statistical learning was intact in individuals with ASD relative to controls. Effect sizes did not vary as a function of task modality or participant age. Our findings inform debates about overlapping social-communicative difficulties in children with SLI and ASD by suggesting distinct underlying mechanisms. In line with the procedural deficit hypothesis (Ullman and Pierpont, 2005), impaired statistical learning may account for phonological and syntactic difficulties associated with SLI. In contrast, impaired statistical learning fails to account for the social-pragmatic difficulties associated with ASD. PMID:27602006

  2. Statistical Learning in Specific Language Impairment and Autism Spectrum Disorder: A Meta-Analysis

    PubMed Central

    Obeid, Rita; Brooks, Patricia J.; Powers, Kasey L.; Gillespie-Lynch, Kristen; Lum, Jarrad A. G.

    2016-01-01

    Impairments in statistical learning might be a common deficit among individuals with Specific Language Impairment (SLI) and Autism Spectrum Disorder (ASD). Using meta-analysis, we examined statistical learning in SLI (14 studies, 15 comparisons) and ASD (13 studies, 20 comparisons) to evaluate this hypothesis. Effect sizes were examined as a function of diagnosis across multiple statistical learning tasks (Serial Reaction Time, Contextual Cueing, Artificial Grammar Learning, Speech Stream, Observational Learning, and Probabilistic Classification). Individuals with SLI showed deficits in statistical learning relative to age-matched controls. In contrast, statistical learning was intact in individuals with ASD relative to controls. Effect sizes did not vary as a function of task modality or participant age. Our findings inform debates about overlapping social-communicative difficulties in children with SLI and ASD by suggesting distinct underlying mechanisms. In line with the procedural deficit hypothesis (Ullman and Pierpont, 2005), impaired statistical learning may account for phonological and syntactic difficulties associated with SLI. In contrast, impaired statistical learning fails to account for the social-pragmatic difficulties associated with ASD. PMID:27602006

  3. A κ-generalized statistical mechanics approach to income analysis

    NASA Astrophysics Data System (ADS)

    Clementi, F.; Gallegati, M.; Kaniadakis, G.

    2009-02-01

    This paper proposes a statistical mechanics approach to the analysis of income distribution and inequality. A new distribution function, having its roots in the framework of κ-generalized statistics, is derived that is particularly suitable for describing the whole spectrum of incomes, from the low-middle income region up to the high income Pareto power-law regime. Analytical expressions for the shape, moments and some other basic statistical properties are given. Furthermore, several well-known econometric tools for measuring inequality, which all exist in a closed form, are considered. A method for parameter estimation is also discussed. The model is shown to fit remarkably well the data on personal income for the United States, and the analysis of inequality performed in terms of its parameters is revealed as very powerful.

  4. Statistical Analysis of Tsunamis of the Italian Coasts

    SciTech Connect

    Caputo, M.; Faita, G.F.

    1982-01-20

    A study of a catalog of 138 tsunamis of the Italian coasts has been made. Intensitities of 106 tsunamis has been assigned and cataloged. The statistical analysis of this data fits a density distribution of the form log n = 3.00-0.425 I, where n is the number of tsunamis of intensity I per thousand years.

  5. Revisiting the statistical analysis of pyroclast density and porosity data

    NASA Astrophysics Data System (ADS)

    Bernard, B.; Kueppers, U.; Ortiz, H.

    2015-07-01

    Explosive volcanic eruptions are commonly characterized based on a thorough analysis of the generated deposits. Amongst other characteristics in physical volcanology, density and porosity of juvenile clasts are some of the most frequently used to constrain eruptive dynamics. In this study, we evaluate the sensitivity of density and porosity data to statistical methods and introduce a weighting parameter to correct issues raised by the use of frequency analysis. Results of textural investigation can be biased by clast selection. Using statistical tools as presented here, the meaningfulness of a conclusion can be checked for any data set easily. This is necessary to define whether or not a sample has met the requirements for statistical relevance, i.e. whether a data set is large enough to allow for reproducible results. Graphical statistics are used to describe density and porosity distributions, similar to those used for grain-size analysis. This approach helps with the interpretation of volcanic deposits. To illustrate this methodology, we chose two large data sets: (1) directed blast deposits of the 3640-3510 BC eruption of Chachimbiro volcano (Ecuador) and (2) block-and-ash-flow deposits of the 1990-1995 eruption of Unzen volcano (Japan). We propose the incorporation of this analysis into future investigations to check the objectivity of results achieved by different working groups and guarantee the meaningfulness of the interpretation.

  6. Mapping of Planetary Surface Age Based on Crater Statistics Obtained by AN Automatic Detection Algorithm

    NASA Astrophysics Data System (ADS)

    Salih, A. L.; Mühlbauer, M.; Grumpe, A.; Pasckert, J. H.; Wöhler, C.; Hiesinger, H.

    2016-06-01

    The analysis of the impact crater size-frequency distribution (CSFD) is a well-established approach to the determination of the age of planetary surfaces. Classically, estimation of the CSFD is achieved by manual crater counting and size determination in spacecraft images, which, however, becomes very time-consuming for large surface areas and/or high image resolution. With increasing availability of high-resolution (nearly) global image mosaics of planetary surfaces, a variety of automated methods for the detection of craters based on image data and/or topographic data have been developed. In this contribution a template-based crater detection algorithm is used which analyses image data acquired under known illumination conditions. Its results are used to establish the CSFD for the examined area, which is then used to estimate the absolute model age of the surface. The detection threshold of the automatic crater detection algorithm is calibrated based on a region with available manually determined CSFD such that the age inferred from the manual crater counts corresponds to the age inferred from the automatic crater detection results. With this detection threshold, the automatic crater detection algorithm can be applied to a much larger surface region around the calibration area. The proposed age estimation method is demonstrated for a Kaguya Terrain Camera image mosaic of 7.4 m per pixel resolution of the floor region of the lunar crater Tsiolkovsky, which consists of dark and flat mare basalt and has an area of nearly 10,000 km2. The region used for calibration, for which manual crater counts are available, has an area of 100 km2. In order to obtain a spatially resolved age map, CSFDs and surface ages are computed for overlapping quadratic regions of about 4.4 x 4.4 km2 size offset by a step width of 74 m. Our constructed surface age map of the floor of Tsiolkovsky shows age values of typically 3.2-3.3 Ga, while for small regions lower (down to 2.9 Ga) and higher

  7. Statistical inference for exploratory data analysis and model diagnostics.

    PubMed

    Buja, Andreas; Cook, Dianne; Hofmann, Heike; Lawrence, Michael; Lee, Eun-Kyung; Swayne, Deborah F; Wickham, Hadley

    2009-11-13

    We propose to furnish visual statistical methods with an inferential framework and protocol, modelled on confirmatory statistical testing. In this framework, plots take on the role of test statistics, and human cognition the role of statistical tests. Statistical significance of 'discoveries' is measured by having the human viewer compare the plot of the real dataset with collections of plots of simulated datasets. A simple but rigorous protocol that provides inferential validity is modelled after the 'lineup' popular from criminal legal procedures. Another protocol modelled after the 'Rorschach' inkblot test, well known from (pop-)psychology, will help analysts acclimatize to random variability before being exposed to the plot of the real data. The proposed protocols will be useful for exploratory data analysis, with reference datasets simulated by using a null assumption that structure is absent. The framework is also useful for model diagnostics in which case reference datasets are simulated from the model in question. This latter point follows up on previous proposals. Adopting the protocols will mean an adjustment in working procedures for data analysts, adding more rigour, and teachers might find that incorporating these protocols into the curriculum improves their students' statistical thinking. PMID:19805449

  8. Statistical Software for spatial analysis of stratigraphic data sets

    SciTech Connect

    2003-04-08

    Stratistics s a tool for statistical analysis of spatially explicit data sets and model output for description and for model-data comparisons. lt is intended for the analysis of data sets commonly used in geology, such as gamma ray logs and lithologic sequences, as well as 2-D data such as maps. Stratistics incorporates a far wider range of spatial analysis methods drawn from multiple disciplines, than are currently available in other packages. These include incorporation of techniques from spatial and landscape ecology, fractal analysis, and mathematical geology. Its use should substantially reduce the risk associated with the use of predictive models

  9. Statistical Software for spatial analysis of stratigraphic data sets

    2003-04-08

    Stratistics s a tool for statistical analysis of spatially explicit data sets and model output for description and for model-data comparisons. lt is intended for the analysis of data sets commonly used in geology, such as gamma ray logs and lithologic sequences, as well as 2-D data such as maps. Stratistics incorporates a far wider range of spatial analysis methods drawn from multiple disciplines, than are currently available in other packages. These include incorporation ofmore » techniques from spatial and landscape ecology, fractal analysis, and mathematical geology. Its use should substantially reduce the risk associated with the use of predictive models« less

  10. Investigation of Weibull statistics in fracture analysis of cast aluminum

    NASA Technical Reports Server (NTRS)

    Holland, Frederic A., Jr.; Zaretsky, Erwin V.

    1989-01-01

    The fracture strengths of two large batches of A357-T6 cast aluminum coupon specimens were compared by using two-parameter Weibull analysis. The minimum number of these specimens necessary to find the fracture strength of the material was determined. The applicability of three-parameter Weibull analysis was also investigated. A design methodology based on the combination of elementary stress analysis and Weibull statistical analysis is advanced and applied to the design of a spherical pressure vessel shell. The results from this design methodology are compared with results from the applicable ASME pressure vessel code.

  11. HistFitter software framework for statistical data analysis

    NASA Astrophysics Data System (ADS)

    Baak, M.; Besjes, G. J.; Côté, D.; Koutsman, A.; Lorenz, J.; Short, D.

    2015-04-01

    We present a software framework for statistical data analysis, called HistFitter, that has been used extensively by the ATLAS Collaboration to analyze big datasets originating from proton-proton collisions at the Large Hadron Collider at CERN. Since 2012 HistFitter has been the standard statistical tool in searches for supersymmetric particles performed by ATLAS. HistFitter is a programmable and flexible framework to build, book-keep, fit, interpret and present results of data models of nearly arbitrary complexity. Starting from an object-oriented configuration, defined by users, the framework builds probability density functions that are automatically fit to data and interpreted with statistical tests. Internally HistFitter uses the statistics packages RooStats and HistFactory. A key innovation of HistFitter is its design, which is rooted in analysis strategies of particle physics. The concepts of control, signal and validation regions are woven into its fabric. These are progressively treated with statistically rigorous built-in methods. Being capable of working with multiple models at once that describe the data, HistFitter introduces an additional level of abstraction that allows for easy bookkeeping, manipulation and testing of large collections of signal hypotheses. Finally, HistFitter provides a collection of tools to present results with publication quality style through a simple command-line interface.

  12. Statistical analysis of flight times for space shuttle ferry flights

    NASA Technical Reports Server (NTRS)

    Graves, M. E.; Perlmutter, M.

    1974-01-01

    Markov chain and Monte Carlo analysis techniques are applied to the simulated Space Shuttle Orbiter Ferry flights to obtain statistical distributions of flight time duration between Edwards Air Force Base and Kennedy Space Center. The two methods are compared, and are found to be in excellent agreement. The flights are subjected to certain operational and meteorological requirements, or constraints, which cause eastbound and westbound trips to yield different results. Persistence of events theory is applied to the occurrence of inclement conditions to find their effect upon the statistical flight time distribution. In a sensitivity test, some of the constraints are varied to observe the corresponding changes in the results.

  13. A novel statistical analysis and interpretation of flow cytometry data

    PubMed Central

    Banks, H.T.; Kapraun, D.F.; Thompson, W. Clayton; Peligero, Cristina; Argilaguet, Jordi; Meyerhans, Andreas

    2013-01-01

    A recently developed class of models incorporating the cyton model of population generation structure into a conservation-based model of intracellular label dynamics is reviewed. Statistical aspects of the data collection process are quantified and incorporated into a parameter estimation scheme. This scheme is then applied to experimental data for PHA-stimulated CD4+ T and CD8+ T cells collected from two healthy donors. This novel mathematical and statistical framework is shown to form the basis for accurate, meaningful analysis of cellular behaviour for a population of cells labelled with the dye carboxyfluorescein succinimidyl ester and stimulated to divide. PMID:23826744

  14. [The meaning of statistical data in medical science and their examination--true and false analysis of statistical data].

    PubMed

    Hayashi, C

    1986-04-01

    The subjects which are often encountered in the statistical design and analysis of data in medical science studies were discussed. The five topics examined were: Medical science and statistical methods So-called mathematical statistics and medical science Fundamentals of cross-tabulation analysis of statistical data and inference Exploratory study by multidimensional data analyses Optimal process control of individual, medical science and informatics of statistical data In I, the author's statistico-mathematical idea is characterized as the analysis of phenomena by statistical data. This is closely related to the logic, methodology and philosophy of science. This statistical concept and method are based on operational and pragmatic ideas. Self-examination of mathematical statistics is particularly focused in II and III. In II, the effectiveness of experimental design and statistical testing is thoroughly examined with regard to the study of medical science, and the limitation of its application is discussed. In III the apparent paradox of analysis of cross-tabulation of statistical data and statistical inference is shown. This is due to the operation of a simple two- or three-fold cross-tabulation analysis of (more than two or three) multidimensional data, apart from the sophisticated statistical test theory of association. In IV, the necessity of informatics of multidimensional data analysis in medical science is stressed. In V, the following point is discussed. The essential point of clinical trials is that they are not based on any simple statistical test in a traditional experimental design but on the optimal process control of individuals in the information space of the body and mind, which is based on a knowledge of medical science and the informatics of multidimensional statistical data analysis. PMID:3729436

  15. Using Pre-Statistical Analysis to Streamline Monitoring Assessments

    SciTech Connect

    Reed, J.K.

    1999-10-20

    A variety of statistical methods exist to aid evaluation of groundwater quality and subsequent decision making in regulatory programs. These methods are applied because of large temporal and spatial extrapolations commonly applied to these data. In short, statistical conclusions often serve as a surrogate for knowledge. However, facilities with mature monitoring programs that have generated abundant data have inherently less uncertainty because of the sheer quantity of analytical results. In these cases, statistical tests can be less important, and ''expert'' data analysis should assume an important screening role.The WSRC Environmental Protection Department, working with the General Separations Area BSRI Environmental Restoration project team has developed a method for an Integrated Hydrogeological Analysis (IHA) of historical water quality data from the F and H Seepage Basins groundwater remediation project. The IHA combines common sense analytical techniques and a GIS presentation that force direct interactive evaluation of the data. The IHA can perform multiple data analysis tasks required by the RCRA permit. These include: (1) Development of a groundwater quality baseline prior to remediation startup, (2) Targeting of constituents for removal from RCRA GWPS, (3) Targeting of constituents for removal from UIC, permit, (4) Targeting of constituents for reduced, (5)Targeting of monitoring wells not producing representative samples, (6) Reduction in statistical evaluation, and (7) Identification of contamination from other facilities.

  16. Multivariate statistical analysis of atom probe tomography data

    SciTech Connect

    Parish, Chad M; Miller, Michael K

    2010-01-01

    The application of spectrum imaging multivariate statistical analysis methods, specifically principal component analysis (PCA), to atom probe tomography (APT) data has been investigated. The mathematical method of analysis is described and the results for two example datasets are analyzed and presented. The first dataset is from the analysis of a PM 2000 Fe-Cr-Al-Ti steel containing two different ultrafine precipitate populations. PCA properly describes the matrix and precipitate phases in a simple and intuitive manner. A second APT example is from the analysis of an irradiated reactor pressure vessel steel. Fine, nm-scale Cu-enriched precipitates having a core-shell structure were identified and qualitatively described by PCA. Advantages, disadvantages, and future prospects for implementing these data analysis methodologies for APT datasets, particularly with regard to quantitative analysis, are also discussed.

  17. Multivariate statistical analysis of atom probe tomography data.

    PubMed

    Parish, Chad M; Miller, Michael K

    2010-10-01

    The application of spectrum imaging multivariate statistical analysis methods, specifically principal component analysis (PCA), to atom probe tomography (APT) data has been investigated. The mathematical method of analysis is described and the results for two example datasets are analyzed and presented. The first dataset is from the analysis of a PM 2000 Fe-Cr-Al-Ti steel containing two different ultrafine precipitate populations. PCA properly describes the matrix and precipitate phases in a simple and intuitive manner. A second APT example is from the analysis of an irradiated reactor pressure vessel steel. Fine, nm-scale Cu-enriched precipitates having a core-shell structure were identified and qualitatively described by PCA. Advantages, disadvantages, and future prospects for implementing these data analysis methodologies for APT datasets, particularly with regard to quantitative analysis, are also discussed. PMID:20650566

  18. Feature-based statistical analysis of combustion simulation data.

    PubMed

    Bennett, Janine C; Krishnamoorthy, Vaidyanathan; Liu, Shusen; Grout, Ray W; Hawkes, Evatt R; Chen, Jacqueline H; Shepherd, Jason; Pascucci, Valerio; Bremer, Peer-Timo

    2011-12-01

    We present a new framework for feature-based statistical analysis of large-scale scientific data and demonstrate its effectiveness by analyzing features from Direct Numerical Simulations (DNS) of turbulent combustion. Turbulent flows are ubiquitous and account for transport and mixing processes in combustion, astrophysics, fusion, and climate modeling among other disciplines. They are also characterized by coherent structure or organized motion, i.e. nonlocal entities whose geometrical features can directly impact molecular mixing and reactive processes. While traditional multi-point statistics provide correlative information, they lack nonlocal structural information, and hence, fail to provide mechanistic causality information between organized fluid motion and mixing and reactive processes. Hence, it is of great interest to capture and track flow features and their statistics together with their correlation with relevant scalar quantities, e.g. temperature or species concentrations. In our approach we encode the set of all possible flow features by pre-computing merge trees augmented with attributes, such as statistical moments of various scalar fields, e.g. temperature, as well as length-scales computed via spectral analysis. The computation is performed in an efficient streaming manner in a pre-processing step and results in a collection of meta-data that is orders of magnitude smaller than the original simulation data. This meta-data is sufficient to support a fully flexible and interactive analysis of the features, allowing for arbitrary thresholds, providing per-feature statistics, and creating various global diagnostics such as Cumulative Density Functions (CDFs), histograms, or time-series. We combine the analysis with a rendering of the features in a linked-view browser that enables scientists to interactively explore, visualize, and analyze the equivalent of one terabyte of simulation data. We highlight the utility of this new framework for combustion

  19. Feature-Based Statistical Analysis of Combustion Simulation Data

    SciTech Connect

    Bennett, J; Krishnamoorthy, V; Liu, S; Grout, R; Hawkes, E; Chen, J; Pascucci, V; Bremer, P T

    2011-11-18

    We present a new framework for feature-based statistical analysis of large-scale scientific data and demonstrate its effectiveness by analyzing features from Direct Numerical Simulations (DNS) of turbulent combustion. Turbulent flows are ubiquitous and account for transport and mixing processes in combustion, astrophysics, fusion, and climate modeling among other disciplines. They are also characterized by coherent structure or organized motion, i.e. nonlocal entities whose geometrical features can directly impact molecular mixing and reactive processes. While traditional multi-point statistics provide correlative information, they lack nonlocal structural information, and hence, fail to provide mechanistic causality information between organized fluid motion and mixing and reactive processes. Hence, it is of great interest to capture and track flow features and their statistics together with their correlation with relevant scalar quantities, e.g. temperature or species concentrations. In our approach we encode the set of all possible flow features by pre-computing merge trees augmented with attributes, such as statistical moments of various scalar fields, e.g. temperature, as well as length-scales computed via spectral analysis. The computation is performed in an efficient streaming manner in a pre-processing step and results in a collection of meta-data that is orders of magnitude smaller than the original simulation data. This meta-data is sufficient to support a fully flexible and interactive analysis of the features, allowing for arbitrary thresholds, providing per-feature statistics, and creating various global diagnostics such as Cumulative Density Functions (CDFs), histograms, or time-series. We combine the analysis with a rendering of the features in a linked-view browser that enables scientists to interactively explore, visualize, and analyze the equivalent of one terabyte of simulation data. We highlight the utility of this new framework for combustion

  20. Statistic analyses of the color experience according to the age of the observer.

    PubMed

    Hunjet, Anica; Parac-Osterman, Durdica; Vucaj, Edita

    2013-04-01

    Psychological experience of color is a real state of the communication between the environment and color, and it will depend on the source of the light, angle of the view, and particular on the observer and his health condition. Hering's theory or a theory of the opponent processes supposes that cones, which are situated in the retina of the eye, are not sensible on the three chromatic domains (areas, fields, zones) (red, green and purple-blue), but they produce a signal based on the principle of the opposed pairs of colors. A reason of this theory depends on the fact that certain disorders of the color eyesight, which include blindness to certain colors, cause blindness to pairs of opponent colors. This paper presents a demonstration of the experience of blue and yellow tone according to the age of the observer. For the testing of the statistically significant differences in the omission in the color experience according to the color of the background we use following statistical tests: Mann-Whitnney U Test, Kruskal-Wallis ANOVA and Median test. It was proven that the differences are statistically significant in the elderly persons (older than 35 years). PMID:23837226

  1. SPA- STATISTICAL PACKAGE FOR TIME AND FREQUENCY DOMAIN ANALYSIS

    NASA Technical Reports Server (NTRS)

    Brownlow, J. D.

    1994-01-01

    The need for statistical analysis often arises when data is in the form of a time series. This type of data is usually a collection of numerical observations made at specified time intervals. Two kinds of analysis may be performed on the data. First, the time series may be treated as a set of independent observations using a time domain analysis to derive the usual statistical properties including the mean, variance, and distribution form. Secondly, the order and time intervals of the observations may be used in a frequency domain analysis to examine the time series for periodicities. In almost all practical applications, the collected data is actually a mixture of the desired signal and a noise signal which is collected over a finite time period with a finite precision. Therefore, any statistical calculations and analyses are actually estimates. The Spectrum Analysis (SPA) program was developed to perform a wide range of statistical estimation functions. SPA can provide the data analyst with a rigorous tool for performing time and frequency domain studies. In a time domain statistical analysis the SPA program will compute the mean variance, standard deviation, mean square, and root mean square. It also lists the data maximum, data minimum, and the number of observations included in the sample. In addition, a histogram of the time domain data is generated, a normal curve is fit to the histogram, and a goodness-of-fit test is performed. These time domain calculations may be performed on both raw and filtered data. For a frequency domain statistical analysis the SPA program computes the power spectrum, cross spectrum, coherence, phase angle, amplitude ratio, and transfer function. The estimates of the frequency domain parameters may be smoothed with the use of Hann-Tukey, Hamming, Barlett, or moving average windows. Various digital filters are available to isolate data frequency components. Frequency components with periods longer than the data collection interval

  2. Statistical analysis of high density diffuse optical tomography

    PubMed Central

    Hassanpour, Mahlega S.; White, Brian R.; Eggebrecht, Adam T.; Ferradal, Silvina L.; Snyder, Abraham Z.; Culver, Joseph P.

    2014-01-01

    High density diffuse optical tomography (HD-DOT) is a noninvasive neuroimaging modality with moderate spatial resolution and localization accuracy. Due to portability and wear-ability advantages, HD-DOT has the potential to be used in populations that are not amenable to functional magnetic resonance imaging (fMRI), such as hospitalized patients and young children. However, whereas the use of event-related stimuli designs, general linear model (GLM) analysis, and imaging statistics are standardized and routine with fMRI, such tools are not yet common practice in HD-DOT. In this paper we adapt and optimize fundamental elements of fMRI analysis for application to HD-DOT. We show the use of event-related protocols and GLM de-convolution analysis in un-mixing multi-stimuli event-related HD-DOT data. Statistical parametric mapping (SPM) in the framework of a general linear model is developed considering the temporal and spatial characteristics of HD- DOT data. The statistical analysis utilizes a random field noise model that incorporates estimates of the local temporal and spatial correlations of the GLM residuals. The multiple-comparison problem is addressed using a cluster analysis based on non-stationary Gaussian random field theory. These analysis tools provide access to a wide range of experimental designs necessary for the study of the complex brain functions. In addition, they provide a foundation for understanding and interpreting HD-DOT results with quantitative estimates for the statistical significance of detected activation foci. PMID:23732886

  3. HistFitter - A flexible framework for statistical data analysis

    NASA Astrophysics Data System (ADS)

    Lorenz, J. M.; Baak, M.; Besjes, G. J.; Côté, D.; Koutsman, A.; Short, D.

    2015-05-01

    We present a software framework for statistical data analysis, called HistFitter, that has extensively been used in the ATLAS Collaboration to analyze data of proton-proton collisions produced by the Large Hadron Collider at CERN. Most notably, HistFitter has become a de-facto standard in searches for supersymmetric particles since 2012, with some usage for Exotic and Higgs boson physics. HistFitter coherently combines several statistics tools in a programmable and flexible framework that is capable of bookkeeping hundreds of data models under study using thousands of generated input histograms. The key innovations of HistFitter are to weave the concepts of control, validation and signal regions into its very fabric, and to treat them with rigorous statistical methods, while providing multiple tools to visualize and interpret the results through a simple configuration interface.

  4. Statistics.

    PubMed

    1993-02-01

    In 1984, 99% of abortions conducted in Bombay, India, were of female fetuses. In 1986-87, 30,000-50,000 female fetuses were aborted in India. In 1987-88, 7 Delhi clinics conducted 13,000 sex determination tests. Thus, discrimination against females begins before birth in India. Some states (Maharashtra, Goa, and Gujarat) have drafted legislation to prevent the use of prenatal diagnostic tests (e.g., ultrasonography) for sex determination purposes. Families make decisions about an infant's nutrition based on the infant's sex so it is not surprising to see a higher incidence of morbidity among girls than boys (e.g., for respiratory infections in 1985, 55.5% vs. 27.3%). Consequently, they are more likely to die than boys. Even though vasectomy is simpler and safer than tubectomy, the government promotes female sterilizations. The percentage of all sexual sterilizations being tubectomy has increased steadily from 84% to 94% (1986-90). Family planning programs focus on female contraceptive methods, despite the higher incidence of adverse health effects from female methods (e.g., IUD causes pain and heavy bleeding). Some women advocates believe the effects to be so great that India should ban contraceptives and injectable contraceptives. The maternal mortality rate is quite high (460/100,000 live births), equaling a lifetime risk of 1:18 of a pregnancy-related death. 70% of these maternal deaths are preventable. Leading causes of maternal deaths in India are anemia, hemorrhage, eclampsia, sepsis, and abortion. Most pregnant women do not receive prenatal care. Untrained personnel attend about 70% of deliveries in rural areas and 29% in urban areas. Appropriate health services and other interventions would prevent the higher age specific death rates for females between 0 and 35 years old. Even though the government does provide maternal and child health services, it needs to stop decreasing resource allocate for health and start increasing it. PMID:12286355

  5. SMART: Statistical Metabolomics Analysis-An R Tool.

    PubMed

    Liang, Yu-Jen; Lin, Yu-Ting; Chen, Chia-Wei; Lin, Chien-Wei; Chao, Kun-Mao; Pan, Wen-Harn; Yang, Hsin-Chou

    2016-06-21

    Metabolomics data provide unprecedented opportunities to decipher metabolic mechanisms by analyzing hundreds to thousands of metabolites. Data quality concerns and complex batch effects in metabolomics must be appropriately addressed through statistical analysis. This study developed an integrated analysis tool for metabolomics studies to streamline the complete analysis flow from initial data preprocessing to downstream association analysis. We developed Statistical Metabolomics Analysis-An R Tool (SMART), which can analyze input files with different formats, visually represent various types of data features, implement peak alignment and annotation, conduct quality control for samples and peaks, explore batch effects, and perform association analysis. A pharmacometabolomics study of antihypertensive medication was conducted and data were analyzed using SMART. Neuromedin N was identified as a metabolite significantly associated with angiotensin-converting-enzyme inhibitors in our metabolome-wide association analysis (p = 1.56 × 10(-4) in an analysis of covariance (ANCOVA) with an adjustment for unknown latent groups and p = 1.02 × 10(-4) in an ANCOVA with an adjustment for hidden substructures). This endogenous neuropeptide is highly related to neurotensin and neuromedin U, which are involved in blood pressure regulation and smooth muscle contraction. The SMART software, a user guide, and example data can be downloaded from http://www.stat.sinica.edu.tw/hsinchou/metabolomics/SMART.htm . PMID:27248514

  6. Teaching Statistics in Biology: Using Inquiry-based Learning to Strengthen Understanding of Statistical Analysis in Biology Laboratory Courses

    PubMed Central

    2008-01-01

    There is an increasing need for students in the biological sciences to build a strong foundation in quantitative approaches to data analyses. Although most science, engineering, and math field majors are required to take at least one statistics course, statistical analysis is poorly integrated into undergraduate biology course work, particularly at the lower-division level. Elements of statistics were incorporated into an introductory biology course, including a review of statistics concepts and opportunity for students to perform statistical analysis in a biological context. Learning gains were measured with an 11-item statistics learning survey instrument developed for the course. Students showed a statistically significant 25% (p < 0.005) increase in statistics knowledge after completing introductory biology. Students improved their scores on the survey after completing introductory biology, even if they had previously completed an introductory statistics course (9%, improvement p < 0.005). Students retested 1 yr after completing introductory biology showed no loss of their statistics knowledge as measured by this instrument, suggesting that the use of statistics in biology course work may aid long-term retention of statistics knowledge. No statistically significant differences in learning were detected between male and female students in the study. PMID:18765754

  7. CORSSA: Community Online Resource for Statistical Seismicity Analysis

    NASA Astrophysics Data System (ADS)

    Zechar, J. D.; Hardebeck, J. L.; Michael, A. J.; Naylor, M.; Steacy, S.; Wiemer, S.; Zhuang, J.

    2011-12-01

    Statistical seismology is critical to the understanding of seismicity, the evaluation of proposed earthquake prediction and forecasting methods, and the assessment of seismic hazard. Unfortunately, despite its importance to seismology-especially to those aspects with great impact on public policy-statistical seismology is mostly ignored in the education of seismologists, and there is no central repository for the existing open-source software tools. To remedy these deficiencies, and with the broader goal to enhance the quality of statistical seismology research, we have begun building the Community Online Resource for Statistical Seismicity Analysis (CORSSA, www.corssa.org). We anticipate that the users of CORSSA will range from beginning graduate students to experienced researchers. More than 20 scientists from around the world met for a week in Zurich in May 2010 to kick-start the creation of CORSSA: the format and initial table of contents were defined; a governing structure was organized; and workshop participants began drafting articles. CORSSA materials are organized with respect to six themes, each will contain between four and eight articles. CORSSA now includes seven articles with an additional six in draft form along with forums for discussion, a glossary, and news about upcoming meetings, special issues, and recent papers. Each article is peer-reviewed and presents a balanced discussion, including illustrative examples and code snippets. Topics in the initial set of articles include: introductions to both CORSSA and statistical seismology, basic statistical tests and their role in seismology; understanding seismicity catalogs and their problems; basic techniques for modeling seismicity; and methods for testing earthquake predictability hypotheses. We have also begun curating a collection of statistical seismology software packages.

  8. Building the Community Online Resource for Statistical Seismicity Analysis (CORSSA)

    NASA Astrophysics Data System (ADS)

    Michael, A. J.; Wiemer, S.; Zechar, J. D.; Hardebeck, J. L.; Naylor, M.; Zhuang, J.; Steacy, S.; Corssa Executive Committee

    2010-12-01

    Statistical seismology is critical to the understanding of seismicity, the testing of proposed earthquake prediction and forecasting methods, and the assessment of seismic hazard. Unfortunately, despite its importance to seismology - especially to those aspects with great impact on public policy - statistical seismology is mostly ignored in the education of seismologists, and there is no central repository for the existing open-source software tools. To remedy these deficiencies, and with the broader goal to enhance the quality of statistical seismology research, we have begun building the Community Online Resource for Statistical Seismicity Analysis (CORSSA). CORSSA is a web-based educational platform that is authoritative, up-to-date, prominent, and user-friendly. We anticipate that the users of CORSSA will range from beginning graduate students to experienced researchers. More than 20 scientists from around the world met for a week in Zurich in May 2010 to kick-start the creation of CORSSA: the format and initial table of contents were defined; a governing structure was organized; and workshop participants began drafting articles. CORSSA materials are organized with respect to six themes, each containing between four and eight articles. The CORSSA web page, www.corssa.org, officially unveiled on September 6, 2010, debuts with an initial set of approximately 10 to 15 articles available online for viewing and commenting with additional articles to be added over the coming months. Each article will be peer-reviewed and will present a balanced discussion, including illustrative examples and code snippets. Topics in the initial set of articles will include: introductions to both CORSSA and statistical seismology, basic statistical tests and their role in seismology; understanding seismicity catalogs and their problems; basic techniques for modeling seismicity; and methods for testing earthquake predictability hypotheses. A special article will compare and review

  9. Precipitation Hardening and Statistical Modeling of the Aging Parameters and Alloy Compositions in Al-Cu-Mg-Ag Alloys

    NASA Astrophysics Data System (ADS)

    Al-Obaisi, A. M.; El-Danaf, E. A.; Ragab, A. E.; Soliman, M. S.

    2016-04-01

    The addition of Ag to Al-Cu-Mg systems has been proposed to replace the existing high-strength 2xxx and 7xxx Al alloys. The aged Al-Cu-Mg-Ag alloys exhibited promising properties, due to special type of precipitates named Ω, which cooperate with other precipitates to enhance the mechanical properties significantly. In the present investigation, the effect of changing percentages of alloying elements, aging time, and aging temperature on the hardness values was studied based on a factorial design. According to this design of experiments (DOE)—23 factorial design, eight alloys were cast and hot rolled, where (Cu, Mg, and Ag) were added to aluminum with two different levels for each alloying element. These alloys were aged at different temperatures (160, 190, and 220 °C) over a wide range of time intervals from 10 min. to 64 h. The resulting hardness data were used as an input for Minitab software to model and relate the process variables with hardness through a regression analysis. Modifying the alloying elements' weight percentages to the high level enhanced the hardness of the alloy with about 40% as compared to the alloy containing the low level of all alloying elements. Through analysis of variance (ANOVA), it was figured out that altering the fraction of Cu had the greatest effect on the hardness values with a contribution of about 49%. Also, second-level interaction terms had about 21% of impact on the hardness values. Aging time, quadratic terms, and third-level interaction terms had almost the same level of influence on hardness values (about 10% contribution). Furthermore, the results have shown that small addition of Mg and Ag was enough to improve the mechanical properties of the alloy significantly. The statistical model formulated interpreted about 80% of the variation in hardness values.

  10. Precipitation Hardening and Statistical Modeling of the Aging Parameters and Alloy Compositions in Al-Cu-Mg-Ag Alloys

    NASA Astrophysics Data System (ADS)

    Al-Obaisi, A. M.; El-Danaf, E. A.; Ragab, A. E.; Soliman, M. S.

    2016-06-01

    The addition of Ag to Al-Cu-Mg systems has been proposed to replace the existing high-strength 2xxx and 7xxx Al alloys. The aged Al-Cu-Mg-Ag alloys exhibited promising properties, due to special type of precipitates named Ω, which cooperate with other precipitates to enhance the mechanical properties significantly. In the present investigation, the effect of changing percentages of alloying elements, aging time, and aging temperature on the hardness values was studied based on a factorial design. According to this design of experiments (DOE)—23 factorial design, eight alloys were cast and hot rolled, where (Cu, Mg, and Ag) were added to aluminum with two different levels for each alloying element. These alloys were aged at different temperatures (160, 190, and 220 °C) over a wide range of time intervals from 10 min. to 64 h. The resulting hardness data were used as an input for Minitab software to model and relate the process variables with hardness through a regression analysis. Modifying the alloying elements' weight percentages to the high level enhanced the hardness of the alloy with about 40% as compared to the alloy containing the low level of all alloying elements. Through analysis of variance (ANOVA), it was figured out that altering the fraction of Cu had the greatest effect on the hardness values with a contribution of about 49%. Also, second-level interaction terms had about 21% of impact on the hardness values. Aging time, quadratic terms, and third-level interaction terms had almost the same level of influence on hardness values (about 10% contribution). Furthermore, the results have shown that small addition of Mg and Ag was enough to improve the mechanical properties of the alloy significantly. The statistical model formulated interpreted about 80% of the variation in hardness values.

  11. Revisiting the statistical analysis of pyroclast density and porosity data

    NASA Astrophysics Data System (ADS)

    Bernard, B.; Kueppers, U.; Ortiz, H.

    2015-03-01

    Explosive volcanic eruptions are commonly characterized based on a thorough analysis of the generated deposits. Amongst other characteristics in physical volcanology, density and porosity of juvenile clasts are some of the most frequently used characteristics to constrain eruptive dynamics. In this study, we evaluate the sensitivity of density and porosity data and introduce a weighting parameter to correct issues raised by the use of frequency analysis. Results of textural investigation can be biased by clast selection. Using statistical tools as presented here, the meaningfulness of a conclusion can be checked for any dataset easily. This is necessary to define whether or not a sample has met the requirements for statistical relevance, i.e. whether a dataset is large enough to allow for reproducible results. Graphical statistics are used to describe density and porosity distributions, similar to those used for grain-size analysis. This approach helps with the interpretation of volcanic deposits. To illustrate this methodology we chose two large datasets: (1) directed blast deposits of the 3640-3510 BC eruption of Chachimbiro volcano (Ecuador) and (2) block-and-ash-flow deposits of the 1990-1995 eruption of Unzen volcano (Japan). We propose add the use of this analysis for future investigations to check the objectivity of results achieved by different working groups and guarantee the meaningfulness of the interpretation.

  12. Statistical Analysis of Single-Trial Granger Causality Spectra

    PubMed Central

    Brovelli, Andrea

    2012-01-01

    Granger causality analysis is becoming central for the analysis of interactions between neural populations and oscillatory networks. However, it is currently unclear whether single-trial estimates of Granger causality spectra can be used reliably to assess directional influence. We addressed this issue by combining single-trial Granger causality spectra with statistical inference based on general linear models. The approach was assessed on synthetic and neurophysiological data. Synthetic bivariate data was generated using two autoregressive processes with unidirectional coupling. We simulated two hypothetical experimental conditions: the first mimicked a constant and unidirectional coupling, whereas the second modelled a linear increase in coupling across trials. The statistical analysis of single-trial Granger causality spectra, based on t-tests and linear regression, successfully recovered the underlying pattern of directional influence. In addition, we characterised the minimum number of trials and coupling strengths required for significant detection of directionality. Finally, we demonstrated the relevance for neurophysiology by analysing two local field potentials (LFPs) simultaneously recorded from the prefrontal and premotor cortices of a macaque monkey performing a conditional visuomotor task. Our results suggest that the combination of single-trial Granger causality spectra and statistical inference provides a valuable tool for the analysis of large-scale cortical networks and brain connectivity. PMID:22649482

  13. Statistical Analysis of the Heavy Neutral Atoms Measured by IBEX

    NASA Astrophysics Data System (ADS)

    Park, Jeewoo; Kucharek, Harald; Möbius, Eberhard; Galli, André; Livadiotis, George; Fuselier, Steve A.; McComas, David J.

    2015-10-01

    We investigate the directional distribution of heavy neutral atoms in the heliosphere by using heavy neutral maps generated with the IBEX-Lo instrument over three years from 2009 to 2011. The interstellar neutral (ISN) O&Ne gas flow was found in the first-year heavy neutral map at 601 keV and its flow direction and temperature were studied. However, due to the low counting statistics, researchers have not treated the full sky maps in detail. The main goal of this study is to evaluate the statistical significance of each pixel in the heavy neutral maps to get a better understanding of the directional distribution of heavy neutral atoms in the heliosphere. Here, we examine three statistical analysis methods: the signal-to-noise filter, the confidence limit method, and the cluster analysis method. These methods allow us to exclude background from areas where the heavy neutral signal is statistically significant. These methods also allow the consistent detection of heavy neutral atom structures. The main emission feature expands toward lower longitude and higher latitude from the observational peak of the ISN O&Ne gas flow. We call this emission the extended tail. It may be an imprint of the secondary oxygen atoms generated by charge exchange between ISN hydrogen atoms and oxygen ions in the outer heliosheath.

  14. Improved Statistical Power with a Sparse Shape Model in Detecting an Aging Effect in the Hippocampus and Amygdala.

    PubMed

    Chung, Moo K; Kim, Seung-Goo; Schaefer, Stacey M; van Reekum, Carien M; Peschke-Schmitz, Lara; Sutterer, Matthew J; Davidson, Richard J

    2014-03-21

    The sparse regression framework has been widely used in medical image processing and analysis. However, it has been rarely used in anatomical studies. We present a sparse shape modeling framework using the Laplace-Beltrami (LB) eigenfunctions of the underlying shape and show its improvement of statistical power. Traditionally, the LB-eigenfunctions are used as a basis for intrinsically representing surface shapes as a form of Fourier descriptors. To reduce high frequency noise, only the first few terms are used in the expansion and higher frequency terms are simply thrown away. However, some lower frequency terms may not necessarily contribute significantly in reconstructing the surfaces. Motivated by this idea, we present a LB-based method to filter out only the significant eigenfunctions by imposing a sparse penalty. For dense anatomical data such as deformation fields on a surface mesh, the sparse regression behaves like a smoothing process, which will reduce the error of incorrectly detecting false negatives. Hence the statistical power improves. The sparse shape model is then applied in investigating the influence of age on amygdala and hippocampus shapes in the normal population. The advantage of the LB sparse framework is demonstrated by showing the increased statistical power. PMID:25302007

  15. Statistical analysis of spectral data for vegetation detection

    NASA Astrophysics Data System (ADS)

    Love, Rafael; Cathcart, J. Michael

    2006-05-01

    Identification and reduction of false alarms provide a critical component in the detection of landmines. Research at Georgia Tech over the past several years has focused on this problem through an examination of the signature characteristics of various background materials. These efforts seek to understand the physical basis and features of these signatures as an aid to the development of false target identification techniques. The investigation presented in this paper deal concentrated on the detection of foliage in long wave infrared imagery. Data collected by a hyperspectral long-wave infrared sensor provided the background signatures used in this study. These studies focused on an analysis of the statistical characteristics of both the intensity signature and derived emissivity data. Results from these studies indicate foliage signatures possess unique characteristics that can be exploited to enable detection of vegetation in LWIR images. This paper will present review of the approach and results of the statistical analysis.

  16. Statistical analysis of mineral soils in the Odra valley

    NASA Astrophysics Data System (ADS)

    Hudak, Magda; Rojna, Arkadiusz

    2012-10-01

    The aim of this article is to present the results of statistical analyses of laboratory experiment results obtained from an ITB ZW-K2 apparatus, Kamieński tubes and grain-size distribution curves. Beside basic statistical parameters (mean, sum, minimum and maximum), correlation analysis and multivariate analysis of variance at significance levels α < 0.01 and α < 0.05 were taken into account, as well as calculations of LSD confidence half-intervals. The research material was collected from the valley of the Odra river near the town of Słubice in Lubuskie province. The research involved mineral, non-rock fine-grained, non-cohesive soils lying at the depth of 0.3-1.5 m.

  17. A statistical analysis of mesoscale rainfall as a random cascade

    NASA Technical Reports Server (NTRS)

    Gupta, Vijay K.; Waymire, Edward C.

    1993-01-01

    The paper considers the random cascade theory for spatial rainfall. Particular attention was given to the following four areas: (1) the relationship of the random cascade theory of rainfall to the simple scaling and the hierarchical cluster-point-process theories, (2) the mathematical foundations for some of the formalisms commonly applied in the develpment of statistical cascade theory, (3) the empirical evidence for a random cascade theory of rainfall, and (4) the way of using data for making estimates of parameters and for making statistical inference within this theoretical framework. An analysis of space-time rainfall data is presented. Cascade simulations are carried out to provide a comparison with methods of analysis that are applied to the rainfall data.

  18. Statistical Analysis of speckle noise reduction techniques for echocardiographic Images

    NASA Astrophysics Data System (ADS)

    Saini, Kalpana; Dewal, M. L.; Rohit, Manojkumar

    2011-12-01

    Echocardiography is the safe, easy and fast technology for diagnosing the cardiac diseases. As in other ultrasound images these images also contain speckle noise. In some cases this speckle noise is useful such as in motion detection. But in general noise removal is required for better analysis of the image and proper diagnosis. Different Adaptive and anisotropic filters are included for statistical analysis. Statistical parameters such as Signal-to-Noise Ratio (SNR), Peak Signal-to-Noise Ratio (PSNR), and Root Mean Square Error (RMSE) calculated for performance measurement. One more important aspect that there may be blurring during speckle noise removal. So it is prefered that filter should be able to enhance edges during noise removal.

  19. Integrated Data Collection Analysis (IDCA) Program - Statistical Analysis of RDX Standard Data Sets

    SciTech Connect

    Sandstrom, Mary M.; Brown, Geoffrey W.; Preston, Daniel N.; Pollard, Colin J.; Warner, Kirstin F.; Sorensen, Daniel N.; Remmers, Daniel L.; Phillips, Jason J.; Shelley, Timothy J.; Reyes, Jose A.; Hsu, Peter C.; Reynolds, John G.

    2015-10-30

    The Integrated Data Collection Analysis (IDCA) program is conducting a Proficiency Test for Small- Scale Safety and Thermal (SSST) testing of homemade explosives (HMEs). Described here are statistical analyses of the results for impact, friction, electrostatic discharge, and differential scanning calorimetry analysis of the RDX Type II Class 5 standard. The material was tested as a well-characterized standard several times during the proficiency study to assess differences among participants and the range of results that may arise for well-behaved explosive materials. The analyses show that there are detectable differences among the results from IDCA participants. While these differences are statistically significant, most of them can be disregarded for comparison purposes to assess potential variability when laboratories attempt to measure identical samples using methods assumed to be nominally the same. The results presented in this report include the average sensitivity results for the IDCA participants and the ranges of values obtained. The ranges represent variation about the mean values of the tests of between 26% and 42%. The magnitude of this variation is attributed to differences in operator, method, and environment as well as the use of different instruments that are also of varying age. The results appear to be a good representation of the broader safety testing community based on the range of methods, instruments, and environments included in the IDCA Proficiency Test.

  20. Lifetime statistics of quantum chaos studied by a multiscale analysis

    SciTech Connect

    Di Falco, A.; Krauss, T. F.; Fratalocchi, A.

    2012-04-30

    In a series of pump and probe experiments, we study the lifetime statistics of a quantum chaotic resonator when the number of open channels is greater than one. Our design embeds a stadium billiard into a two dimensional photonic crystal realized on a silicon-on-insulator substrate. We calculate resonances through a multiscale procedure that combines energy landscape analysis and wavelet transforms. Experimental data is found to follow the universal predictions arising from random matrix theory with an excellent level of agreement.

  1. Statistical sampling analysis for stratospheric measurements from satellite missions

    NASA Technical Reports Server (NTRS)

    Drewry, J. W.; Harrison, E. F.; Brooks, D. R.; Robbins, J. L.

    1978-01-01

    Earth orbiting satellite experiments can be designed to measure stratospheric constituents such as ozone by utilizing remote sensing techniques. Statistical analysis techniques, mission simulation and model development have been utilized to develop a method for analyzing various mission/sensor combinations. Existing and planned NASA satellite missions such as Nimbus-4 and G, and Stratospheric Aerosol and Gas Experiment-Application Explorer Mission (SAGE-AEM) have been analyzed to determine the ability of the missions to adequately sample the global field.

  2. Statistical Analysis of the Exchange Rate of Bitcoin.

    PubMed

    Chu, Jeffrey; Nadarajah, Saralees; Chan, Stephen

    2015-01-01

    Bitcoin, the first electronic payment system, is becoming a popular currency. We provide a statistical analysis of the log-returns of the exchange rate of Bitcoin versus the United States Dollar. Fifteen of the most popular parametric distributions in finance are fitted to the log-returns. The generalized hyperbolic distribution is shown to give the best fit. Predictions are given for future values of the exchange rate. PMID:26222702

  3. Statistical Analysis of the Exchange Rate of Bitcoin

    PubMed Central

    Chu, Jeffrey; Nadarajah, Saralees; Chan, Stephen

    2015-01-01

    Bitcoin, the first electronic payment system, is becoming a popular currency. We provide a statistical analysis of the log-returns of the exchange rate of Bitcoin versus the United States Dollar. Fifteen of the most popular parametric distributions in finance are fitted to the log-returns. The generalized hyperbolic distribution is shown to give the best fit. Predictions are given for future values of the exchange rate. PMID:26222702

  4. The statistical analysis of multivariate serological frequency data.

    PubMed

    Reyment, Richard A

    2005-11-01

    Data occurring in the form of frequencies are common in genetics-for example, in serology. Examples are provided by the AB0 group, the Rhesus group, and also DNA data. The statistical analysis of tables of frequencies is carried out using the available methods of multivariate analysis with usually three principal aims. One of these is to seek meaningful relationships between the components of a data set, the second is to examine relationships between populations from which the data have been obtained, the third is to bring about a reduction in dimensionality. This latter aim is usually realized by means of bivariate scatter diagrams using scores computed from a multivariate analysis. The multivariate statistical analysis of tables of frequencies cannot safely be carried out by standard multivariate procedures because they represent compositions and are therefore embedded in simplex space, a subspace of full space. Appropriate procedures for simplex space are compared and contrasted with simple standard methods of multivariate analysis ("raw" principal component analysis). The study shows that the differences between a log-ratio model and a simple logarithmic transformation of proportions may not be very great, particularly as regards graphical ordinations, but important discrepancies do occur. The divergencies between logarithmically based analyses and raw data are, however, great. Published data on Rhesus alleles observed for Italian populations are used to exemplify the subject. PMID:16024067

  5. Statistical Analysis of Longitudinal Psychiatric Data with Dropouts

    PubMed Central

    Mazumdar, Sati; Tang, Gong; Houck, Patricia R.; Dew, Mary Amanda; Begley, Amy E.; Scott, John; Mulsant, Benoit H.; Reynolds, Charles F.

    2007-01-01

    Longitudinal studies are used in psychiatric research to address outcome changes over time within and between individuals. However, because participants may drop out of a study prematurely, ignoring the nature of dropout often leads to biased inference, and in turn, wrongful conclusions. The purpose of the present paper is: (1) to review several dropout processes, corresponding inferential issues and recent methodological advances; (2) to evaluate the impact of assumptions regarding the dropout processes on inference by simulation studies and an illustrative example using psychiatric data; and (3) to provide a general strategy for practitioners to perform analyses of longitudinal data with dropouts, using software available commercially or in the public domain. The statistical methods used in this paper are maximum likelihood, multiple imputation and semi-parametric regression methods for inference, as well as Little’s test and ISNI (Index of Sensitivity to Nonignorability) for assessing statistical dropout mechanisms. We show that accounting for the nature of the dropout process influences results and that sensitivity analysis is useful in assessing the robustness of parameter estimates and related uncertainties. We conclude that recording the causes of dropouts should be an integral part of any statistical analysis with longitudinal psychiatric data, and we recommend performing a sensitivity analysis when the exact nature of the dropout process cannot be discerned. PMID:17092516

  6. A study of brain white matter plasticity in early blinds using tract-based spatial statistics and tract statistical analysis.

    PubMed

    Lao, Yi; Kang, Yue; Collignon, Olivier; Brun, Caroline; Kheibai, Shadi B; Alary, Flamine; Gee, James; Nelson, Marvin D; Lepore, Franco; Lepore, Natasha

    2015-12-16

    Early blind individuals are known to exhibit structural brain reorganization. Particularly, early-onset blindness may trigger profound brain alterations that affect not only the visual system but also the remaining sensory systems. Diffusion tensor imaging (DTI) allows in-vivo visualization of brain white matter connectivity, and has been extensively used to study brain white matter structure. Among statistical approaches based on DTI, tract-based spatial statistics (TBSS) is widely used because of its ability to automatically perform whole brain white matter studies. Tract specific analysis (TSA) is a more recent method that localizes changes in specific white matter bundles. In the present study, we compare TBSS and TSA results of DTI scans from 12 early blind individuals and 13 age-matched sighted controls, with two aims: (a) to investigate white matter alterations associated with early visual deprivation; (b) to examine the relative sensitivity of TSA when compared with TBSS, for both deficit and hypertrophy of white matter microstructures. Both methods give consistent results for broad white matter regions of deficits. However, TBSS does not detect hypertrophy of white matter, whereas TSA shows a higher sensitivity in detecting subtle differences in white matter colocalized to the posterior parietal lobe. PMID:26559727

  7. Spectral Analysis of B Stars: An Application of Bayesian Statistics

    NASA Astrophysics Data System (ADS)

    Mugnes, J.-M.; Robert, C.

    2012-12-01

    To better understand the processes involved in stellar physics, it is necessary to obtain accurate stellar parameters (effective temperature, surface gravity, abundances…). Spectral analysis is a powerful tool for investigating stars, but it is also vital to reduce uncertainties at a decent computational cost. Here we present a spectral analysis method based on a combination of Bayesian statistics and grids of synthetic spectra obtained with TLUSTY. This method simultaneously constrains the stellar parameters by using all the lines accessible in observed spectra and thus greatly reduces uncertainties and improves the overall spectrum fitting. Preliminary results are shown using spectra from the Observatoire du Mont-Mégantic.

  8. A statistical analysis of sea temperature data. A statistical analysis of sea temperature data

    NASA Astrophysics Data System (ADS)

    Lorentzen, Torbjørn

    2015-02-01

    The paper analyzes sea temperature series measured at two geographical locations along the coast of Norway. We address the question whether the series are stable over the sample period 1936-2012 and whether we can measure any signal of climate change in the regional data. We use nonstandard supF, OLS-based CUSUM, RE, and Chow tests in combination with the Bai-Perron's structural break test to identify potential changes in the temperature. The augmented Dickey-Fuller, the KPSS, and the nonparametric Phillips-Perron tests are in addition applied in the evaluation of the stochastic properties of the series. The analysis indicates that both series undergo similar structural instabilities in the form of small shifts in the temperature level. The temperature at Lista (58° 06' N, 06° 38' E) shifts downward about 1962 while the Skrova series (68° 12' N, 14° 10' E) shifts to a lower level about 1977. Both series shift upward about 1987, and after a period of increasing temperature, both series start leveling off about the turn of the millennium. The series have no significant stochastic or deterministic trend. The analysis indicates that the mean temperature has moved upward in decadal, small steps since the 1980s. The result is in accordance with recent analyses of sea temperatures in the North Atlantic. The findings are also related to the so-called hiatus phenomenon where natural variation in climate can mask global warming processes. The paper contributes to the discussion of applying objective methods in measuring climate change.

  9. A Laboratory Exercise in Statistical Analysis of Data

    NASA Astrophysics Data System (ADS)

    Vitha, Mark F.; Carr, Peter W.

    1997-08-01

    An undergraduate laboratory exercise in statistical analysis of data has been developed based on facile weighings of vitamin E pills. The use of electronic top-loading balances allows for very rapid data collection. Therefore, students obtain a sufficiently large number of replicates to provide statistically meaningful data sets. Through this exercise, students explore the effects of sample size and different types of sample averaging on the standard deviation of the average weight per pill. An emphasis is placed on the difference between the standard deviation of the mean and the standard deviation of the population. Students also perform the Q-test and t-test and are introduced to the X2-test. In this report, the class data from two consecutive offerings of the course are compared and reveal a statistically significant increase in the average weight per pill, presumably due to the absorption of water over time. Histograms of the class data are shown and used to illustrate the importance of plotting the data. Overall, through this brief laboratory exercise, students are exposed to many important statistical tests and concepts which are then used and further developed throughout the remainder of the course.

  10. HistFitter: a flexible framework for statistical data analysis

    NASA Astrophysics Data System (ADS)

    Besjes, G. J.; Baak, M.; Côté, D.; Koutsman, A.; Lorenz, J. M.; Short, D.

    2015-12-01

    HistFitter is a software framework for statistical data analysis that has been used extensively in the ATLAS Collaboration to analyze data of proton-proton collisions produced by the Large Hadron Collider at CERN. Most notably, HistFitter has become a de-facto standard in searches for supersymmetric particles since 2012, with some usage for Exotic and Higgs boson physics. HistFitter coherently combines several statistics tools in a programmable and flexible framework that is capable of bookkeeping hundreds of data models under study using thousands of generated input histograms. HistFitter interfaces with the statistics tools HistFactory and RooStats to construct parametric models and to perform statistical tests of the data, and extends these tools in four key areas. The key innovations are to weave the concepts of control, validation and signal regions into the very fabric of HistFitter, and to treat these with rigorous methods. Multiple tools to visualize and interpret the results through a simple configuration interface are also provided.

  11. Region-based Statistical Analysis of 2D PAGE Images

    PubMed Central

    Li, Feng; Seillier-Moiseiwitsch, Françoise; Korostyshevskiy, Valeriy R.

    2011-01-01

    A new comprehensive procedure for statistical analysis of two-dimensional polyacrylamide gel electrophoresis (2D PAGE) images is proposed, including protein region quantification, normalization and statistical analysis. Protein regions are defined by the master watershed map that is obtained from the mean gel. By working with these protein regions, the approach bypasses the current bottleneck in the analysis of 2D PAGE images: it does not require spot matching. Background correction is implemented in each protein region by local segmentation. Two-dimensional locally weighted smoothing (LOESS) is proposed to remove any systematic bias after quantification of protein regions. Proteins are separated into mutually independent sets based on detected correlations, and a multivariate analysis is used on each set to detect the group effect. A strategy for multiple hypothesis testing based on this multivariate approach combined with the usual Benjamini-Hochberg FDR procedure is formulated and applied to the differential analysis of 2D PAGE images. Each step in the analytical protocol is shown by using an actual dataset. The effectiveness of the proposed methodology is shown using simulated gels in comparison with the commercial software packages PDQuest and Dymension. We also introduce a new procedure for simulating gel images. PMID:21850152

  12. Statistical strategies to reveal potential vibrational markers for in vivo analysis by confocal Raman spectroscopy

    NASA Astrophysics Data System (ADS)

    Oliveira Mendes, Thiago de; Pinto, Liliane Pereira; Santos, Laurita dos; Tippavajhala, Vamshi Krishna; Téllez Soto, Claudio Alberto; Martin, Airton Abrahão

    2016-07-01

    The analysis of biological systems by spectroscopic techniques involves the evaluation of hundreds to thousands of variables. Hence, different statistical approaches are used to elucidate regions that discriminate classes of samples and to propose new vibrational markers for explaining various phenomena like disease monitoring, mechanisms of action of drugs, food, and so on. However, the technical statistics are not always widely discussed in applied sciences. In this context, this work presents a detailed discussion including the various steps necessary for proper statistical analysis. It includes univariate parametric and nonparametric tests, as well as multivariate unsupervised and supervised approaches. The main objective of this study is to promote proper understanding of the application of various statistical tools in these spectroscopic methods used for the analysis of biological samples. The discussion of these methods is performed on a set of in vivo confocal Raman spectra of human skin analysis that aims to identify skin aging markers. In the Appendix, a complete routine of data analysis is executed in a free software that can be used by the scientific community involved in these studies.

  13. Statistically significant faunal differences among Middle Ordovician age, Chickamauga Group bryozoan bioherms, central Alabama

    SciTech Connect

    Crow, C.J.

    1985-01-01

    Middle Ordovician age Chickamauga Group carbonates crop out along the Birmingham and Murphrees Valley anticlines in central Alabama. The macrofossil contents on exposed surfaces of seven bioherms have been counted to determine their various paleontologic characteristics. Twelve groups of organisms are present in these bioherms. Dominant organisms include bryozoans, algae, brachiopods, sponges, pelmatozoans, stromatoporoids and corals. Minor accessory fauna include predators, scavengers and grazers such as gastropods, ostracods, trilobites, cephalopods and pelecypods. Vertical and horizontal niche zonation has been detected for some of the bioherm dwelling fauna. No one bioherm of those studied exhibits all 12 groups of organisms; rather, individual bioherms display various subsets of the total diversity. Statistical treatment (G-test) of the diversity data indicates a lack of statistical homogeneity of the bioherms, both within and between localities. Between-locality population heterogeneity can be ascribed to differences in biologic responses to such gross environmental factors as water depth and clarity, and energy levels. At any one locality, gross aspects of the paleoenvironments are assumed to have been more uniform. Significant differences among bioherms at any one locality may have resulted from patchy distribution of species populations, differential preservation and other factors.

  14. Statistical analysis of heartbeat data with wavelet techniques

    NASA Astrophysics Data System (ADS)

    Pazsit, Imre

    2004-05-01

    The purpose of this paper is to demonstrate the use of some methods of signal analysis, performed on ECG and in some cases blood pressure signals, for the classification of the health status of the heart of mice and rats. Spectral and wavelet analysis were performed on the raw signals. FFT-based coherence and phase was also calculated between blood pressure and raw ECG signals. Finally, RR-intervals were deduced from the ECG signals and an analysis of the fractal dimensions was performed. The analysis was made on data from mice and rats. A correlation was found between the health status of the mice and the rats and some of the statistical descriptors, most notably the phase of the cross-spectra between ECG and blood pressure, and the fractal properties and dimensions of the interbeat series (RR-interval fluctuations).

  15. Bayes Method Plant Aging Risk Analysis

    1992-03-13

    DORIAN is an integrated package for performing Bayesian aging analysis of reliability data; e.g. for identifying trends in component failure rates and/or outage durations as a function of time. The user must specify several alternatives hypothesized aging models (i.e. possible trends) along with prior probabilities indicating the subjective probability that each trend is actually the correct one. DORIAN then uses component failure and/or repair data over time to update these prior probabilities and develop amore » posterior probability for each aging model, representing the probability that each model is the correct one in light of the observed data rather than a priori. Mean, median, and 5th and 95th percentile trends are also compiled from the posterior probabilities.« less

  16. Agriculture, population growth, and statistical analysis of the radiocarbon record

    PubMed Central

    Zahid, H. Jabran; Robinson, Erick; Kelly, Robert L.

    2016-01-01

    The human population has grown significantly since the onset of the Holocene about 12,000 y ago. Despite decades of research, the factors determining prehistoric population growth remain uncertain. Here, we examine measurements of the rate of growth of the prehistoric human population based on statistical analysis of the radiocarbon record. We find that, during most of the Holocene, human populations worldwide grew at a long-term annual rate of 0.04%. Statistical analysis of the radiocarbon record shows that transitioning farming societies experienced the same rate of growth as contemporaneous foraging societies. The same rate of growth measured for populations dwelling in a range of environments and practicing a variety of subsistence strategies suggests that the global climate and/or endogenous biological factors, not adaptability to local environment or subsistence practices, regulated the long-term growth of the human population during most of the Holocene. Our results demonstrate that statistical analyses of large ensembles of radiocarbon dates are robust and valuable for quantitatively investigating the demography of prehistoric human populations worldwide. PMID:26699457

  17. Agriculture, population growth, and statistical analysis of the radiocarbon record.

    PubMed

    Zahid, H Jabran; Robinson, Erick; Kelly, Robert L

    2016-01-26

    The human population has grown significantly since the onset of the Holocene about 12,000 y ago. Despite decades of research, the factors determining prehistoric population growth remain uncertain. Here, we examine measurements of the rate of growth of the prehistoric human population based on statistical analysis of the radiocarbon record. We find that, during most of the Holocene, human populations worldwide grew at a long-term annual rate of 0.04%. Statistical analysis of the radiocarbon record shows that transitioning farming societies experienced the same rate of growth as contemporaneous foraging societies. The same rate of growth measured for populations dwelling in a range of environments and practicing a variety of subsistence strategies suggests that the global climate and/or endogenous biological factors, not adaptability to local environment or subsistence practices, regulated the long-term growth of the human population during most of the Holocene. Our results demonstrate that statistical analyses of large ensembles of radiocarbon dates are robust and valuable for quantitatively investigating the demography of prehistoric human populations worldwide. PMID:26699457

  18. Statistical wind analysis for near-space applications

    NASA Astrophysics Data System (ADS)

    Roney, Jason A.

    2007-09-01

    Statistical wind models were developed based on the existing observational wind data for near-space altitudes between 60 000 and 100 000 ft (18 30 km) above ground level (AGL) at two locations, Akon, OH, USA, and White Sands, NM, USA. These two sites are envisioned as playing a crucial role in the first flights of high-altitude airships. The analysis shown in this paper has not been previously applied to this region of the stratosphere for such an application. Standard statistics were compiled for these data such as mean, median, maximum wind speed, and standard deviation, and the data were modeled with Weibull distributions. These statistics indicated, on a yearly average, there is a lull or a “knee” in the wind between 65 000 and 72 000 ft AGL (20 22 km). From the standard statistics, trends at both locations indicated substantial seasonal variation in the mean wind speed at these heights. The yearly and monthly statistical modeling indicated that Weibull distributions were a reasonable model for the data. Forecasts and hindcasts were done by using a Weibull model based on 2004 data and comparing the model with the 2003 and 2005 data. The 2004 distribution was also a reasonable model for these years. Lastly, the Weibull distribution and cumulative function were used to predict the 50%, 95%, and 99% winds, which are directly related to the expected power requirements of a near-space station-keeping airship. These values indicated that using only the standard deviation of the mean may underestimate the operational conditions.

  19. Analysis of the Spatial Organization of Molecules with Robust Statistics

    PubMed Central

    Lagache, Thibault; Lang, Gabriel; Sauvonnet, Nathalie; Olivo-Marin, Jean-Christophe

    2013-01-01

    One major question in molecular biology is whether the spatial distribution of observed molecules is random or organized in clusters. Indeed, this analysis gives information about molecules’ interactions and physical interplay with their environment. The standard tool for analyzing molecules’ distribution statistically is the Ripley’s K function, which tests spatial randomness through the computation of its critical quantiles. However, quantiles’ computation is very cumbersome, hindering its use. Here, we present an analytical expression of these quantiles, leading to a fast and robust statistical test, and we derive the characteristic clusters’ size from the maxima of the Ripley’s K function. Subsequently, we analyze the spatial organization of endocytic spots at the cell membrane and we report that clathrin spots are randomly distributed while clathrin-independent spots are organized in clusters with a radius of , which suggests distinct physical mechanisms and cellular functions for each pathway. PMID:24349021

  20. Statistical analysis of nanoparticle dosing in a dynamic cellular system

    NASA Astrophysics Data System (ADS)

    Summers, Huw D.; Rees, Paul; Holton, Mark D.; Rowan Brown, M.; Chappell, Sally C.; Smith, Paul J.; Errington, Rachel J.

    2011-03-01

    The delivery of nanoparticles into cells is important in therapeutic applications and in nanotoxicology. Nanoparticles are generally targeted to receptors on the surfaces of cells and internalized into endosomes by endocytosis, but the kinetics of the process and the way in which cell division redistributes the particles remain unclear. Here we show that the chance of success or failure of nanoparticle uptake and inheritance is random. Statistical analysis of nanoparticle-loaded endosomes indicates that particle capture is described by an over-dispersed Poisson probability distribution that is consistent with heterogeneous adsorption and internalization. Partitioning of nanoparticles in cell division is random and asymmetric, following a binomial distribution with mean probability of 0.52-0.72. These results show that cellular targeting of nanoparticles is inherently imprecise due to the randomness of nature at the molecular scale, and the statistical framework offers a way to predict nanoparticle dosage for therapy and for the study of nanotoxins.

  1. Statistical analysis of the particulation of shaped charge jets

    SciTech Connect

    Minich, R W, Baker, E L; Schwartz, A J

    1999-08-12

    A statistical analysis of shaped charge jet break-up was carried out in order to investigate the role of nonlinear instabilities leading to the particulation of the jet. Statistical methods generally used for studying fluctuations in nonlinear dynamical systems are applied to experimentally measured velocities of the individual particles. In particular we present results suggesting the deviation of non-Gaussian behavior for interparticle velocity correlations, characteristic of nonlinear dynamical systems. Results are presented for two silver shaped charge jets that differ primarily in their material processing. We provide evidence that the particulation of a jet is not random, but has its origin in a deterministic dynamical process involving the nonlinear coupling of two oscillators analogous to the underling dynamics observed in Rayleigh-Benard convection and modeled in the return map of Curry and Yorke.

  2. A Statistical Analysis of Lunisolar-Earthquake Connections

    NASA Astrophysics Data System (ADS)

    Rüegg, Christian Michael-André

    2012-11-01

    Despite over a century of study, the relationship between lunar cycles and earthquakes remains controversial and difficult to quantitatively investigate. Perhaps as a consequence, major earthquakes around the globe are frequently followed by "prediction claim", using lunar cycles, that generate media furore and pressure scientists to provide resolute answers. The 2010-2011 Canterbury earthquakes in New Zealand were no exception; significant media attention was given to lunar derived earthquake predictions by non-scientists, even though the predictions were merely "opinions" and were not based on any statistically robust temporal or causal relationships. This thesis provides a framework for studying lunisolar earthquake temporal relationships by developing replicable statistical methodology based on peer reviewed literature. Notable in the methodology is a high accuracy ephemeris, called ECLPSE, designed specifically by the author for use on earthquake catalogs and a model for performing phase angle analysis.

  3. Statistical analysis of effective singular values in matrix rank determination

    NASA Technical Reports Server (NTRS)

    Konstantinides, Konstantinos; Yao, Kung

    1988-01-01

    A major problem in using SVD (singular-value decomposition) as a tool in determining the effective rank of a perturbed matrix is that of distinguishing between significantly small and significantly large singular values to the end, conference regions are derived for the perturbed singular values of matrices with noisy observation data. The analysis is based on the theories of perturbations of singular values and statistical significance test. Threshold bounds for perturbation due to finite-precision and i.i.d. random models are evaluated. In random models, the threshold bounds depend on the dimension of the matrix, the noisy variance, and predefined statistical level of significance. Results applied to the problem of determining the effective order of a linear autoregressive system from the approximate rank of a sample autocorrelation matrix are considered. Various numerical examples illustrating the usefulness of these bounds and comparisons to other previously known approaches are given.

  4. Statistical analysis of nanoparticle dosing in a dynamic cellular system.

    PubMed

    Summers, Huw D; Rees, Paul; Holton, Mark D; Brown, M Rowan; Chappell, Sally C; Smith, Paul J; Errington, Rachel J

    2011-03-01

    The delivery of nanoparticles into cells is important in therapeutic applications and in nanotoxicology. Nanoparticles are generally targeted to receptors on the surfaces of cells and internalized into endosomes by endocytosis, but the kinetics of the process and the way in which cell division redistributes the particles remain unclear. Here we show that the chance of success or failure of nanoparticle uptake and inheritance is random. Statistical analysis of nanoparticle-loaded endosomes indicates that particle capture is described by an over-dispersed Poisson probability distribution that is consistent with heterogeneous adsorption and internalization. Partitioning of nanoparticles in cell division is random and asymmetric, following a binomial distribution with mean probability of 0.52-0.72. These results show that cellular targeting of nanoparticles is inherently imprecise due to the randomness of nature at the molecular scale, and the statistical framework offers a way to predict nanoparticle dosage for therapy and for the study of nanotoxins. PMID:21258333

  5. Statistical analysis of subjective preferences for video enhancement

    NASA Astrophysics Data System (ADS)

    Woods, Russell L.; Satgunam, PremNandhini; Bronstad, P. Matthew; Peli, Eli

    2010-02-01

    Measuring preferences for moving video quality is harder than for static images due to the fleeting and variable nature of moving video. Subjective preferences for image quality can be tested by observers indicating their preference for one image over another. Such pairwise comparisons can be analyzed using Thurstone scaling (Farrell, 1999). Thurstone (1927) scaling is widely used in applied psychology, marketing, food tasting and advertising research. Thurstone analysis constructs an arbitrary perceptual scale for the items that are compared (e.g. enhancement levels). However, Thurstone scaling does not determine the statistical significance of the differences between items on that perceptual scale. Recent papers have provided inferential statistical methods that produce an outcome similar to Thurstone scaling (Lipovetsky and Conklin, 2004). Here, we demonstrate that binary logistic regression can analyze preferences for enhanced video.

  6. Noise removing in encrypted color images by statistical analysis

    NASA Astrophysics Data System (ADS)

    Islam, N.; Puech, W.

    2012-03-01

    Cryptographic techniques are used to secure confidential data from unauthorized access but these techniques are very sensitive to noise. A single bit change in encrypted data can have catastrophic impact over the decrypted data. This paper addresses the problem of removing bit error in visual data which are encrypted using AES algorithm in the CBC mode. In order to remove the noise, a method is proposed which is based on the statistical analysis of each block during the decryption. The proposed method exploits local statistics of the visual data and confusion/diffusion properties of the encryption algorithm to remove the errors. Experimental results show that the proposed method can be used at the receiving end for the possible solution for noise removing in visual data in encrypted domain.

  7. Statistical Mechanics Analysis of ATP Binding to a Multisubunit Enzyme

    NASA Astrophysics Data System (ADS)

    Zhang, Yun-Xin

    2014-10-01

    Due to inter-subunit communication, multisubunit enzymes usually hydrolyze ATP in a concerted fashion. However, so far the principle of this process remains poorly understood. In this study, from the viewpoint of statistical mechanics, a simple model is presented. In this model, we assume that the binding of ATP will change the potential of the corresponding enzyme subunit, and the degree of this change depends on the state of its adjacent subunits. The probability of enzyme in a given state satisfies the Boltzmann's distribution. Although it looks much simple, this model can fit the recent experimental data of chaperonin TRiC/CCT well. From this model, the dominant state of TRiC/CCT can be obtained. This study provide a new way to understand biophysical processe by statistical mechanics analysis.

  8. Statistical analysis of the modal properties of large structural systems.

    NASA Technical Reports Server (NTRS)

    Collins, J. D.; Kennedy, B.; Hart, G. C.

    1971-01-01

    A theory is developed to predict eigenvalue and eigenvector uncertainty in large dynamic models. The uncertainty is based on physical property uncertainty and should not be confused with numerical roundoff, although the method can be extended to include the latter. The theory, when implemented on a computer, is used to analyze the uncertainties in frequencies and mode shapes based on uncertainties in mass, stiffness, modulus of elasticity, etc. The method incorporates a linear statistical model which is quite adequate for handling property uncertainties of 10% or more. The model is not limited to small systems but uses certain statistical assumptions as well as selective matrix manipulations to keep the size of all matrix operations to within the number of degrees of freedom of the system. Examples are given for two longitudinal vibration problems, and the results are supported by a Monte Carlo analysis.

  9. Statistical Analysis of Human Blood Cytometries: Potential Donors and Patients

    NASA Astrophysics Data System (ADS)

    Bernal-Alvarado, J.; Segovia-Olvera, P.; Mancilla-Escobar, B. E.; Palomares, P.

    2004-09-01

    The histograms of the cell volume from human blood present valuable information for clinical evaluation. Measurements can be performed with automatic equipment and a graphical presentation of the data is available, nevertheless, an statistical and mathematical analysis of the cell volume distribution could be useful for medical interpretation too, as much as the numerical parameters characterizing the histograms might be correlated with healthy people and patient populations. In this work, a statistical exercise was performed in order to find the most suitable model fitting the cell volume histograms. Several trial functions were tested and their parameters were tabulated. Healthy people exhibited an average of the cell volume of 85 femto liters while patients had 95 femto liters. White blood cell presented a small variation and platelets preserved their average for both populations.

  10. The Effects of Statistical Analysis Software and Calculators on Statistics Achievement

    ERIC Educational Resources Information Center

    Christmann, Edwin P.

    2009-01-01

    This study compared the effects of microcomputer-based statistical software and hand-held calculators on the statistics achievement of university males and females. The subjects, 73 graduate students enrolled in univariate statistics classes at a public comprehensive university, were randomly assigned to groups that used either microcomputer-based…

  11. Statistical analysis of pitting corrosion in condenser tubes

    SciTech Connect

    Ault, J.P.; Gehring, G.A. Jr.

    1997-12-31

    Condenser tube failure via wall penetration allows cooling water to contaminate the working fluid (steam). Contamination, especially from brackish or saltwater, will lower steam quality and thus lower overall plant efficiency. Because of the importance of minimizing leakages, power plant engineers are primarily concerned with the maximum localized corrosion in a unit rather than average corrosion values or rates. Extreme value analysis is a useful tool for evaluating the condition of condenser tubing. Extreme value statistical techniques allow the prediction of the most probable deepest pit in a given surface area based upon data acquired from a smaller surface area. Data is gathered from a physical examination of actual tubes (either in-service or from a sidestream unit) rather than small sample coupons. Three distinct applications of extreme value statistics to condenser tube evaluation are presented in this paper: (1) condition assessment of an operating condenser, (2) design data for material selection, and (3) research tool for assessing impact of various factors on condenser tube corrosion. The projections for operating units based on extreme value analysis are shown to be more useful than those made on the basis of other techniques such as eddy current or electrochemical measurements. Extreme value analysis would benefit from advances in two key areas: (1) development of an accurate and economical method for the measurement of maximum pit depths of condenser tubes in-situ would enhance the application of extreme value statistical analysis to the assessment of condenser tubing corrosion pitting and (2) development of methodologies to predict pit depth-time relationship in addition to pit depth-area relationship would be useful for modeling purposes.

  12. STATISTICAL ANALYSIS OF TANK 18F FLOOR SAMPLE RESULTS

    SciTech Connect

    Harris, S.

    2010-09-02

    Representative sampling has been completed for characterization of the residual material on the floor of Tank 18F as per the statistical sampling plan developed by Shine [1]. Samples from eight locations have been obtained from the tank floor and two of the samples were archived as a contingency. Six samples, referred to in this report as the current scrape samples, have been submitted to and analyzed by SRNL [2]. This report contains the statistical analysis of the floor sample analytical results to determine if further data are needed to reduce uncertainty. Included are comparisons with the prior Mantis samples results [3] to determine if they can be pooled with the current scrape samples to estimate the upper 95% confidence limits (UCL{sub 95%}) for concentration. Statistical analysis revealed that the Mantis and current scrape sample results are not compatible. Therefore, the Mantis sample results were not used to support the quantification of analytes in the residual material. Significant spatial variability among the current sample results was not found. Constituent concentrations were similar between the North and South hemispheres as well as between the inner and outer regions of the tank floor. The current scrape sample results from all six samples fall within their 3-sigma limits. In view of the results from numerous statistical tests, the data were pooled from all six current scrape samples. As such, an adequate sample size was provided for quantification of the residual material on the floor of Tank 18F. The uncertainty is quantified in this report by an upper 95% confidence limit (UCL{sub 95%}) on each analyte concentration. The uncertainty in analyte concentration was calculated as a function of the number of samples, the average, and the standard deviation of the analytical results. The UCL{sub 95%} was based entirely on the six current scrape sample results (each averaged across three analytical determinations).

  13. STATISTICAL ANALYSIS OF TANK 19F FLOOR SAMPLE RESULTS

    SciTech Connect

    Harris, S.

    2010-09-02

    Representative sampling has been completed for characterization of the residual material on the floor of Tank 19F as per the statistical sampling plan developed by Harris and Shine. Samples from eight locations have been obtained from the tank floor and two of the samples were archived as a contingency. Six samples, referred to in this report as the current scrape samples, have been submitted to and analyzed by SRNL. This report contains the statistical analysis of the floor sample analytical results to determine if further data are needed to reduce uncertainty. Included are comparisons with the prior Mantis samples results to determine if they can be pooled with the current scrape samples to estimate the upper 95% confidence limits (UCL95%) for concentration. Statistical analysis revealed that the Mantis and current scrape sample results are not compatible. Therefore, the Mantis sample results were not used to support the quantification of analytes in the residual material. Significant spatial variability among the current scrape sample results was not found. Constituent concentrations were similar between the North and South hemispheres as well as between the inner and outer regions of the tank floor. The current scrape sample results from all six samples fall within their 3-sigma limits. In view of the results from numerous statistical tests, the data were pooled from all six current scrape samples. As such, an adequate sample size was provided for quantification of the residual material on the floor of Tank 19F. The uncertainty is quantified in this report by an UCL95% on each analyte concentration. The uncertainty in analyte concentration was calculated as a function of the number of samples, the average, and the standard deviation of the analytical results. The UCL95% was based entirely on the six current scrape sample results (each averaged across three analytical determinations).

  14. a Multivariate Statistical Analysis of Visibility at California Regions.

    NASA Astrophysics Data System (ADS)

    Motallebi, Nehzat

    This study summarizes the results of a comprehensive study of visibility in California. California is one of the few states that has promulgated air quality standards for visibility. The study was concerned not only with major metropolitan areas such as Los Angeles, but also with deterioration of visibility in the less urbanized areas of California. The relationships among visibility reduction, atmospheric pollutants, and meteorological conditions were examined by using the multivariate statistical techniques of principal component analysis and multiple linear regression analysis. The primary concern of this work was to find which of the many atmospheric constituents most effectively reduce visibility, and to determine the role of the different meteorological variables on these relationships. Another objective was to identify the major pollutant sources and transport routes which contribute to visibility degradation. In order to establish the relationship between the light scattering coefficient and particulate data, both the size distribution and the elemental composition of particulate aerosols were considered. Meanwhile, including meteorological parameters in the principal component analysis made it possible to investigate meteorological effects on the observed pollution patterns. The associations among wind direction, elemental concentration, and additional meteorological parameters were considered by using a special modification of principal component analysis. This technique can identify all of the main features, and provides reasonable source direction for particular elements. It is appropriate to note that there appeared to be no published accounts of a principal component analysis for a data set similar to that analyzed in this work. Finally, the results of the multivariate statistical analyses, multiple linear regression analysis and principal component analysis, indicate that intermediate size sulfur containing aerosols, sulfur size mode 0.6 (mu)m < D

  15. Statistical analysis of static shape control in space structures

    NASA Technical Reports Server (NTRS)

    Burdisso, Ricardo A.; Haftka, Raphael T.

    1990-01-01

    The article addresses the problem of efficient analysis of the statistics of initial and corrected shape distortions in space structures. Two approaches for improving efficiency are considered. One is an adjoint technique for calculating distortion shapes: the second is a modal expansion of distortion shapes in terms of pseudo-vibration modes. The two techniques are applied to the problem of optimizing actuator locations on a 55 m radiometer antenna. The adjoint analysis technique is used with a discrete-variable optimization method. The modal approximation technique is coupled with a standard conjugate-gradient continuous optimization method. The agreement between the two sets of results is good, validating both the approximate analysis and optimality of the results.

  16. STATISTICS. The reusable holdout: Preserving validity in adaptive data analysis.

    PubMed

    Dwork, Cynthia; Feldman, Vitaly; Hardt, Moritz; Pitassi, Toniann; Reingold, Omer; Roth, Aaron

    2015-08-01

    Misapplication of statistical data analysis is a common cause of spurious discoveries in scientific research. Existing approaches to ensuring the validity of inferences drawn from data assume a fixed procedure to be performed, selected before the data are examined. In common practice, however, data analysis is an intrinsically adaptive process, with new analyses generated on the basis of data exploration, as well as the results of previous analyses on the same data. We demonstrate a new approach for addressing the challenges of adaptivity based on insights from privacy-preserving data analysis. As an application, we show how to safely reuse a holdout data set many times to validate the results of adaptively chosen analyses. PMID:26250683

  17. Data and statistical methods for analysis of trends and patterns

    SciTech Connect

    Atwood, C.L.; Gentillon, C.D.; Wilson, G.E.

    1992-11-01

    This report summarizes topics considered at a working meeting on data and statistical methods for analysis of trends and patterns in US commercial nuclear power plants. This meeting was sponsored by the Office of Analysis and Evaluation of Operational Data (AEOD) of the Nuclear Regulatory Commission (NRC). Three data sets are briefly described: Nuclear Plant Reliability Data System (NPRDS), Licensee Event Report (LER) data, and Performance Indicator data. Two types of study are emphasized: screening studies, to see if any trends or patterns appear to be present; and detailed studies, which are more concerned with checking the analysis assumptions, modeling any patterns that are present, and searching for causes. A prescription is given for a screening study, and ideas are suggested for a detailed study, when the data take of any of three forms: counts of events per time, counts of events per demand, and non-event data.

  18. Managing Performance Analysis with Dynamic Statistical Projection Pursuit

    SciTech Connect

    Vetter, J.S.; Reed, D.A.

    2000-05-22

    Computer systems and applications are growing more complex. Consequently, performance analysis has become more difficult due to the complex, transient interrelationships among runtime components. To diagnose these types of performance issues, developers must use detailed instrumentation to capture a large number of performance metrics. Unfortunately, this instrumentation may actually influence the performance analysis, leading the developer to an ambiguous conclusion. In this paper, we introduce a technique for focusing a performance analysis on interesting performance metrics. This technique, called dynamic statistical projection pursuit, identifies interesting performance metrics that the monitoring system should capture across some number of processors. By reducing the number of performance metrics, projection pursuit can limit the impact of instrumentation on the performance of the target system and can reduce the volume of performance data.

  19. Teaching Statistics in Biology: Using Inquiry-Based Learning to Strengthen Understanding of Statistical Analysis in Biology Laboratory Courses

    ERIC Educational Resources Information Center

    Metz, Anneke M.

    2008-01-01

    There is an increasing need for students in the biological sciences to build a strong foundation in quantitative approaches to data analyses. Although most science, engineering, and math field majors are required to take at least one statistics course, statistical analysis is poorly integrated into undergraduate biology course work, particularly…

  20. Gis-Based Spatial Statistical Analysis of College Graduates Employment

    NASA Astrophysics Data System (ADS)

    Tang, R.

    2012-07-01

    It is urgently necessary to be aware of the distribution and employment status of college graduates for proper allocation of human resources and overall arrangement of strategic industry. This study provides empirical evidence regarding the use of geocoding and spatial analysis in distribution and employment status of college graduates based on the data from 2004-2008 Wuhan Municipal Human Resources and Social Security Bureau, China. Spatio-temporal distribution of employment unit were analyzed with geocoding using ArcGIS software, and the stepwise multiple linear regression method via SPSS software was used to predict the employment and to identify spatially associated enterprise and professionals demand in the future. The results show that the enterprises in Wuhan east lake high and new technology development zone increased dramatically from 2004 to 2008, and tended to distributed southeastward. Furthermore, the models built by statistical analysis suggest that the specialty of graduates major in has an important impact on the number of the employment and the number of graduates engaging in pillar industries. In conclusion, the combination of GIS and statistical analysis which helps to simulate the spatial distribution of the employment status is a potential tool for human resource development research.

  1. Statistical energy analysis of a geared rotor system

    NASA Technical Reports Server (NTRS)

    Lim, Teik C.; Singh, Rajendra

    1990-01-01

    The vibroacoustic response of a generic geared rotor system is analyzed on an order of magnitude basis utilizing an approximate statistical energy analysis method. This model includes a theoretical coupling loss factor for a generic bearing component, which properly accounts for the vibration transmission through rolling element bearing. A simplified model of a NASA test stand that assumes vibratory energy flow from the gear mesh source to the casing through shafts and bearings is given as an example. Effects of dissipation loss factor and gearbox radiation efficiency models are studied by comparing predictions with NASA test results.

  2. Statistical energy analysis of complex structures, phase 2

    NASA Technical Reports Server (NTRS)

    Trudell, R. W.; Yano, L. I.

    1980-01-01

    A method for estimating the structural vibration properties of complex systems in high frequency environments was investigated. The structure analyzed was the Materials Experiment Assembly, (MEA), which is a portion of the OST-2A payload for the space transportation system. Statistical energy analysis (SEA) techniques were used to model the structure and predict the structural element response to acoustic excitation. A comparison of the intial response predictions and measured acoustic test data is presented. The conclusions indicate that: the SEA predicted the response of primary structure to acoustic excitation over a wide range of frequencies; and the contribution of mechanically induced random vibration to the total MEA is not significant.

  3. Multi-scale statistical analysis of coronal solar activity

    DOE PAGESBeta

    Gamborino, Diana; del-Castillo-Negrete, Diego; Martinell, Julio J.

    2016-07-08

    Multi-filter images from the solar corona are used to obtain temperature maps that are analyzed using techniques based on proper orthogonal decomposition (POD) in order to extract dynamical and structural information at various scales. Exploring active regions before and after a solar flare and comparing them with quiet regions, we show that the multi-scale behavior presents distinct statistical properties for each case that can be used to characterize the level of activity in a region. Information about the nature of heat transport is also to be extracted from the analysis.

  4. Data collection, computation and statistical analysis in psychophysiological experiments.

    PubMed

    Buzzi, R; Wespi, J; Zwimpfer, J

    1982-01-01

    The system was designed to allow simultaneous monitoring of eight bioelectrical signals together with the necessary event markers. The data inputs are pulse code modulated, recorded on magnetic tape, and then read into a minicomputer. The computer permits the determination of parameters for the following signals: electrocardiogram (ECG), respiration (RESP), skin conductance changes (SCC), electromyogram (EMG), plethysmogram (PLET), pulse transmission time (PTT), and electroencephalogram (EEG). These parameters are determined for time blocks of selectable duration and read into a mainframe computer for further statistical analysis. PMID:7183101

  5. Skylab 2 ground winds data reduction and statistical analysis

    NASA Technical Reports Server (NTRS)

    1974-01-01

    A ground winds test was conducted on the Skylab 2 spacecraft in a subsonic wind tunnel and the results were tape recorded for analysis. The data reduction system used to analyze the tapes for full scale, first and second mode bending moments, or acceleration plots versus dynamic pressure or wind velocity is explained. Portions of the Skylab 2 tape data were analyzed statistically in the form of power spectral densities, autocorrelations, and cross correlations to introduce a concept of using system response decay as a measure of linear system damping.

  6. Multi-scale statistical analysis of coronal solar activity

    NASA Astrophysics Data System (ADS)

    Gamborino, Diana; del-Castillo-Negrete, Diego; Martinell, Julio J.

    2016-07-01

    Multi-filter images from the solar corona are used to obtain temperature maps that are analyzed using techniques based on proper orthogonal decomposition (POD) in order to extract dynamical and structural information at various scales. Exploring active regions before and after a solar flare and comparing them with quiet regions, we show that the multi-scale behavior presents distinct statistical properties for each case that can be used to characterize the level of activity in a region. Information about the nature of heat transport is also to be extracted from the analysis.

  7. Statistical Analysis in Genetic Studies of Mental Illnesses

    PubMed Central

    Zhang, Heping

    2011-01-01

    Identifying the risk factors for mental illnesses is of significant public health importance. Diagnosis, stigma associated with mental illnesses, comorbidity, and complex etiologies, among others, make it very challenging to study mental disorders. Genetic studies of mental illnesses date back at least a century ago, beginning with descriptive studies based on Mendelian laws of inheritance. A variety of study designs including twin studies, family studies, linkage analysis, and more recently, genomewide association studies have been employed to study the genetics of mental illnesses, or complex diseases in general. In this paper, I will present the challenges and methods from a statistical perspective and focus on genetic association studies. PMID:21909187

  8. Statistical Analysis of Strength Data for an Aerospace Aluminum Alloy

    NASA Technical Reports Server (NTRS)

    Neergaard, Lynn; Malone, Tina; Gentz, Steven J. (Technical Monitor)

    2000-01-01

    Aerospace vehicles are produced in limited quantities that do not always allow development of MIL-HDBK-5 A-basis design allowables. One method of examining production and composition variations is to perform 100% lot acceptance testing for aerospace Aluminum (Al) alloys. This paper discusses statistical trends seen in strength data for one Al alloy. A four-step approach reduced the data to residuals, visualized residuals as a function of time, grouped data with quantified scatter, and conducted analysis of variance (ANOVA).

  9. Statistical Analysis of Strength Data for an Aerospace Aluminum Alloy

    NASA Technical Reports Server (NTRS)

    Neergaard, L.; Malone, T.

    2001-01-01

    Aerospace vehicles are produced in limited quantities that do not always allow development of MIL-HDBK-5 A-basis design allowables. One method of examining production and composition variations is to perform 100% lot acceptance testing for aerospace Aluminum (Al) alloys. This paper discusses statistical trends seen in strength data for one Al alloy. A four-step approach reduced the data to residuals, visualized residuals as a function of time, grouped data with quantified scatter, and conducted analysis of variance (ANOVA).

  10. Statistical analysis of the 70 meter antenna surface distortions

    NASA Technical Reports Server (NTRS)

    Kiedron, K.; Chian, C. T.; Chuang, K. L.

    1987-01-01

    Statistical analysis of surface distortions of the 70 meter NASA/JPL antenna, located at Goldstone, was performed. The purpose of this analysis is to verify whether deviations due to gravity loading can be treated as quasi-random variables with normal distribution. Histograms of the RF pathlength error distribution for several antenna elevation positions were generated. The results indicate that the deviations from the ideal antenna surface are not normally distributed. The observed density distribution for all antenna elevation angles is taller and narrower than the normal density, which results in large positive values of kurtosis and a significant amount of skewness. The skewness of the distribution changes from positive to negative as the antenna elevation changes from zenith to horizon.

  11. Statistical analysis of cascading failures in power grids

    SciTech Connect

    Chertkov, Michael; Pfitzner, Rene; Turitsyn, Konstantin

    2010-12-01

    We introduce a new microscopic model of cascading failures in transmission power grids. This model accounts for automatic response of the grid to load fluctuations that take place on the scale of minutes, when optimum power flow adjustments and load shedding controls are unavailable. We describe extreme events, caused by load fluctuations, which cause cascading failures of loads, generators and lines. Our model is quasi-static in the causal, discrete time and sequential resolution of individual failures. The model, in its simplest realization based on the Directed Current description of the power flow problem, is tested on three standard IEEE systems consisting of 30, 39 and 118 buses. Our statistical analysis suggests a straightforward classification of cascading and islanding phases in terms of the ratios between average number of removed loads, generators and links. The analysis also demonstrates sensitivity to variations in line capacities. Future research challenges in modeling and control of cascading outages over real-world power networks are discussed.

  12. Processes and subdivisions in diogenites, a multivariate statistical analysis

    NASA Technical Reports Server (NTRS)

    Harriott, T. A.; Hewins, R. H.

    1984-01-01

    Multivariate statistical techniques used on diogenite orthopyroxene analyses show the relationships that occur within diogenites and the two orthopyroxenite components (class I and II) in the polymict diogenite Garland. Cluster analysis shows that only Peckelsheim is similar to Garland class I (Fe-rich) and the other diogenites resemble Garland class II. The unique diogenite Y 75032 may be related to type I by fractionation. Factor analysis confirms the subdivision and shows that Fe does not correlate with the weakly incompatible elements across the entire pyroxene composition range, indicating that igneous fractionation is not the process controlling total diogenite composition variation. The occurrence of two groups of diogenites is interpreted as the result of sampling or mixing of two main sequences of orthopyroxene cumulates with slightly different compositions.

  13. Vibroacoustic optimization using a statistical energy analysis model

    NASA Astrophysics Data System (ADS)

    Culla, Antonio; D`Ambrogio, Walter; Fregolent, Annalisa; Milana, Silvia

    2016-08-01

    In this paper, an optimization technique for medium-high frequency dynamic problems based on Statistical Energy Analysis (SEA) method is presented. Using a SEA model, the subsystem energies are controlled by internal loss factors (ILF) and coupling loss factors (CLF), which in turn depend on the physical parameters of the subsystems. A preliminary sensitivity analysis of subsystem energy to CLF's is performed to select CLF's that are most effective on subsystem energies. Since the injected power depends not only on the external loads but on the physical parameters of the subsystems as well, it must be taken into account under certain conditions. This is accomplished in the optimization procedure, where approximate relationships between CLF's, injected power and physical parameters are derived. The approach is applied on a typical aeronautical structure: the cabin of a helicopter.

  14. First statistical analysis of Geant4 quality software metrics

    NASA Astrophysics Data System (ADS)

    Ronchieri, Elisabetta; Grazia Pia, Maria; Giacomini, Francesco

    2015-12-01

    Geant4 is a simulation system of particle transport through matter, widely used in several experimental areas from high energy physics and nuclear experiments to medical studies. Some of its applications may involve critical use cases; therefore they would benefit from an objective assessment of the software quality of Geant4. In this paper, we provide a first statistical evaluation of software metrics data related to a set of Geant4 physics packages. The analysis aims at identifying risks for Geant4 maintainability, which would benefit from being addressed at an early stage. The findings of this pilot study set the grounds for further extensions of the analysis to the whole of Geant4 and to other high energy physics software systems.

  15. Detection of bearing damage by statistic vibration analysis

    NASA Astrophysics Data System (ADS)

    Sikora, E. A.

    2016-04-01

    The condition of bearings, which are essential components in mechanisms, is crucial to safety. The analysis of the bearing vibration signal, which is always contaminated by certain types of noise, is a very important standard for mechanical condition diagnosis of the bearing and mechanical failure phenomenon. In this paper the method of rolling bearing fault detection by statistical analysis of vibration is proposed to filter out Gaussian noise contained in a raw vibration signal. The results of experiments show that the vibration signal can be significantly enhanced by application of the proposed method. Besides, the proposed method is used to analyse real acoustic signals of a bearing with inner race and outer race faults, respectively. The values of attributes are determined according to the degree of the fault. The results confirm that the periods between the transients, which represent bearing fault characteristics, can be successfully detected.

  16. Statistical analysis and correlation discovery of tumor respiratory motion.

    PubMed

    Wu, Huanmei; Sharp, Gregory C; Zhao, Qingya; Shirato, Hiroki; Jiang, Steve B

    2007-08-21

    Tumors, especially in the thorax and abdomen, are subject to respiratory motion, and understanding the structure of respiratory motion is a key component to the management and control of disease in these sites. We have applied statistical analysis and correlation discovery methods to analyze and mine tumor respiratory motion based on a finite state model of tumor motion. Aggregates (such as minimum, maximum, average and mean), histograms, percentages, linear regression and multi-round statistical analysis have been explored. The results have been represented in various formats, including tables, graphs and text description. Different graphs, for example scatter plots, clustered column figures, 100% stacked column figures and box-whisker plots, have been applied to highlight different aspects of the results. The internal tumor motion from 42 lung tumors, 30 of which have motion larger than 5 mm, has been analyzed. Results for both inter-patient and intra-patient motion characteristics, such as duration and travel distance patterns, are reported. New knowledge of patient-specific tumor motion characteristics have been discovered, such as expected correlations between properties. The discovered tumor motion characteristics will be utilized in different aspects of image-guided radiation treatment, including treatment planning, online tumor motion prediction and real-time radiation dose delivery. PMID:17671334

  17. Statistical analysis and correlation discovery of tumor respiratory motion

    NASA Astrophysics Data System (ADS)

    Wu, Huanmei; Sharp, Gregory C.; Zhao, Qingya; Shirato, Hiroki; Jiang, Steve B.

    2007-08-01

    Tumors, especially in the thorax and abdomen, are subject to respiratory motion, and understanding the structure of respiratory motion is a key component to the management and control of disease in these sites. We have applied statistical analysis and correlation discovery methods to analyze and mine tumor respiratory motion based on a finite state model of tumor motion. Aggregates (such as minimum, maximum, average and mean), histograms, percentages, linear regression and multi-round statistical analysis have been explored. The results have been represented in various formats, including tables, graphs and text description. Different graphs, for example scatter plots, clustered column figures, 100% stacked column figures and box-whisker plots, have been applied to highlight different aspects of the results. The internal tumor motion from 42 lung tumors, 30 of which have motion larger than 5 mm, has been analyzed. Results for both inter-patient and intra-patient motion characteristics, such as duration and travel distance patterns, are reported. New knowledge of patient-specific tumor motion characteristics have been discovered, such as expected correlations between properties. The discovered tumor motion characteristics will be utilized in different aspects of image-guided radiation treatment, including treatment planning, online tumor motion prediction and real-time radiation dose delivery.

  18. Analysis of ageing of amorphous thermoplastic polymers by PVT analysis

    NASA Astrophysics Data System (ADS)

    Greco, Antonio; Maffezzoli, Alfonso; Gennaro, Riccardo; Rizzo, Michele

    2012-07-01

    The aim of this work is the analysis of ageing phenomenon occurring in amorphous thermoplastic polymers below their glass transition temperature by pressure-volume-temperature (PVT) analysis. The ageing behavior of different polymers as a function of the heating and cooling rates has been widespread studied. Also, different works in literature are aimed to study the effect of the applied pressure on the glass transition behavior. Another relevant aspect related to the glass transition behavior is related to the ageing effects, which can also be influenced by the applied pressure. This is a very relevant issue, since most of the polymers, during ageing, are subjected to mechanical loading. PVT analysis was used to study the ageing of amorphous PET copolymer (PETg) at different pressure levels. Specific volume-temperature curves measured during the cooling and the heating steps were used for calculating the relaxed specific volume, showing that ageing effects increase with increasing applied pressure. The evolution of the fictive temperature as a function of time was calculated from experimental data.

  19. Neutral dynamics with environmental noise: Age-size statistics and species lifetimes

    NASA Astrophysics Data System (ADS)

    Kessler, David; Suweis, Samir; Formentin, Marco; Shnerb, Nadav M.

    2015-08-01

    Neutral dynamics, where taxa are assumed to be demographically equivalent and their abundance is governed solely by the stochasticity of the underlying birth-death process, has proved itself as an important minimal model that accounts for many empirical datasets in genetics and ecology. However, the restriction of the model to demographic [O (√{N }) ] noise yields relatively slow dynamics that appears to be in conflict with both short-term and long-term characteristics of the observed systems. Here we analyze two of these problems—age-size relationships and species extinction time—in the framework of a neutral theory with both demographic and environmental stochasticity. It turns out that environmentally induced variations of the demographic rates control the long-term dynamics and modify dramatically the predictions of the neutral theory with demographic noise only, yielding much better agreement with empirical data. We consider two prototypes of "zero mean" environmental noise, one which is balanced with regard to the arithmetic abundance, another balanced in the logarithmic (fitness) space, study their species lifetime statistics, and discuss their relevance to realistic models of community dynamics.

  20. Neutral dynamics with environmental noise: Age-size statistics and species lifetimes.

    PubMed

    Kessler, David; Suweis, Samir; Formentin, Marco; Shnerb, Nadav M

    2015-08-01

    Neutral dynamics, where taxa are assumed to be demographically equivalent and their abundance is governed solely by the stochasticity of the underlying birth-death process, has proved itself as an important minimal model that accounts for many empirical datasets in genetics and ecology. However, the restriction of the model to demographic [O√N)] noise yields relatively slow dynamics that appears to be in conflict with both short-term and long-term characteristics of the observed systems. Here we analyze two of these problems--age-size relationships and species extinction time--in the framework of a neutral theory with both demographic and environmental stochasticity. It turns out that environmentally induced variations of the demographic rates control the long-term dynamics and modify dramatically the predictions of the neutral theory with demographic noise only, yielding much better agreement with empirical data. We consider two prototypes of "zero mean" environmental noise, one which is balanced with regard to the arithmetic abundance, another balanced in the logarithmic (fitness) space, study their species lifetime statistics, and discuss their relevance to realistic models of community dynamics. PMID:26382447

  1. EBprot: Statistical analysis of labeling-based quantitative proteomics data.

    PubMed

    Koh, Hiromi W L; Swa, Hannah L F; Fermin, Damian; Ler, Siok Ghee; Gunaratne, Jayantha; Choi, Hyungwon

    2015-08-01

    Labeling-based proteomics is a powerful method for detection of differentially expressed proteins (DEPs). The current data analysis platform typically relies on protein-level ratios, which is obtained by summarizing peptide-level ratios for each protein. In shotgun proteomics, however, some proteins are quantified with more peptides than others, and this reproducibility information is not incorporated into the differential expression (DE) analysis. Here, we propose a novel probabilistic framework EBprot that directly models the peptide-protein hierarchy and rewards the proteins with reproducible evidence of DE over multiple peptides. To evaluate its performance with known DE states, we conducted a simulation study to show that the peptide-level analysis of EBprot provides better receiver-operating characteristic and more accurate estimation of the false discovery rates than the methods based on protein-level ratios. We also demonstrate superior classification performance of peptide-level EBprot analysis in a spike-in dataset. To illustrate the wide applicability of EBprot in different experimental designs, we applied EBprot to a dataset for lung cancer subtype analysis with biological replicates and another dataset for time course phosphoproteome analysis of EGF-stimulated HeLa cells with multiplexed labeling. Through these examples, we show that the peptide-level analysis of EBprot is a robust alternative to the existing statistical methods for the DE analysis of labeling-based quantitative datasets. The software suite is freely available on the Sourceforge website http://ebprot.sourceforge.net/. All MS data have been deposited in the ProteomeXchange with identifier PXD001426 (http://proteomecentral.proteomexchange.org/dataset/PXD001426/). PMID:25913743

  2. Statistical Models and Methods for Network Meta-Analysis.

    PubMed

    Madden, L V; Piepho, H-P; Paul, P A

    2016-08-01

    Meta-analysis, the methodology for analyzing the results from multiple independent studies, has grown tremendously in popularity over the last four decades. Although most meta-analyses involve a single effect size (summary result, such as a treatment difference) from each study, there are often multiple treatments of interest across the network of studies in the analysis. Multi-treatment (or network) meta-analysis can be used for simultaneously analyzing the results from all the treatments. However, the methodology is considerably more complicated than for the analysis of a single effect size, and there have not been adequate explanations of the approach for agricultural investigations. We review the methods and models for conducting a network meta-analysis based on frequentist statistical principles, and demonstrate the procedures using a published multi-treatment plant pathology data set. A major advantage of network meta-analysis is that correlations of estimated treatment effects are automatically taken into account when an appropriate model is used. Moreover, treatment comparisons may be possible in a network meta-analysis that are not possible in a single study because all treatments of interest may not be included in any given study. We review several models that consider the study effect as either fixed or random, and show how to interpret model-fitting output. We further show how to model the effect of moderator variables (study-level characteristics) on treatment effects, and present one approach to test for the consistency of treatment effects across the network. Online supplemental files give explanations on fitting the network meta-analytical models using SAS. PMID:27111798

  3. A statistical analysis of the internal organ weights of normal Japanese people

    SciTech Connect

    Ogiu, Nobuko; Nakamura, Yuji; Ogiu, Toshiaki

    1997-03-01

    Correlation of weights of various organs with age, body weight, and/or body height was statistically analyzed using data on the Japanese physique collected by the Medico-Legal Society from Universities and Research Institutes in almost all areas of Japan. After exclusion of unsuitable individual data for statistical analysis, findings for 4,667 Japanese, aged 0-95 y, including 3,023 males and 1,644 females were used in the present study. Analyses of age-dependent changes in weights of the brain, heart, lung, kidney, spleen, pancreas, thymus, thyroid gland and adrenal gland and also of correlations between organ weights and body height, weight, or surface area were carried out. It was concluded that organ weights in the growing generation (under 19 y) generally increased with a coefficient expressed as (body height X body weight{sup 0.5}). Because clear age-dependent changes were not observed in adults over 20 y, they were classified into 4 physical types, thin, standard, plump and obese, and the relations of organ weights with these physical types were assessed. Some organs were relatively heavier in fat groups and light in thin individuals, or vice versa. 36 refs., 5 figs., 11 tabs.

  4. Transcriptome analysis of aging mouse meibomian glands

    PubMed Central

    Parfitt, Geraint J.; Brown, Donald J.

    2016-01-01

    Purpose Dry eye disease is a common condition associated with age-related meibomian gland dysfunction (ARMGD). We have previously shown that ARMGD occurs in old mice, similar to that observed in human patients with MGD. To begin to understand the mechanism underlying ARMGD, we generated transcriptome profiles of eyelids excised from young and old mice of both sexes. Methods Male and female C57BL/6 mice were euthanized at ages of 3 months or 2 years and their lower eyelids removed, the conjunctival epithelium scrapped off, and the tarsal plate, containing the meibomian glands, dissected from the overlying muscle and lid epidermis. RNA was isolated, enriched, and transcribed into cDNA and processed to generate four non-stranded libraries with distinct bar codes on each adaptor. The libraries were then sequenced and mapped to the mm10 reference genome, and expression results were gathered as reads per length of transcript in kilobases per million mapped reads (RPKM) values. Differential gene expression analyses were performed using CyberT. Results Approximately 55 million reads were generated from each library. Expression data indicated that about 15,000 genes were expressed in these tissues. Of the genes that showed more than twofold significant differences in either young or old tissue, 698 were identified as differentially expressed. According to the Gene Ontology (GO) analysis, the cellular, developmental, and metabolic processes were found to be highly represented with Wnt function noted to be altered in the aging mouse. Conclusions The RNA sequencing data identified several signaling pathways, including fibroblast growth factor (FGF) and Wnt that were altered in the meibomian glands of aging mice. PMID:27279727

  5. Statistical analysis of magnetotail fast flows and related magnetic disturbances

    NASA Astrophysics Data System (ADS)

    Frühauff, Dennis; Glassmeier, Karl-Heinz

    2016-04-01

    This study presents an investigation on the occurrence of fast flows in the magnetotail using the complete available data set of the THEMIS spacecraft for the years 2007 to 2015. The fast flow events (times of enhanced ion velocity) are detected through the use of a velocity criterion, therefore making the resulting database as large as almost 16 000 events. First, basic statistical findings concerning velocity distributions, occurrence rates, group structures are presented. Second, Superposed Epoch Analysis is utilized to account for average profiles of selected plasma quantities. The data reveal representative time series in near and far tail of the Earth with typical timescales of the order of 1-2 min, corresponding to scale sizes of 3 RE. Last, related magnetic field disturbances are analyzed. It is found that the minimum variance direction is essentially confined to a plane almost perpendicular to the main flow direction while, at the same time, the maximum variance direction is aligned with flow and background field directions. The presentation of the database and first statistical findings will prove useful both as input for magneto-hydrodynamical simulations and theoretical considerations of fast flows.

  6. Statistical methods for the analysis of climate extremes

    NASA Astrophysics Data System (ADS)

    Naveau, Philippe; Nogaj, Marta; Ammann, Caspar; Yiou, Pascal; Cooley, Daniel; Jomelli, Vincent

    2005-08-01

    Currently there is an increasing research activity in the area of climate extremes because they represent a key manifestation of non-linear systems and an enormous impact on economic and social human activities. Our understanding of the mean behavior of climate and its 'normal' variability has been improving significantly during the last decades. In comparison, climate extreme events have been hard to study and even harder to predict because they are, by definition, rare and obey different statistical laws than averages. In this context, the motivation for this paper is twofold. Firstly, we recall the basic principles of Extreme Value Theory that is used on a regular basis in finance and hydrology, but it still does not have the same success in climate studies. More precisely, the theoretical distributions of maxima and large peaks are recalled. The parameters of such distributions are estimated with the maximum likelihood estimation procedure that offers the flexibility to take into account explanatory variables in our analysis. Secondly, we detail three case-studies to show that this theory can provide a solid statistical foundation, specially when assessing the uncertainty associated with extreme events in a wide range of applications linked to the study of our climate. To cite this article: P. Naveau et al., C. R. Geoscience 337 (2005).

  7. ADS-Demo Fuel Rod Performance: Multivariate Statistical Analysis

    SciTech Connect

    Calabrese, R.; Vettraino, F.; Luzzi, L.

    2004-07-01

    A forward step in the development of Accelerator Driven System (ADS) for the Pu, MA and LLFP transmutation, is the realisation of a 80 MWt ADS-demo (XADS) whose basic objective is the system feasibility demonstration. The XADS is forecasted to adopt the UO{sub 2}-PuO{sub 2} mixed-oxides fuel already experimented in the sodium cooled fast reactors such as the french SPX-1. The present multivariate statistical analysis performed by using the Transuranus Code, was carried out for the Normal Operation at the so-called Enhanced Nominal Conditions (120% nominal reactor power), aimed at verifying that the fuel system complies with the stated design limits, i.e. centerline fuel temperature, cladding temperature and damage, during all the in-reactor lifetime. A statistical input set similar to SPX and PEC fuel case, was adopted. One most relevant assumption in the present calculations was a 30% AISI-316 cladding thickness corrosion at EOL. Relative influence of main fuel rod parameters on fuel centerline temperature was also evaluated. (authors)

  8. Confirmatory Factor Analysis of the Statistical Anxiety Rating Scale With Online Graduate Students.

    PubMed

    DeVaney, Thomas A

    2016-04-01

    The Statistical Anxiety Rating Scale was examined using data from a convenience sample of 450 female and 65 male students enrolled in online, graduate-level introductory statistics courses. The mean age of the students was 33.1 (SD = 8.2), and 58.3% had completed six or fewer online courses. The majority of students were enrolled in education or counseling degree programs. Confirmatory factor analysis using unweighted least squares estimation was used to test three proposed models, and alpha coefficients were used to examine the internal consistency. The confirmatory factor analysis results supported the six-factor structure and indicated that proper models should include correlations among the six factors or two second-order factors (anxiety and attitude). Internal consistency estimates ranged from .82 to .95 and were consistent with values reported by previous researchers. The findings suggest that, when measuring statistics anxiety of online students using Statistical Anxiety Rating Scale, researchers and instructors can use scores from the individual subscales or generate two composite scores, anxiety and attitude, instead of a total score. PMID:27154380

  9. Statistical Analysis of Risk Factors in the Prebreathe Reduction Protocol

    NASA Technical Reports Server (NTRS)

    Gerth, Wayne A.; Gernhardt, Michael L.; Conkin, Johnny; Homick, Jerry L. (Technical Monitor)

    2000-01-01

    The 165 exposures from four 2-hour protocols were analyzed for correlations or trends between decompression sickness (DCS) or venous gas emboli (VGE), and variables that affect risk in the subject and astronaut populations. The assumption in this global survey is that the distributions of gender, age, body mass index, etc., are equally represented in all four tested procedures. We used Student t-test for comparisons between means and chi-square test between comparisons of proportions with p<0.05 defining a significant level. The type and distribution of the 19 cases of DCS were similar to historical cases. There was no correlation of age, gender, body mass index or fitness level with greater incidence of DCS or VGE. However increased age was associated with more Grade IV VGE in males. The duration and quantity of exercise during prebreathe is inversely related to risk of DCS and VGE. The latency time for VGE was longer (103 min +/- 56 SD, n = 15) when the ergometry was done approximately 15 min into the prebreathe than when done at the start of the prebreathe (53 min +/- 31, n = 13). The order of the ergometry did not influence the overall DCS and VGE incidence. We identified variables other than those of the prebreathe procedures that influence the DCS and VGE outcome. The analysis suggests that males over 40 years have a high incidence of Grade IV VGE.

  10. Statistical analysis of target acquisition sensor modeling experiments

    NASA Astrophysics Data System (ADS)

    Deaver, Dawne M.; Moyer, Steve

    2015-05-01

    The U.S. Army RDECOM CERDEC NVESD Modeling and Simulation Division is charged with the development and advancement of military target acquisition models to estimate expected soldier performance when using all types of imaging sensors. Two elements of sensor modeling are (1) laboratory-based psychophysical experiments used to measure task performance and calibrate the various models and (2) field-based experiments used to verify the model estimates for specific sensors. In both types of experiments, it is common practice to control or measure environmental, sensor, and target physical parameters in order to minimize uncertainty of the physics based modeling. Predicting the minimum number of test subjects required to calibrate or validate the model should be, but is not always, done during test planning. The objective of this analysis is to develop guidelines for test planners which recommend the number and types of test samples required to yield a statistically significant result.

  11. Treated cabin acoustic prediction using statistical energy analysis

    NASA Technical Reports Server (NTRS)

    Yoerkie, Charles A.; Ingraham, Steven T.; Moore, James A.

    1987-01-01

    The application of statistical energy analysis (SEA) to the modeling and design of helicopter cabin interior noise control treatment is demonstrated. The information presented here is obtained from work sponsored at NASA Langley for the development of analytic modeling techniques and the basic understanding of cabin noise. Utility and executive interior models are developed directly from existing S-76 aircraft designs. The relative importance of panel transmission loss (TL), acoustic leakage, and absorption to the control of cabin noise is shown using the SEA modeling parameters. It is shown that the major cabin noise improvement below 1000 Hz comes from increased panel TL, while above 1000 Hz it comes from reduced acoustic leakage and increased absorption in the cabin and overhead cavities.

  12. Statistical approach to the analysis of cell desynchronization data

    NASA Astrophysics Data System (ADS)

    Milotti, Edoardo; Del Fabbro, Alessio; Dalla Pellegrina, Chiara; Chignola, Roberto

    2008-07-01

    Experimental measurements on semi-synchronous tumor cell populations show that after a few cell cycles they desynchronize completely, and this desynchronization reflects the intercell variability of cell-cycle duration. It is important to identify the sources of randomness that desynchronize a population of cells living in a homogeneous environment: for example, being able to reduce randomness and induce synchronization would aid in targeting tumor cells with chemotherapy or radiotherapy. Here we describe a statistical approach to the analysis of the desynchronization measurements that is based on minimal modeling hypotheses, and can be derived from simple heuristics. We use the method to analyze existing desynchronization data and to draw conclusions on the randomness of cell growth and proliferation.

  13. Barcode localization with region based gradient statistical analysis

    NASA Astrophysics Data System (ADS)

    Chen, Zhiyuan; Zhao, Yuming

    2015-03-01

    Barcode, as a kind of data representation method, has been adopted in a wide range of areas. Especially with the rise of the smart phone and the hand-held device equipped with high resolution camera and great computation power, barcode technique has found itself more extensive applications. In industrial field, barcode reading system is highly demanded to be robust to blur, illumination change, pitch, rotation, and scale change. This paper gives a new idea in localizing barcode under a region-based gradient statistical analysis. Making this idea as the basis, four algorithms have been developed for dealing with Linear, PDF417, Stacked 1D1D and Stacked 1D2D barcodes respectively. After being evaluated on our challenging dataset with more than 17000 images, the result shows that our methods can achieve an average localization accuracy of 82.17% with respect to 8 kinds of distortions and within an average time of 12 ms.

  14. Statistical analysis of test data for APM rod issue

    SciTech Connect

    Edwards, T.B.; Harris, S.P.; Reeve, C.P.

    1992-05-01

    The uncertainty associated with the use of the K-Reactor axial power monitors (APMs) to measure roof-top-ratios is investigated in this report. Internal heating test data acquired under both DC-flow conditions and AC-flow conditions have been analyzed. These tests were conducted to simulate gamma heating at the lower power levels planned for reactor operation. The objective of this statistical analysis is to investigate the relationship between the observed and true roof-top-ratio (RTR) values and associated uncertainties at power levels within this lower operational range. Conditional on a given, known power level, a prediction interval for the true RTR value corresponding to a new, observed RTR is given. This is done for a range of power levels. Estimates of total system uncertainty are also determined by combining the analog-to-digital converter uncertainty with the results from the test data.

  15. The geomagnetic storms of 2015: Statistical analysis and forecasting results

    NASA Astrophysics Data System (ADS)

    Paouris, Evangelos; Gerontidou, Maria; Mavromichalaki, Helen

    2016-04-01

    The year 2015 was characterized by long geomagnetic quiet periods with a lot of geomagnetically active breaks although it is on the declining phase of the current solar cycle. As a result a number of geomagnetic storms in the G1 up to G4 scale were noticed. In this work the characteristics of these geomagnetic storms like the scale level, the origin of the storm (CME or CIR) and the duration have been studied. Furthermore, a statistical analysis of these events and a comparative study of the forecasting and the actual geomagnetic conditions are performed using data from the NOAA space weather forecasting center and from the Athens Space Weather Forecasting Center as well. These forecasting centers estimate and provide every day the geomagnetic conditions for the upcoming days giving the values of the geomagnetic index Ap. The forecasting values of Ap index for the year 2015 from these two centers and their comparison in terms of the actual values are discussed.

  16. Dynamic Modelling and Statistical Analysis of Event Times

    PubMed Central

    Peña, Edsel A.

    2006-01-01

    This review article provides an overview of recent work in the modelling and analysis of recurrent events arising in engineering, reliability, public health, biomedical, and other areas. Recurrent event modelling possesses unique facets making it different and more difficult to handle than single event settings. For instance, the impact of an increasing number of event occurrences needs to be taken into account, the effects of covariates should be considered, potential association among the inter-event times within a unit cannot be ignored, and the effects of performed interventions after each event occurrence need to be factored in. A recent general class of models for recurrent events which simultaneously accommodates these aspects is described. Statistical inference methods for this class of models are presented and illustrated through applications to real data sets. Some existing open research problems are described. PMID:17906740

  17. Statistical analysis of honeybee survival after chronic exposure to insecticides.

    PubMed

    Dechaume Moncharmont, François-Xavier; Decourtye, Axel; Hennequet-Hantier, Christelle; Pons, Odile; Pham-Delègue, Minh-Hà

    2003-12-01

    Studies concerning long-term survival of honeybees raise the problem of the statistical analysis of mortality data. In the present study, we used a modeling approach of survival data of caged bees under chronic exposure to two pesticides (imidacloprid and deltamethrin). Our model, based on a Cox proportional hazard model, is not restricted to a specific hazard functional form, such as in parametric approaches, but takes into account multiple covariates. We consider not only the pesticide treatment but also a nuisance variable (variability between replicates). Moreover, considering the occurrence of social interactions, the model integrates the fact that bees do not die independently of each other. We demonstrate the chronic toxicity induced by imidacloprid and deltamethrin. Our results also underline the role of the replicate effect, the density-dependent effect, and their interactions with the treatment effect. None of these parameters can be neglected in the assessment of chronic toxicity of pesticides to the honeybee. PMID:14713054

  18. Statistical analysis of the uncertainty related to flood hazard appraisal

    NASA Astrophysics Data System (ADS)

    Notaro, Vincenza; Freni, Gabriele

    2015-12-01

    The estimation of flood hazard frequency statistics for an urban catchment is of great interest in practice. It provides the evaluation of potential flood risk and related damage and supports decision making for flood risk management. Flood risk is usually defined as function of the probability, that a system deficiency can cause flooding (hazard), and the expected damage, due to the flooding magnitude (damage), taking into account both the exposure and the vulnerability of the goods at risk. The expected flood damage can be evaluated by an a priori estimation of potential damage caused by flooding or by interpolating real damage data. With regard to flood hazard appraisal several procedures propose to identify some hazard indicator (HI) such as flood depth or the combination of flood depth and velocity and to assess the flood hazard corresponding to the analyzed area comparing the HI variables with user-defined threshold values or curves (penalty curves or matrixes). However, flooding data are usually unavailable or piecemeal allowing for carrying out a reliable flood hazard analysis, therefore hazard analysis is often performed by means of mathematical simulations aimed at evaluating water levels and flow velocities over catchment surface. As results a great part of the uncertainties intrinsic to flood risk appraisal can be related to the hazard evaluation due to the uncertainty inherent to modeling results and to the subjectivity of the user defined hazard thresholds applied to link flood depth to a hazard level. In the present work, a statistical methodology was proposed for evaluating and reducing the uncertainties connected with hazard level estimation. The methodology has been applied to a real urban watershed as case study.

  19. Age estimation of bloodstains using smartphones and digital image analysis.

    PubMed

    Thanakiatkrai, Phuvadol; Yaodam, Alisa; Kitpipit, Thitika

    2013-12-10

    Recent studies on bloodstains have focused on determining the time since deposition of bloodstains, which can provide useful temporal information to forensic investigations. This study is the first to use smartphone cameras in combination with a truly low-cost illumination system as a tool to estimate the age of bloodstains. Bloodstains were deposited on various substrates and photographed with a smartphone camera. Three smartphones (Samsung Galaxy S Plus, Apple iPhone 4, and Apple iPad 2) were compared. The environmental effects - temperature, humidity, light exposure, and anticoagulant - on the bloodstain age estimation process were explored. The color values from the digital images were extracted and correlated with time since deposition. Magenta had the highest correlation (R(2)=0.966) and was used in subsequent experiments. The Samsung Galaxy S Plus was the most suitable smartphone as its magenta decreased exponentially with increasing time and had highest repeatability (low variation within and between pictures). The quantifiable color change observed is consistent with well-established hemoglobin denaturation process. Using a statistical classification technique called Random Forests™, we could predict bloodstain age accurately up to 42 days with an error rate of 12%. Additionally, the age of forty blind stains were all correctly predicted, and 83% of mock casework samples were correctly classified. No within- and between-person variations were observed (p>0.05), while smartphone camera, temperature, humidity, and substrate color influenced the age determination process in different ways. Our technique provides a cheap, rapid, easy-to-use, and truly portable alternative to more complicated analysis using specialized equipment, e.g. spectroscopy and HPLC. No training is necessary with our method, and we envision a smartphone application that could take user inputs of environmental factors and provide an accurate estimate of bloodstain age. PMID:24314532

  20. Helioseismology of pre-emerging active regions. III. Statistical analysis

    SciTech Connect

    Barnes, G.; Leka, K. D.; Braun, D. C.; Birch, A. C.

    2014-05-01

    The subsurface properties of active regions (ARs) prior to their appearance at the solar surface may shed light on the process of AR formation. Helioseismic holography has been applied to samples taken from two populations of regions on the Sun (pre-emergence and without emergence), each sample having over 100 members, that were selected to minimize systematic bias, as described in Paper I. Paper II showed that there are statistically significant signatures in the average helioseismic properties that precede the formation of an AR. This paper describes a more detailed analysis of the samples of pre-emergence regions and regions without emergence based on discriminant analysis. The property that is best able to distinguish the populations is found to be the surface magnetic field, even a day before the emergence time. However, after accounting for the correlations between the surface field and the quantities derived from helioseismology, there is still evidence of a helioseismic precursor to AR emergence that is present for at least a day prior to emergence, although the analysis presented cannot definitively determine the subsurface properties prior to emergence due to the small sample sizes.

  1. Statistical Signal Analysis for Systems with Interferenced Inputs

    NASA Technical Reports Server (NTRS)

    Bai, R. M.; Mielnicka-Pate, A. L.

    1985-01-01

    A new approach is introduced, based on statistical signal analysis, which overcomes the error due to input signal interference. The model analyzed is given. The input signals u sub 1 (t) and u sub 2 (t) are assumed to be unknown. The measurable signals x sub 1 (t) and x sub 2 (t) are interferened according to the frequency response functions, H sub 12 (f) and H sub 21 (f). The goal of the analysis was to evaluate the power output due to each input, u sub 1 (t) and u sub 2 (t), for the case where both are applied to the same time. In addition, all frequency response functions are calculated. The interferenced system is described by a set of five equations with six unknown functions. An IBM XT Personal Computer, which was interfaced with the FFT, was used to solve the set of equations. The software was tested on an electrical two-input, one-output system. The results were excellent. The research presented includes the analysis of the acoustic radiation from a rectangular plate with two force inputs and the sound pressure as an output signal.

  2. Statistical analysis of effects of measures against agricultural pollution.

    PubMed

    Sæbø, H V

    1991-01-01

    The Norwegian Government has initiated a plan to reduce agricultural pollution. One of the projects in this plan is aimed at investigating the effects of different measures in order to evaluate their effects and costs.A set of experiments has been designed to estimate the effects of measures to reduce or control the use of fertilizers and erosion. The project started in 1985. It comprises continuous measurements in two water courses in each of four counties: one test drainage area where the relevant measures were implemented at the end of 1986, and one reference area where no specific measures are carried out. A series of chemical parameters are measured together with runoff and other hydrological and meteorogical data.The paper provides a preliminary analysis of the data collected in one of the counties during the period June 1985 to April 1988. It contains examples of analysis of covariance to show possible effects of the measures carried out in the test area.Natural variations in precipitation and pollution are large, making it difficult to see the effects of the measures without using statistical techniques to take the multivariability of the problem into account. Some effects can be shown with analysis of covariance. However, the relatively short measurement period makes it neccessary to be careful when interpreting the results. PMID:24233499

  3. Classification of Malaysia aromatic rice using multivariate statistical analysis

    SciTech Connect

    Abdullah, A. H.; Adom, A. H.; Shakaff, A. Y. Md; Masnan, M. J.; Zakaria, A.; Rahim, N. A.; Omar, O.

    2015-05-15

    Aromatic rice (Oryza sativa L.) is considered as the best quality premium rice. The varieties are preferred by consumers because of its preference criteria such as shape, colour, distinctive aroma and flavour. The price of aromatic rice is higher than ordinary rice due to its special needed growth condition for instance specific climate and soil. Presently, the aromatic rice quality is identified by using its key elements and isotopic variables. The rice can also be classified via Gas Chromatography Mass Spectrometry (GC-MS) or human sensory panels. However, the uses of human sensory panels have significant drawbacks such as lengthy training time, and prone to fatigue as the number of sample increased and inconsistent. The GC–MS analysis techniques on the other hand, require detailed procedures, lengthy analysis and quite costly. This paper presents the application of in-house developed Electronic Nose (e-nose) to classify new aromatic rice varieties. The e-nose is used to classify the variety of aromatic rice based on the samples odour. The samples were taken from the variety of rice. The instrument utilizes multivariate statistical data analysis, including Principal Component Analysis (PCA), Linear Discriminant Analysis (LDA) and K-Nearest Neighbours (KNN) to classify the unknown rice samples. The Leave-One-Out (LOO) validation approach is applied to evaluate the ability of KNN to perform recognition and classification of the unspecified samples. The visual observation of the PCA and LDA plots of the rice proves that the instrument was able to separate the samples into different clusters accordingly. The results of LDA and KNN with low misclassification error support the above findings and we may conclude that the e-nose is successfully applied to the classification of the aromatic rice varieties.

  4. Classification of Malaysia aromatic rice using multivariate statistical analysis

    NASA Astrophysics Data System (ADS)

    Abdullah, A. H.; Adom, A. H.; Shakaff, A. Y. Md; Masnan, M. J.; Zakaria, A.; Rahim, N. A.; Omar, O.

    2015-05-01

    Aromatic rice (Oryza sativa L.) is considered as the best quality premium rice. The varieties are preferred by consumers because of its preference criteria such as shape, colour, distinctive aroma and flavour. The price of aromatic rice is higher than ordinary rice due to its special needed growth condition for instance specific climate and soil. Presently, the aromatic rice quality is identified by using its key elements and isotopic variables. The rice can also be classified via Gas Chromatography Mass Spectrometry (GC-MS) or human sensory panels. However, the uses of human sensory panels have significant drawbacks such as lengthy training time, and prone to fatigue as the number of sample increased and inconsistent. The GC-MS analysis techniques on the other hand, require detailed procedures, lengthy analysis and quite costly. This paper presents the application of in-house developed Electronic Nose (e-nose) to classify new aromatic rice varieties. The e-nose is used to classify the variety of aromatic rice based on the samples odour. The samples were taken from the variety of rice. The instrument utilizes multivariate statistical data analysis, including Principal Component Analysis (PCA), Linear Discriminant Analysis (LDA) and K-Nearest Neighbours (KNN) to classify the unknown rice samples. The Leave-One-Out (LOO) validation approach is applied to evaluate the ability of KNN to perform recognition and classification of the unspecified samples. The visual observation of the PCA and LDA plots of the rice proves that the instrument was able to separate the samples into different clusters accordingly. The results of LDA and KNN with low misclassification error support the above findings and we may conclude that the e-nose is successfully applied to the classification of the aromatic rice varieties.

  5. Statistical Analysis Of Tank 5 Floor Sample Results

    SciTech Connect

    Shine, E. P.

    2012-08-01

    Sampling has been completed for the characterization of the residual material on the floor of Tank 5 in the F-Area Tank Farm at the Savannah River Site (SRS), near Aiken, SC. The sampling was performed by Savannah River Remediation (SRR) LLC using a stratified random sampling plan with volume-proportional compositing. The plan consisted of partitioning the residual material on the floor of Tank 5 into three non-overlapping strata: two strata enclosed accumulations, and a third stratum consisted of a thin layer of material outside the regions of the two accumulations. Each of three composite samples was constructed from five primary sample locations of residual material on the floor of Tank 5. Three of the primary samples were obtained from the stratum containing the thin layer of material, and one primary sample was obtained from each of the two strata containing an accumulation. This report documents the statistical analyses of the analytical results for the composite samples. The objective of the analysis is to determine the mean concentrations and upper 95% confidence (UCL95) bounds for the mean concentrations for a set of analytes in the tank residuals. The statistical procedures employed in the analyses were consistent with the Environmental Protection Agency (EPA) technical guidance by Singh and others [2010]. Savannah River National Laboratory (SRNL) measured the sample bulk density, nonvolatile beta, gross alpha, and the radionuclide, elemental, and chemical concentrations three times for each of the composite samples. The analyte concentration data were partitioned into three separate groups for further analysis: analytes with every measurement above their minimum detectable concentrations (MDCs), analytes with no measurements above their MDCs, and analytes with a mixture of some measurement results above and below their MDCs. The means, standard deviations, and UCL95s were computed for the analytes in the two groups that had at least some measurements

  6. STATISTICAL ANALYSIS OF TANK 5 FLOOR SAMPLE RESULTS

    SciTech Connect

    Shine, E.

    2012-03-14

    Sampling has been completed for the characterization of the residual material on the floor of Tank 5 in the F-Area Tank Farm at the Savannah River Site (SRS), near Aiken, SC. The sampling was performed by Savannah River Remediation (SRR) LLC using a stratified random sampling plan with volume-proportional compositing. The plan consisted of partitioning the residual material on the floor of Tank 5 into three non-overlapping strata: two strata enclosed accumulations, and a third stratum consisted of a thin layer of material outside the regions of the two accumulations. Each of three composite samples was constructed from five primary sample locations of residual material on the floor of Tank 5. Three of the primary samples were obtained from the stratum containing the thin layer of material, and one primary sample was obtained from each of the two strata containing an accumulation. This report documents the statistical analyses of the analytical results for the composite samples. The objective of the analysis is to determine the mean concentrations and upper 95% confidence (UCL95) bounds for the mean concentrations for a set of analytes in the tank residuals. The statistical procedures employed in the analyses were consistent with the Environmental Protection Agency (EPA) technical guidance by Singh and others [2010]. Savannah River National Laboratory (SRNL) measured the sample bulk density, nonvolatile beta, gross alpha, radionuclide, inorganic, and anion concentrations three times for each of the composite samples. The analyte concentration data were partitioned into three separate groups for further analysis: analytes with every measurement above their minimum detectable concentrations (MDCs), analytes with no measurements above their MDCs, and analytes with a mixture of some measurement results above and below their MDCs. The means, standard deviations, and UCL95s were computed for the analytes in the two groups that had at least some measurements above their

  7. Statistical Analysis of Tank 5 Floor Sample Results

    SciTech Connect

    Shine, E. P.

    2013-01-31

    Sampling has been completed for the characterization of the residual material on the floor of Tank 5 in the F-Area Tank Farm at the Savannah River Site (SRS), near Aiken, SC. The sampling was performed by Savannah River Remediation (SRR) LLC using a stratified random sampling plan with volume-proportional compositing. The plan consisted of partitioning the residual material on the floor of Tank 5 into three non-overlapping strata: two strata enclosed accumulations, and a third stratum consisted of a thin layer of material outside the regions of the two accumulations. Each of three composite samples was constructed from five primary sample locations of residual material on the floor of Tank 5. Three of the primary samples were obtained from the stratum containing the thin layer of material, and one primary sample was obtained from each of the two strata containing an accumulation. This report documents the statistical analyses of the analytical results for the composite samples. The objective of the analysis is to determine the mean concentrations and upper 95% confidence (UCL95) bounds for the mean concentrations for a set of analytes in the tank residuals. The statistical procedures employed in the analyses were consistent with the Environmental Protection Agency (EPA) technical guidance by Singh and others [2010]. Savannah River National Laboratory (SRNL) measured the sample bulk density, nonvolatile beta, gross alpha, and the radionuclide1, elemental, and chemical concentrations three times for each of the composite samples. The analyte concentration data were partitioned into three separate groups for further analysis: analytes with every measurement above their minimum detectable concentrations (MDCs), analytes with no measurements above their MDCs, and analytes with a mixture of some measurement results above and below their MDCs. The means, standard deviations, and UCL95s were computed for the analytes in the two groups that had at least some measurements

  8. Data Analysis & Statistical Methods for Command File Errors

    NASA Technical Reports Server (NTRS)

    Meshkat, Leila; Waggoner, Bruce; Bryant, Larry

    2014-01-01

    This paper explains current work on modeling for managing the risk of command file errors. It is focused on analyzing actual data from a JPL spaceflight mission to build models for evaluating and predicting error rates as a function of several key variables. We constructed a rich dataset by considering the number of errors, the number of files radiated, including the number commands and blocks in each file, as well as subjective estimates of workload and operational novelty. We have assessed these data using different curve fitting and distribution fitting techniques, such as multiple regression analysis, and maximum likelihood estimation to see how much of the variability in the error rates can be explained with these. We have also used goodness of fit testing strategies and principal component analysis to further assess our data. Finally, we constructed a model of expected error rates based on the what these statistics bore out as critical drivers to the error rate. This model allows project management to evaluate the error rate against a theoretically expected rate as well as anticipate future error rates.

  9. A statistical design for testing apomictic diversification through linkage analysis.

    PubMed

    Zeng, Yanru; Hou, Wei; Song, Shuang; Feng, Sisi; Shen, Lin; Xia, Guohua; Wu, Rongling

    2014-03-01

    The capacity of apomixis to generate maternal clones through seed reproduction has made it a useful characteristic for the fixation of heterosis in plant breeding. It has been observed that apomixis displays pronounced intra- and interspecific diversification, but the genetic mechanisms underlying this diversification remains elusive, obstructing the exploitation of this phenomenon in practical breeding programs. By capitalizing on molecular information in mapping populations, we describe and assess a statistical design that deploys linkage analysis to estimate and test the pattern and extent of apomictic differences at various levels from genotypes to species. The design is based on two reciprocal crosses between two individuals each chosen from a hermaphrodite or monoecious species. A multinomial distribution likelihood is constructed by combining marker information from two crosses. The EM algorithm is implemented to estimate the rate of apomixis and test its difference between two plant populations or species as the parents. The design is validated by computer simulation. A real data analysis of two reciprocal crosses between hickory (Carya cathayensis) and pecan (C. illinoensis) demonstrates the utilization and usefulness of the design in practice. The design provides a tool to address fundamental and applied questions related to the evolution and breeding of apomixis. PMID:23271157

  10. Autotasked Performance in the NAS Workload: A Statistical Analysis

    NASA Technical Reports Server (NTRS)

    Carter, R. L.; Stockdale, I. E.; Kutler, Paul (Technical Monitor)

    1998-01-01

    A statistical analysis of the workload performance of a production quality FORTRAN code for five different Cray Y-MP hardware and system software configurations is performed. The analysis was based on an experimental procedure that was designed to minimize correlations between the number of requested CPUs and the time of day the runs were initiated. Observed autotasking over heads were significantly larger for the set of jobs that requested the maximum number of CPUs. Speedups for UNICOS 6 releases show consistent wall clock speedups in the workload of around 2. which is quite good. The observed speed ups were very similar for the set of jobs that requested 8 CPUs and the set that requested 4 CPUs. The original NAS algorithm for determining charges to the user discourages autotasking in the workload. A new charging algorithm to be applied to jobs run in the NQS multitasking queues also discourages NAS users from using auto tasking. The new algorithm favors jobs requesting 8 CPUs over those that request less, although the jobs requesting 8 CPUs experienced significantly higher over head and presumably degraded system throughput. A charging algorithm is presented that has the following desirable characteristics when applied to the data: higher overhead jobs requesting 8 CPUs are penalized when compared to moderate overhead jobs requesting 4 CPUs, thereby providing a charging incentive to NAS users to use autotasking in a manner that provides them with significantly improved turnaround while also maintaining system throughput.

  11. A statistical method for draft tube pressure pulsation analysis

    NASA Astrophysics Data System (ADS)

    Doerfler, P. K.; Ruchonnet, N.

    2012-11-01

    Draft tube pressure pulsation (DTPP) in Francis turbines is composed of various components originating from different physical phenomena. These components may be separated because they differ by their spatial relationships and by their propagation mechanism. The first step for such an analysis was to distinguish between so-called synchronous and asynchronous pulsations; only approximately periodic phenomena could be described in this manner. However, less regular pulsations are always present, and these become important when turbines have to operate in the far off-design range, in particular at very low load. The statistical method described here permits to separate the stochastic (random) component from the two traditional 'regular' components. It works in connection with the standard technique of model testing with several pressure signals measured in draft tube cone. The difference between the individual signals and the averaged pressure signal, together with the coherence between the individual pressure signals is used for analysis. An example reveals that a generalized, non-periodic version of the asynchronous pulsation is important at low load.

  12. Statistical modeling of ground motion relations for seismic hazard analysis

    NASA Astrophysics Data System (ADS)

    Raschke, Mathias

    2013-10-01

    We introduce a new approach for ground motion relations (GMR) in the probabilistic seismic hazard analysis (PSHA), being influenced by the extreme value theory of mathematical statistics. Therein, we understand a GMR as a random function. We derive mathematically the principle of area equivalence, wherein two alternative GMRs have an equivalent influence on the hazard if these GMRs have equivalent area functions. This includes local biases. An interpretation of the difference between these GMRs (an actual and a modeled one) as a random component leads to a general overestimation of residual variance and hazard. Beside this, we discuss important aspects of classical approaches and discover discrepancies with the state of the art of stochastics and statistics (model selection and significance, test of distribution assumptions, extreme value statistics). We criticize especially the assumption of logarithmic normally distributed residuals of maxima like the peak ground acceleration (PGA). The natural distribution of its individual random component (equivalent to exp( ɛ 0) of Joyner and Boore, Bull Seism Soc Am 83(2):469-487, 1993) is the generalized extreme value. We show by numerical researches that the actual distribution can be hidden and a wrong distribution assumption can influence the PSHA negatively as the negligence of area equivalence does. Finally, we suggest an estimation concept for GMRs of PSHA with a regression-free variance estimation of the individual random component. We demonstrate the advantages of event-specific GMRs by analyzing data sets from the PEER strong motion database and estimate event-specific GMRs. Therein, the majority of the best models base on an anisotropic point source approach. The residual variance of logarithmized PGA is significantly smaller than in previous models. We validate the estimations for the event with the largest sample by empirical area functions, which indicate the appropriate modeling of the GMR by an anisotropic

  13. Analysis of pediatric airway morphology using statistical shape modeling.

    PubMed

    Humphries, Stephen M; Hunter, Kendall S; Shandas, Robin; Deterding, Robin R; DeBoer, Emily M

    2016-06-01

    Traditional studies of airway morphology typically focus on individual measurements or relatively simple lumped summary statistics. The purpose of this work was to use statistical shape modeling (SSM) to synthesize a skeleton model of the large bronchi of the pediatric airway tree and to test for overall airway shape differences between two populations. Airway tree anatomy was segmented from volumetric chest computed tomography of 20 control subjects and 20 subjects with cystic fibrosis (CF). Airway centerlines, particularly bifurcation points, provide landmarks for SSM. Multivariate linear and logistic regression was used to examine the relationships between airway shape variation, subject size, and disease state. Leave-one-out cross-validation was performed to test the ability to detect shape differences between control and CF groups. Simulation experiments, using tree shapes with known size and shape variations, were performed as a technical validation. Models were successfully created using SSM methods. Simulations demonstrated that the analysis process can detect shape differences between groups. In clinical data, CF status was discriminated with good accuracy (precision = 0.7, recall = 0.7) in leave-one-out cross-validation. Logistic regression modeling using all subjects showed a good fit (ROC AUC = 0.85) and revealed significant differences in SSM parameters between control and CF groups. The largest mode of shape variation was highly correlated with subject size (R = 0.95, p < 0.001). SSM methodology can be applied to identify shape differences in the airway between two populations. This method suggests that subtle shape differences exist between the CF airway and disease control. PMID:26718559

  14. Constraints on Statistical Computations at 10 Months of Age: The Use of Phonological Features

    ERIC Educational Resources Information Center

    Gonzalez-Gomez, Nayeli; Nazzi, Thierry

    2015-01-01

    Recently, several studies have argued that infants capitalize on the statistical properties of natural languages to acquire the linguistic structure of their native language, but the kinds of constraints which apply to statistical computations remain largely unknown. Here we explored French-learning infants' perceptual preference for…

  15. The Importance of Understanding Statistics: An Analysis of Document Supply Statistics at Macquarie University Library

    ERIC Educational Resources Information Center

    Pearson, Kathryn

    2008-01-01

    Macquarie University Library was concerned at the length of time that elapsed between placement of an interlibrary loan request to the satisfaction of that request. Taking advantage of improved statistical information available to them through membership of the CLIC Consortium, library staff investigated the reasons for delivery delay. This led to…

  16. On Conceptual Analysis as the Primary Qualitative Approach to Statistics Education Research in Psychology

    ERIC Educational Resources Information Center

    Petocz, Agnes; Newbery, Glenn

    2010-01-01

    Statistics education in psychology often falls disappointingly short of its goals. The increasing use of qualitative approaches in statistics education research has extended and enriched our understanding of statistical cognition processes, and thus facilitated improvements in statistical education and practices. Yet conceptual analysis, a…

  17. Statistical Analysis of Data with Non-Detectable Values

    SciTech Connect

    Frome, E.L.

    2004-08-26

    Environmental exposure measurements are, in general, positive and may be subject to left censoring, i.e. the measured value is less than a ''limit of detection''. In occupational monitoring, strategies for assessing workplace exposures typically focus on the mean exposure level or the probability that any measurement exceeds a limit. A basic problem of interest in environmental risk assessment is to determine if the mean concentration of an analyte is less than a prescribed action level. Parametric methods, used to determine acceptable levels of exposure, are often based on a two parameter lognormal distribution. The mean exposure level and/or an upper percentile (e.g. the 95th percentile) are used to characterize exposure levels, and upper confidence limits are needed to describe the uncertainty in these estimates. In certain situations it is of interest to estimate the probability of observing a future (or ''missed'') value of a lognormal variable. Statistical methods for random samples (without non-detects) from the lognormal distribution are well known for each of these situations. In this report, methods for estimating these quantities based on the maximum likelihood method for randomly left censored lognormal data are described and graphical methods are used to evaluate the lognormal assumption. If the lognormal model is in doubt and an alternative distribution for the exposure profile of a similar exposure group is not available, then nonparametric methods for left censored data are used. The mean exposure level, along with the upper confidence limit, is obtained using the product limit estimate, and the upper confidence limit on the 95th percentile (i.e. the upper tolerance limit) is obtained using a nonparametric approach. All of these methods are well known but computational complexity has limited their use in routine data analysis with left censored data. The recent development of the R environment for statistical data analysis and graphics has greatly

  18. Hydrogeochemical characteristics of groundwater in Latvia using multivariate statistical analysis

    NASA Astrophysics Data System (ADS)

    Retike, Inga; Kalvans, Andis; Bikse, Janis; Popovs, Konrads; Babre, Alise

    2015-04-01

    The aim of this study is to determine geochemical processes denoting trace element levels and variations in the fresh groundwater in Latvia. The database of 1398 groundwater samples containing records about major ion chemistry, trace elements and geological conditions was made and used. Accuracy of groundwater analysis and errors were determined and excluded prior statistical analysis. Groundwater hydrogeochemical groups were distributed on the basis of major ion concentrations using Hierarchical Cluster Analysis (HCA) and Principal Component Analysis (PCA). The results of PCA showed that there are three main geochemical groups explaining 84% of the total variance in data set. Component 1 explains the greatest amount of variance- 51% with main positive loadings of Cl, Na, K and Mg. Component 2 explains 21% of the variance with highest loadings of HCO3, Ca and Mg. Component 3 shows the highest loadings of SO4 and Ca and explains 12% of the total variance. HCA was chosen because of its great ability to group large amount of data (groundwater samples) in several clusters based on similar characteristics. As a result three large groups comprising nine distinctive clusters was made. It was possible to characterise each cluster depending on its depth of sampling, aquifer material and geochemical processes: carbonate dissolution (weathering), groundwater mixing, gypsum dissolution, ion exchange and seawater and upward saline water intrusion. Cluster 1 is the least altered infiltration water with very low load of dissolved salts. It is concluded that the groundwater in Cluster 5 has evolved from Cluster 1 by carbonate weathering in an open system conditions. The Cluster 4 is similar to Cluster 5, yet have been affected by reduction of sulphates and iron species. Cluster 3 is characterised by highest loading of chloride salts while Cluster 9 represents groundwater with highest sulphate concentrations resulting from gypsum dissolution. However, Cluster 8 is an intermediate

  19. Combining statistical energy analysis and finite element analysis in RESOUND mid frequency vibroacoustic analysis

    NASA Astrophysics Data System (ADS)

    Gardner, Bryce K.; Shorter, Philip J.; Bremner, Paul G.

    2002-11-01

    At low frequencies, vibroacoustic systems exhibit a dynamic response characterized by spatially correlated motion with low modal density. These systems are typically modeled with deterministic methods. While at high frequencies, the dynamic response is characterized by weak spatial correlation and a large number of modes with high modal overlap. These systems are typically modeled with statistical methods. However many vibroacoustic systems have some regions with high modal density and some regions with low modal density. Such systems require a midfrequency solution technique. One such method has been developed based on a hybrid approach combining finite element analysis (FE) in the low modal density regions and statistical energy analysis (SEA) in the high modal density regions. This method is called RESOUND [Langley and Bremner, J. Acoust. Soc. Am. 105, 1657-1671 (1999)]. Recent developments of RESOUND have focused on predicting the appropriate dynamic interactions and mechanisms for energy flow between the FE and the SEA regions. By including these effects, RESOUND can predict the dynamic response of systems having regions with low modal densities and regions with high modal densities. This paper will provide an overview of recent developments.

  20. A Statistical Aggregation Engine for Climatology and Trend Analysis

    NASA Astrophysics Data System (ADS)

    Chapman, D. R.; Simon, T. A.; Halem, M.

    2014-12-01

    Fundamental climate data records (FCDRs) from satellite instruments often span tens to hundreds of terabytes or even petabytes in scale. These large volumes make it difficult to aggregate or summarize their climatology and climate trends. It is especially cumbersome to supply the full derivation (provenance) of these aggregate calculations. We present a lightweight and resilient software platform, Gridderama that simplifies the calculation of climatology by exploiting the "Data-Cube" topology often present in earth observing satellite records. By using the large array storage (LAS) paradigm, Gridderama allows the analyst to more easily produce a series of aggregate climate data products at progressively coarser spatial and temporal resolutions. Furthermore, provenance tracking and extensive visualization capabilities allow the analyst to track down and correct for data problems such as missing data and outliers that may impact the scientific results. We have developed and applied Gridderama to calculate a trend analysis of 55 Terabytes of AIRS Level 1b infrared radiances, and show statistically significant trending in the greenhouse gas absorption bands as observed by AIRS over the 2003-2012 decade. We will extend this calculation to show regional changes in CO2 concentration from AIRS over the 2003-2012 decade by using a neural network retrieval algorithm.

  1. Vibration transmission through rolling element bearings. IV - Statistical energy analysis

    NASA Technical Reports Server (NTRS)

    Lim, T. C.; Singh, R.

    1992-01-01

    A theoretical broadband coupling-loss factor is developed analytically for use in the statistical energy analysis (SEA) of a shaft-bearing-plate system. The procedure is based on the solution of the boundary-value problem at the plate-bearing interface and incorporates a bearing-stiffness matrix developed by the authors. Three examples are utilized to illustrate the SEA incorporating the coupling-loss factor including: (1) a shaft-bearing-plate system; (2) a plate-cantilevered beam; and (3) a circular-shaft-bearing plate. The coupling-loss factor in the case of the thin plate-cantilevered beam is found to be more accurate than that developed by Lyon and Eichler (1964). The coupling-loss factor is described for the bearing system and extended to describe the mean-square vibratory response of a rectangular plate. The proposed techniques are of interest to the study of vibration and noise in rotating machinery such as gearboxes.

  2. Metrology Optical Power Budgeting in SIM Using Statistical Analysis Techniques

    NASA Technical Reports Server (NTRS)

    Kuan, Gary M

    2008-01-01

    The Space Interferometry Mission (SIM) is a space-based stellar interferometry instrument, consisting of up to three interferometers, which will be capable of micro-arc second resolution. Alignment knowledge of the three interferometer baselines requires a three-dimensional, 14-leg truss with each leg being monitored by an external metrology gauge. In addition, each of the three interferometers requires an internal metrology gauge to monitor the optical path length differences between the two sides. Both external and internal metrology gauges are interferometry based, operating at a wavelength of 1319 nanometers. Each gauge has fiber inputs delivering measurement and local oscillator (LO) power, split into probe-LO and reference-LO beam pairs. These beams experience power loss due to a variety of mechanisms including, but not restricted to, design efficiency, material attenuation, element misalignment, diffraction, and coupling efficiency. Since the attenuation due to these sources may degrade over time, an accounting of the range of expected attenuation is needed so an optical power margin can be book kept. A method of statistical optical power analysis and budgeting, based on a technique developed for deep space RF telecommunications, is described in this paper and provides a numerical confidence level for having sufficient optical power relative to mission metrology performance requirements.

  3. Statistical methods for texture analysis applied to agronomical images

    NASA Astrophysics Data System (ADS)

    Cointault, F.; Journaux, L.; Gouton, P.

    2008-02-01

    For activities of agronomical research institute, the land experimentations are essential and provide relevant information on crops such as disease rate, yield components, weed rate... Generally accurate, they are manually done and present numerous drawbacks, such as penibility, notably for wheat ear counting. In this case, the use of color and/or texture image processing to estimate the number of ears per square metre can be an improvement. Then, different image segmentation techniques based on feature extraction have been tested using textural information with first and higher order statistical methods. The Run Length method gives the best results closed to manual countings with an average error of 3%. Nevertheless, a fine justification of hypothesis made on the values of the classification and description parameters is necessary, especially for the number of classes and the size of analysis windows, through the estimation of a cluster validity index. The first results show that the mean number of classes in wheat image is of 11, which proves that our choice of 3 is not well adapted. To complete these results, we are currently analysing each of the class previously extracted to gather together all the classes characterizing the ears.

  4. Ensemble Solar Forecasting Statistical Quantification and Sensitivity Analysis: Preprint

    SciTech Connect

    Cheung, WanYin; Zhang, Jie; Florita, Anthony; Hodge, Bri-Mathias; Lu, Siyuan; Hamann, Hendrik F.; Sun, Qian; Lehman, Brad

    2015-12-08

    Uncertainties associated with solar forecasts present challenges to maintain grid reliability, especially at high solar penetrations. This study aims to quantify the errors associated with the day-ahead solar forecast parameters and the theoretical solar power output for a 51-kW solar power plant in a utility area in the state of Vermont, U.S. Forecasts were generated by three numerical weather prediction (NWP) models, including the Rapid Refresh, the High Resolution Rapid Refresh, and the North American Model, and a machine-learning ensemble model. A photovoltaic (PV) performance model was adopted to calculate theoretical solar power generation using the forecast parameters (e.g., irradiance, cell temperature, and wind speed). Errors of the power outputs were quantified using statistical moments and a suite of metrics, such as the normalized root mean squared error (NRMSE). In addition, the PV model's sensitivity to different forecast parameters was quantified and analyzed. Results showed that the ensemble model yielded forecasts in all parameters with the smallest NRMSE. The NRMSE of solar irradiance forecasts of the ensemble NWP model was reduced by 28.10% compared to the best of the three NWP models. Further, the sensitivity analysis indicated that the errors of the forecasted cell temperature attributed only approximately 0.12% to the NRMSE of the power output as opposed to 7.44% from the forecasted solar irradiance.

  5. [Statistical analysis of 100 recent cases of sterile couples. (1948)].

    PubMed

    Guerrero, Carlos D

    2002-12-01

    A statistical analysis of 100 recent cases of sterile couples, studied from February through December, 5945, is made. The management of the study is described. Sterility causes are classified under four groups: hormonal factor; tubal factor; cervical spermatic factor; spermatic (pure) factor. Every group has two subgroups: serious and not serious. The serious case is incurable or very difficult to treat (examples: azoospermia, bilateral absence or tubal obstruction, nonovulatory menstruation). Other doctor-referred cases (difficult): 61. Doctors' own wives: 15. Serious cases: 65 with three pregnancies (4.56 per cent). Not serious cases: 24, with ten pregnancies (41.8 per cent). Total number of cases finished: 89, with 13 pregnancies (14.5 per cent). Study discontinued: eleven. The last total rate (14.5 per cent) is erroneous because there is an absolute difference between the "serious cases", and the "not serious cases." In Mexico the "sterility specialist" has many "serious cases", and for this reason the rate of successful cases is low. PMID:12661337

  6. Plutonium metal exchange program : current status and statistical analysis

    SciTech Connect

    Tandon, L.; Eglin, J. L.; Michalak, S. E.; Picard, R. R.; Temer, D. J.

    2004-01-01

    The Rocky Flats Plutonium (Pu) Metal Sample Exchange program was conducted to insure the quality and intercomparability of measurements such as Pu assay, Pu isotopics, and impurity analyses. The Rocky Flats program was discontinued in 1989 after more than 30 years. In 2001, Los Alamos National Laboratory (LANL) reestablished the Pu Metal Exchange program. In addition to the Atomic Weapons Establishment (AWE) at Aldermaston, six Department of Energy (DOE) facilities Argonne East, Argonne West, Livermore, Los Alamos, New Brunswick Laboratory, and Savannah River are currently participating in the program. Plutonium metal samples are prepared and distributed to the sites for destructive measurements to determine elemental concentration, isotopic abundance, and both metallic and nonmetallic impurity levels. The program provides independent verification of analytical measurement capabilies for each participating facility and allows problems in analytical methods to be identified. The current status of the program will be discussed with emphasis on the unique statistical analysis and modeling of the data developed for the program. The discussion includes the definition of the consensus values for each analyte (in the presence and absence of anomalous values and/or censored values), and interesting features of the data and the results.

  7. Utility green pricing programs: A statistical analysis of program effectiveness

    SciTech Connect

    Wiser, Ryan; Olson, Scott; Bird, Lori; Swezey, Blair

    2004-02-01

    Development of renewable energy. Such programs have grown in number in recent years. The design features and effectiveness of these programs varies considerably, however, leading a variety of stakeholders to suggest specific marketing and program design features that might improve customer response and renewable energy sales. This report analyzes actual utility green pricing program data to provide further insight into which program features might help maximize both customer participation in green pricing programs and the amount of renewable energy purchased by customers in those programs. Statistical analysis is performed on both the residential and non-residential customer segments. Data comes from information gathered through a questionnaire completed for 66 utility green pricing programs in early 2003. The questionnaire specifically gathered data on residential and non-residential participation, amount of renewable energy sold, program length, the type of renewable supply used, program price/cost premiums, types of consumer research and program evaluation performed, different sign-up options available, program marketing efforts, and ancillary benefits offered to participants.

  8. High Statistics Analysis of Nucleon form Factor in Lattice QCD

    NASA Astrophysics Data System (ADS)

    Shintani, Eigo; Wittig, Hartmut

    We systematically study the effect of excited state contamination into the signal of nucleon axial, (iso-)scalar and tensor charge, extracted from three-point function with various sets of source-sink separation. In order to enhance the statistics at O(10,000) measurement, we use the all-mode-averaging technique using the approximation of observable with the optimized size of local deflation field and block size of Schwartz alternative procedure to reduce the computational cost. Numerical study is performed with the range of source-sink separation (ts) from 0.8 fm to more than 1.5 fm with several cut-off scales (a-1 = 3-4 GeV) and pion masses (mπ = 0.19-0.45 GeV) keeping the volume as mπL > 4 on Nf = 2 Wilson-clover fermion configurations in Mainz-CLS group. We suggest that in the measurement of axial-charge there appears the significant effect of unsuppressed excited state contamination at less than ts = 1.2 fm even in light pion region, otherwides those are small in scalar and tensor charge. In the analysis using ts > 1.5 fm, the axial charge approaches to experimental result near physical point.

  9. Statistical framework for phylogenomic analysis of gene family expression profiles.

    PubMed

    Gu, Xun

    2004-05-01

    Microarray technology has produced massive expression data that are invaluable for investigating the genome-wide evolutionary pattern of gene expression. To this end, phylogenetic expression analysis is highly desirable. On the basis of the Brownian process, we developed a statistical framework (called the E(0) model), assuming the independent expression of evolution between lineages. Several evolutionary mechanisms are integrated to characterize the pattern of expression diversity after gene duplications, including gradual drift and dramatic shift (punctuated equilibrium). When the phylogeny of a gene family is given, we show that the likelihood function follows a multivariate normal distribution; the variance-covariance matrix is determined by the phylogenetic topology and evolutionary parameters. Maximum-likelihood methods for multiple microarray experiments are developed, and likelihood-ratio tests are designed for testing the evolutionary pattern of gene expression. To reconstruct the evolutionary trace of expression diversity after gene (or genome) duplications, we developed a Bayesian-based method and use the posterior mean as predictors. Potential applications in evolutionary genomics are discussed. PMID:15166175

  10. Statistical analysis of mission profile parameters of civil transport airplanes

    NASA Technical Reports Server (NTRS)

    Buxbaum, O.

    1972-01-01

    The statistical analysis of flight times as well as airplane gross weights and fuel weights of jet-powered civil transport airplanes has shown that the distributions of their frequency of occurrence per flight can be presented approximately in general form. Before, however, these results may be used during the project stage of an airplane for defining a typical mission profile (the parameters of which are assumed to occur, for example, with a probability of 50 percent), the following points have to be taken into account. Because the individual airplanes were rotated during service, the scatter between the distributions of mission profile parameters for airplanes of the same type, which were flown with similar payload, has proven to be very small. Significant deviations from the generalized distributions may occur if an operator uses one airplane preferably on one or two specific routes. Another reason for larger deviations could be that the maintenance services of the operators of the observed airplanes are not representative of other airlines. Although there are indications that this is unlikely, similar information should be obtained from other operators. Such information would improve the reliability of the data.

  11. Slow and fast solar wind - data selection and statistical analysis

    NASA Astrophysics Data System (ADS)

    Wawrzaszek, Anna; Macek, Wiesław M.; Bruno, Roberto; Echim, Marius

    2014-05-01

    In this work we consider the important problem of selection of slow and fast solar wind data measured in-situ by the Ulysses spacecraft during two solar minima (1995-1997, 2007-2008) and solar maximum (1999-2001). To recognise different types of solar wind we use a set of following parameters: radial velocity, proton density, proton temperature, the distribution of charge states of oxygen ions, and compressibility of magnetic field. We present how this idea of the data selection works on Ulysses data. In the next step we consider the chosen intervals for fast and slow solar wind and perform statistical analysis of the fluctuating magnetic field components. In particular, we check the possibility of identification of inertial range by considering the scale dependence of the third and fourth orders scaling exponents of structure function. We try to verify the size of inertial range depending on the heliographic latitudes, heliocentric distance and phase of the solar cycle. Research supported by the European Community's Seventh Framework Programme (FP7/2007 - 2013) under grant agreement no 313038/STORM.

  12. Statistical Analysis of Resistivity Anomalies Caused by Underground Caves

    NASA Astrophysics Data System (ADS)

    Frid, V.; Averbach, A.; Frid, M.; Dudkinski, D.; Liskevich, G.

    2015-05-01

    Geophysical prospecting of underground caves being performed on a construction site is often still a challenging procedure. Estimation of a likelihood level of an anomaly found is frequently a mandatory requirement of a project principal due to necessity of risk/safety assessment. However, the methodology of such estimation is not hitherto developed. Aiming to put forward such a methodology the present study (being performed as a part of an underground caves mapping prior to the land development on the site area) consisted of application of electrical resistivity tomography (ERT) together with statistical analysis utilized for the likelihood assessment of underground anomalies located. The methodology was first verified via a synthetic modeling technique and applied to the in situ collected ERT data and then crossed referenced with intrusive investigations (excavation and drilling) for the data verification. The drilling/excavation results showed that the proper discovering of underground caves can be done if anomaly probability level is not lower than 90 %. Such a probability value was shown to be consistent with the modeling results. More than 30 underground cavities were discovered on the site utilizing the methodology.

  13. Statistical shape analysis of subcortical structures using spectral matching.

    PubMed

    Shakeri, Mahsa; Lombaert, Herve; Datta, Alexandre N; Oser, Nadine; Létourneau-Guillon, Laurent; Lapointe, Laurence Vincent; Martin, Florence; Malfait, Domitille; Tucholka, Alan; Lippé, Sarah; Kadoury, Samuel

    2016-09-01

    Studying morphological changes of subcortical structures often predicate neurodevelopmental and neurodegenerative diseases, such as Alzheimer's disease and schizophrenia. Hence, methods for quantifying morphological variations in the brain anatomy, including groupwise shape analyses, are becoming increasingly important for studying neurological disorders. In this paper, a novel groupwise shape analysis approach is proposed to detect regional morphological alterations in subcortical structures between two study groups, e.g., healthy and pathological subjects. The proposed scheme extracts smoothed triangulated surface meshes from segmented binary maps, and establishes reliable point-to-point correspondences among the population of surfaces using a spectral matching method. Mean curvature features are incorporated in the matching process, in order to increase the accuracy of the established surface correspondence. The mean shapes are created as the geometric mean of all surfaces in each group, and a distance map between these shapes is used to characterize the morphological changes between the two study groups. The resulting distance map is further analyzed to check for statistically significant differences between two populations. The performance of the proposed framework is evaluated on two separate subcortical structures (hippocampus and putamen). Furthermore, the proposed methodology is validated in a clinical application for detecting abnormal subcortical shape variations in Alzheimer's disease. Experimental results show that the proposed method is comparable to state-of-the-art algorithms, has less computational cost, and is more sensitive to small morphological variations in patients with neuropathologies. PMID:27025904

  14. Statistical Analysis of the AIAA Drag Prediction Workshop CFD Solutions

    NASA Technical Reports Server (NTRS)

    Morrison, Joseph H.; Hemsch, Michael J.

    2007-01-01

    The first AIAA Drag Prediction Workshop (DPW), held in June 2001, evaluated the results from an extensive N-version test of a collection of Reynolds-Averaged Navier-Stokes CFD codes. The code-to-code scatter was more than an order of magnitude larger than desired for design and experimental validation of cruise conditions for a subsonic transport configuration. The second AIAA Drag Prediction Workshop, held in June 2003, emphasized the determination of installed pylon-nacelle drag increments and grid refinement studies. The code-to-code scatter was significantly reduced compared to the first DPW, but still larger than desired. However, grid refinement studies showed no significant improvement in code-to-code scatter with increasing grid refinement. The third AIAA Drag Prediction Workshop, held in June 2006, focused on the determination of installed side-of-body fairing drag increments and grid refinement studies for clean attached flow on wing alone configurations and for separated flow on the DLR-F6 subsonic transport model. This report compares the transonic cruise prediction results of the second and third workshops using statistical analysis.

  15. Fluorescence correlation spectroscopy: Statistical analysis and biological applications

    NASA Astrophysics Data System (ADS)

    Saffarian, Saveez

    2002-01-01

    The experimental design and realization of an apparatus which can be used both for single molecule fluorescence detection and also fluorescence correlation and cross correlation spectroscopy is presented. A thorough statistical analysis of the fluorescence correlation functions including the analysis of bias and errors based on analytical derivations has been carried out. Using the methods developed here, the mechanism of binding and cleavage site recognition of matrix metalloproteinases (MMP) for their substrates has been studied. We demonstrate that two of the MMP family members, Collagenase (MMP-1) and Gelatinase A (MMP-2) exhibit diffusion along their substrates, the importance of this diffusion process and its biological implications are discussed. We show through truncation mutants that the hemopexin domain of the MMP-2 plays and important role in the substrate diffusion of this enzyme. Single molecule diffusion of the collagenase MMP-1 has been observed on collagen fibrils and shown to be biased. The discovered biased diffusion would make the MMP-1 molecule an active motor, thus making it the first active motor that is not coupled to ATP hydrolysis. The possible sources of energy for this enzyme and their implications are discussed. We propose that a possible source of energy for the enzyme can be in the rearrangement of the structure of collagen fibrils. In a separate application, using the methods developed here, we have observed an intermediate in the intestinal fatty acid binding protein folding process through the changes in its hydrodynamic radius also the fluctuations in the structure of the IFABP in solution were measured using FCS.

  16. A statistical analysis of icing prediction in complex terrains

    NASA Astrophysics Data System (ADS)

    Terborg, Amanda M.

    The issue of icing has been around for decades in aviation industry, and while notable improvements have been made in the study of the formation and process of icing, the prediction of icing events is a challenge that has yet to be completely overcome. Low level icing prediction, particularly in complex terrain, has been bumped to the back burner in an attempt to perfect the models created for in-flight icing. However, over the years there have been a number of different, non-model methods used to better refine the variable involved in low-level icing prediction. One of those methods comes through statistical analysis and modeling, particularly through the use of the Classification and Regression Tree (CART) techniques. These techniques examine the statistical significance of each predictor within a data set to determine various decision rules. Those rules in which the overall misclassification error is the smallest are then used to construct a decision tree and can be used to create a forecast for icing events. Using adiabatically adjusted Rapid Update Cycle (RUC) interpolated sounding data these CART techniques are used in this study to examine icing events in the White Mountains of New Hampshire, specifically on the summit of Mount Washington. The Mount Washington Observatory (MWO), which sits on the summit and is manned year around by weather observers, is no stranger to icing occurrences. In fact, the summit sees icing events from October all the way until April, and occasionally even into May. In this study, these events are examined in detail for the October 2010 to April 2011 season, and five CART models generated for icing in general, rime icing, and glaze icing in attempt to create a decision tree or trees with a high predictive accuracy. Also examined in this study for the October 2010 to April 2011 icing season is the Air Weather Service Pamphlet (AWSP) algorithm, a decision tree model currently in use by the Air Force to predict icing events. Producing

  17. A deterministic and statistical energy analysis of tyre cavity resonance noise

    NASA Astrophysics Data System (ADS)

    Mohamed, Zamri; Wang, Xu

    2016-03-01

    Tyre cavity resonance was studied using a combination of deterministic analysis and statistical energy analysis where its deterministic part was implemented using the impedance compact mobility matrix method and its statistical part was done by the statistical energy analysis method. While the impedance compact mobility matrix method can offer a deterministic solution to the cavity pressure response and the compliant wall vibration velocity response in the low frequency range, the statistical energy analysis method can offer a statistical solution of the responses in the high frequency range. In the mid frequency range, a combination of the statistical energy analysis and deterministic analysis methods can identify system coupling characteristics. Both methods have been compared to those from commercial softwares in order to validate the results. The combined analysis result has been verified by the measurement result from a tyre-cavity physical model. The analysis method developed in this study can be applied to other similar toroidal shape structural-acoustic systems.

  18. Algebraic Monte Carlo precedure reduces statistical analysis time and cost factors

    NASA Technical Reports Server (NTRS)

    Africano, R. C.; Logsdon, T. S.

    1967-01-01

    Algebraic Monte Carlo procedure statistically analyzes performance parameters in large, complex systems. The individual effects of input variables can be isolated and individual input statistics can be changed without having to repeat the entire analysis.

  19. Combined statistical analysis of landslide release and propagation

    NASA Astrophysics Data System (ADS)

    Mergili, Martin; Rohmaneo, Mohammad; Chu, Hone-Jay

    2016-04-01

    Statistical methods - often coupled with stochastic concepts - are commonly employed to relate areas affected by landslides with environmental layers, and to estimate spatial landslide probabilities by applying these relationships. However, such methods only concern the release of landslides, disregarding their motion. Conceptual models for mass flow routing are used for estimating landslide travel distances and possible impact areas. Automated approaches combining release and impact probabilities are rare. The present work attempts to fill this gap by a fully automated procedure combining statistical and stochastic elements, building on the open source GRASS GIS software: (1) The landslide inventory is subset into release and deposition zones. (2) We employ a traditional statistical approach to estimate the spatial release probability of landslides. (3) We back-calculate the probability distribution of the angle of reach of the observed landslides, employing the software tool r.randomwalk. One set of random walks is routed downslope from each pixel defined as release area. Each random walk stops when leaving the observed impact area of the landslide. (4) The cumulative probability function (cdf) derived in (3) is used as input to route a set of random walks downslope from each pixel in the study area through the DEM, assigning the probability gained from the cdf to each pixel along the path (impact probability). The impact probability of a pixel is defined as the average impact probability of all sets of random walks impacting a pixel. Further, the average release probabilities of the release pixels of all sets of random walks impacting a given pixel are stored along with the area of the possible release zone. (5) We compute the zonal release probability by increasing the release probability according to the size of the release zone - the larger the zone, the larger the probability that a landslide will originate from at least one pixel within this zone. We

  20. Profile of State Prisoners under Age 18, 1985-97. Bureau of Justice Statistics Special Report.

    ERIC Educational Resources Information Center

    Strom, Kevin J.

    This report presents data on all individuals under age 18 in state prisons, whether under the original jurisdiction of the juvenile or adult criminal system. Most of the data are from the National Corrections Reporting Program. On December 31, 1997, less than 1% of inmates in state prisons were under age 18, a proportion that has remained stable…

  1. Investigating Moderator Hypotheses in Aging Research: Statistical, Methodological, and Conceptual Difficulties with Comparing Separate Regressions

    ERIC Educational Resources Information Center

    Newsom, Jason T.; Prigerson, Holly G.; Schulz, Richard; Reynolds, Charles F., III

    2003-01-01

    Many topics in aging research address questions about group differences in prediction. Such questions can be viewed in terms of interaction or moderator effects, and use of appropriate methods to test these hypotheses are necessary to arrive at accurate conclusions about age differences. This article discusses the conceptual, methodological, and…

  2. The Statistics Concept Inventory: Development and analysis of a cognitive assessment instrument in statistics

    NASA Astrophysics Data System (ADS)

    Allen, Kirk

    The Statistics Concept Inventory (SCI) is a multiple choice test designed to assess students' conceptual understanding of topics typically encountered in an introductory statistics course. This dissertation documents the development of the SCI from Fall 2002 up to Spring 2006. The first phase of the project essentially sought to answer the question: "Can you write a test to assess topics typically encountered in introductory statistics?" Book One presents the results utilized in answering this question in the affirmative. The bulk of the results present the development and evolution of the items, primarily relying on objective metrics to gauge effectiveness but also incorporating student feedback. The second phase boils down to: "Now that you have the test, what else can you do with it?" This includes an exploration of Cronbach's alpha, the most commonly-used measure of test reliability in the literature. An online version of the SCI was designed, and its equivalency to the paper version is assessed. Adding an extra wrinkle to the online SCI, subjects rated their answer confidence. These results show a general positive trend between confidence and correct responses. However, some items buck this trend, revealing potential sources of misunderstandings, with comparisons offered to the extant statistics and probability educational research. The third phase is a re-assessment of the SCI: "Are you sure?" A factor analytic study favored a uni-dimensional structure for the SCI, although maintaining the likelihood of a deeper structure if more items can be written to tap similar topics. A shortened version of the instrument is proposed, demonstrated to be able to maintain a reliability nearly identical to that of the full instrument. Incorporating student feedback and a faculty topics survey, improvements to the items and recommendations for further research are proposed. The state of the concept inventory movement is assessed, to offer a comparison to the work presented

  3. Parallelization of the Physical-Space Statistical Analysis System (PSAS)

    NASA Technical Reports Server (NTRS)

    Larson, J. W.; Guo, J.; Lyster, P. M.

    1999-01-01

    Atmospheric data assimilation is a method of combining observations with model forecasts to produce a more accurate description of the atmosphere than the observations or forecast alone can provide. Data assimilation plays an increasingly important role in the study of climate and atmospheric chemistry. The NASA Data Assimilation Office (DAO) has developed the Goddard Earth Observing System Data Assimilation System (GEOS DAS) to create assimilated datasets. The core computational components of the GEOS DAS include the GEOS General Circulation Model (GCM) and the Physical-space Statistical Analysis System (PSAS). The need for timely validation of scientific enhancements to the data assimilation system poses computational demands that are best met by distributed parallel software. PSAS is implemented in Fortran 90 using object-based design principles. The analysis portions of the code solve two equations. The first of these is the "innovation" equation, which is solved on the unstructured observation grid using a preconditioned conjugate gradient (CG) method. The "analysis" equation is a transformation from the observation grid back to a structured grid, and is solved by a direct matrix-vector multiplication. Use of a factored-operator formulation reduces the computational complexity of both the CG solver and the matrix-vector multiplication, rendering the matrix-vector multiplications as a successive product of operators on a vector. Sparsity is introduced to these operators by partitioning the observations using an icosahedral decomposition scheme. PSAS builds a large (approx. 128MB) run-time database of parameters used in the calculation of these operators. Implementing a message passing parallel computing paradigm into an existing yet developing computational system as complex as PSAS is nontrivial. One of the technical challenges is balancing the requirements for computational reproducibility with the need for high performance. The problem of computational

  4. Global statistical analysis of TOPEX and POSEIDON data

    NASA Astrophysics Data System (ADS)

    Le Traon, P. Y.; Stum, J.; Dorandeu, J.; Gaspar, P.; Vincent, P.

    1994-12-01

    A global statistical analysis of the first 10 months of TOPEX/POSEIDON merged geophysical data records is presented. The global crossover analysis using the Cartwright and Ray (1990) (CR) tide model and Gaspar et al. (this issue) electromagnetic bias parameterization yields a sea level RMS crossover difference of 10.05 cm, 10.15 cm, and 10.15 cm for TOPEX-TOPEX, POSEIDON-POSEIDON, and TOPEX-POSEIDON crossovers, respectively. All geophysical corrections give reductions in the crossover differences, the most significant being with respect to ocean tides, solid earth tide, and inverse barometer effect. Based on TOPEX-POSEIDON crossovers and repeat-track differences, we estimate the relative bias between TOPEX and POSEIDON at about -15.5 +/- 1 cm. This value is dependent on electromagnetic bias corrections used. An orbit error reduction method based on global minimization of crossover differences over one cycle yields an orbit error of about 3 cm root mean square (RMS). This is probably an upper estimate of the orbit error since the estimation absorbs other altimetric signals. The RMS crossover difference is reduced to 8.8 cm after adjustment. A repeat-track analysis is then performed using the CR tide model. In regions of high mesoscale variability, the RMS sea level variability agrees well with the Geosat results. Tidal errors are also clearly evidenced. A recent tide model (Ma et al., this issue) determined from TOPEX/POSEIDON data considerably improves the RMS sea level variability. The reduction of sea level variance is (4 cm) sqaured on average but can reach (8 cm) squared in the southeast Pacific, southeast Atlantic, and Indian Oceans. The RMS sea level variability thus decreases from 6 cm to only 4 cm in quiet ocean regions. The large-scale sea level variations over these first 10 months most likely show for the first time the global annual cycle of sea level. We analyze the TOPEX and POSEIDON sea level anomaly wavenumber spectral characteristics. TOPEX and

  5. Digital Natives, Digital Immigrants: An Analysis of Age and ICT Competency in Teacher Education

    ERIC Educational Resources Information Center

    Guo, Ruth Xiaoqing; Dobson, Teresa; Petrina, Stephen

    2008-01-01

    This article examines the intersection of age and ICT (information and communication technology) competency and critiques the "digital natives versus digital immigrants" argument proposed by Prensky (2001a, 2001b). Quantitative analysis was applied to a statistical data set collected in the context of a study with over 2,000 pre-service teachers…

  6. Exploratory Factor Analysis of Diagnostic and Statistical Manual, 5th Edition, Criteria for Posttraumatic Stress Disorder.

    PubMed

    McSweeney, Lauren B; Koch, Ellen I; Saules, Karen K; Jefferson, Stephen

    2016-01-01

    One change to the posttraumatic stress disorder (PTSD) nomenclature highlighted in the Diagnostic and Statistical Manual, 5th Edition (DSM-5; American Psychiatric Association, 2013) is the conceptualization of PTSD as a diagnostic category with four distinct symptom clusters. This article presents exploratory factor analysis to test the structural validity of the DSM-5 conceptualization of PTSD via an online survey that included the PTSD Checklist-5. The study utilized a sample of 113 college students from a large Midwestern university and 177 Amazon Mechanical Turk users. Participants were primarily female, Caucasian, single, and heterosexual with an average age of 32 years. Approximately 30% to 35% of participants met diagnostic criteria for PTSD based on two different scoring criteria. Results of the exploratory factor analysis revealed five distinct symptom clusters. The implications for the classification of PTSD are discussed. PMID:26669983

  7. SUBMILLIMETER NUMBER COUNTS FROM STATISTICAL ANALYSIS OF BLAST MAPS

    SciTech Connect

    Patanchon, Guillaume; Ade, Peter A. R.; Griffin, Matthew; Hargrave, Peter C.; Mauskopf, Philip; Moncelsi, Lorenzo; Pascale, Enzo; Bock, James J.; Chapin, Edward L.; Halpern, Mark; Marsden, Gaelen; Scott, Douglas; Devlin, Mark J.; Dicker, Simon R.; Klein, Jeff; Rex, Marie; Gundersen, Joshua O.; Hughes, David H.; Netterfield, Calvin B.; Olmi, Luca

    2009-12-20

    We describe the application of a statistical method to estimate submillimeter galaxy number counts from confusion-limited observations by the Balloon-borne Large Aperture Submillimeter Telescope (BLAST). Our method is based on a maximum likelihood fit to the pixel histogram, sometimes called 'P(D)', an approach which has been used before to probe faint counts, the difference being that here we advocate its use even for sources with relatively high signal-to-noise ratios. This method has an advantage over standard techniques of source extraction in providing an unbiased estimate of the counts from the bright end down to flux densities well below the confusion limit. We specifically analyze BLAST observations of a roughly 10 deg{sup 2} map centered on the Great Observatories Origins Deep Survey South field. We provide estimates of number counts at the three BLAST wavelengths 250, 350, and 500 mum; instead of counting sources in flux bins we estimate the counts at several flux density nodes connected with power laws. We observe a generally very steep slope for the counts of about -3.7 at 250 mum, and -4.5 at 350 and 500 mum, over the range approx0.02-0.5 Jy, breaking to a shallower slope below about 0.015 Jy at all three wavelengths. We also describe how to estimate the uncertainties and correlations in this method so that the results can be used for model-fitting. This method should be well suited for analysis of data from the Herschel satellite.

  8. 3D statistical failure analysis of monolithic dental ceramic crowns.

    PubMed

    Nasrin, Sadia; Katsube, Noriko; Seghi, Robert R; Rokhlin, Stanislav I

    2016-07-01

    For adhesively retained ceramic crown of various types, it has been clinically observed that the most catastrophic failures initiate from the cement interface as a result of radial crack formation as opposed to Hertzian contact stresses originating on the occlusal surface. In this work, a 3D failure prognosis model is developed for interface initiated failures of monolithic ceramic crowns. The surface flaw distribution parameters determined by biaxial flexural tests on ceramic plates and point-to-point variations of multi-axial stress state at the intaglio surface are obtained by finite element stress analysis. They are combined on the basis of fracture mechanics based statistical failure probability model to predict failure probability of a monolithic crown subjected to single-cycle indentation load. The proposed method is verified by prior 2D axisymmetric model and experimental data. Under conditions where the crowns are completely bonded to the tooth substrate, both high flexural stress and high interfacial shear stress are shown to occur in the wall region where the crown thickness is relatively thin while high interfacial normal tensile stress distribution is observed at the margin region. Significant impact of reduced cement modulus on these stress states is shown. While the analyses are limited to single-cycle load-to-failure tests, high interfacial normal tensile stress or high interfacial shear stress may contribute to degradation of the cement bond between ceramic and dentin. In addition, the crown failure probability is shown to be controlled by high flexural stress concentrations over a small area, and the proposed method might be of some value to detect initial crown design errors. PMID:27215334

  9. Analysis of statistical model properties from discrete nuclear structure data

    NASA Astrophysics Data System (ADS)

    Firestone, Richard B.

    2012-02-01

    Experimental M1, E1, and E2 photon strengths have been compiled from experimental data in the Evaluated Nuclear Structure Data File (ENSDF) and the Evaluated Gamma-ray Activation File (EGAF). Over 20,000 Weisskopf reduced transition probabilities were recovered from the ENSDF and EGAF databases. These transition strengths have been analyzed for their dependence on transition energies, initial and final level energies, spin/parity dependence, and nuclear deformation. ENSDF BE1W values were found to increase exponentially with energy, possibly consistent with the Axel-Brink hypothesis, although considerable excess strength observed for transitions between 4-8 MeV. No similar energy dependence was observed in EGAF or ARC data. BM1W average values were nearly constant at all energies above 1 MeV with substantial excess strength below 1 MeV and between 4-8 MeV. BE2W values decreased exponentially by a factor of 1000 from 0 to 16 MeV. The distribution of ENSDF transition probabilities for all multipolarities could be described by a lognormal statistical distribution. BE1W, BM1W, and BE2W strengths all increased substantially for initial transition level energies between 4-8 MeV possibly due to dominance of spin-flip and Pygmy resonance transitions at those excitations. Analysis of the average resonance capture data indicated no transition probability dependence on final level spins or energies between 0-3 MeV. The comparison of favored to unfavored transition probabilities for odd-A or odd-Z targets indicated only partial support for the expected branching intensity ratios with many unfavored transitions having nearly the same strength as favored ones. Average resonance capture BE2W transition strengths generally increased with greater deformation. Analysis of ARC data suggest that there is a large E2 admixture in M1 transitions with the mixing ratio δ ≈ 1.0. The ENSDF reduced transition strengths were considerably stronger than those derived from capture gamma ray

  10. The Coming of Age of Statistics Education in New Zealand, and Its Influence Internationally

    ERIC Educational Resources Information Center

    Forbes, Sharleen

    2014-01-01

    New Zealand has been leading the world in terms of the data handling, and in more recent years, data visualisation approach in its school statistics curriculum. In 2013, bootstrapping and randomisation were added to the senior secondary school (Ministry of Education 2012). This paper gives an historical perspective of the people and groups that…

  11. Statistical analysis of synaptic transmission: model discrimination and confidence limits.

    PubMed Central

    Stricker, C; Redman, S; Daley, D

    1994-01-01

    Procedures for discriminating between competing statistical models of synaptic transmission, and for providing confidence limits on the parameters of these models, have been developed. These procedures were tested against simulated data and were used to analyze the fluctuations in synaptic currents evoked in hippocampal neurones. All models were fitted to data using the Expectation-Maximization algorithm and a maximum likelihood criterion. Competing models were evaluated using the log-likelihood ratio (Wilks statistic). When the competing models were not nested, Monte Carlo sampling of the model used as the null hypothesis (H0) provided density functions against which H0 and the alternate model (H1) were tested. The statistic for the log-likelihood ratio was determined from the fit of H0 and H1 to these probability densities. This statistic was used to determine the significance level at which H0 could be rejected for the original data. When the competing models were nested, log-likelihood ratios and the chi 2 statistic were used to determine the confidence level for rejection. Once the model that provided the best statistical fit to the data was identified, many estimates for the model parameters were calculated by resampling the original data. Bootstrap techniques were then used to obtain the confidence limits of these parameters. PMID:7948672

  12. A statistical model for the study of U-Nb aging (u)

    SciTech Connect

    Hemphill, Geralyn M; Hackenberg, Robert E

    2009-01-01

    This study was undertaken to model the aging response of U-Nb alloys in order to quantify property and lifetime predictions and uncertainties, in response to concerns that aging during long-term stockpile storage may change the microstructure and properties of U-6 wt%Nb alloy components in ways adversely affecting performance. U-6Nb has many desirable properties, but is a complex material because of its gross compositional inhomogeneity (its chemical banding spans 4-8 wt%), its metastable starting microstructure, and the fact that a variety of external factors such as temperature, stress, and gaseous species can cause aging through multiple mechanisms. The most significant aging mechanism identified in earlier studies [2007hac2] is age hardening, phenomenologically defined as increasing hardness and strength and decreasing ductility observed as a function of increasing aging time-at-temperature. The scientific fundamentals of age hardening at temperatures relevant to U-6Nb material processing ({le}200 C) and stockpile storage ({le}60 C) remain unresolved in spite of significant experimental efforts [2007hac2, 2009cla]. Equally problematic is the lack of a well-established U-6Nb component failure criterion. These limitations make the most desirable approach of property response and lifetime prediction - that based on fundamental physics - unattainable at the present time. Therefore, a semi-empirical approach was taken to model the phenomenological property evolution during aging. This enabled lifetime estimates to be made from an assumed failure criterion (derived from a manufacturing acceptance criterion) couched in terms of an age-sensitive property, namely quasi-static tensile elongation to failure. The predictions of this and other age-sensitive properties are also useful for U-6Nb component surveillance studies. Drawing upon a large body of artificial aging data obtained from nonbanded (chemically homogeneous) U-5.6Nb and U-7.7Nb material [2007hacJ ] over 100

  13. {chi}{sup 2} versus median statistics in supernova type Ia data analysis

    SciTech Connect

    Barreira, A.; Avelino, P. P.

    2011-10-15

    In this paper we compare the performances of the {chi}{sup 2} and median likelihood analysis in the determination of cosmological constraints using type Ia supernovae data. We perform a statistical analysis using the 307 supernovae of the Union 2 compilation of the Supernova Cosmology Project and find that the {chi}{sup 2} statistical analysis yields tighter cosmological constraints than the median statistic if only supernovae data is taken into account. We also show that when additional measurements from the cosmic microwave background and baryonic acoustic oscillations are considered, the combined cosmological constraints are not strongly dependent on whether one applies the {chi}{sup 2} statistic or the median statistic to the supernovae data. This indicates that, when complementary information from other cosmological probes is taken into account, the performances of the {chi}{sup 2} and median statistics are very similar, demonstrating the robustness of the statistical analysis.

  14. Age-Associated Changes in the Spectral and Statistical Parameters of Surface Electromyogram of Tibialis Anterior.

    PubMed

    Siddiqi, Ariba; Arjunan, Sridhar Poosapadi; Kumar, Dinesh Kant

    2016-01-01

    Age-related neuromuscular change of Tibialis Anterior (TA) is a leading cause of muscle strength decline among the elderly. This study has established the baseline for age-associated changes in sEMG of TA at different levels of voluntary contraction. We have investigated the use of Gaussianity and maximal power of the power spectral density (PSD) as suitable features to identify age-associated changes in the surface electromyogram (sEMG). Eighteen younger (20-30 years) and 18 older (60-85 years) cohorts completed two trials of isometric dorsiflexion at four different force levels between 10% and 50% of the maximal voluntary contraction. Gaussianity and maximal power of the PSD of sEMG were determined. Results show a significant increase in sEMG's maximal power of the PSD and Gaussianity with increase in force for both cohorts. It was also observed that older cohorts had higher maximal power of the PSD and lower Gaussianity. These age-related differences observed in the PSD and Gaussianity could be due to motor unit remodelling. This can be useful for noninvasive tracking of age-associated neuromuscular changes. PMID:27610379

  15. Estimating Small-area Populations by Age and Sex Using Spatial Interpolation and Statistical Inference Methods

    SciTech Connect

    Qai, Qiang; Rushton, Gerald; Bhaduri, Budhendra L; Bright, Eddie A; Coleman, Phil R

    2006-01-01

    The objective of this research is to compute population estimates by age and sex for small areas whose boundaries are different from those for which the population counts were made. In our approach, population surfaces and age-sex proportion surfaces are separately estimated. Age-sex population estimates for small areas and their confidence intervals are then computed using a binomial model with the two surfaces as inputs. The approach was implemented for Iowa using a 90 m resolution population grid (LandScan USA) and U.S. Census 2000 population. Three spatial interpolation methods, the areal weighting (AW) method, the ordinary kriging (OK) method, and a modification of the pycnophylactic method, were used on Census Tract populations to estimate the age-sex proportion surfaces. To verify the model, age-sex population estimates were computed for paired Block Groups that straddled Census Tracts and therefore were spatially misaligned with them. The pycnophylactic method and the OK method were more accurate than the AW method. The approach is general and can be used to estimate subgroup-count types of variables from information in existing administrative areas for custom-defined areas used as the spatial basis of support in other applications.

  16. Age-Associated Changes in the Spectral and Statistical Parameters of Surface Electromyogram of Tibialis Anterior

    PubMed Central

    2016-01-01

    Age-related neuromuscular change of Tibialis Anterior (TA) is a leading cause of muscle strength decline among the elderly. This study has established the baseline for age-associated changes in sEMG of TA at different levels of voluntary contraction. We have investigated the use of Gaussianity and maximal power of the power spectral density (PSD) as suitable features to identify age-associated changes in the surface electromyogram (sEMG). Eighteen younger (20–30 years) and 18 older (60–85 years) cohorts completed two trials of isometric dorsiflexion at four different force levels between 10% and 50% of the maximal voluntary contraction. Gaussianity and maximal power of the PSD of sEMG were determined. Results show a significant increase in sEMG's maximal power of the PSD and Gaussianity with increase in force for both cohorts. It was also observed that older cohorts had higher maximal power of the PSD and lower Gaussianity. These age-related differences observed in the PSD and Gaussianity could be due to motor unit remodelling. This can be useful for noninvasive tracking of age-associated neuromuscular changes. PMID:27610379

  17. Predictors of age-associated decline in maximal aerobic capacity: a comparison of four statistical models.

    PubMed

    Rosen, M J; Sorkin, J D; Goldberg, A P; Hagberg, J M; Katzel, L I

    1998-06-01

    Studies assessing changes in maximal aerobic capacity (VO2 max) associated with aging have traditionally employed the ratio of VO2 max to body weight. Log-linear, ordinary least-squares, and weighted least-squares models may avoid some of the inherent weaknesses associated with the use of ratios. In this study we used four different methods to examine the age-associated decline in VO2 max in a cross-sectional sample of 276 healthy men, aged 45-80 yr. Sixty-one of the men were aerobically trained athletes, and the remainder were sedentary. The model that accounted for the largest proportion of variance was a weighted least-squares model that included age, fat-free mass, and an indicator variable denoting exercise training status. The model accounted for 66% of the variance in VO2 max and satisfied all the important general linear model assumptions. The other approaches failed to satisfy one or more of these assumptions. The results indicated that VO2 max declines at the same rate in athletic and sedentary men (0.24 l/min or 9%/decade) and that 35% of this decline (0.08 l . min-1 . decade-1) is due to the age-associated loss of fat-free mass. PMID:9609813

  18. Interfaces between statistical analysis packages and the ESRI geographic information system

    NASA Technical Reports Server (NTRS)

    Masuoka, E.

    1980-01-01

    Interfaces between ESRI's geographic information system (GIS) data files and real valued data files written to facilitate statistical analysis and display of spatially referenced multivariable data are described. An example of data analysis which utilized the GIS and the statistical analysis system is presented to illustrate the utility of combining the analytic capability of a statistical package with the data management and display features of the GIS.

  19. The Statistical Modeling of Aging and Risk of Transition Project: Data Collection and Harmonization Across 11 Longitudinal Cohort Studies of Aging, Cognition, and Dementia

    PubMed Central

    Abner, EL; Schmitt, FA; Nelson, PT; Lou, W; Wan, L; Gauriglia, R; Dodge, HH; Woltjer, RL; Yu, L; Bennett, DA; Schneider, JA; Chen, R; Masaki, K; Katz, MJ; Lipton, RB; Dickson, DW; Lim, KO; Hemmy, LS; Cairns, NJ; Grant, E; Tyas, SL; Xiong, C; Fardo, DW; Kryscio, RJ

    2015-01-01

    Longitudinal cognitive trajectories and other factors associated with mixed neuropathologies (such as Alzheimer’s disease with co-occurring cerebrovascular disease) remain incompletely understood, despite being the rule and not the exception in older populations. The Statistical Modeling of Aging and Risk of Transition study (SMART) is a consortium of 11 different high-quality longitudinal studies of aging and cognition (N=11,541 participants) established for the purpose of characterizing risk and protective factors associated with subtypes of age-associated mixed neuropathologies (N=3,001 autopsies). While brain donation was not required for participation in all SMART cohorts, most achieved substantial autopsy rates (i.e., > 50%). Moreover, the studies comprising SMART have large numbers of participants who were followed from intact cognition and transitioned to cognitive impairment and dementia, as well as participants who remained cognitively intact until death. These data provide an exciting opportunity to apply sophisticated statistical methods, like Markov processes, that require large, well-characterized samples. Thus, SMART will serve as an important resource for the field of mixed dementia epidemiology and neuropathology. PMID:25984574

  20. Untangling the chemistry of port wine aging with the use of GC-FID, multivariate statistics, and network reconstruction.

    PubMed

    Jacobson, Dan; Monforte, Ana Rita; Silva Ferreira, António César

    2013-03-13

    Chromatography separates the different components of complex mixtures and generates a fingerprint representing the chemical composition of the sample. The resulting data structure depends on the characteristics of the detector used, univariate for devices such as a flame ionization detector (FID) or multivariate for mass spectroscopy (MS). This study addresses the potential use of a univariate signal for a nontargeted approach to (i) classify samples according to a given process or perturbation, (ii) evaluate the feasibility of developing a screening procedure to select candidates related to the process, and (iii) provide insight into the chemical mechanisms that are affected by the perturbation. To achieve this, it was necessary to use and develop methods for data preprocessing and visualization tools to assist an analytical chemist to view and interpret complex multidimensional data sets. Dichloromethane Port wine extracts were collected using GC-FID; the chromatograms were then aligned with correlation optimized warping (COW) and subsequently analyzed with multivariate statistics (MVA) by principal component analysis (PCA) and partial least-squares regression (PLS-R). Furthermore, wavelets were used for peak calling and alignment refinement, and the resulting matrix was used to perform kinetic network reconstruction via correlation networks and maximum spanning trees. Network-target correlation projections were used to screen for potential chromatographic regions/peaks related to aging mechanisms. Results from PLS between aligned chromatograms and target molecules showed high X to Y correlations of 0.91, 092, and 0.89 with 5-hydroxymethylfurfural (HMF) (Maillard), acetaldehyde (oxidation), and 4,5-dimethyl-(5H)-3-hydroxy-2-furanone, respectively. The context of the correlation (and therefore likely kinetic) relationships among compounds detected by GC-FID and the relationships between target compounds within different regions of the network can be clearly seen

  1. Statistical Analysis of CMC Constituent and Processing Data

    NASA Technical Reports Server (NTRS)

    Fornuff, Jonathan

    2004-01-01

    observed using statistical analysis software. The ultimate purpose of this study is to determine what variations in material processing can lead to the most critical changes in the materials property. The work I have taken part in this summer explores, in general, the key properties needed In this study SiC/SiC composites of varying architectures, utilizing a boron-nitride (BN)

  2. Constraints on statistical computations at 10 months of age: the use of phonological features.

    PubMed

    Gonzalez-Gomez, Nayeli; Nazzi, Thierry

    2015-11-01

    Recently, several studies have argued that infants capitalize on the statistical properties of natural languages to acquire the linguistic structure of their native language, but the kinds of constraints which apply to statistical computations remain largely unknown. Here we explored French-learning infants' perceptual preference for labial-coronal (LC) words over coronal-labial words (CL) words (e.g. preferring bat over tab) to determine whether this phonotactic preference is based on the acquisition of the statistical properties of the input based on a single phonological feature (i.e. place of articulation), multiple features (i.e. place and manner of articulation), or individual consonant pairs. Results from four experiments revealed that infants had a labial-coronal bias for nasal sequences (Experiment 1) and for all plosive sequences (Experiments 2 and 4) but a coronal-labial bias for all fricative sequences (Experiments 3 and 4), independently of the frequencies of individual consonant pairs. These results establish for the first time that constellations of multiple phonological features, defining broad consonant classes, constrain the early acquisition of phonotactic regularities of the native language. PMID:25530121

  3. Radar Derived Spatial Statistics of Summer Rain. Volume 2; Data Reduction and Analysis

    NASA Technical Reports Server (NTRS)

    Konrad, T. G.; Kropfli, R. A.

    1975-01-01

    Data reduction and analysis procedures are discussed along with the physical and statistical descriptors used. The statistical modeling techniques are outlined and examples of the derived statistical characterization of rain cells in terms of the several physical descriptors are presented. Recommendations concerning analyses which can be pursued using the data base collected during the experiment are included.

  4. Gene Identification Algorithms Using Exploratory Statistical Analysis of Periodicity

    NASA Astrophysics Data System (ADS)

    Mukherjee, Shashi Bajaj; Sen, Pradip Kumar

    2010-10-01

    Studying periodic pattern is expected as a standard line of attack for recognizing DNA sequence in identification of gene and similar problems. But peculiarly very little significant work is done in this direction. This paper studies statistical properties of DNA sequences of complete genome using a new technique. A DNA sequence is converted to a numeric sequence using various types of mappings and standard Fourier technique is applied to study the periodicity. Distinct statistical behaviour of periodicity parameters is found in coding and non-coding sequences, which can be used to distinguish between these parts. Here DNA sequences of Drosophila melanogaster were analyzed with significant accuracy.

  5. Toward a comprehensive framework for the spatiotemporal statistical analysis of longitudinal shape data

    PubMed Central

    Durrleman, S.; Pennec, X.; Trouvé, A.; Braga, J.; Gerig, G.; Ayache, N.

    2013-01-01

    This paper proposes an original approach for the statistical analysis of longitudinal shape data. The proposed method allows the characterization of typical growth patterns and subject-specific shape changes in repeated time-series observations of several subjects. This can be seen as the extension of usual longitudinal statistics of scalar measurements to high-dimensional shape or image data. The method is based on the estimation of continuous subject-specific growth trajectories and the comparison of such temporal shape changes across subjects. Differences between growth trajectories are decomposed into morphological deformations, which account for shape changes independent of the time, and time warps, which account for different rates of shape changes over time. Given a longitudinal shape data set, we estimate a mean growth scenario representative of the population, and the variations of this scenario both in terms of shape changes and in terms of change in growth speed. Then, intrinsic statistics are derived in the space of spatiotemporal deformations, which characterize the typical variations in shape and in growth speed within the studied population. They can be used to detect systematic developmental delays across subjects. In the context of neuroscience, we apply this method to analyze the differences in the growth of the hippocampus in children diagnosed with autism, developmental delays and in controls. Result suggest that group differences may be better characterized by a different speed of maturation rather than shape differences at a given age. In the context of anthropology, we assess the differences in the typical growth of the endocranium between chimpanzees and bonobos. We take advantage of this study to show the robustness of the method with respect to change of parameters and perturbation of the age estimates. PMID:23956495

  6. Toward a comprehensive framework for the spatiotemporal statistical analysis of longitudinal shape data.

    PubMed

    Durrleman, S; Pennec, X; Trouvé, A; Braga, J; Gerig, G; Ayache, N

    2013-05-01

    This paper proposes an original approach for the statistical analysis of longitudinal shape data. The proposed method allows the characterization of typical growth patterns and subject-specific shape changes in repeated time-series observations of several subjects. This can be seen as the extension of usual longitudinal statistics of scalar measurements to high-dimensional shape or image data. The method is based on the estimation of continuous subject-specific growth trajectories and the comparison of such temporal shape changes across subjects. Differences between growth trajectories are decomposed into morphological deformations, which account for shape changes independent of the time, and time warps, which account for different rates of shape changes over time. Given a longitudinal shape data set, we estimate a mean growth scenario representative of the population, and the variations of this scenario both in terms of shape changes and in terms of change in growth speed. Then, intrinsic statistics are derived in the space of spatiotemporal deformations, which characterize the typical variations in shape and in growth speed within the studied population. They can be used to detect systematic developmental delays across subjects. In the context of neuroscience, we apply this method to analyze the differences in the growth of the hippocampus in children diagnosed with autism, developmental delays and in controls. Result suggest that group differences may be better characterized by a different speed of maturation rather than shape differences at a given age. In the context of anthropology, we assess the differences in the typical growth of the endocranium between chimpanzees and bonobos. We take advantage of this study to show the robustness of the method with respect to change of parameters and perturbation of the age estimates. PMID:23956495

  7. Statistical models and NMR analysis of polymer microstructure

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Statistical models can be used in conjunction with NMR spectroscopy to study polymer microstructure and polymerization mechanisms. Thus, Bernoullian, Markovian, and enantiomorphic-site models are well known. Many additional models have been formulated over the years for additional situations. Typica...

  8. Statistical Analysis Tools for Learning in Engineering Laboratories.

    ERIC Educational Resources Information Center

    Maher, Carolyn A.

    1990-01-01

    Described are engineering programs that have used automated data acquisition systems to implement data collection and analyze experiments. Applications include a biochemical engineering laboratory, heat transfer performance, engineering materials testing, mechanical system reliability, statistical control laboratory, thermo-fluid laboratory, and a…

  9. Conventional and Newer Statistical Methods in Meta-Analysis.

    ERIC Educational Resources Information Center

    Kulik, James A.; Kulik, Chen-Lin C.

    The assumptions and consequences of applying conventional and newer statistical methods to meta-analytic data sets are reviewed. The application of the two approaches to a meta-analytic data set described by L. V. Hedges (1984) illustrates the differences. Hedges analyzed six studies of the effects of open education on student cooperation. The…

  10. Data Desk Professional: Statistical Analysis for the Macintosh.

    ERIC Educational Resources Information Center

    Wise, Steven L.; Kutish, Gerald W.

    This review of Data Desk Professional, a statistical software package for Macintosh microcomputers, includes information on: (1) cost and the amount and allocation of memory; (2) usability (documentation quality, ease of use); (3) running programs; (4) program output (quality of graphics); (5) accuracy; and (6) user services. In conclusion, it is…

  11. Statistical Power Analysis in Education Research. NCSER 2010-3006

    ERIC Educational Resources Information Center

    Hedges, Larry V.; Rhoads, Christopher

    2010-01-01

    This paper provides a guide to calculating statistical power for the complex multilevel designs that are used in most field studies in education research. For multilevel evaluation studies in the field of education, it is important to account for the impact of clustering on the standard errors of estimates of treatment effects. Using ideas from…

  12. Did Tanzania Achieve the Second Millennium Development Goal? Statistical Analysis

    ERIC Educational Resources Information Center

    Magoti, Edwin

    2016-01-01

    Development Goal "Achieve universal primary education", the challenges faced, along with the way forward towards achieving the fourth Sustainable Development Goal "Ensure inclusive and equitable quality education and promote lifelong learning opportunities for all". Statistics show that Tanzania has made very promising steps…

  13. STATISTICAL ANALYSIS OF THE LOS ANGELES CATALYST STUDY DATA

    EPA Science Inventory

    This research was initiated to perform statistical analyses of the data from the Los Angeles Catalyst Study. The objective is to determine the effects of the introduction of the catalytic converter upon the atmospheric concentration levels of a number of air pollutants. This repo...

  14. State Survey on Racial and Ethnic Classifications. Statistical Analysis Report.

    ERIC Educational Resources Information Center

    Carey, Nancy; Rowand, Cassandra; Farris, Elizabeth

    The State Survey on Racial and Ethnic Classifications was conducted for the National Center for Education Statistics and the Office for Civil Rights in the U.S. Department of Education as part of the research associated with the comprehensive review of an Office of Management and Budget (OMB) directive on race and ethnic standards for federal…

  15. Spatial statistical analysis of basal stem root disease under natural field epidemic of oil palm

    NASA Astrophysics Data System (ADS)

    Kamu, Assis; Phin, Chong Khim; Seman, Idris Abu; Wan, Hoong Hak; Mun, Ho Chong

    2015-02-01

    Oil palm or scientifically known as Elaeis guineensis Jacq. is the most important commodity crop in Malaysia and has greatly contributed to the economy growth of the country. As far as disease is concerned in the industry, Basal Stem Rot (BSR) caused by Ganoderma boninence remains the most important disease. BSR disease is the most widely studied with information available for oil palm disease in Malaysia. However, there is still limited study on the spatial as well as temporal pattern or distribution of the disease especially under natural field epidemic condition in oil palm plantation. The objective of this study is to spatially identify the pattern of BSR disease under natural field epidemic using two geospatial analytical techniques, which are quadrat analysis for the first order properties of partial pattern analysis and nearest-neighbor analysis (NNA) for the second order properties of partial pattern analysis. Two study sites were selected with different age of tree. Both sites are located in Tawau, Sabah and managed by the same company. The results showed that at least one of the point pattern analysis used which is NNA (i.e. the second order properties of partial pattern analysis) has confirmed the disease is complete spatial randomness. This suggests the spread of the disease is not from tree to tree and the age of palm does not play a significance role in determining the spatial pattern of the disease. From the spatial pattern of the disease, it would help in the disease management program and for the industry in the future. The statistical modelling is expected to help in identifying the right model to estimate the yield loss of oil palm due to BSR disease in the future.

  16. Descriptive statistics tables from a detailed analysis of the National Human Activity Pattern Survey (NHAPS) data

    SciTech Connect

    Tsang, A.M.; Klepeis, N.E.

    1996-07-01

    Detailed results tables are presented from an unweighted statistical analysis of selected portions of the 1992--1994 National Human Activity Pattern Survey (NHAPS) data base. This survey collected data on the potential exposure of Americans to important household pollutants. Randomly selected individuals (9,386) supplied minute-by-minute diaries spanning a 24-hour day as well as follow-up questions on specific exposure types. Selected 24-hour diary locations and activities, selected regroupings of the 24-hour diary locations, activities, and smoker-present categories, and most of the follow-up question variables in the NHAPS data base were statistically analyzed across 12 subgroups (gender, age, Hispanic, education, employment, census region, day-of-week, season, asthma, angina and bronchitis/emphysema). Overall statistics were also generated for the 9,386 total respondents. Tables show descriptive statistics (including frequency distributions) of time spent and frequency of occurrence in each of 39 locations and for 22 activities (that were experienced by more than 50 respondents), along with equivalent tables for 10 regrouped locations (Residence-Indoors, Residence-Outdoors, Inside Vehicle, Near Vehicle, Other Outdoor, Office/Factory, Mall/Store, Public Building, Bar/Restaurant, Other Indoor), seven regrouped activities and smoker present. Tables of frequency distributions of time spent in exposure activities, or the frequency of occurrence of exposure activities, as determined from the follow up questions that were analyzed are also presented. Detailed indices provide page numbers for each table. An Appendix contains a condensed listing of the questionnaires (Versions A and B for adults, child-direct and child-proxy interview types), including the question number, the NHAPS data base variable name, and the verbatim question text.

  17. Cerebral glucose metabolism in corticobasal degeneration comparison with progressive supranuclear palsy using statistical mapping analysis.

    PubMed

    Juh, Rahyeong; Pae, Chi-Un; Kim, Tae-Suk; Lee, Chang-Uk; Choe, Boyoung; Suh, Taesuk

    This study measured the cerebral glucose metabolism in patients suffering from corticobasal degeneration (CBD) and progressive supranuclear palsy (PSP). The aim was to determine if there is a different metabolic pattern using (18)F-labeled 2-deoxyglucose ((18)F-FDG) positron emission tomography (PET). The regional cerebral glucose metabolism was examined in 8 patients diagnosed clinically with CBD (mean age 69.6 +/- 7.8 years; male/female: 5/3), 8 patients with probable PSP (mean age 67.8 +/- 4.5 years; male/female: 4/4) and 22 healthy controls. The regional cerebral glucose metabolism between the three groups was compared using statistical parametric mapping (SPM) with a voxel-by-voxel approach (p < 0.001, 200-voxel level). Compared with the normal controls, asymmetry in the regional glucose metabolism was observed in the parietal, frontal and cingulate in the CBD patients. In the PSP patients, the glucose metabolism was lower in the orbitofrontal, middle frontal, cingulate, thalamus and mid-brain than their age matched normal controls. A comparison of the two patient groups demonstrated relative hypometabolism in the thalamus, the mid-brain in the PSP patients and the parietal lobe in CBD patients. These results suggest that when making a differential diagnosis of CBD and PSP, voxel-based analysis of the (18)F-FDG PET images using a SPM might be a useful tool in clinical examinations. PMID:15936506

  18. Statistical averaging of marine magnetic anomalies and the aging of oceanic crust.

    USGS Publications Warehouse

    Blakely, R.J.

    1983-01-01

    Visual comparison of Mesozoic and Cenozoic magnetic anomalies in the North Pacific suggests that older anomalies contain less short-wavelength information than younger anomalies in this area. To test this observation, magnetic profiles from the North Pacific are examined from crust of three ages: 0-2.1, 29.3-33.1, and 64.9-70.3Ma. For each time period, at least nine profiles were analyzed by 1) calculating the power density spectrum of each profile, 2) averaging the spectra together, and 3) computing a 'recording filter' for each time period by assuming a hypothetical seafloor model. The model assumes that the top of the source is acoustic basement, the source thickness is 0.5km, and the time scale of geomagnetic reversals is according to Ness et al. (1980). The calculated power density spectra of the three recording filters are complex in shape but show an increase of attenuation of short-wavelength information as the crust ages. These results are interpreted using a multilayer model for marine magnetic anomalies in which the upper layer, corresponding to pillow basalt of seismic layer 2A, acts as a source of noise to the magnetic anomalies. As the ocean crust ages, this noisy contribution by the pillow basalts becomes less significant to the anomalies. Consequently, magnetic sources below layer 2A must be faithful recorders of geomagnetic reversals.-AuthorPacific power density spectrum

  19. A Statistical Framework for the Functional Analysis of Metagenomes

    SciTech Connect

    Sharon, Itai; Pati, Amrita; Markowitz, Victor; Pinter, Ron Y.

    2008-10-01

    Metagenomic studies consider the genetic makeup of microbial communities as a whole, rather than their individual member organisms. The functional and metabolic potential of microbial communities can be analyzed by comparing the relative abundance of gene families in their collective genomic sequences (metagenome) under different conditions. Such comparisons require accurate estimation of gene family frequencies. They present a statistical framework for assessing these frequencies based on the Lander-Waterman theory developed originally for Whole Genome Shotgun (WGS) sequencing projects. They also provide a novel method for assessing the reliability of the estimations which can be used for removing seemingly unreliable measurements. They tested their method on a wide range of datasets, including simulated genomes and real WGS data from sequencing projects of whole genomes. Results suggest that their framework corrects inherent biases in accepted methods and provides a good approximation to the true statistics of gene families in WGS projects.

  20. Statistical Methods for Rapid Aerothermal Analysis and Design Technology

    NASA Technical Reports Server (NTRS)

    Morgan, Carolyn; DePriest, Douglas; Thompson, Richard (Technical Monitor)

    2002-01-01

    The cost and safety goals for NASA's next generation of reusable launch vehicle (RLV) will require that rapid high-fidelity aerothermodynamic design tools be used early in the design cycle. To meet these requirements, it is desirable to establish statistical models that quantify and improve the accuracy, extend the applicability, and enable combined analyses using existing prediction tools. The research work was focused on establishing the suitable mathematical/statistical models for these purposes. It is anticipated that the resulting models can be incorporated into a software tool to provide rapid, variable-fidelity, aerothermal environments to predict heating along an arbitrary trajectory. This work will support development of an integrated design tool to perform automated thermal protection system (TPS) sizing and material selection.

  1. Statistical analysis of modeling error in structural dynamic systems

    NASA Technical Reports Server (NTRS)

    Hasselman, T. K.; Chrostowski, J. D.

    1990-01-01

    The paper presents a generic statistical model of the (total) modeling error for conventional space structures in their launch configuration. Modeling error is defined as the difference between analytical prediction and experimental measurement. It is represented by the differences between predicted and measured real eigenvalues and eigenvectors. Comparisons are made between pre-test and post-test models. Total modeling error is then subdivided into measurement error, experimental error and 'pure' modeling error, and comparisons made between measurement error and total modeling error. The generic statistical model presented in this paper is based on the first four global (primary structure) modes of four different structures belonging to the generic category of Conventional Space Structures (specifically excluding large truss-type space structures). As such, it may be used to evaluate the uncertainty of predicted mode shapes and frequencies, sinusoidal response, or the transient response of other structures belonging to the same generic category.

  2. Ambiguity and nonidentifiability in the statistical analysis of neural codes.

    PubMed

    Amarasingham, Asohan; Geman, Stuart; Harrison, Matthew T

    2015-05-19

    Many experimental studies of neural coding rely on a statistical interpretation of the theoretical notion of the rate at which a neuron fires spikes. For example, neuroscientists often ask, "Does a population of neurons exhibit more synchronous spiking than one would expect from the covariability of their instantaneous firing rates?" For another example, "How much of a neuron's observed spiking variability is caused by the variability of its instantaneous firing rate, and how much is caused by spike timing variability?" However, a neuron's theoretical firing rate is not necessarily well-defined. Consequently, neuroscientific questions involving the theoretical firing rate do not have a meaning in isolation but can only be interpreted in light of additional statistical modeling choices. Ignoring this ambiguity can lead to inconsistent reasoning or wayward conclusions. We illustrate these issues with examples drawn from the neural-coding literature. PMID:25934918

  3. Common misconceptions about data analysis and statistics1

    PubMed Central

    Motulsky, Harvey J

    2015-01-01

    Ideally, any experienced investigator with the right tools should be able to reproduce a finding published in a peer-reviewed biomedical science journal. In fact, the reproducibility of a large percentage of published findings has been questioned. Undoubtedly, there are many reasons for this, but one reason may be that investigators fool themselves due to a poor understanding of statistical concepts. In particular, investigators often make these mistakes: (1) P-Hacking. This is when you reanalyze a data set in many different ways, or perhaps reanalyze with additional replicates, until you get the result you want. (2) Overemphasis on P values rather than on the actual size of the observed effect. (3) Overuse of statistical hypothesis testing, and being seduced by the word “significant”. (4) Overreliance on standard errors, which are often misunderstood. PMID:25692012

  4. Statistical analysis of motion contrast in optical coherence tomography angiography

    NASA Astrophysics Data System (ADS)

    Cheng, Yuxuan; Guo, Li; Pan, Cong; Lu, Tongtong; Hong, Tianyu; Ding, Zhihua; Li, Peng

    2015-11-01

    Optical coherence tomography angiography (Angio-OCT), mainly based on the temporal dynamics of OCT scattering signals, has found a range of potential applications in clinical and scientific research. Based on the model of random phasor sums, temporal statistics of the complex-valued OCT signals are mathematically described. Statistical distributions of the amplitude differential and complex differential Angio-OCT signals are derived. The theories are validated through the flow phantom and live animal experiments. Using the model developed, the origin of the motion contrast in Angio-OCT is mathematically explained, and the implications in the improvement of motion contrast are further discussed, including threshold determination and its residual classification error, averaging method, and scanning protocol. The proposed mathematical model of Angio-OCT signals can aid in the optimal design of the system and associated algorithms.

  5. Statistical Analysis of CFD Solutions from the Drag Prediction Workshop

    NASA Technical Reports Server (NTRS)

    Hemsch, Michael J.

    2002-01-01

    A simple, graphical framework is presented for robust statistical evaluation of results obtained from N-Version testing of a series of RANS CFD codes. The solutions were obtained by a variety of code developers and users for the June 2001 Drag Prediction Workshop sponsored by the AIAA Applied Aerodynamics Technical Committee. The aerodynamic configuration used for the computational tests is the DLR-F4 wing-body combination previously tested in several European wind tunnels and for which a previous N-Version test had been conducted. The statistical framework is used to evaluate code results for (1) a single cruise design point, (2) drag polars and (3) drag rise. The paper concludes with a discussion of the meaning of the results, especially with respect to predictability, Validation, and reporting of solutions.

  6. Statistical analysis of sparse data: Space plasma measurements

    NASA Astrophysics Data System (ADS)

    Roelof, Edmond C.

    2012-05-01

    Some operating space plasma instruments, e.g., ACE/SWICS, can have low counting rates (<1 count/sample). A novel approach has been suggested [1] that estimates counting rates (x) from ``strings'' of samples with (k) zeros followed by a non-zero count (n>=1) using x' = n/(k+1) for each string. We apply Poisson statistics to obtain the ensemble-averaged expectation value of R' and its standard deviation (s.d.) as a function of the (unknown) true rate (x). We find that x'>x for all true rates (particularly for x<1), but interestingly that the s.d. of x' can be less than that of the usual Poisson s.d. from (k+1) samples. We suggest a statistical theoretical ``correction'' for each bin rate that will, on average, compensate for this sampling bias.

  7. ANOVA like analysis of cancer death age

    NASA Astrophysics Data System (ADS)

    Areia, Aníbal; Mexia, João T.

    2016-06-01

    We use ANOVA to study the influence of year, sex, country and location on the average cancer death age. The data used was from the World Health Organization (WHO) files for 1999, 2003, 2007 and 2011. The locations considered were: kidney, leukaemia, melanoma of skin and oesophagus and the countries: Portugal, Norway, Greece and Romania.

  8. Computational and Statistical Analysis of Protein Mass Spectrometry Data

    PubMed Central

    Noble, William Stafford; MacCoss, Michael J.

    2012-01-01

    High-throughput proteomics experiments involving tandem mass spectrometry produce large volumes of complex data that require sophisticated computational analyses. As such, the field offers many challenges for computational biologists. In this article, we briefly introduce some of the core computational and statistical problems in the field and then describe a variety of outstanding problems that readers of PLoS Computational Biology might be able to help solve. PMID:22291580

  9. CAG-Repeat Length and the Age of Onset in Huntington Disease (HD): A Review and Validation Study of Statistical Approaches

    PubMed Central

    Langbehn, Douglas R.; Hayden, Michael; Paulsen, Jane S.

    2011-01-01

    Background CAG-repeat length in the gene for HD is inversely correlated with age of onset (AOO). A number of statistical models elucidating the relationship between CAG length and AOO have recently been published. In the present article, we review the published formulae, summarize essential differences in subject sources, statistical methodologies, and predictive results. We argue that unrepresentative sampling and failure to use appropriate survival analysis methodology may have substantially biased much of the literature. We also explain why the survival analysis perspective is necessary if any such model is to undergo prospective validation. Methods We use prospective diagnostic data from the PREDICT-HD longitudinal study of CAG-expanded participants to test conditional predictions derived from two survival models of age of onset of HD. Principal Findings A prior model of the relationship of CAG and AOO originally published by Langbehn et al. yields reasonably accurate predictions, while a similar model by Gutierrez and MacDonald substantially overestimates diagnosis risk for all but the highest risk subjects in this sample. Conclusions/Significance The Langbehn et al model appears accurate enough to have substantial utility in various research contexts. We also emphasize remaining caveats, many of which are relevant for any direct application to genetic counseling. PMID:19548255

  10. Statistics Education Research in Malaysia and the Philippines: A Comparative Analysis

    ERIC Educational Resources Information Center

    Reston, Enriqueta; Krishnan, Saras; Idris, Noraini

    2014-01-01

    This paper presents a comparative analysis of statistics education research in Malaysia and the Philippines by modes of dissemination, research areas, and trends. An electronic search for published research papers in the area of statistics education from 2000-2012 yielded 20 for Malaysia and 19 for the Philippines. Analysis of these papers showed…

  11. Child Mortality in a Developing Country: A Statistical Analysis

    ERIC Educational Resources Information Center

    Uddin, Md. Jamal; Hossain, Md. Zakir; Ullah, Mohammad Ohid

    2009-01-01

    This study uses data from the "Bangladesh Demographic and Health Survey (BDHS] 1999-2000" to investigate the predictors of child (age 1-4 years) mortality in a developing country like Bangladesh. The cross-tabulation and multiple logistic regression techniques have been used to estimate the predictors of child mortality. The cross-tabulation…

  12. Condition of America's Public School Facilities, 1999. Statistical Analysis Report.

    ERIC Educational Resources Information Center

    Lewis, Laurie; Snow, Kyle; Farris, Elizabeth; Smerdon, Becky; Cronen, Stephanie; Kaplan, Jessica

    This report provides national data for 903 U.S. public elementary and secondary schools on the condition of public schools in 1999 and the costs to bring them into good condition. Additionally provided are school plans for repairs, renovations, and replacements; data on the age of public schools; and overcrowding and practices used to address…

  13. Studies in astronomical time series analysis. II - Statistical aspects of spectral analysis of unevenly spaced data

    NASA Technical Reports Server (NTRS)

    Scargle, J. D.

    1982-01-01

    Detection of a periodic signal hidden in noise is frequently a goal in astronomical data analysis. This paper does not introduce a new detection technique, but instead studies the reliability and efficiency of detection with the most commonly used technique, the periodogram, in the case where the observation times are unevenly spaced. This choice was made because, of the methods in current use, it appears to have the simplest statistical behavior. A modification of the classical definition of the periodogram is necessary in order to retain the simple statistical behavior of the evenly spaced case. With this modification, periodogram analysis and least-squares fitting of sine waves to the data are exactly equivalent. Certain difficulties with the use of the periodogram are less important than commonly believed in the case of detection of strictly periodic signals. In addition, the standard method for mitigating these difficulties (tapering) can be used just as well if the sampling is uneven. An analysis of the statistical significance of signal detections is presented, with examples

  14. How Many Studies Do You Need? A Primer on Statistical Power for Meta-Analysis

    ERIC Educational Resources Information Center

    Valentine, Jeffrey C.; Pigott, Therese D.; Rothstein, Hannah R.

    2010-01-01

    In this article, the authors outline methods for using fixed and random effects power analysis in the context of meta-analysis. Like statistical power analysis for primary studies, power analysis for meta-analysis can be done either prospectively or retrospectively and requires assumptions about parameters that are unknown. The authors provide…

  15. Statistical analysis of wing/fin buffeting response

    NASA Astrophysics Data System (ADS)

    Lee, B. H. K.

    2002-05-01

    The random nature of the aerodynamic loading on the wing and tail structures of an aircraft makes it necessary to adopt a statistical approach in the prediction of the buffeting response. This review describes a buffeting prediction technique based on rigid model pressure measurements that is commonly used in North America, and also the buffet excitation parameter technique favored by many researchers in the UK. It is shown that the two models are equivalent and have their origin based on a statistical theory of the response of a mechanical system to a random load. In formulating the model for predicting aircraft response at flight conditions using rigid model wind tunnel pressure measurements, the wing (fin) is divided into panels, and the load is computed from measured pressure fluctuations at the center of each panel. The methods used to model pressure correlation between panels are discussed. The coupling between the wing (fin) motion and the induced aerodynamics using a doublet-lattice unsteady aerodynamics code is described. The buffet excitation parameter approach to predict flight test response using wind tunnel model data is derived from the equations for the pressure model formulation. Examples of flight correlation with prediction based on wind tunnel measurements for wing and vertical tail buffeting response are presented for a number of aircraft. For rapid maneuvers inside the buffet regime, the statistical properties of the buffet load are usually non-stationary because of the short time records and difficulties in maintaining constant flight conditions. The time history of the applied load is segmented into a number of time intervals. In each time segment, the non-stationary load is represented as a product of a deterministic shaping function and a random function. Various forms of the load power spectral density that permits analytical solution of the mean square displacement and acceleration response are considered. Illustrations are given using F

  16. Statistical analysis of brain sulci based on active ribbon modeling

    NASA Astrophysics Data System (ADS)

    Barillot, Christian; Le Goualher, Georges; Hellier, Pierre; Gibaud, Bernard

    1999-05-01

    This paper presents a general statistical framework for modeling deformable object. This model is devoted being used in digital brain atlases. We first present a numerical modeling of brain sulci. We present also a method to characterize the high inter-individual variability of basic cortical structures on which the description of the cerebral cortex is based. The aimed applications use numerical modeling of brain sulci to assist non-linear registration of human brains by inter-individual anatomical matching or to better compare neuro-functional recordings performed on a series of individuals. The utilization of these methods is illustrated using a few examples.

  17. Improving the Conduct and Reporting of Statistical Analysis in Psychology.

    PubMed

    Sijtsma, Klaas; Veldkamp, Coosje L S; Wicherts, Jelte M

    2016-03-01

    We respond to the commentaries Waldman and Lilienfeld (Psychometrika, 2015) and Wigboldus and Dotch (Psychometrika, 2015) provided in response to Sijtsma's (Sijtsma in Psychometrika, 2015) discussion article on questionable research practices. Specifically, we discuss the fear of an increased dichotomy between substantive and statistical aspects of research that may arise when the latter aspects are laid entirely in the hands of a statistician, remedies for false positives and replication failure, and the status of data exploration, and we provide a re-definition of the concept of questionable research practices. PMID:25820978

  18. Statistical analysis of several terminal area traffic collision hazard factors.

    NASA Technical Reports Server (NTRS)

    Ruetenik, J. R.

    1972-01-01

    An 11 hr sample of air traffic, comprising 584 tracks recorded at Atlanta during peak periods of August 1967, is analyzed to examine the statistical characteristics of range-guard intrusions and airspace conflicts in a terminal area. The number of intrusions (of an imaginary 3-naut mile, 500-ft range guard surrounding each aircraft) and number of conflicts (of the projected airspace for two aircraft) for a track exhibit Poisson variations with track duration. The hourly rate of intrusions follows the gas model square-law variation with traffic density, but the hourly conflict rate, contrary to the gas model, decreases with greater traffic density.

  19. Statistical Analysis of Noisy Signals Using Classification Tools

    SciTech Connect

    Thompson, Sandra E.; Heredia-Langner, Alejandro; Johnson, Timothy J.; Foster, Nancy S.; Valentine, Nancy B.; Amonette, James E.

    2005-06-04

    The potential use of chemicals, biotoxins and biological pathogens are a threat to military and police forces as well as the general public. Rapid identification of these agents is made difficult due to the noisy nature of the signal that can be obtained from portable, in-field sensors. In previously published articles, we created a flowchart that illustrated a method for triaging bacterial identification by combining standard statistical techniques for discrimination and identification with mid-infrared spectroscopic data. The present work documents the process of characterizing and eliminating the sources of the noise and outlines how multidisciplinary teams are necessary to accomplish that goal.

  20. Statistical analysis of multivariate atmospheric variables. [cloud cover

    NASA Technical Reports Server (NTRS)

    Tubbs, J. D.

    1979-01-01

    Topics covered include: (1) estimation in discrete multivariate distributions; (2) a procedure to predict cloud cover frequencies in the bivariate case; (3) a program to compute conditional bivariate normal parameters; (4) the transformation of nonnormal multivariate to near-normal; (5) test of fit for the extreme value distribution based upon the generalized minimum chi-square; (6) test of fit for continuous distributions based upon the generalized minimum chi-square; (7) effect of correlated observations on confidence sets based upon chi-square statistics; and (8) generation of random variates from specified distributions.

  1. Statistical analysis of kerf mark measurements in bone

    PubMed Central

    Wang, Yishi; van de Goot, Frank R. W.; Gerretsen, Reza R. R.

    2010-01-01

    Saw marks on bone have been routinely reported in dismemberment cases. When saw blade teeth contact bone and the bone is not completely sawed into two parts, bone fragments are removed forming a channel or kerf. Therefore, kerf width can approximate the thickness of the saw blade. The purpose of this study is to evaluate 100 saw kerf widths in bone produced by ten saw types to determine if a saw can be eliminated based on the kerf width. Five measurements were taken from each of the 100 saw kerfs to establish an average thickness for each kerf mark. Ten cuts were made on 10 sections of bovine bone, five with human-powered saws and five with mechanical-powered saws. The cuts were examined with a stereoscopic microscope utilizing digital camera measuring software. Two statistical cumulative logistic regression models were used to analyze the saw kerf data collected. In order to estimate the prediction error, repeated stratified cross-validation was applied in analyzing the kerf mark data. Based on the two statistical models used, 70–90% of the saws could be eliminated based on kerf width. PMID:20652770

  2. Statistical analysis of bankrupting and non-bankrupting stocks

    NASA Astrophysics Data System (ADS)

    Li, Qian; Wang, Fengzhong; Wei, Jianrong; Liang, Yuan; Huang, Jiping; Stanley, H. Eugene

    2012-04-01

    The recent financial crisis has caused extensive world-wide economic damage, affecting in particular those who invested in companies that eventually filed for bankruptcy. A better understanding of stocks that become bankrupt would be helpful in reducing risk in future investments. Economists have conducted extensive research on this topic, and here we ask whether statistical physics concepts and approaches may offer insights into pre-bankruptcy stock behavior. To this end, we study all 20092 stocks listed in US stock markets for the 20-year period 1989-2008, including 4223 (21 percent) that became bankrupt during that period. We find that, surprisingly, the distributions of the daily returns of those stocks that become bankrupt differ significantly from those that do not. Moreover, these differences are consistent for the entire period studied. We further study the relation between the distribution of returns and the length of time until bankruptcy, and observe that larger differences of the distribution of returns correlate with shorter time periods preceding bankruptcy. This behavior suggests that sharper fluctuations in the stock price occur when the stock is closer to bankruptcy. We also analyze the cross-correlations between the return and the trading volume, and find that stocks approaching bankruptcy tend to have larger return-volume cross-correlations than stocks that are not. Furthermore, the difference increases as bankruptcy approaches. We conclude that before a firm becomes bankrupt its stock exhibits unusual behavior that is statistically quantifiable.

  3. Statistical Analysis of the Indus Script Using n-Grams

    PubMed Central

    Yadav, Nisha; Joglekar, Hrishikesh; Rao, Rajesh P. N.; Vahia, Mayank N.; Adhikari, Ronojoy; Mahadevan, Iravatham

    2010-01-01

    The Indus script is one of the major undeciphered scripts of the ancient world. The small size of the corpus, the absence of bilingual texts, and the lack of definite knowledge of the underlying language has frustrated efforts at decipherment since the discovery of the remains of the Indus civilization. Building on previous statistical approaches, we apply the tools of statistical language processing, specifically n-gram Markov chains, to analyze the syntax of the Indus script. We find that unigrams follow a Zipf-Mandelbrot distribution. Text beginner and ender distributions are unequal, providing internal evidence for syntax. We see clear evidence of strong bigram correlations and extract significant pairs and triplets using a log-likelihood measure of association. Highly frequent pairs and triplets are not always highly significant. The model performance is evaluated using information-theoretic measures and cross-validation. The model can restore doubtfully read texts with an accuracy of about 75%. We find that a quadrigram Markov chain saturates information theoretic measures against a held-out corpus. Our work forms the basis for the development of a stochastic grammar which may be used to explore the syntax of the Indus script in greater detail. PMID:20333254

  4. Statistical analysis of the Indus script using n-grams.

    PubMed

    Yadav, Nisha; Joglekar, Hrishikesh; Rao, Rajesh P N; Vahia, Mayank N; Adhikari, Ronojoy; Mahadevan, Iravatham

    2010-01-01

    The Indus script is one of the major undeciphered scripts of the ancient world. The small size of the corpus, the absence of bilingual texts, and the lack of definite knowledge of the underlying language has frustrated efforts at decipherment since the discovery of the remains of the Indus civilization. Building on previous statistical approaches, we apply the tools of statistical language processing, specifically n-gram Markov chains, to analyze the syntax of the Indus script. We find that unigrams follow a Zipf-Mandelbrot distribution. Text beginner and ender distributions are unequal, providing internal evidence for syntax. We see clear evidence of strong bigram correlations and extract significant pairs and triplets using a log-likelihood measure of association. Highly frequent pairs and triplets are not always highly significant. The model performance is evaluated using information-theoretic measures and cross-validation. The model can restore doubtfully read texts with an accuracy of about 75%. We find that a quadrigram Markov chain saturates information theoretic measures against a held-out corpus. Our work forms the basis for the development of a stochastic grammar which may be used to explore the syntax of the Indus script in greater detail. PMID:20333254

  5. Texture analysis with statistical methods for wheat ear extraction

    NASA Astrophysics Data System (ADS)

    Bakhouche, M.; Cointault, F.; Gouton, P.

    2007-01-01

    In agronomic domain, the simplification of crop counting, necessary for yield prediction and agronomic studies, is an important project for technical institutes such as Arvalis. Although the main objective of our global project is to conceive a mobile robot for natural image acquisition directly in a field, Arvalis has proposed us first to detect by image processing the number of wheat ears in images before to count them, which will allow to obtain the first component of the yield. In this paper we compare different texture image segmentation techniques based on feature extraction by first and higher order statistical methods which have been applied on our images. The extracted features are used for unsupervised pixel classification to obtain the different classes in the image. So, the K-means algorithm is implemented before the choice of a threshold to highlight the ears. Three methods have been tested in this feasibility study with very average error of 6%. Although the evaluation of the quality of the detection is visually done, automatic evaluation algorithms are currently implementing. Moreover, other statistical methods of higher order will be implemented in the future jointly with methods based on spatio-frequential transforms and specific filtering.

  6. Power flow as a complement to statistical energy analysis and finite element analysis

    NASA Technical Reports Server (NTRS)

    Cuschieri, J. M.

    1987-01-01

    Present methods of analysis of the structural response and the structure-borne transmission of vibrational energy use either finite element (FE) techniques or statistical energy analysis (SEA) methods. The FE methods are a very useful tool at low frequencies where the number of resonances involved in the analysis is rather small. On the other hand SEA methods can predict with acceptable accuracy the response and energy transmission between coupled structures at relatively high frequencies where the structural modal density is high and a statistical approach is the appropriate solution. In the mid-frequency range, a relatively large number of resonances exist which make finite element method too costly. On the other hand SEA methods can only predict an average level form. In this mid-frequency range a possible alternative is to use power flow techniques, where the input and flow of vibrational energy to excited and coupled structural components can be expressed in terms of input and transfer mobilities. This power flow technique can be extended from low to high frequencies and this can be integrated with established FE models at low frequencies and SEA models at high frequencies to form a verification of the method. This method of structural analysis using power flo and mobility methods, and its integration with SEA and FE analysis is applied to the case of two thin beams joined together at right angles.

  7. Limitations of Using Microsoft Excel Version 2016 (MS Excel 2016) for Statistical Analysis for Medical Research.

    PubMed

    Tanavalee, Chotetawan; Luksanapruksa, Panya; Singhatanadgige, Weerasak

    2016-06-01

    Microsoft Excel (MS Excel) is a commonly used program for data collection and statistical analysis in biomedical research. However, this program has many limitations, including fewer functions that can be used for analysis and a limited number of total cells compared with dedicated statistical programs. MS Excel cannot complete analyses with blank cells, and cells must be selected manually for analysis. In addition, it requires multiple steps of data transformation and formulas to plot survival analysis graphs, among others. The Megastat add-on program, which will be supported by MS Excel 2016 soon, would eliminate some limitations of using statistic formulas within MS Excel. PMID:27135620

  8. Statistical mechanical modeling: Computer simulations, analysis and applications

    NASA Astrophysics Data System (ADS)

    Subramanian, Balakrishna

    This thesis describes the applications of statistical mechanical models and tools, especially computational techniques to the study of several problems in science. We study in chapter 2, various properties of a non-equilibrium cellular automaton model, the Toom model. We obtain numerically the exponents describing the fluctuations of the interface between the two stable phases of the model. In chapter 3, we introduce a binary alloy model with three-body potentials. Unlike the usual Ising-type models with two-body interactions, this model is not symmetric in its components. We calculate the exact low temperature phase diagram using Pirogov-Sinai theory and also find the mean-field equilibrium properties of this model. We then study the kinetics of phase segregation following a quenching in this model. We find that the results are very similar to those obtained for Ising-type models with pair interactions, indicating universality. In chapter 4, we discuss the statistical properties of "Contact Maps". These maps, are used to represent three-dimensional structures of proteins in modeling problems. We find that this representation space has particular properties that make it a convenient choice. The maps representing native folds of proteins correspond to compact structures which in turn correspond to maps with low degeneracy, making it easier to translate the map into the detailed 3-dimensional structure. The early stage of formation of a river network is described in Chapter 5 using quasi-random spanning trees on a square lattice. We observe that the statistical properties generated by these models are quite similar (better than some of the earlier models) to the empirical laws and results presented by geologists for real river networks. Finally, in chapter 6 we present a brief note on our study of the problem of progression of heterogeneous breast tumors. We investigate some of the possible pathways of progression based on the traditional notions of DCIS (Ductal

  9. Statistical Analysis of Seismicity in the Sumatra Region

    NASA Astrophysics Data System (ADS)

    Bansal, A.; Main, I.

    2007-12-01

    We examine the effect of the great M=9.0 Boxing day 2004 earthquake on the statistics of seismicity in the Sumatra region by dividing data from the NEIC catalogue into two time windows before and after the earthquake. First we determine a completeness threshold of magnitude 4.5 for the whole dataset from the stability of the maximum likelihood b-value with respect to changes in the threshold. The split data sets have similar statistical sampling, with 2563 events before and 3701 after the event. Temporal clustering is first quantified broadly by the fractal dimension of the time series to be respectively 0.137, 0.259 and 0.222 before, after and for the whole dataset, compared to a Poisson null hypothesis of 0, indicating a significant increase in temporal clustering after the event associated with aftershocks. To quantify this further we apply the Epidemic Type Aftershock Sequence (ETAS) model. The background random seismicity rate £g and the coefficient Ñ, a measure of an efficiency of a magnitude of an earthquake in generating its aftershocks, do not change significantly when averaged over the two time periods. In contrast the amplitude A of aftershock generation changes by a factor 4 or so, and there is a small but statistically significant increase in the Omori decay exponent p, indicating a faster decay rate of the aftershocks after the Sumatra earthquake. The ETAS model parameters are calculated for different magnitude threshold (i.e. 4.5, 5.0, 5.5) with similar results for the different magnitude thresholds. The Ñ values increases from near 1 to near 1.5, possibly reflecting known changes in the scaling exponent between scalar moment and magnitude with increasing magnitude. A simple relation of magnitude and span of aftershock activity indicates that detectable aftershock activity of the Sumatra earthquake may last up to 8.7 years. Earthquakes are predominantly in the depth range 30-40 km before 20-30 km after the mainshock, compared to a CMT centroid

  10. Analysis of surface sputtering on a quantum statistical basis

    NASA Technical Reports Server (NTRS)

    Wilhelm, H. E.

    1975-01-01

    Surface sputtering is explained theoretically by means of a 3-body sputtering mechanism involving the ion and two surface atoms of the solid. By means of quantum-statistical mechanics, a formula for the sputtering ratio S(E) is derived from first principles. The theoretical sputtering rate S(E) was found experimentally to be proportional to the square of the difference between incident ion energy and the threshold energy for sputtering of surface atoms at low ion energies. Extrapolation of the theoretical sputtering formula to larger ion energies indicates that S(E) reaches a saturation value and finally decreases at high ion energies. The theoretical sputtering ratios S(E) for wolfram, tantalum, and molybdenum are compared with the corresponding experimental sputtering curves in the low energy region from threshold sputtering energy to 120 eV above the respective threshold energy. Theory and experiment are shown to be in good agreement.

  11. Statistical Analysis of Haralick Texture Features to Discriminate Lung Abnormalities.

    PubMed

    Zayed, Nourhan; Elnemr, Heba A

    2015-01-01

    The Haralick texture features are a well-known mathematical method to detect the lung abnormalities and give the opportunity to the physician to localize the abnormality tissue type, either lung tumor or pulmonary edema. In this paper, statistical evaluation of the different features will represent the reported performance of the proposed method. Thirty-seven patients CT datasets with either lung tumor or pulmonary edema were included in this study. The CT images are first preprocessed for noise reduction and image enhancement, followed by segmentation techniques to segment the lungs, and finally Haralick texture features to detect the type of the abnormality within the lungs. In spite of the presence of low contrast and high noise in images, the proposed algorithms introduce promising results in detecting the abnormality of lungs in most of the patients in comparison with the normal and suggest that some of the features are significantly recommended than others. PMID:26557845

  12. Statistical thermal model analysis of particle production at LHC

    NASA Astrophysics Data System (ADS)

    Karasu Uysal, A.; Vardar, N.

    2016-04-01

    A successful description of the particle ratios measured in heavy-ion collisions has been achieved in the framework of thermal models. In such a way, a large number of observables can be reproduced with a small number of parameters, namely the temperature, baryo-chemical potential and a factor measuring the degree of strangeness saturation. The comparison of experimental data at and the model estimations has made possible to define the thermodynamic parameters of strongly interacting matter at chemical freeze-out temperature. The detailed study of hadron and meson production including resonances using the statistical-thermal model is discussed. Their ratios are compared with the existing experimental data and predictions are made for pp and heavy-ion collisions at RHIC and LHC energies.

  13. Statistical Analysis of Complexity Generators for Cost Estimation

    NASA Technical Reports Server (NTRS)

    Rowell, Ginger Holmes

    1999-01-01

    Predicting the cost of cutting edge new technologies involved with spacecraft hardware can be quite complicated. A new feature of the NASA Air Force Cost Model (NAFCOM), called the Complexity Generator, is being developed to model the complexity factors that drive the cost of space hardware. This parametric approach is also designed to account for the differences in cost, based on factors that are unique to each system and subsystem. The cost driver categories included in this model are weight, inheritance from previous missions, technical complexity, and management factors. This paper explains the Complexity Generator framework, the statistical methods used to select the best model within this framework, and the procedures used to find the region of predictability and the prediction intervals for the cost of a mission.

  14. Statistical Methods for Rapid Aerothermal Analysis and Design Technology: Validation

    NASA Technical Reports Server (NTRS)

    DePriest, Douglas; Morgan, Carolyn

    2003-01-01

    The cost and safety goals for NASA s next generation of reusable launch vehicle (RLV) will require that rapid high-fidelity aerothermodynamic design tools be used early in the design cycle. To meet these requirements, it is desirable to identify adequate statistical models that quantify and improve the accuracy, extend the applicability, and enable combined analyses using existing prediction tools. The initial research work focused on establishing suitable candidate models for these purposes. The second phase is focused on assessing the performance of these models to accurately predict the heat rate for a given candidate data set. This validation work compared models and methods that may be useful in predicting the heat rate.

  15. Statistical analysis of loopy belief propagation in random fields

    NASA Astrophysics Data System (ADS)

    Yasuda, Muneki; Kataoka, Shun; Tanaka, Kazuyuki

    2015-10-01

    Loopy belief propagation (LBP), which is equivalent to the Bethe approximation in statistical mechanics, is a message-passing-type inference method that is widely used to analyze systems based on Markov random fields (MRFs). In this paper, we propose a message-passing-type method to analytically evaluate the quenched average of LBP in random fields by using the replica cluster variation method. The proposed analytical method is applicable to general pairwise MRFs with random fields whose distributions differ from each other and can give the quenched averages of the Bethe free energies over random fields, which are consistent with numerical results. The order of its computational cost is equivalent to that of standard LBP. In the latter part of this paper, we describe the application of the proposed method to Bayesian image restoration, in which we observed that our theoretical results are in good agreement with the numerical results for natural images.

  16. Statistical analysis of Nomao customer votes for spots of France

    NASA Astrophysics Data System (ADS)

    Pálovics, Róbert; Daróczy, Bálint; Benczúr, András; Pap, Julia; Ermann, Leonardo; Phan, Samuel; Chepelianskii, Alexei D.; Shepelyansky, Dima L.

    2015-08-01

    We investigate the statistical properties of votes of customers for spots of France collected by the startup company Nomao. The frequencies of votes per spot and per customer are characterized by a power law distribution which remains stable on a time scale of a decade when the number of votes is varied by almost two orders of magnitude. Using the computer science methods we explore the spectrum and the eigenvalues of a matrix containing user ratings to geolocalized items. Eigenvalues nicely map to large towns and regions but show certain level of instability as we modify the interpretation of the underlying matrix. We evaluate imputation strategies that provide improved prediction performance by reaching geographically smooth eigenvectors. We point on possible links between distribution of votes and the phenomenon of self-organized criticality.

  17. A Statistical Analysis of Exoplanets in Their Habitable Zones

    NASA Astrophysics Data System (ADS)

    Adams, Arthur; Kane, S. R.

    2014-01-01

    The Kepler mission has detected a wealth of planets through planetary transits since its launch in 2009. An important step in the continued study of exoplanets is to characterize planets based on their orbital properties and compositions. As the Kepler mission has progressed the data sensitivity to planetary transits at longer orbital periods has increased. This allows for an enhanced probability of detecting planets which lie in the Habitable Zones (HZs) of their host stars. We present the results of statistical analyses of Kepler planetary candidates to study the percentage of orbital time spent in the HZ as a function of planetary parameters, including planetary mass, radius, and orbital eccentricity. We compare these results to the confirmed exoplanet population.

  18. Statistical Analysis of Haralick Texture Features to Discriminate Lung Abnormalities

    PubMed Central

    Zayed, Nourhan; Elnemr, Heba A.

    2015-01-01

    The Haralick texture features are a well-known mathematical method to detect the lung abnormalities and give the opportunity to the physician to localize the abnormality tissue type, either lung tumor or pulmonary edema. In this paper, statistical evaluation of the different features will represent the reported performance of the proposed method. Thirty-seven patients CT datasets with either lung tumor or pulmonary edema were included in this study. The CT images are first preprocessed for noise reduction and image enhancement, followed by segmentation techniques to segment the lungs, and finally Haralick texture features to detect the type of the abnormality within the lungs. In spite of the presence of low contrast and high noise in images, the proposed algorithms introduce promising results in detecting the abnormality of lungs in most of the patients in comparison with the normal and suggest that some of the features are significantly recommended than others. PMID:26557845

  19. New Statistical Methods for the Analysis of the Cratering on Venus

    NASA Astrophysics Data System (ADS)

    Xie, M.; Smrekar, S. E.; Handcock, M. S.

    2014-12-01

    The sparse crater population (~1000 craters) on Venus is the most important clue of determining the planet's surface age and aids in understanding its geologic history. What processes (volcanism, tectonism, weathering, etc.) modify the total impact crater population? Are the processes regional or global in occurrence? The heated debate on these questions points to the need for better approaches. We present new statistical methods for the analysis of the crater locations and characteristics. Specifically: 1) We produce a map of crater density and the proportion of no halo craters (inferred to be modified) by using generalized additive models, and smoothing splines with a spherical spline basis set. Based on this map, we are able to predict the probability of a crater has no halo given that there is a crater at that point. We also obtain a continuous representation of the ratio of craters with no halo as a function of crater density. This approach allows us to look for regions that appear to have experienced more or less modification, and are thus potentially older or younger. 2) We examine the randomness or clustering of distributions of craters by type (e.g. dark floored, intermediate). For example, for dark floored craters we consider two hypotheses: i) the dark floored craters are randomly distributed on the surface; ii) the dark floored craters are random given the locations of the crater population. Instead of only using a single measure such as average nearest neighbor distance, we use the probability density function of these distances, and compare it to complete spatial randomness to get the relative probability density function. This function gives us a clearer picture of how and where the nearest neighbor distances differ from complete spatial randomness. We also conduct statistical tests of these hypotheses. Confidence intervals with specified global coverage are constructed. Software to reproduce the methods is available in the open source statistics

  20. Uncovering the Formation of Ultracompact Dwarf Galaxies by Multivariate Statistical Analysis

    NASA Astrophysics Data System (ADS)

    Chattopadhyay, Tanuka; Sharina, Margarita; Davoust, Emmanuel; De, Tuli; Chattopadhyay, Asis Kumar

    2012-05-01

    We present a statistical analysis of the properties of a large sample of dynamically hot old stellar systems, from globular clusters (GCs) to giant ellipticals, which was performed in order to investigate the origin of ultracompact dwarf galaxies (UCDs). The data were mostly drawn from Forbes et al. We recalculated some of the effective radii, computed mean surface brightnesses and mass-to-light ratios, and estimated ages and metallicities. We completed the sample with GCs of M31. We used a multivariate statistical technique (K-Means clustering), together with a new algorithm (Gap Statistics) for finding the optimum number of homogeneous sub-groups in the sample, using a total of six parameters (absolute magnitude, effective radius, virial mass-to-light ratio, stellar mass-to-light ratio, and metallicity). We found six groups. FK1 and FK5 are composed of high- and low-mass elliptical galaxies, respectively. FK3 and FK6 are composed of high-metallicity and low-metallicity objects, respectively, and both include GCs and UCDs. Two very small groups, FK2 and FK4, are composed of Local Group dwarf spheroidals. Our groups differ in their mean masses and virial mass-to-light ratios. The relations between these two parameters are also different for the various groups. The probability density distributions of metallicity for the four groups of galaxies are similar to those of the GCs and UCDs. The brightest low-metallicity GCs and UCDs tend to follow the mass-metallicity relation like elliptical galaxies. The objects of FK3 are more metal-rich per unit effective luminosity density than high-mass ellipticals.

  1. Statistical Analysis of GPS Vertical Uplift Rates in Southern California

    NASA Astrophysics Data System (ADS)

    Howell, S. M.; Smith-Konter, B. R.; Frazer, L. N.; Tong, X.; Sandwell, D. T.

    2014-12-01

    Variations in crustal surface velocities obtained from GPS stations provide key constraints on physical models that predict surface deformation in response to earthquake cycle loading processes. Vertical GPS velocities, however, are highly susceptible to short scale (<10's km) variations in both magnitude and direction induced by local changes in water-storage, pore pressure, precipitation, and water runoff. These short-wavelength spatial variations both dominate and contaminate vertical GPS velocity measurements and often mask coherent long-wavelength deformation signals. Because of these complications, vertical GPS velocities, like those provided by EarthScope's Plate Boundary Observatory (PBO), have traditionally been omitted from crustal deformation models. Here we attempt to overcome these obstacles by first eliminating GPS velocities influenced by non-tectonic deformation sources based on high-resolution InSAR data. Second, we employ model selection, a statistical technique that provides an objective and robust estimate of the velocity field that best describes the regional signal without overfitting the highly variable short-wavelength noise. Spline-based interpolation techniques are also used to corroborate these models. We compare these results to published physical models that simulate 3D viscoelastic earthquake cycle deformation and find that the statistical PBO vertical velocity model is in good agreement (0.55 mm/yr residual) with physical model predictions of vertical deformation in Southern California. We also utilize sources of disagreement as a tool for improving our physical model and to further inspect non-tectonic sources of deformation. Moreover, these results suggest that vertical GPS velocities can be used as additional physical model constraints, leading to a better understanding of faulting parameters that are critical to seismic hazard analyses.

  2. Statistical analysis of imperfection effect on cylindrical buckling response

    NASA Astrophysics Data System (ADS)

    Ismail, M. S.; Purbolaksono, J.; Muhammad, N.; Andriyana, A.; Liew, H. L.

    2015-12-01

    It is widely reported that no efficient guidelines for modelling imperfections in composite structures are available. In response, this work evaluates the imperfection factors of axially compressed Carbon Fibre Reinforced Polymer (CFRP) cylinder with different ply angles through finite element (FE) analysis. The sensitivity of imperfection factors were analysed using design of experiment: factorial design approach. From the analysis it identified three critical factors that sensitively reacted towards buckling load. Furthermore empirical equation is proposed according to each type of cylinder. Eventually, critical buckling loads estimated by empirical equation showed good agreements with FE analysis. The design of experiment methodology is useful in identifying parameters that lead to structures imperfection tolerance.

  3. Statistical Continuum Mechanics Analysis of Effective Elastic Properties in Solid Oxide Fuel Cell Glass–Ceramic Seal Material

    SciTech Connect

    Milhans, Jacqueline; Li, Dongsheng; Khaleel, Mohammad A.; Sun, Xin; Garmestani, Hamid

    2010-09-01

    A full statistical analysis of the microstructure of glass–ceramic solid oxide fuel cell (SOFC) seal material, G18, is performed to calculate elastic properties. Predictions are made for samples aged for 4 h and 1000 h, giving different crystallinity levels. Microstructure of the glass–ceramic G18 is characterized by correlation function for each individual phase. Predicted results are compared with the Voigt and Reuss bounds in this study. The weak contrast analysis results in elastic modulus predictions between the upper and lower bounds but closer to the upper bound.

  4. Paleotempestological chronology developed from gas ion source AMS analysis of carbonates determined through real-time Bayesian statistical approach

    NASA Astrophysics Data System (ADS)

    Wallace, D. J.; Rosenheim, B. E.; Roberts, M. L.; Burton, J. R.; Donnelly, J. P.; Woodruff, J. D.

    2014-12-01

    Is a small quantity of high-precision ages more robust than a higher quantity of lower-precision ages for sediment core chronologies? AMS Radiocarbon ages have been available to researchers for several decades now, and precision of the technique has continued to improve. Analysis and time cost is high, though, and projects are often limited in terms of the number of dates that can be used to develop a chronology. The Gas Ion Source at the National Ocean Sciences Accelerator Mass Spectrometry Facility (NOSAMS), while providing lower-precision (uncertainty of order 100 14C y for a sample), is significantly less expensive and far less time consuming than conventional age dating and offers the unique opportunity for large amounts of ages. Here we couple two approaches, one analytical and one statistical, to investigate the utility of an age model comprised of these lower-precision ages for paleotempestology. We use a gas ion source interfaced to a gas-bench type device to generate radiocarbon dates approximately every 5 minutes while determining the order of sample analysis using the published Bayesian accumulation histories for deposits (Bacon). During two day-long sessions, several dates were obtained from carbonate shells in living position in a sediment core comprised of sapropel gel from Mangrove Lake, Bermuda. Samples were prepared where large shells were available, and the order of analysis was determined by the depth with the highest uncertainty according to Bacon. We present the results of these analyses as well as a prognosis for a future where such age models can be constructed from many dates that are quickly obtained relative to conventional radiocarbon dates. This technique currently is limited to carbonates, but development of a system for organic material dating is underway. We will demonstrate the extent to which sacrificing some analytical precision in favor of more dates improves age models.

  5. Use of Brain MRI Atlases to Determine Boundaries of Age-Related Pathology: The Importance of Statistical Method

    PubMed Central

    Dickie, David Alexander; Job, Dominic E.; Gonzalez, David Rodriguez; Shenkin, Susan D.; Wardlaw, Joanna M.

    2015-01-01

    Introduction Neurodegenerative disease diagnoses may be supported by the comparison of an individual patient’s brain magnetic resonance image (MRI) with a voxel-based atlas of normal brain MRI. Most current brain MRI atlases are of young to middle-aged adults and parametric, e.g., mean ±standard deviation (SD); these atlases require data to be Gaussian. Brain MRI data, e.g., grey matter (GM) proportion images, from normal older subjects are apparently not Gaussian. We created a nonparametric and a parametric atlas of the normal limits of GM proportions in older subjects and compared their classifications of GM proportions in Alzheimer’s disease (AD) patients. Methods Using publicly available brain MRI from 138 normal subjects and 138 subjects diagnosed with AD (all 55–90 years), we created: a mean ±SD atlas to estimate parametrically the percentile ranks and limits of normal ageing GM; and, separately, a nonparametric, rank order-based GM atlas from the same normal ageing subjects. GM images from AD patients were then classified with respect to each atlas to determine the effect statistical distributions had on classifications of proportions of GM in AD patients. Results The parametric atlas often defined the lower normal limit of the proportion of GM to be negative (which does not make sense physiologically as the lowest possible proportion is zero). Because of this, for approximately half of the AD subjects, 25–45% of voxels were classified as normal when compared to the parametric atlas; but were classified as abnormal when compared to the nonparametric atlas. These voxels were mainly concentrated in the frontal and occipital lobes. Discussion To our knowledge, we have presented the first nonparametric brain MRI atlas. In conditions where there is increasing variability in brain structure, such as in old age, nonparametric brain MRI atlases may represent the limits of normal brain structure more accurately than parametric approaches. Therefore, we

  6. New Statistical Approach to the Analysis of Hierarchical Data

    NASA Astrophysics Data System (ADS)

    Neuman, S. P.; Guadagnini, A.; Riva, M.

    2014-12-01

    Many variables possess a hierarchical structure reflected in how their increments vary in space and/or time. Quite commonly the increments (a) fluctuate in a highly irregular manner; (b) possess symmetric, non-Gaussian frequency distributions characterized by heavy tails that often decay with separation distance or lag; (c) exhibit nonlinear power-law scaling of sample structure functions in a midrange of lags, with breakdown in such scaling at small and large lags; (d) show extended power-law scaling (ESS) at all lags; and (e) display nonlinear scaling of power-law exponent with order of sample structure function. Some interpret this to imply that the variables are multifractal, which explains neither breakdowns in power-law scaling nor ESS. We offer an alternative interpretation consistent with all above phenomena. It views data as samples from stationary, anisotropic sub-Gaussian random fields subordinated to truncated fractional Brownian motion (tfBm) or truncated fractional Gaussian noise (tfGn). The fields are scaled Gaussian mixtures with random variances. Truncation of fBm and fGn entails filtering out components below data measurement or resolution scale and above domain scale. Our novel interpretation of the data allows us to obtain maximum likelihood estimates of all parameters characterizing the underlying truncated sub-Gaussian fields. These parameters in turn make it possible to downscale or upscale all statistical moments to situations entailing smaller or larger measurement or resolution and sampling scales, respectively. They also allow one to perform conditional or unconditional Monte Carlo simulations of random field realizations corresponding to these scales. Aspects of our approach are illustrated on field and laboratory measured porous and fractured rock permeabilities, as well as soil texture characteristics and neural network estimates of unsaturated hydraulic parameters in a deep vadose zone near Phoenix, Arizona. We also use our approach

  7. Bayesian Statistics and Uncertainty Quantification for Safety Boundary Analysis in Complex Systems

    NASA Technical Reports Server (NTRS)

    He, Yuning; Davies, Misty Dawn

    2014-01-01

    The analysis of a safety-critical system often requires detailed knowledge of safe regions and their highdimensional non-linear boundaries. We present a statistical approach to iteratively detect and characterize the boundaries, which are provided as parameterized shape candidates. Using methods from uncertainty quantification and active learning, we incrementally construct a statistical model from only few simulation runs and obtain statistically sound estimates of the shape parameters for safety boundaries.

  8. [Discrimination of bamboo using FTIR spectroscopy and statistical analysis].

    PubMed

    Li, Lun; Liu, Gang; Zhang, Chuan-Yun; Ou, Quan-Hong; Zhang, Li; Zhao, Xing-Xiang

    2013-12-01

    Fourier transform infrared (FTIR) spectroscopy combined with principal component analysis (PCA) and hierarchical cluster analysis (HCA) were used to identify and classify bamboo leaves. FTIR spectra of fifty-four bamboo leaf samples belonging to six species were obtained. The results showed that the infrared spectra of bamboo leaves were similar, and mainly composed of the bands of polysaccharides, protein and lipids. The original spectra exhibit minor differences in the region of 1800-700cm-1. The second derivative spectra show apparent differences in the same region. Principal component analysis and hierarchical cluster analysis were performed on the second derivative infrared spectra in the range from 1800 to 700 cm-1. The leaf samples were separated into 6 groups with accuracy of 98% with the first three principal components, and with 100% accuracy according to the third and fourth principal components. Hierarchical cluster analysis can correctly cluster the bamboo leaf samples. It is proved that Fourier transform infrared spectroscopy combined with PCA and HCA could be used to discriminate bamboo at species level with only a tiny leaf sample. PMID:24611374

  9. On the Statistical Analysis of X-ray Polarization Measurements

    NASA Technical Reports Server (NTRS)

    Strohmayer, T. E.; Kallman, T. R.

    2013-01-01

    In many polarimetry applications, including observations in the X-ray band, the measurement of a polarization signal can be reduced to the detection and quantification of a deviation from uniformity of a distribution of measured angles of the form alpha plus beta cosine (exp 2)(phi - phi(sub 0) (0 (is) less than phi is less than pi). We explore the statistics of such polarization measurements using both Monte Carlo simulations as well as analytic calculations based on the appropriate probability distributions. We derive relations for the number of counts required to reach a given detection level (parameterized by beta the "number of sigma's" of the measurement) appropriate for measuring the modulation amplitude alpha by itself (single interesting parameter case) or jointly with the position angle phi (two interesting parameters case). We show that for the former case when the intrinsic amplitude is equal to the well known minimum detectable polarization (MDP) it is, on average, detected at the 3sigma level. For the latter case, when one requires a joint measurement at the same confidence level, then more counts are needed, by a factor of approximately equal to 2.2, than that required to achieve the MDP level. We find that the position angle uncertainty at 1sigma confidence is well described by the relation sigma(sub pi) equals 28.5(degrees) divided by beta.

  10. Statistical analysis of properties of dwarf novae outbursts

    NASA Astrophysics Data System (ADS)

    Otulakowska-Hypka, Magdalena; Olech, Arkadiusz; Patterson, Joseph

    2016-08-01

    We present a statistical study of all measurable photometric features of a large sample of dwarf novae during their outbursts and superoutbursts. We used all accessible photometric data for all our objects to make the study as complete and up to date as possible. Our aim was to check correlations between these photometric features in order to constrain theoretical models which try to explain the nature of dwarf novae outbursts. We managed to confirm a few of the known correlations, that is the Stolz and Schoembs relation, the Bailey relation for long outbursts above the period gap, the relations between the cycle and supercycle lengths, amplitudes of normal and superoutbursts, amplitude and duration of superoutbursts, outburst duration and orbital period, outburst duration and mass ratio for short and normal outbursts, as well as the relation between the rise and decline rates of superoutbursts. However, we question the existence of the Kukarkin-Parenago relation but we found an analogous relation for superoutbursts. We also failed to find one presumed relation between outburst duration and mass ratio for superoutbursts. This study should help to direct theoretical work dedicated to dwarf novae.

  11. Tool for Statistical Analysis and Display of Landing Sites

    NASA Technical Reports Server (NTRS)

    Wawrzyniak, Geoffrey; Kennedy, Brian; Knocke, Philip; Michel, John

    2006-01-01

    MarsLS is a software tool for analyzing statistical dispersion of spacecraft-landing sites and displaying the results of its analyses. Originally intended for the Mars Explorer Rover (MER) mission, MarsLS is also applicable to landing sites on Earth and non-MER sites on Mars. MarsLS is a collection of interdependent MATLAB scripts that utilize the MATLAB graphical-user-interface software environment to display landing-site data (see figure) on calibrated image-maps of the Martian or other terrain. The landing-site data comprise latitude/longitude pairs generated by Monte Carlo runs of other computer programs that simulate entry, descent, and landing. Using these data, MarsLS can compute a landing-site ellipse a standard means of depicting the area within which the spacecraft can be expected to land with a given probability. MarsLS incorporates several features for the user s convenience, including capabilities for drawing lines and ellipses, overlaying kilometer or latitude/longitude grids, drawing and/or specifying lines and/or points, entering notes, defining and/or displaying polygons to indicate hazards or areas of interest, and evaluating hazardous and/or scientifically interesting areas. As part of such an evaluation, MarsLS can compute the probability of landing in a specified polygonal area.

  12. Statistically optimal analysis of samples from multiple equilibrium states

    PubMed Central

    Shirts, Michael R.; Chodera, John D.

    2008-01-01

    We present a new estimator for computing free energy differences and thermodynamic expectations as well as their uncertainties from samples obtained from multiple equilibrium states via either simulation or experiment. The estimator, which we call the multistate Bennett acceptance ratio estimator (MBAR) because it reduces to the Bennett acceptance ratio estimator (BAR) when only two states are considered, has significant advantages over multiple histogram reweighting methods for combining data from multiple states. It does not require the sampled energy range to be discretized to produce histograms, eliminating bias due to energy binning and significantly reducing the time complexity of computing a solution to the estimating equations in many cases. Additionally, an estimate of the statistical uncertainty is provided for all estimated quantities. In the large sample limit, MBAR is unbiased and has the lowest variance of any known estimator for making use of equilibrium data collected from multiple states. We illustrate this method by producing a highly precise estimate of the potential of mean force for a DNA hairpin system, combining data from multiple optical tweezer measurements under constant force bias. PMID:19045004

  13. Statistical language analysis for automatic exfiltration event detection.

    SciTech Connect

    Robinson, David Gerald

    2010-04-01

    This paper discusses the recent development a statistical approach for the automatic identification of anomalous network activity that is characteristic of exfiltration events. This approach is based on the language processing method eferred to as latent dirichlet allocation (LDA). Cyber security experts currently depend heavily on a rule-based framework for initial detection of suspect network events. The application of the rule set typically results in an extensive list of uspect network events that are then further explored manually for suspicious activity. The ability to identify anomalous network events is heavily dependent on the experience of the security personnel wading through the network log. Limitations f this approach are clear: rule-based systems only apply to exfiltration behavior that has previously been observed, and experienced cyber security personnel are rare commodities. Since the new methodology is not a discrete rule-based pproach, it is more difficult for an insider to disguise the exfiltration events. A further benefit is that the methodology provides a risk-based approach that can be implemented in a continuous, dynamic or evolutionary fashion. This permits uspect network activity to be identified early with a quantifiable risk associated with decision making when responding to suspicious activity.

  14. Statistical analysis of properties of dwarf novae outbursts

    NASA Astrophysics Data System (ADS)

    Otulakowska-Hypka, Magdalena; Olech, Arkadiusz; Patterson, Joseph

    2016-08-01

    We present a statistical study of all measurable photometric features of a large sample of dwarf novae during their outbursts and superoutbursts. We used all accessible photometric data for all our objects to make the study as complete and up-to-date as possible. Our aim was to check correlations between these photometric features in order to constrain theoretical models which try to explain the nature of dwarf novae outbursts. We managed to confirm a few of the known correlations, that is the Stolz and Schoembs Relation, the Bailey Relation for long outbursts above the period gap, the relations between the cycle and supercycle lengths, amplitudes of normal and superoutbursts, amplitude and duration of superoutbursts, outburst duration and orbital period, outburst duration and mass ratio for short and normal outbursts, as well as the relation between the rise and decline rates of superoutbursts. However, we question the existence of the Kukarkin-Parenago Relation but we found an analogous relation for superoutbursts. We also failed to find one presumed relation between outburst duration and mass ratio for superoutbursts. This study should help to direct theoretical work dedicated to dwarf novae.

  15. Statistical analysis of shard and canister glass correlation test

    SciTech Connect

    Pulsipher, B.

    1990-12-01

    The vitrification facility at West Valley, New York will be used to incorporate nuclear waste into a vitrified waste form. Waste Acceptance Preliminary Specifications (WAPS) will be used to determine the acceptability of the waste form product. These specifications require chemical characterization of the waste form produced. West Valley Nuclear Services (WVNS) intends to characterize canister contents by obtaining shard samples from the top of the canisters prior to final sealing. A study was conducted to determine whether shard samples taken from the top of canisters filled with vitrified nuclear waste could be considered representative and therefore used to characterize the elemental composition of the entire canister contents. Three canisters produced during the SF-12 melter run conducted at WVNS were thoroughly sampled by core drilling at several axial and radial locations and by obtaining shard samples from the top of the canisters. Chemical analyses were performed and the resulting data were statistically analyzed by Pacific Northwest Laboratory (PNL). If one can assume that the process controls employed by WVNS during the SF-12 run are representative of those to be employed during future melter runs, shard samples can be used to characterize the canister contents. However, if batch-to-batch variations cannot be controlled to the acceptable levels observed from the SF-12 data, the representativeness of shard samples will be in question. The estimates of process and within-canister variations provided herein will prove valuable in determining the required frequency and number of shard samples to meet waste form qualification objectives.

  16. Structure in gamma ray burst time profiles: Statistical Analysis 1

    NASA Technical Reports Server (NTRS)

    Lestrade, John Patrick

    1992-01-01

    Since its launch on April 5, 1991, the Burst And Transient Source Experiment (BATSE) has observed and recorded over 500 gamma-ray bursts (GRB). The analysis of the time profiles of these bursts has proven to be difficult. Attempts to find periodicities through Fourier analysis have been fruitless except one celebrated case. Our goal is to be able to qualify the observed time-profiles structure. Before applying this formation to bursts, we have tested it on profiles composed of random Poissonian noise. This paper is a report of those preliminary results.

  17. Ockham's razor and Bayesian analysis. [statistical theory for systems evaluation

    NASA Technical Reports Server (NTRS)

    Jefferys, William H.; Berger, James O.

    1992-01-01

    'Ockham's razor', the ad hoc principle enjoining the greatest possible simplicity in theoretical explanations, is presently shown to be justifiable as a consequence of Bayesian inference; Bayesian analysis can, moreover, clarify the nature of the 'simplest' hypothesis consistent with the given data. By choosing the prior probabilities of hypotheses, it becomes possible to quantify the scientific judgment that simpler hypotheses are more likely to be correct. Bayesian analysis also shows that a hypothesis with fewer adjustable parameters intrinsically possesses an enhanced posterior probability, due to the clarity of its predictions.

  18. Statistical analysis of geodetic networks for detecting regional events

    NASA Technical Reports Server (NTRS)

    Granat, Robert

    2004-01-01

    We present an application of hidden Markov models (HMMs) to analysis of geodetic time series in Southern California. Our model fitting method uses a regularized version of the deterministic annealing expectation-maximization algorithm to ensure that model solutions are both robust and of high quality.

  19. The Patterns of Teacher Compensation. Statistical Analysis Report.

    ERIC Educational Resources Information Center

    Chambers, Jay; Bobbitt, Sharon A.

    This report presents information regarding the patterns of variation in the salaries paid to public and private school teachers in relation to various personal and job characteristics. Specifically, the analysis examines the relationship between compensation and variables such as public/private schools, gender, race/ethnic background, school level…

  20. Statistical correlation analysis for comparing vibration data from test and analysis

    NASA Technical Reports Server (NTRS)

    Butler, T. G.; Strang, R. F.; Purves, L. R.; Hershfeld, D. J.

    1986-01-01

    A theory was developed to compare vibration modes obtained by NASTRAN analysis with those obtained experimentally. Because many more analytical modes can be obtained than experimental modes, the analytical set was treated as expansion functions for putting both sources in comparative form. The dimensional symmetry was developed for three general cases: nonsymmetric whole model compared with a nonsymmetric whole structural test, symmetric analytical portion compared with a symmetric experimental portion, and analytical symmetric portion with a whole experimental test. The theory was coded and a statistical correlation program was installed as a utility. The theory is established with small classical structures.

  1. Use of Q-Type Factor Analysis with the Aged.

    ERIC Educational Resources Information Center

    Kleban, Morton H.; And Others

    This paper explores Q-Factor Analysis as a method of organizing data on a large array of variables to describe a group of aged Ss. Forty-seven males, specially selected for their good health (Mean Age: 71.5; SD: 4.8) were measured on 550 biological and behavioral variables. A Q-Factor Analysis was calculated, using a S by variable matrix, which is…

  2. Measuring the Success of an Academic Development Programme: A Statistical Analysis

    ERIC Educational Resources Information Center

    Smith, L. C.

    2009-01-01

    This study uses statistical analysis to estimate the impact of first-year academic development courses in microeconomics, statistics, accountancy, and information systems, offered by the University of Cape Town's Commerce Academic Development Programme, on students' graduation performance relative to that achieved by mainstream students. The data…

  3. Statistical Analysis in Evaluation Research: Tools for Investigating Problems in VR Performance.

    ERIC Educational Resources Information Center

    Dodson, Richard; Kogan, Deborah, Ed.

    This report reviews the ways in which statistical analysis can be used as a tool by vocational rehabilitation program managers to investigate the causes of problematic performance and generate strategies for corrective action. Two types of data collection are noted: operational studies and statistical data studies. Descriptions follow of two…

  4. A new statistic for the analysis of circular data in gamma-ray astronomy

    NASA Technical Reports Server (NTRS)

    Protheroe, R. J.

    1985-01-01

    A new statistic is proposed for the analysis of circular data. The statistic is designed specifically for situations where a test of uniformity is required which is powerful against alternatives in which a small fraction of the observations is grouped in a small range of directions, or phases.

  5. Comparing Methods for Item Analysis: The Impact of Different Item-Selection Statistics on Test Difficulty

    ERIC Educational Resources Information Center

    Jones, Andrew T.

    2011-01-01

    Practitioners often depend on item analysis to select items for exam forms and have a variety of options available to them. These include the point-biserial correlation, the agreement statistic, the B index, and the phi coefficient. Although research has demonstrated that these statistics can be useful for item selection, no research as of yet has…

  6. Statistical and Scientometric Analysis of International Research in Geographical and Environmental Education

    ERIC Educational Resources Information Center

    Papadimitriou, Fivos; Kidman, Gillian

    2012-01-01

    Certain statistic and scientometric features of articles published in the journal "International Research in Geographical and Environmental Education" (IRGEE) are examined in this paper for the period 1992-2009 by applying nonparametric statistics and Shannon's entropy (diversity) formula. The main findings of this analysis are: (a) after 2004,…

  7. ON THE STATISTICAL ANALYSIS OF X-RAY POLARIZATION MEASUREMENTS

    SciTech Connect

    Strohmayer, T. E.; Kallman, T. R.

    2013-08-20

    In many polarimetry applications, including observations in the X-ray band, the measurement of a polarization signal can be reduced to the detection and quantification of a deviation from uniformity of a distribution of measured angles of the form A + Bcos {sup 2}({phi} - {phi}{sub 0}) (0 < {phi} < {pi}). We explore the statistics of such polarization measurements using Monte Carlo simulations and {chi}{sup 2} fitting methods. We compare our results to those derived using the traditional probability density used to characterize polarization measurements and quantify how they deviate as the intrinsic modulation amplitude grows. We derive relations for the number of counts required to reach a given detection level (parameterized by {beta} the ''number of {sigma}'s'' of the measurement) appropriate for measuring the modulation amplitude a by itself (single interesting parameter case) or jointly with the position angle {phi} (two interesting parameters case). We show that for the former case, when the intrinsic amplitude is equal to the well-known minimum detectable polarization, (MDP) it is, on average, detected at the 3{sigma} level. For the latter case, when one requires a joint measurement at the same confidence level, then more counts are needed than what was required to achieve the MDP level. This additional factor is amplitude-dependent, but is Almost-Equal-To 2.2 for intrinsic amplitudes less than about 20%. It decreases slowly with amplitude and is Almost-Equal-To 1.8 when the amplitude is 50%. We find that the position angle uncertainty at 1{sigma} confidence is well described by the relation {sigma}{sub {phi}} = 28. Degree-Sign 5/{beta}.

  8. Using the statistical analysis method to assess the landslide susceptibility

    NASA Astrophysics Data System (ADS)

    Chan, Hsun-Chuan; Chen, Bo-An; Wen, Yo-Ting

    2015-04-01

    This study assessed the landslide susceptibility in Jing-Shan River upstream watershed, central Taiwan. The landslide inventories during typhoons Toraji in 2001, Mindulle in 2004, Kalmaegi and Sinlaku in 2008, Morakot in 2009, and the 0719 rainfall event in 2011, which were established by Taiwan Central Geological Survey, were used as landslide data. This study aims to assess the landslide susceptibility by using different statistical methods including logistic regression, instability index method and support vector machine (SVM). After the evaluations, the elevation, slope, slope aspect, lithology, terrain roughness, slope roughness, plan curvature, profile curvature, total curvature, average of rainfall were chosen as the landslide factors. The validity of the three established models was further examined by the receiver operating characteristic curve. The result of logistic regression showed that the factor of terrain roughness and slope roughness had a stronger impact on the susceptibility value. Instability index method showed that the factor of terrain roughness and lithology had a stronger impact on the susceptibility value. Due to the fact that the use of instability index method may lead to possible underestimation around the river side. In addition, landslide susceptibility indicated that the use of instability index method laid a potential issue about the number of factor classification. An increase of the number of factor classification may cause excessive variation coefficient of the factor. An decrease of the number of factor classification may make a large range of nearby cells classified into the same susceptibility level. Finally, using the receiver operating characteristic curve discriminate the three models. SVM is a preferred method than the others in assessment of landslide susceptibility. Moreover, SVM is further suggested to be nearly logistic regression in terms of recognizing the medium-high and high susceptibility.

  9. Characterization of Nuclear Fuel using Multivariate Statistical Analysis

    SciTech Connect

    Robel, M; Robel, M; Robel, M; Kristo, M J; Kristo, M J

    2007-11-27

    Various combinations of reactor type and fuel composition have been characterized using principle components analysis (PCA) of the concentrations of 9 U and Pu isotopes in the 10 fuel as a function of burnup. The use of PCA allows the reduction of the 9-dimensional data (isotopic concentrations) into a 3-dimensional approximation, giving a visual representation of the changes in nuclear fuel composition with burnup. Real-world variation in the concentrations of {sup 234}U and {sup 236}U in the fresh (unirradiated) fuel was accounted for. The effects of reprocessing were also simulated. The results suggest that, 15 even after reprocessing, Pu isotopes can be used to determine both the type of reactor and the initial fuel composition with good discrimination. Finally, partial least squares discriminant analysis (PSLDA) was investigated as a substitute for PCA. Our results suggest that PLSDA is a better tool for this application where separation between known classes is most important.

  10. Statistical theory and methodology for remote sensing data analysis

    NASA Technical Reports Server (NTRS)

    Odell, P. L.

    1974-01-01

    A model is developed for the evaluation of acreages (proportions) of different crop-types over a geographical area using a classification approach and methods for estimating the crop acreages are given. In estimating the acreages of a specific croptype such as wheat, it is suggested to treat the problem as a two-crop problem: wheat vs. nonwheat, since this simplifies the estimation problem considerably. The error analysis and the sample size problem is investigated for the two-crop approach. Certain numerical results for sample sizes are given for a JSC-ERTS-1 data example on wheat identification performance in Hill County, Montana and Burke County, North Dakota. Lastly, for a large area crop acreages inventory a sampling scheme is suggested for acquiring sample data and the problem of crop acreage estimation and the error analysis is discussed.

  11. Practical guidance for statistical analysis of operational event data

    SciTech Connect

    Atwood, C.L.

    1995-10-01

    This report presents ways to avoid mistakes that are sometimes made in analysis of operational event data. It then gives guidance on what to do when a model is rejected, a list of standard types of models to consider, and principles for choosing one model over another. For estimating reliability, it gives advice on which failure modes to model, and moment formulas for combinations of failure modes. The issues are illustrated with many examples and case studies.

  12. Prevalence of atopic dermatitis in Korea: analysis by using national statistics.

    PubMed

    Yu, Jung-Seok; Lee, Chang-Jong; Lee, Ho-Seok; Kim, Jihyun; Han, Youngshin; Ahn, Kangmo; Lee, Sang-Il

    2012-06-01

    We attempted to investigate the prevalence of atopic dermatitis (AD) in Korea by using national statistics. Data on AD patients who received medical service at least once a year from 2003 through 2008 were collected from health insurance research team of National Health Insurance Corporation. Data of estimated populations during the same period were obtained from the Statistics Korea. In 2008, the prevalence of AD was 26.5% in aged 12-23 months and decreased substantially to 7.6% at age 6 yr, 3.4% at age 12 yr and to 2.4% at age 18 yr. In males, the prevalence was higher than females until 2 yr of age, while the opposite was shown in children aged 2 yr or older. In children aged less than 24 months, the prevalence of AD has increased from 19.8% to 23.8% between the years 2003 and 2008, while the prevalence showed no increase in the older age group. In conclusion, the prevalence of AD in 2008 peaked during infancy up to 26.5% and decreased thereafter. Our findings also suggest that increasing prevalence of AD in children less than 24 months might be responsible for the recent increase in the prevalence of AD in Korean children. PMID:22690101

  13. A new statistical analysis of rare earth element diffusion data in garnet

    NASA Astrophysics Data System (ADS)

    Chu, X.; Ague, J. J.

    2015-12-01

    The incorporation of rare earth elements (REE) in garnet, Sm and Lu in particular, links garnet chemical zoning to absolute age determinations. The application of REE-based geochronology depends critically on the diffusion behaviors of the parent and daughter isotopes. Previous experimental studies on REE diffusion in garnet, however, exhibit significant discrepancies that impact interpretations of garnet Sm/Nd and Lu/Hf ages.We present a new statistical framework to analyze diffusion data for REE using an Arrhenius relationship that accounts for oxygen fugacity, cation radius and garnet unit-cell dimensions [1]. Our approach is based on Bayesian statistics and is implemented by the Markov chain Monte Carlo method. A similar approach has been recently applied to model diffusion of divalent cations in garnet [2]. The analysis incorporates recent data [3] in addition to the data compilation in ref. [1]. We also include the inter-run bias that helps reconcile the discrepancies among data sets. This additional term estimates the reproducibility and other experimental variabilities not explicitly incorporated in the Arrhenius relationship [2] (e.g., compositional dependence [3] and water content).The fitted Arrhenius relationships are consistent with the models in ref. [3], as well as refs. [1]&[4] at high temperatures. Down-temperature extrapolation leads to >0.5 order of magnitude faster diffusion coefficients than in refs. [1]&[4] at <750 °C. The predicted diffusion coefficients are significantly slower than ref. [5]. The fast diffusion [5] was supported by a field test of the Pikwitonei Granulite—the garnet Sm/Nd age postdates the metamorphic peak (750 °C) by ~30 Myr [6], suggesting considerable resetting of the Sm/Nd system during cooling. However, the Pikwitonei Granulite is a recently recognized UHT terrane with peak temperature exceeding 900 °C [7]. The revised closure temperature (~730 °C) is consistent with our new diffusion model.[1] Carlson (2012) Am

  14. Space Shuttle Columbia Aging Wiring Failure Analysis

    NASA Technical Reports Server (NTRS)

    McDaniels, Steven J.

    2005-01-01

    A Space Shuttle Columbia main engine controller 14 AWG wire short circuited during the launch of STS-93. Post-flight examination divulged that the wire had electrically arced against the head of a nearby bolt. More extensive inspection revealed additional damage to the subject wire, and to other wires as well from the mid-body of Columbia. The shorted wire was to have been constructed from nickel-plated copper conductors surrounded by the polyimide insulation Kapton, top-coated with an aromatic polyimide resin. The wires were analyzed via scanning electron microscope (SEM), energy dispersive X-Ray spectroscopy (EDX), and electron spectroscopy for chemical analysis (ESCA); differential scanning calorimetry (DSC) and thermal gravimetric analysis (TGA) were performed on the polyimide. Exemplar testing under laboratory conditions was performed to replicate the mechanical damage characteristics evident on the failed wires. The exemplar testing included a step test, where, as the name implies, a person stepped on a simulated wire bundle that rested upon a bolt head. Likewise, a shear test that forced a bolt head and a torque tip against a wire was performed to attempt to damage the insulation and conductor. Additionally, a vibration test was performed to determine if a wire bundle would abrade when vibrated against the head of a bolt. Also, an abrasion test was undertaken to determine if the polyimide of the wire could be damaged by rubbing against convolex helical tubing. Finally, an impact test was performed to ascertain if the use of the tubing would protect the wire from the strike of a foreign object.

  15. Statistical analysis of ionosphere parameters and atmospheric pressure correlations

    NASA Astrophysics Data System (ADS)

    Voloskov, Dmitriy; Bochkarev, Vladimir; Maslennikova, Yulia; Zagidullin, Bulat

    Ionosphere parameters such as Total electron content (TEC) and Doppler frequency shift characterize ionosphere influence on signals propagation, and therefore information about these parameters is important for radio communication tasks. Meteorological effects such as atmospheric pressure variations can influence on ionosphere parameters. This work is dedicated to analysis of correlations between meteorological and ionosphere parameters. NCEP/NCAR reanalysis meteorological maps, Jet Propulsion Laboratory (JPL) global TEC maps and data from Doppler phase goniometric complex “Spectr” were analysed. Data for 2009-2011 were investigated. Coherent oscillations with periods of 29-32 and 4 days were detected in atmospheric pressure and Doppler frequency shift variations.

  16. Statistical magnetic anomalies from satellite measurements for geologic analysis

    NASA Technical Reports Server (NTRS)

    Goyal, H. K.; Vonfrese, R. R. B.; Hinze, W. J.

    1985-01-01

    The errors of numerically averaging satellite magnetic anomaly data for geologic analysis are investigated using orbital anomaly simulations of crustal magnetic sources by Gauss-Legendre quadrature integration. These simulations suggest that numerical averaging errors constitute small and relatively minor contributions to the total error-budget of higher orbital estimates (approx. 400 km), whereas for lower orbital estimates the error of averaging may increase substantially. Least-squares collocation is also investigated as an alternative to numerical averaging and found to produce substantially more accurate anomaly estimates as the elevation of prediction is decreased towards the crustal sources.

  17. Introducing Statistics to Geography Students: The Case for Exploratory Data Analysis.

    ERIC Educational Resources Information Center

    Burn, Christopher R.; Fox, Michael F.

    1986-01-01

    Exploratory data analysis (EDA) gives students a feel for the data being considered. Four applications of EDA are discussed: the use of displays, resistant statistics, transformations, and smoothing. (RM)

  18. STATISTICAL METHODOLOGY FOR THE SIMULTANEOUS ANALYSIS OF MULTIPLE TYPES OF OUTCOMES IN NONLINEAR THRESHOLD MODELS.

    EPA Science Inventory

    Multiple outcomes are often measured on each experimental unit in toxicology experiments. These multiple observations typically imply the existence of correlation between endpoints, and a statistical analysis that incorporates it may result in improved inference. When both disc...

  19. A Comparative Study of Normalization Methods Used in Statistical Analysis of Oligonucleotide Microarray Data

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Normalization methods used in the statistical analysis of oligonucleotide microarray data were evaluated. The oligonucleotide microarray is considered an efficient analytical tool for analyzing thousands of genes simultaneously in a single experiment. However, systematic variation in microarray, ori...

  20. A critique of the usefulness of inferential statistics in applied behavior analysis

    PubMed Central

    Hopkins, B. L.; Cole, Brian L.; Mason, Tina L.

    1998-01-01

    Researchers continue to recommend that applied behavior analysts use inferential statistics in making decisions about effects of independent variables on dependent variables. In many other approaches to behavioral science, inferential statistics are the primary means for deciding the importance of effects. Several possible uses of inferential statistics are considered. Rather than being an objective means for making decisions about effects, as is often claimed, inferential statistics are shown to be subjective. It is argued that the use of inferential statistics adds nothing to the complex and admittedly subjective nonstatistical methods that are often employed in applied behavior analysis. Attacks on inferential statistics that are being made, perhaps with increasing frequency, by those who are not behavior analysts, are discussed. These attackers are calling for banning the use of inferential statistics in research publications and commonly recommend that behavioral scientists should switch to using statistics aimed at interval estimation or the method of confidence intervals. Interval estimation is shown to be contrary to the fundamental assumption of behavior analysis that only individuals behave. It is recommended that authors who wish to publish the results of inferential statistics be asked to justify them as a means for helping us to identify any ways in which they may be useful. PMID:22478304

  1. Statistical methods for the forensic analysis of striated tool marks

    SciTech Connect

    Hoeksema, Amy Beth

    2013-01-01

    In forensics, fingerprints can be used to uniquely identify suspects in a crime. Similarly, a tool mark left at a crime scene can be used to identify the tool that was used. However, the current practice of identifying matching tool marks involves visual inspection of marks by forensic experts which can be a very subjective process. As a result, declared matches are often successfully challenged in court, so law enforcement agencies are particularly interested in encouraging research in more objective approaches. Our analysis is based on comparisons of profilometry data, essentially depth contours of a tool mark surface taken along a linear path. In current practice, for stronger support of a match or non-match, multiple marks are made in the lab under the same conditions by the suspect tool. We propose the use of a likelihood ratio test to analyze the difference between a sample of comparisons of lab tool marks to a field tool mark, against a sample of comparisons of two lab tool marks. Chumbley et al. (2010) point out that the angle of incidence between the tool and the marked surface can have a substantial impact on the tool mark and on the effectiveness of both manual and algorithmic matching procedures. To better address this problem, we describe how the analysis can be enhanced to model the effect of tool angle and allow for angle estimation for a tool mark left at a crime scene. With sufficient development, such methods may lead to more defensible forensic analyses.

  2. Statistical Analysis of Shear Wave Speed in the Uterine Cervix

    PubMed Central

    Carlson, Lindsey C.; Feltovich, Helen; Palmeri, Mark L.; del Rio, Alejandro Muñoz; Hall, Timothy J.

    2014-01-01

    Although cervical softening is critical in pregnancy, there currently is no objective method for assessing the softness of the cervix. Shear wave speed (SWS) estimation is a noninvasive tool used to measure tissue mechanical properties such as stiffness. The goal of this study was to determine the spatial variability and assess the ability of SWS to classify ripened vs. unripened tissue samples. Ex vivo human hysterectomy samples (n = 22) were collected, a subset (n = 13) were ripened. SWS estimates were made at 4–5 locations along the length of the canal on both anterior and posterior halves. A linear mixed model was used for a robust multivariate analysis. Receiver operating characteristic (ROC) analysis and the area under the ROC curve (AUC) were calculated to describe the utility of SWS to classify ripened vs. unripened tissue samples. Results showed that all variables used in the linear mixed model were significant (p<0.05). Estimates at the mid location for the unripened group were 3.45 ± 0.95 m/s (anterior) and 3.56 ± 0.92 m/s (posterior), and 2.11 ± 0.45 m/s (anterior) and 2.68 ± 0.57 m/s (posterior) for the ripened (p < 0.001). The AUC’s were 0.91 and 0.84 for anterior and posterior respectively suggesting SWS estimates may be useful for quantifying cervical softening. PMID:25392863

  3. Statistical analysis of shear wave speed in the uterine cervix.

    PubMed

    Carlson, Lindsey C; Feltovich, Helen; Palmeri, Mark L; del Rio, Alejandro Muñoz; Hall, Timothy J

    2014-10-01

    Although cervical softening is critical in pregnancy, there currently is no objective method for assessing the softness of the cervix. Shear wave speed (SWS) estimation is a noninvasive tool used to measure tissue mechanical properties such as stiffness. The goal of this study was to determine the spatial variability and assess the ability of SWS to classify ripened versus unripened tissue samples. Ex vivo human hysterectomy samples (n = 22) were collected; a subset (n = 13) were ripened. SWS estimates were made at 4 to 5 locations along the length of the canal on both anterior and posterior halves. A linear mixed model was used for a robust multivariate analysis. Receiver operating characteristic (ROC) analysis and the area under the ROC curve (AUC) were calculated to describe the utility of SWS to classify ripened versus unripened tissue samples. Results showed that all variables used in the linear mixed model were significant ( p < 0.05). Estimates at the mid location for the unripened group were 3.45 ± 0.95 m/s (anterior) and 3.56 ± 0.92 m/s (posterior), and 2.11 ± 0.45 m/s (anterior) and 2.68 ± 0.57 m/s (posterior) for the ripened ( p < 0.001). The AUCs were 0.91 and 0.84 for anterior and posterior, respectively, suggesting that SWS estimates may be useful for quantifying cervical softening. PMID:25392863

  4. Analysis of compressive fracture in rock using statistical techniques

    SciTech Connect

    Blair, S.C.

    1994-12-01

    Fracture of rock in compression is analyzed using a field-theory model, and the processes of crack coalescence and fracture formation and the effect of grain-scale heterogeneities on macroscopic behavior of rock are studied. The model is based on observations of fracture in laboratory compression tests, and incorporates assumptions developed using fracture mechanics analysis of rock fracture. The model represents grains as discrete sites, and uses superposition of continuum and crack-interaction stresses to create cracks at these sites. The sites are also used to introduce local heterogeneity. Clusters of cracked sites can be analyzed using percolation theory. Stress-strain curves for simulated uniaxial tests were analyzed by studying the location of cracked sites, and partitioning of strain energy for selected intervals. Results show that the model implicitly predicts both development of shear-type fracture surfaces and a strength-vs-size relation that are similar to those observed for real rocks. Results of a parameter-sensitivity analysis indicate that heterogeneity in the local stresses, attributed to the shape and loading of individual grains, has a first-order effect on strength, and that increasing local stress heterogeneity lowers compressive strength following an inverse power law. Peak strength decreased with increasing lattice size and decreasing mean site strength, and was independent of site-strength distribution. A model for rock fracture based on a nearest-neighbor algorithm for stress redistribution is also presented and used to simulate laboratory compression tests, with promising results.

  5. New acquisition techniques and statistical analysis of bubble size distributions

    NASA Astrophysics Data System (ADS)

    Proussevitch, A.; Sahagian, D.

    2005-12-01

    Various approaches have been taken to solve the long-standing problem of determining size distributions of objects embedded in an opaque medium. In the case of vesicles in volcanic rocks, the most reliable technique is 3-D imagery by computed X-Ray tomography. However, this method is expensive, requires intensive computational resources and thus limited and not always available for an investigator. As a cheaper alternative, 2-D cross-sectional data is commonly available, but requires stereological analysis for 3-D conversion. A stereology technique for spherical bubbles is quite robust but elongated non-spherical bubbles require complicated conversion approaches and large observed populations. We have revised computational schemes of applying non-spherical stereology for practical analysis of bubble size distributions. The basic idea of this new approach is to exclude from the conversion those classes (bins) of non-spherical bubbles that provide a larger cross-section probability distribution than a maximum value which depends on mean aspect ratio. Thus, in contrast to traditional stereological techniques, larger bubbles are "predicted" from the rest of the population. As a proof of principle, we have compared distributions so obtained with direct 3-D imagery (X-Ray tomography) for non-spherical bubbles from the same samples of vesicular basalts collected from the Colorado Plateau. The results of the comparison demonstrate that in cases where x-ray tomography is impractical, stereology can be used with reasonable reliability, even for non-spherical vesicles.

  6. Meta-analysis for Discovering Rare-Variant Associations: Statistical Methods and Software Programs

    PubMed Central

    Tang, Zheng-Zheng; Lin, Dan-Yu

    2015-01-01

    There is heightened interest in using next-generation sequencing technologies to identify rare variants that influence complex human diseases and traits. Meta-analysis is essential to this endeavor because large sample sizes are required for detecting associations with rare variants. In this article, we provide a comprehensive overview of statistical methods for meta-analysis of sequencing studies for discovering rare-variant associations. Specifically, we discuss the calculation of relevant summary statistics from participating studies, the construction of gene-level association tests, the choice of transformation for quantitative traits, the use of fixed-effects versus random-effects models, and the removal of shadow association signals through conditional analysis. We also show that meta-analysis based on properly calculated summary statistics is as powerful as joint analysis of individual-participant data. In addition, we demonstrate the performance of different meta-analysis methods by using both simulated and empirical data. We then compare four major software packages for meta-analysis of rare-variant associations—MASS, RAREMETAL, MetaSKAT, and seqMeta—in terms of the underlying statistical methodology, analysis pipeline, and software interface. Finally, we present PreMeta, a software interface that integrates the four meta-analysis packages and allows a consortium to combine otherwise incompatible summary statistics. PMID:26094574

  7. Statistical analysis of the ambiguities in the asteroid period determinations

    NASA Astrophysics Data System (ADS)

    Butkiewicz, M.; Kwiatkowski, T.; Bartczak, P.; Dudziński, G.

    2014-07-01

    A synodic period of an asteroid can be derived from its lightcurve by standard methods like Fourier-series fitting. A problem appears when results of observations are based on less than a full coverage of a lightcurve and/or high level of noise. Also, long gaps between individual lightcurves create an ambiguity in the cycle count which leads to aliases. Excluding binary systems and objects with non-principal-axis rotation, the rotation period is usually identical to the period of the second Fourier harmonic of the lightcurve. There are cases, however, where it may be connected with the 1st, 3rd, or 4th harmonic and it is difficult to choose among them when searching for the period. To help remove such uncertainties we analysed asteroid lightcurves for a range of shapes and observing/illuminating geometries. We simulated them using a modified internal code from the ISAM service (Marciniak et al. 2012, A&A 545, A131). In our computations, shapes of asteroids were modeled as Gaussian random spheres (Muinonen 1998, A&A, 332, 1087). A combination of Lommel-Seeliger and Lambert scattering laws was assumed. For each of the 100 shapes, we randomly selected 1000 positions of the spin axis, systematically changing the solar phase angle with a step of 5°. For each lightcurve, we determined its peak-to-peak amplitude, fitted the 6th-order Fourier series and derived the amplitudes of its harmonics. Instead of the number of the lightcurve extrema, which in many cases is subjective, we characterized each lightcurve by the order of the highest-amplitude Fourier harmonic. The goal of our simulations was to derive statistically significant conclusions (based on the underlying assumptions) about the dominance of different harmonics in the lightcurves of the specified amplitude and phase angle. The results, presented in the Figure, can be used in individual cases to estimate the probability that the obtained lightcurve is dominated by a specified Fourier harmonic. Some of the

  8. An improved method for statistical analysis of raw accelerator mass spectrometry data

    SciTech Connect

    Gutjahr, A.; Phillips, F.; Kubik, P.W.; Elmore, D.

    1987-01-01

    Hierarchical statistical analysis is an appropriate method for statistical treatment of raw accelerator mass spectrometry (AMS) data. Using Monte Carlo simulations we show that this method yields more accurate estimates of isotope ratios and analytical uncertainty than the generally used propagation of errors approach. The hierarchical analysis is also useful in design of experiments because it can be used to identify sources of variability. 8 refs., 2 figs.

  9. Comparability of mixed IC₅₀ data - a statistical analysis.

    PubMed

    Kalliokoski, Tuomo; Kramer, Christian; Vulpetti, Anna; Gedeck, Peter

    2013-01-01

    The biochemical half maximal inhibitory concentration (IC50) is the most commonly used metric for on-target activity in lead optimization. It is used to guide lead optimization, build large-scale chemogenomics analysis, off-target activity and toxicity models based on public data. However, the use of public biochemical IC50 data is problematic, because they are assay specific and comparable only under certain conditions. For large scale analysis it is not feasible to check each data entry manually and it is very tempting to mix all available IC50 values from public database even if assay information is not reported. As previously reported for Ki database analysis, we first analyzed the types of errors, the redundancy and the variability that can be found in ChEMBL IC50 database. For assessing the variability of IC50 data independently measured in two different labs at least ten IC50 data for identical protein-ligand systems against the same target were searched in ChEMBL. As a not sufficient number of cases of this type are available, the variability of IC50 data was assessed by comparing all pairs of independent IC50 measurements on identical protein-ligand systems. The standard deviation of IC50 data is only 25% larger than the standard deviation of Ki data, suggesting that mixing IC50 data from different assays, even not knowing assay conditions details, only adds a moderate amount of noise to the overall data. The standard deviation of public ChEMBL IC50 data, as expected, resulted greater than the standard deviation of in-house intra-laboratory/inter-day IC50 data. Augmenting mixed public IC50 data by public Ki data does not deteriorate the quality of the mixed IC50 data, if the Ki is corrected by an offset. For a broad dataset such as ChEMBL database a Ki- IC50 conversion factor of 2 was found to be the most reasonable. PMID:23613770

  10. Accounting for Multiple Sources of Uncertainty in the Statistical Analysis of Holocene Sea Levels

    NASA Astrophysics Data System (ADS)

    Cahill, N.; Parnell, A. C.; Kemp, A.; Horton, B.

    2014-12-01

    We perform a Bayesian statistical analysis on historical and late Holocene rates of sea-level change. The data that form the input to the statistical model are tide-gauge measurements and proxy reconstructions from cores of coastal sediment. The aims are to estimate rates of sea-level change, to determine when modern rates of rise began and to observe how these rates have evolved over time. Many current methods for doing this use simple linear regression to estimate rates. This is often inappropriate as it is too rigid and it can ignore uncertainties that arise as part of the data collection exercise. This can lead to over-confidence in the sea-level trends being characterized. The proposed model places a Gaussian process prior on the rate process (i.e. the process that determines how rates of sea-level are changing over time). The likelihood of the observed data is the integral of this process. When dealing with proxy reconstructions, the model is set in an errors-in-variables framework so as to take account of age uncertainty. It is also necessary to account for glacio-isostatic adjustment, which introduces a covariance between individual age and sea-level observations. This method allows for the estimation of the rate process with full consideration of all sources of uncertainty. The model captures the continuous and dynamic evolution of sea-level change and results show that modern rates of rise are consistently increasing. Analysis of a global tide-gauge record (Church and White, 2011) indicated that the rate of sea-level rise increased continuously since 1880AD and is currently 1.9mm/yr (95% credible interval of 1.84 to 2.03mm/yr). Applying the model to a proxy reconstruction from North Carolina (Kemp et al., 2011) indicated that the mean rate of rise in this locality since the middle of the 19th century (current rate of 2.44 mm/yr with a 95% credible interval of 1.91 to 3.01mm/yr) is unprecedented in at least the last 2000 years.

  11. Statistical analysis of the temporal properties of BL Lacertae

    NASA Astrophysics Data System (ADS)

    Guo, Yu Cheng; Hu, Shao Ming; Li, Yu Tong; Chen, Xu

    2016-08-01

    A comprehensive temporal analysis has been performed on optical light curves of BL Lacertae in the B, V and R bands. The light curves were denoised by Gaussian smoothing and decomposed into individual flares using an exponential profile. The asymmetry, duration, peak flux and equivalent energy output of flares were measured and the frequency distributions presented. Most optical flares of BL Lacertae are highly symmetric, with a weak tendency towards gradual rises and rapid decays. The distribution of flare durations is not random, but consistent with a gamma distribution. Peak fluxes and energy outputs of flares all follow a log-normal distribution. A positive correlation is detected between flare durations and peak fluxes. The temporal properties of BL Lacertae provide evidence of the stochastic magnetohydrodynamic process in the accretion disc and jet.The results presented here can serve as constraints on physical models attempting to interpret blazar variations.

  12. STATISTICAL ANALYSIS OF THE VERY QUIET SUN MAGNETISM

    SciTech Connect

    Martinez Gonzalez, M. J.; Manso Sainz, R.; Asensio Ramos, A.

    2010-03-10

    The behavior of the observed polarization amplitudes with spatial resolution is a strong constraint on the nature and organization of solar magnetic fields below the resolution limit. We study the polarization of the very quiet Sun at different spatial resolutions using ground- and space-based observations. It is shown that 80% of the observed polarization signals do not change with spatial resolution, suggesting that, observationally, the very quiet Sun magnetism remains the same despite the high spatial resolution of space-based observations. Our analysis also reveals a cascade of spatial scales for the magnetic field within the resolution element. It is manifest that the Zeeman effect is sensitive to the microturbulent field usually associated with Hanle diagnostics. This demonstrates that Zeeman and Hanle studies show complementary perspectives of the same magnetism.

  13. Statistical Analysis of Temple Orientation in Ancient India

    NASA Astrophysics Data System (ADS)

    Aller, Alba; Belmonte, Juan Antonio

    2015-05-01

    The great diversity of religions that have been followed in India for over 3000 years is the reason why there are hundreds of temples built to worship dozens of different divinities. In this work, more than one hundred temples geographically distributed over the whole Indian land have been analyzed, obtaining remarkable results. For this purpose, a deep analysis of the main deities who are worshipped in each of them, as well as of the different dynasties (or cultures) who built them has also been conducted. As a result, we have found that the main axes of the temples dedicated to Shiva seem to be oriented to the east cardinal point while those temples dedicated to Vishnu would be oriented to both the east and west cardinal points. To explain these cardinal directions we propose to look back to the origins of Hinduism. Besides these cardinal orientations, clear solar orientations have also been found, especially at the equinoctial declination.

  14. Ordinary chondrites - Multivariate statistical analysis of trace element contents

    NASA Technical Reports Server (NTRS)

    Lipschutz, Michael E.; Samuels, Stephen M.

    1991-01-01

    The contents of mobile trace elements (Co, Au, Sb, Ga, Se, Rb, Cs, Te, Bi, Ag, In, Tl, Zn, and Cd) in Antarctic and non-Antarctic populations of H4-6 and L4-6 chondrites, were compared using standard multivariate discriminant functions borrowed from linear discriminant analysis and logistic regression. A nonstandard randomization-simulation method was developed, making it possible to carry out probability assignments on a distribution-free basis. Compositional differences were found both between the Antarctic and non-Antarctic H4-6 chondrite populations and between two L4-6 chondrite populations. It is shown that, for various types of meteorites (in particular, for the H4-6 chondrites), the Antarctic/non-Antarctic compositional difference is due to preterrestrial differences in the genesis of their parent materials.

  15. Spectral reflectance of surface soils - A statistical analysis

    NASA Technical Reports Server (NTRS)

    Crouse, K. R.; Henninger, D. L.; Thompson, D. R.

    1983-01-01

    The relationship of the physical and chemical properties of soils to their spectral reflectance as measured at six wavebands of Thematic Mapper (TM) aboard NASA's Landsat-4 satellite was examined. The results of performing regressions of over 20 soil properties on the six TM bands indicated that organic matter, water, clay, cation exchange capacity, and calcium were the properties most readily predicted from TM data. The middle infrared bands, bands 5 and 7, were the best bands for predicting soil properties, and the near infrared band, band 4, was nearly as good. Clustering 234 soil samples on the TM bands and characterizing the clusters on the basis of soil properties revealed several clear relationships between properties and reflectance. Discriminant analysis found organic matter, fine sand, base saturation, sand, extractable acidity, and water to be significant in discriminating among clusters.

  16. Statistical Analysis on Temporal Properties of BL Lacertae

    NASA Astrophysics Data System (ADS)

    Guo, Yu Cheng; Hu, Shao Ming; Li, Yu Tong; Chen, Xu

    2016-04-01

    A comprehensive temporal analysis has been performed on optical light curves of BL Lacertae in B, V and R bands. The light curves were denoised by Gaussian smoothing and decomposed into individual flares using an exponential profile. Asymmetry, duration, peak flux and equivalent energy output of flares were measured and the frequency distributions are presented. Most optical flares of BL Lacertae are highly symmetric, with a weak tendency towards gradual rises and rapid decays. The distribution of flare durations is not random but consistent with a gamma distribution. Peak fluxes and energy outputs of flares all follow lognormal distribution. A positive correlation is detected between flare durations and peak fluxes. The temporal properties of BL Lacertae provide evidence of the stochastic magnetohydrodynamic process in accretion disk and jet. Results presented here can serve as constraints on physical models attempting to interpreting blazar variations.

  17. Statistical Analysis of Factors Affecting Child Mortality in Pakistan.

    PubMed

    Ahmed, Zoya; Kamal, Asifa; Kamal, Asma

    2016-06-01

    Child mortality is a composite indicator reflecting economic, social, environmental, healthcare services, and their delivery situation in a country. Globally, Pakistan has the third highest burden of fetal, maternal, and child mortality. Factors affecting child mortality in Pakistan are investigated by using Binary Logistic Regression Analysis. Region, education of mother, birth order, preceding birth interval (the period between the previous child birth and the index child birth), size of child at birth, and breastfeeding and family size were found to be significantly important with child mortality in Pakistan. Child mortality decreased as level of mother's education, preceding birth interval, size of child at birth, and family size increased. Child mortality was found to be significantly higher in Balochistan as compared to other regions. Child mortality was low for low birth orders. Child survival was significantly higher for children who were breastfed as compared to those who were not. PMID:27354000

  18. GIS application on spatial landslide analysis using statistical based models

    NASA Astrophysics Data System (ADS)

    Pradhan, Biswajeet; Lee, Saro; Buchroithner, Manfred F.

    2009-09-01

    This paper presents the assessment results of spatially based probabilistic three models using Geoinformation Techniques (GIT) for landslide susceptibility analysis at Penang Island in Malaysia. Landslide locations within the study areas were identified by interpreting aerial photographs, satellite images and supported with field surveys. Maps of the topography, soil type, lineaments and land cover were constructed from the spatial data sets. There are ten landslide related factors were extracted from the spatial database and the frequency ratio, fuzzy logic, and bivariate logistic regression coefficients of each factor was computed. Finally, landslide susceptibility maps were drawn for study area using frequency ratios, fuzzy logic and bivariate logistic regression models. For verification, the results of the analyses were compared with actual landslide locations in study area. The verification results show that bivariate logistic regression model provides slightly higher prediction accuracy than the frequency ratio and fuzzy logic models.

  19. Statistical Analysis of Acoustic Wave Parameters Near Solar Active Regions

    NASA Astrophysics Data System (ADS)

    Rabello-Soares, M. Cristina; Bogart, Richard S.; Scherrer, Philip H.

    2016-08-01

    In order to quantify the influence of magnetic fields on acoustic mode parameters and flows in and around active regions, we analyze the differences in the parameters in magnetically quiet regions nearby an active region (which we call “nearby regions”), compared with those of quiet regions at the same disk locations for which there are no neighboring active regions. We also compare the mode parameters in active regions with those in comparably located quiet regions. Our analysis is based on ring-diagram analysis of all active regions observed by the Helioseismic and Magnetic Imager (HMI) during almost five years. We find that the frequency at which the mode amplitude changes from attenuation to amplification in the quiet nearby regions is around 4.2 mHz, in contrast to the active regions, for which it is about 5.1 mHz. This amplitude enhacement (the “acoustic halo effect”) is as large as that observed in the active regions, and has a very weak dependence on the wave propagation direction. The mode energy difference in nearby regions also changes from a deficit to an excess at around 4.2 mHz, but averages to zero over all modes. The frequency difference in nearby regions increases with increasing frequency until a point at which the frequency shifts turn over sharply, as in active regions. However, this turnover occurs around 4.9 mHz, which is significantly below the acoustic cutoff frequency. Inverting the horizontal flow parameters in the direction of the neigboring active regions, we find flows that are consistent with a model of the thermal energy flow being blocked directly below the active region.

  20. Remote Compositional Analysis: The Coming of Age

    NASA Astrophysics Data System (ADS)

    McCord, T. B.

    2002-12-01

    Remote mineralogical analysis of planetary surfaces was attempted more than a century ago. This involved spectroscopy of regions on, mostly, the lunar surface, using groundbased telescopes and of rocks and minerals in the laboratory. However, it was not until the 1960s that science and technology developed to the point of allowing reflectance spectroscopy to become a quantitative technique. Some of us were luck enough to appear on this scene, young and energetic and with supporting funds available to take advantage of these advances to further the knowledge of molecules and minerals in the solar system. Electronic light detectors became available and the near IR portion of the spectrum was quantitatively accessed so that specific absorption bands could be detected in the reflectance spectrum, begging interpretation. At the same time, the physics of the interaction of light and minerals was becoming much better understood, allowing interpretation. Geochemists and geologists became interested and helped place these discoveries in the context of solar system science. Major successes resulted mostly from a few scientists who accomplished some expertise in all three areas. This allowed identification of many minerals and their crystal state using the reflectance spectra. The early emphasis was on the Moon because of its proximity to Earth and the Apollo Program. Reflectance spectra of the Moon were obtained in the late 60s and early 70s that showed absorption features and these features were interpreted, for example, to suggest a basaltic composition for the maria with high titanium content in some places. The Apollo Program produced samples and their reflectance spectra were measured in the laboratory. Comparisons with telescopic measurements indicated very good agreement and confirmed remote mineralogical interpretations. With confidence gained, we proceeded to explore the mineralogy of the Moon and derived interpretations therefrom. This success gave us confidence to

  1. Indoor Soiling Method and Outdoor Statistical Risk Analysis of Photovoltaic Power Plants

    NASA Astrophysics Data System (ADS)

    Rajasekar, Vidyashree

    This is a two-part thesis. Part 1 presents an approach for working towards the development of a standardized artificial soiling method for laminated photovoltaic (PV) cells or mini-modules. Construction of an artificial chamber to maintain controlled environmental conditions and components/chemicals used in artificial soil formulation is briefly explained. Both poly-Si mini-modules and a single cell mono-Si coupons were soiled and characterization tests such as I-V, reflectance and quantum efficiency (QE) were carried out on both soiled, and cleaned coupons. From the results obtained, poly-Si mini-modules proved to be a good measure of soil uniformity, as any non-uniformity present would not result in a smooth curve during I-V measurements. The challenges faced while executing reflectance and QE characterization tests on poly-Si due to smaller size cells was eliminated on the mono-Si coupons with large cells to obtain highly repeatable measurements. This study indicates that the reflectance measurements between 600-700 nm wavelengths can be used as a direct measure of soil density on the modules. Part 2 determines the most dominant failure modes of field aged PV modules using experimental data obtained in the field and statistical analysis, FMECA (Failure Mode, Effect, and Criticality Analysis). The failure and degradation modes of about 744 poly-Si glass/polymer frameless modules fielded for 18 years under the cold-dry climate of New York was evaluated. Defect chart, degradation rates (both string and module levels) and safety map were generated using the field measured data. A statistical reliability tool, FMECA that uses Risk Priority Number (RPN) is used to determine the dominant failure or degradation modes in the strings and modules by means of ranking and prioritizing the modes. This study on PV power plants considers all the failure and degradation modes from both safety and performance perspectives. The indoor and outdoor soiling studies were jointly

  2. Orthogonal separations: Comparison of orthogonality metrics by statistical analysis.

    PubMed

    Schure, Mark R; Davis, Joe M

    2015-10-01

    Twenty orthogonality metrics (OMs) derived from convex hull, information theory, fractal dimension, correlation coefficients, nearest neighbor distances and bin-density techniques were calculated from a diverse group of 47 experimental two-dimensional (2D) chromatograms. These chromatograms comprise two datasets; one dataset is a collection of 2D chromatograms from Peter Carr's laboratory at the University of Minnesota, and the other dataset is based on pairs of one-dimensional chromatograms previously published by Martin Gilar and coworkers (Waters Corp.). The chromatograms were pooled to make a third or combined dataset. Cross-correlation results suggest that specific OMs are correlated within families of nearest neighbor methods, correlation coefficients and the information theory methods. Principal component analysis of the OMs show that none of the OMs stands out as clearly better at explaining the data variance than any another OM. Principal component analysis of individual chromatograms shows that different OMs favor certain chromatograms. The chromatograms exhibit a range of quality, as subjectively graded by nine experts experienced in 2D chromatography. The subjective (grading) evaluations were taken at two intervals per expert and demonstrated excellent consistency for each expert. Excellent agreement for both very good and very bad chromatograms was seen across the range of experts. However, evaluation uncertainty increased for chromatograms that were judged as average to mediocre. The grades were converted to numbers (percentages) for numerical computations. The percentages were correlated with OMs to establish good OMs for evaluating the quality of 2D chromatograms. Certain metrics correlate better than others. However, these results are not consistent across all chromatograms examined. Most of the nearest neighbor methods were observed to correlate poorly with the percentages. However, one method, devised by Clark and Evans, appeared to work

  3. Wheat signature modeling and analysis for improved training statistics

    NASA Technical Reports Server (NTRS)

    Nalepka, R. F. (Principal Investigator); Malila, W. A.; Cicone, R. C.; Gleason, J. M.

    1976-01-01

    The author has identified the following significant results. The spectral, spatial, and temporal characteristics of wheat and other signatures in LANDSAT multispectral scanner data were examined through empirical analysis and simulation. Irrigation patterns varied widely within Kansas; 88 percent of wheat acreage in Finney was irrigated and 24 percent in Morton, as opposed to less than 3 percent for western 2/3's of the State. The irrigation practice was definitely correlated with the observed spectral response; wheat variety differences produced observable spectral differences due to leaf coloration and different dates of maturation. Between-field differences were generally greater than within-field differences, and boundary pixels produced spectral features distinct from those within field centers. Multiclass boundary pixels contributed much of the observed bias in proportion estimates. The variability between signatures obtained by different draws of training data decreased as the sample size became larger; also, the resulting signatures became more robust and the particular decision threshold value became less important.

  4. Statistical Energy Analysis (SEA) and Energy Finite Element Analysis (EFEA) Predictions for a Floor-Equipped Composite Cylinder

    NASA Technical Reports Server (NTRS)

    Grosveld, Ferdinand W.; Schiller, Noah H.; Cabell, Randolph H.

    2011-01-01

    Comet Enflow is a commercially available, high frequency vibroacoustic analysis software founded on Energy Finite Element Analysis (EFEA) and Energy Boundary Element Analysis (EBEA). Energy Finite Element Analysis (EFEA) was validated on a floor-equipped composite cylinder by comparing EFEA vibroacoustic response predictions with Statistical Energy Analysis (SEA) and experimental results. Statistical Energy Analysis (SEA) predictions were made using the commercial software program VA One 2009 from ESI Group. The frequency region of interest for this study covers the one-third octave bands with center frequencies from 100 Hz to 4000 Hz.

  5. Q-Type Factor Analysis of Healthy Aged Men.

    ERIC Educational Resources Information Center

    Kleban, Morton H.

    Q-type factor analysis was used to re-analyze baseline data collected in 1957, on 47 men aged 65-91. Q-type analysis is the use of factor methods to study persons rather than tests. Although 550 variables were originally studied involving psychiatry, medicine, cerebral metabolism and chemistry, personality, audiometry, dichotic and diotic memory,…

  6. Processing and statistical analysis of soil-root images

    NASA Astrophysics Data System (ADS)

    Razavi, Bahar S.; Hoang, Duyen; Kuzyakov, Yakov

    2016-04-01

    Importance of the hotspots such as rhizosphere, the small soil volume that surrounds and is influenced by plant roots, calls for spatially explicit methods to visualize distribution of microbial activities in this active site (Kuzyakov and Blagodatskaya, 2015). Zymography technique has previously been adapted to visualize the spatial dynamics of enzyme activities in rhizosphere (Spohn and Kuzyakov, 2014). Following further developing of soil zymography -to obtain a higher resolution of enzyme activities - we aimed to 1) quantify the images, 2) determine whether the pattern (e.g. distribution of hotspots in space) is clumped (aggregated) or regular (dispersed). To this end, we incubated soil-filled rhizoboxes with maize Zea mays L. and without maize (control box) for two weeks. In situ soil zymography was applied to visualize enzymatic activity of β-glucosidase and phosphatase at soil-root interface. Spatial resolution of fluorescent images was improved by direct application of a substrate saturated membrane to the soil-root system. Furthermore, we applied "spatial point pattern analysis" to determine whether the pattern (e.g. distribution of hotspots in space) is clumped (aggregated) or regular (dispersed). Our results demonstrated that distribution of hotspots at rhizosphere is clumped (aggregated) compare to control box without plant which showed regular (dispersed) pattern. These patterns were similar in all three replicates and for both enzymes. We conclude that improved zymography is promising in situ technique to identify, analyze, visualize and quantify spatial distribution of enzyme activities in the rhizosphere. Moreover, such different patterns should be considered in assessments and modeling of rhizosphere extension and the corresponding effects on soil properties and functions. Key words: rhizosphere, spatial point pattern, enzyme activity, zymography, maize.

  7. Aging Chart: a community resource for rapid exploratory pathway analysis of age-related processes

    PubMed Central

    Moskalev, Alexey; Zhikrivetskaya, Svetlana; Shaposhnikov, Mikhail; Dobrovolskaya, Evgenia; Gurinovich, Roman; Kuryan, Oleg; Pashuk, Aleksandr; Jellen, Leslie C.; Aliper, Alex; Peregudov, Alex; Zhavoronkov, Alex

    2016-01-01

    Aging research is a multi-disciplinary field encompassing knowledge from many areas of basic, applied and clinical research. Age-related processes occur on molecular, cellular, tissue, organ, system, organismal and even psychological levels, trigger the onset of multiple debilitating diseases and lead to a loss of function, and there is a need for a unified knowledge repository designed to track, analyze and visualize the cause and effect relationships and interactions between the many elements and processes on all levels. Aging Chart (http://agingchart.org/) is a new, community-curated collection of aging pathways and knowledge that provides a platform for rapid exploratory analysis. Building on an initial content base constructed by a team of experts from peer-reviewed literature, users can integrate new data into biological pathway diagrams for a visible, intuitive, top-down framework of aging processes that fosters knowledge-building and collaboration. As the body of knowledge in aging research is rapidly increasing, an open visual encyclopedia of aging processes will be useful to both the new entrants and experts in the field. PMID:26602690

  8. Aging Chart: a community resource for rapid exploratory pathway analysis of age-related processes.

    PubMed

    Moskalev, Alexey; Zhikrivetskaya, Svetlana; Shaposhnikov, Mikhail; Dobrovolskaya, Evgenia; Gurinovich, Roman; Kuryan, Oleg; Pashuk, Aleksandr; Jellen, Leslie C; Aliper, Alex; Peregudov, Alex; Zhavoronkov, Alex

    2016-01-01

    Aging research is a multi-disciplinary field encompassing knowledge from many areas of basic, applied and clinical research. Age-related processes occur on molecular, cellular, tissue, organ, system, organismal and even psychological levels, trigger the onset of multiple debilitating diseases and lead to a loss of function, and there is a need for a unified knowledge repository designed to track, analyze and visualize the cause and effect relationships and interactions between the many elements and processes on all levels. Aging Chart (http://agingchart.org/) is a new, community-curated collection of aging pathways and knowledge that provides a platform for rapid exploratory analysis. Building on an initial content base constructed by a team of experts from peer-reviewed literature, users can integrate new data into biological pathway diagrams for a visible, intuitive, top-down framework of aging processes that fosters knowledge-building and collaboration. As the body of knowledge in aging research is rapidly increasing, an open visual encyclopedia of aging processes will be useful to both the new entrants and experts in the field. PMID:26602690

  9. Feasibility of voxel-based statistical analysis method for myocardial PET

    NASA Astrophysics Data System (ADS)

    Ram Yu, A.; Kim, Jin Su; Paik, Chang H.; Kim, Kyeong Min; Moo Lim, Sang

    2014-09-01

    Although statistical parametric mapping (SPM) analysis is widely used in neuroimaging studies, to our best knowledge, there was no application to myocardial PET data analysis. In this study, we developed the voxel based statistical analysis method for myocardial PET which provides statistical comparison results between groups in image space. PET Emission data of normal and myocardial infarction rats were acquired For the SPM analysis, a rat heart template was created. In addition, individual PET data was spatially normalized and smoothed. Two sample t-tests were performed to identify the myocardial infarct region. This developed SPM method was compared with conventional ROI methods. Myocardial glucose metabolism was decreased in the lateral wall of the left ventricle. In the result of ROI analysis, the mean value of the lateral wall was 29% decreased. The newly developed SPM method for myocardial PET could provide quantitative information in myocardial PET study.

  10. Tutorial: survival analysis--a statistic for clinical, efficacy, and theoretical applications.

    PubMed

    Gruber, F A

    1999-04-01

    Current demands for increased research attention to therapeutic efficacy, efficiency, and also for improved developmental models call for analysis of longitudinal outcome data. Statistical treatment of longitudinal speech and language data is difficult, but there is a family of statistical techniques in common use in medicine, actuarial science, manufacturing, and sociology that has not been used in speech or language research. Survival analysis is introduced as a method that avoids many of the statistical problems of other techniques because it treats time as the outcome. In survival analysis, probabilities are calculated not just for groups but also for individuals in a group. This is a major advantage for clinical work. This paper provides a basic introduction to nonparametric and semiparametric survival analysis using speech outcomes as examples. A brief discussion of potential conflicts between actuarial analysis and clinical intuition is also provided. PMID:10229458

  11. Childhood autism in India: A case-control study using tract-based spatial statistics analysis

    PubMed Central

    Assis, Zarina Abdul; Bagepally, Bhavani Shankara; Saini, Jitender; Srinath, Shoba; Bharath, Rose Dawn; Naidu, Purushotham R.; Gupta, Arun Kumar

    2015-01-01

    Context: Autism is a serious behavioral disorder among young children that now occurs at epidemic rates in developing countries like India. We have used tract-based spatial statistics (TBSS) of diffusion tensor imaging (DTI) measures to investigate the microstructure of primary neurocircuitry involved in autistic spectral disorders as compared to the typically developed children. Objective: To evaluate the various white matter tracts in Indian autistic children as compared to the controls using TBSS. Materials and Methods: Prospective, case-control, voxel-based, whole-brain DTI analysis using TBSS was performed. The study included 19 autistic children (mean age 8.7 years ± 3.84, 16 males and 3 females) and 34 controls (mean age 12.38 ± 3.76, all males). Fractional anisotropy (FA), mean diffusivity (MD), radial diffusivity (RD), and axial diffusivity (AD) values were used as outcome variables. Results: Compared to the control group, TBSS demonstrated multiple areas of markedly reduced FA involving multiple long white matter tracts, entire corpus callosum, bilateral posterior thalami, and bilateral optic tracts (OTs). Notably, there were no voxels where FA was significantly increased in the autism group. Increased RD was also noted in these regions, suggesting underlying myelination defect. The MD was elevated in many of the projections and association fibers and notably in the OTs. There were no significant changes in the AD in these regions, indicating no significant axonal injury. There was no significant correlation between the FA values and Childhood Autism Rating Scale. Conclusion: This is a first of a kind study evaluating DTI findings in autistic children in India. In our study, DTI has shown a significant fault with the underlying intricate brain wiring system in autism. OT abnormality is a novel finding and needs further research. PMID:26600581

  12. Tract-based spatial statistics analysis of white matter changes in children with anisometropic amblyopia.

    PubMed

    Li, Qian; Zhai, Liying; Jiang, Qinying; Qin, Wen; Li, Qingji; Yin, Xiaohui; Guo, Mingxia

    2015-06-15

    Amblyopia is a neurological disorder of vision that follows abnormal binocular interaction or visual deprivation during early life. Previous studies have reported multiple functional or structural cortical alterations. Although white matter was also studied, it still cannot be clarified clearly which fasciculus was affected by amblyopia. In the present study, tract-based spatial statistics analysis was applied to diffusion tensor imaging (DTI) to investigate potential diffusion changes of neural tracts in anisometropic amblyopia. Fractional anisotropy (FA) value was calculated and compared between 20 amblyopic children and 18 healthy age-matched controls. In contrast to the controls, significant decreases in FA values were found in right optic radiation (OR), left inferior longitudinal fasciculus/inferior fronto-occipital fasciculus (ILF/IFO) and right superior longitudinal fasciculus (SLF) in the amblyopia. Furthermore, FA values of these identified tracts showed positive correlation with visual acuity. It can be inferred that abnormal visual input not only hinders OR from well developed, but also impairs fasciculi associated with dorsal and ventral visual pathways, which may be responsible for the amblyopic deficiency in object discrimination and stereopsis. Increased FA was detected in right posterior part of corpus callosum (CC) with a medium effect size, which may be due to compensation effect. DTI with subsequent measurement of FA is a useful tool for investigating neuronal tract involvement in amblyopia. PMID:25899779

  13. [Aging at home with telecare in Spain. A dicourse analysis].

    PubMed

    Aceros, Juan C; Cavalcante, Maria Tereza Leal; Domènech, Miquel

    2016-08-01

    Caring for the elderly is turning to forms of community care and home care. Telecare is one of those emergent modalities of caring. This article will explore the meanings that older people give to the experience of staying at home in later life by using telecare. Discourse analysis is used to examine a set of focus groups and interviews with telecare users from different cities of Catalonia (Spain). The outcomes include three interpretative repertoires that we called: "Aging at home", "normal aging" and "unsafe aging". For each repertoire we examine how the permanence of older people in their homes is accounted, and which role telecare plays in such experience. PMID:27557015

  14. Performance analysis of morphological component analysis (MCA) method for mammograms using some statistical features

    NASA Astrophysics Data System (ADS)

    Gardezi, Syed Jamal Safdar; Faye, Ibrahima; Kamel, Nidal; Eltoukhy, Mohamed Meselhy; Hussain, Muhammad

    2014-10-01

    Early detection of breast cancer helps reducing the mortality rates. Mammography is very useful tool in breast cancer detection. But it is very difficult to separate different morphological features in mammographic images. In this study, Morphological Component Analysis (MCA) method is used to extract different morphological aspects of mammographic images by effectively preserving the morphological characteristics of regions. MCA decomposes the mammogram into piecewise smooth part and the texture part using the Local Discrete Cosine Transform (LDCT) and Curvelet Transform via wrapping (CURVwrap). In this study, simple comparison in performance has been done using some statistical features for the original image versus the piecewise smooth part obtained from the MCA decomposition. The results show that MCA suppresses the structural noises and blood vessels from the mammogram and enhances the performance for mass detection.

  15. Application of multivariate statistical methods to the analysis of ancient Turkish potsherds

    SciTech Connect

    Martin, R.C.

    1986-01-01

    Three hundred ancient Turkish potsherds were analyzed by instrumental neutron activation analysis, and the resulting data analyzed by several techniques of multivariate statistical analysis, some only recently developed. The programs AGCLUS, MASLOC, and SIMCA were sequentially employed to characterize and group the samples by type of pottery and site of excavation. Comparison of the statistical analyses by each method provided archaeological insight into the site/type relationships of the samples and ultimately evidence relevant to the commercial relations between the ancient communities and specialization of pottery production over time. The techniques used for statistical analysis were found to be of significant potential utility in the future analysis of other archaeometric data sets. 25 refs., 33 figs.

  16. VOStat: A Virtual Observatory Web Service for Statistical Analysis of Astronomical Data

    NASA Astrophysics Data System (ADS)

    Feigelson, Eric; Chakraborty, A.; Babu, G.

    2013-01-01

    VOStat (http://vostat.org), is a VO-compliant Web service giving astronomers access to a suite of statistical procedures in a user-friendly Web environment. It uses R (http://www.r-project.org), the largest public domain statistical software environment with >4000 add-on packages. Data input is by user upload, URL, or SAMP interaction with other VO tools. Outputs include plots, tabular results and R scripts. VOStat implements ~60 statistical functions, only a tiny portion of the full R capabilities. These include density estimation (smoothing), hypothesis tests, regression (linear, local, quantile, robust), multivariate analysis (regression, principal components, hierarchical clustering, normal mixture models), spatial analysis (autocorrelation, k-NN, Riley's K, Voronoi tests), directional data, survival analysis (Kaplan-Meier estimator, two-sample tests, Lynden-Bell-Woodroofe estimator), and time series analysis (autocorrelation, autoregressive models, periodogram).

  17. Analysis of Variance with Summary Statistics in Microsoft® Excel®

    ERIC Educational Resources Information Center

    Larson, David A.; Hsu, Ko-Cheng

    2010-01-01

    Students regularly are asked to solve Single Factor Analysis of Variance problems given only the sample summary statistics (number of observations per category, category means, and corresponding category standard deviations). Most undergraduate students today use Excel for data analysis of this type. However, Excel, like all other statistical…

  18. Nationwide statistical analysis of myeloid malignancies in Korea: incidence and survival rate from 1999 to 2012

    PubMed Central

    Park, Eun-Hye; Lee, Hyewon; Won, Young-Joo; Ju, Hee Young; Oh, Chang-Mo; Ingabire, Cecile; Kong, Hyun-Joo; Park, Byung-Kiu; Yoon, Ju Young; Eom, Hyeon-Seok; Lee, Eunyoung

    2015-01-01

    Background Large-scale epidemiologic analysis for hematologic malignancies will be helpful to understand the trends in incidence and survival. Methods The Korea Central Cancer Registry (KCCR) updated the nationwide analysis on the incidence and survival of myeloid malignancies, from the Korean National Cancer Incidence Database between 1999 and 2012. Myeloid malignancies were classified based on the International Classification of Diseases for Oncology 3rd edition (ICD-O-3). Results Overall 3,771 cases of myeloid diseases, which was 1.7% of all cancers, were identified in 2012. The highest incidence of myeloid malignancies was observed in age 70s and male predominance was noted (1.3:1). Acute myeloid leukemia (AML) was the most frequent subtype, followed by myeloproliferative neoplasms (MPN), myelodysplastic syndrome (MDS) and MDS/MPN: age-standardized incidence rates (ASR) in 2012 for each disease were 2.02, 1.95, 1.13, and 0.12 per 100,000 persons, respectively. The ASR for all myeloid malignancies was increased from 3.31 in 1999 to 5.70 in 2012 with the annual percentage change (APC) of 5.4 %. Five-year relative survival rate (RS) for myeloid malignancies has gradually improved for decades. RS changed from 26.3% to 34.8% in AML, specifically from 51.6% to 69.6% in acute promyelocytic leukemia (APL) and from 23.8% to 29.9% in non-APL AML, between 1996-2000 and 2008-2012. RS also increased from 81.8% to 87.1% in MPN, with a significant improvement in CML (from 74.5% to 85.5%), and from 27.3% to 31.7% in MDS/MPN between 2001-2005 and 2008-2012. However, there was no survival improvement in MDS during the study period (45.6% in 2001-2005 to 44.4% in 2008-2012). Conclusion This report updated the nationwide statistical analysis on myeloid malignancies since 2008, showing increasing incidence and improving trends in survival. PMID:26770948

  19. RooStatsCms: A tool for analysis modelling, combination and statistical studies

    NASA Astrophysics Data System (ADS)

    Piparo, D.; Schott, G.; Quast, G.

    2010-04-01

    RooStatsCms is an object oriented statistical framework based on the RooFit technology. Its scope is to allow the modelling, statistical analysis and combination of multiple search channels for new phenomena in High Energy Physics. It provides a variety of methods described in literature implemented as classes, whose design is oriented to the execution of multiple CPU intensive jobs on batch systems or on the Grid.

  20. Root dentine transparency: age determination of human teeth using computerized densitometric analysis.

    PubMed

    Drusini, A; Calliari, I; Volpe, A

    1991-05-01

    Root dentine transparency (RDT) was used to estimate the ages of human subjects from 152 intact teeth. Teeth were from 134 subjects, both historical and recent, of known age and sex. The aims of this work are 1) to compare two methods of using RDT to estimate age; 2) to test the applicability of the regression formulae for estimating age obtained from a recent sample on an historical sample; and 3) to estimate the suitability of RDT to determine age at death of 100-year-old skeletons. RDT was measured by two techniques: 1) computerized densitometric analysis and 2) vernier caliper. Age estimations based on computerized densitometric analysis were no more accurate than were those determined by caliper measurement; both give a predictive success of +/- 5 years in about 45-48% of cases for premolars. The television-based digitization system has some disadvantages: It is expensive, not portable, and requires some training to use. However, it furnishes a more standardized method, a rapid graphic illustration of the results, and an immediate storage of statistical information for future use. PMID:1853940

  1. Bayesian Statistical Analysis of Historical and Late Holocene Rates of Sea-Level Change

    NASA Astrophysics Data System (ADS)

    Cahill, Niamh; Parnell, Andrew; Kemp, Andrew; Horton, Benjamin

    2014-05-01

    A fundamental concern associated with climate change is the rate at which sea levels are rising. Studies of past sea level (particularly beyond the instrumental data range) allow modern sea-level rise to be placed in a more complete context. Considering this, we perform a Bayesian statistical analysis on historical and late Holocene rates of sea-level change. The data that form the input to the statistical model are tide-gauge measurements and proxy reconstructions from cores of coastal sediment. The aims are to estimate rates of sea-level rise, to determine when modern rates of sea-level rise began and to observe how these rates have been changing over time. Many of the current methods for doing this use simple linear regression to estimate rates. This is often inappropriate as it is too rigid and it can ignore uncertainties that arise as part of the data collection exercise. This can lead to over confidence in the sea-level trends being characterized. The proposed Bayesian model places a Gaussian process prior on the rate process (i.e. the process that determines how rates of sea-level are changing over time). The likelihood of the observed data is the integral of this process. When dealing with proxy reconstructions, this is set in an errors-in-variables framework so as to take account of age uncertainty. It is also necessary, in this case, for the model to account for glacio-isostatic adjustment, which introduces a covariance between individual age and sea-level observations. This method provides a flexible fit and it allows for the direct estimation of the rate process with full consideration of all sources of uncertainty. Analysis of tide-gauge datasets and proxy reconstructions in this way means that changing rates of sea level can be estimated more comprehensively and accurately than previously possible. The model captures the continuous and dynamic evolution of sea-level change and results show that not only are modern sea levels rising but that the rates

  2. Statistical-fluctuation analysis for quantum key distribution with consideration of after-pulse contributions

    NASA Astrophysics Data System (ADS)

    Li, Hongxin; Jiang, Haodong; Gao, Ming; Ma, Zhi; Ma, Chuangui; Wang, Wei

    2015-12-01

    The statistical fluctuation problem is a critical factor in all quantum key distribution (QKD) protocols under finite-key conditions. The current statistical fluctuation analysis is mainly based on independent random samples, however, the precondition cannot always be satisfied because of different choices of samples and actual parameters. As a result, proper statistical fluctuation methods are required to solve this problem. Taking the after-pulse contributions into consideration, this paper gives the expression for the secure key rate and the mathematical model for statistical fluctuations, focusing on a decoy-state QKD protocol [Z.-C. Wei et al., Sci. Rep. 3, 2453 (2013), 10.1038/srep02453] with a biased basis choice. On this basis, a classified analysis of statistical fluctuation is represented according to the mutual relationship between random samples. First, for independent identical relations, a deviation comparison is made between the law of large numbers and standard error analysis. Second, a sufficient condition is given that the Chernoff bound achieves a better result than Hoeffding's inequality based on only independent relations. Third, by constructing the proper martingale, a stringent way is proposed to deal issues based on dependent random samples through making use of Azuma's inequality. In numerical optimization, the impact on the secure key rate, the comparison of secure key rates, and the respective deviations under various kinds of statistical fluctuation analyses are depicted.

  3. General specifications for the development of a USL NASA PC R and D statistical analysis support package

    NASA Technical Reports Server (NTRS)

    Dominick, Wayne D. (Editor); Bassari, Jinous; Triantafyllopoulos, Spiros

    1984-01-01

    The University of Southwestern Louisiana (USL) NASA PC R and D statistical analysis support package is designed to be a three-level package to allow statistical analysis for a variety of applications within the USL Data Base Management System (DBMS) contract work. The design addresses usage of the statistical facilities as a library package, as an interactive statistical analysis system, and as a batch processing package.

  4. The incidence of cervical spondylosis decreases with aging in the elderly, and increases with aging in the young and adult population: a hospital-based clinical analysis

    PubMed Central

    Wang, Chuanling; Tian, Fuming; Zhou, Yingjun; He, Wenbo; Cai, Zhiyou

    2016-01-01

    Background and purpose Cervical spondylosis is well accepted as a common degenerative change in the cervical spine. Compelling evidence has shown that the incidence of cervical spondylosis increases with age. However, the relationship between age and the incidence of cervical spondylosis remains obscure. It is essential to note the relationship between age and the incidence of cervical spondylosis through more and more clinical data. Methods In the case-controlled study reported here, retrospective clinical analysis of 1,276 cases of cervical spondylosis has been conducted. We analyzed the general clinical data, the relationship between age and the incidence of cervical spondylosis, and the relationship between age-related risk factors and the incidence of cervical spondylosis. A chi-square test was used to analyze the associations between different variables. Statistical significance was defined as a P-value of less than 0.05. Results The imaging examination demonstrated the most prominent characteristic features of cervical spondylosis: bulge or herniation at C3-C4, C4-C5, and C5-C6. The incidence of cervical spondylosis increased with aging before age 50 years and decreased with aging after age 50 years, especially in the elderly after 60 years old. The occurrence rate of bulge or herniation at C3-C4, C4-C5, C5-C6, and C6-C7 increased with aging before age 50 years and decreased with aging after age 50 years, especially after 60 years. Moreover, the incidence of hyperosteogeny and spinal stenosis increased with aging before age 60 years and decreased with aging after age 60 years, although there was no obvious change in calcification. The age-related risk factors, such as hypertension, hyperlipidemia, diabetes, cerebral infarct, cardiovascular diseases, smoking, and drinking, have no relationship with the incidence of cervical spondylosis. Conclusion A decreasing proportion of cervical spondylosis with aging occurs in the elderly, while the proportion of

  5. The linear statistical d.c. model of GaAs MESFET using factor analysis

    NASA Astrophysics Data System (ADS)

    Dobrzanski, Lech

    1995-02-01

    The linear statistical model of the GaAs MESFET's current generator is obtained by means of factor analysis. Three different MESFET deterministic models are taken into account in the analysis: the Statz model (ST), the Materka-type model (MT) and a new proprietary model of MESFET with implanted channel (PLD). It is shown that statistical models obtained using factor analysis provide excellent generation of the multidimensional random variable representing the drain current of MESFET. The method of implementation of the statistical model into the SPICE program is presented. It is proved that for a strongly limited number of Monte Carlo analysis runs in that program, the statistical models considered in each case (ST, MT and PLD) enable good reconstruction of the empirical factor structure. The empirical correlation matrix of model parameters is not reconstructed exactly by statistical modelling, but values of correlation matrix elements obtained from simulated data are within the confidence intervals for the small sample. This paper proves that a formal approach to statistical modelling using factor analysis is the right path to follow, in spite of the fact, that CAD systems (PSpice[MicroSim Corp.], Microwave Harmonica[Compact Software]) are not designed properly for generation of the multidimensional random variable. It is obvious that further progress in implementation of statistical methods in CAD software is required. Furthermore, a new approach to the MESFET's d.c. model is presented. The separate functions, describing the linear as well as the saturated region of MESFET output characteristics, are combined in the single equation. This way of modelling is particularly suitable for transistors with an implanted channel.

  6. Demographic analysis from summaries of an age-structured population

    USGS Publications Warehouse

    Link, W.A.; Royle, J. Andrew; Hatfield, J.S.

    2003-01-01

    Demographic analyses of age-structured populations typically rely on life history data for individuals, or when individual animals are not identified, on information about the numbers of individuals in each age class through time. While it is usually difficult to determine the age class of a randomly encountered individual, it is often the case that the individual can be readily and reliably assigned to one of a set of age classes. For example, it is often possible to distinguish first-year from older birds. In such cases, the population age structure can be regarded as a latent variable governed by a process prior, and the data as summaries of this latent structure. In this article, we consider the problem of uncovering the latent structure and estimating process parameters from summaries of age class information. We present a demographic analysis for the critically endangered migratory population of whooping cranes (Grus americana), based only on counts of first-year birds and of older birds. We estimate age and year-specific survival rates. We address the controversial issue of whether management action on the breeding grounds has influenced recruitment, relating recruitment rates to the number of seventh-year and older birds, and examining the pattern of variation through time in this rate.

  7. Linguistic Analysis of the Human Heartbeat Using Frequency and Rank Order Statistics

    NASA Astrophysics Data System (ADS)

    Yang, Albert C.-C.; Hseu, Shu-Shya; Yien, Huey-Wen; Goldberger, Ary L.; Peng, C.-K.

    2003-03-01

    Complex physiologic signals may carry unique dynamical signatures that are related to their underlying mechanisms. We present a method based on rank order statistics of symbolic sequences to investigate the profile of different types of physiologic dynamics. We apply this method to heart rate fluctuations, the output of a central physiologic control system. The method robustly discriminates patterns generated from healthy and pathologic states, as well as aging. Furthermore, we observe increased randomness in the heartbeat time series with physiologic aging and pathologic states and also uncover nonrandom patterns in the ventricular response to atrial fibrillation.

  8. Dietary restriction of rodents decreases aging rate without affecting initial mortality rate -- a meta-analysis.

    PubMed

    Simons, Mirre J P; Koch, Wouter; Verhulst, Simon

    2013-06-01

    Dietary restriction (DR) extends lifespan in multiple species from various taxa. This effect can arise via two distinct but not mutually exclusive ways: a change in aging rate and/or vulnerability to the aging process (i.e. initial mortality rate). When DR affects vulnerability, this lowers mortality instantly, whereas a change in aging rate will gradually lower mortality risk over time. Unraveling how DR extends lifespan is of interest because it may guide toward understanding the mechanism(s) mediating lifespan extension and also has practical implications for the application of DR. We reanalyzed published survival data from 82 pairs of survival curves from DR experiments in rats and mice by fitting Gompertz and also Gompertz-Makeham models. The addition of the Makeham parameter has been reported to improve the estimation of Gompertz parameters. Both models separate initial mortality rate (vulnerability) from an age-dependent increase in mortality (aging rate). We subjected the obtained Gompertz parameters to a meta-analysis. We find that DR reduced aging rate without affecting vulnerability. The latter contrasts with the conclusion of a recent analysis of a largely overlapping data set, and we show how the earlier finding is due to a statistical artifact. Our analysis indicates that the biology underlying the life-extending effect of DR in rodents likely involves attenuated accumulation of damage, which contrasts with the acute effect of DR on mortality reported for Drosophila. Moreover, our findings show that the often-reported correlation between aging rate and vulnerability does not constrain changing aging rate without affecting vulnerability simultaneously. PMID:23438200

  9. A roadmap for the genetic analysis of renal aging.

    PubMed

    Noordmans, Gerda A; Hillebrands, Jan-Luuk; van Goor, Harry; Korstanje, Ron

    2015-10-01

    Several studies show evidence for the genetic basis of renal disease, which renders some individuals more prone than others to accelerated renal aging. Studying the genetics of renal aging can help us to identify genes involved in this process and to unravel the underlying pathways. First, this opinion article will give an overview of the phenotypes that can be observed in age-related kidney disease. Accurate phenotyping is essential in performing genetic analysis. For kidney aging, this could include both functional and structural changes. Subsequently, this article reviews the studies that report on candidate genes associated with renal aging in humans and mice. Several loci or candidate genes have been found associated with kidney disease, but identification of the specific genetic variants involved has proven to be difficult. CUBN, UMOD, and SHROOM3 were identified by human GWAS as being associated with albuminuria, kidney function, and chronic kidney disease (CKD). These are promising examples of genes that could be involved in renal aging, and were further mechanistically evaluated in animal models. Eventually, we will provide approaches for performing genetic analysis. We should leverage the power of mouse models, as testing in humans is limited. Mouse and other animal models can be used to explain the underlying biological mechanisms of genes and loci identified by human GWAS. Furthermore, mouse models can be used to identify genetic variants associated with age-associated histological changes, of which Far2, Wisp2, and Esrrg are examples. A new outbred mouse population with high genetic diversity will facilitate the identification of genes associated with renal aging by enabling high-resolution genetic mapping while also allowing the control of environmental factors, and by enabling access to renal tissues at specific time points for histology, proteomics, and gene expression. PMID:26219736

  10. A roadmap for the genetic analysis of renal aging

    PubMed Central

    Noordmans, Gerda A; Hillebrands, Jan-Luuk; van Goor, Harry; Korstanje, Ron

    2015-01-01

    Several studies show evidence for the genetic basis of renal disease, which renders some individuals more prone than others to accelerated renal aging. Studying the genetics of renal aging can help us to identify genes involved in this process and to unravel the underlying pathways. First, this opinion article will give an overview of the phenotypes that can be observed in age-related kidney disease. Accurate phenotyping is essential in performing genetic analysis. For kidney aging, this could include both functional and structural changes. Subsequently, this article reviews the studies that report on candidate genes associated with renal aging in humans and mice. Several loci or candidate genes have been found associated with kidney disease, but identification of the specific genetic variants involved has proven to be difficult. CUBN, UMOD, and SHROOM3 were identified by human GWAS as being associated with albuminuria, kidney function, and chronic kidney disease (CKD). These are promising examples of genes that could be involved in renal aging, and were further mechanistically evaluated in animal models. Eventually, we will provide approaches for performing genetic analysis. We should leverage the power of mouse models, as testing in humans is limited. Mouse and other animal models can be used to explain the underlying biological mechanisms of genes and loci identified by human GWAS. Furthermore, mouse models can be used to identify genetic variants associated with age-associated histological changes, of which Far2, Wisp2, and Esrrg are examples. A new outbred mouse population with high genetic diversity will facilitate the identification of genes associated with renal aging by enabling high-resolution genetic mapping while also allowing the control of environmental factors, and by enabling access to renal tissues at specific time points for histology, proteomics, and gene expression. PMID:26219736

  11. Error Analysis of Terrestrial Laser Scanning Data by Means of Spherical Statistics and 3D Graphs

    PubMed Central

    Cuartero, Aurora; Armesto, Julia; Rodríguez, Pablo G.; Arias, Pedro

    2010-01-01

    This paper presents a complete analysis of the positional errors of terrestrial laser scanning (TLS) data based on spherical statistics and 3D graphs. Spherical statistics are preferred because of the 3D vectorial nature of the spatial error. Error vectors have three metric elements (one module and two angles) that were analyzed by spherical statistics. A study case has been presented and discussed in detail. Errors were calculating using 53 check points (CP) and CP coordinates were measured by a digitizer with submillimetre accuracy. The positional accuracy was analyzed by both the conventional method (modular errors analysis) and the proposed method (angular errors analysis) by 3D graphics and numerical spherical statistics. Two packages in R programming language were performed to obtain graphics automatically. The results indicated that the proposed method is advantageous as it offers a more complete analysis of the positional accuracy, such as angular error component, uniformity of the vector distribution, error isotropy, and error, in addition the modular error component by linear statistics. PMID:22163461

  12. On the importance of statistics in breath analysis - Hope or curse?

    PubMed Central

    Eckel, Sandrah P.; Baumbach, Jan; Hauschild, Anne-Christin

    2014-01-01

    As we saw at the 2013 Breath Analysis Summit, breath analysis is a rapidly evolving field. Increasingly sophisticated technology is producing huge amounts of complex data. A major barrier now faced by the breath research community is the analysis of these data. Emerging breath data require sophisticated, modern statistical methods to allow for a careful and robust deduction of real-world conclusions. PMID:24565974

  13. Statistical analysis of interaction between lake seepage rates and groundwater and lake levels

    NASA Astrophysics Data System (ADS)

    Ala-aho, P.; Rossi, P. M.; Klöve, B.

    2012-04-01

    In Finland, the main sources of groundwater are the esker deposits from the last ice age. Small lakes imbedded in the aquifer with no outlets or inlets are typically found in eskers. Some lakes at Rokua esker, in Northern Finland, have been suffering from changes in water stage and quality. A possible permanent decline of water level has raised considerable concern as the area is also used for recreation and tourism. Rare biotypes supported by the oligotrophic lakes can also be endangered by the level decline. Drainage of peatlands located in the discharge zone of the aquifer is a possible threat for the lakes and the whole aquifer. Drainage can potentially lower the aquifer water table which can have an effect on the groundwater-lake interaction. The aim of this study was to understand in more detail the interaction of the aquifer and the lake systems so potential causes for the lake level variations could be better understood and managed. In-depth understanding of hydrogeological system provides foundation to study the nutrient input to lakes affecting lake ecosystems. A small lake imbedded the Rokua esker aquifer was studied in detail. Direct measurements of seepage rate between the lake and the aquifer were carried out using seepage meters. Seepage was measured from six locations for eight times during May 2010 - November 2010. Precipitation was recorded with a tipping bucket rain gauge adjacent to the lake. Lake stage and groundwater levels from three piezometers were registered on an hourly interval using pressure probes. Statistical methods were applied to examine relationship between seepage measurements and levels of lake and groundwater and amount of precipitation. Distinct areas of inseepage and outseepage of the lake were distinguished with seepage meter measurements. Seepage rates showed only little variation within individual measurement locations. Nevertheless analysis revealed statistically significant correlation of seepage rate variation in four

  14. Differential Expression Analysis for RNA-Seq: An Overview of Statistical Methods and Computational Software

    PubMed Central

    Huang, Huei-Chung; Niu, Yi; Qin, Li-Xuan

    2015-01-01

    Deep sequencing has recently emerged as a powerful alternative to microarrays for the high-throughput profiling of gene expression. In order to account for the discrete nature of RNA sequencing data, new statistical methods and computational tools have been developed for the analysis of differential expression to identify genes that are relevant to a disease such as cancer. In this paper, it is thus timely to provide an overview of these analysis methods and tools. For readers with statistical background, we also review the parameter estimation algorithms and hypothesis testing strategies used in these methods. PMID:26688660

  15. Development of a statistical sampling method for uncertainty analysis with SCALE

    SciTech Connect

    Williams, M.; Wiarda, D.; Smith, H.; Jessee, M. A.; Rearden, B. T.; Zwermann, W.; Klein, M.; Pautz, A.; Krzykacz-Hausmann, B.; Gallner, L.

    2012-07-01

    A new statistical sampling sequence called Sampler has been developed for the SCALE code system. Random values for the input multigroup cross sections are determined by using the XSUSA program to sample uncertainty data provided in the SCALE covariance library. Using these samples, Sampler computes perturbed self-shielded cross sections and propagates the perturbed nuclear data through any specified SCALE analysis sequence, including those for criticality safety, lattice physics with depletion, and shielding calculations. Statistical analysis of the output distributions provides uncertainties and correlations in the desired responses. (authors)

  16. A Comparative Review of Sensitivity and Uncertainty Analysis of Large-Scale Systems - II: Statistical Methods

    SciTech Connect

    Cacuci, Dan G.; Ionescu-Bujor, Mihaela

    2004-07-15

    Part II of this review paper highlights the salient features of the most popular statistical methods currently used for local and global sensitivity and uncertainty analysis of both large-scale computational models and indirect experimental measurements. These statistical procedures represent sampling-based methods (random sampling, stratified importance sampling, and Latin Hypercube sampling), first- and second-order reliability algorithms (FORM and SORM, respectively), variance-based methods (correlation ratio-based methods, the Fourier Amplitude Sensitivity Test, and the Sobol Method), and screening design methods (classical one-at-a-time experiments, global one-at-a-time design methods, systematic fractional replicate designs, and sequential bifurcation designs). It is emphasized that all statistical uncertainty and sensitivity analysis procedures first commence with the 'uncertainty analysis' stage and only subsequently proceed to the 'sensitivity analysis' stage; this path is the exact reverse of the conceptual path underlying the methods of deterministic sensitivity and uncertainty analysis where the sensitivities are determined prior to using them for uncertainty analysis. By comparison to deterministic methods, statistical methods for uncertainty and sensitivity analysis are relatively easier to develop and use but cannot yield exact values of the local sensitivities. Furthermore, current statistical methods have two major inherent drawbacks as follows: 1. Since many thousands of simulations are needed to obtain reliable results, statistical methods are at best expensive (for small systems) or, at worst, impracticable (e.g., for large time-dependent systems).2. Since the response sensitivities and parameter uncertainties are inherently and inseparably amalgamated in the results produced by these methods, improvements in parameter uncertainties cannot be directly propagated to improve response uncertainties; rather, the entire set of simulations and

  17. JULIDE: a software tool for 3D reconstruction and statistical analysis of autoradiographic mouse brain sections.

    PubMed

    Ribes, Delphine; Parafita, Julia; Charrier, Rémi; Magara, Fulvio; Magistretti, Pierre J; Thiran, Jean-Philippe

    2010-01-01

    In this article we introduce JULIDE, a software toolkit developed to perform the 3D reconstruction, intensity normalization, volume standardization by 3D image registration and voxel-wise statistical analysis of autoradiographs of mouse brain sections. This software tool has been developed in the open-source ITK software framework and is freely available under a GPL license. The article presents the complete image processing chain from raw data acquisition to 3D statistical group analysis. Results of the group comparison in the context of a study on spatial learning are shown as an illustration of the data that can be obtained with this tool. PMID:21124830

  18. Long-term Statistical Analysis of the Simultaneity of Forbush Decrease Events at Middle Latitudes

    NASA Astrophysics Data System (ADS)

    Lee, Seongsuk; Oh, Suyeon; Yi, Yu; Evenson, Paul; Jee, Geonhwa; Choi, Hwajin

    2015-03-01

    Forbush Decreases (FD) are transient, sudden reductions of cosmic ray (CR) intensity lasting a few days, to a week. Such events are observed globally using ground neutron monitors (NMs). Most studies of FD events indicate that an FD event is observed simultaneously at NM stations located all over the Earth. However, using statistical analysis, previous researchers verified that while FD events could occur simultaneously, in some cases, FD events could occur non-simultaneously. Previous studies confirmed the statistical reality of non-simultaneous FD events and the mechanism by which they occur, using data from high-latitude and middle-latitude NM stations. In this study, we used long-term data (1971-2006) from middle-latitude NM stations (Irkutsk, Climax, and Jungfraujoch) to enhance statistical reliability. According to the results from this analysis, the variation of cosmic ray intensity during the main phase, is larger (statistically significant) for simultaneous FD events, than for non-simultaneous ones. Moreover, the distribution of main-phase-onset time shows differences that are statistically significant. While the onset times for the simultaneous FDs are distributed evenly over 24- hour intervals (day and night), those of non-simultaneous FDs are mostly distributed over 12-hour intervals, in daytime. Thus, the existence of the two kinds of FD events, according to differences in their statistical properties, were verified based on data from middle-latitude NM stations.

  19. Statistical analysis of water-quality data containing multiple detection limits: S-language software for regression on order statistics

    USGS Publications Warehouse

    Lee, L.; Helsel, D.

    2005-01-01

    Trace contaminants in water, including metals and organics, often are measured at sufficiently low concentrations to be reported only as values below the instrument detection limit. Interpretation of these "less thans" is complicated when multiple detection limits occur. Statistical methods for multiply censored, or multiple-detection limit, datasets have been developed for medical and industrial statistics, and can be employed to estimate summary statistics or model the distributions of trace-level environmental data. We describe S-language-based software tools that perform robust linear regression on order statistics (ROS). The ROS method has been evaluated as one of the most reliable procedures for developing summary statistics of multiply censored data. It is applicable to any dataset that has 0 to 80% of its values censored. These tools are a part of a software library, or add-on package, for the R environment for statistical computing. This library can be used to generate ROS models and associated summary statistics, plot modeled distributions, and predict exceedance probabilities of water-quality standards. ?? 2005 Elsevier Ltd. All rights reserved.

  20. Geochemical and statistical analysis of toxic elements in tsunami deposits occurred at March 11, 2011

    NASA Astrophysics Data System (ADS)

    Komai, T.; Kuwatani, T.; Kawabe, Y.; Hara, J.; Okada, M.

    2013-12-01

    Huge amount of tsunami deposits remain after the large earthquake and tsunami occurred on March 11, 2011. This event may bring a possibility of environmental pollutions, particularly in the environment of soil and sediments around coastal areas of eastern Japan. Therefor a geochemical survey and investigation for soil contamination risk was carried out, to make clear the risk level caused by tsunami event and its deposits. First more than 200 points of sampling soil and sediment samples were selected on the basis of tsunami event hazard and topography features. Samples were analyzed by means of chemical and physical methods to accumulate the database for evaluating the environmental risk. Various kinds of tsunami deposits were observed at the coastal areas, some of them are sandy sediments and others are muddy with much clay components. The result of chemical analysis showed that some portions of deposits contain a little higher content of arsenic and lead, however, almost are similar component compared with normal subsurface soils. Environmental risk assessment by using our developed GERAS system indicated that tsunami deposits sampled at around north Miyagi and Iwate pref. have relatively higher risk level. In this case some kind of risk management is necessary for their storage and utilization. Other amount of deposits and soils can be safely used for reconstruction activity because of acceptable risk level. In the analysis of physical properties of deposits, a series of database was developed for particle distribution, soil and clay components, and content of organic matters. The behaviors of biological effects and aging trend in terms of components of tsunami deposits with sulfide minerals were clarified by the precise investigation by a long term testing method. The authors also conduct an approach of statistical analysis of elements in tsunami deposits by using an original technic of sparse modeling, in which the discrimination between tsunami deposits and

  1. Menarche age in Iran: A meta-analysis

    PubMed Central

    Bahrami, Nasim; Soleimani, Mohammad Ali; Chan, Yiong Huak; Ghojazadeh, Morteza; Mirmiran, Parvin

    2014-01-01

    Background: Research shows that the age at menarche, as an essential element in the reproductive health of women, had been decreasing in the 19th and 20th centuries, and shows a huge variation across different countries. There are numerous studies performed in Iran reporting a range of age at menarche. Thus, this meta-analysis aimed to determine the overall mean age at menarche of the girls in Iran. Materials and Methods: All relevant studies were reviewed using sensitive and standard keywords in the databases from 1950 to 2013. Two raters verified a total of 1088 articles based on the inclusion criteria of this study. Forty-seven studies were selected for this meta-analysis. Cochran test was used for samples’ homogeneity (Tau-square). The mean age at menarche of the girls in Iran with 95% confidence interval (CI) from the random effects was reported. Results: The homogeneity assumption for the 47 reviewed studies was attained (Tau-square = 0.00). The mean (95% CI) menarche age of Iranian girls from the random effects was 12.81 (95% CI: 12.56–13.06) years. Conclusions: The results of this study showed that mean age at menarche was less than that of some European developed countries such as Switzerland, Sweden, and Denmark, more than that reported in some countries such as Greece and Italy, and similar to the values obtained in the United States of America and Colombia. Lower age at menarche in Iran may be largely attributed to the changes in lifestyle and diet of the children. PMID:25400670

  2. The statistical analysis techniques to support the NGNP fuel performance experiments

    NASA Astrophysics Data System (ADS)

    Pham, Binh T.; Einerson, Jeffrey J.

    2013-10-01

    This paper describes the development and application of statistical analysis techniques to support the Advanced Gas Reactor (AGR) experimental program on Next Generation Nuclear Plant (NGNP) fuel performance. The experiments conducted in the Idaho National Laboratory's Advanced Test Reactor employ fuel compacts placed in a graphite cylinder shrouded by a steel capsule. The tests are instrumented with thermocouples embedded in graphite blocks and the target quantity (fuel temperature) is regulated by the He-Ne gas mixture that fills the gap volume. Three techniques for statistical analysis, namely control charting, correlation analysis, and regression analysis, are implemented in the NGNP Data Management and Analysis System for automated processing and qualification of the AGR measured data. The neutronic and thermal code simulation results are used for comparative scrutiny. The ultimate objective of this work includes (a) a multi-faceted system for data monitoring and data accuracy testing, (b) identification of possible modes of diagnostics deterioration and changes in experimental conditions, (c) qualification of data for use in code validation, and (d) identification and use of data trends to support effective control of test conditions with respect to the test target. Analysis results and examples given in the paper show the three statistical analysis techniques providing a complementary capability to warn of thermocouple failures. It also suggests that the regression analysis models relating calculated fuel temperatures and thermocouple readings can enable online regulation of experimental parameters (i.e. gas mixture content), to effectively maintain the fuel temperature within a given range.

  3. The statistical analysis techniques to support the NGNP fuel performance experiments

    SciTech Connect

    Binh T. Pham; Jeffrey J. Einerson

    2013-10-01

    This paper describes the development and application of statistical analysis techniques to support the Advanced Gas Reactor (AGR) experimental program on Next Generation Nuclear Plant (NGNP) fuel performance. The experiments conducted in the Idaho National Laboratory’s Advanced Test Reactor employ fuel compacts placed in a graphite cylinder shrouded by a steel capsule. The tests are instrumented with thermocouples embedded in graphite blocks and the target quantity (fuel temperature) is regulated by the He–Ne gas mixture that fills the gap volume. Three techniques for statistical analysis, namely control charting, correlation analysis, and regression analysis, are implemented in the NGNP Data Management and Analysis System for automated processing and qualification of the AGR measured data. The neutronic and thermal code simulation results are used for comparative scrutiny. The ultimate objective of this work includes (a) a multi-faceted system for data monitoring and data accuracy testing, (b) identification of possible modes of diagnostics deterioration and changes in experimental conditions, (c) qualification of data for use in code validation, and (d) identification and use of data trends to support effective control of test conditions with respect to the test target. Analysis results and examples given in the paper show the three statistical analysis techniques providing a complementary capability to warn of thermocouple failures. It also suggests that the regression analysis models relating calculated fuel temperatures and thermocouple readings can enable online regulation of experimental parameters (i.e. gas mixture content), to effectively maintain the fuel temperature within a given range.

  4. Linearised and non-linearised isotherm models optimization analysis by error functions and statistical means

    PubMed Central

    2014-01-01

    In adsorption study, to describe sorption process and evaluation of best-fitting isotherm model is a key analysis to investigate the theoretical hypothesis. Hence, numerous statistically analysis have been extensively used to estimate validity of the experimental equilibrium adsorption values with the predicted equilibrium values. Several statistical error analysis were carried out. In the present study, the following statistical analysis were carried out to evaluate the adsorption isotherm model fitness, like the Pearson correlation, the coefficient of determination and the Chi-square test, have been used. The ANOVA test was carried out for evaluating significance of various error functions and also coefficient of dispersion were evaluated for linearised and non-linearised models. The adsorption of phenol onto natural soil (Local name Kalathur soil) was carried out, in batch mode at 30 ± 20 C. For estimating the isotherm parameters, to get a holistic view of the analysis the models were compared between linear and non-linear isotherm models. The result reveled that, among above mentioned error functions and statistical functions were designed to determine the best fitting isotherm. PMID:25018878

  5. The Grenoble Analysis Toolkit (GreAT)-A statistical analysis framework

    NASA Astrophysics Data System (ADS)

    Putze, A.; Derome, L.

    2014-12-01

    The field of astroparticle physics is currently the focus of prolific scientific activity. In the last decade, this field has undergone significant developments thanks to several experimental results from CREAM, PAMELA, Fermi, and H.E.S.S. Moreover, the next generation of instruments, such as AMS-02 (launched on 16 May 2011) and CTA, will undoubtedly facilitate more sensitive and precise measurements of the cosmic-ray and γ-ray fluxes. To fully exploit the wealth of high precision data generated by these experiments, robust and efficient statistical tools such as Markov Chain Monte Carlo algorithms or evolutionary algorithms, able to handle the complexity of joint parameter spaces and datasets, are necessary for a phenomenological interpretation. The Grenoble Analysis Toolkit (GreAT) is an user-friendly and modular object orientated framework in C++, which samples the user-defined parameter space with a pre- or user-defined algorithm. The functionality of GreAT is presented in the context of cosmic-ray physics, where the boron-to-carbon (B/C) ratio is used to constrain cosmic-ray propagation models.

  6. Temporal statistical analysis of laser speckle images and its application to retinal blood-flow imaging.

    PubMed

    Cheng, Haiying; Yan, Yumei; Duong, Timothy Q

    2008-07-01

    Temporal-statistical analysis of laser-speckle image (TS-LSI) preserves the original spatial resolution, in contrast to conventional spatial-statistical analysis (SS-LSI). Concerns have been raised regarding the temporal independency of TS-LSI signals and its insensitivity toward stationary-speckle contamination. Our results from flow phantoms and in vivo rat retinas demonstrated that the TS-LSI signals are temporally statistically independent and TS-LSI minimizes stationary-speckle contamination. The latter is because the stationary speckle is "non-random" and thus non-ergodic where the temporal average of stationary speckle needs not equal its spatial ensemble average. TS-LSI detects blood flow in smaller blood vessels and is less susceptible to stationary-speckle artifacts. PMID:18607429

  7. Landing Site Dispersion Analysis and Statistical Assessment for the Mars Phoenix Lander

    NASA Technical Reports Server (NTRS)

    Bonfiglio, Eugene P.; Adams, Douglas; Craig, Lynn; Spencer, David A.; Strauss, William; Seelos, Frank P.; Seelos, Kimberly D.; Arvidson, Ray; Heet, Tabatha

    2008-01-01

    The Mars Phoenix Lander launched on August 4, 2007 and successfully landed on Mars 10 months later on May 25, 2008. Landing ellipse predicts and hazard maps were key in selecting safe surface targets for Phoenix. Hazard maps were based on terrain slopes, geomorphology maps and automated rock counts of MRO's High Resolution Imaging Science Experiment (HiRISE) images. The expected landing dispersion which led to the selection of Phoenix's surface target is discussed as well as the actual landing dispersion predicts determined during operations in the weeks, days, and hours before landing. A statistical assessment of these dispersions is performed, comparing the actual landing-safety probabilities to criteria levied by the project. Also discussed are applications for this statistical analysis which were used by the Phoenix project. These include using the statistical analysis used to verify the effectiveness of a pre-planned maneuver menu and calculating the probability of future maneuvers.

  8. Statistical Analysis of CFD Solutions from the Fourth AIAA Drag Prediction Workshop

    NASA Technical Reports Server (NTRS)

    Morrison, Joseph H.

    2010-01-01

    A graphical framework is used for statistical analysis of the results from an extensive N-version test of a collection of Reynolds-averaged Navier-Stokes computational fluid dynamics codes. The solutions were obtained by code developers and users from the U.S., Europe, Asia, and Russia using a variety of grid systems and turbulence models for the June 2009 4th Drag Prediction Workshop sponsored by the AIAA Applied Aerodynamics Technical Committee. The aerodynamic configuration for this workshop was a new subsonic transport model, the Common Research Model, designed using a modern approach for the wing and included a horizontal tail. The fourth workshop focused on the prediction of both absolute and incremental drag levels for wing-body and wing-body-horizontal tail configurations. This work continues the statistical analysis begun in the earlier workshops and compares the results from the grid convergence study of the most recent workshop with earlier workshops using the statistical framework.

  9. Risk assessment of coal production: an information system user's manual. [SAS (Statistical Analysis System) format

    SciTech Connect

    Watson, A.P.; Birchfield, T.E.; Fore, C.S.

    1982-10-01

    A specialized information system comprising all US domestic coal mine and processing plant injuries as reported to the Mine Safety and Health Administration of the US Department of Labor for the years 1975 through 1980 has been developed at Oak Ridge National Laboratory (ORNL) for online and batch users. The data are stored in two principal datasets: (1) annual summaries of accidental injuries and fatalities in both surface and underground bituminous and anthracite mines, as well as information on injuries suffered by workers employed in coal-processing (blending, crushing, etc.) facilities; and (2) annual summaries of employment (person-hours, number of individuals) and production (tons) of each domestic mine or processing facility for which the US Department of Labor has granted an operating permit. There are currently more than 232 000 records available online to interested users. Data are recorded for the following variables: county, state, date of injury, sex of victim, age at time of accident, degree of injury, occupation title at time of injury, activity during injury, location of accident, type of coal, type of mine, type of mining machine, type of accident, source and nature of injury, part of body injured, total mine experience, experience at current mine and job title held at time of injury, and number of days away from work or number of days restricted or charged due to the injury. As these values are organized by FIPS (Federal Information Processing Standards) county code for each reporting facility, compilations may be made on a subregional or substate basis. The datasets have been established in SAS (Statistical Analysis System) format and are readily manipulated by SAS routines available at ORNL. Several appendices are included in the manual to provide the user with a detailed description of all the codes available for data retrieval. Sample retrieval sessions are also incorporated.

  10. A statistical method for estimating rates of soil development and ages of geologic deposits: A design for soil-chronosequence studies

    USGS Publications Warehouse

    Switzer, P.; Harden, J.W.; Mark, R.K.

    1988-01-01

    A statistical method for estimating rates of soil development in a given region based on calibration from a series of dated soils is used to estimate ages of soils in the same region that are not dated directly. The method is designed specifically to account for sampling procedures and uncertainties that are inherent in soil studies. Soil variation and measurement error, uncertainties in calibration dates and their relation to the age of the soil, and the limited number of dated soils are all considered. Maximum likelihood (ML) is employed to estimate a parametric linear calibration curve, relating soil development to time or age on suitably transformed scales. Soil variation on a geomorphic surface of a certain age is characterized by replicate sampling of soils on each surface; such variation is assumed to have a Gaussian distribution. The age of a geomorphic surface is described by older and younger bounds. This technique allows age uncertainty to be characterized by either a Gaussian distribution or by a triangular distribution using minimum, best-estimate, and maximum ages. The calibration curve is taken to be linear after suitable (in certain cases logarithmic) transformations, if required, of the soil parameter and age variables. Soil variability, measurement error, and departures from linearity are described in a combined fashion using Gaussian distributions with variances particular to each sampled geomorphic surface and the number of sample replicates. Uncertainty in age of a geomorphic surface used for calibration is described using three parameters by one of two methods. In the first method, upper and lower ages are specified together with a coverage probability; this specification is converted to a Gaussian distribution with the appropriate mean and variance. In the second method, "absolute" older and younger ages are specified together with a most probable age; this specification is converted to an asymmetric triangular distribution with mode at the

  11. Age inclusive services or separate old age and working age services? A historical analysis from the formative years of old age psychiatry c.1940–1989

    PubMed Central

    Hilton, Claire

    2015-01-01

    The Equality Act 2010 made it unlawful to discriminate in the provision of services on the grounds of age. This legislation is open to interpretation, but it is affecting the way older people’s services are defined and provided. Historical evidence indicates that, since the 1940s, apart from psychiatrists working in dedicated old age services, most were unenthusiastic about working with mentally unwell older people and unsupportive of those who chose to do so. A historical analysis might shed light on current dilemmas about ‘all age’ or ‘old age’ services and inform decision-making on future mental health services. PMID:26191440

  12. Product analysis for polyethylene degradation by radiation and thermal ageing

    NASA Astrophysics Data System (ADS)

    Sugimoto, Masaki; Shimada, Akihiko; Kudoh, Hisaaki; Tamura, Kiyotoshi; Seguchi, Tadao

    2013-01-01

    The oxidation products in crosslinked polyethylene for cable insulation formed during thermal and radiation ageing were analyzed by FTIR-ATR. The products were composed of carboxylic acid, carboxylic ester, and carboxylic anhydride for all ageing conditions. The relative yields of carboxylic ester and carboxylic anhydride increased with an increase of temperature for radiation and thermal ageing. The carboxylic acid was the primary oxidation product and the ester and anhydride were secondary products formed by the thermally induced reactions of the carboxylic acids. The carboxylic acid could be produced by chain scission at any temperature followed by the oxidation of the free radicals formed in the polyethylene. The results of the analysis led to formulation of a new oxidation mechanism which was different from the chain reactions via peroxy radicals and peroxides.

  13. The School to Work Transition of Indigenous Australians: A Review of the Literature and Statistical Analysis.

    ERIC Educational Resources Information Center

    Long, Mike; Frigo, Tracey; Batten, Margaret

    This report describes the current educational and employment situation of Australian Indigenous youth in terms of their pathways from school to work. A literature review and analysis of statistical data identify barriers to successful transition from school to work, including forms of teaching, curriculum, and assessment that pose greater…

  14. Can Percentiles Replace Raw Scores in the Statistical Analysis of Test Data?

    ERIC Educational Resources Information Center

    Zimmerman, Donald W.; Zumbo, Bruno D.

    2005-01-01

    Educational and psychological testing textbooks typically warn of the inappropriateness of performing arithmetic operations and statistical analysis on percentiles instead of raw scores. This seems inconsistent with the well-established finding that transforming scores to ranks and using nonparametric methods often improves the validity and power…

  15. Data Analysis and Graphing in an Introductory Physics Laboratory: Spreadsheet versus Statistics Suite

    ERIC Educational Resources Information Center

    Peterlin, Primoz

    2010-01-01

    Two methods of data analysis are compared: spreadsheet software and a statistics software suite. Their use is compared analysing data collected in three selected experiments taken from an introductory physics laboratory, which include a linear dependence, a nonlinear dependence and a histogram. The merits of each method are compared. (Contains 7…

  16. 1977-78 Cost Analysis for Florida Schools and Districts. Statistical Report. Series 79-01.

    ERIC Educational Resources Information Center

    Florida State Dept. of Education, Tallahassee. Div. of Public Schools.

    This statistical report describes some of the cost analysis information available from computer reports produced by the Florida Department of Education. It reproduces examples of Florida school and school district financial data that can be used by state, district, and school-level administrators as they analyze program costs and expenditures. The…

  17. Relationships between Association of Research Libraries (ARL) Statistics and Bibliometric Indicators: A Principal Components Analysis

    ERIC Educational Resources Information Center

    Hendrix, Dean

    2010-01-01

    This study analyzed 2005-2006 Web of Science bibliometric data from institutions belonging to the Association of Research Libraries (ARL) and corresponding ARL statistics to find any associations between indicators from the two data sets. Principal components analysis on 36 variables from 103 universities revealed obvious associations between…

  18. Bayesian Statistical Analysis Applied to NAA Data for Neutron Flux Spectrum Determination

    NASA Astrophysics Data System (ADS)

    Chiesa, D.; Previtali, E.; Sisti, M.

    2014-04-01

    In this paper, we present a statistical method, based on Bayesian statistics, to evaluate the neutron flux spectrum from the activation data of different isotopes. The experimental data were acquired during a neutron activation analysis (NAA) experiment [A. Borio di Tigliole et al., Absolute flux measurement by NAA at the Pavia University TRIGA Mark II reactor facilities, ENC 2012 - Transactions Research Reactors, ISBN 978-92-95064-14-0, 22 (2012)] performed at the TRIGA Mark II reactor of Pavia University (Italy). In order to evaluate the neutron flux spectrum, subdivided in energy groups, we must solve a system of linear equations containing the grouped cross sections and the activation rate data. We solve this problem with Bayesian statistical analysis, including the uncertainties of the coefficients and the a priori information about the neutron flux. A program for the analysis of Bayesian hierarchical models, based on Markov Chain Monte Carlo (MCMC) simulations, is used to define the problem statistical model and solve it. The energy group fluxes and their uncertainties are then determined with great accuracy and the correlations between the groups are analyzed. Finally, the dependence of the results on the prior distribution choice and on the group cross section data is investigated to confirm the reliability of the analysis.

  19. Whatever Happened to Exploratory Data Analysis? An Evaluation of Behavioral Science Statistics Textbooks.

    ERIC Educational Resources Information Center

    Curtis, Deborah A.; Araki, Cheri J.

    The purpose of this research was to analyze recent statistics textbooks in the behavioral sciences in terms of their coverage of exploratory data analysis (EDA) philosophy and techniques. Twenty popular texts were analyzed. EDA philosophy was not addressed in the vast majority of texts. Only three texts had an entire chapter on EDA. None of the…

  20. Audience Diversion Due to Cable Television: A Statistical Analysis of New Data.

    ERIC Educational Resources Information Center

    Park, Rolla Edward

    A statistical analysis of new data suggests that television broadcasting will continue to prosper, despite increasing competition from cable television carrying distant signals. Data on cable and non-cable audiences in 121 counties with well defined signal choice support generalized least squares estimates of two models: total audience and…