Science.gov

Sample records for advanced statistical analysis

  1. Recent advances in statistical energy analysis

    NASA Technical Reports Server (NTRS)

    Heron, K. H.

    1992-01-01

    Statistical Energy Analysis (SEA) has traditionally been developed using modal summation and averaging approach, and has led to the need for many restrictive SEA assumptions. The assumption of 'weak coupling' is particularly unacceptable when attempts are made to apply SEA to structural coupling. It is now believed that this assumption is more a function of the modal formulation rather than a necessary formulation of SEA. The present analysis ignores this restriction and describes a wave approach to the calculation of plate-plate coupling loss factors. Predictions based on this method are compared with results obtained from experiments using point excitation on one side of an irregular six-sided box structure. Conclusions show that the use and calculation of infinite transmission coefficients is the way forward for the development of a purely predictive SEA code.

  2. Integration of Advanced Statistical Analysis Tools and Geophysical Modeling

    DTIC Science & Technology

    2010-12-01

    later in this section. 2) San Luis Obispo . Extracted features were also provided for MTADS EM61, MTADS magnetics, EM61 cart, and TEMTADS data sets from...subsequent training of statistical classifiers using these features. Results of discrimination studies at Camp Sibert and San Luis Obispo have shown...Comparison of classification performance Figures 10 through 13 show receiver operating characteristics for data sets acquired at San Luis Obispo . Subplot

  3. Advanced statistical methods for improved data analysis of NASA astrophysics missions

    NASA Technical Reports Server (NTRS)

    Feigelson, Eric D.

    1992-01-01

    The investigators under this grant studied ways to improve the statistical analysis of astronomical data. They looked at existing techniques, the development of new techniques, and the production and distribution of specialized software to the astronomical community. Abstracts of nine papers that were produced are included, as well as brief descriptions of four software packages. The articles that are abstracted discuss analytical and Monte Carlo comparisons of six different linear least squares fits, a (second) paper on linear regression in astronomy, two reviews of public domain software for the astronomer, subsample and half-sample methods for estimating sampling distributions, a nonparametric estimation of survival functions under dependent competing risks, censoring in astronomical data due to nondetections, an astronomy survival analysis computer package called ASURV, and improving the statistical methodology of astronomical data analysis.

  4. Improved Test Planning and Analysis Through the Use of Advanced Statistical Methods

    NASA Technical Reports Server (NTRS)

    Green, Lawrence L.; Maxwell, Katherine A.; Glass, David E.; Vaughn, Wallace L.; Barger, Weston; Cook, Mylan

    2016-01-01

    The goal of this work is, through computational simulations, to provide statistically-based evidence to convince the testing community that a distributed testing approach is superior to a clustered testing approach for most situations. For clustered testing, numerous, repeated test points are acquired at a limited number of test conditions. For distributed testing, only one or a few test points are requested at many different conditions. The statistical techniques of Analysis of Variance (ANOVA), Design of Experiments (DOE) and Response Surface Methods (RSM) are applied to enable distributed test planning, data analysis and test augmentation. The D-Optimal class of DOE is used to plan an optimally efficient single- and multi-factor test. The resulting simulated test data are analyzed via ANOVA and a parametric model is constructed using RSM. Finally, ANOVA can be used to plan a second round of testing to augment the existing data set with new data points. The use of these techniques is demonstrated through several illustrative examples. To date, many thousands of comparisons have been performed and the results strongly support the conclusion that the distributed testing approach outperforms the clustered testing approach.

  5. Dynamic statistical optimization of GNSS radio occultation bending angles: advanced algorithm and performance analysis

    NASA Astrophysics Data System (ADS)

    Li, Y.; Kirchengast, G.; Scherllin-Pirscher, B.; Norman, R.; Yuan, Y. B.; Fritzer, J.; Schwaerz, M.; Zhang, K.

    2015-08-01

    We introduce a new dynamic statistical optimization algorithm to initialize ionosphere-corrected bending angles of Global Navigation Satellite System (GNSS)-based radio occultation (RO) measurements. The new algorithm estimates background and observation error covariance matrices with geographically varying uncertainty profiles and realistic global-mean correlation matrices. The error covariance matrices estimated by the new approach are more accurate and realistic than in simplified existing approaches and can therefore be used in statistical optimization to provide optimal bending angle profiles for high-altitude initialization of the subsequent Abel transform retrieval of refractivity. The new algorithm is evaluated against the existing Wegener Center Occultation Processing System version 5.6 (OPSv5.6) algorithm, using simulated data on two test days from January and July 2008 and real observed CHAllenging Minisatellite Payload (CHAMP) and Constellation Observing System for Meteorology, Ionosphere, and Climate (COSMIC) measurements from the complete months of January and July 2008. The following is achieved for the new method's performance compared to OPSv5.6: (1) significant reduction of random errors (standard deviations) of optimized bending angles down to about half of their size or more; (2) reduction of the systematic differences in optimized bending angles for simulated MetOp data; (3) improved retrieval of refractivity and temperature profiles; and (4) realistically estimated global-mean correlation matrices and realistic uncertainty fields for the background and observations. Overall the results indicate high suitability for employing the new dynamic approach in the processing of long-term RO data into a reference climate record, leading to well-characterized and high-quality atmospheric profiles over the entire stratosphere.

  6. Dynamic statistical optimization of GNSS radio occultation bending angles: an advanced algorithm and its performance analysis

    NASA Astrophysics Data System (ADS)

    Li, Y.; Kirchengast, G.; Scherllin-Pirscher, B.; Norman, R.; Yuan, Y. B.; Fritzer, J.; Schwaerz, M.; Zhang, K.

    2015-01-01

    We introduce a new dynamic statistical optimization algorithm to initialize ionosphere-corrected bending angles of Global Navigation Satellite System (GNSS) based radio occultation (RO) measurements. The new algorithm estimates background and observation error covariance matrices with geographically-varying uncertainty profiles and realistic global-mean correlation matrices. The error covariance matrices estimated by the new approach are more accurate and realistic than in simplified existing approaches and can therefore be used in statistical optimization to provide optimal bending angle profiles for high-altitude initialization of the subsequent Abel transform retrieval of refractivity. The new algorithm is evaluated against the existing Wegener Center Occultation Processing System version 5.6 (OPSv5.6) algorithm, using simulated data on two test days from January and July 2008 and real observed CHAMP and COSMIC measurements from the complete months of January and July 2008. The following is achieved for the new method's performance compared to OPSv5.6: (1) significant reduction in random errors (standard deviations) of optimized bending angles down to about two-thirds of their size or more; (2) reduction of the systematic differences in optimized bending angles for simulated MetOp data; (3) improved retrieval of refractivity and temperature profiles; (4) produces realistically estimated global-mean correlation matrices and realistic uncertainty fields for the background and observations. Overall the results indicate high suitability for employing the new dynamic approach in the processing of long-term RO data into a reference climate record, leading to well characterized and high-quality atmospheric profiles over the entire stratosphere.

  7. Intermediate/Advanced Research Design and Statistics

    NASA Technical Reports Server (NTRS)

    Ploutz-Snyder, Robert

    2009-01-01

    The purpose of this module is To provide Institutional Researchers (IRs) with an understanding of the principles of advanced research design and the intermediate/advanced statistical procedures consistent with such designs

  8. Mathematical and statistical analysis

    NASA Technical Reports Server (NTRS)

    Houston, A. Glen

    1988-01-01

    The goal of the mathematical and statistical analysis component of RICIS is to research, develop, and evaluate mathematical and statistical techniques for aerospace technology applications. Specific research areas of interest include modeling, simulation, experiment design, reliability assessment, and numerical analysis.

  9. Deconstructing Statistical Analysis

    ERIC Educational Resources Information Center

    Snell, Joel

    2014-01-01

    Using a very complex statistical analysis and research method for the sake of enhancing the prestige of an article or making a new product or service legitimate needs to be monitored and questioned for accuracy. 1) The more complicated the statistical analysis, and research the fewer the number of learned readers can understand it. This adds a…

  10. The ERP PCA Toolkit: an open source program for advanced statistical analysis of event-related potential data.

    PubMed

    Dien, Joseph

    2010-03-15

    This article presents an open source Matlab program, the ERP PCA (EP) Toolkit, for facilitating the multivariate decomposition and analysis of event-related potential data. This program is intended to supplement existing ERP analysis programs by providing functions for conducting artifact correction, robust averaging, referencing and baseline correction, data editing and visualization, principal components analysis, and robust inferential statistical analysis. This program subserves three major goals: (1) optimizing analysis of noisy data, such as clinical or developmental; (2) facilitating the multivariate decomposition of ERP data into its constituent components; (3) increasing the transparency of analysis operations by providing direct visualization of the corresponding waveforms.

  11. Statistical Energy Analysis Program

    NASA Technical Reports Server (NTRS)

    Ferebee, R. C.; Trudell, R. W.; Yano, L. I.; Nygaard, S. I.

    1985-01-01

    Statistical Energy Analysis (SEA) is powerful tool for estimating highfrequency vibration spectra of complex structural systems and incorporated into computer program. Basic SEA analysis procedure divided into three steps: Idealization, parameter generation, and problem solution. SEA computer program written in FORTRAN V for batch execution.

  12. Writing to Learn Statistics in an Advanced Placement Statistics Course

    ERIC Educational Resources Information Center

    Northrup, Christian Glenn

    2012-01-01

    This study investigated the use of writing in a statistics classroom to learn if writing provided a rich description of problem-solving processes of students as they solved problems. Through analysis of 329 written samples provided by students, it was determined that writing provided a rich description of problem-solving processes and enabled…

  13. Analysis of the two-dimensional turbulence in pure electron plasmas by means of advanced statistical techniques

    NASA Astrophysics Data System (ADS)

    Romé, M.; Lepreti, F.; Maero, G.; Pozzoli, R.; Vecchio, A.; Carbone, V.

    2013-03-01

    Highly magnetized, pure electron plasmas confined in a Penning-Malmberg trap allow one to perform experiments on the two-dimensional (2D) fluid dynamics under conditions where non-ideal effects are almost negligible. Recent results on the freely decaying 2D turbulence obtained from experiments with electron plasmas performed in the Penning-Malmberg trap ELTRAP are presented. The analysis has been applied to experimental sequences with different types of initial density distributions. The dynamical properties of the system have been investigated by means of wavelet transforms and Proper Orthogonal Decomposition (POD). The wavelet analysis shows that most of the enstrophy is contained at spatial scales corresponding to the typical size of the persistent vortices in the 2D electron plasma flow. The POD analysis allows one to identify the coherent structures which give the dominant contribution to the plasma evolution. The statistical properties of the turbulence have been investigated by means of Probability Density Functions (PDFs) and structure functions of spatial vorticity increments. The analysis evidences how the shape and evolution of the dominant coherent structures and the intermittency properties of the turbulence strongly depend on the initial conditions for the electron density.

  14. Advance Report of Final Mortality Statistics, 1985.

    ERIC Educational Resources Information Center

    Monthly Vital Statistics Report, 1987

    1987-01-01

    This document presents mortality statistics for 1985 for the entire United States. Data analysis and discussion of these factors is included: death and death rates; death rates by age, sex, and race; expectation of life at birth and at specified ages; causes of death; infant mortality; and maternal mortality. Highlights reported include: (1) the…

  15. Advances in Statistical Methods for Substance Abuse Prevention Research

    PubMed Central

    MacKinnon, David P.; Lockwood, Chondra M.

    2010-01-01

    The paper describes advances in statistical methods for prevention research with a particular focus on substance abuse prevention. Standard analysis methods are extended to the typical research designs and characteristics of the data collected in prevention research. Prevention research often includes longitudinal measurement, clustering of data in units such as schools or clinics, missing data, and categorical as well as continuous outcome variables. Statistical methods to handle these features of prevention data are outlined. Developments in mediation, moderation, and implementation analysis allow for the extraction of more detailed information from a prevention study. Advancements in the interpretation of prevention research results include more widespread calculation of effect size and statistical power, the use of confidence intervals as well as hypothesis testing, detailed causal analysis of research findings, and meta-analysis. The increased availability of statistical software has contributed greatly to the use of new methods in prevention research. It is likely that the Internet will continue to stimulate the development and application of new methods. PMID:12940467

  16. Statistical log analysis made practical

    SciTech Connect

    Mitchell, W.K.; Nelson, R.J. )

    1991-06-01

    This paper discusses the advantages of a statistical approach to log analysis. Statistical techniques use inverse methods to calculate formation parameters. The use of statistical techniques has been limited, however, by the complexity of the mathematics and lengthy computer time required to minimize traditionally used nonlinear equations.

  17. Deterministic and Advanced Statistical Modeling of Wind-Driven Sea

    DTIC Science & Technology

    2015-07-06

    COVERED (From - To) 01/09/2010-06/07/2015 4. TITLE AND SUBTITLE Deterministic and advanced statistical modeling of wind-driven sea 5a. CONTRACT...Technical Report Deterministic and advanced statistical modeling of wind-driven sea Vladimir Zakharov, Andrei Pushkarev Waves and Solitons LLC, 1719 W...Development of accurate and fast advanced statistical and dynamical nonlinear models of ocean surface waves, based on first physical principles, which will

  18. Statistical data analysis

    SciTech Connect

    Hahn, A.A.

    1994-11-01

    The complexity of instrumentation sometimes requires data analysis to be done before the result is presented to the control room. This tutorial reviews some of the theoretical assumptions underlying the more popular forms of data analysis and presents simple examples to illuminate the advantages and hazards of different techniques.

  19. Conceptualizing a Framework for Advanced Placement Statistics Teaching Knowledge

    ERIC Educational Resources Information Center

    Haines, Brenna

    2015-01-01

    The purpose of this article is to sketch a conceptualization of a framework for Advanced Placement (AP) Statistics Teaching Knowledge. Recent research continues to problematize the lack of knowledge and preparation among secondary level statistics teachers. The College Board's AP Statistics course continues to grow and gain popularity, but is a…

  20. Statistical evaluation of vibration analysis techniques

    NASA Technical Reports Server (NTRS)

    Milner, G. Martin; Miller, Patrice S.

    1987-01-01

    An evaluation methodology is presented for a selection of candidate vibration analysis techniques applicable to machinery representative of the environmental control and life support system of advanced spacecraft; illustrative results are given. Attention is given to the statistical analysis of small sample experiments, the quantification of detection performance for diverse techniques through the computation of probability of detection versus probability of false alarm, and the quantification of diagnostic performance.

  1. Enhanced bio-manufacturing through advanced multivariate statistical technologies.

    PubMed

    Martin, E B; Morris, A J

    2002-11-13

    The paper describes the interrogation of data, from a reaction vessel producing an active pharmaceutical ingredient (API), using advanced multivariate statistical techniques. Due to the limited number of batches available, data augmentation was used to increase the number of batches thereby enabling the extraction of more subtle process behaviour from the data. A second methodology investigated was that of multi-group modelling. This allowed between cluster variability to be removed, thus allowing attention to focus on within process variability. The paper describes how the different approaches enabled the realisation of a better understanding of the factors causing the onset of an impurity formation to be obtained as well demonstrating the power of multivariate statistical data analysis techniques to provide an enhanced understanding of the process.

  2. A Hierarchical Statistic Methodology for Advanced Memory System Evaluation

    SciTech Connect

    Sun, X.-J.; He, D.; Cameron, K.W.; Luo, Y.

    1999-04-12

    Advances in technology have resulted in a widening of the gap between computing speed and memory access time. Data access time has become increasingly important for computer system design. Various hierarchical memory architectures have been developed. The performance of these advanced memory systems, however, varies with applications and problem sizes. How to reach an optimal cost/performance design eludes researchers still. In this study, the authors introduce an evaluation methodology for advanced memory systems. This methodology is based on statistical factorial analysis and performance scalability analysis. It is two fold: it first determines the impact of memory systems and application programs toward overall performance; it also identifies the bottleneck in a memory hierarchy and provides cost/performance comparisons via scalability analysis. Different memory systems can be compared in terms of mean performance or scalability over a range of codes and problem sizes. Experimental testing has been performed extensively on the Department of Energy's Accelerated Strategic Computing Initiative (ASCI) machines and benchmarks available at the Los Alamos National Laboratory to validate this newly proposed methodology. Experimental and analytical results show this methodology is simple and effective. It is a practical tool for memory system evaluation and design. Its extension to general architectural evaluation and parallel computer systems are possible and should be further explored.

  3. Advanced Algorithms and Statistics for MOS Surveys

    NASA Astrophysics Data System (ADS)

    Bolton, A. S.

    2016-10-01

    This paper presents an individual view on the current state of computational data processing and statistics for inference and discovery in multi-object spectroscopic surveys, supplemented by a historical perspective and a few present-day applications. It is more op-ed than review, and hopefully more readable as a result.

  4. AMOVA ["Accumulative Manifold Validation Analysis"]: An Advanced Statistical Methodology Designed to Measure and Test the Validity, Reliability, and Overall Efficacy of Inquiry-Based Psychometric Instruments

    ERIC Educational Resources Information Center

    Osler, James Edward, II

    2015-01-01

    This monograph provides an epistemological rational for the Accumulative Manifold Validation Analysis [also referred by the acronym "AMOVA"] statistical methodology designed to test psychometric instruments. This form of inquiry is a form of mathematical optimization in the discipline of linear stochastic modelling. AMOVA is an in-depth…

  5. Tools for Basic Statistical Analysis

    NASA Technical Reports Server (NTRS)

    Luz, Paul L.

    2005-01-01

    Statistical Analysis Toolset is a collection of eight Microsoft Excel spreadsheet programs, each of which performs calculations pertaining to an aspect of statistical analysis. These programs present input and output data in user-friendly, menu-driven formats, with automatic execution. The following types of calculations are performed: Descriptive statistics are computed for a set of data x(i) (i = 1, 2, 3 . . . ) entered by the user. Normal Distribution Estimates will calculate the statistical value that corresponds to cumulative probability values, given a sample mean and standard deviation of the normal distribution. Normal Distribution from two Data Points will extend and generate a cumulative normal distribution for the user, given two data points and their associated probability values. Two programs perform two-way analysis of variance (ANOVA) with no replication or generalized ANOVA for two factors with four levels and three repetitions. Linear Regression-ANOVA will curvefit data to the linear equation y=f(x) and will do an ANOVA to check its significance.

  6. Statistical Analysis of RNA Backbone

    PubMed Central

    Hershkovitz, Eli; Sapiro, Guillermo; Tannenbaum, Allen; Williams, Loren Dean

    2009-01-01

    Local conformation is an important determinant of RNA catalysis and binding. The analysis of RNA conformation is particularly difficult due to the large number of degrees of freedom (torsion angles) per residue. Proteins, by comparison, have many fewer degrees of freedom per residue. In this work, we use and extend classical tools from statistics and signal processing to search for clusters in RNA conformational space. Results are reported both for scalar analysis, where each torsion angle is separately studied, and for vectorial analysis, where several angles are simultaneously clustered. Adapting techniques from vector quantization and clustering to the RNA structure, we find torsion angle clusters and RNA conformational motifs. We validate the technique using well-known conformational motifs, showing that the simultaneous study of the total torsion angle space leads to results consistent with known motifs reported in the literature and also to the finding of new ones. PMID:17048391

  7. Entropy in statistical energy analysis.

    PubMed

    Le Bot, Alain

    2009-03-01

    In this paper, the second principle of thermodynamics is discussed in the framework of statistical energy analysis (SEA). It is shown that the "vibrational entropy" and the "vibrational temperature" of sub-systems only depend on the vibrational energy and the number of resonant modes. A SEA system can be described as a thermodynamic system slightly out of equilibrium. In steady-state condition, the entropy exchanged with exterior by sources and dissipation exactly balances the production of entropy by irreversible processes at interface between SEA sub-systems.

  8. Statistical analysis of pyroshock data

    NASA Astrophysics Data System (ADS)

    Hughes, William O.

    2002-05-01

    The sample size of aerospace pyroshock test data is typically small. This often forces the engineer to make assumptions on its population distribution and to use conservative margins or methodologies in determining shock specifications. For example, the maximum expected environment is often derived by adding 3-6 dB to the maximum envelope of a limited amount of shock data. The recent availability of a large amount of pyroshock test data has allowed a rare statistical analysis to be performed. Findings and procedures from this analysis will be explained, including information on population distributions, procedures to properly combine families of test data, and methods of deriving appropriate shock specifications for a multipoint shock source.

  9. Statistical Analysis of Protein Ensembles

    NASA Astrophysics Data System (ADS)

    Máté, Gabriell; Heermann, Dieter

    2014-04-01

    As 3D protein-configuration data is piling up, there is an ever-increasing need for well-defined, mathematically rigorous analysis approaches, especially that the vast majority of the currently available methods rely heavily on heuristics. We propose an analysis framework which stems from topology, the field of mathematics which studies properties preserved under continuous deformations. First, we calculate a barcode representation of the molecules employing computational topology algorithms. Bars in this barcode represent different topological features. Molecules are compared through their barcodes by statistically determining the difference in the set of their topological features. As a proof-of-principle application, we analyze a dataset compiled of ensembles of different proteins, obtained from the Ensemble Protein Database. We demonstrate that our approach correctly detects the different protein groupings.

  10. New advanced tools for combined ULF wave analysis of multipoint space-borne and ground observations: application to single event and statistical studies

    NASA Astrophysics Data System (ADS)

    Balasis, G.; Papadimitriou, C.; Daglis, I. A.; Georgiou, M.; Giamini, S. A.

    2013-12-01

    In the past decade, a critical mass of high-quality scientific data on the electric and magnetic fields in the Earth's magnetosphere and topside ionosphere has been progressively collected. This data pool will be further enriched by the measurements of the upcoming ESA/Swarm mission, a constellation of three satellites in three different polar orbits between 400 and 550 km altitude, which is expected to be launched in November 2013. New analysis tools that can cope with measurements of various spacecraft at various regions of the magnetosphere and in the topside ionosphere as well as ground stations will effectively enhance the scientific exploitation of the accumulated data. Here, we report on a new suite of algorithms based on a combination of wavelet spectral methods and artificial neural network techniques and demonstrate the applicability of our recently developed analysis tools both for individual case studies and statistical studies of ultra-low frequency (ULF) waves. First, we provide evidence for a rare simultaneous observation of a ULF wave event in the Earth's magnetosphere, topside ionosphere and surface: we have found a specific time interval during the Halloween 2003 magnetic storm, when the Cluster and CHAMP spacecraft were in good local time (LT) conjunction, and have examined the ULF wave activity in the Pc3 (22-100 mHz) and Pc4-5 (1-22 mHz) bands using data from the Geotail, Cluster and CHAMP missions, as well as the CARISMA and GIMA magnetometer networks. Then, we perform a statistical study of Pc3 wave events observed by CHAMP for the full decade (2001-2010) of the satellite vector magnetic data: the creation of a database of such events enabled us to derive valuable statistics for many important physical properties relating to the spatio-temporal location of these waves, the wave power and frequency, as well as other parameters and their correlation with solar wind conditions, magnetospheric indices, electron density data, ring current decay

  11. Project T.E.A.M. (Technical Education Advancement Modules). Advanced Statistical Process Control.

    ERIC Educational Resources Information Center

    Dunlap, Dale

    This instructional guide, one of a series developed by the Technical Education Advancement Modules (TEAM) project, is a 20-hour advanced statistical process control (SPC) and quality improvement course designed to develop the following competencies: (1) understanding quality systems; (2) knowing the process; (3) solving quality problems; and (4)…

  12. Archaeological applications of laser-induced breakdown spectroscopy: an example from the Coso Volcanic Field, California, using advanced statistical signal processing analysis

    SciTech Connect

    Remus, Jeremiah J.; Gottfried, Jennifer L.; Harmon, Russell S.; Draucker, Anne; Baron, Dirk; Yohe, Robert

    2010-05-01

    of the classifier setup considered in this study include the training/testing routine (a 27-fold leave-one-sample-out setup versus a simple split of the data into separate sets for training and evaluation), the number of latent variables used in the regression model, and whether PLSDA operating on the entire broadband LIBS spectrum is superior to that using only a selected subset of LIBS emission lines. The results point to the robustness of the PLSDA technique and suggest that LIBS analysis combined with the appropriate statistical signal processing has the potential to be a useful tool for chemical analysis of archaeological artifacts and geological specimens.

  13. Statistical Analysis of Tsunami Variability

    NASA Astrophysics Data System (ADS)

    Zolezzi, Francesca; Del Giudice, Tania; Traverso, Chiara; Valfrè, Giulio; Poggi, Pamela; Parker, Eric J.

    2010-05-01

    similar to that seen in ground motion attenuation correlations used for seismic hazard assessment. The second issue was intra-event variability. This refers to the differences in tsunami wave run-up along a section of coast during a single event. Intra-event variability investigated directly considering field observations. The tsunami events used in the statistical evaluation were selected on the basis of the completeness and reliability of the available data. Tsunami considered for the analysis included the recent and well surveyed tsunami of Boxing Day 2004 (Great Indian Ocean Tsunami), Java 2006, Okushiri 1993, Kocaeli 1999, Messina 1908 and a case study of several historic events in Hawaii. Basic statistical analysis was performed on the field observations from these tsunamis. For events with very wide survey regions, the run-up heights have been grouped in order to maintain a homogeneous distance from the source. Where more than one survey was available for a given event, the original datasets were maintained separately to avoid combination of non-homogeneous data. The observed run-up measurements were used to evaluate the minimum, maximum, average, standard deviation and coefficient of variation for each data set. The minimum coefficient of variation was 0.12 measured for the 2004 Boxing Day tsunami at Nias Island (7 data) while the maximum is 0.98 for the Okushiri 1993 event (93 data). The average coefficient of variation is of the order of 0.45.

  14. Asymptotic modal analysis and statistical energy analysis

    NASA Technical Reports Server (NTRS)

    Dowell, Earl H.

    1992-01-01

    Asymptotic Modal Analysis (AMA) is a method which is used to model linear dynamical systems with many participating modes. The AMA method was originally developed to show the relationship between statistical energy analysis (SEA) and classical modal analysis (CMA). In the limit of a large number of modes of a vibrating system, the classical modal analysis result can be shown to be equivalent to the statistical energy analysis result. As the CMA result evolves into the SEA result, a number of systematic assumptions are made. Most of these assumptions are based upon the supposition that the number of modes approaches infinity. It is for this reason that the term 'asymptotic' is used. AMA is the asymptotic result of taking the limit of CMA as the number of modes approaches infinity. AMA refers to any of the intermediate results between CMA and SEA, as well as the SEA result which is derived from CMA. The main advantage of the AMA method is that individual modal characteristics are not required in the model or computations. By contrast, CMA requires that each modal parameter be evaluated at each frequency. In the latter, contributions from each mode are computed and the final answer is obtained by summing over all the modes in the particular band of interest. AMA evaluates modal parameters only at their center frequency and does not sum the individual contributions from each mode in order to obtain a final result. The method is similar to SEA in this respect. However, SEA is only capable of obtaining spatial averages or means, as it is a statistical method. Since AMA is systematically derived from CMA, it can obtain local spatial information as well.

  15. Evaluation of Long-term Outcomes of Correction of Severe Blepharoptosis with Advancement of External Levator Muscle Complex: Descriptive Statistical Analysis of the Results

    PubMed Central

    INNOCENTI, ALESSANDRO; MORI, FRANCESCO; MELITA, DARIO; DREASSI, EMANUELA; CIANCIO, FRANCESCO; INNOCENTI, MARCO

    2017-01-01

    Aim: Evaluation of long-term results after aponeurotic blepharoptosis correction with external levator muscle complex advancement. Patients and Methods: We carried out a retrospective study with medical record review of 20 patients (40 eyes) affected by bilateral aponeurotic moderate and severe ptosis who underwent primary surgery between January 2010 and December 2013. Criteria for outcome evaluations included 3-year postoperative follow-up of upper margin reflex index (uMRD) and symmetry. Results: 3-Year postoperative follow-up showed 17 (85%) cases of successful correction of ptosis and three cases (15%) showed partial success. Two eyes showed hypocorrection, while one eye was overcorrected. The symmetry was maintained in all patients except for the oldest. Conclusion: External superior levator advancement is an effective procedure for moderate and severe aponeurotic blepharoptosis correction, and establishes good long-term eyelid position and symmetry. PMID:28064228

  16. Multivariate statistical analysis of wildfires in Portugal

    NASA Astrophysics Data System (ADS)

    Costa, Ricardo; Caramelo, Liliana; Pereira, Mário

    2013-04-01

    Several studies demonstrate that wildfires in Portugal present high temporal and spatial variability as well as cluster behavior (Pereira et al., 2005, 2011). This study aims to contribute to the characterization of the fire regime in Portugal with the multivariate statistical analysis of the time series of number of fires and area burned in Portugal during the 1980 - 2009 period. The data used in the analysis is an extended version of the Rural Fire Portuguese Database (PRFD) (Pereira et al, 2011), provided by the National Forest Authority (Autoridade Florestal Nacional, AFN), the Portuguese Forest Service, which includes information for more than 500,000 fire records. There are many multiple advanced techniques for examining the relationships among multiple time series at the same time (e.g., canonical correlation analysis, principal components analysis, factor analysis, path analysis, multiple analyses of variance, clustering systems). This study compares and discusses the results obtained with these different techniques. Pereira, M.G., Trigo, R.M., DaCamara, C.C., Pereira, J.M.C., Leite, S.M., 2005: "Synoptic patterns associated with large summer forest fires in Portugal". Agricultural and Forest Meteorology. 129, 11-25. Pereira, M. G., Malamud, B. D., Trigo, R. M., and Alves, P. I.: The history and characteristics of the 1980-2005 Portuguese rural fire database, Nat. Hazards Earth Syst. Sci., 11, 3343-3358, doi:10.5194/nhess-11-3343-2011, 2011 This work is supported by European Union Funds (FEDER/COMPETE - Operational Competitiveness Programme) and by national funds (FCT - Portuguese Foundation for Science and Technology) under the project FCOMP-01-0124-FEDER-022692, the project FLAIR (PTDC/AAC-AMB/104702/2008) and the EU 7th Framework Program through FUME (contract number 243888).

  17. Recent Advances in Morphological Cell Image Analysis

    PubMed Central

    Chen, Shengyong; Zhao, Mingzhu; Wu, Guang; Yao, Chunyan; Zhang, Jianwei

    2012-01-01

    This paper summarizes the recent advances in image processing methods for morphological cell analysis. The topic of morphological analysis has received much attention with the increasing demands in both bioinformatics and biomedical applications. Among many factors that affect the diagnosis of a disease, morphological cell analysis and statistics have made great contributions to results and effects for a doctor. Morphological cell analysis finds the cellar shape, cellar regularity, classification, statistics, diagnosis, and so forth. In the last 20 years, about 1000 publications have reported the use of morphological cell analysis in biomedical research. Relevant solutions encompass a rather wide application area, such as cell clumps segmentation, morphological characteristics extraction, 3D reconstruction, abnormal cells identification, and statistical analysis. These reports are summarized in this paper to enable easy referral to suitable methods for practical solutions. Representative contributions and future research trends are also addressed. PMID:22272215

  18. Statistical Power in Meta-Analysis

    ERIC Educational Resources Information Center

    Liu, Jin

    2015-01-01

    Statistical power is important in a meta-analysis study, although few studies have examined the performance of simulated power in meta-analysis. The purpose of this study is to inform researchers about statistical power estimation on two sample mean difference test under different situations: (1) the discrepancy between the analytical power and…

  19. Towards a Judgement-Based Statistical Analysis

    ERIC Educational Resources Information Center

    Gorard, Stephen

    2006-01-01

    There is a misconception among social scientists that statistical analysis is somehow a technical, essentially objective, process of decision-making, whereas other forms of data analysis are judgement-based, subjective and far from technical. This paper focuses on the former part of the misconception, showing, rather, that statistical analysis…

  20. Asymptotic modal analysis and statistical energy analysis

    NASA Technical Reports Server (NTRS)

    Dowell, Earl H.

    1988-01-01

    Statistical Energy Analysis (SEA) is defined by considering the asymptotic limit of Classical Modal Analysis, an approach called Asymptotic Modal Analysis (AMA). The general approach is described for both structural and acoustical systems. The theoretical foundation is presented for structural systems, and experimental verification is presented for a structural plate responding to a random force. Work accomplished subsequent to the grant initiation focusses on the acoustic response of an interior cavity (i.e., an aircraft or spacecraft fuselage) with a portion of the wall vibrating in a large number of structural modes. First results were presented at the ASME Winter Annual Meeting in December, 1987, and accepted for publication in the Journal of Vibration, Acoustics, Stress and Reliability in Design. It is shown that asymptotically as the number of acoustic modes excited becomes large, the pressure level in the cavity becomes uniform except at the cavity boundaries. However, the mean square pressure at the cavity corner, edge and wall is, respectively, 8, 4, and 2 times the value in the cavity interior. Also it is shown that when the portion of the wall which is vibrating is near a cavity corner or edge, the response is significantly higher.

  1. Statistical analysis of histopathological endpoints.

    PubMed

    Green, John W; Springer, Timothy A; Saulnier, Amy N; Swintek, Joe

    2014-05-01

    Histopathological assessments of fish from aquatic ecotoxicology studies are being performed with increasing frequency. Aquatic ecotoxicology studies performed for submission to regulatory agencies are usually conducted with multiple subjects (e.g., fish) in each of multiple vessels (replicates) within a water control and within each of several concentrations of a test substance. A number of histopathological endpoints are evaluated in each fish, and a severity score is generally recorded for each endpoint. The severity scores are often recorded using a nonquantitative scale of 0 to 4, with 0 indicating no effect, 1 indicating minimal effect, through 4 for severe effect. Statistical methods often used to analyze these scores suffer from several shortcomings: computing average scores as though scores were quantitative values, considering only the frequency of abnormality while ignoring severity, ignoring any concentration-response trend, and ignoring the possible correlation between responses of individuals within test vessels. A new test, the Rao-Scott Cochran-Armitage by Slices (RSCABS), is proposed that incorporates the replicate vessel experimental design and the biological expectation that the severity of the effect tends to increase with increasing doses or concentrations, while retaining the individual subject scores and taking into account the severity as well as frequency of scores. A power simulation and examples demonstrate the performance of the test. R-based software has been developed to carry out this test and is available free of charge at www.epa.gov/med/Prods_Pubs/rscabs.htm. The SAS-based RSCABS software is available from the first and third authors.

  2. Statistical Analysis of DWPF ARG-1 Data

    SciTech Connect

    Harris, S.P.

    2001-03-02

    A statistical analysis of analytical results for ARG-1, an Analytical Reference Glass, blanks, and the associated calibration and bench standards has been completed. These statistics provide a means for DWPF to review the performance of their laboratory as well as identify areas of improvement.

  3. Explorations in Statistics: The Analysis of Change

    ERIC Educational Resources Information Center

    Curran-Everett, Douglas; Williams, Calvin L.

    2015-01-01

    Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This tenth installment of "Explorations in Statistics" explores the analysis of a potential change in some physiological response. As researchers, we often express absolute change as percent change so we can…

  4. Statistical analysis of trypanosomes' motility

    NASA Astrophysics Data System (ADS)

    Zaburdaev, Vasily; Uppaluri, Sravanti; Pfohl, Thomas; Engstler, Markus; Stark, Holger; Friedrich, Rudolf

    2010-03-01

    Trypanosome is a parasite causing the sleeping sickness. The way it moves in the blood stream and penetrates various obstacles is the area of active research. Our goal was to investigate a free trypanosomes' motion in the planar geometry. Our analysis of trypanosomes' trajectories reveals that there are two correlation times - one is associated with a fast motion of its body and the second one with a slower rotational diffusion of the trypanosome as a point object. We propose a system of Langevin equations to model such motion. One of its peculiarities is the presence of multiplicative noise predicting higher level of noise for higher velocity of the trypanosome. Theoretical and numerical results give a comprehensive description of the experimental data such as the mean squared displacement, velocity distribution and auto-correlation function.

  5. Statistical analysis principles for Omics data.

    PubMed

    Dunkler, Daniela; Sánchez-Cabo, Fátima; Heinze, Georg

    2011-01-01

    In Omics experiments, typically thousands of hypotheses are tested simultaneously, each based on very few independent replicates. Traditional tests like the t-test were shown to perform poorly with this new type of data. Furthermore, simultaneous consideration of many hypotheses, each prone to a decision error, requires powerful adjustments for this multiple testing situation. After a general introduction to statistical testing, we present the moderated t-statistic, the SAM statistic, and the RankProduct statistic which have been developed to evaluate hypotheses in typical Omics experiments. We also provide an introduction to the multiple testing problem and discuss some state-of-the-art procedures to address this issue. The presented test statistics are subjected to a comparative analysis of a microarray experiment comparing tissue samples of two groups of tumors. All calculations can be done using the freely available statistical software R. Accompanying, commented code is available at: http://www.meduniwien.ac.at/msi/biometrie/MIMB.

  6. Singularity analysis and robust neighborhood statistics

    NASA Astrophysics Data System (ADS)

    Zuo, Renguang

    2015-04-01

    Neighborhood statistics involving data within small neighborhoods have the advantages of revealing more detailed local structures and spatial variations of spatial patterns, and provide less biased information compared with global statistics. However, the resulting neighborhood statistics are influenced by the size of neighborhood. Singularity analysis can be regarded as a type of robust neighborhood statistics. It measures the gradient of relative change within small neighborhoods. The value of singularity index at a location of z rarely relies on the element concentration at that location, but depends on the changes around z. From the multifractal theory viewpoint, the singularity index is independent of the size of neighborhood. Singularity analysis is a powerful tool to identify geochemical and geophysical anomalies in mineral exploration. Recent studies demonstrated singularity analysis can well detect the weak geochemical anomalies related to mineralization due to decaying and masking effects of covers.

  7. Statistical Analysis Techniques for Small Sample Sizes

    NASA Technical Reports Server (NTRS)

    Navard, S. E.

    1984-01-01

    The small sample sizes problem which is encountered when dealing with analysis of space-flight data is examined. Because of such a amount of data available, careful analyses are essential to extract the maximum amount of information with acceptable accuracy. Statistical analysis of small samples is described. The background material necessary for understanding statistical hypothesis testing is outlined and the various tests which can be done on small samples are explained. Emphasis is on the underlying assumptions of each test and on considerations needed to choose the most appropriate test for a given type of analysis.

  8. Schmidt decomposition and multivariate statistical analysis

    NASA Astrophysics Data System (ADS)

    Bogdanov, Yu. I.; Bogdanova, N. A.; Fastovets, D. V.; Luckichev, V. F.

    2016-12-01

    The new method of multivariate data analysis based on the complements of classical probability distribution to quantum state and Schmidt decomposition is presented. We considered Schmidt formalism application to problems of statistical correlation analysis. Correlation of photons in the beam splitter output channels, when input photons statistics is given by compound Poisson distribution is examined. The developed formalism allows us to analyze multidimensional systems and we have obtained analytical formulas for Schmidt decomposition of multivariate Gaussian states. It is shown that mathematical tools of quantum mechanics can significantly improve the classical statistical analysis. The presented formalism is the natural approach for the analysis of both classical and quantum multivariate systems and can be applied in various tasks associated with research of dependences.

  9. Statistical Tools for Forensic Analysis of Toolmarks

    SciTech Connect

    David Baldwin; Max Morris; Stan Bajic; Zhigang Zhou; James Kreiser

    2004-04-22

    Recovery and comparison of toolmarks, footprint impressions, and fractured surfaces connected to a crime scene are of great importance in forensic science. The purpose of this project is to provide statistical tools for the validation of the proposition that particular manufacturing processes produce marks on the work-product (or tool) that are substantially different from tool to tool. The approach to validation involves the collection of digital images of toolmarks produced by various tool manufacturing methods on produced work-products and the development of statistical methods for data reduction and analysis of the images. The developed statistical methods provide a means to objectively calculate a ''degree of association'' between matches of similarly produced toolmarks. The basis for statistical method development relies on ''discriminating criteria'' that examiners use to identify features and spatial relationships in their analysis of forensic samples. The developed data reduction algorithms utilize the same rules used by examiners for classification and association of toolmarks.

  10. Investigation of Weibull statistics in fracture analysis of cast aluminum

    NASA Technical Reports Server (NTRS)

    Holland, Frederic A., Jr.; Zaretsky, Erwin V.

    1989-01-01

    The fracture strengths of two large batches of A357-T6 cast aluminum coupon specimens were compared by using two-parameter Weibull analysis. The minimum number of these specimens necessary to find the fracture strength of the material was determined. The applicability of three-parameter Weibull analysis was also investigated. A design methodology based on the combination of elementary stress analysis and Weibull statistical analysis is advanced and applied to the design of a spherical pressure vessel shell. The results from this design methodology are compared with results from the applicable ASME pressure vessel code.

  11. Investigation of Weibull statistics in fracture analysis of cast aluminum

    NASA Technical Reports Server (NTRS)

    Holland, F. A., Jr.; Zaretsky, E. V.

    1989-01-01

    The fracture strengths of two large batches of A357-T6 cast aluminum coupon specimens were compared by using two-parameter Weibull analysis. The minimum number of these specimens necessary to find the fracture strength of the material was determined. The applicability of three-parameter Weibull analysis was also investigated. A design methodolgy based on the combination of elementary stress analysis and Weibull statistical analysis is advanced and applied to the design of a spherical pressure vessel shell. The results from this design methodology are compared with results from the applicable ASME pressure vessel code.

  12. Statistical metrology—measurement and modeling of variation for advanced process development and design rule generation

    NASA Astrophysics Data System (ADS)

    Boning, Duane S.; Chung, James E.

    1998-11-01

    Advanced process technology will require more detailed understanding and tighter control of variation in devices and interconnects. The purpose of statistical metrology is to provide methods to measure and characterize variation, to model systematic and random components of that variation, and to understand the impact of variation on both yield and performance of advanced circuits. Of particular concern are spatial or pattern-dependencies within individual chips; such systematic variation within the chip can have a much larger impact on performance than wafer-level random variation. Statistical metrology methods will play an important role in the creation of design rules for advanced technologies. For example, a key issue in multilayer interconnect is the uniformity of interlevel dielectric (ILD) thickness within the chip. For the case of ILD thickness, we describe phases of statistical metrology development and application to understanding and modeling thickness variation arising from chemical-mechanical polishing (CMP). These phases include screening experiments including design of test structures and test masks to gather electrical or optical data, techniques for statistical decomposition and analysis of the data, and approaches to calibrating empirical and physical variation models. These models can be integrated with circuit CAD tools to evaluate different process integration or design rule strategies. One focus for the generation of interconnect design rules are guidelines for the use of "dummy fill" or "metal fill" to improve the uniformity of underlying metal density and thus improve the uniformity of oxide thickness within the die. Trade-offs that can be evaluated via statistical metrology include the improvements to uniformity possible versus the effect of increased capacitance due to additional metal.

  13. Advances in sequence analysis.

    PubMed

    Califano, A

    2001-06-01

    In its early days, the entire field of computational biology revolved almost entirely around biological sequence analysis. Over the past few years, however, a number of new non-sequence-based areas of investigation have become mainstream, from the analysis of gene expression data from microarrays, to whole-genome association discovery, and to the reverse engineering of gene regulatory pathways. Nonetheless, with the completion of private and public efforts to map the human genome, as well as those of other organisms, sequence data continue to be a veritable mother lode of valuable biological information that can be mined in a variety of contexts. Furthermore, the integration of sequence data with a variety of alternative information is providing valuable and fundamentally new insight into biological processes, as well as an array of new computational methodologies for the analysis of biological data.

  14. Advanced Economic Analysis

    NASA Technical Reports Server (NTRS)

    Greenberg, Marc W.; Laing, William

    2013-01-01

    An Economic Analysis (EA) is a systematic approach to the problem of choosing the best method of allocating scarce resources to achieve a given objective. An EA helps guide decisions on the "worth" of pursuing an action that departs from status quo ... an EA is the crux of decision-support.

  15. Statistical Analysis Experiment for Freshman Chemistry Lab.

    ERIC Educational Resources Information Center

    Salzsieder, John C.

    1995-01-01

    Describes a laboratory experiment dissolving zinc from galvanized nails in which data can be gathered very quickly for statistical analysis. The data have sufficient significant figures and the experiment yields a nice distribution of random errors. Freshman students can gain an appreciation of the relationships between random error, number of…

  16. Applied Behavior Analysis and Statistical Process Control?

    ERIC Educational Resources Information Center

    Hopkins, B. L.

    1995-01-01

    Incorporating statistical process control (SPC) methods into applied behavior analysis is discussed. It is claimed that SPC methods would likely reduce applied behavior analysts' intimate contacts with problems and would likely yield poor treatment and research decisions. Cases and data presented by Pfadt and Wheeler (1995) are cited as examples.…

  17. Bayesian Statistics for Biological Data: Pedigree Analysis

    ERIC Educational Resources Information Center

    Stanfield, William D.; Carlton, Matthew A.

    2004-01-01

    The use of Bayes' formula is applied to the biological problem of pedigree analysis to show that the Bayes' formula and non-Bayesian or "classical" methods of probability calculation give different answers. First year college students of biology can be introduced to the Bayesian statistics.

  18. MICROARRAY DATA ANALYSIS USING MULTIPLE STATISTICAL MODELS

    EPA Science Inventory

    Microarray Data Analysis Using Multiple Statistical Models

    Wenjun Bao1, Judith E. Schmid1, Amber K. Goetz1, Ming Ouyang2, William J. Welsh2,Andrew I. Brooks3,4, ChiYi Chu3,Mitsunori Ogihara3,4, Yinhe Cheng5, David J. Dix1. 1National Health and Environmental Effects Researc...

  19. Comments: Statistical Analysis for Multisite Trials

    ERIC Educational Resources Information Center

    Bloom, Howard S.

    2012-01-01

    In this article, the author shares his comments on statistical analysis for multisite trials, and focuses on the contribution of Stephen Raudenbush, Sean Reardon, and Takako Nomi to future research. Raudenbush, Reardon, and Nomi provide a major contribution to future research on variation in program impacts by showing how to use multisite trials…

  20. Statistical analysis of extreme river flows

    NASA Astrophysics Data System (ADS)

    Mateus, Ayana; Caeiro, Frederico; Gomes, Dora Prata; Sequeira, Inês J.

    2016-12-01

    Floods are recurrent events that can have a catastrophic impact. In this work we are interested in the analysis of a data set of gauged daily flows from the Whiteadder Water river, Scotland. Using statistic techniques based on extreme value theory, we estimate several extreme value parameters, including extreme quantiles and return periods of high levels.

  1. Statistical Methods in Algorithm Design and Analysis.

    ERIC Educational Resources Information Center

    Weide, Bruce W.

    The use of statistical methods in the design and analysis of discrete algorithms is explored. The introductory chapter contains a literature survey and background material on probability theory. In Chapter 2, probabilistic approximation algorithms are discussed with the goal of exposing and correcting some oversights in previous work. Chapter 3…

  2. Source apportionment advances using polar plots of bivariate correlation and regression statistics

    NASA Astrophysics Data System (ADS)

    Grange, Stuart K.; Lewis, Alastair C.; Carslaw, David C.

    2016-11-01

    This paper outlines the development of enhanced bivariate polar plots that allow the concentrations of two pollutants to be compared using pair-wise statistics for exploring the sources of atmospheric pollutants. The new method combines bivariate polar plots, which provide source characteristic information, with pair-wise statistics that provide information on how two pollutants are related to one another. The pair-wise statistics implemented include weighted Pearson correlation and slope from two linear regression methods. The development uses a Gaussian kernel to locally weight the statistical calculations on a wind speed-direction surface together with variable-scaling. Example applications of the enhanced polar plots are presented by using routine air quality data for two monitoring sites in London, United Kingdom for a single year (2013). The London examples demonstrate that the combination of bivariate polar plots, correlation, and regression techniques can offer considerable insight into air pollution source characteristics, which would be missed if only scatter plots and mean polar plots were used for analysis. Specifically, using correlation and slopes as pair-wise statistics, long-range transport processes were isolated and black carbon (BC) contributions to PM2.5 for a kerbside monitoring location were quantified. Wider applications and future advancements are also discussed.

  3. Explorations in statistics: the analysis of change.

    PubMed

    Curran-Everett, Douglas; Williams, Calvin L

    2015-06-01

    Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This tenth installment of Explorations in Statistics explores the analysis of a potential change in some physiological response. As researchers, we often express absolute change as percent change so we can account for different initial values of the response. But this creates a problem: percent change is really just a ratio, and a ratio is infamous for its ability to mislead. This means we may fail to find a group difference that does exist, or we may find a group difference that does not exist. What kind of an approach to science is that? In contrast, analysis of covariance is versatile: it can accommodate an analysis of the relationship between absolute change and initial value when percent change is useless.

  4. Statistical Analysis of Iberian Peninsula Megaliths Orientations

    NASA Astrophysics Data System (ADS)

    González-García, A. C.

    2009-08-01

    Megalithic monuments have been intensively surveyed and studied from the archaeoastronomical point of view in the past decades. We have orientation measurements for over one thousand megalithic burial monuments in the Iberian Peninsula, from several different periods. These data, however, lack a sound understanding. A way to classify and start to understand such orientations is by means of statistical analysis of the data. A first attempt is done with simple statistical variables and a mere comparison between the different areas. In order to minimise the subjectivity in the process a further more complicated analysis is performed. Some interesting results linking the orientation and the geographical location will be presented. Finally I will present some models comparing the orientation of the megaliths in the Iberian Peninsula with the rising of the sun and the moon at several times of the year.

  5. Multivariate analysis: A statistical approach for computations

    NASA Astrophysics Data System (ADS)

    Michu, Sachin; Kaushik, Vandana

    2014-10-01

    Multivariate analysis is a type of multivariate statistical approach commonly used in, automotive diagnosis, education evaluating clusters in finance etc and more recently in the health-related professions. The objective of the paper is to provide a detailed exploratory discussion about factor analysis (FA) in image retrieval method and correlation analysis (CA) of network traffic. Image retrieval methods aim to retrieve relevant images from a collected database, based on their content. The problem is made more difficult due to the high dimension of the variable space in which the images are represented. Multivariate correlation analysis proposes an anomaly detection and analysis method based on the correlation coefficient matrix. Anomaly behaviors in the network include the various attacks on the network like DDOs attacks and network scanning.

  6. Statistical quality control through overall vibration analysis

    NASA Astrophysics Data System (ADS)

    Carnero, M. a. Carmen; González-Palma, Rafael; Almorza, David; Mayorga, Pedro; López-Escobar, Carlos

    2010-05-01

    The present study introduces the concept of statistical quality control in automotive wheel bearings manufacturing processes. Defects on products under analysis can have a direct influence on passengers' safety and comfort. At present, the use of vibration analysis on machine tools for quality control purposes is not very extensive in manufacturing facilities. Noise and vibration are common quality problems in bearings. These failure modes likely occur under certain operating conditions and do not require high vibration amplitudes but relate to certain vibration frequencies. The vibration frequencies are affected by the type of surface problems (chattering) of ball races that are generated through grinding processes. The purpose of this paper is to identify grinding process variables that affect the quality of bearings by using statistical principles in the field of machine tools. In addition, an evaluation of the quality results of the finished parts under different combinations of process variables is assessed. This paper intends to establish the foundations to predict the quality of the products through the analysis of self-induced vibrations during the contact between the grinding wheel and the parts. To achieve this goal, the overall self-induced vibration readings under different combinations of process variables are analysed using statistical tools. The analysis of data and design of experiments follows a classical approach, considering all potential interactions between variables. The analysis of data is conducted through analysis of variance (ANOVA) for data sets that meet normality and homoscedasticity criteria. This paper utilizes different statistical tools to support the conclusions such as chi squared, Shapiro-Wilks, symmetry, Kurtosis, Cochran, Hartlett, and Hartley and Krushal-Wallis. The analysis presented is the starting point to extend the use of predictive techniques (vibration analysis) for quality control. This paper demonstrates the existence

  7. Statistical Tolerance and Clearance Analysis for Assembly

    NASA Technical Reports Server (NTRS)

    Lee, S.; Yi, C.

    1996-01-01

    Tolerance is inevitable because manufacturing exactly equal parts is known to be impossible. Furthermore, the specification of tolerances is an integral part of product design since tolerances directly affect the assemblability, functionality, manufacturability, and cost effectiveness of a product. In this paper, we present statistical tolerance and clearance analysis for the assembly. Our proposed work is expected to make the following contributions: (i) to help the designers to evaluate products for assemblability, (ii) to provide a new perspective to tolerance problems, and (iii) to provide a tolerance analysis tool which can be incorporated into a CAD or solid modeling system.

  8. Statistical inference to advance network models in epidemiology.

    PubMed

    Welch, David; Bansal, Shweta; Hunter, David R

    2011-03-01

    Contact networks are playing an increasingly important role in the study of epidemiology. Most of the existing work in this area has focused on considering the effect of underlying network structure on epidemic dynamics by using tools from probability theory and computer simulation. This work has provided much insight on the role that heterogeneity in host contact patterns plays on infectious disease dynamics. Despite the important understanding afforded by the probability and simulation paradigm, this approach does not directly address important questions about the structure of contact networks such as what is the best network model for a particular mode of disease transmission, how parameter values of a given model should be estimated, or how precisely the data allow us to estimate these parameter values. We argue that these questions are best answered within a statistical framework and discuss the role of statistical inference in estimating contact networks from epidemiological data.

  9. Statistical analysis of Contact Angle Hysteresis

    NASA Astrophysics Data System (ADS)

    Janardan, Nachiketa; Panchagnula, Mahesh

    2015-11-01

    We present the results of a new statistical approach to determining Contact Angle Hysteresis (CAH) by studying the nature of the triple line. A statistical distribution of local contact angles on a random three-dimensional drop is used as the basis for this approach. Drops with randomly shaped triple lines but of fixed volumes were deposited on a substrate and their triple line shapes were extracted by imaging. Using a solution developed by Prabhala et al. (Langmuir, 2010), the complete three dimensional shape of the sessile drop was generated. A distribution of the local contact angles for several such drops but of the same liquid-substrate pairs is generated. This distribution is a result of several microscopic advancing and receding processes along the triple line. This distribution is used to yield an approximation of the CAH associated with the substrate. This is then compared with measurements of CAH by means of a liquid infusion-withdrawal experiment. Static measurements are shown to be sufficient to measure quasistatic contact angle hysteresis of a substrate. The approach also points towards the relationship between microscopic triple line contortions and CAH.

  10. Statistical analysis of sleep spindle occurrences.

    PubMed

    Panas, Dagmara; Malinowska, Urszula; Piotrowski, Tadeusz; Żygierewicz, Jarosław; Suffczyński, Piotr

    2013-01-01

    Spindles - a hallmark of stage II sleep - are a transient oscillatory phenomenon in the EEG believed to reflect thalamocortical activity contributing to unresponsiveness during sleep. Currently spindles are often classified into two classes: fast spindles, with a frequency of around 14 Hz, occurring in the centro-parietal region; and slow spindles, with a frequency of around 12 Hz, prevalent in the frontal region. Here we aim to establish whether the spindle generation process also exhibits spatial heterogeneity. Electroencephalographic recordings from 20 subjects were automatically scanned to detect spindles and the time occurrences of spindles were used for statistical analysis. Gamma distribution parameters were fit to each inter-spindle interval distribution, and a modified Wald-Wolfowitz lag-1 correlation test was applied. Results indicate that not all spindles are generated by the same statistical process, but this dissociation is not spindle-type specific. Although this dissociation is not topographically specific, a single generator for all spindle types appears unlikely.

  11. Apparatus for statistical time-series analysis of electrical signals

    NASA Technical Reports Server (NTRS)

    Stewart, C. H. (Inventor)

    1973-01-01

    An apparatus for performing statistical time-series analysis of complex electrical signal waveforms, permitting prompt and accurate determination of statistical characteristics of the signal is presented.

  12. Advances in total scattering analysis

    SciTech Connect

    Proffen, Thomas E; Kim, Hyunjeong

    2008-01-01

    In recent years the analysis of the total scattering pattern has become an invaluable tool to study disordered crystalline and nanocrystalline materials. Traditional crystallographic structure determination is based on Bragg intensities and yields the long range average atomic structure. By including diffuse scattering into the analysis, the local and medium range atomic structure can be unravelled. Here we give an overview of recent experimental advances, using X-rays as well as neutron scattering as well as current trends in modelling of total scattering data.

  13. Statistical Hot Channel Analysis for the NBSR

    SciTech Connect

    Cuadra A.; Baek J.

    2014-05-27

    A statistical analysis of thermal limits has been carried out for the research reactor (NBSR) at the National Institute of Standards and Technology (NIST). The objective of this analysis was to update the uncertainties of the hot channel factors with respect to previous analysis for both high-enriched uranium (HEU) and low-enriched uranium (LEU) fuels. Although uncertainties in key parameters which enter into the analysis are not yet known for the LEU core, the current analysis uses reasonable approximations instead of conservative estimates based on HEU values. Cumulative distribution functions (CDFs) were obtained for critical heat flux ratio (CHFR), and onset of flow instability ratio (OFIR). As was done previously, the Sudo-Kaminaga correlation was used for CHF and the Saha-Zuber correlation was used for OFI. Results were obtained for probability levels of 90%, 95%, and 99.9%. As an example of the analysis, the results for both the existing reactor with HEU fuel and the LEU core show that CHFR would have to be above 1.39 to assure with 95% probability that there is no CHF. For the OFIR, the results show that the ratio should be above 1.40 to assure with a 95% probability that OFI is not reached.

  14. The statistical multifragmentation model: Origins and recent advances

    NASA Astrophysics Data System (ADS)

    Donangelo, R.; Souza, S. R.

    2016-07-01

    We review the Statistical Multifragmentation Model (SMM) which considers a generalization of the liquid-drop model for hot nuclei and allows one to calculate thermodynamic quantities characterizing the nuclear ensemble at the disassembly stage. We show how to determine probabilities of definite partitions of finite nuclei and how to determine, through Monte Carlo calculations, observables such as the caloric curve, multiplicity distributions, heat capacity, among others. Some experimental measurements of the caloric curve confirmed the SMM predictions of over 10 years before, leading to a surge in the interest in the model. However, the experimental determination of the fragmentation temperatures relies on the yields of different isotopic species, which were not correctly calculated in the schematic, liquid-drop picture, employed in the SMM. This led to a series of improvements in the SMM, in particular to the more careful choice of nuclear masses and energy densities, specially for the lighter nuclei. With these improvements the SMM is able to make quantitative determinations of isotope production. We show the application of SMM to the production of exotic nuclei through multifragmentation. These preliminary calculations demonstrate the need for a careful choice of the system size and excitation energy to attain maximum yields.

  15. Advanced statistics: applying statistical process control techniques to emergency medicine: a primer for providers.

    PubMed

    Callahan, Charles D; Griffen, David L

    2003-08-01

    Emergency medicine faces unique challenges in the effort to improve efficiency and effectiveness. Increased patient volumes, decreased emergency department (ED) supply, and an increased emphasis on the ED as a diagnostic center have contributed to poor customer satisfaction and process failures such as diversion/bypass. Statistical process control (SPC) techniques developed in industry offer an empirically based means to understand our work processes and manage by fact. Emphasizing that meaningful quality improvement can occur only when it is exercised by "front-line" providers, this primer presents robust yet accessible SPC concepts and techniques for use in today's ED.

  16. Advanced Numerical Methods for Computing Statistical Quantities of Interest

    DTIC Science & Technology

    2014-07-10

    coefficients , forcing terms, and initial conditions was analyzed. The input data were assumed to depend on a finite number of random variables . Unlike...89, 2012, 1269-1280. We considered the Musiela equation of forward rates; this is a hyperbolic stochastic partial differential equation . A weak...ZHANG AND M. GUNZBURGER, Error analysis of stochastic collocation method for parabolic partial differential equations with random input data; SIAM Journal

  17. Raman spectroscopy coupled with advanced statistics for differentiating menstrual and peripheral blood.

    PubMed

    Sikirzhytskaya, Aliaksandra; Sikirzhytski, Vitali; Lednev, Igor K

    2014-01-01

    Body fluids are a common and important type of forensic evidence. In particular, the identification of menstrual blood stains is often a key step during the investigation of rape cases. Here, we report on the application of near-infrared Raman microspectroscopy for differentiating menstrual blood from peripheral blood. We observed that the menstrual and peripheral blood samples have similar but distinct Raman spectra. Advanced statistical analysis of the multiple Raman spectra that were automatically (Raman mapping) acquired from the 40 dried blood stains (20 donors for each group) allowed us to build classification model with maximum (100%) sensitivity and specificity. We also demonstrated that despite certain common constituents, menstrual blood can be readily distinguished from vaginal fluid. All of the classification models were verified using cross-validation methods. The proposed method overcomes the problems associated with currently used biochemical methods, which are destructive, time consuming and expensive.

  18. Statistical Analysis of Sleep Spindle Occurrences

    PubMed Central

    Panas, Dagmara; Malinowska, Urszula; Piotrowski, Tadeusz; Żygierewicz, Jarosław; Suffczyński, Piotr

    2013-01-01

    Spindles - a hallmark of stage II sleep - are a transient oscillatory phenomenon in the EEG believed to reflect thalamocortical activity contributing to unresponsiveness during sleep. Currently spindles are often classified into two classes: fast spindles, with a frequency of around 14 Hz, occurring in the centro-parietal region; and slow spindles, with a frequency of around 12 Hz, prevalent in the frontal region. Here we aim to establish whether the spindle generation process also exhibits spatial heterogeneity. Electroencephalographic recordings from 20 subjects were automatically scanned to detect spindles and the time occurrences of spindles were used for statistical analysis. Gamma distribution parameters were fit to each inter-spindle interval distribution, and a modified Wald-Wolfowitz lag-1 correlation test was applied. Results indicate that not all spindles are generated by the same statistical process, but this dissociation is not spindle-type specific. Although this dissociation is not topographically specific, a single generator for all spindle types appears unlikely. PMID:23560045

  19. Analysis of Advanced Rotorcraft Configurations

    NASA Technical Reports Server (NTRS)

    Johnson, Wayne

    2000-01-01

    Advanced rotorcraft configurations are being investigated with the objectives of identifying vehicles that are larger, quieter, and faster than current-generation rotorcraft. A large rotorcraft, carrying perhaps 150 passengers, could do much to alleviate airport capacity limitations, and a quiet rotorcraft is essential for community acceptance of the benefits of VTOL operations. A fast, long-range, long-endurance rotorcraft, notably the tilt-rotor configuration, will improve rotorcraft economics through productivity increases. A major part of the investigation of advanced rotorcraft configurations consists of conducting comprehensive analyses of vehicle behavior for the purpose of assessing vehicle potential and feasibility, as well as to establish the analytical models required to support the vehicle development. The analytical work of FY99 included applications to tilt-rotor aircraft. Tilt Rotor Aeroacoustic Model (TRAM) wind tunnel measurements are being compared with calculations performed by using the comprehensive analysis tool (Comprehensive Analytical Model of Rotorcraft Aerodynamics and Dynamics (CAMRAD 11)). The objective is to establish the wing and wake aerodynamic models that are required for tilt-rotor analysis and design. The TRAM test in the German-Dutch Wind Tunnel (DNW) produced extensive measurements. This is the first test to encompass air loads, performance, and structural load measurements on tilt rotors, as well as acoustic and flow visualization data. The correlation of measurements and calculations includes helicopter-mode operation (performance, air loads, and blade structural loads), hover (performance and air loads), and airplane-mode operation (performance).

  20. Analysis of Variance: What Is Your Statistical Software Actually Doing?

    ERIC Educational Resources Information Center

    Li, Jian; Lomax, Richard G.

    2011-01-01

    Users assume statistical software packages produce accurate results. In this article, the authors systematically examined Statistical Package for the Social Sciences (SPSS) and Statistical Analysis System (SAS) for 3 analysis of variance (ANOVA) designs, mixed-effects ANOVA, fixed-effects analysis of covariance (ANCOVA), and nested ANOVA. For each…

  1. An R package for statistical provenance analysis

    NASA Astrophysics Data System (ADS)

    Vermeesch, Pieter; Resentini, Alberto; Garzanti, Eduardo

    2016-05-01

    This paper introduces provenance, a software package within the statistical programming environment R, which aims to facilitate the visualisation and interpretation of large amounts of sedimentary provenance data, including mineralogical, petrographic, chemical and isotopic provenance proxies, or any combination of these. provenance comprises functions to: (a) calculate the sample size required to achieve a given detection limit; (b) plot distributional data such as detrital zircon U-Pb age spectra as Cumulative Age Distributions (CADs) or adaptive Kernel Density Estimates (KDEs); (c) plot compositional data as pie charts or ternary diagrams; (d) correct the effects of hydraulic sorting on sandstone petrography and heavy mineral composition; (e) assess the settling equivalence of detrital minerals and grain-size dependence of sediment composition; (f) quantify the dissimilarity between distributional data using the Kolmogorov-Smirnov and Sircombe-Hazelton distances, or between compositional data using the Aitchison and Bray-Curtis distances; (e) interpret multi-sample datasets by means of (classical and nonmetric) Multidimensional Scaling (MDS) and Principal Component Analysis (PCA); and (f) simplify the interpretation of multi-method datasets by means of Generalised Procrustes Analysis (GPA) and 3-way MDS. All these tools can be accessed through an intuitive query-based user interface, which does not require knowledge of the R programming language. provenance is free software released under the GPL-2 licence and will be further expanded based on user feedback.

  2. Time Series Analysis Based on Running Mann Whitney Z Statistics

    Technology Transfer Automated Retrieval System (TEKTRAN)

    A sensitive and objective time series analysis method based on the calculation of Mann Whitney U statistics is described. This method samples data rankings over moving time windows, converts those samples to Mann-Whitney U statistics, and then normalizes the U statistics to Z statistics using Monte-...

  3. Statistically Optimal Approximations of Astronomical Signals: Implications to Classification and Advanced Study of Variable Stars

    NASA Astrophysics Data System (ADS)

    Andronov, I. L.; Chinarova, L. L.; Kudashkina, L. S.; Marsakova, V. I.; Tkachenko, M. G.

    2016-06-01

    We have elaborated a set of new algorithms and programs for advanced time series analysis of (generally) multi-component multi-channel observations with irregularly spaced times of observations, which is a common case for large photometric surveys. Previous self-review on these methods for periodogram, scalegram, wavelet, autocorrelation analysis as well as on "running" or "sub-interval" local approximations were self-reviewed in (2003ASPC..292..391A). For an approximation of the phase light curves of nearly-periodic pulsating stars, we use a Trigonometric Polynomial (TP) fit of the statistically optimal degree and initial period improvement using differential corrections (1994OAP.....7...49A). For the determination of parameters of "characteristic points" (minima, maxima, crossings of some constant value etc.) we use a set of methods self-reviewed in 2005ASPC..335...37A, Results of the analysis of the catalogs compiled using these programs are presented in 2014AASP....4....3A. For more complicated signals, we use "phenomenological approximations" with "special shapes" based on functions defined on sub-intervals rather on the complete interval. E. g. for the Algol-type stars we developed the NAV ("New Algol Variable") algorithm (2012Ap.....55..536A, 2012arXiv1212.6707A, 2015JASS...32..127A), which was compared to common methods of Trigonometric Polynomial Fit (TP) or local Algebraic Polynomial (A) fit of a fixed or (alternately) statistically optimal degree. The method allows determine the minimal set of parameters required for the "General Catalogue of Variable Stars", as well as an extended set of phenomenological and astrophysical parameters which may be used for the classification. Totally more that 1900 variable stars were studied in our group using these methods in a frame of the "Inter-Longitude Astronomy" campaign (2010OAP....23....8A) and the "Ukrainian Virtual Observatory" project (2012KPCB...28...85V).

  4. Statistical Analysis of Bus Networks in India

    PubMed Central

    2016-01-01

    In this paper, we model the bus networks of six major Indian cities as graphs in L-space, and evaluate their various statistical properties. While airline and railway networks have been extensively studied, a comprehensive study on the structure and growth of bus networks is lacking. In India, where bus transport plays an important role in day-to-day commutation, it is of significant interest to analyze its topological structure and answer basic questions on its evolution, growth, robustness and resiliency. Although the common feature of small-world property is observed, our analysis reveals a wide spectrum of network topologies arising due to significant variation in the degree-distribution patterns in the networks. We also observe that these networks although, robust and resilient to random attacks are particularly degree-sensitive. Unlike real-world networks, such as Internet, WWW and airline, that are virtual, bus networks are physically constrained. Our findings therefore, throw light on the evolution of such geographically and constrained networks that will help us in designing more efficient bus networks in the future. PMID:27992590

  5. Statistical Analysis of Bus Networks in India.

    PubMed

    Chatterjee, Atanu; Manohar, Manju; Ramadurai, Gitakrishnan

    2016-01-01

    In this paper, we model the bus networks of six major Indian cities as graphs in L-space, and evaluate their various statistical properties. While airline and railway networks have been extensively studied, a comprehensive study on the structure and growth of bus networks is lacking. In India, where bus transport plays an important role in day-to-day commutation, it is of significant interest to analyze its topological structure and answer basic questions on its evolution, growth, robustness and resiliency. Although the common feature of small-world property is observed, our analysis reveals a wide spectrum of network topologies arising due to significant variation in the degree-distribution patterns in the networks. We also observe that these networks although, robust and resilient to random attacks are particularly degree-sensitive. Unlike real-world networks, such as Internet, WWW and airline, that are virtual, bus networks are physically constrained. Our findings therefore, throw light on the evolution of such geographically and constrained networks that will help us in designing more efficient bus networks in the future.

  6. Statistical Power Analysis of Rehabilitation Counseling Research.

    ERIC Educational Resources Information Center

    Kosciulek, John F.; Szymanski, Edna Mora

    1993-01-01

    Provided initial assessment of the statistical power of rehabilitation counseling research published in selected rehabilitation journals. From 5 relevant journals, found 32 articles that contained statistical tests that could be power analyzed. Findings indicated that rehabilitation counselor researchers had little chance of finding small…

  7. Web-Based Statistical Sampling and Analysis

    ERIC Educational Resources Information Center

    Quinn, Anne; Larson, Karen

    2016-01-01

    Consistent with the Common Core State Standards for Mathematics (CCSSI 2010), the authors write that they have asked students to do statistics projects with real data. To obtain real data, their students use the free Web-based app, Census at School, created by the American Statistical Association (ASA) to help promote civic awareness among school…

  8. Reducing Anxiety and Increasing Self-Efficacy within an Advanced Graduate Psychology Statistics Course

    ERIC Educational Resources Information Center

    McGrath, April L.; Ferns, Alyssa; Greiner, Leigh; Wanamaker, Kayla; Brown, Shelley

    2015-01-01

    In this study we assessed the usefulness of a multifaceted teaching framework in an advanced statistics course. We sought to expand on past findings by using this framework to assess changes in anxiety and self-efficacy, and we collected focus group data to ascertain whether students attribute such changes to a multifaceted teaching approach.…

  9. Using the Student Research Project to Integrate Macroeconomics and Statistics in an Advanced Cost Accounting Course

    ERIC Educational Resources Information Center

    Hassan, Mahamood M.; Schwartz, Bill N.

    2014-01-01

    This paper discusses a student research project that is part of an advanced cost accounting class. The project emphasizes active learning, integrates cost accounting with macroeconomics and statistics by "learning by doing" using real world data. Students analyze sales data for a publicly listed company by focusing on the company's…

  10. Analyzing Planck and low redshift data sets with advanced statistical methods

    NASA Astrophysics Data System (ADS)

    Eifler, Tim

    The recent ESA/NASA Planck mission has provided a key data set to constrain cosmology that is most sensitive to physics of the early Universe, such as inflation and primordial NonGaussianity (Planck 2015 results XIII). In combination with cosmological probes of the LargeScale Structure (LSS), the Planck data set is a powerful source of information to investigate late time phenomena (Planck 2015 results XIV), e.g. the accelerated expansion of the Universe, the impact of baryonic physics on the growth of structure, and the alignment of galaxies in their dark matter halos. It is the main objective of this proposal to re-analyze the archival Planck data, 1) with different, more recently developed statistical methods for cosmological parameter inference, and 2) to combine Planck and ground-based observations in an innovative way. We will make the corresponding analysis framework publicly available and believe that it will set a new standard for future CMB-LSS analyses. Advanced statistical methods, such as the Gibbs sampler (Jewell et al 2004, Wandelt et al 2004) have been critical in the analysis of Planck data. More recently, Approximate Bayesian Computation (ABC, see Weyant et al 2012, Akeret et al 2015, Ishida et al 2015, for cosmological applications) has matured to an interesting tool in cosmological likelihood analyses. It circumvents several assumptions that enter the standard Planck (and most LSS) likelihood analyses, most importantly, the assumption that the functional form of the likelihood of the CMB observables is a multivariate Gaussian. Beyond applying new statistical methods to Planck data in order to cross-check and validate existing constraints, we plan to combine Planck and DES data in a new and innovative way and run multi-probe likelihood analyses of CMB and LSS observables. The complexity of multiprobe likelihood analyses scale (non-linearly) with the level of correlations amongst the individual probes that are included. For the multi

  11. Classification of human colonic tissues using FTIR spectra and advanced statistical techniques

    NASA Astrophysics Data System (ADS)

    Zwielly, A.; Argov, S.; Salman, A.; Bogomolny, E.; Mordechai, S.

    2010-04-01

    One of the major public health hazards is colon cancer. There is a great necessity to develop new methods for early detection of cancer. If colon cancer is detected and treated early, cure rate of more than 90% can be achieved. In this study we used FTIR microscopy (MSP), which has shown a good potential in the last 20 years in the fields of medical diagnostic and early detection of abnormal tissues. Large database of FTIR microscopic spectra was acquired from 230 human colonic biopsies. Five different subgroups were included in our database, normal and cancer tissues as well as three stages of benign colonic polyps, namely, mild, moderate and severe polyps which are precursors of carcinoma. In this study we applied advanced mathematical and statistical techniques including principal component analysis (PCA) and linear discriminant analysis (LDA), on human colonic FTIR spectra in order to differentiate among the mentioned subgroups' tissues. Good classification accuracy between normal, polyps and cancer groups was achieved with approximately 85% success rate. Our results showed that there is a great potential of developing FTIR-micro spectroscopy as a simple, reagent-free viable tool for early detection of colon cancer in particular the early stages of premalignancy among the benign colonic polyps.

  12. [Statistical models for spatial analysis in parasitology].

    PubMed

    Biggeri, A; Catelan, D; Dreassi, E; Lagazio, C; Cringoli, G

    2004-06-01

    The simplest way to study the spatial pattern of a disease is the geographical representation of its cases (or some indicators of them) over a map. Maps based on raw data are generally "wrong" since they do not take into consideration for sampling errors. Indeed, the observed differences between areas (or points in the map) are not directly interpretable, as they derive from the composition of true, structural differences and of the noise deriving from the sampling process. This problem is well known in human epidemiology, and several solutions have been proposed to filter the signal from the noise. These statistical methods are usually referred to as Disease Mapping. In geographical analysis a first goal is to evaluate the statistical significance of the heterogeneity between areas (or points). If the test indicates rejection of the hypothesis of homogeneity the following task is to study the spatial pattern of the disease. The spatial variability of risk is usually decomposed into two terms: a spatially structured (clustering) and a non spatially structured (heterogeneity) one. The heterogeneity term reflects spatial variability due to intrinsic characteristics of the sampling units (e.g. igienic conditions of farms), while the clustering term models the association due to proximity between sampling units, that usually depends on ecological conditions that vary over the study area and that affect in similar way breedings that are close to each other. Hierarchical bayesian models are the main tool to make inference over the clustering and heterogeneity components. The results are based on the marginal posterior distributions of the parameters of the model, that are approximated by Monte Carlo Markov Chain methods. Different models can be defined depending on the terms that are considered, namely a model with only the clustering term, a model with only the heterogeneity term and a model where both are included. Model selection criteria based on a compromise between

  13. On statistical approaches to climate change analysis

    NASA Astrophysics Data System (ADS)

    Lee, Terry Chun Kit

    Evidence for a human contribution to climatic changes during the past century is accumulating rapidly. Given the strength of the evidence, it seems natural to ask whether forcing projections can be used to forecast climate change. A Bayesian method for post-processing forced climate model simulations that produces probabilistic hindcasts of inter-decadal temperature changes on large spatial scales is proposed. Hindcasts produced for the last two decades of the 20th century are shown to be skillful. The suggestion that skillful decadal forecasts can be produced on large regional scales by exploiting the response to anthropogenic forcing provides additional evidence that anthropogenic change in the composition of the atmosphere has influenced our climate. In the absence of large negative volcanic forcing on the climate system (which cannot presently be forecast), the global mean temperature for the decade 2000-2009 is predicted to lie above the 1970-1999 normal with probability 0.94. The global mean temperature anomaly for this decade relative to 1970-1999 is predicted to be 0.35°C (5-95% confidence range: 0.21°C--0.48°C). Reconstruction of temperature variability of the past centuries using climate proxy data can also provide important information on the role of anthropogenic forcing in the observed 20th century warming. A state-space model approach that allows incorporation of additional non-temperature information, such as the estimated response to external forcing, to reconstruct historical temperature is proposed. An advantage of this approach is that it permits simultaneous reconstruction and detection analysis as well as future projection. A difficulty in using this approach is that estimation of several unknown state-space model parameters is required. To take advantage of the data structure in the reconstruction problem, the existing parameter estimation approach is modified, resulting in two new estimation approaches. The competing estimation approaches

  14. Modern Statistical Methods for GLAST Event Analysis

    SciTech Connect

    Morris, Robin D.; Cohen-Tanugi, Johann; /SLAC /KIPAC, Menlo Park

    2007-04-10

    We describe a statistical reconstruction methodology for the GLAST LAT. The methodology incorporates in detail the statistics of the interactions of photons and charged particles with the tungsten layers in the LAT, and uses the scattering distributions to compute the full probability distribution over the energy and direction of the incident photons. It uses model selection methods to estimate the probabilities of the possible geometrical configurations of the particles produced in the detector, and numerical marginalization over the energy loss and scattering angles at each layer. Preliminary results show that it can improve on the tracker-only energy estimates for muons and electrons incident on the LAT.

  15. Statistical Analysis of Refractivity in UAE

    NASA Astrophysics Data System (ADS)

    Al-Ansari, Kifah; Al-Mal, Abdulhadi Abu; Kamel, Rami

    2007-07-01

    This paper presents the results of the refractivity statistics in the UAE (United Arab Emirates) for a period of 14 years (1990-2003). Six sites have been considered using meteorological surface data (Abu Dhabi, Dubai, Sharjah, Al-Ain, Ras Al-Kaimah, and Al-Fujairah). Upper air (radiosonde) data were available at one site only, Abu Dhabi airport, which has been considered for the refractivity gradient statistics. Monthly and yearly averages are obtained for the two parameters, refractivity and refractivity gradient. Cumulative distributions are also provided.

  16. Statistical modeling and analysis of laser-evoked potentials of electrocorticogram recordings from awake humans.

    PubMed

    Chen, Zhe; Ohara, Shinji; Cao, Jianting; Vialatte, François; Lenz, Fred A; Cichocki, Andrzej

    2007-01-01

    This article is devoted to statistical modeling and analysis of electrocorticogram (ECoG) signals induced by painful cutaneous laser stimuli, which were recorded from implanted electrodes in awake humans. Specifically, with statistical tools of factor analysis and independent component analysis, the pain-induced laser-evoked potentials (LEPs) were extracted and investigated under different controlled conditions. With the help of wavelet analysis, quantitative and qualitative analyses were conducted regarding the LEPs' attributes of power, amplitude, and latency, in both averaging and single-trial experiments. Statistical hypothesis tests were also applied in various experimental setups. Experimental results reported herein also confirm previous findings in the neurophysiology literature. In addition, single-trial analysis has also revealed many new observations that might be interesting to the neuroscientists or clinical neurophysiologists. These promising results show convincing validation that advanced signal processing and statistical analysis may open new avenues for future studies of such ECoG or other relevant biomedical recordings.

  17. Notes on numerical reliability of several statistical analysis programs

    USGS Publications Warehouse

    Landwehr, J.M.; Tasker, Gary D.

    1999-01-01

    This report presents a benchmark analysis of several statistical analysis programs currently in use in the USGS. The benchmark consists of a comparison between the values provided by a statistical analysis program for variables in the reference data set ANASTY and their known or calculated theoretical values. The ANASTY data set is an amendment of the Wilkinson NASTY data set that has been used in the statistical literature to assess the reliability (computational correctness) of calculated analytical results.

  18. Critical analysis of adsorption data statistically

    NASA Astrophysics Data System (ADS)

    Kaushal, Achla; Singh, S. K.

    2016-09-01

    Experimental data can be presented, computed, and critically analysed in a different way using statistics. A variety of statistical tests are used to make decisions about the significance and validity of the experimental data. In the present study, adsorption was carried out to remove zinc ions from contaminated aqueous solution using mango leaf powder. The experimental data was analysed statistically by hypothesis testing applying t test, paired t test and Chi-square test to (a) test the optimum value of the process pH, (b) verify the success of experiment and (c) study the effect of adsorbent dose in zinc ion removal from aqueous solutions. Comparison of calculated and tabulated values of t and χ 2 showed the results in favour of the data collected from the experiment and this has been shown on probability charts. K value for Langmuir isotherm was 0.8582 and m value for Freundlich adsorption isotherm obtained was 0.725, both are <1, indicating favourable isotherms. Karl Pearson's correlation coefficient values for Langmuir and Freundlich adsorption isotherms were obtained as 0.99 and 0.95 respectively, which show higher degree of correlation between the variables. This validates the data obtained for adsorption of zinc ions from the contaminated aqueous solution with the help of mango leaf powder.

  19. ADVANCED POWER SYSTEMS ANALYSIS TOOLS

    SciTech Connect

    Robert R. Jensen; Steven A. Benson; Jason D. Laumb

    2001-08-31

    The use of Energy and Environmental Research Center (EERC) modeling tools and improved analytical methods has provided key information in optimizing advanced power system design and operating conditions for efficiency, producing minimal air pollutant emissions and utilizing a wide range of fossil fuel properties. This project was divided into four tasks: the demonstration of the ash transformation model, upgrading spreadsheet tools, enhancements to analytical capabilities using the scanning electron microscopy (SEM), and improvements to the slag viscosity model. The ash transformation model, Atran, was used to predict the size and composition of ash particles, which has a major impact on the fate of the combustion system. To optimize Atran key factors such as mineral fragmentation and coalescence, the heterogeneous and homogeneous interaction of the organically associated elements must be considered as they are applied to the operating conditions. The resulting model's ash composition compares favorably to measured results. Enhancements to existing EERC spreadsheet application included upgrading interactive spreadsheets to calculate the thermodynamic properties for fuels, reactants, products, and steam with Newton Raphson algorithms to perform calculations on mass, energy, and elemental balances, isentropic expansion of steam, and gasifier equilibrium conditions. Derivative calculations can be performed to estimate fuel heating values, adiabatic flame temperatures, emission factors, comparative fuel costs, and per-unit carbon taxes from fuel analyses. Using state-of-the-art computer-controlled scanning electron microscopes and associated microanalysis systems, a method to determine viscosity using the incorporation of grey-scale binning acquired by the SEM image was developed. The image analysis capabilities of a backscattered electron image can be subdivided into various grey-scale ranges that can be analyzed separately. Since the grey scale's intensity is

  20. A statistical package for computing time and frequency domain analysis

    NASA Technical Reports Server (NTRS)

    Brownlow, J.

    1978-01-01

    The spectrum analysis (SPA) program is a general purpose digital computer program designed to aid in data analysis. The program does time and frequency domain statistical analyses as well as some preanalysis data preparation. The capabilities of the SPA program include linear trend removal and/or digital filtering of data, plotting and/or listing of both filtered and unfiltered data, time domain statistical characterization of data, and frequency domain statistical characterization of data.

  1. Statistical Analysis of Random Number Generators

    NASA Astrophysics Data System (ADS)

    Accardi, Luigi; Gäbler, Markus

    2011-01-01

    In many applications, for example cryptography and Monte Carlo simulation, there is need for random numbers. Any procedure, algorithm or device which is intended to produce such is called a random number generator (RNG). What makes a good RNG? This paper gives an overview on empirical testing of the statistical properties of the sequences produced by RNGs and special software packages designed for that purpose. We also present the results of applying a particular test suite--TestU01-- to a family of RNGs currently being developed at the Centro Interdipartimentale Vito Volterra (CIVV), Roma, Italy.

  2. Statistical analysis of life history calendar data.

    PubMed

    Eerola, Mervi; Helske, Satu

    2016-04-01

    The life history calendar is a data-collection tool for obtaining reliable retrospective data about life events. To illustrate the analysis of such data, we compare the model-based probabilistic event history analysis and the model-free data mining method, sequence analysis. In event history analysis, we estimate instead of transition hazards the cumulative prediction probabilities of life events in the entire trajectory. In sequence analysis, we compare several dissimilarity metrics and contrast data-driven and user-defined substitution costs. As an example, we study young adults' transition to adulthood as a sequence of events in three life domains. The events define the multistate event history model and the parallel life domains in multidimensional sequence analysis. The relationship between life trajectories and excess depressive symptoms in middle age is further studied by their joint prediction in the multistate model and by regressing the symptom scores on individual-specific cluster indices. The two approaches complement each other in life course analysis; sequence analysis can effectively find typical and atypical life patterns while event history analysis is needed for causal inquiries.

  3. Structure-based statistical analysis of transmembrane helices.

    PubMed

    Baeza-Delgado, Carlos; Marti-Renom, Marc A; Mingarro, Ismael

    2013-03-01

    Recent advances in determination of the high-resolution structure of membrane proteins now enable analysis of the main features of amino acids in transmembrane (TM) segments in comparison with amino acids in water-soluble helices. In this work, we conducted a large-scale analysis of the prevalent locations of amino acids by using a data set of 170 structures of integral membrane proteins obtained from the MPtopo database and 930 structures of water-soluble helical proteins obtained from the protein data bank. Large hydrophobic amino acids (Leu, Val, Ile, and Phe) plus Gly were clearly prevalent in TM helices whereas polar amino acids (Glu, Lys, Asp, Arg, and Gln) were less frequent in this type of helix. The distribution of amino acids along TM helices was also examined. As expected, hydrophobic and slightly polar amino acids are commonly found in the hydrophobic core of the membrane whereas aromatic (Trp and Tyr), Pro, and the hydrophilic amino acids (Asn, His, and Gln) occur more frequently in the interface regions. Charged amino acids are also statistically prevalent outside the hydrophobic core of the membrane, and whereas acidic amino acids are frequently found at both cytoplasmic and extra-cytoplasmic interfaces, basic amino acids cluster at the cytoplasmic interface. These results strongly support the experimentally demonstrated biased distribution of positively charged amino acids (that is, the so-called the positive-inside rule) with structural data.

  4. Accuracy Evaluation of a Mobile Mapping System with Advanced Statistical Methods

    NASA Astrophysics Data System (ADS)

    Toschi, I.; Rodríguez-Gonzálvez, P.; Remondino, F.; Minto, S.; Orlandini, S.; Fuller, A.

    2015-02-01

    This paper discusses a methodology to evaluate the precision and the accuracy of a commercial Mobile Mapping System (MMS) with advanced statistical methods. So far, the metric potentialities of this emerging mapping technology have been studied in few papers, where generally the assumption that errors follow a normal distribution is made. In fact, this hypothesis should be carefully verified in advance, in order to test how well the Gaussian classic statistics can adapt to datasets that are usually affected by asymmetrical gross errors. The workflow adopted in this study relies on a Gaussian assessment, followed by an outlier filtering process. Finally, non-parametric statistical models are applied, in order to achieve a robust estimation of the error dispersion. Among the different MMSs available on the market, the latest solution provided by RIEGL is here tested, i.e. the VMX-450 Mobile Laser Scanning System. The test-area is the historic city centre of Trento (Italy), selected in order to assess the system performance in dealing with a challenging and historic urban scenario. Reference measures are derived from photogrammetric and Terrestrial Laser Scanning (TLS) surveys. All datasets show a large lack of symmetry that leads to the conclusion that the standard normal parameters are not adequate to assess this type of data. The use of non-normal statistics gives thus a more appropriate description of the data and yields results that meet the quoted a-priori errors.

  5. Statistical analysis of low level atmospheric turbulence

    NASA Technical Reports Server (NTRS)

    Tieleman, H. W.; Chen, W. W. L.

    1974-01-01

    The statistical properties of low-level wind-turbulence data were obtained with the model 1080 total vector anemometer and the model 1296 dual split-film anemometer, both manufactured by Thermo Systems Incorporated. The data obtained from the above fast-response probes were compared with the results obtained from a pair of Gill propeller anemometers. The digitized time series representing the three velocity components and the temperature were each divided into a number of blocks, the length of which depended on the lowest frequency of interest and also on the storage capacity of the available computer. A moving-average and differencing high-pass filter was used to remove the trend and the low frequency components in the time series. The calculated results for each of the anemometers used are represented in graphical or tabulated form.

  6. Advanced materials: Information and analysis needs

    SciTech Connect

    Curlee, T.R.; Das, S.; Lee, R.; Trumble, D.

    1990-09-01

    This report presents the findings of a study to identify the types of information and analysis that are needed for advanced materials. The project was sponsored by the US Bureau of Mines (BOM). It includes a conceptual description of information needs for advanced materials and the development and implementation of a questionnaire on the same subject. This report identifies twelve fundamental differences between advanced and traditional materials and discusses the implications of these differences for data and analysis needs. Advanced and traditional materials differ significantly in terms of physical and chemical properties. Advanced material properties can be customized more easily. The production of advanced materials may differ from traditional materials in terms of inputs, the importance of by-products, the importance of different processing steps (especially fabrication), and scale economies. The potential for change in advanced materials characteristics and markets is greater and is derived from the marriage of radically different materials and processes. In addition to the conceptual study, a questionnaire was developed and implemented to assess the opinions of people who are likely users of BOM information on advanced materials. The results of the questionnaire, which was sent to about 1000 people, generally confirm the propositions set forth in the conceptual part of the study. The results also provide data on the categories of advanced materials and the types of information that are of greatest interest to potential users. 32 refs., 1 fig., 12 tabs.

  7. Statistical Evaluation of Time Series Analysis Techniques

    NASA Technical Reports Server (NTRS)

    Benignus, V. A.

    1973-01-01

    The performance of a modified version of NASA's multivariate spectrum analysis program is discussed. A multiple regression model was used to make the revisions. Performance improvements were documented and compared to the standard fast Fourier transform by Monte Carlo techniques.

  8. Statistics

    Cancer.gov

    Links to sources of cancer-related statistics, including the Surveillance, Epidemiology and End Results (SEER) Program, SEER-Medicare datasets, cancer survivor prevalence data, and the Cancer Trends Progress Report.

  9. Comparative analysis of positive and negative attitudes toward statistics

    NASA Astrophysics Data System (ADS)

    Ghulami, Hassan Rahnaward; Ab Hamid, Mohd Rashid; Zakaria, Roslinazairimah

    2015-02-01

    Many statistics lecturers and statistics education researchers are interested to know the perception of their students' attitudes toward statistics during the statistics course. In statistics course, positive attitude toward statistics is a vital because it will be encourage students to get interested in the statistics course and in order to master the core content of the subject matters under study. Although, students who have negative attitudes toward statistics they will feel depressed especially in the given group assignment, at risk for failure, are often highly emotional, and could not move forward. Therefore, this study investigates the students' attitude towards learning statistics. Six latent constructs have been the measurement of students' attitudes toward learning statistic such as affect, cognitive competence, value, difficulty, interest, and effort. The questionnaire was adopted and adapted from the reliable and validate instrument of Survey of Attitudes towards Statistics (SATS). This study is conducted among engineering undergraduate engineering students in the university Malaysia Pahang (UMP). The respondents consist of students who were taking the applied statistics course from different faculties. From the analysis, it is found that the questionnaire is acceptable and the relationships among the constructs has been proposed and investigated. In this case, students show full effort to master the statistics course, feel statistics course enjoyable, have confidence that they have intellectual capacity, and they have more positive attitudes then negative attitudes towards statistics learning. In conclusion in terms of affect, cognitive competence, value, interest and effort construct the positive attitude towards statistics was mostly exhibited. While negative attitudes mostly exhibited by difficulty construct.

  10. Statistical analysis of plasmaspheric EMIC waves

    NASA Astrophysics Data System (ADS)

    Kato, Y.; Miyoshi, Y.; Sakaguchi, K.; Kasahara, Y.; Keika, K.; Shoji, M.; Kitamura, N.; Hasegawa, S.; Kumamoto, A.; Shiokawa, K.

    2014-12-01

    Electromagnetic ion cyclotron (EMIC) waves in the inner magnetosphere are important since EMIC waves cause the pitch angle scattering of ring current ions as well as relativistic electrons of the radiation belts. Although the spatial distributions of EMIC waves have been investigated by several spacecraft such as CRRES, THEMIS and AMPTE/CCE, there have been little studies on plasmaspheric EMIC waves. We investigate statistically EMIC wave data using the Akebono/VLF measurements. The plasmaspheric EMIC waves tend to be distributed at lower L-shell region (L~2) than the slot region. There are no significant MLT dependences, which are different from the EMIC waves outside the plasmapause. The plasmaspheric EMIC wave frequencies depend on the equatorial cyclotron frequency, suggesting that the plasmaspheric EMIC waves are not propagated from high L-shell but generated near the equivalent L-shell magnetic equator. This result is consistent with the result of the dependence of resonance energy. Using the in-situ thermal plasma density measured by the Akebono satellite, we estimate the resonance energy of energetic ions, and the resonance energies of the plasmaspheric EMIC waves are few tens keV to ~ 1 MeV. The results indicate that the ring current and radiation belt ions may contribute the generation of the plasmaspheric EMIC waves.

  11. Tsallis statistics in reliability analysis: Theory and methods

    NASA Astrophysics Data System (ADS)

    Zhang, Fode; Shi, Yimin; Keung Tony Ng, Hon; Wang, Ruibing

    2016-10-01

    Tsallis statistics, which is based on a non-additive entropy characterized by an index q, is a very useful tool in physics and statistical mechanics. This paper presents an application of Tsallis statistics in reliability analysis. We first show that the q-gamma and incomplete q-gamma functions are q-generalized. Then, three commonly used statistical distributions in reliability analysis are introduced in Tsallis statistics, and the corresponding reliability characteristics including the reliability function, hazard function, cumulative hazard function and mean time to failure are investigated. In addition, we study the statistical inference based on censored reliability data. Specifically, we investigate the point and interval estimation of the model parameters of the q-exponential distribution based on the maximum likelihood method. Simulated and real-life datasets are used to illustrate the methodologies discussed in this paper. Finally, some concluding remarks are provided.

  12. CORSSA: The Community Online Resource for Statistical Seismicity Analysis

    USGS Publications Warehouse

    Michael, Andrew J.; Wiemer, Stefan

    2010-01-01

    Statistical seismology is the application of rigorous statistical methods to earthquake science with the goal of improving our knowledge of how the earth works. Within statistical seismology there is a strong emphasis on the analysis of seismicity data in order to improve our scientific understanding of earthquakes and to improve the evaluation and testing of earthquake forecasts, earthquake early warning, and seismic hazards assessments. Given the societal importance of these applications, statistical seismology must be done well. Unfortunately, a lack of educational resources and available software tools make it difficult for students and new practitioners to learn about this discipline. The goal of the Community Online Resource for Statistical Seismicity Analysis (CORSSA) is to promote excellence in statistical seismology by providing the knowledge and resources necessary to understand and implement the best practices, so that the reader can apply these methods to their own research. This introduction describes the motivation for and vision of CORRSA. It also describes its structure and contents.

  13. Statistical analysis of fixed income market

    NASA Astrophysics Data System (ADS)

    Bernaschi, Massimo; Grilli, Luca; Vergni, Davide

    2002-05-01

    We present cross and time series analysis of price fluctuations in the US Treasury fixed income market. Bonds have been classified according to a suitable metric based on the correlation among them. The classification shows how the correlation among fixed income securities depends strongly on their maturity. We study also the structure of price fluctuations for single time series.

  14. Component outage data analysis methods. Volume 2: Basic statistical methods

    NASA Astrophysics Data System (ADS)

    Marshall, J. A.; Mazumdar, M.; McCutchan, D. A.

    1981-08-01

    Statistical methods for analyzing outage data on major power system components such as generating units, transmission lines, and transformers are identified. The analysis methods produce outage statistics from component failure and repair data that help in understanding the failure causes and failure modes of various types of components. Methods for forecasting outage statistics for those components used in the evaluation of system reliability are emphasized.

  15. Statistical Analysis of Multiple Choice Testing

    DTIC Science & Technology

    2001-04-01

    the question to help determine poor distractors (incorrect answers). However, Attali and Fraenkel show that while it is sound to use the Rpbis...heavily on question difficulty.21 Attali and Fraenkel say that the Biserial is usually preferred as a criterion measure for the correct alternative...pubs/mcq/scpre.html, p.6 17 Renckly, Thomas R. Test Analysis & Development Sysem (TAD) version 5.49. CD- ROM.(1990-2000). 18 Ibid. 19 Attali , Yigal

  16. Internet Data Analysis for the Undergraduate Statistics Curriculum

    ERIC Educational Resources Information Center

    Sanchez, Juana; He, Yan

    2005-01-01

    Statistics textbooks for undergraduates have not caught up with the enormous amount of analysis of Internet data that is taking place these days. Case studies that use Web server log data or Internet network traffic data are rare in undergraduate Statistics education. And yet these data provide numerous examples of skewed and bimodal…

  17. Attitudes and Achievement in Statistics: A Meta-Analysis Study

    ERIC Educational Resources Information Center

    Emmioglu, Esma; Capa-Aydin, Yesim

    2012-01-01

    This study examined the relationships among statistics achievement and four components of attitudes toward statistics (Cognitive Competence, Affect, Value, and Difficulty) as assessed by the SATS. Meta-analysis results revealed that the size of relationships differed by the geographical region in which the studies were conducted as well as by the…

  18. Guidelines for Statistical Analysis of Percentage of Syllables Stuttered Data

    ERIC Educational Resources Information Center

    Jones, Mark; Onslow, Mark; Packman, Ann; Gebski, Val

    2006-01-01

    Purpose: The purpose of this study was to develop guidelines for the statistical analysis of percentage of syllables stuttered (%SS) data in stuttering research. Method; Data on %SS from various independent sources were used to develop a statistical model to describe this type of data. On the basis of this model, %SS data were simulated with…

  19. Explorations in Statistics: The Analysis of Ratios and Normalized Data

    ERIC Educational Resources Information Center

    Curran-Everett, Douglas

    2013-01-01

    Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This ninth installment of "Explorations in Statistics" explores the analysis of ratios and normalized--or standardized--data. As researchers, we compute a ratio--a numerator divided by a denominator--to compute a…

  20. A Realistic Experimental Design and Statistical Analysis Project

    ERIC Educational Resources Information Center

    Muske, Kenneth R.; Myers, John A.

    2007-01-01

    A realistic applied chemical engineering experimental design and statistical analysis project is documented in this article. This project has been implemented as part of the professional development and applied statistics courses at Villanova University over the past five years. The novel aspects of this project are that the students are given a…

  1. LHC Olympics: Advanced Analysis Techniques

    NASA Astrophysics Data System (ADS)

    Armour, Kyle; Larkoski, Andrew; Gray, Amanda; Ventura, Dan; Walsh, Jon; Schabinger, Rob

    2006-05-01

    The LHC Olympics is a series of workshop aimed at encouraging theorists and experimentalists to prepare for the soon-to-be-online Large Hadron Collider in Geneva, Switzerland. One aspect of the LHC Olympics program consists of the study of simulated data sets which represent various possible new physics signals as they would be seen in LHC detectors. Through this exercise, LHC Olympians learn the phenomenology of possible new physics models and gain experience in analyzing LHC data. Additionally, the LHC Olympics encourages discussion between theorists and experimentalists, and through this collaboration new techniques could be developed. The University of Washington LHC Olympics group consists of several first-year graduate and senior undergraduate students, in both theoretical and experimental particle physics. Presented here is a discussion of some of the more advanced techniques used and the recent results of one such LHC Olympics study.

  2. System statistical reliability model and analysis

    NASA Technical Reports Server (NTRS)

    Lekach, V. S.; Rood, H.

    1973-01-01

    A digital computer code was developed to simulate the time-dependent behavior of the 5-kwe reactor thermoelectric system. The code was used to determine lifetime sensitivity coefficients for a number of system design parameters, such as thermoelectric module efficiency and degradation rate, radiator absorptivity and emissivity, fuel element barrier defect constant, beginning-of-life reactivity, etc. A probability distribution (mean and standard deviation) was estimated for each of these design parameters. Then, error analysis was used to obtain a probability distribution for the system lifetime (mean = 7.7 years, standard deviation = 1.1 years). From this, the probability that the system will achieve the design goal of 5 years lifetime is 0.993. This value represents an estimate of the degradation reliability of the system.

  3. A statistical analysis of UK financial networks

    NASA Astrophysics Data System (ADS)

    Chu, J.; Nadarajah, S.

    2017-04-01

    In recent years, with a growing interest in big or large datasets, there has been a rise in the application of large graphs and networks to financial big data. Much of this research has focused on the construction and analysis of the network structure of stock markets, based on the relationships between stock prices. Motivated by Boginski et al. (2005), who studied the characteristics of a network structure of the US stock market, we construct network graphs of the UK stock market using same method. We fit four distributions to the degree density of the vertices from these graphs, the Pareto I, Fréchet, lognormal, and generalised Pareto distributions, and assess the goodness of fit. Our results show that the degree density of the complements of the market graphs, constructed using a negative threshold value close to zero, can be fitted well with the Fréchet and lognormal distributions.

  4. Instrumental Neutron Activation Analysis and Multivariate Statistics for Pottery Provenance

    NASA Astrophysics Data System (ADS)

    Glascock, M. D.; Neff, H.; Vaughn, K. J.

    2004-06-01

    The application of instrumental neutron activation analysis and multivariate statistics to archaeological studies of ceramics and clays is described. A small pottery data set from the Nasca culture in southern Peru is presented for illustration.

  5. Advanced analysis methods in particle physics

    SciTech Connect

    Bhat, Pushpalatha C.; /Fermilab

    2010-10-01

    Each generation of high energy physics experiments is grander in scale than the previous - more powerful, more complex and more demanding in terms of data handling and analysis. The spectacular performance of the Tevatron and the beginning of operations of the Large Hadron Collider, have placed us at the threshold of a new era in particle physics. The discovery of the Higgs boson or another agent of electroweak symmetry breaking and evidence of new physics may be just around the corner. The greatest challenge in these pursuits is to extract the extremely rare signals, if any, from huge backgrounds arising from known physics processes. The use of advanced analysis techniques is crucial in achieving this goal. In this review, I discuss the concepts of optimal analysis, some important advanced analysis methods and a few examples. The judicious use of these advanced methods should enable new discoveries and produce results with better precision, robustness and clarity.

  6. Statistical analysis of litter experiments in teratology

    SciTech Connect

    Williams, R.; Buschbom, R.L.

    1982-11-01

    Teratological data is binary response data (each fetus is either affected or not) in which the responses within a litter are usually not independent. As a result, the litter should be taken as the experimental unit. For each litter, its size, n, and the number of fetuses, x, possessing the effect of interest are recorded. The ratio p = x/n is then the basic data generated by the experiment. There are currently three general approaches to the analysis of teratological data: nonparametric, transformation followed by t-test or ANOVA, and parametric. The first two are currently in wide use by practitioners while the third is relatively new to the field. These first two also appear to possess comparable power levels while maintaining the nominal level of significance. When transformations are employed, care must be exercised to check that the transformed data has the required properties. Since the data is often highly asymmetric, there may be no transformation which renders the data nearly normal. The parametric procedures, including the beta-binomial model, offer the possibility of increased power.

  7. Propensity Score Analysis: An Alternative Statistical Approach for HRD Researchers

    ERIC Educational Resources Information Center

    Keiffer, Greggory L.; Lane, Forrest C.

    2016-01-01

    Purpose: This paper aims to introduce matching in propensity score analysis (PSA) as an alternative statistical approach for researchers looking to make causal inferences using intact groups. Design/methodology/approach: An illustrative example demonstrated the varying results of analysis of variance, analysis of covariance and PSA on a heuristic…

  8. Advanced nuclear energy analysis technology.

    SciTech Connect

    Gauntt, Randall O.; Murata, Kenneth K.; Romero, Vicente JosÔe; Young, Michael Francis; Rochau, Gary Eugene

    2004-05-01

    A two-year effort focused on applying ASCI technology developed for the analysis of weapons systems to the state-of-the-art accident analysis of a nuclear reactor system was proposed. The Sandia SIERRA parallel computing platform for ASCI codes includes high-fidelity thermal, fluids, and structural codes whose coupling through SIERRA can be specifically tailored to the particular problem at hand to analyze complex multiphysics problems. Presently, however, the suite lacks several physics modules unique to the analysis of nuclear reactors. The NRC MELCOR code, not presently part of SIERRA, was developed to analyze severe accidents in present-technology reactor systems. We attempted to: (1) evaluate the SIERRA code suite for its current applicability to the analysis of next generation nuclear reactors, and the feasibility of implementing MELCOR models into the SIERRA suite, (2) examine the possibility of augmenting ASCI codes or alternatives by coupling to the MELCOR code, or portions thereof, to address physics particular to nuclear reactor issues, especially those facing next generation reactor designs, and (3) apply the coupled code set to a demonstration problem involving a nuclear reactor system. We were successful in completing the first two in sufficient detail to determine that an extensive demonstration problem was not feasible at this time. In the future, completion of this research would demonstrate the feasibility of performing high fidelity and rapid analyses of safety and design issues needed to support the development of next generation power reactor systems.

  9. Cognition of and Demand for Education and Teaching in Medical Statistics in China: A Systematic Review and Meta-Analysis

    PubMed Central

    Li, Gaoming; Yi, Dali; Wu, Xiaojiao; Liu, Xiaoyu; Zhang, Yanqi; Liu, Ling; Yi, Dong

    2015-01-01

    Background Although a substantial number of studies focus on the teaching and application of medical statistics in China, few studies comprehensively evaluate the recognition of and demand for medical statistics. In addition, the results of these various studies differ and are insufficiently comprehensive and systematic. Objectives This investigation aimed to evaluate the general cognition of and demand for medical statistics by undergraduates, graduates, and medical staff in China. Methods We performed a comprehensive database search related to the cognition of and demand for medical statistics from January 2007 to July 2014 and conducted a meta-analysis of non-controlled studies with sub-group analysis for undergraduates, graduates, and medical staff. Results There are substantial differences with respect to the cognition of theory in medical statistics among undergraduates (73.5%), graduates (60.7%), and medical staff (39.6%). The demand for theory in medical statistics is high among graduates (94.6%), undergraduates (86.1%), and medical staff (88.3%). Regarding specific statistical methods, the cognition of basic statistical methods is higher than of advanced statistical methods. The demand for certain advanced statistical methods, including (but not limited to) multiple analysis of variance (ANOVA), multiple linear regression, and logistic regression, is higher than that for basic statistical methods. The use rates of the Statistical Package for the Social Sciences (SPSS) software and statistical analysis software (SAS) are only 55% and 15%, respectively. Conclusion The overall statistical competence of undergraduates, graduates, and medical staff is insufficient, and their ability to practically apply their statistical knowledge is limited, which constitutes an unsatisfactory state of affairs for medical statistics education. Because the demand for skills in this area is increasing, the need to reform medical statistics education in China has become urgent

  10. Advanced Signal Analysis for Forensic Applications of Ground Penetrating Radar

    SciTech Connect

    Steven Koppenjan; Matthew Streeton; Hua Lee; Michael Lee; Sashi Ono

    2004-06-01

    Ground penetrating radar (GPR) systems have traditionally been used to image subsurface objects. The main focus of this paper is to evaluate an advanced signal analysis technique. Instead of compiling spatial data for the analysis, this technique conducts object recognition procedures based on spectral statistics. The identification feature of an object type is formed from the training vectors by a singular-value decomposition procedure. To illustrate its capability, this procedure is applied to experimental data and compared to the performance of the neural-network approach.

  11. The statistical analysis techniques to support the NGNP fuel performance experiments

    SciTech Connect

    Binh T. Pham; Jeffrey J. Einerson

    2013-10-01

    This paper describes the development and application of statistical analysis techniques to support the Advanced Gas Reactor (AGR) experimental program on Next Generation Nuclear Plant (NGNP) fuel performance. The experiments conducted in the Idaho National Laboratory’s Advanced Test Reactor employ fuel compacts placed in a graphite cylinder shrouded by a steel capsule. The tests are instrumented with thermocouples embedded in graphite blocks and the target quantity (fuel temperature) is regulated by the He–Ne gas mixture that fills the gap volume. Three techniques for statistical analysis, namely control charting, correlation analysis, and regression analysis, are implemented in the NGNP Data Management and Analysis System for automated processing and qualification of the AGR measured data. The neutronic and thermal code simulation results are used for comparative scrutiny. The ultimate objective of this work includes (a) a multi-faceted system for data monitoring and data accuracy testing, (b) identification of possible modes of diagnostics deterioration and changes in experimental conditions, (c) qualification of data for use in code validation, and (d) identification and use of data trends to support effective control of test conditions with respect to the test target. Analysis results and examples given in the paper show the three statistical analysis techniques providing a complementary capability to warn of thermocouple failures. It also suggests that the regression analysis models relating calculated fuel temperatures and thermocouple readings can enable online regulation of experimental parameters (i.e. gas mixture content), to effectively maintain the fuel temperature within a given range.

  12. The statistical analysis techniques to support the NGNP fuel performance experiments

    NASA Astrophysics Data System (ADS)

    Pham, Binh T.; Einerson, Jeffrey J.

    2013-10-01

    This paper describes the development and application of statistical analysis techniques to support the Advanced Gas Reactor (AGR) experimental program on Next Generation Nuclear Plant (NGNP) fuel performance. The experiments conducted in the Idaho National Laboratory's Advanced Test Reactor employ fuel compacts placed in a graphite cylinder shrouded by a steel capsule. The tests are instrumented with thermocouples embedded in graphite blocks and the target quantity (fuel temperature) is regulated by the He-Ne gas mixture that fills the gap volume. Three techniques for statistical analysis, namely control charting, correlation analysis, and regression analysis, are implemented in the NGNP Data Management and Analysis System for automated processing and qualification of the AGR measured data. The neutronic and thermal code simulation results are used for comparative scrutiny. The ultimate objective of this work includes (a) a multi-faceted system for data monitoring and data accuracy testing, (b) identification of possible modes of diagnostics deterioration and changes in experimental conditions, (c) qualification of data for use in code validation, and (d) identification and use of data trends to support effective control of test conditions with respect to the test target. Analysis results and examples given in the paper show the three statistical analysis techniques providing a complementary capability to warn of thermocouple failures. It also suggests that the regression analysis models relating calculated fuel temperatures and thermocouple readings can enable online regulation of experimental parameters (i.e. gas mixture content), to effectively maintain the fuel temperature within a given range.

  13. Advances in Barkhausen noise analysis

    NASA Astrophysics Data System (ADS)

    Meyendorf, Norbert; Hillmann, Susanne; Cikalova, Ulana; Schreiber, Juergen

    2014-03-01

    The magnetic Barkhausen Noise technique is a well suited method for the characterization of ferromagnetic materials. The Barkhausen effect results in an interaction between the magnetic structure and the microstructure of materials, and is sensitive to the stresses and microstructure related mechanical properties. Barkhausen noise is a complex signal that provides a large amount of information, for example frequency spectrum, amplitude, RMS value, dependence of magnetic field strength, magnetization frequency and fractal behavior. Although this technique has a lot potentials, it is not commonly used in nondestructive material testing. Large sensors and complex calibration procedures made the method impractical for many applications. However, research has progressed in recent years; new sensor designs were developed and evaluated, new algorithms to simplify the calibration and measurement procedures were developed as well as analysis of additional material properties have been introduced.

  14. Analysis of Coastal Dunes: A Remote Sensing and Statistical Approach.

    ERIC Educational Resources Information Center

    Jones, J. Richard

    1985-01-01

    Remote sensing analysis and statistical methods were used to analyze the coastal dunes of Plum Island, Massachusetts. The research methodology used provides an example of a student project for remote sensing, geomorphology, or spatial analysis courses at the university level. (RM)

  15. Data explorer: a prototype expert system for statistical analysis.

    PubMed Central

    Aliferis, C.; Chao, E.; Cooper, G. F.

    1993-01-01

    The inadequate analysis of medical research data, due mainly to the unavailability of local statistical expertise, seriously jeopardizes the quality of new medical knowledge. Data Explorer is a prototype Expert System that builds on the versatility and power of existing statistical software, to provide automatic analyses and interpretation of medical data. The system draws much of its power by using belief network methods in place of more traditional, but difficult to automate, classical multivariate statistical techniques. Data Explorer identifies statistically significant relationships among variables, and using power-size analysis, belief network inference/learning and various explanatory techniques helps the user understand the importance of the findings. Finally the system can be used as a tool for the automatic development of predictive/diagnostic models from patient databases. PMID:8130501

  16. A Divergence Statistics Extension to VTK for Performance Analysis

    SciTech Connect

    Pebay, Philippe Pierre; Bennett, Janine Camille

    2015-02-01

    This report follows the series of previous documents ([PT08, BPRT09b, PT09, BPT09, PT10, PB13], where we presented the parallel descriptive, correlative, multi-correlative, principal component analysis, contingency, k -means, order and auto-correlative statistics engines which we developed within the Visualization Tool Kit ( VTK ) as a scalable, parallel and versatile statistics package. We now report on a new engine which we developed for the calculation of divergence statistics, a concept which we hereafter explain and whose main goal is to quantify the discrepancy, in a stasticial manner akin to measuring a distance, between an observed empirical distribution and a theoretical, "ideal" one. The ease of use of the new diverence statistics engine is illustrated by the means of C++ code snippets. Although this new engine does not yet have a parallel implementation, it has already been applied to HPC performance analysis, of which we provide an example.

  17. Advanced Software Methods for Physics Analysis

    NASA Astrophysics Data System (ADS)

    Lista, L.

    2006-01-01

    Unprecedented data analysis complexity is experienced in modern High Energy Physics experiments. The complexity arises from the growing size of recorded data samples, the large number of data analyses performed by different users in each single experiment, and the level of complexity of each single analysis. For this reason, the requirements on software for data analysis impose a very high level of reliability. We present two concrete examples: the former from BaBar experience with the migration to a new Analysis Model with the definition of a new model for the Event Data Store, the latter about a toolkit for multivariate statistical and parametric Monte Carlo analysis developed using generic programming.

  18. Statistical analysis in dBASE-compatible databases.

    PubMed

    Hauer-Jensen, M

    1991-01-01

    Database management in clinical and experimental research often requires statistical analysis of the data in addition to the usual functions for storing, organizing, manipulating and reporting. With most database systems, transfer of data to a dedicated statistics package is a relatively simple task. However, many statistics programs lack the powerful features found in database management software. dBASE IV and compatible programs are currently among the most widely used database management programs. d4STAT is a utility program for dBASE, containing a collection of statistical functions and tests for data stored in the dBASE file format. By using d4STAT, statistical calculations may be performed directly on the data stored in the database without having to exit dBASE IV or export data. Record selection and variable transformations are performed in memory, thus obviating the need for creating new variables or data files. The current version of the program contains routines for descriptive statistics, paired and unpaired t-tests, correlation, linear regression, frequency tables, Mann-Whitney U-test, Wilcoxon signed rank test, a time-saving procedure for counting observations according to user specified selection criteria, survival analysis (product limit estimate analysis, log-rank test, and graphics), and normal t and chi-squared distribution functions.

  19. Fisher statistics for analysis of diffusion tensor directional information.

    PubMed

    Hutchinson, Elizabeth B; Rutecki, Paul A; Alexander, Andrew L; Sutula, Thomas P

    2012-04-30

    A statistical approach is presented for the quantitative analysis of diffusion tensor imaging (DTI) directional information using Fisher statistics, which were originally developed for the analysis of vectors in the field of paleomagnetism. In this framework, descriptive and inferential statistics have been formulated based on the Fisher probability density function, a spherical analogue of the normal distribution. The Fisher approach was evaluated for investigation of rat brain DTI maps to characterize tissue orientation in the corpus callosum, fornix, and hilus of the dorsal hippocampal dentate gyrus, and to compare directional properties in these regions following status epilepticus (SE) or traumatic brain injury (TBI) with values in healthy brains. Direction vectors were determined for each region of interest (ROI) for each brain sample and Fisher statistics were applied to calculate the mean direction vector and variance parameters in the corpus callosum, fornix, and dentate gyrus of normal rats and rats that experienced TBI or SE. Hypothesis testing was performed by calculation of Watson's F-statistic and associated p-value giving the likelihood that grouped observations were from the same directional distribution. In the fornix and midline corpus callosum, no directional differences were detected between groups, however in the hilus, significant (p<0.0005) differences were found that robustly confirmed observations that were suggested by visual inspection of directionally encoded color DTI maps. The Fisher approach is a potentially useful analysis tool that may extend the current capabilities of DTI investigation by providing a means of statistical comparison of tissue structural orientation.

  20. Advanced Placement: Model Policy Components. Policy Analysis

    ERIC Educational Resources Information Center

    Zinth, Jennifer

    2016-01-01

    Advanced Placement (AP), launched in 1955 by the College Board as a program to offer gifted high school students the opportunity to complete entry-level college coursework, has since expanded to encourage a broader array of students to tackle challenging content. This Education Commission of the State's Policy Analysis identifies key components of…

  1. Statistical mechanics analysis of thresholding 1-bit compressed sensing

    NASA Astrophysics Data System (ADS)

    Xu, Yingying; Kabashima, Yoshiyuki

    2016-08-01

    The one-bit compressed sensing framework aims to reconstruct a sparse signal by only using the sign information of its linear measurements. To compensate for the loss of scale information, past studies in the area have proposed recovering the signal by imposing an additional constraint on the l 2-norm of the signal. Recently, an alternative strategy that captures scale information by introducing a threshold parameter to the quantization process was advanced. In this paper, we analyze the typical behavior of thresholding 1-bit compressed sensing utilizing the replica method of statistical mechanics, so as to gain an insight for properly setting the threshold value. Our result shows that fixing the threshold at a constant value yields better performance than varying it randomly when the constant is optimally tuned, statistically. Unfortunately, the optimal threshold value depends on the statistical properties of the target signal, which may not be known in advance. In order to handle this inconvenience, we develop a heuristic that adaptively tunes the threshold parameter based on the frequency of positive (or negative) values in the binary outputs. Numerical experiments show that the heuristic exhibits satisfactory performance while incurring low computational cost.

  2. Statistical inference in behavior analysis: Friend or foe?

    PubMed Central

    Baron, Alan

    1999-01-01

    Behavior analysts are undecided about the proper role to be played by inferential statistics in behavioral research. The traditional view, as expressed in Sidman's Tactics of Scientific Research (1960), was that inferential statistics has no place within a science that focuses on the steady-state behavior of individual organisms. Despite this admonition, there have been steady inroads of statistical techniques into behavior analysis since then, as evidenced by publications in the Journal of the Experimental Analysis of Behavior. The issues raised by these developments were considered at a panel held at the 24th annual convention of the Association for Behavior Analysis, Orlando, Florida (May, 1998). The proceedings are reported in this and the following articles. PMID:22478323

  3. Statistical inference in behavior analysis: Experimental control is better

    PubMed Central

    Perone, Michael

    1999-01-01

    Statistical inference promises automatic, objective, reliable assessments of data, independent of the skills or biases of the investigator, whereas the single-subject methods favored by behavior analysts often are said to rely too much on the investigator's subjective impressions, particularly in the visual analysis of data. In fact, conventional statistical methods are difficult to apply correctly, even by experts, and the underlying logic of null-hypothesis testing has drawn criticism since its inception. By comparison, single-subject methods foster direct, continuous interaction between investigator and subject and development of strong forms of experimental control that obviate the need for statistical inference. Treatment effects are demonstrated in experimental designs that incorporate replication within and between subjects, and the visual analysis of data is adequate when integrated into such designs. Thus, single-subject methods are ideal for shaping—and maintaining—the kind of experimental practices that will ensure the continued success of behavior analysis. PMID:22478328

  4. Advanced Interval Management: A Benefit Analysis

    NASA Technical Reports Server (NTRS)

    Timer, Sebastian; Peters, Mark

    2016-01-01

    This document is the final report for the NASA Langley Research Center (LaRC)- sponsored task order 'Possible Benefits for Advanced Interval Management Operations.' Under this research project, Architecture Technology Corporation performed an analysis to determine the maximum potential benefit to be gained if specific Advanced Interval Management (AIM) operations were implemented in the National Airspace System (NAS). The motivation for this research is to guide NASA decision-making on which Interval Management (IM) applications offer the most potential benefit and warrant further research.

  5. Advanced statistical process control: controlling sub-0.18-μm lithography and other processes

    NASA Astrophysics Data System (ADS)

    Zeidler, Amit; Veenstra, Klaas-Jelle; Zavecz, Terrence E.

    2001-08-01

    access of the analysis to include the external variables involved in CMP, deposition etc. We then applied yield analysis methods to identify the significant lithography-external process variables from the history of lots, subsequently adding the identified process variable to the signatures database and to the PPC calculations. With these improvements, the authors anticipate a 50% improvement of the process window. This improvement results in a significant reduction of rework and improved yield depending on process demands and equipment configuration. A statistical theory that explains the PPC is then presented. This theory can be used to simulate a general PPC application. In conclusion, the PPC concept is not lithography or semiconductors limited. In fact it is applicable for any production process that is signature biased (chemical industry, car industry, .). Requirements for the PPC are large data collection, a controllable process that is not too expensive to tune the process for every lot, and the ability to employ feedback calculations. PPC is a major change in the process management approach and therefor will first be employed where the need is high and the return on investment is very fast. The best industry to start with is the semiconductors and the most likely process area to start with is lithography.

  6. Adaptive strategy for the statistical analysis of connectomes.

    PubMed

    Meskaldji, Djalel Eddine; Ottet, Marie-Christine; Cammoun, Leila; Hagmann, Patric; Meuli, Reto; Eliez, Stephan; Thiran, Jean Philippe; Morgenthaler, Stephan

    2011-01-01

    We study an adaptive statistical approach to analyze brain networks represented by brain connection matrices of interregional connectivity (connectomes). Our approach is at a middle level between a global analysis and single connections analysis by considering subnetworks of the global brain network. These subnetworks represent either the inter-connectivity between two brain anatomical regions or by the intra-connectivity within the same brain anatomical region. An appropriate summary statistic, that characterizes a meaningful feature of the subnetwork, is evaluated. Based on this summary statistic, a statistical test is performed to derive the corresponding p-value. The reformulation of the problem in this way reduces the number of statistical tests in an orderly fashion based on our understanding of the problem. Considering the global testing problem, the p-values are corrected to control the rate of false discoveries. Finally, the procedure is followed by a local investigation within the significant subnetworks. We contrast this strategy with the one based on the individual measures in terms of power. We show that this strategy has a great potential, in particular in cases where the subnetworks are well defined and the summary statistics are properly chosen. As an application example, we compare structural brain connection matrices of two groups of subjects with a 22q11.2 deletion syndrome, distinguished by their IQ scores.

  7. Data analysis using the Gnu R system for statistical computation

    SciTech Connect

    Simone, James; /Fermilab

    2011-07-01

    R is a language system for statistical computation. It is widely used in statistics, bioinformatics, machine learning, data mining, quantitative finance, and the analysis of clinical drug trials. Among the advantages of R are: it has become the standard language for developing statistical techniques, it is being actively developed by a large and growing global user community, it is open source software, it is highly portable (Linux, OS-X and Windows), it has a built-in documentation system, it produces high quality graphics and it is easily extensible with over four thousand extension library packages available covering statistics and applications. This report gives a very brief introduction to R with some examples using lattice QCD simulation results. It then discusses the development of R packages designed for chi-square minimization fits for lattice n-pt correlation functions.

  8. Improving statistical analysis of matched case-control studies.

    PubMed

    Conway, Aaron; Rolley, John X; Fulbrook, Paul; Page, Karen; Thompson, David R

    2013-06-01

    Matched case-control research designs can be useful because matching can increase power due to reduced variability between subjects. However, inappropriate statistical analysis of matched data could result in a change in the strength of association between the dependent and independent variables or a change in the significance of the findings. We sought to ascertain whether matched case-control studies published in the nursing literature utilized appropriate statistical analyses. Of 41 articles identified that met the inclusion criteria, 31 (76%) used an inappropriate statistical test for comparing data derived from case subjects and their matched controls. In response to this finding, we developed an algorithm to support decision-making regarding statistical tests for matched case-control studies.

  9. Monte Carlo Simulations in Statistical Physics -- From Basic Principles to Advanced Applications

    NASA Astrophysics Data System (ADS)

    Janke, Wolfhard

    2013-08-01

    This chapter starts with an overview of Monte Carlo computer simulation methodologies which are illustrated for the simple case of the Ising model. After reviewing importance sampling schemes based on Markov chains and standard local update rules (Metropolis, Glauber, heat-bath), nonlocal cluster-update algorithms are explained which drastically reduce the problem of critical slowing down at second-order phase transitions and thus improve the performance of simulations. How this can be quantified is explained in the section on statistical error analyses of simulation data including the effect of temporal correlations and autocorrelation times. Histogram reweighting methods are explained in the next section. Eventually, more advanced generalized ensemble methods (simulated and parallel tempering, multicanonical ensemble, Wang-Landau method) are discussed which are particularly important for simulations of first-order phase transitions and, in general, of systems with rare-event states. The setup of scaling and finite-size scaling analyses is the content of the following section. The chapter concludes with two advanced applications to complex physical systems. The first example deals with a quenched, diluted ferromagnet, and in the second application we consider the adsorption properties of macromolecules such as polymers and proteins to solid substrates. Such systems often require especially tailored algorithms for their efficient and successful simulation.

  10. Statistics and bioinformatics in nutritional sciences: analysis of complex data in the era of systems biology.

    PubMed

    Fu, Wenjiang J; Stromberg, Arnold J; Viele, Kert; Carroll, Raymond J; Wu, Guoyao

    2010-07-01

    Over the past 2 decades, there have been revolutionary developments in life science technologies characterized by high throughput, high efficiency, and rapid computation. Nutritionists now have the advanced methodologies for the analysis of DNA, RNA, protein, low-molecular-weight metabolites, as well as access to bioinformatics databases. Statistics, which can be defined as the process of making scientific inferences from data that contain variability, has historically played an integral role in advancing nutritional sciences. Currently, in the era of systems biology, statistics has become an increasingly important tool to quantitatively analyze information about biological macromolecules. This article describes general terms used in statistical analysis of large, complex experimental data. These terms include experimental design, power analysis, sample size calculation, and experimental errors (Type I and II errors) for nutritional studies at population, tissue, cellular, and molecular levels. In addition, we highlighted various sources of experimental variations in studies involving microarray gene expression, real-time polymerase chain reaction, proteomics, and other bioinformatics technologies. Moreover, we provided guidelines for nutritionists and other biomedical scientists to plan and conduct studies and to analyze the complex data. Appropriate statistical analyses are expected to make an important contribution to solving major nutrition-associated problems in humans and animals (including obesity, diabetes, cardiovascular disease, cancer, ageing, and intrauterine growth retardation).

  11. Statistics and bioinformatics in nutritional sciences: analysis of complex data in the era of systems biology⋆

    PubMed Central

    Fu, Wenjiang J.; Stromberg, Arnold J.; Viele, Kert; Carroll, Raymond J.; Wu, Guoyao

    2009-01-01

    Over the past two decades, there have been revolutionary developments in life science technologies characterized by high throughput, high efficiency, and rapid computation. Nutritionists now have the advanced methodologies for the analysis of DNA, RNA, protein, low-molecular-weight metabolites, as well as access to bioinformatics databases. Statistics, which can be defined as the process of making scientific inferences from data that contain variability, has historically played an integral role in advancing nutritional sciences. Currently, in the era of systems biology, statistics has become an increasingly important tool to quantitatively analyze information about biological macromolecules. This article describes general terms used in statistical analysis of large, complex experimental data. These terms include experimental design, power analysis, sample size calculation, and experimental errors (type I and II errors) for nutritional studies at population, tissue, cellular, and molecular levels. In addition, we highlighted various sources of experimental variations in studies involving microarray gene expression, real-time polymerase chain reaction, proteomics, and other bioinformatics technologies. Moreover, we provided guidelines for nutritionists and other biomedical scientists to plan and conduct studies and to analyze the complex data. Appropriate statistical analyses are expected to make an important contribution to solving major nutrition-associated problems in humans and animals (including obesity, diabetes, cardiovascular disease, cancer, ageing, and intrauterine fetal retardation). PMID:20233650

  12. Integration of Advanced Statistical Analysis Tools and Geophysical Modeling

    DTIC Science & Technology

    2012-08-01

    this problem with a fingerprinting algorithm that inverts for target location and orientation while holding polarizations fixed at their library values...Cross-Domain Multitask Learning with Latent Probit Mod- els,” Proc. Int. Conf. Machine Learning (ICML), 2012 L. Beran, S.D. Billings and D. Oldenburg

  13. PREFACE: Advanced many-body and statistical methods in mesoscopic systems

    NASA Astrophysics Data System (ADS)

    Anghel, Dragos Victor; Sabin Delion, Doru; Sorin Paraoanu, Gheorghe

    2012-02-01

    It has increasingly been realized in recent times that the borders separating various subfields of physics are largely artificial. This is the case for nanoscale physics, physics of lower-dimensional systems and nuclear physics, where the advanced techniques of many-body theory developed in recent times could provide a unifying framework for these disciplines under the general name of mesoscopic physics. Other fields, such as quantum optics and quantum information, are increasingly using related methods. The 6-day conference 'Advanced many-body and statistical methods in mesoscopic systems' that took place in Constanta, Romania, between 27 June and 2 July 2011 was, we believe, a successful attempt at bridging an impressive list of topical research areas: foundations of quantum physics, equilibrium and non-equilibrium quantum statistics/fractional statistics, quantum transport, phases and phase transitions in mesoscopic systems/superfluidity and superconductivity, quantum electromechanical systems, quantum dissipation, dephasing, noise and decoherence, quantum information, spin systems and their dynamics, fundamental symmetries in mesoscopic systems, phase transitions, exactly solvable methods for mesoscopic systems, various extension of the random phase approximation, open quantum systems, clustering, decay and fission modes and systematic versus random behaviour of nuclear spectra. This event brought together participants from seventeen countries and five continents. Each of the participants brought considerable expertise in his/her field of research and, at the same time, was exposed to the newest results and methods coming from the other, seemingly remote, disciplines. The talks touched on subjects that are at the forefront of topical research areas and we hope that the resulting cross-fertilization of ideas will lead to new, interesting results from which everybody will benefit. We are grateful for the financial and organizational support from IFIN-HH, Ovidius

  14. A Kernel of Truth: Statistical Advances in Polygenic Variance Component Models for Complex Human Pedigrees

    PubMed Central

    Blangero, John; Diego, Vincent P.; Dyer, Thomas D.; Almeida, Marcio; Peralta, Juan; Kent, Jack W.; Williams, Jeff T.; Almasy, Laura; Göring, Harald H. H.

    2014-01-01

    Statistical genetic analysis of quantitative traits in large pedigrees is a formidable computational task due to the necessity of taking the non-independence among relatives into account. With the growing awareness that rare sequence variants may be important in human quantitative variation, heritability and association study designs involving large pedigrees will increase in frequency due to the greater chance of observing multiple copies of rare variants amongst related individuals. Therefore, it is important to have statistical genetic test procedures that utilize all available information for extracting evidence regarding genetic association. Optimal testing for marker/phenotype association involves the exact calculation of the likelihood ratio statistic which requires the repeated inversion of potentially large matrices. In a whole genome sequence association context, such computation may be prohibitive. Toward this end, we have developed a rapid and efficient eigensimplification of the likelihood that makes analysis of family data commensurate with the analysis of a comparable sample of unrelated individuals. Our theoretical results which are based on a spectral representation of the likelihood yield simple exact expressions for the expected likelihood ratio test statistic (ELRT) for pedigrees of arbitrary size and complexity. For heritability, the ELRT is: −∑ln[1+ĥ2(λgi−1)], where ĥ2 and λgi are respectively the heritability and eigenvalues of the pedigree-derived genetic relationship kernel (GRK). For association analysis of sequence variants, the ELRT is given by ELRT[hq2>0:unrelateds]−(ELRT[ht2>0:pedigrees]−ELRT[hr2>0:pedigrees]), where ht2,hq2, and hr2 are the total, quantitative trait nucleotide, and residual heritabilities, respectively. Using these results, fast and accurate analytical power analyses are possible, eliminating the need for computer simulation. Additional benefits of eigensimplification include a simple method for

  15. A novel statistic for genome-wide interaction analysis.

    PubMed

    Wu, Xuesen; Dong, Hua; Luo, Li; Zhu, Yun; Peng, Gang; Reveille, John D; Xiong, Momiao

    2010-09-23

    Although great progress in genome-wide association studies (GWAS) has been made, the significant SNP associations identified by GWAS account for only a few percent of the genetic variance, leading many to question where and how we can find the missing heritability. There is increasing interest in genome-wide interaction analysis as a possible source of finding heritability unexplained by current GWAS. However, the existing statistics for testing interaction have low power for genome-wide interaction analysis. To meet challenges raised by genome-wide interactional analysis, we have developed a novel statistic for testing interaction between two loci (either linked or unlinked). The null distribution and the type I error rates of the new statistic for testing interaction are validated using simulations. Extensive power studies show that the developed statistic has much higher power to detect interaction than classical logistic regression. The results identified 44 and 211 pairs of SNPs showing significant evidence of interactions with FDR<0.001 and 0.001analysis is a valuable tool for finding remaining missing heritability unexplained by the current GWAS, and the developed novel statistic is able to search significant interaction between SNPs across the genome. Real data analysis showed that the results of genome-wide interaction analysis can be replicated in two independent studies.

  16. Revisiting the statistical analysis of pyroclast density and porosity data

    NASA Astrophysics Data System (ADS)

    Bernard, B.; Kueppers, U.; Ortiz, H.

    2015-07-01

    Explosive volcanic eruptions are commonly characterized based on a thorough analysis of the generated deposits. Amongst other characteristics in physical volcanology, density and porosity of juvenile clasts are some of the most frequently used to constrain eruptive dynamics. In this study, we evaluate the sensitivity of density and porosity data to statistical methods and introduce a weighting parameter to correct issues raised by the use of frequency analysis. Results of textural investigation can be biased by clast selection. Using statistical tools as presented here, the meaningfulness of a conclusion can be checked for any data set easily. This is necessary to define whether or not a sample has met the requirements for statistical relevance, i.e. whether a data set is large enough to allow for reproducible results. Graphical statistics are used to describe density and porosity distributions, similar to those used for grain-size analysis. This approach helps with the interpretation of volcanic deposits. To illustrate this methodology, we chose two large data sets: (1) directed blast deposits of the 3640-3510 BC eruption of Chachimbiro volcano (Ecuador) and (2) block-and-ash-flow deposits of the 1990-1995 eruption of Unzen volcano (Japan). We propose the incorporation of this analysis into future investigations to check the objectivity of results achieved by different working groups and guarantee the meaningfulness of the interpretation.

  17. Introduction to Statistics and Data Analysis With Computer Applications I.

    ERIC Educational Resources Information Center

    Morris, Carl; Rolph, John

    This document consists of unrevised lecture notes for the first half of a 20-week in-house graduate course at Rand Corporation. The chapter headings are: (1) Histograms and descriptive statistics; (2) Measures of dispersion, distance and goodness of fit; (3) Using JOSS for data analysis; (4) Binomial distribution and normal approximation; (5)…

  18. AMA Statistical Information Based Analysis of a Compressive Imaging System

    NASA Astrophysics Data System (ADS)

    Hope, D.; Prasad, S.

    Recent advances in optics and instrumentation have dramatically increased the amount of data, both spatial and spectral, that can be obtained about a target scene. The volume of the acquired data can and, in fact, often does far exceed the amount of intrinsic information present in the scene. In such cases, the large volume of data alone can impede the analysis and extraction of relevant information about the scene. One approach to overcoming this impedance mismatch between the volume of data and intrinsic information in the scene the data are supposed to convey is compressive sensing. Compressive sensing exploits the fact that most signals of interest, such as image scenes, possess natural correlations in their physical structure. These correlations, which can occur spatially as well as spectrally, can suggest a more natural sparse basis for compressing and representing the scene than standard pixels or voxels. A compressive sensing system attempts to acquire and encode the scene in this sparse basis, while preserving all relevant information in the scene. One criterion for assessing the content, acquisition, and processing of information in the image scene is Shannon information. This metric describes fundamental limits on encoding and reliably transmitting information about a source, such as an image scene. In this framework, successful encoding of the image requires an optimal choice of a sparse basis, while losses of information during transmission occur due to a finite system response and measurement noise. An information source can be represented by a certain class of image scenes, .e.g. those that have a common morphology. The ability to associate the recorded image with the correct member of the class that produced the image depends on the amount of Shannon information in the acquired data. In this manner, one can analyze the performance of a compressive imaging system for a specific class or ensemble of image scenes. We present such an information

  19. A statistical model for iTRAQ data analysis.

    PubMed

    Hill, Elizabeth G; Schwacke, John H; Comte-Walters, Susana; Slate, Elizabeth H; Oberg, Ann L; Eckel-Passow, Jeanette E; Therneau, Terry M; Schey, Kevin L

    2008-08-01

    We describe biological and experimental factors that induce variability in reporter ion peak areas obtained from iTRAQ experiments. We demonstrate how these factors can be incorporated into a statistical model for use in evaluating differential protein expression and highlight the benefits of using analysis of variance to quantify fold change. We demonstrate the model's utility based on an analysis of iTRAQ data derived from a spike-in study.

  20. A Computer Aided Statistical Covariance Program for Missile System Analysis

    DTIC Science & Technology

    1974-04-01

    ENGINEERING RESEARCH OKLAHOMA STATE UNIVERSITY A COMPUTER AIDED STATISTICAL COVARIANCE PROGRAM FOR MISSILE SYSTEM ANALYSI. TO D JN2 U. S. Army Missile...ORGANIZATION NAME AND ADDRESS 10. PROGRAM ELEMENT. PROJECT. TASK AREA & WORK UNIT NUMBERS Office of Engineering Rsch, Oklahoma State Univ Agiculture...ANALYSIS by James R. Rowland and V. M. Gupta School of Electrical Engineering V Approved for public release; distribution unlimited. Office of Engineering

  1. HistFitter software framework for statistical data analysis

    NASA Astrophysics Data System (ADS)

    Baak, M.; Besjes, G. J.; Côté, D.; Koutsman, A.; Lorenz, J.; Short, D.

    2015-04-01

    We present a software framework for statistical data analysis, called HistFitter, that has been used extensively by the ATLAS Collaboration to analyze big datasets originating from proton-proton collisions at the Large Hadron Collider at CERN. Since 2012 HistFitter has been the standard statistical tool in searches for supersymmetric particles performed by ATLAS. HistFitter is a programmable and flexible framework to build, book-keep, fit, interpret and present results of data models of nearly arbitrary complexity. Starting from an object-oriented configuration, defined by users, the framework builds probability density functions that are automatically fit to data and interpreted with statistical tests. Internally HistFitter uses the statistics packages RooStats and HistFactory. A key innovation of HistFitter is its design, which is rooted in analysis strategies of particle physics. The concepts of control, signal and validation regions are woven into its fabric. These are progressively treated with statistically rigorous built-in methods. Being capable of working with multiple models at once that describe the data, HistFitter introduces an additional level of abstraction that allows for easy bookkeeping, manipulation and testing of large collections of signal hypotheses. Finally, HistFitter provides a collection of tools to present results with publication quality style through a simple command-line interface.

  2. Advanced Climate Analysis and Long Range Forecasting

    DTIC Science & Technology

    2014-09-30

    1 DISTRIBUTION STATEMENT A. Approved for public release; distribution is unlimited. Advanced Climate Analysis and Long Range Forecasting...project is to improve the long range and climate support provided by the U.S. Naval Oceanography Enterprise (NOe) for planning, conducting, and...months, several seasons, several years). The primary transition focus is on improving the long range and climate support capabilities of the Fleet

  3. Statistical Analysis of CMC Constituent and Processing Data

    NASA Technical Reports Server (NTRS)

    Fornuff, Jonathan

    2004-01-01

    observed using statistical analysis software. The ultimate purpose of this study is to determine what variations in material processing can lead to the most critical changes in the materials property. The work I have taken part in this summer explores, in general, the key properties needed In this study SiC/SiC composites of varying architectures, utilizing a boron-nitride (BN)

  4. Using Pre-Statistical Analysis to Streamline Monitoring Assessments

    SciTech Connect

    Reed, J.K.

    1999-10-20

    A variety of statistical methods exist to aid evaluation of groundwater quality and subsequent decision making in regulatory programs. These methods are applied because of large temporal and spatial extrapolations commonly applied to these data. In short, statistical conclusions often serve as a surrogate for knowledge. However, facilities with mature monitoring programs that have generated abundant data have inherently less uncertainty because of the sheer quantity of analytical results. In these cases, statistical tests can be less important, and ''expert'' data analysis should assume an important screening role.The WSRC Environmental Protection Department, working with the General Separations Area BSRI Environmental Restoration project team has developed a method for an Integrated Hydrogeological Analysis (IHA) of historical water quality data from the F and H Seepage Basins groundwater remediation project. The IHA combines common sense analytical techniques and a GIS presentation that force direct interactive evaluation of the data. The IHA can perform multiple data analysis tasks required by the RCRA permit. These include: (1) Development of a groundwater quality baseline prior to remediation startup, (2) Targeting of constituents for removal from RCRA GWPS, (3) Targeting of constituents for removal from UIC, permit, (4) Targeting of constituents for reduced, (5)Targeting of monitoring wells not producing representative samples, (6) Reduction in statistical evaluation, and (7) Identification of contamination from other facilities.

  5. Feature-Based Statistical Analysis of Combustion Simulation Data

    SciTech Connect

    Bennett, J; Krishnamoorthy, V; Liu, S; Grout, R; Hawkes, E; Chen, J; Pascucci, V; Bremer, P T

    2011-11-18

    We present a new framework for feature-based statistical analysis of large-scale scientific data and demonstrate its effectiveness by analyzing features from Direct Numerical Simulations (DNS) of turbulent combustion. Turbulent flows are ubiquitous and account for transport and mixing processes in combustion, astrophysics, fusion, and climate modeling among other disciplines. They are also characterized by coherent structure or organized motion, i.e. nonlocal entities whose geometrical features can directly impact molecular mixing and reactive processes. While traditional multi-point statistics provide correlative information, they lack nonlocal structural information, and hence, fail to provide mechanistic causality information between organized fluid motion and mixing and reactive processes. Hence, it is of great interest to capture and track flow features and their statistics together with their correlation with relevant scalar quantities, e.g. temperature or species concentrations. In our approach we encode the set of all possible flow features by pre-computing merge trees augmented with attributes, such as statistical moments of various scalar fields, e.g. temperature, as well as length-scales computed via spectral analysis. The computation is performed in an efficient streaming manner in a pre-processing step and results in a collection of meta-data that is orders of magnitude smaller than the original simulation data. This meta-data is sufficient to support a fully flexible and interactive analysis of the features, allowing for arbitrary thresholds, providing per-feature statistics, and creating various global diagnostics such as Cumulative Density Functions (CDFs), histograms, or time-series. We combine the analysis with a rendering of the features in a linked-view browser that enables scientists to interactively explore, visualize, and analyze the equivalent of one terabyte of simulation data. We highlight the utility of this new framework for combustion

  6. SPA- STATISTICAL PACKAGE FOR TIME AND FREQUENCY DOMAIN ANALYSIS

    NASA Technical Reports Server (NTRS)

    Brownlow, J. D.

    1994-01-01

    The need for statistical analysis often arises when data is in the form of a time series. This type of data is usually a collection of numerical observations made at specified time intervals. Two kinds of analysis may be performed on the data. First, the time series may be treated as a set of independent observations using a time domain analysis to derive the usual statistical properties including the mean, variance, and distribution form. Secondly, the order and time intervals of the observations may be used in a frequency domain analysis to examine the time series for periodicities. In almost all practical applications, the collected data is actually a mixture of the desired signal and a noise signal which is collected over a finite time period with a finite precision. Therefore, any statistical calculations and analyses are actually estimates. The Spectrum Analysis (SPA) program was developed to perform a wide range of statistical estimation functions. SPA can provide the data analyst with a rigorous tool for performing time and frequency domain studies. In a time domain statistical analysis the SPA program will compute the mean variance, standard deviation, mean square, and root mean square. It also lists the data maximum, data minimum, and the number of observations included in the sample. In addition, a histogram of the time domain data is generated, a normal curve is fit to the histogram, and a goodness-of-fit test is performed. These time domain calculations may be performed on both raw and filtered data. For a frequency domain statistical analysis the SPA program computes the power spectrum, cross spectrum, coherence, phase angle, amplitude ratio, and transfer function. The estimates of the frequency domain parameters may be smoothed with the use of Hann-Tukey, Hamming, Barlett, or moving average windows. Various digital filters are available to isolate data frequency components. Frequency components with periods longer than the data collection interval

  7. Advanced Fuel Cycle Economic Sensitivity Analysis

    SciTech Connect

    David Shropshire; Kent Williams; J.D. Smith; Brent Boore

    2006-12-01

    A fuel cycle economic analysis was performed on four fuel cycles to provide a baseline for initial cost comparison using the Gen IV Economic Modeling Work Group G4 ECON spreadsheet model, Decision Programming Language software, the 2006 Advanced Fuel Cycle Cost Basis report, industry cost data, international papers, the nuclear power related cost study from MIT, Harvard, and the University of Chicago. The analysis developed and compared the fuel cycle cost component of the total cost of energy for a wide range of fuel cycles including: once through, thermal with fast recycle, continuous fast recycle, and thermal recycle.

  8. Statistical model applied to motor evoked potentials analysis.

    PubMed

    Ma, Ying; Thakor, Nitish V; Jia, Xiaofeng

    2011-01-01

    Motor evoked potentials (MEPs) convey information regarding the functional integrity of the descending motor pathways. Absence of the MEP has been used as a neurophysiological marker to suggest cortico-spinal abnormalities in the operating room. Due to their high variability and sensitivity, detailed quantitative studies of MEPs are lacking. This paper applies a statistical method to characterize MEPs by estimating the number of motor units and single motor unit potential amplitudes. A clearly increasing trend of single motor unit potential amplitudes in the MEPs after each pulse of the stimulation pulse train is revealed by this method. This statistical method eliminates the effects of anesthesia, and provides an objective assessment of MEPs. Consequently this statistical method has high potential to be useful in future quantitative MEPs analysis.

  9. Adiyaman wind potential and statistical analysis, in Turkey

    NASA Astrophysics Data System (ADS)

    Sogukpinar, Haci; Bozkurt, Ismail

    2017-02-01

    In this study, wind potential of Adiyaman is analyzed statistically and installed wind capacity across Turkey is summarized. One-year experimental data are obtained for the district of Adiyaman. The data is taken from two major data station of Sincik and Kahta which determines the wind potential of Adiyaman. Measurements at 10 m height are used for statistical analysis. With the data obtained, monthly average wind speed are calculated and statistical analyzes are performed using the Weibull, Gamma and Log-normal distribution. Data received from the wind station Sincik represents windy part of Adiyaman so average wind speed is higher. Kahta represent windless part of Adiyaman and the average wind speed is lower in there. This study shows that the best fit to the Gamma distribution of measurements made on.

  10. Geographic analysis of forest health indicators using spatial scan statistics.

    PubMed

    Coulston, John W; Riitters, Kurt H

    2003-06-01

    Geographically explicit analysis tools are needed to assess forest health indicators that are measured over large regions. Spatial scan statistics can be used to detect spatial or spatiotemporal clusters of forests representing hotspots of extreme indicator values. This paper demonstrates the approach through analyses of forest fragmentation indicators in the southeastern United States and insect and pathogen indicators in the Pacific Northwest United States. The scan statistic detected four spatial clusters of fragmented forest including a hotspot in the Piedmont and Coastal Plain region. Three recurring clusters of insect and pathogen occurrence were found in the Pacific Northwest. Spatial scan statistics are a powerful new tool that can be used to identify potential forest health problems.

  11. ISSUES IN THE STATISTICAL ANALYSIS OF SMALL-AREA HEALTH DATA. (R825173)

    EPA Science Inventory

    The availability of geographically indexed health and population data, with advances in computing, geographical information systems and statistical methodology, have opened the way for serious exploration of small area health statistics based on routine data. Such analyses may be...

  12. CORSSA: Community Online Resource for Statistical Seismicity Analysis

    NASA Astrophysics Data System (ADS)

    Zechar, J. D.; Hardebeck, J. L.; Michael, A. J.; Naylor, M.; Steacy, S.; Wiemer, S.; Zhuang, J.

    2011-12-01

    Statistical seismology is critical to the understanding of seismicity, the evaluation of proposed earthquake prediction and forecasting methods, and the assessment of seismic hazard. Unfortunately, despite its importance to seismology-especially to those aspects with great impact on public policy-statistical seismology is mostly ignored in the education of seismologists, and there is no central repository for the existing open-source software tools. To remedy these deficiencies, and with the broader goal to enhance the quality of statistical seismology research, we have begun building the Community Online Resource for Statistical Seismicity Analysis (CORSSA, www.corssa.org). We anticipate that the users of CORSSA will range from beginning graduate students to experienced researchers. More than 20 scientists from around the world met for a week in Zurich in May 2010 to kick-start the creation of CORSSA: the format and initial table of contents were defined; a governing structure was organized; and workshop participants began drafting articles. CORSSA materials are organized with respect to six themes, each will contain between four and eight articles. CORSSA now includes seven articles with an additional six in draft form along with forums for discussion, a glossary, and news about upcoming meetings, special issues, and recent papers. Each article is peer-reviewed and presents a balanced discussion, including illustrative examples and code snippets. Topics in the initial set of articles include: introductions to both CORSSA and statistical seismology, basic statistical tests and their role in seismology; understanding seismicity catalogs and their problems; basic techniques for modeling seismicity; and methods for testing earthquake predictability hypotheses. We have also begun curating a collection of statistical seismology software packages.

  13. Building the Community Online Resource for Statistical Seismicity Analysis (CORSSA)

    NASA Astrophysics Data System (ADS)

    Michael, A. J.; Wiemer, S.; Zechar, J. D.; Hardebeck, J. L.; Naylor, M.; Zhuang, J.; Steacy, S.; Corssa Executive Committee

    2010-12-01

    Statistical seismology is critical to the understanding of seismicity, the testing of proposed earthquake prediction and forecasting methods, and the assessment of seismic hazard. Unfortunately, despite its importance to seismology - especially to those aspects with great impact on public policy - statistical seismology is mostly ignored in the education of seismologists, and there is no central repository for the existing open-source software tools. To remedy these deficiencies, and with the broader goal to enhance the quality of statistical seismology research, we have begun building the Community Online Resource for Statistical Seismicity Analysis (CORSSA). CORSSA is a web-based educational platform that is authoritative, up-to-date, prominent, and user-friendly. We anticipate that the users of CORSSA will range from beginning graduate students to experienced researchers. More than 20 scientists from around the world met for a week in Zurich in May 2010 to kick-start the creation of CORSSA: the format and initial table of contents were defined; a governing structure was organized; and workshop participants began drafting articles. CORSSA materials are organized with respect to six themes, each containing between four and eight articles. The CORSSA web page, www.corssa.org, officially unveiled on September 6, 2010, debuts with an initial set of approximately 10 to 15 articles available online for viewing and commenting with additional articles to be added over the coming months. Each article will be peer-reviewed and will present a balanced discussion, including illustrative examples and code snippets. Topics in the initial set of articles will include: introductions to both CORSSA and statistical seismology, basic statistical tests and their role in seismology; understanding seismicity catalogs and their problems; basic techniques for modeling seismicity; and methods for testing earthquake predictability hypotheses. A special article will compare and review

  14. Analysis of the chaotic maps generating different statistical distributions

    NASA Astrophysics Data System (ADS)

    Lawnik, M.

    2015-09-01

    The analysis of the chaotic maps, enabling the derivation of numbers from given statistical distributions was presented. The analyzed chaotic maps are in the form xk+1 = F-1(U(F(xk))), where F is the cumulative distribution function, U is the skew tent map and F-1 is the inverse function of F. The analysis was presented on the example of chaotic map with the standard normal distribution in view of his computational efficiency and accuracy. On the grounds of the conducted analysis, it should be indicated that the method not always allows to generate the values from the given distribution.

  15. Wavelet analysis in ecology and epidemiology: impact of statistical tests

    PubMed Central

    Cazelles, Bernard; Cazelles, Kévin; Chavez, Mario

    2014-01-01

    Wavelet analysis is now frequently used to extract information from ecological and epidemiological time series. Statistical hypothesis tests are conducted on associated wavelet quantities to assess the likelihood that they are due to a random process. Such random processes represent null models and are generally based on synthetic data that share some statistical characteristics with the original time series. This allows the comparison of null statistics with those obtained from original time series. When creating synthetic datasets, different techniques of resampling result in different characteristics shared by the synthetic time series. Therefore, it becomes crucial to consider the impact of the resampling method on the results. We have addressed this point by comparing seven different statistical testing methods applied with different real and simulated data. Our results show that statistical assessment of periodic patterns is strongly affected by the choice of the resampling method, so two different resampling techniques could lead to two different conclusions about the same time series. Moreover, our results clearly show the inadequacy of resampling series generated by white noise and red noise that are nevertheless the methods currently used in the wide majority of wavelets applications. Our results highlight that the characteristics of a time series, namely its Fourier spectrum and autocorrelation, are important to consider when choosing the resampling technique. Results suggest that data-driven resampling methods should be used such as the hidden Markov model algorithm and the ‘beta-surrogate’ method. PMID:24284892

  16. Wavelet analysis in ecology and epidemiology: impact of statistical tests.

    PubMed

    Cazelles, Bernard; Cazelles, Kévin; Chavez, Mario

    2014-02-06

    Wavelet analysis is now frequently used to extract information from ecological and epidemiological time series. Statistical hypothesis tests are conducted on associated wavelet quantities to assess the likelihood that they are due to a random process. Such random processes represent null models and are generally based on synthetic data that share some statistical characteristics with the original time series. This allows the comparison of null statistics with those obtained from original time series. When creating synthetic datasets, different techniques of resampling result in different characteristics shared by the synthetic time series. Therefore, it becomes crucial to consider the impact of the resampling method on the results. We have addressed this point by comparing seven different statistical testing methods applied with different real and simulated data. Our results show that statistical assessment of periodic patterns is strongly affected by the choice of the resampling method, so two different resampling techniques could lead to two different conclusions about the same time series. Moreover, our results clearly show the inadequacy of resampling series generated by white noise and red noise that are nevertheless the methods currently used in the wide majority of wavelets applications. Our results highlight that the characteristics of a time series, namely its Fourier spectrum and autocorrelation, are important to consider when choosing the resampling technique. Results suggest that data-driven resampling methods should be used such as the hidden Markov model algorithm and the 'beta-surrogate' method.

  17. Aspects of design and statistical analysis in the Comet assay.

    PubMed

    Wiklund, Stig Johan; Agurell, Eva

    2003-03-01

    Some aspects of the statistical design and analysis of the Comet (single cell gel electrophoresis) assay have been evaluated by means of a simulation study. The tail length and tail moment were selected for the quantification of DNA migration. Results from the simulation study showed that the choice of measure to summarize the cells on each slide is extremely important in order to facilitate an efficient analysis. For tail moment, the mean of log transformed data is clearly superior to the other evaluated measures, whereas using the mean of raw data without transformation can lead to very inefficient analyses. The 90th percentile, capturing the upper tail of the distribution, performs well for the tail length, with a slight improvement obtained by applying a log transformation prior to calculations. Furthermore, the simulation study has been used to assess the appropriateness of some models for statistical analysis and to address the issue of design (i.e. number of cultures or animals in each group, number of slides per animal/culture and number of cells scored per slide). Combining the results from the simulations with practical experience from the pharmaceutical industry, we conclude the paper by providing concise recommendations regarding the design and statistical analysis in the Comet assay.

  18. Revisiting the statistical analysis of pyroclast density and porosity data

    NASA Astrophysics Data System (ADS)

    Bernard, B.; Kueppers, U.; Ortiz, H.

    2015-03-01

    Explosive volcanic eruptions are commonly characterized based on a thorough analysis of the generated deposits. Amongst other characteristics in physical volcanology, density and porosity of juvenile clasts are some of the most frequently used characteristics to constrain eruptive dynamics. In this study, we evaluate the sensitivity of density and porosity data and introduce a weighting parameter to correct issues raised by the use of frequency analysis. Results of textural investigation can be biased by clast selection. Using statistical tools as presented here, the meaningfulness of a conclusion can be checked for any dataset easily. This is necessary to define whether or not a sample has met the requirements for statistical relevance, i.e. whether a dataset is large enough to allow for reproducible results. Graphical statistics are used to describe density and porosity distributions, similar to those used for grain-size analysis. This approach helps with the interpretation of volcanic deposits. To illustrate this methodology we chose two large datasets: (1) directed blast deposits of the 3640-3510 BC eruption of Chachimbiro volcano (Ecuador) and (2) block-and-ash-flow deposits of the 1990-1995 eruption of Unzen volcano (Japan). We propose add the use of this analysis for future investigations to check the objectivity of results achieved by different working groups and guarantee the meaningfulness of the interpretation.

  19. Statistical analysis of single-trial Granger causality spectra.

    PubMed

    Brovelli, Andrea

    2012-01-01

    Granger causality analysis is becoming central for the analysis of interactions between neural populations and oscillatory networks. However, it is currently unclear whether single-trial estimates of Granger causality spectra can be used reliably to assess directional influence. We addressed this issue by combining single-trial Granger causality spectra with statistical inference based on general linear models. The approach was assessed on synthetic and neurophysiological data. Synthetic bivariate data was generated using two autoregressive processes with unidirectional coupling. We simulated two hypothetical experimental conditions: the first mimicked a constant and unidirectional coupling, whereas the second modelled a linear increase in coupling across trials. The statistical analysis of single-trial Granger causality spectra, based on t-tests and linear regression, successfully recovered the underlying pattern of directional influence. In addition, we characterised the minimum number of trials and coupling strengths required for significant detection of directionality. Finally, we demonstrated the relevance for neurophysiology by analysing two local field potentials (LFPs) simultaneously recorded from the prefrontal and premotor cortices of a macaque monkey performing a conditional visuomotor task. Our results suggest that the combination of single-trial Granger causality spectra and statistical inference provides a valuable tool for the analysis of large-scale cortical networks and brain connectivity.

  20. STATISTICAL ANALYSIS OF THE HEAVY NEUTRAL ATOMS MEASURED BY IBEX

    SciTech Connect

    Park, Jeewoo; Kucharek, Harald; Möbius, Eberhard; Galli, André; Livadiotis, George; Fuselier, Steve A.; McComas, David J.

    2015-10-15

    We investigate the directional distribution of heavy neutral atoms in the heliosphere by using heavy neutral maps generated with the IBEX-Lo instrument over three years from 2009 to 2011. The interstellar neutral (ISN) O and Ne gas flow was found in the first-year heavy neutral map at 601 keV and its flow direction and temperature were studied. However, due to the low counting statistics, researchers have not treated the full sky maps in detail. The main goal of this study is to evaluate the statistical significance of each pixel in the heavy neutral maps to get a better understanding of the directional distribution of heavy neutral atoms in the heliosphere. Here, we examine three statistical analysis methods: the signal-to-noise filter, the confidence limit method, and the cluster analysis method. These methods allow us to exclude background from areas where the heavy neutral signal is statistically significant. These methods also allow the consistent detection of heavy neutral atom structures. The main emission feature expands toward lower longitude and higher latitude from the observational peak of the ISN O and Ne gas flow. We call this emission the extended tail. It may be an imprint of the secondary oxygen atoms generated by charge exchange between ISN hydrogen atoms and oxygen ions in the outer heliosheath.

  1. Statistical Analysis of the Heavy Neutral Atoms Measured by IBEX

    NASA Astrophysics Data System (ADS)

    Park, Jeewoo; Kucharek, Harald; Möbius, Eberhard; Galli, André; Livadiotis, George; Fuselier, Steve A.; McComas, David J.

    2015-10-01

    We investigate the directional distribution of heavy neutral atoms in the heliosphere by using heavy neutral maps generated with the IBEX-Lo instrument over three years from 2009 to 2011. The interstellar neutral (ISN) O&Ne gas flow was found in the first-year heavy neutral map at 601 keV and its flow direction and temperature were studied. However, due to the low counting statistics, researchers have not treated the full sky maps in detail. The main goal of this study is to evaluate the statistical significance of each pixel in the heavy neutral maps to get a better understanding of the directional distribution of heavy neutral atoms in the heliosphere. Here, we examine three statistical analysis methods: the signal-to-noise filter, the confidence limit method, and the cluster analysis method. These methods allow us to exclude background from areas where the heavy neutral signal is statistically significant. These methods also allow the consistent detection of heavy neutral atom structures. The main emission feature expands toward lower longitude and higher latitude from the observational peak of the ISN O&Ne gas flow. We call this emission the extended tail. It may be an imprint of the secondary oxygen atoms generated by charge exchange between ISN hydrogen atoms and oxygen ions in the outer heliosheath.

  2. Advances in statistical methods to map quantitative trait loci in outbred populations.

    PubMed

    Hoeschele, I; Uimari, P; Grignola, F E; Zhang, Q; Gage, K M

    1997-11-01

    Statistical methods to map quantitative trait loci (QTL) in outbred populations are reviewed, extensions and applications to human and plant genetic data are indicated, and areas for further research are identified. Simple and computationally inexpensive methods include (multiple) linear regression of phenotype on marker genotypes and regression of squared phenotypic differences among relative pairs on estimated proportions of identity-by-descent at a locus. These methods are less suited for genetic parameter estimation in outbred populations but allow the determination of test statistic distributions via simulation or data permutation; however, further inferences including confidence intervals of QTL location require the use of Monte Carlo or bootstrap sampling techniques. A method which is intermediate in computational requirements is residual maximum likelihood (REML) with a covariance matrix of random QTL effects conditional on information from multiple linked markers. Testing for the number of QTLs on a chromosome is difficult in a classical framework. The computationally most demanding methods are maximum likelihood and Bayesian analysis, which take account of the distribution of multilocus marker-QTL genotypes on a pedigree and permit investigators to fit different models of variation at the QTL. The Bayesian analysis includes the number of QTLs on a chromosome as an unknown.

  3. Advances in Statistical Methods to Map Quantitative Trait Loci in Outbred Populations

    PubMed Central

    Hoeschele, I.; Uimari, P.; Grignola, F. E.; Zhang, Q.; Gage, K. M.

    1997-01-01

    Statistical methods to map quantitative trait loci (QTL) in outbred populations are reviewed, extensions and applications to human and plant genetic data are indicated, and areas for further research are identified. Simple and computationally inexpensive methods include (multiple) linear regression of phenotype on marker genotypes and regression of squared phenotypic differences among relative pairs on estimated proportions of identity-by-descent at a locus. These methods are less suited for genetic parameter estimation in outbred populations but allow the determination of test statistic distributions via simulation or data permutation; however, further inferences including confidence intervals of QTL location require the use of Monte Carlo or bootstrap sampling techniques. A method which is intermediate in computational requirements is residual maximum likelihood (REML) with a covariance matrix of random QTL effects conditional on information from multiple linked markers. Testing for the number of QTLs on a chromosome is difficult in a classical framework. The computationally most demanding methods are maximum likelihood and Bayesian analysis, which take account of the distribution of multilocus marker-QTL genotypes on a pedigree and permit investigators to fit different models of variation at the QTL. The Bayesian analysis includes the number of QTLs on a chromosome as an unknown. PMID:9383084

  4. Advanced Power Plant Development and Analysis Methodologies

    SciTech Connect

    A.D. Rao; G.S. Samuelsen; F.L. Robson; B. Washom; S.G. Berenyi

    2006-06-30

    Under the sponsorship of the U.S. Department of Energy/National Energy Technology Laboratory, a multi-disciplinary team led by the Advanced Power and Energy Program of the University of California at Irvine is defining the system engineering issues associated with the integration of key components and subsystems into advanced power plant systems with goals of achieving high efficiency and minimized environmental impact while using fossil fuels. These power plant concepts include 'Zero Emission' power plants and the 'FutureGen' H2 co-production facilities. The study is broken down into three phases. Phase 1 of this study consisted of utilizing advanced technologies that are expected to be available in the 'Vision 21' time frame such as mega scale fuel cell based hybrids. Phase 2 includes current state-of-the-art technologies and those expected to be deployed in the nearer term such as advanced gas turbines and high temperature membranes for separating gas species and advanced gasifier concepts. Phase 3 includes identification of gas turbine based cycles and engine configurations suitable to coal-based gasification applications and the conceptualization of the balance of plant technology, heat integration, and the bottoming cycle for analysis in a future study. Also included in Phase 3 is the task of acquiring/providing turbo-machinery in order to gather turbo-charger performance data that may be used to verify simulation models as well as establishing system design constraints. The results of these various investigations will serve as a guide for the U. S. Department of Energy in identifying the research areas and technologies that warrant further support.

  5. Preparing High School Students for Success in Advanced Placement Statistics: An Investigation of Pedagogies and Strategies Used in an Online Advanced Placement Statistics Course

    ERIC Educational Resources Information Center

    Potter, James Thomson, III

    2012-01-01

    Research into teaching practices and strategies has been performed separately in AP Statistics and in K-12 online learning (Garfield, 2002; Ferdig, DiPietro, Black & Dawson, 2009). This study seeks combine the two and build on the need for more investigation into online teaching and learning in specific content (Ferdig et al, 2009; DiPietro,…

  6. A statistical analysis of mesoscale rainfall as a random cascade

    NASA Technical Reports Server (NTRS)

    Gupta, Vijay K.; Waymire, Edward C.

    1993-01-01

    The paper considers the random cascade theory for spatial rainfall. Particular attention was given to the following four areas: (1) the relationship of the random cascade theory of rainfall to the simple scaling and the hierarchical cluster-point-process theories, (2) the mathematical foundations for some of the formalisms commonly applied in the develpment of statistical cascade theory, (3) the empirical evidence for a random cascade theory of rainfall, and (4) the way of using data for making estimates of parameters and for making statistical inference within this theoretical framework. An analysis of space-time rainfall data is presented. Cascade simulations are carried out to provide a comparison with methods of analysis that are applied to the rainfall data.

  7. Three-parameter probability distribution density for statistical image analysis

    NASA Astrophysics Data System (ADS)

    Schau, H. C.

    1980-01-01

    Statistical analysis of 2-D image data or data gathered from a scanning radiometer requires that both the non-Gaussian nature and finite sample size of the process be considered. To aid the statistical analysis of this data, a higher moment description density function has been defined, and parameters have been identified with the estimated moments of the data. It is shown that the first two moments may be computed from a knowledge of the Weiner spectrum, whereas all higher moments require the complex spatial frequency spectrum. Parameter identification is carried out for a three-parameter density function and applied to a scene in the IR region, 8-14 microns. Results indicate that a three-parameter distribution density generally provides different probabilities than does a two-parameter Gaussian description if maximum entropy (minimum bias) forms are sought.

  8. Collagen morphology and texture analysis: from statistics to classification

    PubMed Central

    Mostaço-Guidolin, Leila B.; Ko, Alex C.-T.; Wang, Fei; Xiang, Bo; Hewko, Mark; Tian, Ganghong; Major, Arkady; Shiomi, Masashi; Sowa, Michael G.

    2013-01-01

    In this study we present an image analysis methodology capable of quantifying morphological changes in tissue collagen fibril organization caused by pathological conditions. Texture analysis based on first-order statistics (FOS) and second-order statistics such as gray level co-occurrence matrix (GLCM) was explored to extract second-harmonic generation (SHG) image features that are associated with the structural and biochemical changes of tissue collagen networks. Based on these extracted quantitative parameters, multi-group classification of SHG images was performed. With combined FOS and GLCM texture values, we achieved reliable classification of SHG collagen images acquired from atherosclerosis arteries with >90% accuracy, sensitivity and specificity. The proposed methodology can be applied to a wide range of conditions involving collagen re-modeling, such as in skin disorders, different types of fibrosis and muscular-skeletal diseases affecting ligaments and cartilage. PMID:23846580

  9. Statistical Analysis of speckle noise reduction techniques for echocardiographic Images

    NASA Astrophysics Data System (ADS)

    Saini, Kalpana; Dewal, M. L.; Rohit, Manojkumar

    2011-12-01

    Echocardiography is the safe, easy and fast technology for diagnosing the cardiac diseases. As in other ultrasound images these images also contain speckle noise. In some cases this speckle noise is useful such as in motion detection. But in general noise removal is required for better analysis of the image and proper diagnosis. Different Adaptive and anisotropic filters are included for statistical analysis. Statistical parameters such as Signal-to-Noise Ratio (SNR), Peak Signal-to-Noise Ratio (PSNR), and Root Mean Square Error (RMSE) calculated for performance measurement. One more important aspect that there may be blurring during speckle noise removal. So it is prefered that filter should be able to enhance edges during noise removal.

  10. Collagen morphology and texture analysis: from statistics to classification.

    PubMed

    Mostaço-Guidolin, Leila B; Ko, Alex C-T; Wang, Fei; Xiang, Bo; Hewko, Mark; Tian, Ganghong; Major, Arkady; Shiomi, Masashi; Sowa, Michael G

    2013-01-01

    In this study we present an image analysis methodology capable of quantifying morphological changes in tissue collagen fibril organization caused by pathological conditions. Texture analysis based on first-order statistics (FOS) and second-order statistics such as gray level co-occurrence matrix (GLCM) was explored to extract second-harmonic generation (SHG) image features that are associated with the structural and biochemical changes of tissue collagen networks. Based on these extracted quantitative parameters, multi-group classification of SHG images was performed. With combined FOS and GLCM texture values, we achieved reliable classification of SHG collagen images acquired from atherosclerosis arteries with >90% accuracy, sensitivity and specificity. The proposed methodology can be applied to a wide range of conditions involving collagen re-modeling, such as in skin disorders, different types of fibrosis and muscular-skeletal diseases affecting ligaments and cartilage.

  11. Collagen morphology and texture analysis: from statistics to classification

    NASA Astrophysics Data System (ADS)

    Mostaço-Guidolin, Leila B.; Ko, Alex C.-T.; Wang, Fei; Xiang, Bo; Hewko, Mark; Tian, Ganghong; Major, Arkady; Shiomi, Masashi; Sowa, Michael G.

    2013-07-01

    In this study we present an image analysis methodology capable of quantifying morphological changes in tissue collagen fibril organization caused by pathological conditions. Texture analysis based on first-order statistics (FOS) and second-order statistics such as gray level co-occurrence matrix (GLCM) was explored to extract second-harmonic generation (SHG) image features that are associated with the structural and biochemical changes of tissue collagen networks. Based on these extracted quantitative parameters, multi-group classification of SHG images was performed. With combined FOS and GLCM texture values, we achieved reliable classification of SHG collagen images acquired from atherosclerosis arteries with >90% accuracy, sensitivity and specificity. The proposed methodology can be applied to a wide range of conditions involving collagen re-modeling, such as in skin disorders, different types of fibrosis and muscular-skeletal diseases affecting ligaments and cartilage.

  12. Live-site UXO classification studies using advanced EMI and statistical models

    NASA Astrophysics Data System (ADS)

    Shamatava, I.; Shubitidze, F.; Fernandez, J. P.; Bijamov, A.; Barrowes, B. E.; O'Neill, K.

    2011-06-01

    In this paper we present the inversion and classification performance of the advanced EMI inversion, processing and discrimination schemes developed by our group when applied to the ESTCP Live-Site UXO Discrimination Study carried out at the former Camp Butner in North Carolina. The advanced models combine: 1) the joint diagonalization (JD) algorithm to estimate the number of potential anomalies from the measured data without inversion, 2) the ortho-normalized volume magnetic source (ONVMS) to represent targets' EMI responses and extract their intrinsic "feature vectors," and 3) the Gaussian mixture algorithm to classify buried objects as targets of interest or not starting from the extracted discrimination features. The studies are conducted using cued datasets collected with the next-generation TEMTADS and MetalMapper (MM) sensor systems. For the cued TEMTADS datasets we first estimate the data quality and the number of targets contributing to each signal using the JD technique. Once we know the number of targets we proceed to invert the data using a standard non-linear optimization technique in order to determine intrinsic parameters such as the total ONVMS for each potential target. Finally we classify the targets using a library-matching technique. The MetalMapper data are all inverted as multi-target scenarios, and the resulting intrinsic parameters are grouped using an unsupervised Gaussian mixture approach. The potential targets of interest are a 37-mm projectile, an M48 fuze, and a 105-mm projectile. During the analysis we requested the ground truth for a few selected anomalies to assist in the classification task. Our results were scored independently by the Institute for Defense Analyses, who revealed that our advanced models produce superb classification when starting from either TEMTADS or MM cued datasets.

  13. Statistical Signal Models and Algorithms for Image Analysis

    DTIC Science & Technology

    1984-10-25

    In this report, two-dimensional stochastic linear models are used in developing algorithms for image analysis such as classification, segmentation, and object detection in images characterized by textured backgrounds. These models generate two-dimensional random processes as outputs to which statistical inference procedures can naturally be applied. A common thread throughout our algorithms is the interpretation of the inference procedures in terms of linear prediction

  14. Statistical Analysis of the Exchange Rate of Bitcoin

    PubMed Central

    Chu, Jeffrey; Nadarajah, Saralees; Chan, Stephen

    2015-01-01

    Bitcoin, the first electronic payment system, is becoming a popular currency. We provide a statistical analysis of the log-returns of the exchange rate of Bitcoin versus the United States Dollar. Fifteen of the most popular parametric distributions in finance are fitted to the log-returns. The generalized hyperbolic distribution is shown to give the best fit. Predictions are given for future values of the exchange rate. PMID:26222702

  15. Statistical Analysis of the Exchange Rate of Bitcoin.

    PubMed

    Chu, Jeffrey; Nadarajah, Saralees; Chan, Stephen

    2015-01-01

    Bitcoin, the first electronic payment system, is becoming a popular currency. We provide a statistical analysis of the log-returns of the exchange rate of Bitcoin versus the United States Dollar. Fifteen of the most popular parametric distributions in finance are fitted to the log-returns. The generalized hyperbolic distribution is shown to give the best fit. Predictions are given for future values of the exchange rate.

  16. The statistical analysis of multivariate serological frequency data.

    PubMed

    Reyment, Richard A

    2005-11-01

    Data occurring in the form of frequencies are common in genetics-for example, in serology. Examples are provided by the AB0 group, the Rhesus group, and also DNA data. The statistical analysis of tables of frequencies is carried out using the available methods of multivariate analysis with usually three principal aims. One of these is to seek meaningful relationships between the components of a data set, the second is to examine relationships between populations from which the data have been obtained, the third is to bring about a reduction in dimensionality. This latter aim is usually realized by means of bivariate scatter diagrams using scores computed from a multivariate analysis. The multivariate statistical analysis of tables of frequencies cannot safely be carried out by standard multivariate procedures because they represent compositions and are therefore embedded in simplex space, a subspace of full space. Appropriate procedures for simplex space are compared and contrasted with simple standard methods of multivariate analysis ("raw" principal component analysis). The study shows that the differences between a log-ratio model and a simple logarithmic transformation of proportions may not be very great, particularly as regards graphical ordinations, but important discrepancies do occur. The divergencies between logarithmically based analyses and raw data are, however, great. Published data on Rhesus alleles observed for Italian populations are used to exemplify the subject.

  17. Defects and statistical degradation analysis of photovoltaic power plants

    NASA Astrophysics Data System (ADS)

    Sundarajan, Prasanna

    As the photovoltaic (PV) power plants age in the field, the PV modules degrade and generate visible and invisible defects. A defect and statistical degradation rate analysis of photovoltaic (PV) power plants is presented in two-part thesis. The first part of the thesis deals with the defect analysis and the second part of the thesis deals with the statistical degradation rate analysis. In the first part, a detailed analysis on the performance or financial risk related to each defect found in multiple PV power plants across various climatic regions of the USA is presented by assigning a risk priority number (RPN). The RPN for all the defects in each PV plant is determined based on two databases: degradation rate database; defect rate database. In this analysis it is determined that the RPN for each plant is dictated by the technology type (crystalline silicon or thin-film), climate and age. The PV modules aging between 3 and 19 years in four different climates of hot-dry, hot-humid, cold-dry and temperate are investigated in this study. In the second part, a statistical degradation analysis is performed to determine if the degradation rates are linear or not in the power plants exposed in a hot-dry climate for the crystalline silicon technologies. This linearity degradation analysis is performed using the data obtained through two methods: current-voltage method; metered kWh method. For the current-voltage method, the annual power degradation data of hundreds of individual modules in six crystalline silicon power plants of different ages is used. For the metered kWh method, a residual plot analysis using Winters' statistical method is performed for two crystalline silicon plants of different ages. The metered kWh data typically consists of the signal and noise components. Smoothers remove the noise component from the data by taking the average of the current and the previous observations. Once this is done, a residual plot analysis of the error component is

  18. Statistical analysis of the seasonal variation in demographic data.

    PubMed

    Fellman, J; Eriksson, A W

    2000-10-01

    There has been little agreement as to whether reproduction or similar demographic events occur seasonally and, especially, whether there is any universal seasonal pattern. One reason is that the seasonal pattern may vary in different populations and at different times. Another reason is that different statistical methods have been used. Every statistical model is based on certain assumed conditions and hence is designed to identify specific components of the seasonal pattern. Therefore, the statistical method applied should be chosen with due consideration. In this study we present, develop, and compare different statistical methods for the study of seasonal variation. Furthermore, we stress that the methods are applicable for the analysis of many kinds of demographic data. The first approaches in the literature were based on monthly frequencies, on the simple sine curve, and on the approximation that the months are of equal length. Later, "the population at risk" and the fact that the months have different lengths were considered. Under these later assumptions the targets of the statistical analyses are the rates. In this study we present and generalize the earlier models. Furthermore, we use trigonometric regression methods. The trigonometric regression model in its simplest form corresponds to the sine curve. We compare the regression methods with the earlier models and reanalyze some data. Our results show that models for rates eliminate the disturbing effects of the varying length of the months, including the effect of leap years, and of the seasonal pattern of the population at risk. Therefore, they give the purest analysis of the seasonal pattern of the demographic data in question, e.g., rates of general births, twin maternities, neural tube defects, and mortality. Our main finding is that the trigonometric regression methods are more flexible and easier to handle than the earlier methods, particularly when the data differ from the simple sine curve.

  19. [Advanced data analysis and visualization for clinical laboratory].

    PubMed

    Inada, Masanori; Yoneyama, Akiko

    2011-01-01

    This paper describes visualization techniques that help identify hidden structures in clinical laboratory data. The visualization of data is helpful for a rapid and better understanding of the characteristics of data sets. Various charts help the user identify trends in data. Scatter plots help prevent misinterpretations due to invalid data by identifying outliers. The representation of experimental data in figures is always useful for communicating results to others. Currently, flexible methods such as smoothing methods and latent structure analysis are available owing to the presence of advanced hardware and software. Principle component analysis, which is a well-known technique used to reduce multidimensional data sets, can be carried out on a personal computer. These methods could lead to advanced visualization with regard to exploratory data analysis. In this paper, we present 3 examples in order to introduce advanced data analysis. In the first example, a smoothing spline was fitted to a time-series from the control chart which is not in a state of statistical control. The trend line was clearly extracted from the daily measurements of the control samples. In the second example, principal component analysis was used to identify a new diagnostic indicator for Graves' disease. The multi-dimensional data obtained from patients were reduced to lower dimensions, and the principle components thus obtained summarized the variation in the data set. In the final example, a latent structure analysis for a Gaussian mixture model was used to draw complex density functions suitable for actual laboratory data. As a result, 5 clusters were extracted. The mixed density function of these clusters represented the data distribution graphically. The methods used in the above examples make the creation of complicated models for clinical laboratories more simple and flexible.

  20. Advancing the Science of Spatial Neglect Rehabilitation: An Improved Statistical Approach with Mixed Linear Modeling

    PubMed Central

    Goedert, Kelly M.; Boston, Raymond C.; Barrett, A. M.

    2013-01-01

    Valid research on neglect rehabilitation demands a statistical approach commensurate with the characteristics of neglect rehabilitation data: neglect arises from impairment in distinct brain networks leading to large between-subject variability in baseline symptoms and recovery trajectories. Studies enrolling medically ill, disabled patients, may suffer from missing, unbalanced data, and small sample sizes. Finally, assessment of rehabilitation requires a description of continuous recovery trajectories. Unfortunately, the statistical method currently employed in most studies of neglect treatment [repeated measures analysis of variance (ANOVA), rANOVA] does not well-address these issues. Here we review an alternative, mixed linear modeling (MLM), that is more appropriate for assessing change over time. MLM better accounts for between-subject heterogeneity in baseline neglect severity and in recovery trajectory. MLM does not require complete or balanced data, nor does it make strict assumptions regarding the data structure. Furthermore, because MLM better models between-subject heterogeneity it often results in increased power to observe treatment effects with smaller samples. After reviewing current practices in the field, and the assumptions of rANOVA, we provide an introduction to MLM. We review its assumptions, uses, advantages, and disadvantages. Using real and simulated data, we illustrate how MLM may improve the ability to detect effects of treatment over ANOVA, particularly with the small samples typical of neglect research. Furthermore, our simulation analyses result in recommendations for the design of future rehabilitation studies. Because between-subject heterogeneity is one important reason why studies of neglect treatments often yield conflicting results, employing statistical procedures that model this heterogeneity more accurately will increase the efficiency of our efforts to find treatments to improve the lives of individuals with neglect. PMID

  1. Spectral Analysis of B Stars: An Application of Bayesian Statistics

    NASA Astrophysics Data System (ADS)

    Mugnes, J.-M.; Robert, C.

    2012-12-01

    To better understand the processes involved in stellar physics, it is necessary to obtain accurate stellar parameters (effective temperature, surface gravity, abundances…). Spectral analysis is a powerful tool for investigating stars, but it is also vital to reduce uncertainties at a decent computational cost. Here we present a spectral analysis method based on a combination of Bayesian statistics and grids of synthetic spectra obtained with TLUSTY. This method simultaneously constrains the stellar parameters by using all the lines accessible in observed spectra and thus greatly reduces uncertainties and improves the overall spectrum fitting. Preliminary results are shown using spectra from the Observatoire du Mont-Mégantic.

  2. Coping, Stress, and Job Satisfaction as Predictors of Advanced Placement Statistics Teachers' Intention to Leave the Field

    ERIC Educational Resources Information Center

    McCarthy, Christopher J.; Lambert, Richard G.; Crowe, Elizabeth W.; McCarthy, Colleen J.

    2010-01-01

    This study examined the relationship of teachers' perceptions of coping resources and demands to job satisfaction factors. Participants were 158 Advanced Placement Statistics high school teachers who completed measures of personal resources for stress prevention, classroom demands and resources, job satisfaction, and intention to leave the field…

  3. Detailed Analysis of the Interoccurrence Time Statistics in Seismic Activity

    NASA Astrophysics Data System (ADS)

    Tanaka, Hiroki; Aizawa, Yoji

    2017-02-01

    The interoccurrence time statistics of seismiciry is studied theoretically as well as numerically by taking into account the conditional probability and the correlations among many earthquakes in different magnitude levels. It is known so far that the interoccurrence time statistics is well approximated by the Weibull distribution, but the more detailed information about the interoccurrence times can be obtained from the analysis of the conditional probability. Firstly, we propose the Embedding Equation Theory (EET), where the conditional probability is described by two kinds of correlation coefficients; one is the magnitude correlation and the other is the inter-event time correlation. Furthermore, the scaling law of each correlation coefficient is clearly determined from the numerical data-analysis carrying out with the Preliminary Determination of Epicenter (PDE) Catalog and the Japan Meteorological Agency (JMA) Catalog. Secondly, the EET is examined to derive the magnitude dependence of the interoccurrence time statistics and the multi-fractal relation is successfully formulated. Theoretically we cannot prove the universality of the multi-fractal relation in seismic activity; nevertheless, the theoretical results well reproduce all numerical data in our analysis, where several common features or the invariant aspects are clearly observed. Especially in the case of stationary ensembles the multi-fractal relation seems to obey an invariant curve, furthermore in the case of non-stationary (moving time) ensembles for the aftershock regime the multi-fractal relation seems to satisfy a certain invariant curve at any moving times. It is emphasized that the multi-fractal relation plays an important role to unify the statistical laws of seismicity: actually the Gutenberg-Richter law and the Weibull distribution are unified in the multi-fractal relation, and some universality conjectures regarding the seismicity are briefly discussed.

  4. "I am Not a Statistic": Identities of African American Males in Advanced Science Courses

    NASA Astrophysics Data System (ADS)

    Johnson, Diane Wynn

    The United States Bureau of Labor Statistics (2010) expects new industries to generate approximately 2.7 million jobs in science and technology by the year 2018, and there is concern as to whether there will be enough trained individuals to fill these positions. A tremendous resource remains untapped, African American students, especially African American males (National Science Foundation, 2009). Historically, African American males have been omitted from the so called science pipeline. Fewer African American males pursue a science discipline due, in part; to limiting factors they experience in school and at home (Ogbu, 2004). This is a case study of African American males who are enrolled in advanced science courses at a predominantly African American (84%) urban high school. Guided by expectancy-value theory (EVT) of achievement related results (Eccles, 2009; Eccles et al., 1983), twelve African American male students in two advanced science courses were observed in their science classrooms weekly, participated in an in-depth interview, developed a presentation to share with students enrolled in a tenth grade science course, responded to an open-ended identity questionnaire, and were surveyed about their perceptions of school. Additionally, the students' teachers were interviewed, and seven of the students' parents. The interview data analyses highlighted the important role of supportive parents (key socializers) who had high expectations for their sons and who pushed them academically. The students clearly attributed their enrollment in advanced science courses to their high regard for their science teachers, which included positive relationships, hands-on learning in class, and an inviting and encouraging learning environment. Additionally, other family members and coaches played important roles in these young men's lives. Students' PowerPoint(c) presentations to younger high school students on why they should take advanced science courses highlighted these

  5. Identification of fungal phytopathogens using Fourier transform infrared-attenuated total reflection spectroscopy and advanced statistical methods

    NASA Astrophysics Data System (ADS)

    Salman, Ahmad; Lapidot, Itshak; Pomerantz, Ami; Tsror, Leah; Shufan, Elad; Moreh, Raymond; Mordechai, Shaul; Huleihel, Mahmoud

    2012-01-01

    The early diagnosis of phytopathogens is of a great importance; it could save large economical losses due to crops damaged by fungal diseases, and prevent unnecessary soil fumigation or the use of fungicides and bactericides and thus prevent considerable environmental pollution. In this study, 18 isolates of three different fungi genera were investigated; six isolates of Colletotrichum coccodes, six isolates of Verticillium dahliae and six isolates of Fusarium oxysporum. Our main goal was to differentiate these fungi samples on the level of isolates, based on their infrared absorption spectra obtained using the Fourier transform infrared-attenuated total reflection (FTIR-ATR) sampling technique. Advanced statistical and mathematical methods: principal component analysis (PCA), linear discriminant analysis (LDA), and k-means were applied to the spectra after manipulation. Our results showed significant spectral differences between the various fungi genera examined. The use of k-means enabled classification between the genera with a 94.5% accuracy, whereas the use of PCA [3 principal components (PCs)] and LDA has achieved a 99.7% success rate. However, on the level of isolates, the best differentiation results were obtained using PCA (9 PCs) and LDA for the lower wavenumber region (800-1775 cm-1), with identification success rates of 87%, 85.5%, and 94.5% for Colletotrichum, Fusarium, and Verticillium strains, respectively.

  6. The Effects of Statistical Analysis Software and Calculators on Statistics Achievement

    ERIC Educational Resources Information Center

    Christmann, Edwin P.

    2009-01-01

    This study compared the effects of microcomputer-based statistical software and hand-held calculators on the statistics achievement of university males and females. The subjects, 73 graduate students enrolled in univariate statistics classes at a public comprehensive university, were randomly assigned to groups that used either microcomputer-based…

  7. Turbo recognition: a statistical approach to layout analysis

    NASA Astrophysics Data System (ADS)

    Tokuyasu, Taku A.; Chou, Philip A.

    2000-12-01

    Turbo recognition (TR) is a communication theory approach to the analysis of rectangular layouts, in the spirit of Document Image Decoding. The TR algorithm, inspired by turbo decoding, is based on a generative model of image production, in which two grammars are used simultaneously to describe structure in orthogonal (horizontal and vertical directions. This enables TR to strictly embody non-local constraints that cannot be taken into account by local statistical methods. This basis in finite state grammars also allows TR to be quickly retargetable to new domains. We illustrate some of the capabilities of TR with two examples involving realistic images. While TR, like turbo decoding, is not guaranteed to recover the statistically optimal solution, we present an experiment that demonstrates its ability to produce optimal or near-optimal results on a simple yet nontrivial example, the recovery of a filled rectangle in the midst of noise. Unlike methods such as stochastic context free grammars and exhaustive search, which are often intractable beyond small images, turbo recognition scales linearly with image size, suggesting TR as an efficient yet near-optimal approach to statistical layout analysis.

  8. Agriculture, population growth, and statistical analysis of the radiocarbon record.

    PubMed

    Zahid, H Jabran; Robinson, Erick; Kelly, Robert L

    2016-01-26

    The human population has grown significantly since the onset of the Holocene about 12,000 y ago. Despite decades of research, the factors determining prehistoric population growth remain uncertain. Here, we examine measurements of the rate of growth of the prehistoric human population based on statistical analysis of the radiocarbon record. We find that, during most of the Holocene, human populations worldwide grew at a long-term annual rate of 0.04%. Statistical analysis of the radiocarbon record shows that transitioning farming societies experienced the same rate of growth as contemporaneous foraging societies. The same rate of growth measured for populations dwelling in a range of environments and practicing a variety of subsistence strategies suggests that the global climate and/or endogenous biological factors, not adaptability to local environment or subsistence practices, regulated the long-term growth of the human population during most of the Holocene. Our results demonstrate that statistical analyses of large ensembles of radiocarbon dates are robust and valuable for quantitatively investigating the demography of prehistoric human populations worldwide.

  9. Agriculture, population growth, and statistical analysis of the radiocarbon record

    PubMed Central

    Zahid, H. Jabran; Robinson, Erick; Kelly, Robert L.

    2016-01-01

    The human population has grown significantly since the onset of the Holocene about 12,000 y ago. Despite decades of research, the factors determining prehistoric population growth remain uncertain. Here, we examine measurements of the rate of growth of the prehistoric human population based on statistical analysis of the radiocarbon record. We find that, during most of the Holocene, human populations worldwide grew at a long-term annual rate of 0.04%. Statistical analysis of the radiocarbon record shows that transitioning farming societies experienced the same rate of growth as contemporaneous foraging societies. The same rate of growth measured for populations dwelling in a range of environments and practicing a variety of subsistence strategies suggests that the global climate and/or endogenous biological factors, not adaptability to local environment or subsistence practices, regulated the long-term growth of the human population during most of the Holocene. Our results demonstrate that statistical analyses of large ensembles of radiocarbon dates are robust and valuable for quantitatively investigating the demography of prehistoric human populations worldwide. PMID:26699457

  10. Introduction to statistical methods for microRNA analysis.

    PubMed

    Zararsiz, Gökmen; Coşgun, Erdal

    2014-01-01

    MicroRNA profiling is an important task to investigate miRNA functions and recent technologies such as microarray, single nucleotide polymorphism (SNP), quantitative real-time PCR (qPCR), and next-generation sequencing (NGS) have played a major role for miRNA analysis. In this chapter, we give an overview on statistical approaches for gene expressions, SNP, qPCR, and NGS data including preliminary analyses (pre-processing, differential expression, classification, clustering, exploration of interactions, and the use of ontologies). Our goal is to outline the key approaches with a brief discussion of problems avenues for their solutions and to give some examples for real-world use. Readers will be able to understand the different data formats (expression levels, sequences etc.) and they will be able to choose appropriate methods for their own research and application. On the other hand, we give brief notes on most popular tools/packages for statistical genetic analysis. This chapter aims to serve as a brief introduction to different kinds of statistical methods and also provides an extensive source of references.

  11. Self-Contained Statistical Analysis of Gene Sets

    PubMed Central

    Cannon, Judy L.; Ricoy, Ulises M.; Johnson, Christopher

    2016-01-01

    Microarrays are a powerful tool for studying differential gene expression. However, lists of many differentially expressed genes are often generated, and unraveling meaningful biological processes from the lists can be challenging. For this reason, investigators have sought to quantify the statistical probability of compiled gene sets rather than individual genes. The gene sets typically are organized around a biological theme or pathway. We compute correlations between different gene set tests and elect to use Fisher’s self-contained method for gene set analysis. We improve Fisher’s differential expression analysis of a gene set by limiting the p-value of an individual gene within the gene set to prevent a small percentage of genes from determining the statistical significance of the entire set. In addition, we also compute dependencies among genes within the set to determine which genes are statistically linked. The method is applied to T-ALL (T-lineage Acute Lymphoblastic Leukemia) to identify differentially expressed gene sets between T-ALL and normal patients and T-ALL and AML (Acute Myeloid Leukemia) patients. PMID:27711232

  12. The NIRS Analysis Package: noise reduction and statistical inference.

    PubMed

    Fekete, Tomer; Rubin, Denis; Carlson, Joshua M; Mujica-Parodi, Lilianne R

    2011-01-01

    Near infrared spectroscopy (NIRS) is a non-invasive optical imaging technique that can be used to measure cortical hemodynamic responses to specific stimuli or tasks. While analyses of NIRS data are normally adapted from established fMRI techniques, there are nevertheless substantial differences between the two modalities. Here, we investigate the impact of NIRS-specific noise; e.g., systemic (physiological), motion-related artifacts, and serial autocorrelations, upon the validity of statistical inference within the framework of the general linear model. We present a comprehensive framework for noise reduction and statistical inference, which is custom-tailored to the noise characteristics of NIRS. These methods have been implemented in a public domain Matlab toolbox, the NIRS Analysis Package (NAP). Finally, we validate NAP using both simulated and actual data, showing marked improvement in the detection power and reliability of NIRS.

  13. A Statistical Analysis of Lunisolar-Earthquake Connections

    NASA Astrophysics Data System (ADS)

    Rüegg, Christian Michael-André

    2012-11-01

    Despite over a century of study, the relationship between lunar cycles and earthquakes remains controversial and difficult to quantitatively investigate. Perhaps as a consequence, major earthquakes around the globe are frequently followed by "prediction claim", using lunar cycles, that generate media furore and pressure scientists to provide resolute answers. The 2010-2011 Canterbury earthquakes in New Zealand were no exception; significant media attention was given to lunar derived earthquake predictions by non-scientists, even though the predictions were merely "opinions" and were not based on any statistically robust temporal or causal relationships. This thesis provides a framework for studying lunisolar earthquake temporal relationships by developing replicable statistical methodology based on peer reviewed literature. Notable in the methodology is a high accuracy ephemeris, called ECLPSE, designed specifically by the author for use on earthquake catalogs and a model for performing phase angle analysis.

  14. Spatial statistical analysis of tree deaths using airborne digital imagery

    NASA Astrophysics Data System (ADS)

    Chang, Ya-Mei; Baddeley, Adrian; Wallace, Jeremy; Canci, Michael

    2013-04-01

    High resolution digital airborne imagery offers unprecedented opportunities for observation and monitoring of vegetation, providing the potential to identify, locate and track individual vegetation objects over time. Analytical tools are required to quantify relevant information. In this paper, locations of trees over a large area of native woodland vegetation were identified using morphological image analysis techniques. Methods of spatial point process statistics were then applied to estimate the spatially-varying tree death risk, and to show that it is significantly non-uniform. [Tree deaths over the area were detected in our previous work (Wallace et al., 2008).] The study area is a major source of ground water for the city of Perth, and the work was motivated by the need to understand and quantify vegetation changes in the context of water extraction and drying climate. The influence of hydrological variables on tree death risk was investigated using spatial statistics (graphical exploratory methods, spatial point pattern modelling and diagnostics).

  15. Statistical analysis of effective singular values in matrix rank determination

    NASA Technical Reports Server (NTRS)

    Konstantinides, Konstantinos; Yao, Kung

    1988-01-01

    A major problem in using SVD (singular-value decomposition) as a tool in determining the effective rank of a perturbed matrix is that of distinguishing between significantly small and significantly large singular values to the end, conference regions are derived for the perturbed singular values of matrices with noisy observation data. The analysis is based on the theories of perturbations of singular values and statistical significance test. Threshold bounds for perturbation due to finite-precision and i.i.d. random models are evaluated. In random models, the threshold bounds depend on the dimension of the matrix, the noisy variance, and predefined statistical level of significance. Results applied to the problem of determining the effective order of a linear autoregressive system from the approximate rank of a sample autocorrelation matrix are considered. Various numerical examples illustrating the usefulness of these bounds and comparisons to other previously known approaches are given.

  16. Statistical analysis of subjective preferences for video enhancement

    NASA Astrophysics Data System (ADS)

    Woods, Russell L.; Satgunam, PremNandhini; Bronstad, P. Matthew; Peli, Eli

    2010-02-01

    Measuring preferences for moving video quality is harder than for static images due to the fleeting and variable nature of moving video. Subjective preferences for image quality can be tested by observers indicating their preference for one image over another. Such pairwise comparisons can be analyzed using Thurstone scaling (Farrell, 1999). Thurstone (1927) scaling is widely used in applied psychology, marketing, food tasting and advertising research. Thurstone analysis constructs an arbitrary perceptual scale for the items that are compared (e.g. enhancement levels). However, Thurstone scaling does not determine the statistical significance of the differences between items on that perceptual scale. Recent papers have provided inferential statistical methods that produce an outcome similar to Thurstone scaling (Lipovetsky and Conklin, 2004). Here, we demonstrate that binary logistic regression can analyze preferences for enhanced video.

  17. Statistical Analysis of Human Blood Cytometries: Potential Donors and Patients

    NASA Astrophysics Data System (ADS)

    Bernal-Alvarado, J.; Segovia-Olvera, P.; Mancilla-Escobar, B. E.; Palomares, P.

    2004-09-01

    The histograms of the cell volume from human blood present valuable information for clinical evaluation. Measurements can be performed with automatic equipment and a graphical presentation of the data is available, nevertheless, an statistical and mathematical analysis of the cell volume distribution could be useful for medical interpretation too, as much as the numerical parameters characterizing the histograms might be correlated with healthy people and patient populations. In this work, a statistical exercise was performed in order to find the most suitable model fitting the cell volume histograms. Several trial functions were tested and their parameters were tabulated. Healthy people exhibited an average of the cell volume of 85 femto liters while patients had 95 femto liters. White blood cell presented a small variation and platelets preserved their average for both populations.

  18. Statistical analysis of the particulation of shaped charge jets

    SciTech Connect

    Minich, R W, Baker, E L; Schwartz, A J

    1999-08-12

    A statistical analysis of shaped charge jet break-up was carried out in order to investigate the role of nonlinear instabilities leading to the particulation of the jet. Statistical methods generally used for studying fluctuations in nonlinear dynamical systems are applied to experimentally measured velocities of the individual particles. In particular we present results suggesting the deviation of non-Gaussian behavior for interparticle velocity correlations, characteristic of nonlinear dynamical systems. Results are presented for two silver shaped charge jets that differ primarily in their material processing. We provide evidence that the particulation of a jet is not random, but has its origin in a deterministic dynamical process involving the nonlinear coupling of two oscillators analogous to the underling dynamics observed in Rayleigh-Benard convection and modeled in the return map of Curry and Yorke.

  19. STATISTICAL ANALYSIS OF TANK 19F FLOOR SAMPLE RESULTS

    SciTech Connect

    Harris, S.

    2010-09-02

    Representative sampling has been completed for characterization of the residual material on the floor of Tank 19F as per the statistical sampling plan developed by Harris and Shine. Samples from eight locations have been obtained from the tank floor and two of the samples were archived as a contingency. Six samples, referred to in this report as the current scrape samples, have been submitted to and analyzed by SRNL. This report contains the statistical analysis of the floor sample analytical results to determine if further data are needed to reduce uncertainty. Included are comparisons with the prior Mantis samples results to determine if they can be pooled with the current scrape samples to estimate the upper 95% confidence limits (UCL95%) for concentration. Statistical analysis revealed that the Mantis and current scrape sample results are not compatible. Therefore, the Mantis sample results were not used to support the quantification of analytes in the residual material. Significant spatial variability among the current scrape sample results was not found. Constituent concentrations were similar between the North and South hemispheres as well as between the inner and outer regions of the tank floor. The current scrape sample results from all six samples fall within their 3-sigma limits. In view of the results from numerous statistical tests, the data were pooled from all six current scrape samples. As such, an adequate sample size was provided for quantification of the residual material on the floor of Tank 19F. The uncertainty is quantified in this report by an UCL95% on each analyte concentration. The uncertainty in analyte concentration was calculated as a function of the number of samples, the average, and the standard deviation of the analytical results. The UCL95% was based entirely on the six current scrape sample results (each averaged across three analytical determinations).

  20. STATISTICAL ANALYSIS OF TANK 18F FLOOR SAMPLE RESULTS

    SciTech Connect

    Harris, S.

    2010-09-02

    Representative sampling has been completed for characterization of the residual material on the floor of Tank 18F as per the statistical sampling plan developed by Shine [1]. Samples from eight locations have been obtained from the tank floor and two of the samples were archived as a contingency. Six samples, referred to in this report as the current scrape samples, have been submitted to and analyzed by SRNL [2]. This report contains the statistical analysis of the floor sample analytical results to determine if further data are needed to reduce uncertainty. Included are comparisons with the prior Mantis samples results [3] to determine if they can be pooled with the current scrape samples to estimate the upper 95% confidence limits (UCL{sub 95%}) for concentration. Statistical analysis revealed that the Mantis and current scrape sample results are not compatible. Therefore, the Mantis sample results were not used to support the quantification of analytes in the residual material. Significant spatial variability among the current sample results was not found. Constituent concentrations were similar between the North and South hemispheres as well as between the inner and outer regions of the tank floor. The current scrape sample results from all six samples fall within their 3-sigma limits. In view of the results from numerous statistical tests, the data were pooled from all six current scrape samples. As such, an adequate sample size was provided for quantification of the residual material on the floor of Tank 18F. The uncertainty is quantified in this report by an upper 95% confidence limit (UCL{sub 95%}) on each analyte concentration. The uncertainty in analyte concentration was calculated as a function of the number of samples, the average, and the standard deviation of the analytical results. The UCL{sub 95%} was based entirely on the six current scrape sample results (each averaged across three analytical determinations).

  1. Managing Performance Analysis with Dynamic Statistical Projection Pursuit

    SciTech Connect

    Vetter, J.S.; Reed, D.A.

    2000-05-22

    Computer systems and applications are growing more complex. Consequently, performance analysis has become more difficult due to the complex, transient interrelationships among runtime components. To diagnose these types of performance issues, developers must use detailed instrumentation to capture a large number of performance metrics. Unfortunately, this instrumentation may actually influence the performance analysis, leading the developer to an ambiguous conclusion. In this paper, we introduce a technique for focusing a performance analysis on interesting performance metrics. This technique, called dynamic statistical projection pursuit, identifies interesting performance metrics that the monitoring system should capture across some number of processors. By reducing the number of performance metrics, projection pursuit can limit the impact of instrumentation on the performance of the target system and can reduce the volume of performance data.

  2. Statistical analysis of static shape control in space structures

    NASA Technical Reports Server (NTRS)

    Burdisso, Ricardo A.; Haftka, Raphael T.

    1990-01-01

    The article addresses the problem of efficient analysis of the statistics of initial and corrected shape distortions in space structures. Two approaches for improving efficiency are considered. One is an adjoint technique for calculating distortion shapes: the second is a modal expansion of distortion shapes in terms of pseudo-vibration modes. The two techniques are applied to the problem of optimizing actuator locations on a 55 m radiometer antenna. The adjoint analysis technique is used with a discrete-variable optimization method. The modal approximation technique is coupled with a standard conjugate-gradient continuous optimization method. The agreement between the two sets of results is good, validating both the approximate analysis and optimality of the results.

  3. Forensic discrimination of dyed hair color: II. Multivariate statistical analysis.

    PubMed

    Barrett, Julie A; Siegel, Jay A; Goodpaster, John V

    2011-01-01

    This research is intended to assess the ability of UV-visible microspectrophotometry to successfully discriminate the color of dyed hair. Fifty-five red hair dyes were analyzed and evaluated using multivariate statistical techniques including agglomerative hierarchical clustering (AHC), principal component analysis (PCA), and discriminant analysis (DA). The spectra were grouped into three classes, which were visually consistent with different shades of red. A two-dimensional PCA observations plot was constructed, describing 78.6% of the overall variance. The wavelength regions associated with the absorbance of hair and dye were highly correlated. Principal components were selected to represent 95% of the overall variance for analysis with DA. A classification accuracy of 89% was observed for the comprehensive dye set, while external validation using 20 of the dyes resulted in a prediction accuracy of 75%. Significant color loss from successive washing of hair samples was estimated to occur within 3 weeks of dye application.

  4. Data and statistical methods for analysis of trends and patterns

    SciTech Connect

    Atwood, C.L.; Gentillon, C.D.; Wilson, G.E.

    1992-11-01

    This report summarizes topics considered at a working meeting on data and statistical methods for analysis of trends and patterns in US commercial nuclear power plants. This meeting was sponsored by the Office of Analysis and Evaluation of Operational Data (AEOD) of the Nuclear Regulatory Commission (NRC). Three data sets are briefly described: Nuclear Plant Reliability Data System (NPRDS), Licensee Event Report (LER) data, and Performance Indicator data. Two types of study are emphasized: screening studies, to see if any trends or patterns appear to be present; and detailed studies, which are more concerned with checking the analysis assumptions, modeling any patterns that are present, and searching for causes. A prescription is given for a screening study, and ideas are suggested for a detailed study, when the data take of any of three forms: counts of events per time, counts of events per demand, and non-event data.

  5. Gis-Based Spatial Statistical Analysis of College Graduates Employment

    NASA Astrophysics Data System (ADS)

    Tang, R.

    2012-07-01

    It is urgently necessary to be aware of the distribution and employment status of college graduates for proper allocation of human resources and overall arrangement of strategic industry. This study provides empirical evidence regarding the use of geocoding and spatial analysis in distribution and employment status of college graduates based on the data from 2004-2008 Wuhan Municipal Human Resources and Social Security Bureau, China. Spatio-temporal distribution of employment unit were analyzed with geocoding using ArcGIS software, and the stepwise multiple linear regression method via SPSS software was used to predict the employment and to identify spatially associated enterprise and professionals demand in the future. The results show that the enterprises in Wuhan east lake high and new technology development zone increased dramatically from 2004 to 2008, and tended to distributed southeastward. Furthermore, the models built by statistical analysis suggest that the specialty of graduates major in has an important impact on the number of the employment and the number of graduates engaging in pillar industries. In conclusion, the combination of GIS and statistical analysis which helps to simulate the spatial distribution of the employment status is a potential tool for human resource development research.

  6. Statistical energy analysis of complex structures, phase 2

    NASA Technical Reports Server (NTRS)

    Trudell, R. W.; Yano, L. I.

    1980-01-01

    A method for estimating the structural vibration properties of complex systems in high frequency environments was investigated. The structure analyzed was the Materials Experiment Assembly, (MEA), which is a portion of the OST-2A payload for the space transportation system. Statistical energy analysis (SEA) techniques were used to model the structure and predict the structural element response to acoustic excitation. A comparison of the intial response predictions and measured acoustic test data is presented. The conclusions indicate that: the SEA predicted the response of primary structure to acoustic excitation over a wide range of frequencies; and the contribution of mechanically induced random vibration to the total MEA is not significant.

  7. Feature statistic analysis of ultrasound images of liver cancer

    NASA Astrophysics Data System (ADS)

    Huang, Shuqin; Ding, Mingyue; Zhang, Songgeng

    2007-12-01

    In this paper, a specific feature analysis of liver ultrasound images including normal liver, liver cancer especially hepatocellular carcinoma (HCC) and other hepatopathy is discussed. According to the classification of hepatocellular carcinoma (HCC), primary carcinoma is divided into four types. 15 features from single gray-level statistic, gray-level co-occurrence matrix (GLCM), and gray-level run-length matrix (GLRLM) are extracted. Experiments for the discrimination of each type of HCC, normal liver, fatty liver, angioma and hepatic abscess have been conducted. Corresponding features to potentially discriminate them are found.

  8. Statistical Analysis of Strength Data for an Aerospace Aluminum Alloy

    NASA Technical Reports Server (NTRS)

    Neergaard, Lynn; Malone, Tina; Gentz, Steven J. (Technical Monitor)

    2000-01-01

    Aerospace vehicles are produced in limited quantities that do not always allow development of MIL-HDBK-5 A-basis design allowables. One method of examining production and composition variations is to perform 100% lot acceptance testing for aerospace Aluminum (Al) alloys. This paper discusses statistical trends seen in strength data for one Al alloy. A four-step approach reduced the data to residuals, visualized residuals as a function of time, grouped data with quantified scatter, and conducted analysis of variance (ANOVA).

  9. Statistical Analysis of Strength Data for an Aerospace Aluminum Alloy

    NASA Technical Reports Server (NTRS)

    Neergaard, L.; Malone, T.

    2001-01-01

    Aerospace vehicles are produced in limited quantities that do not always allow development of MIL-HDBK-5 A-basis design allowables. One method of examining production and composition variations is to perform 100% lot acceptance testing for aerospace Aluminum (Al) alloys. This paper discusses statistical trends seen in strength data for one Al alloy. A four-step approach reduced the data to residuals, visualized residuals as a function of time, grouped data with quantified scatter, and conducted analysis of variance (ANOVA).

  10. Statistical shape analysis for face movement manifold modeling

    NASA Astrophysics Data System (ADS)

    Wang, Xiaokan; Mao, Xia; Caleanu, Catalin-Daniel; Ishizuka, Mitsuru

    2012-03-01

    The inter-frame information for analyzing human face movement manifold is modeled by the statistical shape theory. Using the Riemannian geometry principles, we map a sequence of face shapes to a unified tangent space and obtain a curve corresponding to the face movement. The experimental results show that the face movement sequence forms a trajectory in a complex tangent space. Furthermore, the extent and type of face expression could be depicted as the range and direction of the curve. This represents a novel approach for face movement classification using shape-based analysis.

  11. Multi-scale statistical analysis of coronal solar activity

    DOE PAGES

    Gamborino, Diana; del-Castillo-Negrete, Diego; Martinell, Julio J.

    2016-07-08

    Multi-filter images from the solar corona are used to obtain temperature maps that are analyzed using techniques based on proper orthogonal decomposition (POD) in order to extract dynamical and structural information at various scales. Exploring active regions before and after a solar flare and comparing them with quiet regions, we show that the multi-scale behavior presents distinct statistical properties for each case that can be used to characterize the level of activity in a region. Information about the nature of heat transport is also to be extracted from the analysis.

  12. Statistical analysis of personal radiofrequency electromagnetic field measurements with nondetects.

    PubMed

    Röösli, Martin; Frei, Patrizia; Mohler, Evelyn; Braun-Fahrländer, Charlotte; Bürgi, Alfred; Fröhlich, Jürg; Neubauer, Georg; Theis, Gaston; Egger, Matthias

    2008-09-01

    Exposimeters are increasingly applied in bioelectromagnetic research to determine personal radiofrequency electromagnetic field (RF-EMF) exposure. The main advantages of exposimeter measurements are their convenient handling for study participants and the large amount of personal exposure data, which can be obtained for several RF-EMF sources. However, the large proportion of measurements below the detection limit is a challenge for data analysis. With the robust ROS (regression on order statistics) method, summary statistics can be calculated by fitting an assumed distribution to the observed data. We used a preliminary sample of 109 weekly exposimeter measurements from the QUALIFEX study to compare summary statistics computed by robust ROS with a naïve approach, where values below the detection limit were replaced by the value of the detection limit. For the total RF-EMF exposure, differences between the naïve approach and the robust ROS were moderate for the 90th percentile and the arithmetic mean. However, exposure contributions from minor RF-EMF sources were considerably overestimated with the naïve approach. This results in an underestimation of the exposure range in the population, which may bias the evaluation of potential exposure-response associations. We conclude from our analyses that summary statistics of exposimeter data calculated by robust ROS are more reliable and more informative than estimates based on a naïve approach. Nevertheless, estimates of source-specific medians or even lower percentiles depend on the assumed data distribution and should be considered with caution.

  13. A statistical analysis of eruptive activity on Mount Etna, Sicily

    NASA Astrophysics Data System (ADS)

    Smethurst, Lucy; James, Mike R.; Pinkerton, Harry; Tawn, Jonathan A.

    2009-10-01

    A rigorous analysis of the timing and location of flank eruptions of Mount Etna on Sicily is important for the creation of hazard maps of the densely populated area surrounding the volcano. In this paper, we analyse the temporal, volumetric and spatial data on eruptive activity on Etna. Our analyses are based on the two most recent and robust historical data catalogues of flank eruption activity on Etna, with one from 1669 to 2008 and the other from 1610 to 2008. We use standard statistical methodology and modelling techniques, though a number of features are new to the analysis of eruption data. Our temporal analysis reveals that flank eruptions on Mount Etna between 1610 and 2008 follow an inhomogeneous Poisson process, with intensity of eruptions increasing nearly linearly since the mid-1900s. Our temporal analysis reveals no evidence of cyclicity over this period. An analysis of volumetric lava flow rates shows a marked increase in activity since 1971. This increase, which coincides with the formation of the Southeast Crater (SEC), appears to be related to increased activity on and around the SEC. This has significant implications for hazard analysis on Etna.

  14. Detection of bearing damage by statistic vibration analysis

    NASA Astrophysics Data System (ADS)

    Sikora, E. A.

    2016-04-01

    The condition of bearings, which are essential components in mechanisms, is crucial to safety. The analysis of the bearing vibration signal, which is always contaminated by certain types of noise, is a very important standard for mechanical condition diagnosis of the bearing and mechanical failure phenomenon. In this paper the method of rolling bearing fault detection by statistical analysis of vibration is proposed to filter out Gaussian noise contained in a raw vibration signal. The results of experiments show that the vibration signal can be significantly enhanced by application of the proposed method. Besides, the proposed method is used to analyse real acoustic signals of a bearing with inner race and outer race faults, respectively. The values of attributes are determined according to the degree of the fault. The results confirm that the periods between the transients, which represent bearing fault characteristics, can be successfully detected.

  15. First statistical analysis of Geant4 quality software metrics

    NASA Astrophysics Data System (ADS)

    Ronchieri, Elisabetta; Grazia Pia, Maria; Giacomini, Francesco

    2015-12-01

    Geant4 is a simulation system of particle transport through matter, widely used in several experimental areas from high energy physics and nuclear experiments to medical studies. Some of its applications may involve critical use cases; therefore they would benefit from an objective assessment of the software quality of Geant4. In this paper, we provide a first statistical evaluation of software metrics data related to a set of Geant4 physics packages. The analysis aims at identifying risks for Geant4 maintainability, which would benefit from being addressed at an early stage. The findings of this pilot study set the grounds for further extensions of the analysis to the whole of Geant4 and to other high energy physics software systems.

  16. Vibroacoustic optimization using a statistical energy analysis model

    NASA Astrophysics Data System (ADS)

    Culla, Antonio; D`Ambrogio, Walter; Fregolent, Annalisa; Milana, Silvia

    2016-08-01

    In this paper, an optimization technique for medium-high frequency dynamic problems based on Statistical Energy Analysis (SEA) method is presented. Using a SEA model, the subsystem energies are controlled by internal loss factors (ILF) and coupling loss factors (CLF), which in turn depend on the physical parameters of the subsystems. A preliminary sensitivity analysis of subsystem energy to CLF's is performed to select CLF's that are most effective on subsystem energies. Since the injected power depends not only on the external loads but on the physical parameters of the subsystems as well, it must be taken into account under certain conditions. This is accomplished in the optimization procedure, where approximate relationships between CLF's, injected power and physical parameters are derived. The approach is applied on a typical aeronautical structure: the cabin of a helicopter.

  17. Statistical analysis of cascading failures in power grids

    SciTech Connect

    Chertkov, Michael; Pfitzner, Rene; Turitsyn, Konstantin

    2010-12-01

    We introduce a new microscopic model of cascading failures in transmission power grids. This model accounts for automatic response of the grid to load fluctuations that take place on the scale of minutes, when optimum power flow adjustments and load shedding controls are unavailable. We describe extreme events, caused by load fluctuations, which cause cascading failures of loads, generators and lines. Our model is quasi-static in the causal, discrete time and sequential resolution of individual failures. The model, in its simplest realization based on the Directed Current description of the power flow problem, is tested on three standard IEEE systems consisting of 30, 39 and 118 buses. Our statistical analysis suggests a straightforward classification of cascading and islanding phases in terms of the ratios between average number of removed loads, generators and links. The analysis also demonstrates sensitivity to variations in line capacities. Future research challenges in modeling and control of cascading outages over real-world power networks are discussed.

  18. Image analysis and statistical inference in neuroimaging with R.

    PubMed

    Tabelow, K; Clayden, J D; de Micheaux, P Lafaye; Polzehl, J; Schmid, V J; Whitcher, B

    2011-04-15

    R is a language and environment for statistical computing and graphics. It can be considered an alternative implementation of the S language developed in the 1970s and 1980s for data analysis and graphics (Becker and Chambers, 1984; Becker et al., 1988). The R language is part of the GNU project and offers versions that compile and run on almost every major operating system currently available. We highlight several R packages built specifically for the analysis of neuroimaging data in the context of functional MRI, diffusion tensor imaging, and dynamic contrast-enhanced MRI. We review their methodology and give an overview of their capabilities for neuroimaging. In addition we summarize some of the current activities in the area of neuroimaging software development in R.

  19. Processes and subdivisions in diogenites, a multivariate statistical analysis

    NASA Technical Reports Server (NTRS)

    Harriott, T. A.; Hewins, R. H.

    1984-01-01

    Multivariate statistical techniques used on diogenite orthopyroxene analyses show the relationships that occur within diogenites and the two orthopyroxenite components (class I and II) in the polymict diogenite Garland. Cluster analysis shows that only Peckelsheim is similar to Garland class I (Fe-rich) and the other diogenites resemble Garland class II. The unique diogenite Y 75032 may be related to type I by fractionation. Factor analysis confirms the subdivision and shows that Fe does not correlate with the weakly incompatible elements across the entire pyroxene composition range, indicating that igneous fractionation is not the process controlling total diogenite composition variation. The occurrence of two groups of diogenites is interpreted as the result of sampling or mixing of two main sequences of orthopyroxene cumulates with slightly different compositions.

  20. Statistical analysis of the 70 meter antenna surface distortions

    NASA Technical Reports Server (NTRS)

    Kiedron, K.; Chian, C. T.; Chuang, K. L.

    1987-01-01

    Statistical analysis of surface distortions of the 70 meter NASA/JPL antenna, located at Goldstone, was performed. The purpose of this analysis is to verify whether deviations due to gravity loading can be treated as quasi-random variables with normal distribution. Histograms of the RF pathlength error distribution for several antenna elevation positions were generated. The results indicate that the deviations from the ideal antenna surface are not normally distributed. The observed density distribution for all antenna elevation angles is taller and narrower than the normal density, which results in large positive values of kurtosis and a significant amount of skewness. The skewness of the distribution changes from positive to negative as the antenna elevation changes from zenith to horizon.

  1. Statistical analysis of magnetically soft particles in magnetorheological elastomers

    NASA Astrophysics Data System (ADS)

    Gundermann, T.; Cremer, P.; Löwen, H.; Menzel, A. M.; Odenbach, S.

    2017-04-01

    The physical properties of magnetorheological elastomers (MRE) are a complex issue and can be influenced and controlled in many ways, e.g. by applying a magnetic field, by external mechanical stimuli, or by an electric potential. In general, the response of MRE materials to these stimuli is crucially dependent on the distribution of the magnetic particles inside the elastomer. Specific knowledge of the interactions between particles or particle clusters is of high relevance for understanding the macroscopic rheological properties and provides an important input for theoretical calculations. In order to gain a better insight into the correlation between the macroscopic effects and microstructure and to generate a database for theoretical analysis, x-ray micro-computed tomography (X-μCT) investigations as a base for a statistical analysis of the particle configurations were carried out. Different MREs with quantities of 2–15 wt% (0.27–2.3 vol%) of iron powder and different allocations of the particles inside the matrix were prepared. The X-μCT results were edited by an image processing software regarding the geometrical properties of the particles with and without the influence of an external magnetic field. Pair correlation functions for the positions of the particles inside the elastomer were calculated to statistically characterize the distributions of the particles in the samples.

  2. Statistical analysis of a dynamic model for dietary contaminant exposure.

    PubMed

    Bertail, P; Clémençon, S; Tressou, J

    2010-03-01

    This paper is devoted to the statistical analysis of a stochastic model introduced in [P. Bertail, S. Clémençon, and J. Tressou, A storage model with random release rate for modelling exposure to food contaminants, Math. Biosci. Eng. 35 (1) (2008), pp. 35-60] for describing the phenomenon of exposure to a certain food contaminant. In this modelling, the temporal evolution of the contamination exposure is entirely determined by the accumulation phenomenon due to successive dietary intakes and the pharmacokinetics governing the elimination process inbetween intakes, in such a way that the exposure dynamic through time is described as a piecewise deterministic Markov process. Paths of the contamination exposure process are scarcely observable in practice, therefore intensive computer simulation methods are crucial for estimating the time-dependent or steady-state features of the process. Here we consider simulation estimators based on consumption and contamination data and investigate how to construct accurate bootstrap confidence intervals (CI) for certain quantities of considerable importance from the epidemiology viewpoint. Special attention is also paid to the problem of computing the probability of certain rare events related to the exposure process path arising in dietary risk analysis using multilevel splitting or importance sampling (IS) techniques. Applications of these statistical methods to a collection of data sets related to dietary methyl mercury contamination are discussed thoroughly.

  3. Design and contents of an advanced distance-based statistics course for a PhD in nursing program.

    PubMed

    Azuero, Andres; Wilbanks, Bryan; Pryor, Erica

    2013-01-01

    Doctoral nursing students and researchers are expected to understand, critique, and conduct research that uses advanced quantitative methodology. The authors describe the design and contents of a distance-based course in multivariate statistics for PhD students in nursing and health administration, compare the design to recommendations found in the literature for distance-based statistics education, and compare the course contents to a tabulation of the methodologies used in a sample of recently published quantitative dissertations in nursing. The authors conclude with a discussion based on these comparisons as well as with experiences in course implementation and directions for future course development.

  4. Advanced Telecommunications in U.S. Public Elementary and Secondary Schools, Fall 1996. Statistics in Brief.

    ERIC Educational Resources Information Center

    Heaviside, Sheila; And Others

    The "Survey of Advanced Telecommunications in U.S. Public Elementary and Secondary Schools, Fall 1996" collected information from 911 regular United States public elementary and secondary schools regarding the availability and use of advanced telecommunications, and in particular, access to the Internet, plans to obtain Internet access, use of…

  5. Advanced Technology Lifecycle Analysis System (ATLAS)

    NASA Technical Reports Server (NTRS)

    O'Neil, Daniel A.; Mankins, John C.

    2004-01-01

    Developing credible mass and cost estimates for space exploration and development architectures require multidisciplinary analysis based on physics calculations, and parametric estimates derived from historical systems. Within the National Aeronautics and Space Administration (NASA), concurrent engineering environment (CEE) activities integrate discipline oriented analysis tools through a computer network and accumulate the results of a multidisciplinary analysis team via a centralized database or spreadsheet Each minute of a design and analysis study within a concurrent engineering environment is expensive due the size of the team and supporting equipment The Advanced Technology Lifecycle Analysis System (ATLAS) reduces the cost of architecture analysis by capturing the knowledge of discipline experts into system oriented spreadsheet models. A framework with a user interface presents a library of system models to an architecture analyst. The analyst selects models of launchers, in-space transportation systems, and excursion vehicles, as well as space and surface infrastructure such as propellant depots, habitats, and solar power satellites. After assembling the architecture from the selected models, the analyst can create a campaign comprised of missions spanning several years. The ATLAS controller passes analyst specified parameters to the models and data among the models. An integrator workbook calls a history based parametric analysis cost model to determine the costs. Also, the integrator estimates the flight rates, launched masses, and architecture benefits over the years of the campaign. An accumulator workbook presents the analytical results in a series of bar graphs. In no way does ATLAS compete with a CEE; instead, ATLAS complements a CEE by ensuring that the time of the experts is well spent Using ATLAS, an architecture analyst can perform technology sensitivity analysis, study many scenarios, and see the impact of design decisions. When the analyst is

  6. EBprot: Statistical analysis of labeling-based quantitative proteomics data.

    PubMed

    Koh, Hiromi W L; Swa, Hannah L F; Fermin, Damian; Ler, Siok Ghee; Gunaratne, Jayantha; Choi, Hyungwon

    2015-08-01

    Labeling-based proteomics is a powerful method for detection of differentially expressed proteins (DEPs). The current data analysis platform typically relies on protein-level ratios, which is obtained by summarizing peptide-level ratios for each protein. In shotgun proteomics, however, some proteins are quantified with more peptides than others, and this reproducibility information is not incorporated into the differential expression (DE) analysis. Here, we propose a novel probabilistic framework EBprot that directly models the peptide-protein hierarchy and rewards the proteins with reproducible evidence of DE over multiple peptides. To evaluate its performance with known DE states, we conducted a simulation study to show that the peptide-level analysis of EBprot provides better receiver-operating characteristic and more accurate estimation of the false discovery rates than the methods based on protein-level ratios. We also demonstrate superior classification performance of peptide-level EBprot analysis in a spike-in dataset. To illustrate the wide applicability of EBprot in different experimental designs, we applied EBprot to a dataset for lung cancer subtype analysis with biological replicates and another dataset for time course phosphoproteome analysis of EGF-stimulated HeLa cells with multiplexed labeling. Through these examples, we show that the peptide-level analysis of EBprot is a robust alternative to the existing statistical methods for the DE analysis of labeling-based quantitative datasets. The software suite is freely available on the Sourceforge website http://ebprot.sourceforge.net/. All MS data have been deposited in the ProteomeXchange with identifier PXD001426 (http://proteomecentral.proteomexchange.org/dataset/PXD001426/).

  7. Statistically advanced, self-similar, radial probability density functions of atmospheric and under-expanded hydrogen jets

    NASA Astrophysics Data System (ADS)

    Ruggles, Adam J.

    2015-11-01

    This paper presents improved statistical insight regarding the self-similar scalar mixing process of atmospheric hydrogen jets and the downstream region of under-expanded hydrogen jets. Quantitative planar laser Rayleigh scattering imaging is used to probe both jets. The self-similarity of statistical moments up to the sixth order (beyond the literature established second order) is documented in both cases. This is achieved using a novel self-similar normalization method that facilitated a degree of statistical convergence that is typically limited to continuous, point-based measurements. This demonstrates that image-based measurements of a limited number of samples can be used for self-similar scalar mixing studies. Both jets exhibit the same radial trends of these moments demonstrating that advanced atmospheric self-similarity can be applied in the analysis of under-expanded jets. Self-similar histograms away from the centerline are shown to be the combination of two distributions. The first is attributed to turbulent mixing. The second, a symmetric Poisson-type distribution centered on zero mass fraction, progressively becomes the dominant and eventually sole distribution at the edge of the jet. This distribution is attributed to shot noise-affected pure air measurements, rather than a diffusive superlayer at the jet boundary. This conclusion is reached after a rigorous measurement uncertainty analysis and inspection of pure air data collected with each hydrogen data set. A threshold based upon the measurement noise analysis is used to separate the turbulent and pure air data, and thusly estimate intermittency. Beta-distributions (four parameters) are used to accurately represent the turbulent distribution moments. This combination of measured intermittency and four-parameter beta-distributions constitutes a new, simple approach to model scalar mixing. Comparisons between global moments from the data and moments calculated using the proposed model show excellent

  8. Freedom from the Tyranny of the Campus Main-Frame: Handling the Statistical Analysis of a 10-year Survey Research Study with a Personal Computer.

    ERIC Educational Resources Information Center

    Hickman, Linda J.

    Technological advances in microcomputer hardware and software, including size of memory and increasingly more sophisticated statistical application packages, create a new era in educational research. The alternative to costly main-frame computer data processing and statistical analysis is explored in this paper. In the first section, typical…

  9. Statistical analysis of dynamic sequences for functional imaging

    NASA Astrophysics Data System (ADS)

    Kao, Chien-Min; Chen, Chin-Tu; Wernick, Miles N.

    2000-04-01

    Factor analysis of medical image sequences (FAMIS), in which one concerns the problem of simultaneous identification of homogeneous regions (factor images) and the characteristic temporal variations (factors) inside these regions from a temporal sequence of images by statistical analysis, is one of the major challenges in medical imaging. In this research, we contribute to this important area of research by proposing a two-step approach. First, we study the use of the noise- adjusted principal component (NAPC) analysis developed by Lee et. al. for identifying the characteristic temporal variations in dynamic scans acquired by PET and MRI. NAPC allows us to effectively reject data noise and substantially reduce data dimension based on signal-to-noise ratio consideration. Subsequently, a simple spatial analysis based on the criteria of minimum spatial overlapping and non-negativity of the factor images is applied for extraction of the factors and factor images. In our simulation study, our preliminary results indicate that the proposed approach can accurately identify the factor images. However, the factors are not completely separated.

  10. Advanced techniques in current signature analysis

    NASA Astrophysics Data System (ADS)

    Smith, S. F.; Castleberry, K. N.

    1992-02-01

    In general, both ac and dc motors can be characterized as weakly nonlinear systems, in which both linear and nonlinear effects occur simultaneously. Fortunately, the nonlinearities are generally well behaved and understood and can be handled via several standard mathematical techniques already well developed in the systems modeling area; examples are piecewise linear approximations and Volterra series representations. Field measurements of numerous motors and motor-driven systems confirm the rather complex nature of motor current spectra and illustrate both linear and nonlinear effects (including line harmonics and modulation components). Although previous current signature analysis (CSA) work at Oak Ridge and other sites has principally focused on the modulation mechanisms and detection methods (AM, PM, and FM), more recent studies have been conducted on linear spectral components (those appearing in the electric current at their actual frequencies and not as modulation sidebands). For example, large axial-flow compressors (approximately 3300 hp) in the US gaseous diffusion uranium enrichment plants exhibit running-speed (approximately 20 Hz) and high-frequency vibrational information (greater than 1 kHz) in their motor current spectra. Several signal-processing techniques developed to facilitate analysis of these components, including specialized filtering schemes, are presented. Finally, concepts for the designs of advanced digitally based CSA units are offered, which should serve to foster the development of much more computationally capable 'smart' CSA instrumentation in the next several years.

  11. Statistical Scalability Analysis of Communication Operations in Distributed Applications

    SciTech Connect

    Vetter, J S; McCracken, M O

    2001-02-27

    Current trends in high performance computing suggest that users will soon have widespread access to clusters of multiprocessors with hundreds, if not thousands, of processors. This unprecedented degree of parallelism will undoubtedly expose scalability limitations in existing applications, where scalability is the ability of a parallel algorithm on a parallel architecture to effectively utilize an increasing number of processors. Users will need precise and automated techniques for detecting the cause of limited scalability. This paper addresses this dilemma. First, we argue that users face numerous challenges in understanding application scalability: managing substantial amounts of experiment data, extracting useful trends from this data, and reconciling performance information with their application's design. Second, we propose a solution to automate this data analysis problem by applying fundamental statistical techniques to scalability experiment data. Finally, we evaluate our operational prototype on several applications, and show that statistical techniques offer an effective strategy for assessing application scalability. In particular, we find that non-parametric correlation of the number of tasks to the ratio of the time for individual communication operations to overall communication time provides a reliable measure for identifying communication operations that scale poorly.

  12. Statistical methods for the analysis of climate extremes

    NASA Astrophysics Data System (ADS)

    Naveau, Philippe; Nogaj, Marta; Ammann, Caspar; Yiou, Pascal; Cooley, Daniel; Jomelli, Vincent

    2005-08-01

    Currently there is an increasing research activity in the area of climate extremes because they represent a key manifestation of non-linear systems and an enormous impact on economic and social human activities. Our understanding of the mean behavior of climate and its 'normal' variability has been improving significantly during the last decades. In comparison, climate extreme events have been hard to study and even harder to predict because they are, by definition, rare and obey different statistical laws than averages. In this context, the motivation for this paper is twofold. Firstly, we recall the basic principles of Extreme Value Theory that is used on a regular basis in finance and hydrology, but it still does not have the same success in climate studies. More precisely, the theoretical distributions of maxima and large peaks are recalled. The parameters of such distributions are estimated with the maximum likelihood estimation procedure that offers the flexibility to take into account explanatory variables in our analysis. Secondly, we detail three case-studies to show that this theory can provide a solid statistical foundation, specially when assessing the uncertainty associated with extreme events in a wide range of applications linked to the study of our climate. To cite this article: P. Naveau et al., C. R. Geoscience 337 (2005).

  13. Log-Normality and Multifractal Analysis of Flame Surface Statistics

    NASA Astrophysics Data System (ADS)

    Saha, Abhishek; Chaudhuri, Swetaprovo; Law, Chung K.

    2013-11-01

    The turbulent flame surface is typically highly wrinkled and folded at a multitude of scales controlled by various flame properties. It is useful if the information contained in this complex geometry can be projected onto a simpler regular geometry for the use of spectral, wavelet or multifractal analyses. Here we investigate local flame surface statistics of turbulent flame expanding under constant pressure. First the statistics of local length ratio is experimentally obtained from high-speed Mie scattering images. For spherically expanding flame, length ratio on the measurement plane, at predefined equiangular sectors is defined as the ratio of the actual flame length to the length of a circular-arc of radius equal to the average radius of the flame. Assuming isotropic distribution of such flame segments we convolute suitable forms of the length-ratio probability distribution functions (pdfs) to arrive at corresponding area-ratio pdfs. Both the pdfs are found to be near log-normally distributed and shows self-similar behavior with increasing radius. Near log-normality and rather intermittent behavior of the flame-length ratio suggests similarity with dissipation rate quantities which stimulates multifractal analysis. Currently at Indian Institute of Science, India.

  14. Constraining cosmology with shear peak statistics: tomographic analysis

    NASA Astrophysics Data System (ADS)

    Martinet, Nicolas; Bartlett, James G.; Kiessling, Alina; Sartoris, Barbara

    2015-09-01

    The abundance of peaks in weak gravitational lensing maps is a potentially powerful cosmological tool, complementary to measurements of the shear power spectrum. We study peaks detected directly in shear maps, rather than convergence maps, an approach that has the advantage of working directly with the observable quantity, the galaxy ellipticity catalog. Using large numbers of numerical simulations to accurately predict the abundance of peaks and their covariance, we quantify the cosmological constraints attainable by a large-area survey similar to that expected from the Euclid mission, focusing on the density parameter, Ωm, and on the power spectrum normalization, σ8, for illustration. We present a tomographic peak counting method that improves the conditional (marginal) constraints by a factor of 1.2 (2) over those from a two-dimensional (i.e., non-tomographic) peak-count analysis. We find that peak statistics provide constraints an order of magnitude less accurate than those from the cluster sample in the ideal situation of a perfectly known observable-mass relation; however, when the scaling relation is not known a priori, the shear-peak constraints are twice as strong and orthogonal to the cluster constraints, highlighting the value of using both clusters and shear-peak statistics.

  15. Statistical analysis of cascaded PLC-based PMD compensator

    NASA Astrophysics Data System (ADS)

    Wang, Bin; Wang, Lei; Wu, Xingkun

    2005-01-01

    The planar lightwave circuit (PLC) on silicon substrate offers a promising on-chip integrated solution to polarization-mode dispersion (PMD) compensation for long haul high speed communications. A novel cascaded PLC based PMD compensator is proposed in this paper and a detailed statistical analysis of PMD generated by cascaded PLC circuits is presented. Using Gisin and Pellaux's approach the distributions of first-order PMD produced by various multiple-stage PLC circuits were obtained by Monte Carlo simulation with respect to the phase shift introduced by heating elements in the circuits. The generated PMD was compared with a standard Maxwell distribution and that of a 12-stage nonlinear crystal based PMD compensator. It was found that a 3-stage cascaded PLC circuit yields a performance close to that of the crystal-based PMD compensator, while with a significant reduction in packaged size and enhancement in stability.

  16. A Computer Program for Statistically-Based Decision Analysis

    PubMed Central

    Polaschek, Jeanette X.; Lenert, Leslie A.; Garber, Alan M.

    1990-01-01

    The majority of patients with coronary artery disease do not fall into the well defined populations from randomized clinical trials. Observational databases contain a rich source of information that could be used by practicing physicians to evaluate treatment alternatives for their patients. We describe a computer system, the CABG Kibitzer, which uses an integrated approach to evaluate the treatment alternatives for CAD patients. We combine a statistical multivariate model for calculating survival advantages with DA techniques for assessing patient preferences and sensitivity analysis, to create one tool that physicians find easy to use in daily clinical practice. The development of tools of this kind is a necessary step in making the data of outcome studies accessible to practicing physicians.

  17. Higher order statistical moment application for solar PV potential analysis

    NASA Astrophysics Data System (ADS)

    Basri, Mohd Juhari Mat; Abdullah, Samizee; Azrulhisham, Engku Ahmad; Harun, Khairulezuan

    2016-10-01

    Solar photovoltaic energy could be as alternative energy to fossil fuel, which is depleting and posing a global warming problem. However, this renewable energy is so variable and intermittent to be relied on. Therefore the knowledge of energy potential is very important for any site to build this solar photovoltaic power generation system. Here, the application of higher order statistical moment model is being analyzed using data collected from 5MW grid-connected photovoltaic system. Due to the dynamic changes of skewness and kurtosis of AC power and solar irradiance distributions of the solar farm, Pearson system where the probability distribution is calculated by matching their theoretical moments with that of the empirical moments of a distribution could be suitable for this purpose. On the advantage of the Pearson system in MATLAB, a software programming has been developed to help in data processing for distribution fitting and potential analysis for future projection of amount of AC power and solar irradiance availability.

  18. Statistical uncertainty analysis of radon transport in nonisothermal, unsaturated soils

    SciTech Connect

    Holford, D.J.; Owczarski, P.C.; Gee, G.W.; Freeman, H.D.

    1990-10-01

    To accurately predict radon fluxes soils to the atmosphere, we must know more than the radium content of the soil. Radon flux from soil is affected not only by soil properties, but also by meteorological factors such as air pressure and temperature changes at the soil surface, as well as the infiltration of rainwater. Natural variations in meteorological factors and soil properties contribute to uncertainty in subsurface model predictions of radon flux, which, when coupled with a building transport model, will also add uncertainty to predictions of radon concentrations in homes. A statistical uncertainty analysis using our Rn3D finite-element numerical model was conducted to assess the relative importance of these meteorological factors and the soil properties affecting radon transport. 10 refs., 10 figs., 3 tabs.

  19. Spectral signature verification using statistical analysis and text mining

    NASA Astrophysics Data System (ADS)

    DeCoster, Mallory E.; Firpi, Alexe H.; Jacobs, Samantha K.; Cone, Shelli R.; Tzeng, Nigel H.; Rodriguez, Benjamin M.

    2016-05-01

    In the spectral science community, numerous spectral signatures are stored in databases representative of many sample materials collected from a variety of spectrometers and spectroscopists. Due to the variety and variability of the spectra that comprise many spectral databases, it is necessary to establish a metric for validating the quality of spectral signatures. This has been an area of great discussion and debate in the spectral science community. This paper discusses a method that independently validates two different aspects of a spectral signature to arrive at a final qualitative assessment; the textual meta-data and numerical spectral data. Results associated with the spectral data stored in the Signature Database1 (SigDB) are proposed. The numerical data comprising a sample material's spectrum is validated based on statistical properties derived from an ideal population set. The quality of the test spectrum is ranked based on a spectral angle mapper (SAM) comparison to the mean spectrum derived from the population set. Additionally, the contextual data of a test spectrum is qualitatively analyzed using lexical analysis text mining. This technique analyzes to understand the syntax of the meta-data to provide local learning patterns and trends within the spectral data, indicative of the test spectrum's quality. Text mining applications have successfully been implemented for security2 (text encryption/decryption), biomedical3 , and marketing4 applications. The text mining lexical analysis algorithm is trained on the meta-data patterns of a subset of high and low quality spectra, in order to have a model to apply to the entire SigDB data set. The statistical and textual methods combine to assess the quality of a test spectrum existing in a database without the need of an expert user. This method has been compared to other validation methods accepted by the spectral science community, and has provided promising results when a baseline spectral signature is

  20. A statistical analysis of the daily streamflow hydrograph

    NASA Astrophysics Data System (ADS)

    Kavvas, M. L.; Delleur, J. W.

    1984-03-01

    In this study a periodic statistical analysis of daily streamflow data in Indiana, U.S.A., was performed to gain some new insight into the stochastic structure which describes the daily streamflow process. This analysis was performed by the periodic mean and covariance functions of the daily streamflows, by the time and peak discharge -dependent recession limb of the daily streamflow hydrograph, by the time and discharge exceedance level (DEL) -dependent probability distribution of the hydrograph peak interarrival time, and by the time-dependent probability distribution of the time to peak discharge. Some new statistical estimators were developed and used in this study. In general features, this study has shown that: (a) the persistence properties of daily flows depend on the storage state of the basin at the specified time origin of the flow process; (b) the daily streamflow process is time irreversible; (c) the probability distribution of the daily hydrograph peak interarrival time depends both on the occurrence time of the peak from which the inter-arrival time originates and on the discharge exceedance level; and (d) if the daily streamflow process is modeled as the release from a linear watershed storage, this release should depend on the state of the storage and on the time of the release as the persistence properties and the recession limb decay rates were observed to change with the state of the watershed storage and time. Therefore, a time-varying reservoir system needs to be considered if the daily streamflow process is to be modeled as the release from a linear watershed storage.

  1. Statistical analysis of the uncertainty related to flood hazard appraisal

    NASA Astrophysics Data System (ADS)

    Notaro, Vincenza; Freni, Gabriele

    2015-12-01

    The estimation of flood hazard frequency statistics for an urban catchment is of great interest in practice. It provides the evaluation of potential flood risk and related damage and supports decision making for flood risk management. Flood risk is usually defined as function of the probability, that a system deficiency can cause flooding (hazard), and the expected damage, due to the flooding magnitude (damage), taking into account both the exposure and the vulnerability of the goods at risk. The expected flood damage can be evaluated by an a priori estimation of potential damage caused by flooding or by interpolating real damage data. With regard to flood hazard appraisal several procedures propose to identify some hazard indicator (HI) such as flood depth or the combination of flood depth and velocity and to assess the flood hazard corresponding to the analyzed area comparing the HI variables with user-defined threshold values or curves (penalty curves or matrixes). However, flooding data are usually unavailable or piecemeal allowing for carrying out a reliable flood hazard analysis, therefore hazard analysis is often performed by means of mathematical simulations aimed at evaluating water levels and flow velocities over catchment surface. As results a great part of the uncertainties intrinsic to flood risk appraisal can be related to the hazard evaluation due to the uncertainty inherent to modeling results and to the subjectivity of the user defined hazard thresholds applied to link flood depth to a hazard level. In the present work, a statistical methodology was proposed for evaluating and reducing the uncertainties connected with hazard level estimation. The methodology has been applied to a real urban watershed as case study.

  2. Statistical analysis and modelling of small satellite reliability

    NASA Astrophysics Data System (ADS)

    Guo, Jian; Monas, Liora; Gill, Eberhard

    2014-05-01

    This paper attempts to characterize failure behaviour of small satellites through statistical analysis of actual in-orbit failures. A unique Small Satellite Anomalies Database comprising empirical failure data of 222 small satellites has been developed. A nonparametric analysis of the failure data has been implemented by means of a Kaplan-Meier estimation. An innovative modelling method, i.e. Bayesian theory in combination with Markov Chain Monte Carlo (MCMC) simulations, has been proposed to model the reliability of small satellites. An extensive parametric analysis using the Bayesian/MCMC method has been performed to fit a Weibull distribution to the data. The influence of several characteristics such as the design lifetime, mass, launch year, mission type and the type of satellite developers on the reliability has been analyzed. The results clearly show the infant mortality of small satellites. Compared with the classical maximum-likelihood estimation methods, the proposed Bayesian/MCMC method results in better fitting Weibull models and is especially suitable for reliability modelling where only very limited failures are observed.

  3. Helioseismology of pre-emerging active regions. III. Statistical analysis

    SciTech Connect

    Barnes, G.; Leka, K. D.; Braun, D. C.; Birch, A. C.

    2014-05-01

    The subsurface properties of active regions (ARs) prior to their appearance at the solar surface may shed light on the process of AR formation. Helioseismic holography has been applied to samples taken from two populations of regions on the Sun (pre-emergence and without emergence), each sample having over 100 members, that were selected to minimize systematic bias, as described in Paper I. Paper II showed that there are statistically significant signatures in the average helioseismic properties that precede the formation of an AR. This paper describes a more detailed analysis of the samples of pre-emergence regions and regions without emergence based on discriminant analysis. The property that is best able to distinguish the populations is found to be the surface magnetic field, even a day before the emergence time. However, after accounting for the correlations between the surface field and the quantities derived from helioseismology, there is still evidence of a helioseismic precursor to AR emergence that is present for at least a day prior to emergence, although the analysis presented cannot definitively determine the subsurface properties prior to emergence due to the small sample sizes.

  4. Advanced Coal Wind Hybrid: Economic Analysis

    SciTech Connect

    Phadke, Amol; Goldman, Charles; Larson, Doug; Carr, Tom; Rath, Larry; Balash, Peter; Yih-Huei, Wan

    2008-11-28

    Growing concern over climate change is prompting new thinking about the technologies used to generate electricity. In the future, it is possible that new government policies on greenhouse gas emissions may favor electric generation technology options that release zero or low levels of carbon emissions. The Western U.S. has abundant wind and coal resources. In a world with carbon constraints, the future of coal for new electrical generation is likely to depend on the development and successful application of new clean coal technologies with near zero carbon emissions. This scoping study explores the economic and technical feasibility of combining wind farms with advanced coal generation facilities and operating them as a single generation complex in the Western US. The key questions examined are whether an advanced coal-wind hybrid (ACWH) facility provides sufficient advantages through improvements to the utilization of transmission lines and the capability to firm up variable wind generation for delivery to load centers to compete effectively with other supply-side alternatives in terms of project economics and emissions footprint. The study was conducted by an Analysis Team that consists of staff from the Lawrence Berkeley National Laboratory (LBNL), National Energy Technology Laboratory (NETL), National Renewable Energy Laboratory (NREL), and Western Interstate Energy Board (WIEB). We conducted a screening level analysis of the economic competitiveness and technical feasibility of ACWH generation options located in Wyoming that would supply electricity to load centers in California, Arizona or Nevada. Figure ES-1 is a simple stylized representation of the configuration of the ACWH options. The ACWH consists of a 3,000 MW coal gasification combined cycle power plant equipped with carbon capture and sequestration (G+CC+CCS plant), a fuel production or syngas storage facility, and a 1,500 MW wind plant. The ACWH project is connected to load centers by a 3,000 MW

  5. Classification of Malaysia aromatic rice using multivariate statistical analysis

    NASA Astrophysics Data System (ADS)

    Abdullah, A. H.; Adom, A. H.; Shakaff, A. Y. Md; Masnan, M. J.; Zakaria, A.; Rahim, N. A.; Omar, O.

    2015-05-01

    Aromatic rice (Oryza sativa L.) is considered as the best quality premium rice. The varieties are preferred by consumers because of its preference criteria such as shape, colour, distinctive aroma and flavour. The price of aromatic rice is higher than ordinary rice due to its special needed growth condition for instance specific climate and soil. Presently, the aromatic rice quality is identified by using its key elements and isotopic variables. The rice can also be classified via Gas Chromatography Mass Spectrometry (GC-MS) or human sensory panels. However, the uses of human sensory panels have significant drawbacks such as lengthy training time, and prone to fatigue as the number of sample increased and inconsistent. The GC-MS analysis techniques on the other hand, require detailed procedures, lengthy analysis and quite costly. This paper presents the application of in-house developed Electronic Nose (e-nose) to classify new aromatic rice varieties. The e-nose is used to classify the variety of aromatic rice based on the samples odour. The samples were taken from the variety of rice. The instrument utilizes multivariate statistical data analysis, including Principal Component Analysis (PCA), Linear Discriminant Analysis (LDA) and K-Nearest Neighbours (KNN) to classify the unknown rice samples. The Leave-One-Out (LOO) validation approach is applied to evaluate the ability of KNN to perform recognition and classification of the unspecified samples. The visual observation of the PCA and LDA plots of the rice proves that the instrument was able to separate the samples into different clusters accordingly. The results of LDA and KNN with low misclassification error support the above findings and we may conclude that the e-nose is successfully applied to the classification of the aromatic rice varieties.

  6. Classification of Malaysia aromatic rice using multivariate statistical analysis

    SciTech Connect

    Abdullah, A. H.; Adom, A. H.; Shakaff, A. Y. Md; Masnan, M. J.; Zakaria, A.; Rahim, N. A.; Omar, O.

    2015-05-15

    Aromatic rice (Oryza sativa L.) is considered as the best quality premium rice. The varieties are preferred by consumers because of its preference criteria such as shape, colour, distinctive aroma and flavour. The price of aromatic rice is higher than ordinary rice due to its special needed growth condition for instance specific climate and soil. Presently, the aromatic rice quality is identified by using its key elements and isotopic variables. The rice can also be classified via Gas Chromatography Mass Spectrometry (GC-MS) or human sensory panels. However, the uses of human sensory panels have significant drawbacks such as lengthy training time, and prone to fatigue as the number of sample increased and inconsistent. The GC–MS analysis techniques on the other hand, require detailed procedures, lengthy analysis and quite costly. This paper presents the application of in-house developed Electronic Nose (e-nose) to classify new aromatic rice varieties. The e-nose is used to classify the variety of aromatic rice based on the samples odour. The samples were taken from the variety of rice. The instrument utilizes multivariate statistical data analysis, including Principal Component Analysis (PCA), Linear Discriminant Analysis (LDA) and K-Nearest Neighbours (KNN) to classify the unknown rice samples. The Leave-One-Out (LOO) validation approach is applied to evaluate the ability of KNN to perform recognition and classification of the unspecified samples. The visual observation of the PCA and LDA plots of the rice proves that the instrument was able to separate the samples into different clusters accordingly. The results of LDA and KNN with low misclassification error support the above findings and we may conclude that the e-nose is successfully applied to the classification of the aromatic rice varieties.

  7. Statistical analysis of nonlinear dynamical systems using differential geometric sampling methods.

    PubMed

    Calderhead, Ben; Girolami, Mark

    2011-12-06

    Mechanistic models based on systems of nonlinear differential equations can help provide a quantitative understanding of complex physical or biological phenomena. The use of such models to describe nonlinear interactions in molecular biology has a long history; however, it is only recently that advances in computing have allowed these models to be set within a statistical framework, further increasing their usefulness and binding modelling and experimental approaches more tightly together. A probabilistic approach to modelling allows us to quantify uncertainty in both the model parameters and the model predictions, as well as in the model hypotheses themselves. In this paper, the Bayesian approach to statistical inference is adopted and we examine the significant challenges that arise when performing inference over nonlinear ordinary differential equation models describing cell signalling pathways and enzymatic circadian control; in particular, we address the difficulties arising owing to strong nonlinear correlation structures, high dimensionality and non-identifiability of parameters. We demonstrate how recently introduced differential geometric Markov chain Monte Carlo methodology alleviates many of these issues by making proposals based on local sensitivity information, which ultimately allows us to perform effective statistical analysis. Along the way, we highlight the deep link between the sensitivity analysis of such dynamic system models and the underlying Riemannian geometry of the induced posterior probability distributions.

  8. Microcomputers: Statistical Analysis Software. Evaluation Guide Number 5.

    ERIC Educational Resources Information Center

    Gray, Peter J.

    This guide discusses six sets of features to examine when purchasing a microcomputer-based statistics program: hardware requirements; data management; data processing; statistical procedures; printing; and documentation. While the current statistical packages have several negative features, they are cost saving and convenient for small to moderate…

  9. Genomic similarity and kernel methods I: advancements by building on mathematical and statistical foundations.

    PubMed

    Schaid, Daniel J

    2010-01-01

    Measures of genomic similarity are the basis of many statistical analytic methods. We review the mathematical and statistical basis of similarity methods, particularly based on kernel methods. A kernel function converts information for a pair of subjects to a quantitative value representing either similarity (larger values meaning more similar) or distance (smaller values meaning more similar), with the requirement that it must create a positive semidefinite matrix when applied to all pairs of subjects. This review emphasizes the wide range of statistical methods and software that can be used when similarity is based on kernel methods, such as nonparametric regression, linear mixed models and generalized linear mixed models, hierarchical models, score statistics, and support vector machines. The mathematical rigor for these methods is summarized, as is the mathematical framework for making kernels. This review provides a framework to move from intuitive and heuristic approaches to define genomic similarities to more rigorous methods that can take advantage of powerful statistical modeling and existing software. A companion paper reviews novel approaches to creating kernels that might be useful for genomic analyses, providing insights with examples [1].

  10. Statistical Analysis of Tank 5 Floor Sample Results

    SciTech Connect

    Shine, E. P.

    2013-01-31

    Sampling has been completed for the characterization of the residual material on the floor of Tank 5 in the F-Area Tank Farm at the Savannah River Site (SRS), near Aiken, SC. The sampling was performed by Savannah River Remediation (SRR) LLC using a stratified random sampling plan with volume-proportional compositing. The plan consisted of partitioning the residual material on the floor of Tank 5 into three non-overlapping strata: two strata enclosed accumulations, and a third stratum consisted of a thin layer of material outside the regions of the two accumulations. Each of three composite samples was constructed from five primary sample locations of residual material on the floor of Tank 5. Three of the primary samples were obtained from the stratum containing the thin layer of material, and one primary sample was obtained from each of the two strata containing an accumulation. This report documents the statistical analyses of the analytical results for the composite samples. The objective of the analysis is to determine the mean concentrations and upper 95% confidence (UCL95) bounds for the mean concentrations for a set of analytes in the tank residuals. The statistical procedures employed in the analyses were consistent with the Environmental Protection Agency (EPA) technical guidance by Singh and others [2010]. Savannah River National Laboratory (SRNL) measured the sample bulk density, nonvolatile beta, gross alpha, and the radionuclide1, elemental, and chemical concentrations three times for each of the composite samples. The analyte concentration data were partitioned into three separate groups for further analysis: analytes with every measurement above their minimum detectable concentrations (MDCs), analytes with no measurements above their MDCs, and analytes with a mixture of some measurement results above and below their MDCs. The means, standard deviations, and UCL95s were computed for the analytes in the two groups that had at least some measurements

  11. STATISTICAL ANALYSIS OF TANK 5 FLOOR SAMPLE RESULTS

    SciTech Connect

    Shine, E.

    2012-03-14

    Sampling has been completed for the characterization of the residual material on the floor of Tank 5 in the F-Area Tank Farm at the Savannah River Site (SRS), near Aiken, SC. The sampling was performed by Savannah River Remediation (SRR) LLC using a stratified random sampling plan with volume-proportional compositing. The plan consisted of partitioning the residual material on the floor of Tank 5 into three non-overlapping strata: two strata enclosed accumulations, and a third stratum consisted of a thin layer of material outside the regions of the two accumulations. Each of three composite samples was constructed from five primary sample locations of residual material on the floor of Tank 5. Three of the primary samples were obtained from the stratum containing the thin layer of material, and one primary sample was obtained from each of the two strata containing an accumulation. This report documents the statistical analyses of the analytical results for the composite samples. The objective of the analysis is to determine the mean concentrations and upper 95% confidence (UCL95) bounds for the mean concentrations for a set of analytes in the tank residuals. The statistical procedures employed in the analyses were consistent with the Environmental Protection Agency (EPA) technical guidance by Singh and others [2010]. Savannah River National Laboratory (SRNL) measured the sample bulk density, nonvolatile beta, gross alpha, radionuclide, inorganic, and anion concentrations three times for each of the composite samples. The analyte concentration data were partitioned into three separate groups for further analysis: analytes with every measurement above their minimum detectable concentrations (MDCs), analytes with no measurements above their MDCs, and analytes with a mixture of some measurement results above and below their MDCs. The means, standard deviations, and UCL95s were computed for the analytes in the two groups that had at least some measurements above their

  12. Statistical Analysis Of Tank 5 Floor Sample Results

    SciTech Connect

    Shine, E. P.

    2012-08-01

    Sampling has been completed for the characterization of the residual material on the floor of Tank 5 in the F-Area Tank Farm at the Savannah River Site (SRS), near Aiken, SC. The sampling was performed by Savannah River Remediation (SRR) LLC using a stratified random sampling plan with volume-proportional compositing. The plan consisted of partitioning the residual material on the floor of Tank 5 into three non-overlapping strata: two strata enclosed accumulations, and a third stratum consisted of a thin layer of material outside the regions of the two accumulations. Each of three composite samples was constructed from five primary sample locations of residual material on the floor of Tank 5. Three of the primary samples were obtained from the stratum containing the thin layer of material, and one primary sample was obtained from each of the two strata containing an accumulation. This report documents the statistical analyses of the analytical results for the composite samples. The objective of the analysis is to determine the mean concentrations and upper 95% confidence (UCL95) bounds for the mean concentrations for a set of analytes in the tank residuals. The statistical procedures employed in the analyses were consistent with the Environmental Protection Agency (EPA) technical guidance by Singh and others [2010]. Savannah River National Laboratory (SRNL) measured the sample bulk density, nonvolatile beta, gross alpha, and the radionuclide, elemental, and chemical concentrations three times for each of the composite samples. The analyte concentration data were partitioned into three separate groups for further analysis: analytes with every measurement above their minimum detectable concentrations (MDCs), analytes with no measurements above their MDCs, and analytes with a mixture of some measurement results above and below their MDCs. The means, standard deviations, and UCL95s were computed for the analytes in the two groups that had at least some measurements

  13. A statistical design for testing apomictic diversification through linkage analysis.

    PubMed

    Zeng, Yanru; Hou, Wei; Song, Shuang; Feng, Sisi; Shen, Lin; Xia, Guohua; Wu, Rongling

    2014-03-01

    The capacity of apomixis to generate maternal clones through seed reproduction has made it a useful characteristic for the fixation of heterosis in plant breeding. It has been observed that apomixis displays pronounced intra- and interspecific diversification, but the genetic mechanisms underlying this diversification remains elusive, obstructing the exploitation of this phenomenon in practical breeding programs. By capitalizing on molecular information in mapping populations, we describe and assess a statistical design that deploys linkage analysis to estimate and test the pattern and extent of apomictic differences at various levels from genotypes to species. The design is based on two reciprocal crosses between two individuals each chosen from a hermaphrodite or monoecious species. A multinomial distribution likelihood is constructed by combining marker information from two crosses. The EM algorithm is implemented to estimate the rate of apomixis and test its difference between two plant populations or species as the parents. The design is validated by computer simulation. A real data analysis of two reciprocal crosses between hickory (Carya cathayensis) and pecan (C. illinoensis) demonstrates the utilization and usefulness of the design in practice. The design provides a tool to address fundamental and applied questions related to the evolution and breeding of apomixis.

  14. Autotasked Performance in the NAS Workload: A Statistical Analysis

    NASA Technical Reports Server (NTRS)

    Carter, R. L.; Stockdale, I. E.; Kutler, Paul (Technical Monitor)

    1998-01-01

    A statistical analysis of the workload performance of a production quality FORTRAN code for five different Cray Y-MP hardware and system software configurations is performed. The analysis was based on an experimental procedure that was designed to minimize correlations between the number of requested CPUs and the time of day the runs were initiated. Observed autotasking over heads were significantly larger for the set of jobs that requested the maximum number of CPUs. Speedups for UNICOS 6 releases show consistent wall clock speedups in the workload of around 2. which is quite good. The observed speed ups were very similar for the set of jobs that requested 8 CPUs and the set that requested 4 CPUs. The original NAS algorithm for determining charges to the user discourages autotasking in the workload. A new charging algorithm to be applied to jobs run in the NQS multitasking queues also discourages NAS users from using auto tasking. The new algorithm favors jobs requesting 8 CPUs over those that request less, although the jobs requesting 8 CPUs experienced significantly higher over head and presumably degraded system throughput. A charging algorithm is presented that has the following desirable characteristics when applied to the data: higher overhead jobs requesting 8 CPUs are penalized when compared to moderate overhead jobs requesting 4 CPUs, thereby providing a charging incentive to NAS users to use autotasking in a manner that provides them with significantly improved turnaround while also maintaining system throughput.

  15. Statistical analysis of plasma thermograms measured by differential scanning calorimetry.

    PubMed

    Fish, Daniel J; Brewood, Greg P; Kim, Jong Sung; Garbett, Nichola C; Chaires, Jonathan B; Benight, Albert S

    2010-11-01

    Melting curves of human plasma measured by differential scanning calorimetry (DSC), known as thermograms, have the potential to markedly impact diagnosis of human diseases. A general statistical methodology is developed to analyze and classify DSC thermograms to analyze and classify thermograms. Analysis of an acquired thermogram involves comparison with a database of empirical reference thermograms from clinically characterized diseases. Two parameters, a distance metric, P, and correlation coefficient, r, are combined to produce a 'similarity metric,' ρ, which can be used to classify unknown thermograms into pre-characterized categories. Simulated thermograms known to lie within or fall outside of the 90% quantile range around a median reference are also analyzed. Results verify the utility of the methods and establish the apparent dynamic range of the metric ρ. Methods are then applied to data obtained from a collection of plasma samples from patients clinically diagnosed with SLE (lupus). High correspondence is found between curve shapes and values of the metric ρ. In a final application, an elementary classification rule is implemented to successfully analyze and classify unlabeled thermograms. These methods constitute a set of powerful yet easy to implement tools for quantitative classification, analysis and interpretation of DSC plasma melting curves.

  16. Statistical Power Flow Analysis of an Imperfect Ribbed Cylinder

    NASA Astrophysics Data System (ADS)

    Blakemore, M.; Woodhouse, J.; Hardie, D. J. W.

    1999-05-01

    Prediction of the noise transmitted from machinery and flow sources on a submarine to the sonar arrays poses a complex problem. Vibrations in the pressure hull provide the main transmission mechanism. The pressure hull is characterised by a very large number of modes over the frequency range of interest (at least 100,000) and by high modal overlap, both of which place its analysis beyond the scope of finite element or boundary element methods. A method for calculating the transmission is presented, which is broadly based on Statistical Energy Analysis, but extended in two important ways: (1) a novel subsystem breakdown which exploits the particular geometry of a submarine pressure hull; (2) explicit modelling of energy density variation within a subsystem due to damping. The method takes account of fluid-structure interaction, the underlying pass/stop band characteristics resulting from the near-periodicity of the pressure hull construction, the effect of vibration isolators such as bulkheads, and the cumulative effect of irregularities (e.g., attachments and penetrations).

  17. FTree query construction for virtual screening: a statistical analysis

    NASA Astrophysics Data System (ADS)

    Gerlach, Christof; Broughton, Howard; Zaliani, Andrea

    2008-02-01

    FTrees (FT) is a known chemoinformatic tool able to condense molecular descriptions into a graph object and to search for actives in large databases using graph similarity. The query graph is classically derived from a known active molecule, or a set of actives, for which a similar compound has to be found. Recently, FT similarity has been extended to fragment space, widening its capabilities. If a user were able to build a knowledge-based FT query from information other than a known active structure, the similarity search could be combined with other, normally separate, fields like de-novo design or pharmacophore searches. With this aim in mind, we performed a comprehensive analysis of several databases in terms of FT description and provide a basic statistical analysis of the FT spaces so far at hand. Vendors' catalogue collections and MDDR as a source of potential or known "actives", respectively, have been used. With the results reported herein, a set of ranges, mean values and standard deviations for several query parameters are presented in order to set a reference guide for the users. Applications on how to use this information in FT query building are also provided, using a newly built 3D-pharmacophore from 57 5HT-1F agonists and a published one which was used for virtual screening for tRNA-guanine transglycosylase (TGT) inhibitors.

  18. Higher order statistical frequency domain decomposition for operational modal analysis

    NASA Astrophysics Data System (ADS)

    Nita, G. M.; Mahgoub, M. A.; Sharyatpanahi, S. G.; Cretu, N. C.; El-Fouly, T. M.

    2017-02-01

    Experimental methods based on modal analysis under ambient vibrational excitation are often employed to detect structural damages of mechanical systems. Many of such frequency domain methods, such as Basic Frequency Domain (BFD), Frequency Domain Decomposition (FFD), or Enhanced Frequency Domain Decomposition (EFFD), use as first step a Fast Fourier Transform (FFT) estimate of the power spectral density (PSD) associated with the response of the system. In this study it is shown that higher order statistical estimators such as Spectral Kurtosis (SK) and Sample to Model Ratio (SMR) may be successfully employed not only to more reliably discriminate the response of the system against the ambient noise fluctuations, but also to better identify and separate contributions from closely spaced individual modes. It is shown that a SMR-based Maximum Likelihood curve fitting algorithm may improve the accuracy of the spectral shape and location of the individual modes and, when combined with the SK analysis, it provides efficient means to categorize such individual spectral components according to their temporal dynamics as coherent or incoherent system responses to unknown ambient excitations.

  19. Data Analysis & Statistical Methods for Command File Errors

    NASA Technical Reports Server (NTRS)

    Meshkat, Leila; Waggoner, Bruce; Bryant, Larry

    2014-01-01

    This paper explains current work on modeling for managing the risk of command file errors. It is focused on analyzing actual data from a JPL spaceflight mission to build models for evaluating and predicting error rates as a function of several key variables. We constructed a rich dataset by considering the number of errors, the number of files radiated, including the number commands and blocks in each file, as well as subjective estimates of workload and operational novelty. We have assessed these data using different curve fitting and distribution fitting techniques, such as multiple regression analysis, and maximum likelihood estimation to see how much of the variability in the error rates can be explained with these. We have also used goodness of fit testing strategies and principal component analysis to further assess our data. Finally, we constructed a model of expected error rates based on the what these statistics bore out as critical drivers to the error rate. This model allows project management to evaluate the error rate against a theoretically expected rate as well as anticipate future error rates.

  20. RNA STRAND: The RNA Secondary Structure and Statistical Analysis Database

    PubMed Central

    Andronescu, Mirela; Bereg, Vera; Hoos, Holger H; Condon, Anne

    2008-01-01

    Background The ability to access, search and analyse secondary structures of a large set of known RNA molecules is very important for deriving improved RNA energy models, for evaluating computational predictions of RNA secondary structures and for a better understanding of RNA folding. Currently there is no database that can easily provide these capabilities for almost all RNA molecules with known secondary structures. Results In this paper we describe RNA STRAND – the RNA secondary STRucture and statistical ANalysis Database, a curated database containing known secondary structures of any type and organism. Our new database provides a wide collection of known RNA secondary structures drawn from public databases, searchable and downloadable in a common format. Comprehensive statistical information on the secondary structures in our database is provided using the RNA Secondary Structure Analyser, a new tool we have developed to analyse RNA secondary structures. The information thus obtained is valuable for understanding to which extent and with which probability certain structural motifs can appear. We outline several ways in which the data provided in RNA STRAND can facilitate research on RNA structure, including the improvement of RNA energy models and evaluation of secondary structure prediction programs. In order to keep up-to-date with new RNA secondary structure experiments, we offer the necessary tools to add solved RNA secondary structures to our database and invite researchers to contribute to RNA STRAND. Conclusion RNA STRAND is a carefully assembled database of trusted RNA secondary structures, with easy on-line tools for searching, analyzing and downloading user selected entries, and is publicly available at . PMID:18700982

  1. Statistical analyses of the magnet data for the advanced photon source storage ring magnets

    SciTech Connect

    Kim, S.H.; Carnegie, D.W.; Doose, C.; Hogrefe, R.; Kim, K.; Merl, R.

    1995-05-01

    The statistics of the measured magnetic data of 80 dipole, 400 quadrupole, and 280 sextupole magnets of conventional resistive designs for the APS storage ring is summarized. In order to accommodate the vacuum chamber, the curved dipole has a C-type cross section and the quadrupole and sextupole cross sections have 180{degrees} and 120{degrees} symmetries, respectively. The data statistics include the integrated main fields, multipole coefficients, magnetic and mechanical axes, and roll angles of the main fields. The average and rms values of the measured magnet data meet the storage ring requirements.

  2. Advanced Materials and Solids Analysis Research Core (AMSARC)

    EPA Science Inventory

    The Advanced Materials and Solids Analysis Research Core (AMSARC), centered at the U.S. Environmental Protection Agency's (EPA) Andrew W. Breidenbach Environmental Research Center in Cincinnati, Ohio, is the foundation for the Agency's solids and surfaces analysis capabilities. ...

  3. Flipping the Classroom and Student Performance in Advanced Statistics: Evidence from a Quasi-Experiment

    ERIC Educational Resources Information Center

    Touchton, Michael

    2015-01-01

    I administer a quasi-experiment using undergraduate political science majors in statistics classes to evaluate whether "flipping the classroom" (the treatment) alters students' applied problem-solving performance and satisfaction relative to students in a traditional classroom environment (the control). I also assess whether general…

  4. Use of statistical analysis in the biomedical informatics literature.

    PubMed

    Scotch, Matthew; Duggal, Mona; Brandt, Cynthia; Lin, Zhenqui; Shiffman, Richard

    2010-01-01

    Statistics is an essential aspect of biomedical informatics. To examine the use of statistics in informatics research, a literature review of recent articles in two high-impact factor biomedical informatics journals, the Journal of American Medical Informatics Association (JAMIA) and the International Journal of Medical Informatics was conducted. The use of statistical methods in each paper was examined. Articles of original investigations from 2000 to 2007 were reviewed. For each journal, the results by statistical methods were analyzed as: descriptive, elementary, multivariable, other regression, machine learning, and other statistics. For both journals, descriptive statistics were most often used. Elementary statistics such as t tests, chi(2), and Wilcoxon tests were much more frequent in JAMIA, while machine learning approaches such as decision trees and support vector machines were similar in occurrence across the journals. Also, the use of diagnostic statistics such as sensitivity, specificity, precision, and recall, was more frequent in JAMIA. These results highlight the use of statistics in informatics and the need for biomedical informatics scientists to have, as a minimum, proficiency in descriptive and elementary statistics.

  5. Advanced probabilistic risk analysis using RAVEN and RELAP-7

    SciTech Connect

    Rabiti, Cristian; Alfonsi, Andrea; Mandelli, Diego; Cogliati, Joshua; Kinoshita, Robert

    2014-06-01

    RAVEN, under the support of the Nuclear Energy Advanced Modeling and Simulation (NEAMS) program [1], is advancing its capability to perform statistical analyses of stochastic dynamic systems. This is aligned with its mission to provide the tools needed by the Risk Informed Safety Margin Characterization (RISMC) path-lead [2] under the Department Of Energy (DOE) Light Water Reactor Sustainability program [3]. In particular this task is focused on the synergetic development with the RELAP-7 [4] code to advance the state of the art on the safety analysis of nuclear power plants (NPP). The investigation of the probabilistic evolution of accident scenarios for a complex system such as a nuclear power plant is not a trivial challenge. The complexity of the system to be modeled leads to demanding computational requirements even to simulate one of the many possible evolutions of an accident scenario (tens of CPU/hour). At the same time, the probabilistic analysis requires thousands of runs to investigate outcomes characterized by low probability and severe consequence (tail problem). The milestone reported in June of 2013 [5] described the capability of RAVEN to implement complex control logic and provide an adequate support for the exploration of the probabilistic space using a Monte Carlo sampling strategy. Unfortunately the Monte Carlo approach is ineffective with a problem of this complexity. In the following year of development, the RAVEN code has been extended with more sophisticated sampling strategies (grids, Latin Hypercube, and adaptive sampling). This milestone report illustrates the effectiveness of those methodologies in performing the assessment of the probability of core damage following the onset of a Station Black Out (SBO) situation in a boiling water reactor (BWR). The first part of the report provides an overview of the available probabilistic analysis capabilities, ranging from the different types of distributions available, possible sampling

  6. On Conceptual Analysis as the Primary Qualitative Approach to Statistics Education Research in Psychology

    ERIC Educational Resources Information Center

    Petocz, Agnes; Newbery, Glenn

    2010-01-01

    Statistics education in psychology often falls disappointingly short of its goals. The increasing use of qualitative approaches in statistics education research has extended and enriched our understanding of statistical cognition processes, and thus facilitated improvements in statistical education and practices. Yet conceptual analysis, a…

  7. Baseline Industry Analysis, Advance Ceramics Industry

    DTIC Science & Technology

    1993-04-01

    Commerce , Department of Defense, and the National Critical Technologies Panel. Advanced Ceramics, which include ceramic matrix composites, are found in...ceramics and materials industry being identified as a National Critical Technology, Commerce Emerging Technology, and Defense Critical Technology.’ There is...total procurement cost in advanced systems, and as much as ten percent of the electronics portion of those weapons. Ceramic capacitors are almost as

  8. Ensemble Solar Forecasting Statistical Quantification and Sensitivity Analysis: Preprint

    SciTech Connect

    Cheung, WanYin; Zhang, Jie; Florita, Anthony; Hodge, Bri-Mathias; Lu, Siyuan; Hamann, Hendrik F.; Sun, Qian; Lehman, Brad

    2015-12-08

    Uncertainties associated with solar forecasts present challenges to maintain grid reliability, especially at high solar penetrations. This study aims to quantify the errors associated with the day-ahead solar forecast parameters and the theoretical solar power output for a 51-kW solar power plant in a utility area in the state of Vermont, U.S. Forecasts were generated by three numerical weather prediction (NWP) models, including the Rapid Refresh, the High Resolution Rapid Refresh, and the North American Model, and a machine-learning ensemble model. A photovoltaic (PV) performance model was adopted to calculate theoretical solar power generation using the forecast parameters (e.g., irradiance, cell temperature, and wind speed). Errors of the power outputs were quantified using statistical moments and a suite of metrics, such as the normalized root mean squared error (NRMSE). In addition, the PV model's sensitivity to different forecast parameters was quantified and analyzed. Results showed that the ensemble model yielded forecasts in all parameters with the smallest NRMSE. The NRMSE of solar irradiance forecasts of the ensemble NWP model was reduced by 28.10% compared to the best of the three NWP models. Further, the sensitivity analysis indicated that the errors of the forecasted cell temperature attributed only approximately 0.12% to the NRMSE of the power output as opposed to 7.44% from the forecasted solar irradiance.

  9. Statistical analysis of mission profile parameters of civil transport airplanes

    NASA Technical Reports Server (NTRS)

    Buxbaum, O.

    1972-01-01

    The statistical analysis of flight times as well as airplane gross weights and fuel weights of jet-powered civil transport airplanes has shown that the distributions of their frequency of occurrence per flight can be presented approximately in general form. Before, however, these results may be used during the project stage of an airplane for defining a typical mission profile (the parameters of which are assumed to occur, for example, with a probability of 50 percent), the following points have to be taken into account. Because the individual airplanes were rotated during service, the scatter between the distributions of mission profile parameters for airplanes of the same type, which were flown with similar payload, has proven to be very small. Significant deviations from the generalized distributions may occur if an operator uses one airplane preferably on one or two specific routes. Another reason for larger deviations could be that the maintenance services of the operators of the observed airplanes are not representative of other airlines. Although there are indications that this is unlikely, similar information should be obtained from other operators. Such information would improve the reliability of the data.

  10. Statistical Analysis of Resistivity Anomalies Caused by Underground Caves

    NASA Astrophysics Data System (ADS)

    Frid, V.; Averbach, A.; Frid, M.; Dudkinski, D.; Liskevich, G.

    2017-03-01

    Geophysical prospecting of underground caves being performed on a construction site is often still a challenging procedure. Estimation of a likelihood level of an anomaly found is frequently a mandatory requirement of a project principal due to necessity of risk/safety assessment. However, the methodology of such estimation is not hitherto developed. Aiming to put forward such a methodology the present study (being performed as a part of an underground caves mapping prior to the land development on the site area) consisted of application of electrical resistivity tomography (ERT) together with statistical analysis utilized for the likelihood assessment of underground anomalies located. The methodology was first verified via a synthetic modeling technique and applied to the in situ collected ERT data and then crossed referenced with intrusive investigations (excavation and drilling) for the data verification. The drilling/excavation results showed that the proper discovering of underground caves can be done if anomaly probability level is not lower than 90 %. Such a probability value was shown to be consistent with the modeling results. More than 30 underground cavities were discovered on the site utilizing the methodology.

  11. Statistical Analysis of the AIAA Drag Prediction Workshop CFD Solutions

    NASA Technical Reports Server (NTRS)

    Morrison, Joseph H.; Hemsch, Michael J.

    2007-01-01

    The first AIAA Drag Prediction Workshop (DPW), held in June 2001, evaluated the results from an extensive N-version test of a collection of Reynolds-Averaged Navier-Stokes CFD codes. The code-to-code scatter was more than an order of magnitude larger than desired for design and experimental validation of cruise conditions for a subsonic transport configuration. The second AIAA Drag Prediction Workshop, held in June 2003, emphasized the determination of installed pylon-nacelle drag increments and grid refinement studies. The code-to-code scatter was significantly reduced compared to the first DPW, but still larger than desired. However, grid refinement studies showed no significant improvement in code-to-code scatter with increasing grid refinement. The third AIAA Drag Prediction Workshop, held in June 2006, focused on the determination of installed side-of-body fairing drag increments and grid refinement studies for clean attached flow on wing alone configurations and for separated flow on the DLR-F6 subsonic transport model. This report compares the transonic cruise prediction results of the second and third workshops using statistical analysis.

  12. Metrology optical power budgeting in SIM using statistical analysis techniques

    NASA Astrophysics Data System (ADS)

    Kuan, Gary M.

    2008-07-01

    The Space Interferometry Mission (SIM) is a space-based stellar interferometry instrument, consisting of up to three interferometers, which will be capable of micro-arc second resolution. Alignment knowledge of the three interferometer baselines requires a three-dimensional, 14-leg truss with each leg being monitored by an external metrology gauge. In addition, each of the three interferometers requires an internal metrology gauge to monitor the optical path length differences between the two sides. Both external and internal metrology gauges are interferometry based, operating at a wavelength of 1319 nanometers. Each gauge has fiber inputs delivering measurement and local oscillator (LO) power, split into probe-LO and reference-LO beam pairs. These beams experience power loss due to a variety of mechanisms including, but not restricted to, design efficiency, material attenuation, element misalignment, diffraction, and coupling efficiency. Since the attenuation due to these sources may degrade over time, an accounting of the range of expected attenuation is needed so an optical power margin can be book kept. A method of statistical optical power analysis and budgeting, based on a technique developed for deep space RF telecommunications, is described in this paper and provides a numerical confidence level for having sufficient optical power relative to mission metrology performance requirements.

  13. Ensemble Solar Forecasting Statistical Quantification and Sensitivity Analysis

    SciTech Connect

    Cheung, WanYin; Zhang, Jie; Florita, Anthony; Hodge, Bri-Mathias; Lu, Siyuan; Hamann, Hendrik F.; Sun, Qian; Lehman, Brad

    2015-10-02

    Uncertainties associated with solar forecasts present challenges to maintain grid reliability, especially at high solar penetrations. This study aims to quantify the errors associated with the day-ahead solar forecast parameters and the theoretical solar power output for a 51-kW solar power plant in a utility area in the state of Vermont, U.S. Forecasts were generated by three numerical weather prediction (NWP) models, including the Rapid Refresh, the High Resolution Rapid Refresh, and the North American Model, and a machine-learning ensemble model. A photovoltaic (PV) performance model was adopted to calculate theoretical solar power generation using the forecast parameters (e.g., irradiance, cell temperature, and wind speed). Errors of the power outputs were quantified using statistical moments and a suite of metrics, such as the normalized root mean squared error (NRMSE). In addition, the PV model's sensitivity to different forecast parameters was quantified and analyzed. Results showed that the ensemble model yielded forecasts in all parameters with the smallest NRMSE. The NRMSE of solar irradiance forecasts of the ensemble NWP model was reduced by 28.10% compared to the best of the three NWP models. Further, the sensitivity analysis indicated that the errors of the forecasted cell temperature attributed only approximately 0.12% to the NRMSE of the power output as opposed to 7.44% from the forecasted solar irradiance.

  14. Parameterization of 3D brain structures for statistical shape analysis

    NASA Astrophysics Data System (ADS)

    Zhu, Litao; Jiang, Tianzi

    2004-05-01

    Statistical Shape Analysis (SSA) is a powerful tool for noninvasive studies of pathophysiology and diagnosis of brain diseases. It also provides a shape constraint for the segmentation of brain structures. There are two key problems in SSA: the representation of shapes and their alignments. The widely used parameterized representations are obtained by preserving angles or areas and the alignments of shapes are achieved by rotating parameter net. However, representations preserving angles or areas do not really guarantee the anatomical correspondence of brain structures. In this paper, we incorporate shape-based landmarks into parameterization of banana-like 3D brain structures to address this problem. Firstly, we get the triangulated surface of the object and extract two landmarks from the mesh, i.e. the ends of the banana-like object. Then the surface is parameterized by creating a continuous and bijective mapping from the surface to a spherical surface based on a heat conduction model. The correspondence of shapes is achieved by mapping the two landmarks to the north and south poles of the sphere and using an extracted origin orientation to select the dateline during parameterization. We apply our approach to the parameterization of lateral ventricle and a multi-resolution shape representation is obtained by using the Discrete Fourier Transform.

  15. Statistical shape analysis of subcortical structures using spectral matching.

    PubMed

    Shakeri, Mahsa; Lombaert, Herve; Datta, Alexandre N; Oser, Nadine; Létourneau-Guillon, Laurent; Lapointe, Laurence Vincent; Martin, Florence; Malfait, Domitille; Tucholka, Alan; Lippé, Sarah; Kadoury, Samuel

    2016-09-01

    Studying morphological changes of subcortical structures often predicate neurodevelopmental and neurodegenerative diseases, such as Alzheimer's disease and schizophrenia. Hence, methods for quantifying morphological variations in the brain anatomy, including groupwise shape analyses, are becoming increasingly important for studying neurological disorders. In this paper, a novel groupwise shape analysis approach is proposed to detect regional morphological alterations in subcortical structures between two study groups, e.g., healthy and pathological subjects. The proposed scheme extracts smoothed triangulated surface meshes from segmented binary maps, and establishes reliable point-to-point correspondences among the population of surfaces using a spectral matching method. Mean curvature features are incorporated in the matching process, in order to increase the accuracy of the established surface correspondence. The mean shapes are created as the geometric mean of all surfaces in each group, and a distance map between these shapes is used to characterize the morphological changes between the two study groups. The resulting distance map is further analyzed to check for statistically significant differences between two populations. The performance of the proposed framework is evaluated on two separate subcortical structures (hippocampus and putamen). Furthermore, the proposed methodology is validated in a clinical application for detecting abnormal subcortical shape variations in Alzheimer's disease. Experimental results show that the proposed method is comparable to state-of-the-art algorithms, has less computational cost, and is more sensitive to small morphological variations in patients with neuropathologies.

  16. High Statistics Analysis of Nucleon form Factor in Lattice QCD

    NASA Astrophysics Data System (ADS)

    Shintani, Eigo; Wittig, Hartmut

    We systematically study the effect of excited state contamination into the signal of nucleon axial, (iso-)scalar and tensor charge, extracted from three-point function with various sets of source-sink separation. In order to enhance the statistics at O(10,000) measurement, we use the all-mode-averaging technique using the approximation of observable with the optimized size of local deflation field and block size of Schwartz alternative procedure to reduce the computational cost. Numerical study is performed with the range of source-sink separation (ts) from 0.8 fm to more than 1.5 fm with several cut-off scales (a-1 = 3-4 GeV) and pion masses (mπ = 0.19-0.45 GeV) keeping the volume as mπL > 4 on Nf = 2 Wilson-clover fermion configurations in Mainz-CLS group. We suggest that in the measurement of axial-charge there appears the significant effect of unsuppressed excited state contamination at less than ts = 1.2 fm even in light pion region, otherwides those are small in scalar and tensor charge. In the analysis using ts > 1.5 fm, the axial charge approaches to experimental result near physical point.

  17. Statistical analysis of barrier isolator/glovebox glove failure.

    PubMed

    Park, Young H; Pines, E; Ofouku, M; Cournoyer, M E

    2007-01-01

    In response to new, stricter safety requirements set out by the federal government, compounding pharmacists are investigating applications and processes appropriate for their facilities. One application, cutrrently used by many industries, was developed by Los Alamos National Laboratories for defense work. A barrier isolator or "glovebox" is a containment device that allows work within a sealed space while providing protection for people and the environment. Though knowledge of glove box use and maintenance has grown, unplanned breaches (e.g., glove failures) remain a concern. Recognizing that effective maintenance procedures can minimize breaches, we analyzed data drawn from glove failure records of Los Alamos National Laboratory's Nuclear Materials Technology Division to evaluate current inventory strategy in light of actual performance of the various types of gloves. This report includes a description of the statistical methods employed. The results of our analysis pinpointed the most frequently occurring causes of glove failure and revealed a significant imbalance between the current glove replacement schedule and the rate of glove failures in a much shorter period. We concluded that, to minimize unplanned breaches, either the replacement period needs to be adjusted or causes of failure eliminated or reduced.

  18. Metrology Optical Power Budgeting in SIM Using Statistical Analysis Techniques

    NASA Technical Reports Server (NTRS)

    Kuan, Gary M

    2008-01-01

    The Space Interferometry Mission (SIM) is a space-based stellar interferometry instrument, consisting of up to three interferometers, which will be capable of micro-arc second resolution. Alignment knowledge of the three interferometer baselines requires a three-dimensional, 14-leg truss with each leg being monitored by an external metrology gauge. In addition, each of the three interferometers requires an internal metrology gauge to monitor the optical path length differences between the two sides. Both external and internal metrology gauges are interferometry based, operating at a wavelength of 1319 nanometers. Each gauge has fiber inputs delivering measurement and local oscillator (LO) power, split into probe-LO and reference-LO beam pairs. These beams experience power loss due to a variety of mechanisms including, but not restricted to, design efficiency, material attenuation, element misalignment, diffraction, and coupling efficiency. Since the attenuation due to these sources may degrade over time, an accounting of the range of expected attenuation is needed so an optical power margin can be book kept. A method of statistical optical power analysis and budgeting, based on a technique developed for deep space RF telecommunications, is described in this paper and provides a numerical confidence level for having sufficient optical power relative to mission metrology performance requirements.

  19. Interval versions of statistical techniques with applications to environmental analysis, bioinformatics, and privacy in statistical databases

    NASA Astrophysics Data System (ADS)

    Kreinovich, Vladik; Longpre, Luc; Starks, Scott A.; Xiang, Gang; Beck, Jan; Kandathi, Raj; Nayak, Asis; Ferson, Scott; Hajagos, Janos

    2007-02-01

    In many areas of science and engineering, it is desirable to estimate statistical characteristics (mean, variance, covariance, etc.) under interval uncertainty. For example, we may want to use the measured values x(t) of a pollution level in a lake at different moments of time to estimate the average pollution level; however, we do not know the exact values x(t)--e.g., if one of the measurement results is 0, this simply means that the actual (unknown) value of x(t) can be anywhere between 0 and the detection limit (DL). We must, therefore, modify the existing statistical algorithms to process such interval data. Such a modification is also necessary to process data from statistical databases, where, in order to maintain privacy, we only keep interval ranges instead of the actual numeric data (e.g., a salary range instead of the actual salary). Most resulting computational problems are NP-hard--which means, crudely speaking, that in general, no computationally efficient algorithm can solve all particular cases of the corresponding problem. In this paper, we overview practical situations in which computationally efficient algorithms exist: e.g., situations when measurements are very accurate, or when all the measurements are done with one (or few) instruments. As a case study, we consider a practical problem from bioinformatics: to discover the genetic difference between the cancer cells and the healthy cells, we must process the measurements results and find the concentrations c and h of a given gene in cancer and in healthy cells. This is a particular case of a general situation in which, to estimate states or parameters which are not directly accessible by measurements, we must solve a system of equations in which coefficients are only known with interval uncertainty. We show that in general, this problem is NP-hard, and we describe new efficient algorithms for solving this problem in practically important situations.

  20. Bayesian Analysis of Order-Statistics Models for Ranking Data.

    ERIC Educational Resources Information Center

    Yu, Philip L. H.

    2000-01-01

    Studied the order-statistics models, extending the usual normal order-statistics model into one in which the underlying random variables followed a multivariate normal distribution. Used a Bayesian approach and the Gibbs sampling technique. Applied the proposed method to analyze presidential election data from the American Psychological…

  1. The Higher Education System in Israel: Statistical Abstract and Analysis.

    ERIC Educational Resources Information Center

    Herskovic, Shlomo

    This edition of a statistical abstract published every few years on the higher education system in Israel presents the most recent data available through 1990-91. The data were gathered through the cooperation of the Central Bureau of Statistics and institutions of higher education. Chapter 1 presents a summary of principal findings covering the…

  2. For the Love of Statistics: Appreciating and Learning to Apply Experimental Analysis and Statistics through Computer Programming Activities

    ERIC Educational Resources Information Center

    Mascaró, Maite; Sacristán, Ana Isabel; Rufino, Marta M.

    2016-01-01

    For the past 4 years, we have been involved in a project that aims to enhance the teaching and learning of experimental analysis and statistics, of environmental and biological sciences students, through computational programming activities (using R code). In this project, through an iterative design, we have developed sequences of R-code-based…

  3. Combined statistical analysis of landslide release and propagation

    NASA Astrophysics Data System (ADS)

    Mergili, Martin; Rohmaneo, Mohammad; Chu, Hone-Jay

    2016-04-01

    Statistical methods - often coupled with stochastic concepts - are commonly employed to relate areas affected by landslides with environmental layers, and to estimate spatial landslide probabilities by applying these relationships. However, such methods only concern the release of landslides, disregarding their motion. Conceptual models for mass flow routing are used for estimating landslide travel distances and possible impact areas. Automated approaches combining release and impact probabilities are rare. The present work attempts to fill this gap by a fully automated procedure combining statistical and stochastic elements, building on the open source GRASS GIS software: (1) The landslide inventory is subset into release and deposition zones. (2) We employ a traditional statistical approach to estimate the spatial release probability of landslides. (3) We back-calculate the probability distribution of the angle of reach of the observed landslides, employing the software tool r.randomwalk. One set of random walks is routed downslope from each pixel defined as release area. Each random walk stops when leaving the observed impact area of the landslide. (4) The cumulative probability function (cdf) derived in (3) is used as input to route a set of random walks downslope from each pixel in the study area through the DEM, assigning the probability gained from the cdf to each pixel along the path (impact probability). The impact probability of a pixel is defined as the average impact probability of all sets of random walks impacting a pixel. Further, the average release probabilities of the release pixels of all sets of random walks impacting a given pixel are stored along with the area of the possible release zone. (5) We compute the zonal release probability by increasing the release probability according to the size of the release zone - the larger the zone, the larger the probability that a landslide will originate from at least one pixel within this zone. We

  4. Parallelization of the Physical-Space Statistical Analysis System (PSAS)

    NASA Technical Reports Server (NTRS)

    Larson, J. W.; Guo, J.; Lyster, P. M.

    1999-01-01

    Atmospheric data assimilation is a method of combining observations with model forecasts to produce a more accurate description of the atmosphere than the observations or forecast alone can provide. Data assimilation plays an increasingly important role in the study of climate and atmospheric chemistry. The NASA Data Assimilation Office (DAO) has developed the Goddard Earth Observing System Data Assimilation System (GEOS DAS) to create assimilated datasets. The core computational components of the GEOS DAS include the GEOS General Circulation Model (GCM) and the Physical-space Statistical Analysis System (PSAS). The need for timely validation of scientific enhancements to the data assimilation system poses computational demands that are best met by distributed parallel software. PSAS is implemented in Fortran 90 using object-based design principles. The analysis portions of the code solve two equations. The first of these is the "innovation" equation, which is solved on the unstructured observation grid using a preconditioned conjugate gradient (CG) method. The "analysis" equation is a transformation from the observation grid back to a structured grid, and is solved by a direct matrix-vector multiplication. Use of a factored-operator formulation reduces the computational complexity of both the CG solver and the matrix-vector multiplication, rendering the matrix-vector multiplications as a successive product of operators on a vector. Sparsity is introduced to these operators by partitioning the observations using an icosahedral decomposition scheme. PSAS builds a large (approx. 128MB) run-time database of parameters used in the calculation of these operators. Implementing a message passing parallel computing paradigm into an existing yet developing computational system as complex as PSAS is nontrivial. One of the technical challenges is balancing the requirements for computational reproducibility with the need for high performance. The problem of computational

  5. A comparison of InVivoStat with other statistical software packages for analysis of data generated from animal experiments.

    PubMed

    Clark, Robin A; Shoaib, Mohammed; Hewitt, Katherine N; Stanford, S Clare; Bate, Simon T

    2012-08-01

    InVivoStat is a free-to-use statistical software package for analysis of data generated from animal experiments. The package is designed specifically for researchers in the behavioural sciences, where exploiting the experimental design is crucial for reliable statistical analyses. This paper compares the analysis of three experiments conducted using InVivoStat with other widely used statistical packages: SPSS (V19), PRISM (V5), UniStat (V5.6) and Statistica (V9). We show that InVivoStat provides results that are similar to those from the other packages and, in some cases, are more advanced. This investigation provides evidence of further validation of InVivoStat and should strengthen users' confidence in this new software package.

  6. New advances in methodology for statistical tests useful in geostatistical studies

    SciTech Connect

    Borgman, L.E.

    1988-05-01

    Methodology for statistical procedures to perform tests of hypothesis pertaining to various aspects of geostatistical investigations has been slow in developing. The correlated nature of the data precludes most classical tests and makes the design of new tests difficult. Recent studies have led to modifications of the classical t test which allow for the intercorrelation. In addition, results for certain nonparametric tests have been obtained. The conclusions of these studies provide a variety of new tools for the geostatistician in deciding questions on significant differences and magnitudes.

  7. A Complex Approach to UXO Discrimination: Combining Advanced EMI Forward Models and Statistical Signal Processing

    DTIC Science & Technology

    2012-01-01

    assume that the NSMS can be approximated by a series of expansion functions F m (  ) such that (  )   m F m (  ) m1 M  (31) UXO...a receiver coil is the electromotive force given by the negative of the time derivative of the secondary magnetic flux through the coil. Since the...statistical signal processing MM-1572 Final Report Sky Research, Inc. January 2012 52 A support vector machine learns from data: when fed a series

  8. A deterministic and statistical energy analysis of tyre cavity resonance noise

    NASA Astrophysics Data System (ADS)

    Mohamed, Zamri; Wang, Xu

    2016-03-01

    Tyre cavity resonance was studied using a combination of deterministic analysis and statistical energy analysis where its deterministic part was implemented using the impedance compact mobility matrix method and its statistical part was done by the statistical energy analysis method. While the impedance compact mobility matrix method can offer a deterministic solution to the cavity pressure response and the compliant wall vibration velocity response in the low frequency range, the statistical energy analysis method can offer a statistical solution of the responses in the high frequency range. In the mid frequency range, a combination of the statistical energy analysis and deterministic analysis methods can identify system coupling characteristics. Both methods have been compared to those from commercial softwares in order to validate the results. The combined analysis result has been verified by the measurement result from a tyre-cavity physical model. The analysis method developed in this study can be applied to other similar toroidal shape structural-acoustic systems.

  9. SUBMILLIMETER NUMBER COUNTS FROM STATISTICAL ANALYSIS OF BLAST MAPS

    SciTech Connect

    Patanchon, Guillaume; Ade, Peter A. R.; Griffin, Matthew; Hargrave, Peter C.; Mauskopf, Philip; Moncelsi, Lorenzo; Pascale, Enzo; Bock, James J.; Chapin, Edward L.; Halpern, Mark; Marsden, Gaelen; Scott, Douglas; Devlin, Mark J.; Dicker, Simon R.; Klein, Jeff; Rex, Marie; Gundersen, Joshua O.; Hughes, David H.; Netterfield, Calvin B.; Olmi, Luca

    2009-12-20

    We describe the application of a statistical method to estimate submillimeter galaxy number counts from confusion-limited observations by the Balloon-borne Large Aperture Submillimeter Telescope (BLAST). Our method is based on a maximum likelihood fit to the pixel histogram, sometimes called 'P(D)', an approach which has been used before to probe faint counts, the difference being that here we advocate its use even for sources with relatively high signal-to-noise ratios. This method has an advantage over standard techniques of source extraction in providing an unbiased estimate of the counts from the bright end down to flux densities well below the confusion limit. We specifically analyze BLAST observations of a roughly 10 deg{sup 2} map centered on the Great Observatories Origins Deep Survey South field. We provide estimates of number counts at the three BLAST wavelengths 250, 350, and 500 mum; instead of counting sources in flux bins we estimate the counts at several flux density nodes connected with power laws. We observe a generally very steep slope for the counts of about -3.7 at 250 mum, and -4.5 at 350 and 500 mum, over the range approx0.02-0.5 Jy, breaking to a shallower slope below about 0.015 Jy at all three wavelengths. We also describe how to estimate the uncertainties and correlations in this method so that the results can be used for model-fitting. This method should be well suited for analysis of data from the Herschel satellite.

  10. 3D statistical failure analysis of monolithic dental ceramic crowns.

    PubMed

    Nasrin, Sadia; Katsube, Noriko; Seghi, Robert R; Rokhlin, Stanislav I

    2016-07-05

    For adhesively retained ceramic crown of various types, it has been clinically observed that the most catastrophic failures initiate from the cement interface as a result of radial crack formation as opposed to Hertzian contact stresses originating on the occlusal surface. In this work, a 3D failure prognosis model is developed for interface initiated failures of monolithic ceramic crowns. The surface flaw distribution parameters determined by biaxial flexural tests on ceramic plates and point-to-point variations of multi-axial stress state at the intaglio surface are obtained by finite element stress analysis. They are combined on the basis of fracture mechanics based statistical failure probability model to predict failure probability of a monolithic crown subjected to single-cycle indentation load. The proposed method is verified by prior 2D axisymmetric model and experimental data. Under conditions where the crowns are completely bonded to the tooth substrate, both high flexural stress and high interfacial shear stress are shown to occur in the wall region where the crown thickness is relatively thin while high interfacial normal tensile stress distribution is observed at the margin region. Significant impact of reduced cement modulus on these stress states is shown. While the analyses are limited to single-cycle load-to-failure tests, high interfacial normal tensile stress or high interfacial shear stress may contribute to degradation of the cement bond between ceramic and dentin. In addition, the crown failure probability is shown to be controlled by high flexural stress concentrations over a small area, and the proposed method might be of some value to detect initial crown design errors.

  11. Application of the Statistical ICA Technique in the DANCE Data Analysis

    NASA Astrophysics Data System (ADS)

    Baramsai, Bayarbadrakh; Jandel, M.; Bredeweg, T. A.; Rusev, G.; Walker, C. L.; Couture, A.; Mosby, S.; Ullmann, J. L.; Dance Collaboration

    2015-10-01

    The Detector for Advanced Neutron Capture Experiments (DANCE) at the Los Alamos Neutron Science Center is used to improve our understanding of the neutron capture reaction. DANCE is a highly efficient 4 π γ-ray detector array consisting of 160 BaF2 crystals which make it an ideal tool for neutron capture experiments. The (n, γ) reaction Q-value equals to the sum energy of all γ-rays emitted in the de-excitation cascades from the excited capture state to the ground state. The total γ-ray energy is used to identify reactions on different isotopes as well as the background. However, it's challenging to identify contribution in the Esum spectra from different isotopes with the similar Q-values. Recently we have tested the applicability of modern statistical methods such as Independent Component Analysis (ICA) to identify and separate different (n, γ) reaction yields on different isotopes that are present in the target material. ICA is a recently developed computational tool for separating multidimensional data into statistically independent additive subcomponents. In this conference talk, we present some results of the application of ICA algorithms and its modification for the DANCE experimental data analysis. This research is supported by the U. S. Department of Energy, Office of Science, Nuclear Physics under the Early Career Award No. LANL20135009.

  12. Statistical Design, Models and Analysis for the Job Change Framework.

    ERIC Educational Resources Information Center

    Gleser, Leon Jay

    1990-01-01

    Proposes statistical methodology for testing Loughead and Black's "job change thermostat." Discusses choice of target population; relationship between job satisfaction and values, perceptions, and opportunities; and determinants of job change. (SK)

  13. Automatic Derivation of Statistical Data Analysis Algorithms: Planetary Nebulae and Beyond

    NASA Astrophysics Data System (ADS)

    Fischer, Bernd; Hajian, Arsen; Knuth, Kevin; Schumann, Johann

    2004-04-01

    AUTOBAYES is a fully automatic program synthesis system for the data analysis domain. Its input is a declarative problem description in form of a statistical model; its output is documented and optimized C/C++ code. The synthesis process relies on the combination of three key techniques. Bayesian networks are used as a compact internal representation mechanism which enables problem decompositions and guides the algorithm derivation. Program schemas are used as independently composable building blocks for the algorithm construction; they can encapsulate advanced algorithms and data structures. A symbolic-algebraic system is used to find closed-form solutions for problems and emerging subproblems. In this paper, we describe the application of AUTOBAYES to the analysis of planetary nebulae images taken by the Hubble Space Telescope. We explain the system architecture, and present in detail the automatic derivation of the scientists' original analysis as well as a refined analysis using clustering models. This study demonstrates that AUTOBAYES is now mature enough so that it can be applied to realistic scientific data analysis tasks.

  14. Unbiased statistical analysis for multi-stage proteomic search strategies.

    PubMed

    Everett, Logan J; Bierl, Charlene; Master, Stephen R

    2010-02-05

    "Multi-stage" search strategies have become widely accepted for peptide identification and are implemented in a number of available software packages. We describe limitations of these strategies for validation and decoy-based statistical analyses and demonstrate these limitations using a set of control sample spectra. We propose a solution that corrects the statistical deficiencies and describe its implementation using the open-source software X!Tandem.

  15. Recent advances in statistical methods for the estimation of sediment and nutrient transport in rivers

    NASA Astrophysics Data System (ADS)

    Colin, T. A.

    1995-07-01

    This paper reviews advances in methods for estimating fluvial transport of suspended sediment and nutrients. Research from the past four years, mostly dealing with estimating monthly and annual loads, is emphasized. However, because this topic has not appeared in previous IUGG reports, some research prior to 1990 is included. The motivation for studying sediment transport has shifted during the past few decades. In addition to its role in filling reservoirs and channels, sediment is increasingly recognized as an important part of fluvial ecosystems and estuarine wetlands. Many groups want information about sediment transport [Bollman, 1992]: Scientists trying to understand benthic biology and catchment hydrology; citizens and policy-makers concerned about environmental impacts (e.g. impacts of logging [Beschta, 1978] or snow-fences [Sturges, 1992]); government regulators considering the effectiveness of programs to protect in-stream habitat and downstream waterbodies; and resource managers seeking to restore wetlands.

  16. Statistical analysis of synaptic transmission: model discrimination and confidence limits.

    PubMed Central

    Stricker, C; Redman, S; Daley, D

    1994-01-01

    Procedures for discriminating between competing statistical models of synaptic transmission, and for providing confidence limits on the parameters of these models, have been developed. These procedures were tested against simulated data and were used to analyze the fluctuations in synaptic currents evoked in hippocampal neurones. All models were fitted to data using the Expectation-Maximization algorithm and a maximum likelihood criterion. Competing models were evaluated using the log-likelihood ratio (Wilks statistic). When the competing models were not nested, Monte Carlo sampling of the model used as the null hypothesis (H0) provided density functions against which H0 and the alternate model (H1) were tested. The statistic for the log-likelihood ratio was determined from the fit of H0 and H1 to these probability densities. This statistic was used to determine the significance level at which H0 could be rejected for the original data. When the competing models were nested, log-likelihood ratios and the chi 2 statistic were used to determine the confidence level for rejection. Once the model that provided the best statistical fit to the data was identified, many estimates for the model parameters were calculated by resampling the original data. Bootstrap techniques were then used to obtain the confidence limits of these parameters. PMID:7948672

  17. Optimization of Sinter Plant Operating Conditions Using Advanced Multivariate Statistics: Intelligent Data Processing

    NASA Astrophysics Data System (ADS)

    Fernández-González, Daniel; Martín-Duarte, Ramón; Ruiz-Bustinza, Íñigo; Mochón, Javier; González-Gasca, Carmen; Verdeja, Luis Felipe

    2016-08-01

    Blast furnace operators expect to get sinter with homogenous and regular properties (chemical and mechanical), necessary to ensure regular blast furnace operation. Blends for sintering also include several iron by-products and other wastes that are obtained in different processes inside the steelworks. Due to their source, the availability of such materials is not always consistent, but their total production should be consumed in the sintering process, to both save money and recycle wastes. The main scope of this paper is to obtain the least expensive iron ore blend for the sintering process, which will provide suitable chemical and mechanical features for the homogeneous and regular operation of the blast furnace. The systematic use of statistical tools was employed to analyze historical data, including linear and partial correlations applied to the data and fuzzy clustering based on the Sugeno Fuzzy Inference System to establish relationships among the available variables.

  18. Preparing systems engineering and computing science students in disciplined methods, quantitative, and advanced statistical techniques to improve process performance

    NASA Astrophysics Data System (ADS)

    McCray, Wilmon Wil L., Jr.

    The research was prompted by a need to conduct a study that assesses process improvement, quality management and analytical techniques taught to students in U.S. colleges and universities undergraduate and graduate systems engineering and the computing science discipline (e.g., software engineering, computer science, and information technology) degree programs during their academic training that can be applied to quantitatively manage processes for performance. Everyone involved in executing repeatable processes in the software and systems development lifecycle processes needs to become familiar with the concepts of quantitative management, statistical thinking, process improvement methods and how they relate to process-performance. Organizations are starting to embrace the de facto Software Engineering Institute (SEI) Capability Maturity Model Integration (CMMI RTM) Models as process improvement frameworks to improve business processes performance. High maturity process areas in the CMMI model imply the use of analytical, statistical, quantitative management techniques, and process performance modeling to identify and eliminate sources of variation, continually improve process-performance; reduce cost and predict future outcomes. The research study identifies and provides a detail discussion of the gap analysis findings of process improvement and quantitative analysis techniques taught in U.S. universities systems engineering and computing science degree programs, gaps that exist in the literature, and a comparison analysis which identifies the gaps that exist between the SEI's "healthy ingredients " of a process performance model and courses taught in U.S. universities degree program. The research also heightens awareness that academicians have conducted little research on applicable statistics and quantitative techniques that can be used to demonstrate high maturity as implied in the CMMI models. The research also includes a Monte Carlo simulation optimization

  19. Proposed neutron activation analysis facilities in the Advanced Neutron Source

    SciTech Connect

    Robinson, L.; Dyer, F.F.; Emery, J.F.

    1990-01-01

    A number of analytical chemistry experimental facilities are being proposed for the Advanced Neutron Source. Experimental capabilities will include gamma-ray analysis and neutron depth profiling. This paper describes the various systems proposed and some of their important characteristics.

  20. Advanced Modeling, Simulation and Analysis (AMSA) Capability Roadmap Progress Review

    NASA Technical Reports Server (NTRS)

    Antonsson, Erik; Gombosi, Tamas

    2005-01-01

    Contents include the following: NASA capability roadmap activity. Advanced modeling, simulation, and analysis overview. Scientific modeling and simulation. Operations modeling. Multi-special sensing (UV-gamma). System integration. M and S Environments and Infrastructure.

  1. Advanced Fingerprint Analysis Project Fingerprint Constituents

    SciTech Connect

    GM Mong; CE Petersen; TRW Clauss

    1999-10-29

    The work described in this report was focused on generating fundamental data on fingerprint components which will be used to develop advanced forensic techniques to enhance fluorescent detection, and visualization of latent fingerprints. Chemical components of sweat gland secretions are well documented in the medical literature and many chemical techniques are available to develop latent prints, but there have been no systematic forensic studies of fingerprint sweat components or of the chemical and physical changes these substances undergo over time.

  2. Advanced Trending Analysis/EDS Data Program.

    DTIC Science & Technology

    1982-01-01

    Fault Detection and Isolation (TEFDI) Program, SCT was to use the Advanced Trend...detailed discussion of the algorithm and its underlying theory, the reader is directed to SCT’s Turbine Engine Fault Detection and Isolation (TEF!I) Program...SCT’s Turbine Engine Fault Detection and Isolation (TEFDI) Program Final Report scheduled for release in early 1982. 2. DISCUSSION OF RESULTS -

  3. Advanced nuclear rocket engine mission analysis

    SciTech Connect

    Ramsthaler, J.; Farbman, G.; Sulmeisters, T.; Buden, D.; Harris, P.

    1987-12-01

    The use of a derivative of the NERVA engine developed from 1955 to 1973 was evluated for potential application to Air Force orbital transfer and maneuvering missions in the time period 1995 to 2020. The NERVA stge was found to have lower life cycle costs (LCC) than an advanced chemical stage for performing low earth orbit (LEO) to geosynchronous orbit (GEO0 missions at any level of activity greater than three missions per year. It had lower life cycle costs than a high performance nuclear electric engine at any level of LEO to GEO mission activity. An examination of all unmanned orbital transfer and maneuvering missions from the Space Transportation Architecture study (STAS 111-3) indicated a LCC advantage for the NERVA stage over the advanced chemical stage of fifteen million dollars. The cost advanced accured from both the orbital transfer and maneuvering missions. Parametric analyses showed that the specific impulse of the NERVA stage and the cost of delivering material to low earth orbit were the most significant factors in the LCC advantage over the chemical stage. Lower development costs and a higher thrust gave the NERVA engine an LCC advantage over the nuclear electric stage. An examination of technical data from the Rover/NERVA program indicated that development of the NERVA stage has a low technical risk, and the potential for high reliability and safe operation. The data indicated the NERVA engine had a great flexibility which would permit a single stage to perform all Air Force missions.

  4. Improved equilibrium reconstructions by advanced statistical weighting of the internal magnetic measurements.

    PubMed

    Murari, A; Gelfusa, M; Peluso, E; Gaudio, P; Mazon, D; Hawkes, N; Point, G; Alper, B; Eich, T

    2014-12-01

    In a Tokamak the configuration of the magnetic fields remains the key element to improve performance and to maximise the scientific exploitation of the device. On the other hand, the quality of the reconstructed fields depends crucially on the measurements available. Traditionally in the least square minimisation phase of the algorithms, used to obtain the magnetic field topology, all the diagnostics are given the same weights, a part from a corrective factor taking into account the error bars. This assumption unduly penalises complex diagnostics, such as polarimetry, which have a limited number of highly significant measurements. A completely new method to choose the weights, to be given to the internal measurements of the magnetic fields for improved equilibrium reconstructions, is presented in this paper. The approach is based on various statistical indicators applied to the residuals, the difference between the actual measurements and their estimates from the reconstructed equilibrium. The potential of the method is exemplified using the measurements of the Faraday rotation derived from JET polarimeter. The results indicate quite clearly that the weights have to be determined carefully, since the inappropriate choice can have significant repercussions on the quality of the magnetic reconstruction both in the edge and in the core. These results confirm the limitations of the assumption that all the diagnostics have to be given the same weight, irrespective of the number of measurements they provide and the region of the plasma they probe.

  5. Statistical analysis of 1D HRR target features

    NASA Astrophysics Data System (ADS)

    Gross, David C.; Schmitz, James L.; Williams, Robert L.

    2000-08-01

    Automatic target recognition (ATR) and feature-aided tracking (FAT) algorithms that use one-dimensional (1-D) high range resolution (HRR) profiles require unique or distinguishable target features. This paper explores the use of statistical measures to quantify the separability and stability of ground target features found in HRR profiles. Measures of stability, such as the mean and variance, can be used to determine the stability of a target feature as a function of the target aspect and elevation angle. Statistical measures of feature predictability and separability, such as the Fisher and Bhattacharyya measures, demonstrate the capability to adequately predict the desired target feature over a specified aspect angular region. These statistical measures for separability and stability are explained in detail and their usefulness is demonstrated with measured HRR data.

  6. Statistical parameters and analysis of local contrast gloss.

    PubMed

    Oksman, Antti; Juuti, Mikko; Peiponen, Kai-Erik

    2008-08-04

    Recently, we introduced a sensor for the detection of local contrast gloss (or luster) of products. This is a new development step in contrast gloss measurement, since contrast gloss has been measured previously from a macroscopic area. Therefore, yet there do not exist statistical parameters for the classification of the contrast gloss as there exist parameters, e.g., for the classification of the surface roughness. In this study, we define novel statistical parameters for the diffuse component and contrast gloss obtained by the sensor for the detection of local contrast gloss. As an example, we utilize these statistical parameters and measured specular gloss, diffuse-component, and contrast gloss maps in the characterization of prints.

  7. Image analysis in medical imaging: recent advances in selected examples.

    PubMed

    Dougherty, G

    2010-01-01

    Medical imaging has developed into one of the most important fields within scientific imaging due to the rapid and continuing progress in computerised medical image visualisation and advances in analysis methods and computer-aided diagnosis. Several research applications are selected to illustrate the advances in image analysis algorithms and visualisation. Recent results, including previously unpublished data, are presented to illustrate the challenges and ongoing developments.

  8. Gene Identification Algorithms Using Exploratory Statistical Analysis of Periodicity

    NASA Astrophysics Data System (ADS)

    Mukherjee, Shashi Bajaj; Sen, Pradip Kumar

    2010-10-01

    Studying periodic pattern is expected as a standard line of attack for recognizing DNA sequence in identification of gene and similar problems. But peculiarly very little significant work is done in this direction. This paper studies statistical properties of DNA sequences of complete genome using a new technique. A DNA sequence is converted to a numeric sequence using various types of mappings and standard Fourier technique is applied to study the periodicity. Distinct statistical behaviour of periodicity parameters is found in coding and non-coding sequences, which can be used to distinguish between these parts. Here DNA sequences of Drosophila melanogaster were analyzed with significant accuracy.

  9. Interfaces between statistical analysis packages and the ESRI geographic information system

    NASA Technical Reports Server (NTRS)

    Masuoka, E.

    1980-01-01

    Interfaces between ESRI's geographic information system (GIS) data files and real valued data files written to facilitate statistical analysis and display of spatially referenced multivariable data are described. An example of data analysis which utilized the GIS and the statistical analysis system is presented to illustrate the utility of combining the analytic capability of a statistical package with the data management and display features of the GIS.

  10. Machine learning and statistical methods for the prediction of maximal oxygen uptake: recent advances

    PubMed Central

    Abut, Fatih; Akay, Mehmet Fatih

    2015-01-01

    Maximal oxygen uptake (VO2max) indicates how many milliliters of oxygen the body can consume in a state of intense exercise per minute. VO2max plays an important role in both sport and medical sciences for different purposes, such as indicating the endurance capacity of athletes or serving as a metric in estimating the disease risk of a person. In general, the direct measurement of VO2max provides the most accurate assessment of aerobic power. However, despite a high level of accuracy, practical limitations associated with the direct measurement of VO2max, such as the requirement of expensive and sophisticated laboratory equipment or trained staff, have led to the development of various regression models for predicting VO2max. Consequently, a lot of studies have been conducted in the last years to predict VO2max of various target audiences, ranging from soccer athletes, nonexpert swimmers, cross-country skiers to healthy-fit adults, teenagers, and children. Numerous prediction models have been developed using different sets of predictor variables and a variety of machine learning and statistical methods, including support vector machine, multilayer perceptron, general regression neural network, and multiple linear regression. The purpose of this study is to give a detailed overview about the data-driven modeling studies for the prediction of VO2max conducted in recent years and to compare the performance of various VO2max prediction models reported in related literature in terms of two well-known metrics, namely, multiple correlation coefficient (R) and standard error of estimate. The survey results reveal that with respect to regression methods used to develop prediction models, support vector machine, in general, shows better performance than other methods, whereas multiple linear regression exhibits the worst performance. PMID:26346869

  11. Machine learning and statistical methods for the prediction of maximal oxygen uptake: recent advances.

    PubMed

    Abut, Fatih; Akay, Mehmet Fatih

    2015-01-01

    Maximal oxygen uptake (VO2max) indicates how many milliliters of oxygen the body can consume in a state of intense exercise per minute. VO2max plays an important role in both sport and medical sciences for different purposes, such as indicating the endurance capacity of athletes or serving as a metric in estimating the disease risk of a person. In general, the direct measurement of VO2max provides the most accurate assessment of aerobic power. However, despite a high level of accuracy, practical limitations associated with the direct measurement of VO2max, such as the requirement of expensive and sophisticated laboratory equipment or trained staff, have led to the development of various regression models for predicting VO2max. Consequently, a lot of studies have been conducted in the last years to predict VO2max of various target audiences, ranging from soccer athletes, nonexpert swimmers, cross-country skiers to healthy-fit adults, teenagers, and children. Numerous prediction models have been developed using different sets of predictor variables and a variety of machine learning and statistical methods, including support vector machine, multilayer perceptron, general regression neural network, and multiple linear regression. The purpose of this study is to give a detailed overview about the data-driven modeling studies for the prediction of VO2max conducted in recent years and to compare the performance of various VO2max prediction models reported in related literature in terms of two well-known metrics, namely, multiple correlation coefficient (R) and standard error of estimate. The survey results reveal that with respect to regression methods used to develop prediction models, support vector machine, in general, shows better performance than other methods, whereas multiple linear regression exhibits the worst performance.

  12. Radar Derived Spatial Statistics of Summer Rain. Volume 2; Data Reduction and Analysis

    NASA Technical Reports Server (NTRS)

    Konrad, T. G.; Kropfli, R. A.

    1975-01-01

    Data reduction and analysis procedures are discussed along with the physical and statistical descriptors used. The statistical modeling techniques are outlined and examples of the derived statistical characterization of rain cells in terms of the several physical descriptors are presented. Recommendations concerning analyses which can be pursued using the data base collected during the experiment are included.

  13. The Power of Statistical Tests for Moderators in Meta-Analysis

    ERIC Educational Resources Information Center

    Hedges, Larry V.; Pigott, Therese D.

    2004-01-01

    Calculation of the statistical power of statistical tests is important in planning and interpreting the results of research studies, including meta-analyses. It is particularly important in moderator analyses in meta-analysis, which are often used as sensitivity analyses to rule out moderator effects but also may have low statistical power. This…

  14. Outliers in Statistical Analysis: Basic Methods of Detection and Accommodation.

    ERIC Educational Resources Information Center

    Jacobs, Robert

    Researchers are often faced with the prospect of dealing with observations within a given data set that are unexpected in terms of their great distance from the concentration of observations. For their potential to influence the mean disproportionately, thus affecting many statistical analyses, outlying observations require special care on the…

  15. Data Desk Professional: Statistical Analysis for the Macintosh.

    ERIC Educational Resources Information Center

    Wise, Steven L.; Kutish, Gerald W.

    This review of Data Desk Professional, a statistical software package for Macintosh microcomputers, includes information on: (1) cost and the amount and allocation of memory; (2) usability (documentation quality, ease of use); (3) running programs; (4) program output (quality of graphics); (5) accuracy; and (6) user services. In conclusion, it is…

  16. Private School Universe Survey, 1991-92. Statistical Analysis Report.

    ERIC Educational Resources Information Center

    Broughman, Stephen; And Others

    This report on the private school universe, a data collection system developed by the National Center for Education Statistics, presents data on schools with grades kindergarten through 12 by school size, school level, religious orientation, geographical region, and program emphasis. Numbers of students and teachers are reported in the same…

  17. Did Tanzania Achieve the Second Millennium Development Goal? Statistical Analysis

    ERIC Educational Resources Information Center

    Magoti, Edwin

    2016-01-01

    Development Goal "Achieve universal primary education", the challenges faced, along with the way forward towards achieving the fourth Sustainable Development Goal "Ensure inclusive and equitable quality education and promote lifelong learning opportunities for all". Statistics show that Tanzania has made very promising steps…

  18. Statistical Analysis Tools for Learning in Engineering Laboratories.

    ERIC Educational Resources Information Center

    Maher, Carolyn A.

    1990-01-01

    Described are engineering programs that have used automated data acquisition systems to implement data collection and analyze experiments. Applications include a biochemical engineering laboratory, heat transfer performance, engineering materials testing, mechanical system reliability, statistical control laboratory, thermo-fluid laboratory, and a…

  19. Advanced tracking systems design and analysis

    NASA Technical Reports Server (NTRS)

    Potash, R.; Floyd, L.; Jacobsen, A.; Cunningham, K.; Kapoor, A.; Kwadrat, C.; Radel, J.; Mccarthy, J.

    1989-01-01

    The results of an assessment of several types of high-accuracy tracking systems proposed to track the spacecraft in the National Aeronautics and Space Administration (NASA) Advanced Tracking and Data Relay Satellite System (ATDRSS) are summarized. Tracking systems based on the use of interferometry and ranging are investigated. For each system, the top-level system design and operations concept are provided. A comparative system assessment is presented in terms of orbit determination performance, ATDRSS impacts, life-cycle cost, and technological risk.

  20. Advanced surface design for logistics analysis

    NASA Astrophysics Data System (ADS)

    Brown, Tim R.; Hansen, Scott D.

    The development of anthropometric arm/hand and tool models and their manipulation in a large system model for maintenance simulation are discussed. The use of Advanced Surface Design and s-fig technology in anthropometrics, and three-dimensional graphics simulation tools, are found to achieve a good balance between model manipulation speed and model accuracy. The present second generation models are shown to be twice as fast to manipulate as the first generation b-surf models, to be easier to manipulate into various configurations, and to more closely approximate human contours.

  1. Research on the integrative strategy of spatial statistical analysis of GIS

    NASA Astrophysics Data System (ADS)

    Xie, Zhong; Han, Qi Juan; Wu, Liang

    2008-12-01

    Presently, the spacial social and natural phenomenon is studied by both the GIS technique and statistics methods. However, plenty of complex practical applications restrict these research methods. The data models and technologies exploited are full of special localization. This paper firstly sums up the requirement of spacial statistical analysis. On the base of the requirement, the universal spatial statistical models are transformed into the function tools in statistical GIS system. A pyramidal structure of three layers is brought forward. Therefore, it is feasible to combine the techniques of spacial dada management, searches and visualization in GIS with the methods of processing data in the statistic analysis. It will form an integrative statistical GIS environment with the management, analysis, application and assistant decision-making of spacial statistical information.

  2. Statistical analysis of modeling error in structural dynamic systems

    NASA Technical Reports Server (NTRS)

    Hasselman, T. K.; Chrostowski, J. D.

    1990-01-01

    The paper presents a generic statistical model of the (total) modeling error for conventional space structures in their launch configuration. Modeling error is defined as the difference between analytical prediction and experimental measurement. It is represented by the differences between predicted and measured real eigenvalues and eigenvectors. Comparisons are made between pre-test and post-test models. Total modeling error is then subdivided into measurement error, experimental error and 'pure' modeling error, and comparisons made between measurement error and total modeling error. The generic statistical model presented in this paper is based on the first four global (primary structure) modes of four different structures belonging to the generic category of Conventional Space Structures (specifically excluding large truss-type space structures). As such, it may be used to evaluate the uncertainty of predicted mode shapes and frequencies, sinusoidal response, or the transient response of other structures belonging to the same generic category.

  3. Ambiguity and nonidentifiability in the statistical analysis of neural codes.

    PubMed

    Amarasingham, Asohan; Geman, Stuart; Harrison, Matthew T

    2015-05-19

    Many experimental studies of neural coding rely on a statistical interpretation of the theoretical notion of the rate at which a neuron fires spikes. For example, neuroscientists often ask, "Does a population of neurons exhibit more synchronous spiking than one would expect from the covariability of their instantaneous firing rates?" For another example, "How much of a neuron's observed spiking variability is caused by the variability of its instantaneous firing rate, and how much is caused by spike timing variability?" However, a neuron's theoretical firing rate is not necessarily well-defined. Consequently, neuroscientific questions involving the theoretical firing rate do not have a meaning in isolation but can only be interpreted in light of additional statistical modeling choices. Ignoring this ambiguity can lead to inconsistent reasoning or wayward conclusions. We illustrate these issues with examples drawn from the neural-coding literature.

  4. Statistical analysis of the lithospheric magnetic anomaly data

    NASA Astrophysics Data System (ADS)

    Pavon-Carrasco, Fco Javier; de Santis, Angelo; Ferraccioli, Fausto; Catalán, Manuel; Ishihara, Takemi

    2013-04-01

    Different analyses carried out on the lithospheric magnetic anomaly data from GEODAS DVD v5.0.10 database (World Digital Magnetic Anomaly Map, WDMAM) show that the data distribution is not Gaussian, but Laplacian. Although this behaviour has been formerly pointed out in other works (e.g., Walker and Jackson, Geophys. J. Int, 143, 799-808, 2000), they have not given any explanation about this statistical property of the magnetic anomalies. In this work, we perform different statistical tests to confirm that the lithospheric magnetic anomaly data follow indeed a Laplacian distribution and we also give a possible interpretation of this behavior providing a model of magnetization which depends on the variation of the geomagnetic field and both induced and remanent magnetizations in the terrestrial lithosphere.

  5. Statistical Analysis of CFD Solutions from the Drag Prediction Workshop

    NASA Technical Reports Server (NTRS)

    Hemsch, Michael J.

    2002-01-01

    A simple, graphical framework is presented for robust statistical evaluation of results obtained from N-Version testing of a series of RANS CFD codes. The solutions were obtained by a variety of code developers and users for the June 2001 Drag Prediction Workshop sponsored by the AIAA Applied Aerodynamics Technical Committee. The aerodynamic configuration used for the computational tests is the DLR-F4 wing-body combination previously tested in several European wind tunnels and for which a previous N-Version test had been conducted. The statistical framework is used to evaluate code results for (1) a single cruise design point, (2) drag polars and (3) drag rise. The paper concludes with a discussion of the meaning of the results, especially with respect to predictability, Validation, and reporting of solutions.

  6. Ambiguity and nonidentifiability in the statistical analysis of neural codes

    PubMed Central

    Amarasingham, Asohan; Geman, Stuart; Harrison, Matthew T.

    2015-01-01

    Many experimental studies of neural coding rely on a statistical interpretation of the theoretical notion of the rate at which a neuron fires spikes. For example, neuroscientists often ask, “Does a population of neurons exhibit more synchronous spiking than one would expect from the covariability of their instantaneous firing rates?” For another example, “How much of a neuron’s observed spiking variability is caused by the variability of its instantaneous firing rate, and how much is caused by spike timing variability?” However, a neuron’s theoretical firing rate is not necessarily well-defined. Consequently, neuroscientific questions involving the theoretical firing rate do not have a meaning in isolation but can only be interpreted in light of additional statistical modeling choices. Ignoring this ambiguity can lead to inconsistent reasoning or wayward conclusions. We illustrate these issues with examples drawn from the neural-coding literature. PMID:25934918

  7. Statistical analysis of motion contrast in optical coherence tomography angiography

    NASA Astrophysics Data System (ADS)

    Cheng, Yuxuan; Guo, Li; Pan, Cong; Lu, Tongtong; Hong, Tianyu; Ding, Zhihua; Li, Peng

    2015-11-01

    Optical coherence tomography angiography (Angio-OCT), mainly based on the temporal dynamics of OCT scattering signals, has found a range of potential applications in clinical and scientific research. Based on the model of random phasor sums, temporal statistics of the complex-valued OCT signals are mathematically described. Statistical distributions of the amplitude differential and complex differential Angio-OCT signals are derived. The theories are validated through the flow phantom and live animal experiments. Using the model developed, the origin of the motion contrast in Angio-OCT is mathematically explained, and the implications in the improvement of motion contrast are further discussed, including threshold determination and its residual classification error, averaging method, and scanning protocol. The proposed mathematical model of Angio-OCT signals can aid in the optimal design of the system and associated algorithms.

  8. Statistical Methods for Rapid Aerothermal Analysis and Design Technology

    NASA Technical Reports Server (NTRS)

    Morgan, Carolyn; DePriest, Douglas; Thompson, Richard (Technical Monitor)

    2002-01-01

    The cost and safety goals for NASA's next generation of reusable launch vehicle (RLV) will require that rapid high-fidelity aerothermodynamic design tools be used early in the design cycle. To meet these requirements, it is desirable to establish statistical models that quantify and improve the accuracy, extend the applicability, and enable combined analyses using existing prediction tools. The research work was focused on establishing the suitable mathematical/statistical models for these purposes. It is anticipated that the resulting models can be incorporated into a software tool to provide rapid, variable-fidelity, aerothermal environments to predict heating along an arbitrary trajectory. This work will support development of an integrated design tool to perform automated thermal protection system (TPS) sizing and material selection.

  9. A Statistical Framework for the Functional Analysis of Metagenomes

    SciTech Connect

    Sharon, Itai; Pati, Amrita; Markowitz, Victor; Pinter, Ron Y.

    2008-10-01

    Metagenomic studies consider the genetic makeup of microbial communities as a whole, rather than their individual member organisms. The functional and metabolic potential of microbial communities can be analyzed by comparing the relative abundance of gene families in their collective genomic sequences (metagenome) under different conditions. Such comparisons require accurate estimation of gene family frequencies. They present a statistical framework for assessing these frequencies based on the Lander-Waterman theory developed originally for Whole Genome Shotgun (WGS) sequencing projects. They also provide a novel method for assessing the reliability of the estimations which can be used for removing seemingly unreliable measurements. They tested their method on a wide range of datasets, including simulated genomes and real WGS data from sequencing projects of whole genomes. Results suggest that their framework corrects inherent biases in accepted methods and provides a good approximation to the true statistics of gene families in WGS projects.

  10. A statistical analysis of hard X-Ray solar flares

    NASA Technical Reports Server (NTRS)

    Pearce, G.; Rowe, A. K.; Yeung, J.

    1993-01-01

    In this study we perform a statistical study on, 8319 X-Ray solar flares observed with the Hard X-Ray Spectrometer (HXRBS) on the Solar Maximum Mission satellite (SMM). The events are examined in terms of the durations, maximum intensities, and intensity profiles. It is concluded that there is no evidence for a correlation between flare intensity, flare duration, and flare asymmetry. However, we do find evidence for a rapid fall-of in the number of short-duration events.

  11. Statistical analysis of mammalian pre-mRNA splicing sites.

    PubMed Central

    Gelfand, M S

    1989-01-01

    222 donor and 222 acceptor (including 206 pairs) non-homologous splicing sites were studied. Well known features of these were confirmed and some novel observations were made. It is (1) cCAGGGag signal in (-60)-(-58) region of acceptor sites; (2) strong complementarity between regions (-69)-(-55) and (-36)-(-22) of some of the acceptor sites, and (3) small but statistically significant correlation between discrimination energies of corresponding donor and acceptor sites. PMID:2528123

  12. Computational and Statistical Analysis of Protein Mass Spectrometry Data

    PubMed Central

    Noble, William Stafford; MacCoss, Michael J.

    2012-01-01

    High-throughput proteomics experiments involving tandem mass spectrometry produce large volumes of complex data that require sophisticated computational analyses. As such, the field offers many challenges for computational biologists. In this article, we briefly introduce some of the core computational and statistical problems in the field and then describe a variety of outstanding problems that readers of PLoS Computational Biology might be able to help solve. PMID:22291580

  13. Application of multivariate statistical analysis to STEM X-ray spectral images: interfacial analysis in microelectronics.

    PubMed

    Kotula, Paul G; Keenan, Michael R

    2006-12-01

    Multivariate statistical analysis methods have been applied to scanning transmission electron microscopy (STEM) energy-dispersive X-ray spectral images. The particular application of the multivariate curve resolution (MCR) technique provides a high spectral contrast view of the raw spectral image. The power of this approach is demonstrated with a microelectronics failure analysis. Specifically, an unexpected component describing a chemical contaminant was found, as well as a component consistent with a foil thickness change associated with the focused ion beam specimen preparation process. The MCR solution is compared with a conventional analysis of the same spectral image data set.

  14. Recent Advances in Anthocyanin Analysis and Characterization

    PubMed Central

    Welch, Cara R.; Wu, Qingli; Simon, James E.

    2009-01-01

    Anthocyanins are a class of polyphenols responsible for the orange, red, purple and blue colors of many fruits, vegetables, grains, flowers and other plants. Consumption of anthocyanins has been linked as protective agents against many chronic diseases and possesses strong antioxidant properties leading to a variety of health benefits. In this review, we examine the advances in the chemical profiling of natural anthocyanins in plant and biological matrices using various chromatographic separations (HPLC and CE) coupled with different detection systems (UV, MS and NMR). An overview of anthocyanin chemistry, prevalence in plants, biosynthesis and metabolism, bioactivities and health properties, sample preparation and phytochemical investigations are discussed while the major focus examines the comparative advantages and disadvantages of each analytical technique. PMID:19946465

  15. Analysis of an advanced technology subsonic turbofan incorporating revolutionary materials

    NASA Technical Reports Server (NTRS)

    Knip, Gerald, Jr.

    1987-01-01

    Successful implementation of revolutionary composite materials in an advanced turbofan offers the possibility of further improvements in engine performance and thrust-to-weight ratio relative to current metallic materials. The present analysis determines the approximate engine cycle and configuration for an early 21st century subsonic turbofan incorporating all composite materials. The advanced engine is evaluated relative to a current technology baseline engine in terms of its potential fuel savings for an intercontinental quadjet having a design range of 5500 nmi and a payload of 500 passengers. The resultant near optimum, uncooled, two-spool, advanced engine has an overall pressure ratio of 87, a bypass ratio of 18, a geared fan, and a turbine rotor inlet temperature of 3085 R. Improvements result in a 33-percent fuel saving for the specified misssion. Various advanced composite materials are used throughout the engine. For example, advanced polymer composite materials are used for the fan and the low pressure compressor (LPC).

  16. Statistical Analysis of the Nonhomogeneity Detector for Non-Gaussian Interference Backgrounds

    DTIC Science & Technology

    2005-06-01

    Statistical Inference . Mineola, NY: Dover, 2003. lanta (March 2004). He is a member of the editorial board of the Digital Signal [35] J. H. Michels, M...01-06-2005 Journal Article 2003 4. TITLE AND SUBTITLE 5a. CONTRACT NUMBER Statistical Analysis of the Nonhomogeneity Detector for Non-Gaussian N/A... statistical analysis of the method. The non-Gaussian interference scenario is assumed to be modeled by a spherically invariant random process (SIRP). We

  17. ROOT: A C++ framework for petabyte data storage, statistical analysis and visualization

    SciTech Connect

    Antcheva, I.; Ballintijn, M.; Bellenot, B.; Biskup, M.; Brun, R.; Buncic, N.; Canal, Ph.; Casadei, D.; Couet, O.; Fine, V.; Franco, L.; /CERN /CERN

    2009-01-01

    ROOT is an object-oriented C++ framework conceived in the high-energy physics (HEP) community, designed for storing and analyzing petabytes of data in an efficient way. Any instance of a C++ class can be stored into a ROOT file in a machine-independent compressed binary format. In ROOT the TTree object container is optimized for statistical data analysis over very large data sets by using vertical data storage techniques. These containers can span a large number of files on local disks, the web or a number of different shared file systems. In order to analyze this data, the user can chose out of a wide set of mathematical and statistical functions, including linear algebra classes, numerical algorithms such as integration and minimization, and various methods for performing regression analysis (fitting). In particular, the RooFit package allows the user to perform complex data modeling and fitting while the RooStats library provides abstractions and implementations for advanced statistical tools. Multivariate classification methods based on machine learning techniques are available via the TMVA package. A central piece in these analysis tools are the histogram classes which provide binning of one- and multi-dimensional data. Results can be saved in high-quality graphical formats like Postscript and PDF or in bitmap formats like JPG or GIF. The result can also be stored into ROOT macros that allow a full recreation and rework of the graphics. Users typically create their analysis macros step by step, making use of the interactive C++ interpreter CINT, while running over small data samples. Once the development is finished, they can run these macros at full compiled speed over large data sets, using on-the-fly compilation, or by creating a stand-alone batch program. Finally, if processing farms are available, the user can reduce the execution time of intrinsically parallel tasks - e.g. data mining in HEP - by using PROOF, which will take care of optimally

  18. Inferring Species Richness and Turnover by Statistical Multiresolution Texture Analysis of Satellite Imagery

    DTIC Science & Technology

    2012-10-24

    time series similarity measures for classification and change detection of ecosystem dynamics . Remote...for estimating species-richness, and introduce a method based on statistical wavelet multiresolution texture analysis to quantitatively assess...entropy for estimating species-richness, and introduce a method based on statistical wavelet multiresolution texture analysis to quantitatively

  19. Statistics Education Research in Malaysia and the Philippines: A Comparative Analysis

    ERIC Educational Resources Information Center

    Reston, Enriqueta; Krishnan, Saras; Idris, Noraini

    2014-01-01

    This paper presents a comparative analysis of statistics education research in Malaysia and the Philippines by modes of dissemination, research areas, and trends. An electronic search for published research papers in the area of statistics education from 2000-2012 yielded 20 for Malaysia and 19 for the Philippines. Analysis of these papers showed…

  20. APPLICATION OF STATISTICAL ENERGY ANALYSIS TO VIBRATIONS OF MULTI-PANEL STRUCTURES.

    DTIC Science & Technology

    cylindrical shell are compared with predictions obtained from statistical energy analysis . Generally good agreement is observed. The flow of mechanical...the coefficients of proportionality between power flow and average modal energy difference, which one must know in order to apply statistical energy analysis . No

  1. Linkage analysis of systolic blood pressure: a score statistic and computer implementation

    PubMed Central

    Wang, Kai; Peng, Yingwei

    2003-01-01

    A genome-wide linkage analysis was conducted on systolic blood pressure using a score statistic. The randomly selected Replicate 34 of the simulated data was used. The score statistic was applied to the sibships derived from the general pedigrees. An add-on R program to GENEHUNTER was developed for this analysis and is freely available. PMID:14975145

  2. Fatigue Crack Propagation: Probabilistic Modeling and Statistical Analysis.

    DTIC Science & Technology

    1988-03-23

    School of Physics "Enrico Fermi" (1986) (eds. D.V. Lindley and C.A. Clarotti) Amsterdam: North Holland (with Morris H. DeGroot ) An accelerated life...Festschrift in Honor of Ingram Olkin 1988, Editors: Jim Press & Leon Jay Gleser (with Morris H. DeGroot and Maria J. Bayarri) New York: Springer-Verlag...389, Department of Statistics, Ohio State University (with Morris H. DeGroot ) In this paper, the concepts of comparison of experiments in the context

  3. Statistical analysis of multivariate atmospheric variables. [cloud cover

    NASA Technical Reports Server (NTRS)

    Tubbs, J. D.

    1979-01-01

    Topics covered include: (1) estimation in discrete multivariate distributions; (2) a procedure to predict cloud cover frequencies in the bivariate case; (3) a program to compute conditional bivariate normal parameters; (4) the transformation of nonnormal multivariate to near-normal; (5) test of fit for the extreme value distribution based upon the generalized minimum chi-square; (6) test of fit for continuous distributions based upon the generalized minimum chi-square; (7) effect of correlated observations on confidence sets based upon chi-square statistics; and (8) generation of random variates from specified distributions.

  4. Statistical analysis of epidemiologic data of pregnancy outcomes

    SciTech Connect

    Butler, W.J.; Kalasinski, L.A. )

    1989-02-01

    In this paper, a generalized logistic regression model for correlated observations is used to analyze epidemiologic data on the frequency of spontaneous abortion among a group of women office workers. The results are compared to those obtained from the use of the standard logistic regression model that assumes statistical independence among all the pregnancies contributed by one woman. In this example, the correlation among pregnancies from the same woman is fairly small and did not have a substantial impact on the magnitude of estimates of parameters of the model. This is due at least partly to the small average number of pregnancies contributed by each woman.

  5. JAWS data collection, analysis highlights, and microburst statistics

    NASA Technical Reports Server (NTRS)

    Mccarthy, J.; Roberts, R.; Schreiber, W.

    1983-01-01

    Organization, equipment, and the current status of the Joint Airport Weather Studies project initiated in relation to the microburst phenomenon are summarized. Some data collection techniques and preliminary statistics on microburst events recorded by Doppler radar are discussed as well. Radar studies show that microbursts occur much more often than expected, with majority of the events being potentially dangerous to landing or departing aircraft. Seventy events were registered, with the differential velocities ranging from 10 to 48 m/s; headwind/tailwind velocity differentials over 20 m/s are considered seriously hazardous. It is noted that a correlation is yet to be established between the velocity differential and incoherent radar reflectivity.

  6. Statistical Analysis of Noisy Signals Using Classification Tools

    SciTech Connect

    Thompson, Sandra E.; Heredia-Langner, Alejandro; Johnson, Timothy J.; Foster, Nancy S.; Valentine, Nancy B.; Amonette, James E.

    2005-06-04

    The potential use of chemicals, biotoxins and biological pathogens are a threat to military and police forces as well as the general public. Rapid identification of these agents is made difficult due to the noisy nature of the signal that can be obtained from portable, in-field sensors. In previously published articles, we created a flowchart that illustrated a method for triaging bacterial identification by combining standard statistical techniques for discrimination and identification with mid-infrared spectroscopic data. The present work documents the process of characterizing and eliminating the sources of the noise and outlines how multidisciplinary teams are necessary to accomplish that goal.

  7. Statistical distributions of potential interest in ultrasound speckle analysis.

    PubMed

    Nadarajah, Saralees

    2007-05-21

    Compound statistical modelling of the uncompressed envelope of the backscattered signal has received much interest recently. In this note, a comprehensive collection of models is derived for the uncompressed envelope of the backscattered signal by compounding the Nakagami distribution with 13 flexible families. The corresponding estimation procedures are derived by the method of moments and the method of maximum likelihood. The sensitivity of the models to their various parameters is examined. It is expected that this work could serve as a useful reference and lead to improved modelling of the uncompressed envelope of the backscattered signal.

  8. Analysis of Bidirectional Associative Memory using Self-consistent Signal to Noise Analysis and Statistical Neurodynamics

    NASA Astrophysics Data System (ADS)

    Shouno, Hayaru; Kido, Shoji; Okada, Masato

    2004-09-01

    Bidirectional associative memory (BAM) is a kind of an artificial neural network used to memorize and retrieve heterogeneous pattern pairs. Many efforts have been made to improve BAM from the the viewpoint of computer application, and few theoretical studies have been done. We investigated the theoretical characteristics of BAM using a framework of statistical-mechanical analysis. To investigate the equilibrium state of BAM, we applied self-consistent signal to noise analysis (SCSNA) and obtained a macroscopic parameter equations and relative capacity. Moreover, to investigate not only the equilibrium state but also the retrieval process of reaching the equilibrium state, we applied statistical neurodynamics to the update rule of BAM and obtained evolution equations for the macroscopic parameters. These evolution equations are consistent with the results of SCSNA in the equilibrium state.

  9. Statistical analysis of bankrupting and non-bankrupting stocks

    NASA Astrophysics Data System (ADS)

    Li, Qian; Wang, Fengzhong; Wei, Jianrong; Liang, Yuan; Huang, Jiping; Stanley, H. Eugene

    2012-04-01

    The recent financial crisis has caused extensive world-wide economic damage, affecting in particular those who invested in companies that eventually filed for bankruptcy. A better understanding of stocks that become bankrupt would be helpful in reducing risk in future investments. Economists have conducted extensive research on this topic, and here we ask whether statistical physics concepts and approaches may offer insights into pre-bankruptcy stock behavior. To this end, we study all 20092 stocks listed in US stock markets for the 20-year period 1989-2008, including 4223 (21 percent) that became bankrupt during that period. We find that, surprisingly, the distributions of the daily returns of those stocks that become bankrupt differ significantly from those that do not. Moreover, these differences are consistent for the entire period studied. We further study the relation between the distribution of returns and the length of time until bankruptcy, and observe that larger differences of the distribution of returns correlate with shorter time periods preceding bankruptcy. This behavior suggests that sharper fluctuations in the stock price occur when the stock is closer to bankruptcy. We also analyze the cross-correlations between the return and the trading volume, and find that stocks approaching bankruptcy tend to have larger return-volume cross-correlations than stocks that are not. Furthermore, the difference increases as bankruptcy approaches. We conclude that before a firm becomes bankrupt its stock exhibits unusual behavior that is statistically quantifiable.

  10. Advances in microfluidics for environmental analysis.

    PubMed

    Jokerst, Jana C; Emory, Jason M; Henry, Charles S

    2012-01-07

    During the past few years, a growing number of groups have recognized the utility of microfluidic devices for environmental analysis. Microfluidic devices offer a number of advantages and in many respects are ideally suited to environmental analyses. Challenges faced in environmental monitoring, including the ability to handle complex and highly variable sample matrices, lead to continued growth and research. Additionally, the need to operate for days to months in the field requires further development of robust, integrated microfluidic systems. This review examines recently published literature on the applications of microfluidic systems for environmental analysis and provides insight in the future direction of the field.

  11. Studies in astronomical time series analysis. II - Statistical aspects of spectral analysis of unevenly spaced data

    NASA Technical Reports Server (NTRS)

    Scargle, J. D.

    1982-01-01

    Detection of a periodic signal hidden in noise is frequently a goal in astronomical data analysis. This paper does not introduce a new detection technique, but instead studies the reliability and efficiency of detection with the most commonly used technique, the periodogram, in the case where the observation times are unevenly spaced. This choice was made because, of the methods in current use, it appears to have the simplest statistical behavior. A modification of the classical definition of the periodogram is necessary in order to retain the simple statistical behavior of the evenly spaced case. With this modification, periodogram analysis and least-squares fitting of sine waves to the data are exactly equivalent. Certain difficulties with the use of the periodogram are less important than commonly believed in the case of detection of strictly periodic signals. In addition, the standard method for mitigating these difficulties (tapering) can be used just as well if the sampling is uneven. An analysis of the statistical significance of signal detections is presented, with examples

  12. Studies in astronomical time series analysis. II - Statistical aspects of spectral analysis of unevenly spaced data

    NASA Astrophysics Data System (ADS)

    Scargle, J. D.

    1982-12-01

    Detection of a periodic signal hidden in noise is frequently a goal in astronomical data analysis. This paper does not introduce a new detection technique, but instead studies the reliability and efficiency of detection with the most commonly used technique, the periodogram, in the case where the observation times are unevenly spaced. This choice was made because, of the methods in current use, it appears to have the simplest statistical behavior. A modification of the classical definition of the periodogram is necessary in order to retain the simple statistical behavior of the evenly spaced case. With this modification, periodogram analysis and least-squares fitting of sine waves to the data are exactly equivalent. Certain difficulties with the use of the periodogram are less important than commonly believed in the case of detection of strictly periodic signals. In addition, the standard method for mitigating these difficulties (tapering) can be used just as well if the sampling is uneven. An analysis of the statistical significance of signal detections is presented, with examples

  13. Advanced Durability Analysis. Volume 1. Analytical Methods

    DTIC Science & Technology

    1987-07-31

    for microstruc .- tural behavior . This approach for representing the IFQ, when properly used, can provide reasonable durability analysis rt,- sults for...equivalent initial flaw size distribution (EIFSD) function. Engineering principles rather than mechanistic-based theories for microstructural behavior are...accurate EIFS distribution and a service crack growth behavior . The determinations of EIFS distribution have been described in detail previously. In this

  14. Modeling and analysis of advanced binary cycles

    SciTech Connect

    Gawlik, K.

    1997-12-31

    A computer model (Cycle Analysis Simulation Tool, CAST) and a methodology have been developed to perform value analysis for small, low- to moderate-temperature binary geothermal power plants. The value analysis method allows for incremental changes in the levelized electricity cost (LEC) to be determined between a baseline plant and a modified plant. Thermodynamic cycle analyses and component sizing are carried out in the model followed by economic analysis which provides LEC results. The emphasis of the present work is on evaluating the effect of mixed working fluids instead of pure fluids on the LEC of a geothermal binary plant that uses a simple Organic Rankine Cycle. Four resources were studied spanning the range of 265{degrees}F to 375{degrees}F. A variety of isobutane and propane based mixtures, in addition to pure fluids, were used as working fluids. This study shows that the use of propane mixtures at a 265{degrees}F resource can reduce the LEC by 24% when compared to a base case value that utilizes commercial isobutane as its working fluid. The cost savings drop to 6% for a 375{degrees}F resource, where an isobutane mixture is favored. Supercritical cycles were found to have the lowest cost at all resources.

  15. Statistical Analysis of Complexity Generators for Cost Estimation

    NASA Technical Reports Server (NTRS)

    Rowell, Ginger Holmes

    1999-01-01

    Predicting the cost of cutting edge new technologies involved with spacecraft hardware can be quite complicated. A new feature of the NASA Air Force Cost Model (NAFCOM), called the Complexity Generator, is being developed to model the complexity factors that drive the cost of space hardware. This parametric approach is also designed to account for the differences in cost, based on factors that are unique to each system and subsystem. The cost driver categories included in this model are weight, inheritance from previous missions, technical complexity, and management factors. This paper explains the Complexity Generator framework, the statistical methods used to select the best model within this framework, and the procedures used to find the region of predictability and the prediction intervals for the cost of a mission.

  16. Analysis of surface sputtering on a quantum statistical basis

    NASA Technical Reports Server (NTRS)

    Wilhelm, H. E.

    1975-01-01

    Surface sputtering is explained theoretically by means of a 3-body sputtering mechanism involving the ion and two surface atoms of the solid. By means of quantum-statistical mechanics, a formula for the sputtering ratio S(E) is derived from first principles. The theoretical sputtering rate S(E) was found experimentally to be proportional to the square of the difference between incident ion energy and the threshold energy for sputtering of surface atoms at low ion energies. Extrapolation of the theoretical sputtering formula to larger ion energies indicates that S(E) reaches a saturation value and finally decreases at high ion energies. The theoretical sputtering ratios S(E) for wolfram, tantalum, and molybdenum are compared with the corresponding experimental sputtering curves in the low energy region from threshold sputtering energy to 120 eV above the respective threshold energy. Theory and experiment are shown to be in good agreement.

  17. Statistical multi-site fatigue damage analysis model

    NASA Astrophysics Data System (ADS)

    Wang, G. S.

    1995-02-01

    A statistical model has been developed to evaluate fatigue damage at multi-sites in complex joints based on coupon test data and fracture mechanics methods. The model is similar to the USAF model, but modified by introducing a failure criterion and a probability of fatal crack occurrence to account for the multiple site damage phenomenon. The involvement of NDI techniques has been included in the model which can be used to evaluate the structural reliability, the detectability of fatigue damage (cracks), and the risk of failure based on NDI results taken from samples. A practical example is provided for rivet fasteners and bolted fasteners. It is shown that the model can be used even if it is based on conventional S-N coupon experiments should further fractographic inspections be made for cracks on the broken surfaces of specimens.

  18. Statistical Methods for Rapid Aerothermal Analysis and Design Technology: Validation

    NASA Technical Reports Server (NTRS)

    DePriest, Douglas; Morgan, Carolyn

    2003-01-01

    The cost and safety goals for NASA s next generation of reusable launch vehicle (RLV) will require that rapid high-fidelity aerothermodynamic design tools be used early in the design cycle. To meet these requirements, it is desirable to identify adequate statistical models that quantify and improve the accuracy, extend the applicability, and enable combined analyses using existing prediction tools. The initial research work focused on establishing suitable candidate models for these purposes. The second phase is focused on assessing the performance of these models to accurately predict the heat rate for a given candidate data set. This validation work compared models and methods that may be useful in predicting the heat rate.

  19. Statistical methods for the geographical analysis of rare diseases.

    PubMed

    Gómez-Rubio, Virgilio; López-Quílez, Antonio

    2010-01-01

    In this chapter we provide a summary of different methods for the detection of disease clusters. First of all, we give a summary of methods for computing estimates of the relative risk. These estimates provide smoothed values of the relative risks that can account for its spatial variation. Some methods for assessing spatial autocorrelation and general clustering are also discussed to test for significant spatial variation of the risk. In order to find the actual location of the clusters, scan methods are introduced. The spatial scan statistic is discussed as well as its extension by means of Generalised Linear Models that allows for the inclusion of covariates and cluster effects. In this context, zero-inflated models are introduced to account for the high number of zeros that appear when studying rare diseases. Finally, two applications of these methods are shown using data of Systemic Lupus Erythematosus in Spain and brain cancer in Navarre (Spain).

  20. Statistical analysis of loopy belief propagation in random fields

    NASA Astrophysics Data System (ADS)

    Yasuda, Muneki; Kataoka, Shun; Tanaka, Kazuyuki

    2015-10-01

    Loopy belief propagation (LBP), which is equivalent to the Bethe approximation in statistical mechanics, is a message-passing-type inference method that is widely used to analyze systems based on Markov random fields (MRFs). In this paper, we propose a message-passing-type method to analytically evaluate the quenched average of LBP in random fields by using the replica cluster variation method. The proposed analytical method is applicable to general pairwise MRFs with random fields whose distributions differ from each other and can give the quenched averages of the Bethe free energies over random fields, which are consistent with numerical results. The order of its computational cost is equivalent to that of standard LBP. In the latter part of this paper, we describe the application of the proposed method to Bayesian image restoration, in which we observed that our theoretical results are in good agreement with the numerical results for natural images.

  1. Statistical analysis of Nomao customer votes for spots of France

    NASA Astrophysics Data System (ADS)

    Pálovics, Róbert; Daróczy, Bálint; Benczúr, András; Pap, Julia; Ermann, Leonardo; Phan, Samuel; Chepelianskii, Alexei D.; Shepelyansky, Dima L.

    2015-08-01

    We investigate the statistical properties of votes of customers for spots of France collected by the startup company Nomao. The frequencies of votes per spot and per customer are characterized by a power law distribution which remains stable on a time scale of a decade when the number of votes is varied by almost two orders of magnitude. Using the computer science methods we explore the spectrum and the eigenvalues of a matrix containing user ratings to geolocalized items. Eigenvalues nicely map to large towns and regions but show certain level of instability as we modify the interpretation of the underlying matrix. We evaluate imputation strategies that provide improved prediction performance by reaching geographically smooth eigenvectors. We point on possible links between distribution of votes and the phenomenon of self-organized criticality.

  2. Statistical analysis of a global photochemical model of the atmosphere

    NASA Astrophysics Data System (ADS)

    Frol'Kis, V. A.; Karol', I. L.; Kiselev, A. A.; Ozolin, Yu. E.; Zubov, V. A.

    2007-08-01

    This is a study of the sensitivity of model results (atmospheric content of main gas constituents and radiative characteristics of the atmosphere) to errors in emissions of a number of atmospheric gaseous pollutants. Groups of the model variables most dependent on these errors are selected. Two variants of emissions are considered: one without their evolution and the other with their variation according to the IPCC scenario. The estimates are made on the basis of standard statistical methods for the results obtained with the detailed onedimensional radiative—photochemical model of the Main Geophysical Observatory (MGO). Some approaches to such estimations with models of higher complexity and to the solution of the inverse problem (i.e., the estimation of the necessary accuracy of external model parameters for obtaining the given accuracy of model results) are outlined.

  3. Advancing Usability Evaluation through Human Reliability Analysis

    SciTech Connect

    Ronald L. Boring; David I. Gertman

    2005-07-01

    This paper introduces a novel augmentation to the current heuristic usability evaluation methodology. The SPAR-H human reliability analysis method was developed for categorizing human performance in nuclear power plants. Despite the specialized use of SPAR-H for safety critical scenarios, the method also holds promise for use in commercial off-the-shelf software usability evaluations. The SPAR-H method shares task analysis underpinnings with human-computer interaction, and it can be easily adapted to incorporate usability heuristics as performance shaping factors. By assigning probabilistic modifiers to heuristics, it is possible to arrive at the usability error probability (UEP). This UEP is not a literal probability of error but nonetheless provides a quantitative basis to heuristic evaluation. When combined with a consequence matrix for usability errors, this method affords ready prioritization of usability issues.

  4. Progress in Advanced Spectral Analysis of Radioxenon

    SciTech Connect

    Haas, Derek A.; Schrom, Brian T.; Cooper, Matthew W.; Ely, James H.; Flory, Adam E.; Hayes, James C.; Heimbigner, Tom R.; McIntyre, Justin I.; Saunders, Danielle L.; Suckow, Thomas J.

    2010-09-21

    Improvements to a Java based software package developed at Pacific Northwest National Laboratory (PNNL) for display and analysis of radioxenon spectra acquired by the International Monitoring System (IMS) are described here. The current version of the Radioxenon JavaViewer implements the region of interest (ROI) method for analysis of beta-gamma coincidence data. Upgrades to the Radioxenon JavaViewer will include routines to analyze high-purity germanium detector (HPGe) data, Standard Spectrum Method to analyze beta-gamma coincidence data and calibration routines to characterize beta-gamma coincidence detectors. These upgrades are currently under development; the status and initial results will be presented. Implementation of these routines into the JavaViewer and subsequent release is planned for FY 2011-2012.

  5. Multivariate Statistical Analysis of MSL APXS Bulk Geochemical Data

    NASA Astrophysics Data System (ADS)

    Hamilton, V. E.; Edwards, C. S.; Thompson, L. M.; Schmidt, M. E.

    2014-12-01

    We apply cluster and factor analyses to bulk chemical data of 130 soil and rock samples measured by the Alpha Particle X-ray Spectrometer (APXS) on the Mars Science Laboratory (MSL) rover Curiosity through sol 650. Multivariate approaches such as principal components analysis (PCA), cluster analysis, and factor analysis compliment more traditional approaches (e.g., Harker diagrams), with the advantage of simultaneously examining the relationships between multiple variables for large numbers of samples. Principal components analysis has been applied with success to APXS, Pancam, and Mössbauer data from the Mars Exploration Rovers. Factor analysis and cluster analysis have been applied with success to thermal infrared (TIR) spectral data of Mars. Cluster analyses group the input data by similarity, where there are a number of different methods for defining similarity (hierarchical, density, distribution, etc.). For example, without any assumptions about the chemical contributions of surface dust, preliminary hierarchical and K-means cluster analyses clearly distinguish the physically adjacent rock targets Windjana and Stephen as being distinctly different than lithologies observed prior to Curiosity's arrival at The Kimberley. In addition, they are separated from each other, consistent with chemical trends observed in variation diagrams but without requiring assumptions about chemical relationships. We will discuss the variation in cluster analysis results as a function of clustering method and pre-processing (e.g., log transformation, correction for dust cover) and implications for interpreting chemical data. Factor analysis shares some similarities with PCA, and examines the variability among observed components of a dataset so as to reveal variations attributable to unobserved components. Factor analysis has been used to extract the TIR spectra of components that are typically observed in mixtures and only rarely in isolation; there is the potential for similar

  6. Statistical analysis of 2AFC contrast threshold measurements

    NASA Astrophysics Data System (ADS)

    Tchou, Philip; Flynn, Michael

    2005-04-01

    Most prior 2AFC experiments have been designed using a small number of signal strengths with many scenes for each strength. Percent correct is then computed for each level and fit to the assumed psychometric function. However, this introduces error because the signal strengths of individual responses are shifted. An alternative approach is to compute the statistical likelihood as a function of the threshold and width of the psychometric response curve. The best fit is then determined by finding the threshold and width that maximize the likelihood. In this paper, we discuss a method for analyzing 2AFC observer responses using maximum likelihood estimation (MLE) techniques. The logit model is used to represent the psychometric function and derive the likelihood. A conjugate gradient search algorithm is then used to find the maximum likelihood. The method is illustrated using human observer results from a previous study while statistical characteristics of the method are examined using simulated response data. The human observer results show that the psychometric function varies between observers and from test to test. The simulations show that the variance of the threshold and width exhibit a 1/Nobs relationship (σ=1.5201*Nobs-0.5236), where Nobs is the number of observations made in a 2AFC test ranging from 10 to 30000. The variance of the human observer data was in close agreement with the simulations. These results indicate that the method is robust over a wide range of observations and can be used to predict human responses. The results of the simulations also suggest how to minimize error in future studies.

  7. Advanced CMOS Radiation Effects Testing and Analysis

    NASA Technical Reports Server (NTRS)

    Pellish, J. A.; Marshall, P. W.; Rodbell, K. P.; Gordon, M. S.; LaBel, K. A.; Schwank, J. R.; Dodds, N. A.; Castaneda, C. M.; Berg, M. D.; Kim, H. S.; Phan, A. M.; Seidleck, C. M.

    2014-01-01

    Presentation at the annual NASA Electronic Parts and Packaging (NEPP) Program Electronic Technology Workshop (ETW). The material includes an update of progress in this NEPP task area over the past year, which includes testing, evaluation, and analysis of radiation effects data on the IBM 32 nm silicon-on-insulator (SOI) complementary metal oxide semiconductor (CMOS) process. The testing was conducted using test vehicles supplied by directly by IBM.

  8. Power flow as a complement to statistical energy analysis and finite element analysis

    NASA Technical Reports Server (NTRS)

    Cuschieri, J. M.

    1987-01-01

    Present methods of analysis of the structural response and the structure-borne transmission of vibrational energy use either finite element (FE) techniques or statistical energy analysis (SEA) methods. The FE methods are a very useful tool at low frequencies where the number of resonances involved in the analysis is rather small. On the other hand SEA methods can predict with acceptable accuracy the response and energy transmission between coupled structures at relatively high frequencies where the structural modal density is high and a statistical approach is the appropriate solution. In the mid-frequency range, a relatively large number of resonances exist which make finite element method too costly. On the other hand SEA methods can only predict an average level form. In this mid-frequency range a possible alternative is to use power flow techniques, where the input and flow of vibrational energy to excited and coupled structural components can be expressed in terms of input and transfer mobilities. This power flow technique can be extended from low to high frequencies and this can be integrated with established FE models at low frequencies and SEA models at high frequencies to form a verification of the method. This method of structural analysis using power flo and mobility methods, and its integration with SEA and FE analysis is applied to the case of two thin beams joined together at right angles.

  9. Statistical analysis of imperfection effect on cylindrical buckling response

    NASA Astrophysics Data System (ADS)

    Ismail, M. S.; Purbolaksono, J.; Muhammad, N.; Andriyana, A.; Liew, H. L.

    2015-12-01

    It is widely reported that no efficient guidelines for modelling imperfections in composite structures are available. In response, this work evaluates the imperfection factors of axially compressed Carbon Fibre Reinforced Polymer (CFRP) cylinder with different ply angles through finite element (FE) analysis. The sensitivity of imperfection factors were analysed using design of experiment: factorial design approach. From the analysis it identified three critical factors that sensitively reacted towards buckling load. Furthermore empirical equation is proposed according to each type of cylinder. Eventually, critical buckling loads estimated by empirical equation showed good agreements with FE analysis. The design of experiment methodology is useful in identifying parameters that lead to structures imperfection tolerance.

  10. Limitations of Using Microsoft Excel Version 2016 (MS Excel 2016) for Statistical Analysis for Medical Research.

    PubMed

    Tanavalee, Chotetawan; Luksanapruksa, Panya; Singhatanadgige, Weerasak

    2016-06-01

    Microsoft Excel (MS Excel) is a commonly used program for data collection and statistical analysis in biomedical research. However, this program has many limitations, including fewer functions that can be used for analysis and a limited number of total cells compared with dedicated statistical programs. MS Excel cannot complete analyses with blank cells, and cells must be selected manually for analysis. In addition, it requires multiple steps of data transformation and formulas to plot survival analysis graphs, among others. The Megastat add-on program, which will be supported by MS Excel 2016 soon, would eliminate some limitations of using statistic formulas within MS Excel.

  11. Advanced Risk Analysis for High-Performing Organizations

    DTIC Science & Technology

    2006-01-01

    using traditional risk analysis techniques. Mission Assurance Analysis Protocol (MAAP) is one technique that high performers can use to identify and mitigate the risks arising from operational complexity....The operational environment for many types of organizations is changing. Changes in operational environments are driving the need for advanced risk ... analysis techniques. Many types of risk prevalent in today’s operational environments (e.g., event risks, inherited risk) are not readily identified

  12. Metabolic systems analysis to advance algal biotechnology.

    PubMed

    Schmidt, Brian J; Lin-Schmidt, Xiefan; Chamberlin, Austin; Salehi-Ashtiani, Kourosh; Papin, Jason A

    2010-07-01

    Algal fuel sources promise unsurpassed yields in a carbon neutral manner that minimizes resource competition between agriculture and fuel crops. Many challenges must be addressed before algal biofuels can be accepted as a component of the fossil fuel replacement strategy. One significant challenge is that the cost of algal fuel production must become competitive with existing fuel alternatives. Algal biofuel production presents the opportunity to fine-tune microbial metabolic machinery for an optimal blend of biomass constituents and desired fuel molecules. Genome-scale model-driven algal metabolic design promises to facilitate both goals by directing the utilization of metabolites in the complex, interconnected metabolic networks to optimize production of the compounds of interest. Network analysis can direct microbial development efforts towards successful strategies and enable quantitative fine-tuning of the network for optimal product yields while maintaining the robustness of the production microbe. Metabolic modeling yields insights into microbial function, guides experiments by generating testable hypotheses, and enables the refinement of knowledge on the specific organism. While the application of such analytical approaches to algal systems is limited to date, metabolic network analysis can improve understanding of algal metabolic systems and play an important role in expediting the adoption of new biofuel technologies.

  13. Advances in Mössbauer data analysis

    NASA Astrophysics Data System (ADS)

    de Souza, Paulo A.

    1998-08-01

    The whole Mössbauer community generates a huge amount of data in several fields of human knowledge since the first publication of Rudolf Mössbauer. Interlaboratory measurements of the same substance may result in minor differences in the Mössbauer Parameters (MP) of isomer shift, quadrupole splitting and internal magnetic field. Therefore, a conventional data bank of published MP will be of limited help in identification of substances. Data bank search for exact information became incapable to differentiate the values of Mössbauer parameters within the experimental errors (e.g., IS = 0.22 mm/s from IS = 0.23 mm/s), but physically both values may be considered the same. An artificial neural network (ANN) is able to identify a substance and its crystalline structure from measured MP, and its slight variations do not represent an obstacle for the ANN identification. A barrier to the popularization of Mössbauer spectroscopy as an analytical technique is the absence of a full automated equipment, since the analysis of a Mössbauer spectrum normally is time-consuming and requires a specialist. In this work, the fitting process of a Mössbauer spectrum was completely automated through the use of genetic algorithms and fuzzy logic. Both software and hardware systems were implemented turning out to be a fully automated Mössbauer data analysis system. The developed system will be presented.

  14. New Statistical Approach to the Analysis of Hierarchical Data

    NASA Astrophysics Data System (ADS)

    Neuman, S. P.; Guadagnini, A.; Riva, M.

    2014-12-01

    Many variables possess a hierarchical structure reflected in how their increments vary in space and/or time. Quite commonly the increments (a) fluctuate in a highly irregular manner; (b) possess symmetric, non-Gaussian frequency distributions characterized by heavy tails that often decay with separation distance or lag; (c) exhibit nonlinear power-law scaling of sample structure functions in a midrange of lags, with breakdown in such scaling at small and large lags; (d) show extended power-law scaling (ESS) at all lags; and (e) display nonlinear scaling of power-law exponent with order of sample structure function. Some interpret this to imply that the variables are multifractal, which explains neither breakdowns in power-law scaling nor ESS. We offer an alternative interpretation consistent with all above phenomena. It views data as samples from stationary, anisotropic sub-Gaussian random fields subordinated to truncated fractional Brownian motion (tfBm) or truncated fractional Gaussian noise (tfGn). The fields are scaled Gaussian mixtures with random variances. Truncation of fBm and fGn entails filtering out components below data measurement or resolution scale and above domain scale. Our novel interpretation of the data allows us to obtain maximum likelihood estimates of all parameters characterizing the underlying truncated sub-Gaussian fields. These parameters in turn make it possible to downscale or upscale all statistical moments to situations entailing smaller or larger measurement or resolution and sampling scales, respectively. They also allow one to perform conditional or unconditional Monte Carlo simulations of random field realizations corresponding to these scales. Aspects of our approach are illustrated on field and laboratory measured porous and fractured rock permeabilities, as well as soil texture characteristics and neural network estimates of unsaturated hydraulic parameters in a deep vadose zone near Phoenix, Arizona. We also use our approach

  15. Detection of viruses via statistical gene expression analysis.

    PubMed

    Chen, Minhua; Carlson, David; Zaas, Aimee; Woods, Christopher W; Ginsburg, Geoffrey S; Hero, Alfred; Lucas, Joseph; Carin, Lawrence

    2011-03-01

    We develop a new bayesian construction of the elastic net (ENet), with variational bayesian analysis. This modeling framework is motivated by analysis of gene expression data for viruses, with a focus on H3N2 and H1N1 influenza, as well as Rhino virus and RSV (respiratory syncytial virus). Our objective is to understand the biological pathways responsible for the host response to such viruses, with the ultimate objective of developing a clinical test to distinguish subjects infected by such viruses from subjects with other symptom causes (e.g., bacteria). In addition to analyzing these new datasets, we provide a detailed analysis of the bayesian ENet and compare it to related models.

  16. Ockham's razor and Bayesian analysis. [statistical theory for systems evaluation

    NASA Technical Reports Server (NTRS)

    Jefferys, William H.; Berger, James O.

    1992-01-01

    'Ockham's razor', the ad hoc principle enjoining the greatest possible simplicity in theoretical explanations, is presently shown to be justifiable as a consequence of Bayesian inference; Bayesian analysis can, moreover, clarify the nature of the 'simplest' hypothesis consistent with the given data. By choosing the prior probabilities of hypotheses, it becomes possible to quantify the scientific judgment that simpler hypotheses are more likely to be correct. Bayesian analysis also shows that a hypothesis with fewer adjustable parameters intrinsically possesses an enhanced posterior probability, due to the clarity of its predictions.

  17. Advanced stability analysis for laminar flow control

    NASA Technical Reports Server (NTRS)

    Orszag, S. A.

    1981-01-01

    Five classes of problems are addressed: (1) the extension of the SALLY stability analysis code to the full eighth order compressible stability equations for three dimensional boundary layer; (2) a comparison of methods for prediction of transition using SALLY for incompressible flows; (3) a study of instability and transition in rotating disk flows in which the effects of Coriolis forces and streamline curvature are included; (4) a new linear three dimensional instability mechanism that predicts Reynolds numbers for transition to turbulence in planar shear flows in good agreement with experiment; and (5) a study of the stability of finite amplitude disturbances in axisymmetric pipe flow showing the stability of this flow to all nonlinear axisymmetric disturbances.

  18. Performance analysis of advanced spacecraft TPS

    NASA Technical Reports Server (NTRS)

    Pitts, William C.

    1987-01-01

    The analysis on the feasibility for using metal hydrides in the thermal protection system of cryogenic tanks in space was based on the heat capacity of ice as the phase change material (PCM). It was found that with ice the thermal protection system weight could be reduced by, at most, about 20 percent over an all LI-900 insulation. For this concept to be viable, a metal hydride with considerably more capacity than water would be required. None were found. Special metal hydrides were developed for hydrogen fuel storage applications and it may be possible to do so for the current application. Until this appears promising further effort on this feasibility study does not seem warranted.

  19. Statistical language analysis for automatic exfiltration event detection.

    SciTech Connect

    Robinson, David Gerald

    2010-04-01

    This paper discusses the recent development a statistical approach for the automatic identification of anomalous network activity that is characteristic of exfiltration events. This approach is based on the language processing method eferred to as latent dirichlet allocation (LDA). Cyber security experts currently depend heavily on a rule-based framework for initial detection of suspect network events. The application of the rule set typically results in an extensive list of uspect network events that are then further explored manually for suspicious activity. The ability to identify anomalous network events is heavily dependent on the experience of the security personnel wading through the network log. Limitations f this approach are clear: rule-based systems only apply to exfiltration behavior that has previously been observed, and experienced cyber security personnel are rare commodities. Since the new methodology is not a discrete rule-based pproach, it is more difficult for an insider to disguise the exfiltration events. A further benefit is that the methodology provides a risk-based approach that can be implemented in a continuous, dynamic or evolutionary fashion. This permits uspect network activity to be identified early with a quantifiable risk associated with decision making when responding to suspicious activity.

  20. Performance analysis of LVQ algorithms: a statistical physics approach.

    PubMed

    Ghosh, Anarta; Biehl, Michael; Hammer, Barbara

    2006-01-01

    Learning vector quantization (LVQ) constitutes a powerful and intuitive method for adaptive nearest prototype classification. However, original LVQ has been introduced based on heuristics and numerous modifications exist to achieve better convergence and stability. Recently, a mathematical foundation by means of a cost function has been proposed which, as a limiting case, yields a learning rule similar to classical LVQ2.1. It also motivates a modification which shows better stability. However, the exact dynamics as well as the generalization ability of many LVQ algorithms have not been thoroughly investigated so far. Using concepts from statistical physics and the theory of on-line learning, we present a mathematical framework to analyse the performance of different LVQ algorithms in a typical scenario in terms of their dynamics, sensitivity to initial conditions, and generalization ability. Significant differences in the algorithmic stability and generalization ability can be found already for slightly different variants of LVQ. We study five LVQ algorithms in detail: Kohonen's original LVQ1, unsupervised vector quantization (VQ), a mixture of VQ and LVQ, LVQ2.1, and a variant of LVQ which is based on a cost function. Surprisingly, basic LVQ1 shows very good performance in terms of stability, asymptotic generalization ability, and robustness to initializations and model parameters which, in many cases, is superior to recent alternative proposals.

  1. On the Statistical Analysis of X-ray Polarization Measurements

    NASA Technical Reports Server (NTRS)

    Strohmayer, T. E.; Kallman, T. R.

    2013-01-01

    In many polarimetry applications, including observations in the X-ray band, the measurement of a polarization signal can be reduced to the detection and quantification of a deviation from uniformity of a distribution of measured angles of the form alpha plus beta cosine (exp 2)(phi - phi(sub 0) (0 (is) less than phi is less than pi). We explore the statistics of such polarization measurements using both Monte Carlo simulations as well as analytic calculations based on the appropriate probability distributions. We derive relations for the number of counts required to reach a given detection level (parameterized by beta the "number of sigma's" of the measurement) appropriate for measuring the modulation amplitude alpha by itself (single interesting parameter case) or jointly with the position angle phi (two interesting parameters case). We show that for the former case when the intrinsic amplitude is equal to the well known minimum detectable polarization (MDP) it is, on average, detected at the 3sigma level. For the latter case, when one requires a joint measurement at the same confidence level, then more counts are needed, by a factor of approximately equal to 2.2, than that required to achieve the MDP level. We find that the position angle uncertainty at 1sigma confidence is well described by the relation sigma(sub pi) equals 28.5(degrees) divided by beta.

  2. A statistical mechanics analysis of the set covering problem

    NASA Astrophysics Data System (ADS)

    Fontanari, J. F.

    1996-02-01

    The dependence of the optimal solution average cost 0305-4470/29/3/004/img1 of the set covering problem on the density of 1's of the incidence matrix (0305-4470/29/3/004/img2) and on the number of constraints (P) is investigated in the limit where the number of items (N) goes to infinity. The annealed approximation is employed to study two stochastic models: the constant density model, where the elements of the incidence matrix are statistically independent random variables, and the Karp model, where the rows of the incidence matrix possess the same number of 1's. Lower bounds for 0305-4470/29/3/004/img1 are presented in the case that P scales with ln N and 0305-4470/29/3/004/img2 is of order 1, as well as in the case that P scales linearly with N and 0305-4470/29/3/004/img2 is of order 1/N. It is shown that in the case that P scales with exp N and 0305-4470/29/3/004/img2 is of order 1 the annealed approximation yields exact results for both models.

  3. Statistical analysis of geomagnetic storm driver and intensity

    NASA Astrophysics Data System (ADS)

    Katus, R. M.; Liemohn, M. W.

    2013-05-01

    Geomagnetic storms are investigated statistically with respect to the solar wind driver and the intensity of the events. The Hot Electron and Ion Drift Integrator (HEIDI) model was used to simulate all of the intense storms (minimum Dst < - 100 nT) from solar cycle 23 (1996-2005). Four different configurations of HEIDI were used to investigate the outer boundary condition and electric field description. The storms are then classified as being a coronal mass ejection (CME) or corotating interaction region (CIR) driven event and binned based on the magnitude of the minimum Dst. The simulation results as well as solar wind and geomagnetic data sets are then analyzed along a normalized epoch timeline. The average behavior of each storm type and the corresponding HEIDI configurations are then presented and discussed. It is found that while the self-consistent electric field better reproduces stronger CME driven storms, the Volland-Stern electric field does well reproducing the results for CIR driven events.

  4. SEDA: A software package for the Statistical Earthquake Data Analysis

    PubMed Central

    Lombardi, A. M.

    2017-01-01

    In this paper, the first version of the software SEDA (SEDAv1.0), designed to help seismologists statistically analyze earthquake data, is presented. The package consists of a user-friendly Matlab-based interface, which allows the user to easily interact with the application, and a computational core of Fortran codes, to guarantee the maximum speed. The primary factor driving the development of SEDA is to guarantee the research reproducibility, which is a growing movement among scientists and highly recommended by the most important scientific journals. SEDAv1.0 is mainly devoted to produce accurate and fast outputs. Less care has been taken for the graphic appeal, which will be improved in the future. The main part of SEDAv1.0 is devoted to the ETAS modeling. SEDAv1.0 contains a set of consistent tools on ETAS, allowing the estimation of parameters, the testing of model on data, the simulation of catalogs, the identification of sequences and forecasts calculation. The peculiarities of routines inside SEDAv1.0 are discussed in this paper. More specific details on the software are presented in the manual accompanying the program package. PMID:28290482

  5. Statistical analysis of shear cracks on rock surfaces

    NASA Astrophysics Data System (ADS)

    Åström, J. A.

    2007-04-01

    A set of 3873 cracks on exposed granite rock surfaces are analyzed in order to investigate possible fracture mechanisms. The fracture patterns are compared with the Mohr-Coulomb and the Roscoe fracture models, which can be combined into a single fracture scheme. A third model for comparison is based on interacting `penny-shaped' micro cracks introduced by Healy et al. [Nature 439, 64 (2006)]. The former models predict a bimodal fracture angle distribution, with two narrow peaks separated by 60○-90○ symmetrically on both sides of the direction of the largest principal stress, while the latter predicts a single broader peak in the same direction with standard deviation in the range 15○-20○. The crack length distributions seem consistent with numerical simulation, whereas the fracture patterns are Euclidean rather than fractal. The statistical analyses indicate that none of the models fully describe the fracture patterns. It seems that natural shear fractures easily become a complex combination of different fracture mechanisms.

  6. Statistical analysis of vibration-induced bone and joint damages.

    PubMed

    Schenk, T

    1995-01-01

    Vibration-induced damages to bones and joints are still occupational diseases with insufficient knowledge about causing and moderating factors and resulting damages. For a better understanding of these relationships also retrospective analyses of already acknowledged occupational diseases may be used. Already recorded detailed data for 203 in 1970 to 1979 acknowledged occupational diseases in the building industry and the building material industry of the GDR are the basis for the here described investigations. The data were gathered from the original documents of the occupational diseases and scaled in cooperation of an industrial engineer and an industrial physician. For the purposes of this investigations the data are to distinguish between data which describe the conditions of the work place (e.g. material, tools and posture), the exposure parameters (e.g. beginning of exposure and latency period) and the disease (e.g. anamnestical and radiological data). These data are treated for the use with sophisticated computerized statistical methods. The following analyses were carried out. Investigation of the connections between the several characteristics, which describe the occupational disease (health damages), including the comparison of the severity of the damages at the individual joints. Investigation of the side dependence of the damages. Investigation of the influence of the age at the beginning of the exposure and the age at the acknowledgement of the occupational disease and herewith of the exposure duration. Investigation of the effect of different occupational and exposure conditions.

  7. SEDA: A software package for the Statistical Earthquake Data Analysis.

    PubMed

    Lombardi, A M

    2017-03-14

    In this paper, the first version of the software SEDA (SEDAv1.0), designed to help seismologists statistically analyze earthquake data, is presented. The package consists of a user-friendly Matlab-based interface, which allows the user to easily interact with the application, and a computational core of Fortran codes, to guarantee the maximum speed. The primary factor driving the development of SEDA is to guarantee the research reproducibility, which is a growing movement among scientists and highly recommended by the most important scientific journals. SEDAv1.0 is mainly devoted to produce accurate and fast outputs. Less care has been taken for the graphic appeal, which will be improved in the future. The main part of SEDAv1.0 is devoted to the ETAS modeling. SEDAv1.0 contains a set of consistent tools on ETAS, allowing the estimation of parameters, the testing of model on data, the simulation of catalogs, the identification of sequences and forecasts calculation. The peculiarities of routines inside SEDAv1.0 are discussed in this paper. More specific details on the software are presented in the manual accompanying the program package.

  8. Statistical threshold for nonlinear Granger Causality in motor intention analysis.

    PubMed

    Liu, MengTing; Kuo, Ching-Chang; Chiu, Alan W L

    2011-01-01

    Directed influence between multiple channel signal measurements is important for the understanding of large dynamic systems. This research investigates a method to analyze large, complex multi-variable systems using directional flow measure to extract relevant information related to the functional connectivity between different units in the system. The directional flow measure was completed through nonlinear Granger Causality (GC) which is based on the nonlinear predictive models using radial basis functions (RBF). In order to extract relevant information from the causality map, we propose a threshold method that can be set up through a spatial statistical process where only the top 20% of causality pathways is shown. We applied this approach to a brain computer interface (BCI) application to decode the different intended arm reaching movement (left, right and forward) using 128 surface electroencephalography (EEG) electrodes. We also evaluated the importance of selecting the appropriate radius in the region of interest and found that the directions of causal influence of active brain regions were unique with respect to the intended direction.

  9. Statistical analysis of AFE GN&C aeropass performance

    NASA Technical Reports Server (NTRS)

    Chang, Ho-Pen; French, Raymond A.

    1990-01-01

    Performance of the guidance, navigation, and control (GN&C) system used on the Aeroassist Flight Experiment (AFE) spacecraft has been studied with Monte Carlo techniques. The performance of the AFE GN&C is investigated with a 6-DOF numerical dynamic model which includes a Global Reference Atmospheric Model (GRAM) and a gravitational model with oblateness corrections. The study considers all the uncertainties due to the environment and the system itself. In the AFE's aeropass phase, perturbations on the system performance are caused by an error space which has over 20 dimensions of the correlated/uncorrelated error sources. The goal of this study is to determine, in a statistical sense, how much flight path angle error can be tolerated at entry interface (EI) and still have acceptable delta-V capability at exit to position the AFE spacecraft for recovery. Assuming there is fuel available to produce 380 ft/sec of delta-V at atmospheric exit, a 3-sigma standard deviation in flight path angle error of 0.04 degrees at EI would result in a 98-percent probability of mission success.

  10. SEDA: A software package for the Statistical Earthquake Data Analysis

    NASA Astrophysics Data System (ADS)

    Lombardi, A. M.

    2017-03-01

    In this paper, the first version of the software SEDA (SEDAv1.0), designed to help seismologists statistically analyze earthquake data, is presented. The package consists of a user-friendly Matlab-based interface, which allows the user to easily interact with the application, and a computational core of Fortran codes, to guarantee the maximum speed. The primary factor driving the development of SEDA is to guarantee the research reproducibility, which is a growing movement among scientists and highly recommended by the most important scientific journals. SEDAv1.0 is mainly devoted to produce accurate and fast outputs. Less care has been taken for the graphic appeal, which will be improved in the future. The main part of SEDAv1.0 is devoted to the ETAS modeling. SEDAv1.0 contains a set of consistent tools on ETAS, allowing the estimation of parameters, the testing of model on data, the simulation of catalogs, the identification of sequences and forecasts calculation. The peculiarities of routines inside SEDAv1.0 are discussed in this paper. More specific details on the software are presented in the manual accompanying the program package.

  11. The Patterns of Teacher Compensation. Statistical Analysis Report.

    ERIC Educational Resources Information Center

    Chambers, Jay; Bobbitt, Sharon A.

    This report presents information regarding the patterns of variation in the salaries paid to public and private school teachers in relation to various personal and job characteristics. Specifically, the analysis examines the relationship between compensation and variables such as public/private schools, gender, race/ethnic background, school level…

  12. Open Access Publishing Trend Analysis: Statistics beyond the Perception

    ERIC Educational Resources Information Center

    Poltronieri, Elisabetta; Bravo, Elena; Curti, Moreno; Maurizio Ferri,; Mancini, Cristina

    2016-01-01

    Introduction: The purpose of this analysis was twofold: to track the number of open access journals acquiring impact factor, and to investigate the distribution of subject categories pertaining to these journals. As a case study, journals in which the researchers of the National Institute of Health (Istituto Superiore di Sanità) in Italy have…

  13. Statistical Performance Analysis of Data-Driven Neural Models.

    PubMed

    Freestone, Dean R; Layton, Kelvin J; Kuhlmann, Levin; Cook, Mark J

    2017-02-01

    Data-driven model-based analysis of electrophysiological data is an emerging technique for understanding the mechanisms of seizures. Model-based analysis enables tracking of hidden brain states that are represented by the dynamics of neural mass models. Neural mass models describe the mean firing rates and mean membrane potentials of populations of neurons. Various neural mass models exist with different levels of complexity and realism. An ideal data-driven model-based analysis framework will incorporate the most realistic model possible, enabling accurate imaging of the physiological variables. However, models must be sufficiently parsimonious to enable tracking of important variables using data. This paper provides tools to inform the realism versus parsimony trade-off, the Bayesian Cramer-Rao (lower) Bound (BCRB). We demonstrate how the BCRB can be used to assess the feasibility of using various popular neural mass models to track epilepsy-related dynamics via stochastic filtering methods. A series of simulations show how optimal state estimates relate to measurement noise, model error and initial state uncertainty. We also demonstrate that state estimation accuracy will vary between seizure-like and normal rhythms. The performance of the extended Kalman filter (EKF) is assessed against the BCRB. This work lays a foundation for assessing feasibility of model-based analysis. We discuss how the framework can be used to design experiments to better understand epilepsy.

  14. Granger causality--statistical analysis under a configural perspective.

    PubMed

    von Eye, Alexander; Wiedermann, Wolfgang; Mun, Eun-Young

    2014-03-01

    The concept of Granger causality can be used to examine putative causal relations between two series of scores. Based on regression models, it is asked whether one series can be considered the cause for the second series. In this article, we propose extending the pool of methods available for testing hypotheses that are compatible with Granger causation by adopting a configural perspective. This perspective allows researchers to assume that effects exist for specific categories only or for specific sectors of the data space, but not for other categories or sectors. Configural Frequency Analysis (CFA) is proposed as the method of analysis from a configural perspective. CFA base models are derived for the exploratory analysis of Granger causation. These models are specified so that they parallel the regression models used for variable-oriented analysis of hypotheses of Granger causation. An example from the development of aggression in adolescence is used. The example shows that only one pattern of change in aggressive impulses over time Granger-causes change in physical aggression against peers.

  15. Bayesian Statistics and Uncertainty Quantification for Safety Boundary Analysis in Complex Systems

    NASA Technical Reports Server (NTRS)

    He, Yuning; Davies, Misty Dawn

    2014-01-01

    The analysis of a safety-critical system often requires detailed knowledge of safe regions and their highdimensional non-linear boundaries. We present a statistical approach to iteratively detect and characterize the boundaries, which are provided as parameterized shape candidates. Using methods from uncertainty quantification and active learning, we incrementally construct a statistical model from only few simulation runs and obtain statistically sound estimates of the shape parameters for safety boundaries.

  16. Advanced analysis techniques for uranium assay

    SciTech Connect

    Geist, W. H.; Ensslin, Norbert; Carrillo, L. A.; Beard, C. A.

    2001-01-01

    Uranium has a negligible passive neutron emission rate making its assay practicable only with an active interrogation method. The active interrogation uses external neutron sources to induce fission events in the uranium in order to determine the mass. This technique requires careful calibration with standards that are representative of the items to be assayed. The samples to be measured are not always well represented by the available standards which often leads to large biases. A technique of active multiplicity counting is being developed to reduce some of these assay difficulties. Active multiplicity counting uses the measured doubles and triples count rates to determine the neutron multiplication (f4) and the product of the source-sample coupling ( C ) and the 235U mass (m). Since the 35U mass always appears in the multiplicity equations as the product of Cm, the coupling needs to be determined before the mass can be known. A relationship has been developed that relates the coupling to the neutron multiplication. The relationship is based on both an analytical derivation and also on empirical observations. To determine a scaling constant present in this relationship, known standards must be used. Evaluation of experimental data revealed an improvement over the traditional calibration curve analysis method of fitting the doubles count rate to the 235Um ass. Active multiplicity assay appears to relax the requirement that the calibration standards and unknown items have the same chemical form and geometry.

  17. Advances in carbonate exploration and reservoir analysis

    USGS Publications Warehouse

    Garland, J.; Neilson, J.; Laubach, S.E.; Whidden, Katherine J.

    2012-01-01

    The development of innovative techniques and concepts, and the emergence of new plays in carbonate rocks are creating a resurgence of oil and gas discoveries worldwide. The maturity of a basin and the application of exploration concepts have a fundamental influence on exploration strategies. Exploration success often occurs in underexplored basins by applying existing established geological concepts. This approach is commonly undertaken when new basins ‘open up’ owing to previous political upheavals. The strategy of using new techniques in a proven mature area is particularly appropriate when dealing with unconventional resources (heavy oil, bitumen, stranded gas), while the application of new play concepts (such as lacustrine carbonates) to new areas (i.e. ultra-deep South Atlantic basins) epitomizes frontier exploration. Many low-matrix-porosity hydrocarbon reservoirs are productive because permeability is controlled by fractures and faults. Understanding basic fracture properties is critical in reducing geological risk and therefore reducing well costs and increasing well recovery. The advent of resource plays in carbonate rocks, and the long-standing recognition of naturally fractured carbonate reservoirs means that new fracture and fault analysis and prediction techniques and concepts are essential.

  18. Analysis of helicopter downwash/frigate airwake interaction using statistically designed experiments

    NASA Astrophysics Data System (ADS)

    Nacakli, Yavuz

    A research program to investigate helicopter downwash/frigate airwake interaction has been initiated using a statistically robust experimental program featuring Design of Experiments. Engineering analysis of the helicopter/frigate interface is complicated by the fact that two flowfields become inherently coupled as separation distance decreases. The final objective of this work is to develop experimental methods to determine when computer simulations need to include the effects of a coupled flowfield versus using a simplified representation by superposing the velocity fields of the individual flowfields. The work presented was performed in the Old Dominion University Low Speed Wind Tunnel using a simplified 1/50 scale frigate waterline model and traverse mounted powered rotor with thrust measurement. Particle Image Velocimetry (PIV) velocity surveys were used with rotor thrust coefficient measurements at locations of identified interaction to help understand the underlying flow physics. Initially, PIV surveys of the frigate model landing deck in isolation and the rotor in isolation were performed to provide a baseline flow understanding. Next a designed experiment was devised yielding a response model for thrust coefficient as a function of vertical and longitudinal distance from the hangar door (base of the step), both with and without the rotor. This first experiment showed that thrust coefficient could be measured with enough precision to identify changes due to location using an advance ratio of 0.075 (Vinfinity = 5.14 m/s and o = 5000 rpm). A second designed experiment determined the practical spatial resolution for mapping the thrust coefficient response along the frigate's longitudinal center plane. Finally, a third designed experiment directly compared rotor thrust measurements between airwake and no-airwake cases and successfully identified regions that differed with statistical significance. Lastly, a qualitative comparison study was performed to

  19. Using the statistical analysis method to assess the landslide susceptibility

    NASA Astrophysics Data System (ADS)

    Chan, Hsun-Chuan; Chen, Bo-An; Wen, Yo-Ting

    2015-04-01

    This study assessed the landslide susceptibility in Jing-Shan River upstream watershed, central Taiwan. The landslide inventories during typhoons Toraji in 2001, Mindulle in 2004, Kalmaegi and Sinlaku in 2008, Morakot in 2009, and the 0719 rainfall event in 2011, which were established by Taiwan Central Geological Survey, were used as landslide data. This study aims to assess the landslide susceptibility by using different statistical methods including logistic regression, instability index method and support vector machine (SVM). After the evaluations, the elevation, slope, slope aspect, lithology, terrain roughness, slope roughness, plan curvature, profile curvature, total curvature, average of rainfall were chosen as the landslide factors. The validity of the three established models was further examined by the receiver operating characteristic curve. The result of logistic regression showed that the factor of terrain roughness and slope roughness had a stronger impact on the susceptibility value. Instability index method showed that the factor of terrain roughness and lithology had a stronger impact on the susceptibility value. Due to the fact that the use of instability index method may lead to possible underestimation around the river side. In addition, landslide susceptibility indicated that the use of instability index method laid a potential issue about the number of factor classification. An increase of the number of factor classification may cause excessive variation coefficient of the factor. An decrease of the number of factor classification may make a large range of nearby cells classified into the same susceptibility level. Finally, using the receiver operating characteristic curve discriminate the three models. SVM is a preferred method than the others in assessment of landslide susceptibility. Moreover, SVM is further suggested to be nearly logistic regression in terms of recognizing the medium-high and high susceptibility.

  20. On the Statistical Analysis of X-Ray Polarization Measurements

    NASA Astrophysics Data System (ADS)

    Strohmayer, T. E.; Kallman, T. R.

    2013-08-01

    In many polarimetry applications, including observations in the X-ray band, the measurement of a polarization signal can be reduced to the detection and quantification of a deviation from uniformity of a distribution of measured angles of the form A + Bcos 2(phi - phi0) (0 < phi < π). We explore the statistics of such polarization measurements using Monte Carlo simulations and χ2 fitting methods. We compare our results to those derived using the traditional probability density used to characterize polarization measurements and quantify how they deviate as the intrinsic modulation amplitude grows. We derive relations for the number of counts required to reach a given detection level (parameterized by β the "number of σ's" of the measurement) appropriate for measuring the modulation amplitude a by itself (single interesting parameter case) or jointly with the position angle phi (two interesting parameters case). We show that for the former case, when the intrinsic amplitude is equal to the well-known minimum detectable polarization, (MDP) it is, on average, detected at the 3σ level. For the latter case, when one requires a joint measurement at the same confidence level, then more counts are needed than what was required to achieve the MDP level. This additional factor is amplitude-dependent, but is ≈2.2 for intrinsic amplitudes less than about 20%. It decreases slowly with amplitude and is ≈1.8 when the amplitude is 50%. We find that the position angle uncertainty at 1σ confidence is well described by the relation σphi = 28.°5/β.

  1. Advances in the environmental analysis of polychlorinated naphthalenes and toxaphene.

    PubMed

    Kucklick, John R; Helm, Paul A

    2006-10-01

    Recent advances in the analysis of the chlorinated environmental pollutants polychlorinated naphthalenes (PCNs) and toxaphene are highlighted in this review. Method improvements have been realized for PCNs over the past decade in isomer-specific quantification, peak resolution, and the availability of mass-labeled standards. Toxaphene method advancements include the application of new capillary gas chromatographic (GC) stationary phases, mass spectrometry (MS), especially ion trap MS, and the availability of Standard Reference Materials that are value-assigned for total toxaphene and selected congener concentrations. An area of promise for the separation of complex mixtures such as PCNs and toxaphene is the development of multidimensional GC techniques. The need for continued advancements and efficiencies in the analysis of contaminants such as PCNs and toxaphene remains as monitoring requirements for these compound classes are established under international agreements.

  2. Advanced computational tools for 3-D seismic analysis

    SciTech Connect

    Barhen, J.; Glover, C.W.; Protopopescu, V.A.

    1996-06-01

    The global objective of this effort is to develop advanced computational tools for 3-D seismic analysis, and test the products using a model dataset developed under the joint aegis of the United States` Society of Exploration Geophysicists (SEG) and the European Association of Exploration Geophysicists (EAEG). The goal is to enhance the value to the oil industry of the SEG/EAEG modeling project, carried out with US Department of Energy (DOE) funding in FY` 93-95. The primary objective of the ORNL Center for Engineering Systems Advanced Research (CESAR) is to spearhead the computational innovations techniques that would enable a revolutionary advance in 3-D seismic analysis. The CESAR effort is carried out in collaboration with world-class domain experts from leading universities, and in close coordination with other national laboratories and oil industry partners.

  3. Characterization and detection of Vero cells infected with Herpes Simplex Virus type 1 using Raman spectroscopy and advanced statistical methods.

    PubMed

    Salman, A; Shufan, E; Zeiri, L; Huleihel, M

    2014-07-01

    Herpes viruses are involved in a variety of human disorders. Herpes Simplex Virus type 1 (HSV-1) is the most common among the herpes viruses and is primarily involved in human cutaneous disorders. Although the symptoms of infection by this virus are usually minimal, in some cases HSV-1 might cause serious infections in the eyes and the brain leading to blindness and even death. A drug, acyclovir, is available to counter this virus. The drug is most effective when used during the early stages of the infection, which makes early detection and identification of these viral infections highly important for successful treatment. In the present study we evaluated the potential of Raman spectroscopy as a sensitive, rapid, and reliable method for the detection and identification of HSV-1 viral infections in cell cultures. Using Raman spectroscopy followed by advanced statistical methods enabled us, with sensitivity approaching 100%, to differentiate between a control group of Vero cells and another group of Vero cells that had been infected with HSV-1. Cell sites that were "rich in membrane" gave the best results in the differentiation between the two categories. The major changes were observed in the 1195-1726 cm(-1) range of the Raman spectrum. The features in this range are attributed mainly to proteins, lipids, and nucleic acids.

  4. Statistical theory and methodology for remote sensing data analysis

    NASA Technical Reports Server (NTRS)

    Odell, P. L.

    1974-01-01

    A model is developed for the evaluation of acreages (proportions) of different crop-types over a geographical area using a classification approach and methods for estimating the crop acreages are given. In estimating the acreages of a specific croptype such as wheat, it is suggested to treat the problem as a two-crop problem: wheat vs. nonwheat, since this simplifies the estimation problem considerably. The error analysis and the sample size problem is investigated for the two-crop approach. Certain numerical results for sample sizes are given for a JSC-ERTS-1 data example on wheat identification performance in Hill County, Montana and Burke County, North Dakota. Lastly, for a large area crop acreages inventory a sampling scheme is suggested for acquiring sample data and the problem of crop acreage estimation and the error analysis is discussed.

  5. Stalked protozoa identification by image analysis and multivariable statistical techniques.

    PubMed

    Amaral, A L; Ginoris, Y P; Nicolau, A; Coelho, M A Z; Ferreira, E C

    2008-06-01

    Protozoa are considered good indicators of the treatment quality in activated sludge systems as they are sensitive to physical, chemical and operational processes. Therefore, it is possible to correlate the predominance of certain species or groups and several operational parameters of the plant. This work presents a semiautomatic image analysis procedure for the recognition of the stalked protozoa species most frequently found in wastewater treatment plants by determining the geometrical, morphological and signature data and subsequent processing by discriminant analysis and neural network techniques. Geometrical descriptors were found to be responsible for the best identification ability and the identification of the crucial Opercularia and Vorticella microstoma microorganisms provided some degree of confidence to establish their presence in wastewater treatment plants.

  6. Characterization of Nuclear Fuel using Multivariate Statistical Analysis

    SciTech Connect

    Robel, M; Robel, M; Robel, M; Kristo, M J; Kristo, M J

    2007-11-27

    Various combinations of reactor type and fuel composition have been characterized using principle components analysis (PCA) of the concentrations of 9 U and Pu isotopes in the 10 fuel as a function of burnup. The use of PCA allows the reduction of the 9-dimensional data (isotopic concentrations) into a 3-dimensional approximation, giving a visual representation of the changes in nuclear fuel composition with burnup. Real-world variation in the concentrations of {sup 234}U and {sup 236}U in the fresh (unirradiated) fuel was accounted for. The effects of reprocessing were also simulated. The results suggest that, 15 even after reprocessing, Pu isotopes can be used to determine both the type of reactor and the initial fuel composition with good discrimination. Finally, partial least squares discriminant analysis (PSLDA) was investigated as a substitute for PCA. Our results suggest that PLSDA is a better tool for this application where separation between known classes is most important.

  7. Practical guidance for statistical analysis of operational event data

    SciTech Connect

    Atwood, C.L.

    1995-10-01

    This report presents ways to avoid mistakes that are sometimes made in analysis of operational event data. It then gives guidance on what to do when a model is rejected, a list of standard types of models to consider, and principles for choosing one model over another. For estimating reliability, it gives advice on which failure modes to model, and moment formulas for combinations of failure modes. The issues are illustrated with many examples and case studies.

  8. Multivariate statistical analysis of Raman images of a pharmaceutical tablet.

    PubMed

    Lin, Haisheng; Marjanović, Ognjen; Lennox, Barry; Šašić, Slobodan; Clegg, Ian M

    2012-03-01

    This paper describes the application of principal component analysis (PCA) and independent component analysis (ICA) to identify the reference spectra of a pharmaceutical tablet's constituent compounds from Raman spectroscopic data. The analysis shows, first with a simulated data set and then with data collected from a pharmaceutical tablet, that both PCA and ICA are able to identify most of the features present in the reference spectra of the constituent compounds. However, the results suggest that the ICA method may be more appropriate when attempting to identify unknown reference spectra from a sample. The resulting PCA and ICA models are subsequently used to estimate the relative concentrations of the constituent compounds and to produce spatial distribution images of the analyzed tablet. These images provide a visual representation of the spatial distribution of the constituent compounds throughout the tablet. Images associated with the ICA scores are found to be more informative and not as affected by measurement noise as the PCA based score images. The paper concludes with a discussion of the future work that needs to be undertaken for ICA to gain wider acceptance in the applied spectroscopy community.

  9. Analysis of tensile bond strengths using Weibull statistics.

    PubMed

    Burrow, Michael F; Thomas, David; Swain, Mike V; Tyas, Martin J

    2004-09-01

    Tensile strength tests of restorative resins bonded to dentin, and the resultant strengths of interfaces between the two, exhibit wide variability. Many variables can affect test results, including specimen preparation and storage, test rig design and experimental technique. However, the more fundamental source of variability, that associated with the brittle nature of the materials, has received little attention. This paper analyzes results from micro-tensile tests on unfilled resins and adhesive bonds between restorative resin composite and dentin in terms of reliability using the Weibull probability of failure method. Results for the tensile strengths of Scotchbond Multipurpose Adhesive (3M) and Clearfil LB Bond (Kuraray) bonding resins showed Weibull moduli (m) of 6.17 (95% confidence interval, 5.25-7.19) and 5.01 (95% confidence interval, 4.23-5.8). Analysis of results for micro-tensile tests on bond strengths to dentin gave moduli between 1.81 (Clearfil Liner Bond 2V) and 4.99 (Gluma One Bond, Kulzer). Material systems with m in this range do not have a well-defined strength. The Weibull approach also enables the size dependence of the strength to be estimated. An example where the bonding area was changed from 3.1 to 1.1 mm diameter is shown. Weibull analysis provides a method for determining the reliability of strength measurements in the analysis of data from bond strength and tensile tests on dental restorative materials.

  10. Carbohydrate Structure Database: tools for statistical analysis of bacterial, plant and fungal glycomes

    PubMed Central

    Egorova, K.S.; Kondakova, A.N.; Toukach, Ph.V.

    2015-01-01

    Carbohydrates are biological blocks participating in diverse and crucial processes both at cellular and organism levels. They protect individual cells, establish intracellular interactions, take part in the immune reaction and participate in many other processes. Glycosylation is considered as one of the most important modifications of proteins and other biologically active molecules. Still, the data on the enzymatic machinery involved in the carbohydrate synthesis and processing are scattered, and the advance on its study is hindered by the vast bulk of accumulated genetic information not supported by any experimental evidences for functions of proteins that are encoded by these genes. In this article, we present novel instruments for statistical analysis of glycomes in taxa. These tools may be helpful for investigating carbohydrate-related enzymatic activities in various groups of organisms and for comparison of their carbohydrate content. The instruments are developed on the Carbohydrate Structure Database (CSDB) platform and are available freely on the CSDB web-site at http://csdb.glycoscience.ru. Database URL: http://csdb.glycoscience.ru PMID:26337239

  11. Process characterization and statistical analysis of oxide CMP on a silicon wafer with sparse data

    NASA Astrophysics Data System (ADS)

    Bukkapatnam, S. T. S.; Rao, P. K.; Lih, W.-C.; Chandrasekaran, N.; Komanduri, R.

    2007-09-01

    Continuous advancements in chemical mechanical planarization (CMP) process, such as new polishing pads, slurry materials, and abrasive particles necessitate optimization of the key process input parameters for maximum material removal rate (MRR) and/or minimum within wafer non-uniformity (WIWNU) using sparse experimental results. In this investigation a methodology is proposed for developing process models and optimization of input parameters (both main and interaction parameters) for maximum MRR and minimum WIWNU. This approach will be equally applicable for polishing other materials, such as copper, dielectrics and low-k materials. Complex relationships exist between several machine-specific and material-specific input parameters and the output performance variables, chiefly MRR and WIWNU. However, only a few of the input parameters are changed on a regular basis. Hence, only those subsets of relationships need to be considered for optimizing the CMP process. In this investigation, CMP process was characterized for polishing a thin layer of silicon dioxide on top of a silicon wafer. Statistical analysis of the experimental data was performed to obtain the order of significance of the input variables (machine and material parameters and their interactions). Both linear and logarithmic regression models were developed and used to determine optimum process conditions for maximizing MRR and minimizing WIWNU. While the main input parameters were responsible for maximum MRR, interaction parameters were found to be responsible for minimizing WIWNU. This may vary for different materials and polishing environments.

  12. ROOT — A C++ framework for petabyte data storage, statistical analysis and visualization

    NASA Astrophysics Data System (ADS)

    Antcheva, I.; Ballintijn, M.; Bellenot, B.; Biskup, M.; Brun, R.; Buncic, N.; Canal, Ph.; Casadei, D.; Couet, O.; Fine, V.; Franco, L.; Ganis, G.; Gheata, A.; Maline, D. Gonzalez; Goto, M.; Iwaszkiewicz, J.; Kreshuk, A.; Segura, D. Marcos; Maunder, R.; Moneta, L.; Naumann, A.; Offermann, E.; Onuchin, V.; Panacek, S.; Rademakers, F.; Russo, P.; Tadel, M.

    2009-12-01

    ROOT is an object-oriented C++ framework conceived in the high-energy physics (HEP) community, designed for storing and analyzing petabytes of data in an efficient way. Any instance of a C++ class can be stored into a ROOT file in a machine-independent compressed binary format. In ROOT the TTree object container is optimized for statistical data analysis over very large data sets by using vertical data storage techniques. These containers can span a large number of files on local disks, the web, or a number of different shared file systems. In order to analyze this data, the user can chose out of a wide set of mathematical and statistical functions, including linear algebra classes, numerical algorithms such as integration and minimization, and various methods for performing regression analysis (fitting). In particular, the RooFit package allows the user to perform complex data modeling and fitting while the RooStats library provides abstractions and implementations for advanced statistical tools. Multivariate classification methods based on machine learning techniques are available via the TMVA package. A central piece in these analysis tools are the histogram classes which provide binning of one- and multi-dimensional data. Results can be saved in high-quality graphical formats like Postscript and PDF or in bitmap formats like JPG or GIF. The result can also be stored into ROOT macros that allow a full recreation and rework of the graphics. Users typically create their analysis macros step by step, making use of the interactive C++ interpreter CINT, while running over small data samples. Once the development is finished, they can run these macros at full compiled speed over large data sets, using on-the-fly compilation, or by creating a stand-alone batch program. Finally, if processing farms are available, the user can reduce the execution time of intrinsically parallel tasks — e.g. data mining in HEP — by using PROOF, which will take care of optimally

  13. Comparing Methods for Item Analysis: The Impact of Different Item-Selection Statistics on Test Difficulty

    ERIC Educational Resources Information Center

    Jones, Andrew T.

    2011-01-01

    Practitioners often depend on item analysis to select items for exam forms and have a variety of options available to them. These include the point-biserial correlation, the agreement statistic, the B index, and the phi coefficient. Although research has demonstrated that these statistics can be useful for item selection, no research as of yet has…

  14. A new statistic for the analysis of circular data in gamma-ray astronomy

    NASA Technical Reports Server (NTRS)

    Protheroe, R. J.

    1985-01-01

    A new statistic is proposed for the analysis of circular data. The statistic is designed specifically for situations where a test of uniformity is required which is powerful against alternatives in which a small fraction of the observations is grouped in a small range of directions, or phases.

  15. Statistical analysis of 59 inspected SSME HPFTP turbine blades (uncracked and cracked)

    NASA Technical Reports Server (NTRS)

    Wheeler, John T.

    1987-01-01

    The numerical results of statistical analysis of the test data of Space Shuttle Main Engine high pressure fuel turbopump second-stage turbine blades, including some with cracks are presented. Several statistical methods use the test data to determine the application of differences in frequency variations between the uncracked and cracked blades.

  16. Statistical and Scientometric Analysis of International Research in Geographical and Environmental Education

    ERIC Educational Resources Information Center

    Papadimitriou, Fivos; Kidman, Gillian

    2012-01-01

    Certain statistic and scientometric features of articles published in the journal "International Research in Geographical and Environmental Education" (IRGEE) are examined in this paper for the period 1992-2009 by applying nonparametric statistics and Shannon's entropy (diversity) formula. The main findings of this analysis are: (a) after 2004,…

  17. School Library Media Centers: 1993-94. Statistical Analysis Report, August 1998.

    ERIC Educational Resources Information Center

    Chaney, Bradford; Williams, Jeffrey

    This statistical analysis report from the National Center for Education Statistics examines the current state of school libraries in the United States and how they have changed. The primary source of data in this report is the 1993-94 Library Survey, the first federally sponsored survey of library media centers and head librarians in elementary…

  18. Polybrominated Diphenyl Ethers in Dryer Lint: An Advanced Analysis Laboratory

    ERIC Educational Resources Information Center

    Thompson, Robert Q.

    2008-01-01

    An advanced analytical chemistry laboratory experiment is described that involves environmental analysis and gas chromatography-mass spectrometry. Students analyze lint from clothes dryers for traces of flame retardant chemicals, polybrominated diphenylethers (PBDEs), compounds receiving much attention recently. In a typical experiment, ng/g…

  19. Advanced GIS Exercise: Predicting Rainfall Erosivity Index Using Regression Analysis

    ERIC Educational Resources Information Center

    Post, Christopher J.; Goddard, Megan A.; Mikhailova, Elena A.; Hall, Steven T.

    2006-01-01

    Graduate students from a variety of agricultural and natural resource fields are incorporating geographic information systems (GIS) analysis into their graduate research, creating a need for teaching methodologies that help students understand advanced GIS topics for use in their own research. Graduate-level GIS exercises help students understand…

  20. Advances in NMR-based biofluid analysis and metabolite profiling.

    PubMed

    Zhang, Shucha; Nagana Gowda, G A; Ye, Tao; Raftery, Daniel

    2010-07-01

    Significant improvements in NMR technology and methods have propelled NMR studies to play an important role in a rapidly expanding number of applications involving the profiling of metabolites in biofluids. This review discusses recent technical advances in NMR spectroscopy based metabolite profiling methods, data processing and analysis over the last three years.

  1. METHODS ADVANCEMENT FOR MILK ANALYSIS: THE MAMA STUDY

    EPA Science Inventory

    The Methods Advancement for Milk Analysis (MAMA) study was designed by US EPA and CDC investigators to provide data to support the technological and study design needs of the proposed National Children=s Study (NCS). The NCS is a multi-Agency-sponsored study, authorized under the...

  2. NASTRAN documentation for flutter analysis of advanced turbopropellers

    NASA Technical Reports Server (NTRS)

    Elchuri, V.; Gallo, A. M.; Skalski, S. C.

    1982-01-01

    An existing capability developed to conduct modal flutter analysis of tuned bladed-shrouded discs was modified to facilitate investigation of the subsonic unstalled flutter characteristics of advanced turbopropellers. The modifications pertain to the inclusion of oscillatory modal aerodynamic loads of blades with large (backward and forward) varying sweep.

  3. Statistical correlation analysis for comparing vibration data from test and analysis

    NASA Technical Reports Server (NTRS)

    Butler, T. G.; Strang, R. F.; Purves, L. R.; Hershfeld, D. J.

    1986-01-01

    A theory was developed to compare vibration modes obtained by NASTRAN analysis with those obtained experimentally. Because many more analytical modes can be obtained than experimental modes, the analytical set was treated as expansion functions for putting both sources in comparative form. The dimensional symmetry was developed for three general cases: nonsymmetric whole model compared with a nonsymmetric whole structural test, symmetric analytical portion compared with a symmetric experimental portion, and analytical symmetric portion with a whole experimental test. The theory was coded and a statistical correlation program was installed as a utility. The theory is established with small classical structures.

  4. Statistical analysis of wines using a robust compositional biplot.

    PubMed

    Hron, K; Jelínková, M; Filzmoser, P; Kreuziger, R; Bednář, P; Barták, P

    2012-02-15

    Eight phenolic acids (vanillic, gentisic, protocatechuic, syringic, gallic, coumaric, ferulic and caffeic) were quantitatively determined in 30 commercially available wines from South Moravia by gas chromatography-mass spectrometry. Raw (untransformed) and centered log-ratio transformed data were evaluated by classical and robust version of principal component analysis (PCA). A robust compositional biplot of the centered log-ratio transformed data gives the best resolution of particular categories of wines. Vanillic, syringic and gallic acids were identified as presumed markers occurring in relatively higher concentrations in red wines. Gentisic and caffeic acid were tentatively suggested as prospective technological markers, reflecting presumably some kinds of technological aspects of wine making.

  5. Statistical Analysis of the Different Factors Affecting the Diarrhea

    PubMed Central

    Zaman, Qamruz; Khan, Imtiaz

    2011-01-01

    Diarrhea is a worldwide problem facing both developing countries and developed countries, especially in pediatric population. Because of shortage of health facilities and lack of good food in developing countries, it is known fact that developing countries are facing this death taking problem more. The main purpose of this study was to examine the various factors which affect the recovery time of diarrhea. A multiple linear regression was applied to analyze the data and to select a model. The response variable for the study was the recovery time of diarrhea. The results of the analysis show that the Zinc is the main factor which affect the recovery time in Peshawar. PMID:23408274

  6. Statistical methods for the forensic analysis of striated tool marks

    SciTech Connect

    Hoeksema, Amy Beth

    2013-01-01

    In forensics, fingerprints can be used to uniquely identify suspects in a crime. Similarly, a tool mark left at a crime scene can be used to identify the tool that was used. However, the current practice of identifying matching tool marks involves visual inspection of marks by forensic experts which can be a very subjective process. As a result, declared matches are often successfully challenged in court, so law enforcement agencies are particularly interested in encouraging research in more objective approaches. Our analysis is based on comparisons of profilometry data, essentially depth contours of a tool mark surface taken along a linear path. In current practice, for stronger support of a match or non-match, multiple marks are made in the lab under the same conditions by the suspect tool. We propose the use of a likelihood ratio test to analyze the difference between a sample of comparisons of lab tool marks to a field tool mark, against a sample of comparisons of two lab tool marks. Chumbley et al. (2010) point out that the angle of incidence between the tool and the marked surface can have a substantial impact on the tool mark and on the effectiveness of both manual and algorithmic matching procedures. To better address this problem, we describe how the analysis can be enhanced to model the effect of tool angle and allow for angle estimation for a tool mark left at a crime scene. With sufficient development, such methods may lead to more defensible forensic analyses.

  7. Statistical Analysis of Shear Wave Speed in the Uterine Cervix

    PubMed Central

    Carlson, Lindsey C.; Feltovich, Helen; Palmeri, Mark L.; del Rio, Alejandro Muñoz; Hall, Timothy J.

    2014-01-01

    Although cervical softening is critical in pregnancy, there currently is no objective method for assessing the softness of the cervix. Shear wave speed (SWS) estimation is a noninvasive tool used to measure tissue mechanical properties such as stiffness. The goal of this study was to determine the spatial variability and assess the ability of SWS to classify ripened vs. unripened tissue samples. Ex vivo human hysterectomy samples (n = 22) were collected, a subset (n = 13) were ripened. SWS estimates were made at 4–5 locations along the length of the canal on both anterior and posterior halves. A linear mixed model was used for a robust multivariate analysis. Receiver operating characteristic (ROC) analysis and the area under the ROC curve (AUC) were calculated to describe the utility of SWS to classify ripened vs. unripened tissue samples. Results showed that all variables used in the linear mixed model were significant (p<0.05). Estimates at the mid location for the unripened group were 3.45 ± 0.95 m/s (anterior) and 3.56 ± 0.92 m/s (posterior), and 2.11 ± 0.45 m/s (anterior) and 2.68 ± 0.57 m/s (posterior) for the ripened (p < 0.001). The AUC’s were 0.91 and 0.84 for anterior and posterior respectively suggesting SWS estimates may be useful for quantifying cervical softening. PMID:25392863

  8. Analysis of compressive fracture in rock using statistical techniques

    SciTech Connect

    Blair, S.C.

    1994-12-01

    Fracture of rock in compression is analyzed using a field-theory model, and the processes of crack coalescence and fracture formation and the effect of grain-scale heterogeneities on macroscopic behavior of rock are studied. The model is based on observations of fracture in laboratory compression tests, and incorporates assumptions developed using fracture mechanics analysis of rock fracture. The model represents grains as discrete sites, and uses superposition of continuum and crack-interaction stresses to create cracks at these sites. The sites are also used to introduce local heterogeneity. Clusters of cracked sites can be analyzed using percolation theory. Stress-strain curves for simulated uniaxial tests were analyzed by studying the location of cracked sites, and partitioning of strain energy for selected intervals. Results show that the model implicitly predicts both development of shear-type fracture surfaces and a strength-vs-size relation that are similar to those observed for real rocks. Results of a parameter-sensitivity analysis indicate that heterogeneity in the local stresses, attributed to the shape and loading of individual grains, has a first-order effect on strength, and that increasing local stress heterogeneity lowers compressive strength following an inverse power law. Peak strength decreased with increasing lattice size and decreasing mean site strength, and was independent of site-strength distribution. A model for rock fracture based on a nearest-neighbor algorithm for stress redistribution is also presented and used to simulate laboratory compression tests, with promising results.

  9. Recent Advances in Multidisciplinary Analysis and Optimization, part 1

    NASA Technical Reports Server (NTRS)

    Barthelemy, Jean-Francois M. (Editor)

    1989-01-01

    This three-part document contains a collection of technical papers presented at the Second NASA/Air Force Symposium on Recent Advances in Multidisciplinary Analysis and Optimization, held September 28-30, 1988 in Hampton, Virginia. The topics covered include: helicopter design, aeroelastic tailoring, control of aeroelastic structures, dynamics and control of flexible structures, structural design, design of large engineering systems, application of artificial intelligence, shape optimization, software development and implementation, and sensitivity analysis.

  10. Recent Advances in Multidisciplinary Analysis and Optimization, part 2

    NASA Technical Reports Server (NTRS)

    Barthelemy, Jean-Francois M. (Editor)

    1989-01-01

    This three-part document contains a collection of technical papers presented at the Second NASA/Air Force Symposium on Recent Advances in Multidisciplinary Analysis and Optimization, held September 28-30, 1988 in Hampton, Virginia. The topics covered include: helicopter design, aeroelastic tailoring, control of aeroelastic structures, dynamics and control of flexible structures, structural design, design of large engineering systems, application of artificial intelligence, shape optimization, software development and implementation, and sensitivity analysis.

  11. Advanced stress analysis methods applicable to turbine engine structures

    NASA Technical Reports Server (NTRS)

    Pian, T. H. H.

    1985-01-01

    Advanced stress analysis methods applicable to turbine engine structures are investigated. Constructions of special elements which containing traction-free circular boundaries are investigated. New versions of mixed variational principle and version of hybrid stress elements are formulated. A method is established for suppression of kinematic deformation modes. semiLoof plate and shell elements are constructed by assumed stress hybrid method. An elastic-plastic analysis is conducted by viscoplasticity theory using the mechanical subelement model.

  12. Recent Advances in Multidisciplinary Analysis and Optimization, part 3

    NASA Technical Reports Server (NTRS)

    Barthelemy, Jean-Francois M. (Editor)

    1989-01-01

    This three-part document contains a collection of technical papers presented at the Second NASA/Air Force Symposium on Recent Advances in Multidisciplinary Analysis and Optimization, held September 28-30, 1988 in Hampton, Virginia. The topics covered include: aircraft design, aeroelastic tailoring, control of aeroelastic structures, dynamics and control of flexible structures, structural design, design of large engineering systems, application of artificial intelligence, shape optimization, software development and implementation, and sensitivity analysis.

  13. Statistical analysis of the ambiguities in the asteroid period determinations

    NASA Astrophysics Data System (ADS)

    Butkiewicz, M.; Kwiatkowski, T.; Bartczak, P.; Dudziński, G.

    2014-07-01

    A synodic period of an asteroid can be derived from its lightcurve by standard methods like Fourier-series fitting. A problem appears when results of observations are based on less than a full coverage of a lightcurve and/or high level of noise. Also, long gaps between individual lightcurves create an ambiguity in the cycle count which leads to aliases. Excluding binary systems and objects with non-principal-axis rotation, the rotation period is usually identical to the period of the second Fourier harmonic of the lightcurve. There are cases, however, where it may be connected with the 1st, 3rd, or 4th harmonic and it is difficult to choose among them when searching for the period. To help remove such uncertainties we analysed asteroid lightcurves for a range of shapes and observing/illuminating geometries. We simulated them using a modified internal code from the ISAM service (Marciniak et al. 2012, A&A 545, A131). In our computations, shapes of asteroids were modeled as Gaussian random spheres (Muinonen 1998, A&A, 332, 1087). A combination of Lommel-Seeliger and Lambert scattering laws was assumed. For each of the 100 shapes, we randomly selected 1000 positions of the spin axis, systematically changing the solar phase angle with a step of 5°. For each lightcurve, we determined its peak-to-peak amplitude, fitted the 6th-order Fourier series and derived the amplitudes of its harmonics. Instead of the number of the lightcurve extrema, which in many cases is subjective, we characterized each lightcurve by the order of the highest-amplitude Fourier harmonic. The goal of our simulations was to derive statistically significant conclusions (based on the underlying assumptions) about the dominance of different harmonics in the lightcurves of the specified amplitude and phase angle. The results, presented in the Figure, can be used in individual cases to estimate the probability that the obtained lightcurve is dominated by a specified Fourier harmonic. Some of the

  14. Safety analysis of the advanced thermionic initiative reactor

    NASA Astrophysics Data System (ADS)

    Lee, Hsing H.; Klein, Andrew C.

    1995-01-01

    Previously, detailed analysis was conducted to assess the technology developed for the Advanced Thermionic Initiative reactor. This analysis included the development of an overall system design code capability and the improvement of analytical models necessary for the assessment of the use of single cell thermionic fuel elements in a low power space nuclear reactor. The present analysis extends this effort to assess the nuclear criticality safety of the ATI reactor for various different scenarios. The analysis discusses the efficacy of different methods of reactor control such as control rods, and control drums.

  15. GIS application on spatial landslide analysis using statistical based models

    NASA Astrophysics Data System (ADS)

    Pradhan, Biswajeet; Lee, Saro; Buchroithner, Manfred F.

    2009-09-01

    This paper presents the assessment results of spatially based probabilistic three models using Geoinformation Techniques (GIT) for landslide susceptibility analysis at Penang Island in Malaysia. Landslide locations within the study areas were identified by interpreting aerial photographs, satellite images and supported with field surveys. Maps of the topography, soil type, lineaments and land cover were constructed from the spatial data sets. There are ten landslide related factors were extracted from the spatial database and the frequency ratio, fuzzy logic, and bivariate logistic regression coefficients of each factor was computed. Finally, landslide susceptibility maps were drawn for study area using frequency ratios, fuzzy logic and bivariate logistic regression models. For verification, the results of the analyses were compared with actual landslide locations in study area. The verification results show that bivariate logistic regression model provides slightly higher prediction accuracy than the frequency ratio and fuzzy logic models.

  16. Ordinary chondrites - Multivariate statistical analysis of trace element contents

    NASA Technical Reports Server (NTRS)

    Lipschutz, Michael E.; Samuels, Stephen M.

    1991-01-01

    The contents of mobile trace elements (Co, Au, Sb, Ga, Se, Rb, Cs, Te, Bi, Ag, In, Tl, Zn, and Cd) in Antarctic and non-Antarctic populations of H4-6 and L4-6 chondrites, were compared using standard multivariate discriminant functions borrowed from linear discriminant analysis and logistic regression. A nonstandard randomization-simulation method was developed, making it possible to carry out probability assignments on a distribution-free basis. Compositional differences were found both between the Antarctic and non-Antarctic H4-6 chondrite populations and between two L4-6 chondrite populations. It is shown that, for various types of meteorites (in particular, for the H4-6 chondrites), the Antarctic/non-Antarctic compositional difference is due to preterrestrial differences in the genesis of their parent materials.

  17. Spectral reflectance of surface soils - A statistical analysis

    NASA Technical Reports Server (NTRS)

    Crouse, K. R.; Henninger, D. L.; Thompson, D. R.

    1983-01-01

    The relationship of the physical and chemical properties of soils to their spectral reflectance as measured at six wavebands of Thematic Mapper (TM) aboard NASA's Landsat-4 satellite was examined. The results of performing regressions of over 20 soil properties on the six TM bands indicated that organic matter, water, clay, cation exchange capacity, and calcium were the properties most readily predicted from TM data. The middle infrared bands, bands 5 and 7, were the best bands for predicting soil properties, and the near infrared band, band 4, was nearly as good. Clustering 234 soil samples on the TM bands and characterizing the clusters on the basis of soil properties revealed several clear relationships between properties and reflectance. Discriminant analysis found organic matter, fine sand, base saturation, sand, extractable acidity, and water to be significant in discriminating among clusters.

  18. Machine processing for remotely acquired data. [using multivariate statistical analysis

    NASA Technical Reports Server (NTRS)

    Landgrebe, D. A.

    1974-01-01

    This paper is a general discussion of earth resources information systems which utilize airborne and spaceborne sensors. It points out that information may be derived by sensing and analyzing the spectral, spatial and temporal variations of electromagnetic fields emanating from the earth surface. After giving an overview system organization, the two broad categories of system types are discussed. These are systems in which high quality imagery is essential and those more numerically oriented. Sensors are also discussed with this categorization of systems in mind. The multispectral approach and pattern recognition are described as an example data analysis procedure for numerically-oriented systems. The steps necessary in using a pattern recognition scheme are described and illustrated with data obtained from aircraft and the Earth Resources Technology Satellite (ERTS-1).

  19. Statistical analysis of large-scale neuronal recording data

    PubMed Central

    Reed, Jamie L.; Kaas, Jon H.

    2010-01-01

    Relating stimulus properties to the response properties of individual neurons and neuronal networks is a major goal of sensory research. Many investigators implant electrode arrays in multiple brain areas and record from chronically implanted electrodes over time to answer a variety of questions. Technical challenges related to analyzing large-scale neuronal recording data are not trivial. Several analysis methods traditionally used by neurophysiologists do not account for dependencies in the data that are inherent in multi-electrode recordings. In addition, when neurophysiological data are not best modeled by the normal distribution and when the variables of interest may not be linearly related, extensions of the linear modeling techniques are recommended. A variety of methods exist to analyze correlated data, even when data are not normally distributed and the relationships are nonlinear. Here we review expansions of the Generalized Linear Model designed to address these data properties. Such methods are used in other research fields, and the application to large-scale neuronal recording data will enable investigators to determine the variable properties that convincingly contribute to the variances in the observed neuronal measures. Standard measures of neuron properties such as response magnitudes can be analyzed using these methods, and measures of neuronal network activity such as spike timing correlations can be analyzed as well. We have done just that in recordings from 100-electrode arrays implanted in the primary somatosensory cortex of owl monkeys. Here we illustrate how one example method, Generalized Estimating Equations analysis, is a useful method to apply to large-scale neuronal recordings. PMID:20472395

  20. Statistical Analysis of Acoustic Wave Parameters Near Solar Active Regions

    NASA Astrophysics Data System (ADS)

    Rabello-Soares, M. Cristina; Bogart, Richard S.; Scherrer, Philip H.

    2016-08-01

    In order to quantify the influence of magnetic fields on acoustic mode parameters and flows in and around active regions, we analyze the differences in the parameters in magnetically quiet regions nearby an active region (which we call “nearby regions”), compared with those of quiet regions at the same disk locations for which there are no neighboring active regions. We also compare the mode parameters in active regions with those in comparably located quiet regions. Our analysis is based on ring-diagram analysis of all active regions observed by the Helioseismic and Magnetic Imager (HMI) during almost five years. We find that the frequency at which the mode amplitude changes from attenuation to amplification in the quiet nearby regions is around 4.2 mHz, in contrast to the active regions, for which it is about 5.1 mHz. This amplitude enhacement (the “acoustic halo effect”) is as large as that observed in the active regions, and has a very weak dependence on the wave propagation direction. The mode energy difference in nearby regions also changes from a deficit to an excess at around 4.2 mHz, but averages to zero over all modes. The frequency difference in nearby regions increases with increasing frequency until a point at which the frequency shifts turn over sharply, as in active regions. However, this turnover occurs around 4.9 mHz, which is significantly below the acoustic cutoff frequency. Inverting the horizontal flow parameters in the direction of the neigboring active regions, we find flows that are consistent with a model of the thermal energy flow being blocked directly below the active region.

  1. Methods of learning in statistical education: Design and analysis of a randomized trial

    NASA Astrophysics Data System (ADS)

    Boyd, Felicity Turner

    Background. Recent psychological and technological advances suggest that active learning may enhance understanding and retention of statistical principles. A randomized trial was designed to evaluate the addition of innovative instructional methods within didactic biostatistics courses for public health professionals. Aims. The primary objectives were to evaluate and compare the addition of two active learning methods (cooperative and internet) on students' performance; assess their impact on performance after adjusting for differences in students' learning style; and examine the influence of learning style on trial participation. Methods. Consenting students enrolled in a graduate introductory biostatistics course were randomized to cooperative learning, internet learning, or control after completing a pretest survey. The cooperative learning group participated in eight small group active learning sessions on key statistical concepts, while the internet learning group accessed interactive mini-applications on the same concepts. Controls received no intervention. Students completed evaluations after each session and a post-test survey. Study outcome was performance quantified by examination scores. Intervention effects were analyzed by generalized linear models using intent-to-treat analysis and marginal structural models accounting for reported participation. Results. Of 376 enrolled students, 265 (70%) consented to randomization; 69, 100, and 96 students were randomized to the cooperative, internet, and control groups, respectively. Intent-to-treat analysis showed no differences between study groups; however, 51% of students in the intervention groups had dropped out after the second session. After accounting for reported participation, expected examination scores were 2.6 points higher (of 100 points) after completing one cooperative learning session (95% CI: 0.3, 4.9) and 2.4 points higher after one internet learning session (95% CI: 0.0, 4.7), versus

  2. Introducing Statistics to Geography Students: The Case for Exploratory Data Analysis.

    ERIC Educational Resources Information Center

    Burn, Christopher R.; Fox, Michael F.

    1986-01-01

    Exploratory data analysis (EDA) gives students a feel for the data being considered. Four applications of EDA are discussed: the use of displays, resistant statistics, transformations, and smoothing. (RM)

  3. STATISTICAL METHODOLOGY FOR THE SIMULTANEOUS ANALYSIS OF MULTIPLE TYPES OF OUTCOMES IN NONLINEAR THRESHOLD MODELS.

    EPA Science Inventory

    Multiple outcomes are often measured on each experimental unit in toxicology experiments. These multiple observations typically imply the existence of correlation between endpoints, and a statistical analysis that incorporates it may result in improved inference. When both disc...

  4. Advanced Mesh-Enabled Monte carlo capability for Multi-Physics Reactor Analysis

    SciTech Connect

    Wilson, Paul; Evans, Thomas; Tautges, Tim

    2012-12-24

    This project will accumulate high-precision fluxes throughout reactor geometry on a non- orthogonal grid of cells to support multi-physics coupling, in order to more accurately calculate parameters such as reactivity coefficients and to generate multi-group cross sections. This work will be based upon recent developments to incorporate advanced geometry and mesh capability in a modular Monte Carlo toolkit with computational science technology that is in use in related reactor simulation software development. Coupling this capability with production-scale Monte Carlo radiation transport codes can provide advanced and extensible test-beds for these developments. Continuous energy Monte Carlo methods are generally considered to be the most accurate computational tool for simulating radiation transport in complex geometries, particularly neutron transport in reactors. Nevertheless, there are several limitations for their use in reactor analysis. Most significantly, there is a trade-off between the fidelity of results in phase space, statistical accuracy, and the amount of computer time required for simulation. Consequently, to achieve an acceptable level of statistical convergence in high-fidelity results required for modern coupled multi-physics analysis, the required computer time makes Monte Carlo methods prohibitive for design iterations and detailed whole-core analysis. More subtly, the statistical uncertainty is typically not uniform throughout the domain, and the simulation quality is limited by the regions with the largest statistical uncertainty. In addition, the formulation of neutron scattering laws in continuous energy Monte Carlo methods makes it difficult to calculate adjoint neutron fluxes required to properly determine important reactivity parameters. Finally, most Monte Carlo codes available for reactor analysis have relied on orthogonal hexahedral grids for tallies that do not conform to the geometric boundaries and are thus generally not well

  5. Isolation and analysis of ginseng: advances and challenges

    PubMed Central

    Wang, Chong-Zhi

    2011-01-01

    Ginseng occupies a prominent position in the list of best-selling natural products in the world. Because of its complex constituents, multidisciplinary techniques are needed to validate the analytical methods that support ginseng’s use worldwide. In the past decade, rapid development of technology has advanced many aspects of ginseng research. The aim of this review is to illustrate the recent advances in the isolation and analysis of ginseng, and to highlight their new applications and challenges. Emphasis is placed on recent trends and emerging techniques. The current article reviews the literature between January 2000 and September 2010. PMID:21258738

  6. Orthogonal separations: Comparison of orthogonality metrics by statistical analysis.

    PubMed

    Schure, Mark R; Davis, Joe M

    2015-10-02

    Twenty orthogonality metrics (OMs) derived from convex hull, information theory, fractal dimension, correlation coefficients, nearest neighbor distances and bin-density techniques were calculated from a diverse group of 47 experimental two-dimensional (2D) chromatograms. These chromatograms comprise two datasets; one dataset is a collection of 2D chromatograms from Peter Carr's laboratory at the University of Minnesota, and the other dataset is based on pairs of one-dimensional chromatograms previously published by Martin Gilar and coworkers (Waters Corp.). The chromatograms were pooled to make a third or combined dataset. Cross-correlation results suggest that specific OMs are correlated within families of nearest neighbor methods, correlation coefficients and the information theory methods. Principal component analysis of the OMs show that none of the OMs stands out as clearly better at explaining the data variance than any another OM. Principal component analysis of individual chromatograms shows that different OMs favor certain chromatograms. The chromatograms exhibit a range of quality, as subjectively graded by nine experts experienced in 2D chromatography. The subjective (grading) evaluations were taken at two intervals per expert and demonstrated excellent consistency for each expert. Excellent agreement for both very good and very bad chromatograms was seen across the range of experts. However, evaluation uncertainty increased for chromatograms that were judged as average to mediocre. The grades were converted to numbers (percentages) for numerical computations. The percentages were correlated with OMs to establish good OMs for evaluating the quality of 2D chromatograms. Certain metrics correlate better than others. However, these results are not consistent across all chromatograms examined. Most of the nearest neighbor methods were observed to correlate poorly with the percentages. However, one method, devised by Clark and Evans, appeared to work

  7. Exposure to Alcoholism in the Family: United States, 1988. Advance Data from Vital and Health Statistics of the National Center for Health Statistics. Number 205.

    ERIC Educational Resources Information Center

    Schoenborn, Charlotte A.

    This report is based on data from the 1988 National Health Interview Survey on Alcohol (NHIS-Alcohol), part of the ongoing National Health Interview Survey conducted by the National Center for Health Statistics. Interviews for the NHIS are conducted in person by staff of the United States Bureau of the Census. Information is collected on each…

  8. Compressed ultrasound video image-quality evaluation using a Likert scale and Kappa statistical analysis

    NASA Astrophysics Data System (ADS)

    Stewart, Brent K.; Carter, Stephen J.; Langer, Steven G.; Andrew, Rex K.

    1998-06-01

    Experiments using NASA's Advanced Communications Technology Satellite were conducted to provide an estimate of the compressed video quality required for preservation of clinically relevant features for the detection of trauma. Bandwidth rates of 128, 256 and 384 kbps were used. A five point Likert scale (1 equals no useful information and 5 equals good diagnostic quality) was used for a subjective preference questionnaire to evaluate the quality of the compressed ultrasound imagery at the three compression rates for several anatomical regions of interest. At 384 kbps the Likert scores (mean plus or minus SD) were abdomen (4.45 plus or minus 0.71), carotid artery (4.70 plus or minus 0.36), kidney (5.0 plus or minus 0.0), liver (4.67 plus or minus 0.58) and thyroid (4.03 plus or minus 0.74). Due to the volatile nature of the H.320 compressed digital video stream, no statistically significant results can be derived through this methodology. As the MPEG standard has at its roots many of the same intraframe and motion vector compression algorithms as the H.261 (such as that used in the previous ACTS/AMT experiments), we are using the MPEG compressed video sequences to best gauge what minimum bandwidths are necessary for preservation of clinically relevant features for the detection of trauma. We have been using an MPEG codec board to collect losslessly compressed video clips from high quality S- VHS tapes and through direct digitization of S-video. Due to the large number of videoclips and questions to be presented to the radiologists and for ease of application, we have developed a web browser interface for this video visual perception study. Due to the large numbers of observations required to reach statistical significance in most ROC studies, Kappa statistical analysis is used to analyze the degree of agreement between observers and between viewing assessment. If the degree of agreement amongst readers is high, then there is a possibility that the ratings (i

  9. Meta-analysis for Discovering Rare-Variant Associations: Statistical Methods and Software Programs.

    PubMed

    Tang, Zheng-Zheng; Lin, Dan-Yu

    2015-07-02

    There is heightened interest in using next-generation sequencing technologies to identify rare variants that influence complex human diseases and traits. Meta-analysis is essential to this endeavor because large sample sizes are required for detecting associations with rare variants. In this article, we provide a comprehensive overview of statistical methods for meta-analysis of sequencing studies for discovering rare-variant associations. Specifically, we discuss the calculation of relevant summary statistics from participating studies, the construction of gene-level association tests, the choice of transformation for quantitative traits, the use of fixed-effects versus random-effects models, and the removal of shadow association signals through conditional analysis. We also show that meta-analysis based on properly calculated summary statistics is as powerful as joint analysis of individual-participant data. In addition, we demonstrate the performance of different meta-analysis methods by using both simulated and empirical data. We then compare four major software packages for meta-analysis of rare-variant associations-MASS, RAREMETAL, MetaSKAT, and seqMeta-in terms of the underlying statistical methodology, analysis pipeline, and software interface. Finally, we present PreMeta, a software interface that integrates the four meta-analysis packages and allows a consortium to combine otherwise incompatible summary statistics.

  10. Wheat signature modeling and analysis for improved training statistics

    NASA Technical Reports Server (NTRS)

    Nalepka, R. F. (Principal Investigator); Malila, W. A.; Cicone, R. C.; Gleason, J. M.

    1976-01-01

    The author has identified the following significant results. The spectral, spatial, and temporal characteristics of wheat and other signatures in LANDSAT multispectral scanner data were examined through empirical analysis and simulation. Irrigation patterns varied widely within Kansas; 88 percent of wheat acreage in Finney was irrigated and 24 percent in Morton, as opposed to less than 3 percent for western 2/3's of the State. The irrigation practice was definitely correlated with the observed spectral response; wheat variety differences produced observable spectral differences due to leaf coloration and different dates of maturation. Between-field differences were generally greater than within-field differences, and boundary pixels produced spectral features distinct from those within field centers. Multiclass boundary pixels contributed much of the observed bias in proportion estimates. The variability between signatures obtained by different draws of training data decreased as the sample size became larger; also, the resulting signatures became more robust and the particular decision threshold value became less important.

  11. A Statistical Analysis of the Determinants of Naval Flight Officer Training Attrition

    DTIC Science & Technology

    1998-03-01

    variables utilized in the model include commissioning source, race, and undergraduate major. The statistical analysis sought to determine the effect of each...of these demographic factors on the probability of attrition by reason. The results show that commissioning source has a significant effect on...in the model include commissioning source, race, and undergraduate major. The statistical analysis sought to determine the effect of each of these

  12. Analysis/forecast experiments with a multivariate statistical analysis scheme using FGGE data

    NASA Technical Reports Server (NTRS)

    Baker, W. E.; Bloom, S. C.; Nestler, M. S.

    1985-01-01

    A three-dimensional, multivariate, statistical analysis method, optimal interpolation (OI) is described for modeling meteorological data from widely dispersed sites. The model was developed to analyze FGGE data at the NASA-Goddard Laboratory of Atmospherics. The model features a multivariate surface analysis over the oceans, including maintenance of the Ekman balance and a geographically dependent correlation function. Preliminary comparisons are made between the OI model and similar schemes employed at the European Center for Medium Range Weather Forecasts and the National Meteorological Center. The OI scheme is used to provide input to a GCM, and model error correlations are calculated for forecasts of 500 mb vertical water mixing ratios and the wind profiles. Comparisons are made between the predictions and measured data. The model is shown to be as accurate as a successive corrections model out to 4.5 days.

  13. Integrated Data Collection Analysis (IDCA) Program - Statistical Analysis of RDX Standard Data Sets

    SciTech Connect

    Sandstrom, Mary M.; Brown, Geoffrey W.; Preston, Daniel N.; Pollard, Colin J.; Warner, Kirstin F.; Sorensen, Daniel N.; Remmers, Daniel L.; Phillips, Jason J.; Shelley, Timothy J.; Reyes, Jose A.; Hsu, Peter C.; Reynolds, John G.

    2015-10-30

    The Integrated Data Collection Analysis (IDCA) program is conducting a Proficiency Test for Small- Scale Safety and Thermal (SSST) testing of homemade explosives (HMEs). Described here are statistical analyses of the results for impact, friction, electrostatic discharge, and differential scanning calorimetry analysis of the RDX Type II Class 5 standard. The material was tested as a well-characterized standard several times during the proficiency study to assess differences among participants and the range of results that may arise for well-behaved explosive materials. The analyses show that there are detectable differences among the results from IDCA participants. While these differences are statistically significant, most of them can be disregarded for comparison purposes to assess potential variability when laboratories attempt to measure identical samples using methods assumed to be nominally the same. The results presented in this report include the average sensitivity results for the IDCA participants and the ranges of values obtained. The ranges represent variation about the mean values of the tests of between 26% and 42%. The magnitude of this variation is attributed to differences in operator, method, and environment as well as the use of different instruments that are also of varying age. The results appear to be a good representation of the broader safety testing community based on the range of methods, instruments, and environments included in the IDCA Proficiency Test.

  14. Processing and statistical analysis of soil-root images

    NASA Astrophysics Data System (ADS)

    Razavi, Bahar S.; Hoang, Duyen; Kuzyakov, Yakov

    2016-04-01

    Importance of the hotspots such as rhizosphere, the small soil volume that surrounds and is influenced by plant roots, calls for spatially explicit methods to visualize distribution of microbial activities in this active site (Kuzyakov and Blagodatskaya, 2015). Zymography technique has previously been adapted to visualize the spatial dynamics of enzyme activities in rhizosphere (Spohn and Kuzyakov, 2014). Following further developing of soil zymography -to obtain a higher resolution of enzyme activities - we aimed to 1) quantify the images, 2) determine whether the pattern (e.g. distribution of hotspots in space) is clumped (aggregated) or regular (dispersed). To this end, we incubated soil-filled rhizoboxes with maize Zea mays L. and without maize (control box) for two weeks. In situ soil zymography was applied to visualize enzymatic activity of β-glucosidase and phosphatase at soil-root interface. Spatial resolution of fluorescent images was improved by direct application of a substrate saturated membrane to the soil-root system. Furthermore, we applied "spatial point pattern analysis" to determine whether the pattern (e.g. distribution of hotspots in space) is clumped (aggregated) or regular (dispersed). Our results demonstrated that distribution of hotspots at rhizosphere is clumped (aggregated) compare to control box without plant which showed regular (dispersed) pattern. These patterns were similar in all three replicates and for both enzymes. We conclude that improved zymography is promising in situ technique to identify, analyze, visualize and quantify spatial distribution of enzyme activities in the rhizosphere. Moreover, such different patterns should be considered in assessments and modeling of rhizosphere extension and the corresponding effects on soil properties and functions. Key words: rhizosphere, spatial point pattern, enzyme activity, zymography, maize.

  15. Advanced Post-Irradiation Examination Capabilities Alternatives Analysis Report

    SciTech Connect

    Jeff Bryan; Bill Landman; Porter Hill

    2012-12-01

    An alternatives analysis was performed for the Advanced Post-Irradiation Capabilities (APIEC) project in accordance with the U.S. Department of Energy (DOE) Order DOE O 413.3B, “Program and Project Management for the Acquisition of Capital Assets”. The Alternatives Analysis considered six major alternatives: ? No Action ? Modify Existing DOE Facilities – capabilities distributed among multiple locations ? Modify Existing DOE Facilities – capabilities consolidated at a few locations ? Construct New Facility ? Commercial Partnership ? International Partnerships Based on the alternatives analysis documented herein, it is recommended to DOE that the advanced post-irradiation examination capabilities be provided by a new facility constructed at the Materials and Fuels Complex at the Idaho National Laboratory.

  16. "ATLAS" Advanced Technology Life-cycle Analysis System

    NASA Technical Reports Server (NTRS)

    Lollar, Louis F.; Mankins, John C.; ONeil, Daniel A.

    2004-01-01

    Making good decisions concerning research and development portfolios-and concerning the best systems concepts to pursue - as early as possible in the life cycle of advanced technologies is a key goal of R&D management This goal depends upon the effective integration of information from a wide variety of sources as well as focused, high-level analyses intended to inform such decisions Life-cycle Analysis System (ATLAS) methodology and tool kit. ATLAS encompasses a wide range of methods and tools. A key foundation for ATLAS is the NASA-created Technology Readiness. The toolkit is largely spreadsheet based (as of August 2003). This product is being funded by the Human and Robotics The presentation provides a summary of the Advanced Technology Level (TRL) systems Technology Program Office, Office of Exploration Systems, NASA Headquarters, Washington D.C. and is being integrated by Dan O Neil of the Advanced Projects Office, NASA/MSFC, Huntsville, AL

  17. Develop Advanced Nonlinear Signal Analysis Topographical Mapping System

    NASA Technical Reports Server (NTRS)

    Jong, Jen-Yi

    1997-01-01

    During the development of the SSME, a hierarchy of advanced signal analysis techniques for mechanical signature analysis has been developed by NASA and AI Signal Research Inc. (ASRI) to improve the safety and reliability for Space Shuttle operations. These techniques can process and identify intelligent information hidden in a measured signal which is often unidentifiable using conventional signal analysis methods. Currently, due to the highly interactive processing requirements and the volume of dynamic data involved, detailed diagnostic analysis is being performed manually which requires immense man-hours with extensive human interface. To overcome this manual process, NASA implemented this program to develop an Advanced nonlinear signal Analysis Topographical Mapping System (ATMS) to provide automatic/unsupervised engine diagnostic capabilities. The ATMS will utilize a rule-based Clips expert system to supervise a hierarchy of diagnostic signature analysis techniques in the Advanced Signal Analysis Library (ASAL). ASAL will perform automatic signal processing, archiving, and anomaly detection/identification tasks in order to provide an intelligent and fully automated engine diagnostic capability. The ATMS has been successfully developed under this contract. In summary, the program objectives to design, develop, test and conduct performance evaluation for an automated engine diagnostic system have been successfully achieved. Software implementation of the entire ATMS system on MSFC's OISPS computer has been completed. The significance of the ATMS developed under this program is attributed to the fully automated coherence analysis capability for anomaly detection and identification which can greatly enhance the power and reliability of engine diagnostic evaluation. The results have demonstrated that ATMS can significantly save time and man-hours in performing engine test/flight data analysis and performance evaluation of large volumes of dynamic test data.

  18. Signal analysis applications of nonlinear dynamics and higher-order statistics

    NASA Astrophysics Data System (ADS)

    Solinsky, James C.; Feeney, John J.

    1994-03-01

    The use of higher-order statistics (HOS) in acoustic, and financial signal analysis applications is outlined in theory and followed with specific data examples. HOS analysis is used to identify data regions of interest, and nonlinear dynamics (ND) analysis is used in a 4D embedded space to show structural density changes resulting from the HOS regions. A second-order statistical comparison is made with the same data processed to have random Fourier phase, since the HOS information is contained in this nonrandom phase. These empirical results indicate that HOS data regions are structural distortions to a second-order planar disk in the 4D ND analysis space.

  19. Statistical Energy Analysis (SEA) and Energy Finite Element Analysis (EFEA) Predictions for a Floor-Equipped Composite Cylinder

    NASA Technical Reports Server (NTRS)

    Grosveld, Ferdinand W.; Schiller, Noah H.; Cabell, Randolph H.

    2011-01-01

    Comet Enflow is a commercially available, high frequency vibroacoustic analysis software founded on Energy Finite Element Analysis (EFEA) and Energy Boundary Element Analysis (EBEA). Energy Finite Element Analysis (EFEA) was validated on a floor-equipped composite cylinder by comparing EFEA vibroacoustic response predictions with Statistical Energy Analysis (SEA) and experimental results. Statistical Energy Analysis (SEA) predictions were made using the commercial software program VA One 2009 from ESI Group. The frequency region of interest for this study covers the one-third octave bands with center frequencies from 100 Hz to 4000 Hz.

  20. Sequential Structural and Fluid Dynamics Analysis of Balloon-Expandable Coronary Stents: A Multivariable Statistical Analysis.

    PubMed

    Martin, David; Boyle, Fergal

    2015-09-01

    Several clinical studies have identified a strong correlation between neointimal hyperplasia following coronary stent deployment and both stent-induced arterial injury and altered vessel hemodynamics. As such, the sequential structural and fluid dynamics analysis of balloon-expandable stent deployment should provide a comprehensive indication of stent performance. Despite this observation, very few numerical studies of balloon-expandable coronary stents have considered both the mechanical and hemodynamic impact of stent deployment. Furthermore, in the few studies that have considered both phenomena, only a small number of stents have been considered. In this study, a sequential structural and fluid dynamics analysis methodology was employed to compare both the mechanical and hemodynamic impact of six balloon-expandable coronary stents. To investigate the relationship between stent design and performance, several common stent design properties were then identified and the dependence between these properties and both the mechanical and hemodynamic variables of interest was evaluated using statistical measures of correlation. Following the completion of the numerical analyses, stent strut thickness was identified as the only common design property that demonstrated a strong dependence with either the mean equivalent stress predicted in the artery wall or the mean relative residence time predicted on the luminal surface of the artery. These results corroborate the findings of the large-scale ISAR-STEREO clinical studies and highlight the crucial role of strut thickness in coronary stent design. The sequential structural and fluid dynamics analysis methodology and the multivariable statistical treatment of the results described in this study should prove useful in the design of future balloon-expandable coronary stents.

  1. Numerical analysis of the V-Y shaped advancement flap.

    PubMed

    Remache, D; Chambert, J; Pauchot, J; Jacquet, E

    2015-10-01

    The V-Y advancement flap is a usual technique for the closure of skin defects. A triangular flap is incised adjacent to a skin defect of rectangular shape. As the flap is advanced to close the initial defect, two smaller defects in the shape of a parallelogram are formed with respect to a reflection symmetry. The height of the defects depends on the apex angle of the flap and the closure efforts are related to the defects height. Andrades et al. 2005 have performed a geometrical analysis of the V-Y flap technique in order to reach a compromise between the flap size and the defects width. However, the geometrical approach does not consider the mechanical properties of the skin. The present analysis based on the finite element method is proposed as a complement to the geometrical one. This analysis aims to highlight the major role of the skin elasticity for a full analysis of the V-Y advancement flap. Furthermore, the study of this technique shows that closing at the flap apex seems mechanically the most interesting step. Thus different strategies of defect closure at the flap apex stemming from surgeon's know-how have been tested by numerical simulations.

  2. Feasibility of voxel-based statistical analysis method for myocardial PET

    NASA Astrophysics Data System (ADS)

    Ram Yu, A.; Kim, Jin Su; Paik, Chang H.; Kim, Kyeong Min; Moo Lim, Sang

    2014-09-01

    Although statistical parametric mapping (SPM) analysis is widely used in neuroimaging studies, to our best knowledge, there was no application to myocardial PET data analysis. In this study, we developed the voxel based statistical analysis method for myocardial PET which provides statistical comparison results between groups in image space. PET Emission data of normal and myocardial infarction rats were acquired For the SPM analysis, a rat heart template was created. In addition, individual PET data was spatially normalized and smoothed. Two sample t-tests were performed to identify the myocardial infarct region. This developed SPM method was compared with conventional ROI methods. Myocardial glucose metabolism was decreased in the lateral wall of the left ventricle. In the result of ROI analysis, the mean value of the lateral wall was 29% decreased. The newly developed SPM method for myocardial PET could provide quantitative information in myocardial PET study.

  3. FieldTrip: Open source software for advanced analysis of MEG, EEG, and invasive electrophysiological data.

    PubMed

    Oostenveld, Robert; Fries, Pascal; Maris, Eric; Schoffelen, Jan-Mathijs

    2011-01-01

    This paper describes FieldTrip, an open source software package that we developed for the analysis of MEG, EEG, and other electrophysiological data. The software is implemented as a MATLAB toolbox and includes a complete set of consistent and user-friendly high-level functions that allow experimental neuroscientists to analyze experimental data. It includes algorithms for simple and advanced analysis, such as time-frequency analysis using multitapers, source reconstruction using dipoles, distributed sources and beamformers, connectivity analysis, and nonparametric statistical permutation tests at the channel and source level. The implementation as toolbox allows the user to perform elaborate and structured analyses of large data sets using the MATLAB command line and batch scripting. Furthermore, users and developers can easily extend the functionality and implement new algorithms. The modular design facilitates the reuse in other software packages.

  4. FieldTrip: Open Source Software for Advanced Analysis of MEG, EEG, and Invasive Electrophysiological Data

    PubMed Central

    Oostenveld, Robert; Fries, Pascal; Maris, Eric; Schoffelen, Jan-Mathijs

    2011-01-01

    This paper describes FieldTrip, an open source software package that we developed for the analysis of MEG, EEG, and other electrophysiological data. The software is implemented as a MATLAB toolbox and includes a complete set of consistent and user-friendly high-level functions that allow experimental neuroscientists to analyze experimental data. It includes algorithms for simple and advanced analysis, such as time-frequency analysis using multitapers, source reconstruction using dipoles, distributed sources and beamformers, connectivity analysis, and nonparametric statistical permutation tests at the channel and source level. The implementation as toolbox allows the user to perform elaborate and structured analyses of large data sets using the MATLAB command line and batch scripting. Furthermore, users and developers can easily extend the functionality and implement new algorithms. The modular design facilitates the reuse in other software packages. PMID:21253357

  5. Advances in Mid-Infrared Spectroscopy for Chemical Analysis

    NASA Astrophysics Data System (ADS)

    Haas, Julian; Mizaikoff, Boris

    2016-06-01

    Infrared spectroscopy in the 3-20 μm spectral window has evolved from a routine laboratory technique into a state-of-the-art spectroscopy and sensing tool by benefitting from recent progress in increasingly sophisticated spectra acquisition techniques and advanced materials for generating, guiding, and detecting mid-infrared (MIR) radiation. Today, MIR spectroscopy provides molecular information with trace to ultratrace sensitivity, fast data acquisition rates, and high spectral resolution catering to demanding applications in bioanalytics, for example, and to improved routine analysis. In addition to advances in miniaturized device technology without sacrificing analytical performance, selected innovative applications for MIR spectroscopy ranging from process analysis to biotechnology and medical diagnostics are highlighted in this review.

  6. Performance analysis of morphological component analysis (MCA) method for mammograms using some statistical features

    NASA Astrophysics Data System (ADS)

    Gardezi, Syed Jamal Safdar; Faye, Ibrahima; Kamel, Nidal; Eltoukhy, Mohamed Meselhy; Hussain, Muhammad

    2014-10-01

    Early detection of breast cancer helps reducing the mortality rates. Mammography is very useful tool in breast cancer detection. But it is very difficult to separate different morphological features in mammographic images. In this study, Morphological Component Analysis (MCA) method is used to extract different morphological aspects of mammographic images by effectively preserving the morphological characteristics of regions. MCA decomposes the mammogram into piecewise smooth part and the texture part using the Local Discrete Cosine Transform (LDCT) and Curvelet Transform via wrapping (CURVwrap). In this study, simple comparison in performance has been done using some statistical features for the original image versus the piecewise smooth part obtained from the MCA decomposition. The results show that MCA suppresses the structural noises and blood vessels from the mammogram and enhances the performance for mass detection.

  7. [Near infrared spectroscopy and multivariate statistical process analysis for real-time monitoring of production process].

    PubMed

    Wang, Yi; Ma, Xiang; Wen, Ya-Dong; Zou, Quan; Wang, Jun; Tu, Jia-Run; Cai, Wen-Sheng; Shao, Xue-Guang

    2013-05-01

    Near infrared diffusive reflectance spectroscopy has been applied in on-site or on-line analysis due to its characteristics of fastness, non-destruction and the feasibility for real complex sample analysis. The present work reported a real-time monitoring method for industrial production by using near infrared spectroscopic technique and multivariate statistical process analysis. In the method, the real-time near infrared spectra of the materials are collected on the production line, and then the evaluation of the production process can be achieved by a statistic Hotelling T2 calculated with the established model. In this work, principal component analysis (PCA) is adopted for building the model, and the statistic is calculated by projecting the real-time spectra onto the PCA model. With an application of the method in a practical production, it was demonstrated that a real-time evaluation of the variations in the production can be realized by investigating the changes in the statistic, and the comparison of the products in different batches can be achieved by further statistics of the statistic. Therefore, the proposed method may provide a practical way for quality insurance of production processes.

  8. Advanced three-dimensional dynamic analysis by boundary element methods

    NASA Technical Reports Server (NTRS)

    Banerjee, P. K.; Ahma, S.

    1985-01-01

    Advanced formulations of boundary element method for periodic, transient transform domain and transient time domain solution of three-dimensional solids have been implemented using a family of isoparametric boundary elements. The necessary numerical integration techniques as well as the various solution algorithms are described. The developed analysis has been incorporated in a fully general purpose computer program BEST3D which can handle up to 10 subregions. A number of numerical examples are presented to demonstrate the accuracy of the dynamic analyses.

  9. Advanced superposition methods for high speed turbopump vibration analysis

    NASA Technical Reports Server (NTRS)

    Nielson, C. E.; Campany, A. D.

    1981-01-01

    The small, high pressure Mark 48 liquid hydrogen turbopump was analyzed and dynamically tested to determine the cause of high speed vibration at an operating speed of 92,400 rpm. This approaches the design point operating speed of 95,000 rpm. The initial dynamic analysis in the design stage and subsequent further analysis of the rotor only dynamics failed to predict the vibration characteristics found during testing. An advanced procedure for dynamics analysis was used in this investigation. The procedure involves developing accurate dynamic models of the rotor assembly and casing assembly by finite element analysis. The dynamically instrumented assemblies are independently rap tested to verify the analytical models. The verified models are then combined by modal superposition techniques to develop a completed turbopump model where dynamic characteristics are determined. The results of the dynamic testing and analysis obtained are presented and methods of moving the high speed vibration characteristics to speeds above the operating range are recommended. Recommendations for use of these advanced dynamic analysis procedures during initial design phases are given.

  10. [Application of multivariate statistical analysis and thinking in quality control of Chinese medicine].

    PubMed

    Liu, Na; Li, Jun; Li, Bao-Guo

    2014-11-01

    The study of quality control of Chinese medicine has always been the hot and the difficulty spot of the development of traditional Chinese medicine (TCM), which is also one of the key problems restricting the modernization and internationalization of Chinese medicine. Multivariate statistical analysis is an analytical method which is suitable for the analysis of characteristics of TCM. It has been used widely in the study of quality control of TCM. Multivariate Statistical analysis was used for multivariate indicators and variables that appeared in the study of quality control and had certain correlation between each other, to find out the hidden law or the relationship between the data can be found,.which could apply to serve the decision-making and realize the effective quality evaluation of TCM. In this paper, the application of multivariate statistical analysis in the quality control of Chinese medicine was summarized, which could provided the basis for its further study.

  11. Airwaves and Microblogs: A Statistical Analysis of Al-Shabaab’s Propaganda Effectiveness

    DTIC Science & Technology

    2014-12-01

    Somalia, Westgate, Kismayo, propaganda, jihad, ideology, data analysis, statistical analysis, counterterrorism, counter violent extremist messaging...Somalia that challenge U.S. interests.11 As with many violent organizations, al-Shabaab has made extensive use of the information environment as...Not all Radicals are the Same: Implications for Counter-Radicalization Strategy,” in Countering Violent Extremism, Scientific Methods & Strategies

  12. Use of the Jackknife Statistic To Establish the External Validity of Discriminant Analysis Results.

    ERIC Educational Resources Information Center

    Daniel, Larry G.

    That the jackknifing technique is superior to traditional techniques for assessing the external validity of statistical results of discriminant analysis is defended. Traditional approaches assessed include: (1) the empirical method, in which the discriminant function coefficients (DFCs) obtained in a given analysis are applied to predict group…

  13. Analysis of Variance with Summary Statistics in Microsoft® Excel®

    ERIC Educational Resources Information Center

    Larson, David A.; Hsu, Ko-Cheng

    2010-01-01

    Students regularly are asked to solve Single Factor Analysis of Variance problems given only the sample summary statistics (number of observations per category, category means, and corresponding category standard deviations). Most undergraduate students today use Excel for data analysis of this type. However, Excel, like all other statistical…

  14. Advances in Computational Stability Analysis of Composite Aerospace Structures

    SciTech Connect

    Degenhardt, R.; Araujo, F. C. de

    2010-09-30

    European aircraft industry demands for reduced development and operating costs. Structural weight reduction by exploitation of structural reserves in composite aerospace structures contributes to this aim, however, it requires accurate and experimentally validated stability analysis of real structures under realistic loading conditions. This paper presents different advances from the area of computational stability analysis of composite aerospace structures which contribute to that field. For stringer stiffened panels main results of the finished EU project COCOMAT are given. It investigated the exploitation of reserves in primary fibre composite fuselage structures through an accurate and reliable simulation of postbuckling and collapse. For unstiffened cylindrical composite shells a proposal for a new design method is presented.

  15. Analysis of advanced solid rocket motor ignition phenomena

    NASA Astrophysics Data System (ADS)

    Foster, Winfred A., Jr.; Jenkins, Rhonald M.

    1995-07-01

    This report presents the results obtained from an experimental analysis of the flow field in the slots of the star grain section in the head-end of the advanced solid rocket motor during the ignition transient. This work represents an extension of the previous tests and analysis to include the effects of using a center port in conjunction with multiple canted igniter ports. The flow field measurements include oil smear data on the star slot walls, pressure and heat transfer coefficient measurements on the star slot walls and velocity measurements in the star slot.

  16. Advanced stress analysis methods applicable to turbine engine structures

    NASA Technical Reports Server (NTRS)

    Pian, Theodore H. H.

    1991-01-01

    The following tasks on the study of advanced stress analysis methods applicable to turbine engine structures are described: (1) constructions of special elements which contain traction-free circular boundaries; (2) formulation of new version of mixed variational principles and new version of hybrid stress elements; (3) establishment of methods for suppression of kinematic deformation modes; (4) construction of semiLoof plate and shell elements by assumed stress hybrid method; and (5) elastic-plastic analysis by viscoplasticity theory using the mechanical subelement model.

  17. Validation Database Based Thermal Analysis of an Advanced RPS Concept

    NASA Technical Reports Server (NTRS)

    Balint, Tibor S.; Emis, Nickolas D.

    2006-01-01

    Advanced RPS concepts can be conceived, designed and assessed using high-end computational analysis tools. These predictions may provide an initial insight into the potential performance of these models, but verification and validation are necessary and required steps to gain confidence in the numerical analysis results. This paper discusses the findings from a numerical validation exercise for a small advanced RPS concept, based on a thermal analysis methodology developed at JPL and on a validation database obtained from experiments performed at Oregon State University. Both the numerical and experimental configurations utilized a single GPHS module enabled design, resembling a Mod-RTG concept. The analysis focused on operating and environmental conditions during the storage phase only. This validation exercise helped to refine key thermal analysis and modeling parameters, such as heat transfer coefficients, and conductivity and radiation heat transfer values. Improved understanding of the Mod-RTG concept through validation of the thermal model allows for future improvements to this power system concept.

  18. Structural Configuration Systems Analysis for Advanced Aircraft Fuselage Concepts

    NASA Technical Reports Server (NTRS)

    Mukhopadhyay, Vivek; Welstead, Jason R.; Quinlan, Jesse R.; Guynn, Mark D.

    2016-01-01

    Structural configuration analysis of an advanced aircraft fuselage concept is investigated. This concept is characterized by a double-bubble section fuselage with rear mounted engines. Based on lessons learned from structural systems analysis of unconventional aircraft, high-fidelity finite-element models (FEM) are developed for evaluating structural performance of three double-bubble section configurations. Structural sizing and stress analysis are applied for design improvement and weight reduction. Among the three double-bubble configurations, the double-D cross-section fuselage design was found to have a relatively lower structural weight. The structural FEM weights of these three double-bubble fuselage section concepts are also compared with several cylindrical fuselage models. Since these fuselage concepts are different in size, shape and material, the fuselage structural FEM weights are normalized by the corresponding passenger floor area for a relative comparison. This structural systems analysis indicates that an advanced composite double-D section fuselage may have a relative structural weight ratio advantage over a conventional aluminum fuselage. Ten commercial and conceptual aircraft fuselage structural weight estimates, which are empirically derived from the corresponding maximum takeoff gross weight, are also presented and compared with the FEM- based estimates for possible correlation. A conceptual full vehicle FEM model with a double-D fuselage is also developed for preliminary structural analysis and weight estimation.

  19. Statistical-fluctuation analysis for quantum key distribution with consideration of after-pulse contributions

    NASA Astrophysics Data System (ADS)

    Li, Hongxin; Jiang, Haodong; Gao, Ming; Ma, Zhi; Ma, Chuangui; Wang, Wei

    2015-12-01

    The statistical fluctuation problem is a critical factor in all quantum key distribution (QKD) protocols under finite-key conditions. The current statistical fluctuation analysis is mainly based on independent random samples, however, the precondition cannot always be satisfied because of different choices of samples and actual parameters. As a result, proper statistical fluctuation methods are required to solve this problem. Taking the after-pulse contributions into consideration, this paper gives the expression for the secure key rate and the mathematical model for statistical fluctuations, focusing on a decoy-state QKD protocol [Z.-C. Wei et al., Sci. Rep. 3, 2453 (2013), 10.1038/srep02453] with a biased basis choice. On this basis, a classified analysis of statistical fluctuation is represented according to the mutual relationship between random samples. First, for independent identical relations, a deviation comparison is made between the law of large numbers and standard error analysis. Second, a sufficient condition is given that the Chernoff bound achieves a better result than Hoeffding's inequality based on only independent relations. Third, by constructing the proper martingale, a stringent way is proposed to deal issues based on dependent random samples through making use of Azuma's inequality. In numerical optimization, the impact on the secure key rate, the comparison of secure key rates, and the respective deviations under various kinds of statistical fluctuation analyses are depicted.

  20. The linear statistical d.c. model of GaAs MESFET using factor analysis

    NASA Astrophysics Data System (ADS)

    Dobrzanski, Lech

    1995-02-01

    The linear statistical model of the GaAs MESFET's current generator is obtained by means of factor analysis. Three different MESFET deterministic models are taken into account in the analysis: the Statz model (ST), the Materka-type model (MT) and a new proprietary model of MESFET with implanted channel (PLD). It is shown that statistical models obtained using factor analysis provide excellent generation of the multidimensional random variable representing the drain current of MESFET. The method of implementation of the statistical model into the SPICE program is presented. It is proved that for a strongly limited number of Monte Carlo analysis runs in that program, the statistical models considered in each case (ST, MT and PLD) enable good reconstruction of the empirical factor structure. The empirical correlation matrix of model parameters is not reconstructed exactly by statistical modelling, but values of correlation matrix elements obtained from simulated data are within the confidence intervals for the small sample. This paper proves that a formal approach to statistical modelling using factor analysis is the right path to follow, in spite of the fact, that CAD systems (PSpice[MicroSim Corp.], Microwave Harmonica[Compact Software]) are not designed properly for generation of the multidimensional random variable. It is obvious that further progress in implementation of statistical methods in CAD software is required. Furthermore, a new approach to the MESFET's d.c. model is presented. The separate functions, describing the linear as well as the saturated region of MESFET output characteristics, are combined in the single equation. This way of modelling is particularly suitable for transistors with an implanted channel.