Sample records for rigorous statistical approach

  1. Beyond Composite Scores and Cronbach's Alpha: Advancing Methodological Rigor in Recreation Research

    ERIC Educational Resources Information Center

    Gagnon, Ryan J.; Stone, Garrett A.; Garst, Barry A.

    2017-01-01

    Critically examining common statistical approaches and their strengths and weaknesses is an important step in advancing recreation and leisure sciences. To continue this critical examination and to inform methodological decision making, this study compared three approaches to determine how alternative approaches may result in contradictory…

  2. Genomic similarity and kernel methods I: advancements by building on mathematical and statistical foundations.

    PubMed

    Schaid, Daniel J

    2010-01-01

    Measures of genomic similarity are the basis of many statistical analytic methods. We review the mathematical and statistical basis of similarity methods, particularly based on kernel methods. A kernel function converts information for a pair of subjects to a quantitative value representing either similarity (larger values meaning more similar) or distance (smaller values meaning more similar), with the requirement that it must create a positive semidefinite matrix when applied to all pairs of subjects. This review emphasizes the wide range of statistical methods and software that can be used when similarity is based on kernel methods, such as nonparametric regression, linear mixed models and generalized linear mixed models, hierarchical models, score statistics, and support vector machines. The mathematical rigor for these methods is summarized, as is the mathematical framework for making kernels. This review provides a framework to move from intuitive and heuristic approaches to define genomic similarities to more rigorous methods that can take advantage of powerful statistical modeling and existing software. A companion paper reviews novel approaches to creating kernels that might be useful for genomic analyses, providing insights with examples [1]. Copyright © 2010 S. Karger AG, Basel.

  3. Exploration of the Maximum Entropy/Optimal Projection Approach to Control Design Synthesis for Large Space Structures.

    DTIC Science & Technology

    1985-02-01

    Energy Analysis , a branch of dynamic modal analysis developed for analyzing acoustic vibration problems, its present stage of development embodies a...Maximum Entropy Stochastic Modelling and Reduced-Order Design Synthesis is a rigorous new approach to this class of problems. Inspired by Statistical

  4. Scaling up to address data science challenges

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wendelberger, Joanne R.

    Statistics and Data Science provide a variety of perspectives and technical approaches for exploring and understanding Big Data. Partnerships between scientists from different fields such as statistics, machine learning, computer science, and applied mathematics can lead to innovative approaches for addressing problems involving increasingly large amounts of data in a rigorous and effective manner that takes advantage of advances in computing. Here, this article will explore various challenges in Data Science and will highlight statistical approaches that can facilitate analysis of large-scale data including sampling and data reduction methods, techniques for effective analysis and visualization of large-scale simulations, and algorithmsmore » and procedures for efficient processing.« less

  5. Scaling up to address data science challenges

    DOE PAGES

    Wendelberger, Joanne R.

    2017-04-27

    Statistics and Data Science provide a variety of perspectives and technical approaches for exploring and understanding Big Data. Partnerships between scientists from different fields such as statistics, machine learning, computer science, and applied mathematics can lead to innovative approaches for addressing problems involving increasingly large amounts of data in a rigorous and effective manner that takes advantage of advances in computing. Here, this article will explore various challenges in Data Science and will highlight statistical approaches that can facilitate analysis of large-scale data including sampling and data reduction methods, techniques for effective analysis and visualization of large-scale simulations, and algorithmsmore » and procedures for efficient processing.« less

  6. Statistical hydrodynamics and related problems in spaces of probability measures

    NASA Astrophysics Data System (ADS)

    Dostoglou, Stamatios

    2017-11-01

    A rigorous theory of statistical solutions of the Navier-Stokes equations, suitable for exploring Kolmogorov's ideas, has been developed by M.I. Vishik and A.V. Fursikov, culminating in their monograph "Mathematical problems of Statistical Hydromechanics." We review some progress made in recent years following this approach, with emphasis on problems concerning the correlation of velocities and corresponding questions in the space of probability measures on Hilbert spaces.

  7. Rigorous force field optimization principles based on statistical distance minimization

    DOE PAGES

    Vlcek, Lukas; Chialvo, Ariel A.

    2015-10-12

    We use the concept of statistical distance to define a measure of distinguishability between a pair of statistical mechanical systems, i.e., a model and its target, and show that its minimization leads to general convergence of the model’s static measurable properties to those of the target. Here we exploit this feature to define a rigorous basis for the development of accurate and robust effective molecular force fields that are inherently compatible with coarse-grained experimental data. The new model optimization principles and their efficient implementation are illustrated through selected examples, whose outcome demonstrates the higher robustness and predictive accuracy of themore » approach compared to other currently used methods, such as force matching and relative entropy minimization. We also discuss relations between the newly developed principles and established thermodynamic concepts, which include the Gibbs-Bogoliubov inequality and the thermodynamic length.« less

  8. Statistical comparison of a hybrid approach with approximate and exact inference models for Fusion 2+

    NASA Astrophysics Data System (ADS)

    Lee, K. David; Wiesenfeld, Eric; Gelfand, Andrew

    2007-04-01

    One of the greatest challenges in modern combat is maintaining a high level of timely Situational Awareness (SA). In many situations, computational complexity and accuracy considerations make the development and deployment of real-time, high-level inference tools very difficult. An innovative hybrid framework that combines Bayesian inference, in the form of Bayesian Networks, and Possibility Theory, in the form of Fuzzy Logic systems, has recently been introduced to provide a rigorous framework for high-level inference. In previous research, the theoretical basis and benefits of the hybrid approach have been developed. However, lacking is a concrete experimental comparison of the hybrid framework with traditional fusion methods, to demonstrate and quantify this benefit. The goal of this research, therefore, is to provide a statistical analysis on the comparison of the accuracy and performance of hybrid network theory, with pure Bayesian and Fuzzy systems and an inexact Bayesian system approximated using Particle Filtering. To accomplish this task, domain specific models will be developed under these different theoretical approaches and then evaluated, via Monte Carlo Simulation, in comparison to situational ground truth to measure accuracy and fidelity. Following this, a rigorous statistical analysis of the performance results will be performed, to quantify the benefit of hybrid inference to other fusion tools.

  9. Model-based assessment of estuary ecosystem health using the latent health factor index, with application to the richibucto estuary.

    PubMed

    Chiu, Grace S; Wu, Margaret A; Lu, Lin

    2013-01-01

    The ability to quantitatively assess ecological health is of great interest to those tasked with monitoring and conserving ecosystems. For decades, biomonitoring research and policies have relied on multimetric health indices of various forms. Although indices are numbers, many are constructed based on qualitative procedures, thus limiting the quantitative rigor of the practical interpretations of such indices. The statistical modeling approach to construct the latent health factor index (LHFI) was recently developed. With ecological data that otherwise are used to construct conventional multimetric indices, the LHFI framework expresses such data in a rigorous quantitative model, integrating qualitative features of ecosystem health and preconceived ecological relationships among such features. This hierarchical modeling approach allows unified statistical inference of health for observed sites (along with prediction of health for partially observed sites, if desired) and of the relevance of ecological drivers, all accompanied by formal uncertainty statements from a single, integrated analysis. Thus far, the LHFI approach has been demonstrated and validated in a freshwater context. We adapt this approach to modeling estuarine health, and illustrate it on the previously unassessed system in Richibucto in New Brunswick, Canada, where active oyster farming is a potential stressor through its effects on sediment properties. Field data correspond to health metrics that constitute the popular AZTI marine biotic index and the infaunal trophic index, as well as abiotic predictors preconceived to influence biota. Our paper is the first to construct a scientifically sensible model that rigorously identifies the collective explanatory capacity of salinity, distance downstream, channel depth, and silt-clay content-all regarded a priori as qualitatively important abiotic drivers-towards site health in the Richibucto ecosystem. This suggests the potential effectiveness of the LHFI approach for assessing not only freshwater systems but aquatic ecosystems in general.

  10. Predicting and downscaling ENSO impacts on intraseasonal precipitation statistics in California: The 1997/98 event

    USGS Publications Warehouse

    Gershunov, A.; Barnett, T.P.; Cayan, D.R.; Tubbs, T.; Goddard, L.

    2000-01-01

    Three long-range forecasting methods have been evaluated for prediction and downscaling of seasonal and intraseasonal precipitation statistics in California. Full-statistical, hybrid-dynamical - statistical and full-dynamical approaches have been used to forecast El Nin??o - Southern Oscillation (ENSO) - related total precipitation, daily precipitation frequency, and average intensity anomalies during the January - March season. For El Nin??o winters, the hybrid approach emerges as the best performer, while La Nin??a forecasting skill is poor. The full-statistical forecasting method features reasonable forecasting skill for both La Nin??a and El Nin??o winters. The performance of the full-dynamical approach could not be evaluated as rigorously as that of the other two forecasting schemes. Although the full-dynamical forecasting approach is expected to outperform simpler forecasting schemes in the long run, evidence is presented to conclude that, at present, the full-dynamical forecasting approach is the least viable of the three, at least in California. The authors suggest that operational forecasting of any intraseasonal temperature, precipitation, or streamflow statistic derivable from the available records is possible now for ENSO-extreme years.

  11. Adult asthma disease management: an analysis of studies, approaches, outcomes, and methods.

    PubMed

    Maciejewski, Matthew L; Chen, Shih-Yin; Au, David H

    2009-07-01

    Disease management has been implemented for patients with asthma in various ways. We describe the approaches to and components of adult asthma disease-management interventions, examine the outcomes evaluated, and assess the quality of published studies. We searched the MEDLINE, EMBASE, CINAHL, PsychInfo, and Cochrane databases for studies published in 1986 through 2008, on adult asthma management. With the studies that met our inclusion criteria, we examined the clinical, process, medication, economic, and patient-reported outcomes reported, and the study designs, provider collaboration during the studies, and statistical methods. Twenty-nine articles describing 27 studies satisfied our inclusion criteria. There was great variation in the content, extent of collaboration between physician and non-physician providers responsible for intervention delivery, and outcomes examined across the 27 studies. Because of limitations in the design of 22 of the 27 studies, the differences in outcomes assessed, and the lack of rigorous statistical adjustment, we could not draw definitive conclusions about the effectiveness or cost-effectiveness of the asthma disease-management programs or which approach was most effective. Few well-designed studies with rigorous evaluations have been conducted to evaluate disease-management interventions for adults with asthma. Current evidence is insufficient to recommend any particular intervention.

  12. Local and global approaches to the problem of Poincaré recurrences. Applications in nonlinear dynamics

    NASA Astrophysics Data System (ADS)

    Anishchenko, V. S.; Boev, Ya. I.; Semenova, N. I.; Strelkova, G. I.

    2015-07-01

    We review rigorous and numerical results on the statistics of Poincaré recurrences which are related to the modern development of the Poincaré recurrence problem. We analyze and describe the rigorous results which are achieved both in the classical (local) approach and in the recently developed global approach. These results are illustrated by numerical simulation data for simple chaotic and ergodic systems. It is shown that the basic theoretical laws can be applied to noisy systems if the probability measure is ergodic and stationary. Poincaré recurrences are studied numerically in nonautonomous systems. Statistical characteristics of recurrences are analyzed in the framework of the global approach for the cases of positive and zero topological entropy. We show that for the positive entropy, there is a relationship between the Afraimovich-Pesin dimension, Lyapunov exponents and the Kolmogorov-Sinai entropy either without and in the presence of external noise. The case of zero topological entropy is exemplified by numerical results for the Poincare recurrence statistics in the circle map. We show and prove that the dependence of minimal recurrence times on the return region size demonstrates universal properties for the golden and the silver ratio. The behavior of Poincaré recurrences is analyzed at the critical point of Feigenbaum attractor birth. We explore Poincaré recurrences for an ergodic set which is generated in the stroboscopic section of a nonautonomous oscillator and is similar to a circle shift. Based on the obtained results we show how the Poincaré recurrence statistics can be applied for solving a number of nonlinear dynamics issues. We propose and illustrate alternative methods for diagnosing effects of external and mutual synchronization of chaotic systems in the context of the local and global approaches. The properties of the recurrence time probability density can be used to detect the stochastic resonance phenomenon. We also discuss how the fractal dimension of chaotic attractors can be estimated using the Poincaré recurrence statistics.

  13. A Study of Statistics through Tootsie Pops

    ERIC Educational Resources Information Center

    Aaberg, Shelby; Vitosh, Jason; Smith, Wendy

    2016-01-01

    A classic TV commercial once asked, "How many licks does it take to get to the center of a Tootsie Roll Tootsie Pop?" The narrator claims, "The world may never know" (Tootsie Roll 2012), but an Internet search returns a multitude of answers, some of which include rigorous systematic approaches by academics to address the…

  14. Exploring the Use of Participatory Information to Improve Monitoring, Mapping and Assessment of Aquatic Ecosystem Services at Landascape Scales

    EPA Science Inventory

    Traditionally, the EPA has monitored aquatic ecosystems using statistically rigorous sample designs and intensive field efforts which provide high quality datasets. But by their nature they leave many aquatic systems unsampled, follow a top down approach, have a long lag between ...

  15. Bayesian Reconstruction of Disease Outbreaks by Combining Epidemiologic and Genomic Data

    PubMed Central

    Jombart, Thibaut; Cori, Anne; Didelot, Xavier; Cauchemez, Simon; Fraser, Christophe; Ferguson, Neil

    2014-01-01

    Recent years have seen progress in the development of statistically rigorous frameworks to infer outbreak transmission trees (“who infected whom”) from epidemiological and genetic data. Making use of pathogen genome sequences in such analyses remains a challenge, however, with a variety of heuristic approaches having been explored to date. We introduce a statistical method exploiting both pathogen sequences and collection dates to unravel the dynamics of densely sampled outbreaks. Our approach identifies likely transmission events and infers dates of infections, unobserved cases and separate introductions of the disease. It also proves useful for inferring numbers of secondary infections and identifying heterogeneous infectivity and super-spreaders. After testing our approach using simulations, we illustrate the method with the analysis of the beginning of the 2003 Singaporean outbreak of Severe Acute Respiratory Syndrome (SARS), providing new insights into the early stage of this epidemic. Our approach is the first tool for disease outbreak reconstruction from genetic data widely available as free software, the R package outbreaker. It is applicable to various densely sampled epidemics, and improves previous approaches by detecting unobserved and imported cases, as well as allowing multiple introductions of the pathogen. Because of its generality, we believe this method will become a tool of choice for the analysis of densely sampled disease outbreaks, and will form a rigorous framework for subsequent methodological developments. PMID:24465202

  16. Time Series Expression Analyses Using RNA-seq: A Statistical Approach

    PubMed Central

    Oh, Sunghee; Song, Seongho; Grabowski, Gregory; Zhao, Hongyu; Noonan, James P.

    2013-01-01

    RNA-seq is becoming the de facto standard approach for transcriptome analysis with ever-reducing cost. It has considerable advantages over conventional technologies (microarrays) because it allows for direct identification and quantification of transcripts. Many time series RNA-seq datasets have been collected to study the dynamic regulations of transcripts. However, statistically rigorous and computationally efficient methods are needed to explore the time-dependent changes of gene expression in biological systems. These methods should explicitly account for the dependencies of expression patterns across time points. Here, we discuss several methods that can be applied to model timecourse RNA-seq data, including statistical evolutionary trajectory index (SETI), autoregressive time-lagged regression (AR(1)), and hidden Markov model (HMM) approaches. We use three real datasets and simulation studies to demonstrate the utility of these dynamic methods in temporal analysis. PMID:23586021

  17. Time series expression analyses using RNA-seq: a statistical approach.

    PubMed

    Oh, Sunghee; Song, Seongho; Grabowski, Gregory; Zhao, Hongyu; Noonan, James P

    2013-01-01

    RNA-seq is becoming the de facto standard approach for transcriptome analysis with ever-reducing cost. It has considerable advantages over conventional technologies (microarrays) because it allows for direct identification and quantification of transcripts. Many time series RNA-seq datasets have been collected to study the dynamic regulations of transcripts. However, statistically rigorous and computationally efficient methods are needed to explore the time-dependent changes of gene expression in biological systems. These methods should explicitly account for the dependencies of expression patterns across time points. Here, we discuss several methods that can be applied to model timecourse RNA-seq data, including statistical evolutionary trajectory index (SETI), autoregressive time-lagged regression (AR(1)), and hidden Markov model (HMM) approaches. We use three real datasets and simulation studies to demonstrate the utility of these dynamic methods in temporal analysis.

  18. Model-Based Assessment of Estuary Ecosystem Health Using the Latent Health Factor Index, with Application to the Richibucto Estuary

    PubMed Central

    Chiu, Grace S.; Wu, Margaret A.; Lu, Lin

    2013-01-01

    The ability to quantitatively assess ecological health is of great interest to those tasked with monitoring and conserving ecosystems. For decades, biomonitoring research and policies have relied on multimetric health indices of various forms. Although indices are numbers, many are constructed based on qualitative procedures, thus limiting the quantitative rigor of the practical interpretations of such indices. The statistical modeling approach to construct the latent health factor index (LHFI) was recently developed. With ecological data that otherwise are used to construct conventional multimetric indices, the LHFI framework expresses such data in a rigorous quantitative model, integrating qualitative features of ecosystem health and preconceived ecological relationships among such features. This hierarchical modeling approach allows unified statistical inference of health for observed sites (along with prediction of health for partially observed sites, if desired) and of the relevance of ecological drivers, all accompanied by formal uncertainty statements from a single, integrated analysis. Thus far, the LHFI approach has been demonstrated and validated in a freshwater context. We adapt this approach to modeling estuarine health, and illustrate it on the previously unassessed system in Richibucto in New Brunswick, Canada, where active oyster farming is a potential stressor through its effects on sediment properties. Field data correspond to health metrics that constitute the popular AZTI marine biotic index and the infaunal trophic index, as well as abiotic predictors preconceived to influence biota. Our paper is the first to construct a scientifically sensible model that rigorously identifies the collective explanatory capacity of salinity, distance downstream, channel depth, and silt–clay content–all regarded a priori as qualitatively important abiotic drivers–towards site health in the Richibucto ecosystem. This suggests the potential effectiveness of the LHFI approach for assessing not only freshwater systems but aquatic ecosystems in general. PMID:23785443

  19. Preschool Center Care Quality Effects on Academic Achievement: An Instrumental Variables Analysis

    ERIC Educational Resources Information Center

    Auger, Anamarie; Farkas, George; Burchinal, Margaret R.; Duncan, Greg J.; Vandell, Deborah Lowe

    2014-01-01

    Much of child care research has focused on the effects of the quality of care in early childhood settings on children's school readiness skills. Although researchers increased the statistical rigor of their approaches over the past 15 years, researchers' ability to draw causal inferences has been limited because the studies are based on…

  20. Measuring the Unmeasurable: Upholding Rigor in Quantitative Studies of Personal and Social Development in Outdoor Adventure Education

    ERIC Educational Resources Information Center

    Scrutton, Roger; Beames, Simon

    2015-01-01

    Outdoor adventure education (OAE) has a long history of being credited with the personal and social development (PSD) of its participants. PSD is notoriously difficult to measure quantitatively, yet stakeholders demand statistical evidence that given approaches to eliciting PSD are effective in their methods. Rightly or wrongly, many stakeholders…

  1. Statistical Analysis of the Processes Controlling Choline and Ethanolamine Glycerophospholipid Molecular Species Composition

    PubMed Central

    Kiebish, Michael A.; Yang, Kui; Han, Xianlin; Gross, Richard W.; Chuang, Jeffrey

    2012-01-01

    The regulation and maintenance of the cellular lipidome through biosynthetic, remodeling, and catabolic mechanisms are critical for biological homeostasis during development, health and disease. These complex mechanisms control the architectures of lipid molecular species, which have diverse yet highly regulated fatty acid chains at both the sn1 and sn2 positions. Phosphatidylcholine (PC) and phosphatidylethanolamine (PE) serve as the predominant biophysical scaffolds in membranes, acting as reservoirs for potent lipid signals and regulating numerous enzymatic processes. Here we report the first rigorous computational dissection of the mechanisms influencing PC and PE molecular architectures from high-throughput shotgun lipidomic data. Using novel statistical approaches, we have analyzed multidimensional mass spectrometry-based shotgun lipidomic data from developmental mouse heart and mature mouse heart, lung, brain, and liver tissues. We show that in PC and PE, sn1 and sn2 positions are largely independent, though for low abundance species regulatory processes may interact with both the sn1 and sn2 chain simultaneously, leading to cooperative effects. Chains with similar biochemical properties appear to be remodeled similarly. We also see that sn2 positions are more regulated than sn1, and that PC exhibits stronger cooperative effects than PE. A key aspect of our work is a novel statistically rigorous approach to determine cooperativity based on a modified Fisher's exact test using Markov Chain Monte Carlo sampling. This computational approach provides a novel tool for developing mechanistic insight into lipidomic regulation. PMID:22662143

  2. A Comparison of Alternate Approaches to Creating Indices of Academic Rigor. Research Report 2012-11

    ERIC Educational Resources Information Center

    Beatty, Adam S.; Sackett, Paul R.; Kuncel, Nathan R.; Kiger, Thomas B.; Rigdon, Jana L.; Shen, Winny; Walmsley, Philip T.

    2013-01-01

    In recent decades, there has been an increasing emphasis placed on college graduation rates and reducing attrition due to the social and economic benefits, at both the individual and national levels, proposed to accrue from a more highly educated population (Bureau of Labor Statistics, 2011). In the United States in particular, there is a concern…

  3. Modeling and prediction of peptide drift times in ion mobility spectrometry using sequence-based and structure-based approaches.

    PubMed

    Zhang, Yiming; Jin, Quan; Wang, Shuting; Ren, Ren

    2011-05-01

    The mobile behavior of 1481 peptides in ion mobility spectrometry (IMS), which are generated by protease digestion of the Drosophila melanogaster proteome, is modeled and predicted based on two different types of characterization methods, i.e. sequence-based approach and structure-based approach. In this procedure, the sequence-based approach considers both the amino acid composition of a peptide and the local environment profile of each amino acid in the peptide; the structure-based approach is performed with the CODESSA protocol, which regards a peptide as a common organic compound and generates more than 200 statistically significant variables to characterize the whole structure profile of a peptide molecule. Subsequently, the nonlinear support vector machine (SVM) and Gaussian process (GP) as well as linear partial least squares (PLS) regression is employed to correlate the structural parameters of the characterizations with the IMS drift times of these peptides. The obtained quantitative structure-spectrum relationship (QSSR) models are evaluated rigorously and investigated systematically via both one-deep and two-deep cross-validations as well as the rigorous Monte Carlo cross-validation (MCCV). We also give a comprehensive comparison on the resulting statistics arising from the different combinations of variable types with modeling methods and find that the sequence-based approach can give the QSSR models with better fitting ability and predictive power but worse interpretability than the structure-based approach. In addition, though the QSSR modeling using sequence-based approach is not needed for the preparation of the minimization structures of peptides before the modeling, it would be considerably efficient as compared to that using structure-based approach. Copyright © 2011 Elsevier Ltd. All rights reserved.

  4. Improving estimates of the number of `fake' leptons and other mis-reconstructed objects in hadron collider events: BoB's your UNCLE

    NASA Astrophysics Data System (ADS)

    Gillam, Thomas P. S.; Lester, Christopher G.

    2014-11-01

    We consider current and alternative approaches to setting limits on new physics signals having backgrounds from misidentified objects; for example jets misidentified as leptons, b-jets or photons. Many ATLAS and CMS analyses have used a heuristic "matrix method" for estimating the background contribution from such sources. We demonstrate that the matrix method suffers from statistical shortcomings that can adversely affect its ability to set robust limits. A rigorous alternative method is discussed, and is seen to produce fake rate estimates and limits with better qualities, but is found to be too costly to use. Having investigated the nature of the approximations used to derive the matrix method, we propose a third strategy that is seen to marry the speed of the matrix method to the performance and physicality of the more rigorous approach.

  5. Using GIS to generate spatially balanced random survey designs for natural resource applications.

    PubMed

    Theobald, David M; Stevens, Don L; White, Denis; Urquhart, N Scott; Olsen, Anthony R; Norman, John B

    2007-07-01

    Sampling of a population is frequently required to understand trends and patterns in natural resource management because financial and time constraints preclude a complete census. A rigorous probability-based survey design specifies where to sample so that inferences from the sample apply to the entire population. Probability survey designs should be used in natural resource and environmental management situations because they provide the mathematical foundation for statistical inference. Development of long-term monitoring designs demand survey designs that achieve statistical rigor and are efficient but remain flexible to inevitable logistical or practical constraints during field data collection. Here we describe an approach to probability-based survey design, called the Reversed Randomized Quadrant-Recursive Raster, based on the concept of spatially balanced sampling and implemented in a geographic information system. This provides environmental managers a practical tool to generate flexible and efficient survey designs for natural resource applications. Factors commonly used to modify sampling intensity, such as categories, gradients, or accessibility, can be readily incorporated into the spatially balanced sample design.

  6. Statistical Analysis of Protein Ensembles

    NASA Astrophysics Data System (ADS)

    Máté, Gabriell; Heermann, Dieter

    2014-04-01

    As 3D protein-configuration data is piling up, there is an ever-increasing need for well-defined, mathematically rigorous analysis approaches, especially that the vast majority of the currently available methods rely heavily on heuristics. We propose an analysis framework which stems from topology, the field of mathematics which studies properties preserved under continuous deformations. First, we calculate a barcode representation of the molecules employing computational topology algorithms. Bars in this barcode represent different topological features. Molecules are compared through their barcodes by statistically determining the difference in the set of their topological features. As a proof-of-principle application, we analyze a dataset compiled of ensembles of different proteins, obtained from the Ensemble Protein Database. We demonstrate that our approach correctly detects the different protein groupings.

  7. Rigorous Science: a How-To Guide.

    PubMed

    Casadevall, Arturo; Fang, Ferric C

    2016-11-08

    Proposals to improve the reproducibility of biomedical research have emphasized scientific rigor. Although the word "rigor" is widely used, there has been little specific discussion as to what it means and how it can be achieved. We suggest that scientific rigor combines elements of mathematics, logic, philosophy, and ethics. We propose a framework for rigor that includes redundant experimental design, sound statistical analysis, recognition of error, avoidance of logical fallacies, and intellectual honesty. These elements lead to five actionable recommendations for research education. Copyright © 2016 Casadevall and Fang.

  8. Using expert knowledge to incorporate uncertainty in cause-of-death assignments for modeling of cause-specific mortality

    USGS Publications Warehouse

    Walsh, Daniel P.; Norton, Andrew S.; Storm, Daniel J.; Van Deelen, Timothy R.; Heisy, Dennis M.

    2018-01-01

    Implicit and explicit use of expert knowledge to inform ecological analyses is becoming increasingly common because it often represents the sole source of information in many circumstances. Thus, there is a need to develop statistical methods that explicitly incorporate expert knowledge, and can successfully leverage this information while properly accounting for associated uncertainty during analysis. Studies of cause-specific mortality provide an example of implicit use of expert knowledge when causes-of-death are uncertain and assigned based on the observer's knowledge of the most likely cause. To explicitly incorporate this use of expert knowledge and the associated uncertainty, we developed a statistical model for estimating cause-specific mortality using a data augmentation approach within a Bayesian hierarchical framework. Specifically, for each mortality event, we elicited the observer's belief of cause-of-death by having them specify the probability that the death was due to each potential cause. These probabilities were then used as prior predictive values within our framework. This hierarchical framework permitted a simple and rigorous estimation method that was easily modified to include covariate effects and regularizing terms. Although applied to survival analysis, this method can be extended to any event-time analysis with multiple event types, for which there is uncertainty regarding the true outcome. We conducted simulations to determine how our framework compared to traditional approaches that use expert knowledge implicitly and assume that cause-of-death is specified accurately. Simulation results supported the inclusion of observer uncertainty in cause-of-death assignment in modeling of cause-specific mortality to improve model performance and inference. Finally, we applied the statistical model we developed and a traditional method to cause-specific survival data for white-tailed deer, and compared results. We demonstrate that model selection results changed between the two approaches, and incorporating observer knowledge in cause-of-death increased the variability associated with parameter estimates when compared to the traditional approach. These differences between the two approaches can impact reported results, and therefore, it is critical to explicitly incorporate expert knowledge in statistical methods to ensure rigorous inference.

  9. Dissolution curve comparisons through the F(2) parameter, a Bayesian extension of the f(2) statistic.

    PubMed

    Novick, Steven; Shen, Yan; Yang, Harry; Peterson, John; LeBlond, Dave; Altan, Stan

    2015-01-01

    Dissolution (or in vitro release) studies constitute an important aspect of pharmaceutical drug development. One important use of such studies is for justifying a biowaiver for post-approval changes which requires establishing equivalence between the new and old product. We propose a statistically rigorous modeling approach for this purpose based on the estimation of what we refer to as the F2 parameter, an extension of the commonly used f2 statistic. A Bayesian test procedure is proposed in relation to a set of composite hypotheses that capture the similarity requirement on the absolute mean differences between test and reference dissolution profiles. Several examples are provided to illustrate the application. Results of our simulation study comparing the performance of f2 and the proposed method show that our Bayesian approach is comparable to or in many cases superior to the f2 statistic as a decision rule. Further useful extensions of the method, such as the use of continuous-time dissolution modeling, are considered.

  10. Rigorous Approach in Investigation of Seismic Structure and Source Characteristicsin Northeast Asia: Hierarchical and Trans-dimensional Bayesian Inversion

    NASA Astrophysics Data System (ADS)

    Mustac, M.; Kim, S.; Tkalcic, H.; Rhie, J.; Chen, Y.; Ford, S. R.; Sebastian, N.

    2015-12-01

    Conventional approaches to inverse problems suffer from non-linearity and non-uniqueness in estimations of seismic structures and source properties. Estimated results and associated uncertainties are often biased by applied regularizations and additional constraints, which are commonly introduced to solve such problems. Bayesian methods, however, provide statistically meaningful estimations of models and their uncertainties constrained by data information. In addition, hierarchical and trans-dimensional (trans-D) techniques are inherently implemented in the Bayesian framework to account for involved error statistics and model parameterizations, and, in turn, allow more rigorous estimations of the same. Here, we apply Bayesian methods throughout the entire inference process to estimate seismic structures and source properties in Northeast Asia including east China, the Korean peninsula, and the Japanese islands. Ambient noise analysis is first performed to obtain a base three-dimensional (3-D) heterogeneity model using continuous broadband waveforms from more than 300 stations. As for the tomography of surface wave group and phase velocities in the 5-70 s band, we adopt a hierarchical and trans-D Bayesian inversion method using Voronoi partition. The 3-D heterogeneity model is further improved by joint inversions of teleseismic receiver functions and dispersion data using a newly developed high-efficiency Bayesian technique. The obtained model is subsequently used to prepare 3-D structural Green's functions for the source characterization. A hierarchical Bayesian method for point source inversion using regional complete waveform data is applied to selected events from the region. The seismic structure and source characteristics with rigorously estimated uncertainties from the novel Bayesian methods provide enhanced monitoring and discrimination of seismic events in northeast Asia.

  11. Robust source and mask optimization compensating for mask topography effects in computational lithography.

    PubMed

    Li, Jia; Lam, Edmund Y

    2014-04-21

    Mask topography effects need to be taken into consideration for a more accurate solution of source mask optimization (SMO) in advanced optical lithography. However, rigorous 3D mask models generally involve intensive computation and conventional SMO fails to manipulate the mask-induced undesired phase errors that degrade the usable depth of focus (uDOF) and process yield. In this work, an optimization approach incorporating pupil wavefront aberrations into SMO procedure is developed as an alternative to maximize the uDOF. We first design the pupil wavefront function by adding primary and secondary spherical aberrations through the coefficients of the Zernike polynomials, and then apply the conjugate gradient method to achieve an optimal source-mask pair under the condition of aberrated pupil. We also use a statistical model to determine the Zernike coefficients for the phase control and adjustment. Rigorous simulations of thick masks show that this approach provides compensation for mask topography effects by improving the pattern fidelity and increasing uDOF.

  12. Increasing rigor in NMR-based metabolomics through validated and open source tools

    PubMed Central

    Eghbalnia, Hamid R; Romero, Pedro R; Westler, William M; Baskaran, Kumaran; Ulrich, Eldon L; Markley, John L

    2016-01-01

    The metabolome, the collection of small molecules associated with an organism, is a growing subject of inquiry, with the data utilized for data-intensive systems biology, disease diagnostics, biomarker discovery, and the broader characterization of small molecules in mixtures. Owing to their close proximity to the functional endpoints that govern an organism’s phenotype, metabolites are highly informative about functional states. The field of metabolomics identifies and quantifies endogenous and exogenous metabolites in biological samples. Information acquired from nuclear magnetic spectroscopy (NMR), mass spectrometry (MS), and the published literature, as processed by statistical approaches, are driving increasingly wider applications of metabolomics. This review focuses on the role of databases and software tools in advancing the rigor, robustness, reproducibility, and validation of metabolomics studies. PMID:27643760

  13. Increasing rigor in NMR-based metabolomics through validated and open source tools.

    PubMed

    Eghbalnia, Hamid R; Romero, Pedro R; Westler, William M; Baskaran, Kumaran; Ulrich, Eldon L; Markley, John L

    2017-02-01

    The metabolome, the collection of small molecules associated with an organism, is a growing subject of inquiry, with the data utilized for data-intensive systems biology, disease diagnostics, biomarker discovery, and the broader characterization of small molecules in mixtures. Owing to their close proximity to the functional endpoints that govern an organism's phenotype, metabolites are highly informative about functional states. The field of metabolomics identifies and quantifies endogenous and exogenous metabolites in biological samples. Information acquired from nuclear magnetic spectroscopy (NMR), mass spectrometry (MS), and the published literature, as processed by statistical approaches, are driving increasingly wider applications of metabolomics. This review focuses on the role of databases and software tools in advancing the rigor, robustness, reproducibility, and validation of metabolomics studies. Copyright © 2016. Published by Elsevier Ltd.

  14. Accuracy and performance of 3D mask models in optical projection lithography

    NASA Astrophysics Data System (ADS)

    Agudelo, Viviana; Evanschitzky, Peter; Erdmann, Andreas; Fühner, Tim; Shao, Feng; Limmer, Steffen; Fey, Dietmar

    2011-04-01

    Different mask models have been compared: rigorous electromagnetic field (EMF) modeling, rigorous EMF modeling with decomposition techniques and the thin mask approach (Kirchhoff approach) to simulate optical diffraction from different mask patterns in projection systems for lithography. In addition, each rigorous model was tested for two different formulations for partially coherent imaging: The Hopkins assumption and rigorous simulation of mask diffraction orders for multiple illumination angles. The aim of this work is to closely approximate results of the rigorous EMF method by the thin mask model enhanced with pupil filtering techniques. The validity of this approach for different feature sizes, shapes and illumination conditions is investigated.

  15. Alarms about structural alerts.

    PubMed

    Alves, Vinicius; Muratov, Eugene; Capuzzi, Stephen; Politi, Regina; Low, Yen; Braga, Rodolpho; Zakharov, Alexey V; Sedykh, Alexander; Mokshyna, Elena; Farag, Sherif; Andrade, Carolina; Kuz'min, Victor; Fourches, Denis; Tropsha, Alexander

    2016-08-21

    Structural alerts are widely accepted in chemical toxicology and regulatory decision support as a simple and transparent means to flag potential chemical hazards or group compounds into categories for read-across. However, there has been a growing concern that alerts disproportionally flag too many chemicals as toxic, which questions their reliability as toxicity markers. Conversely, the rigorously developed and properly validated statistical QSAR models can accurately and reliably predict the toxicity of a chemical; however, their use in regulatory toxicology has been hampered by the lack of transparency and interpretability. We demonstrate that contrary to the common perception of QSAR models as "black boxes" they can be used to identify statistically significant chemical substructures (QSAR-based alerts) that influence toxicity. We show through several case studies, however, that the mere presence of structural alerts in a chemical, irrespective of the derivation method (expert-based or QSAR-based), should be perceived only as hypotheses of possible toxicological effect. We propose a new approach that synergistically integrates structural alerts and rigorously validated QSAR models for a more transparent and accurate safety assessment of new chemicals.

  16. Trans-dimensional and hierarchical Bayesian approaches toward rigorous estimation of seismic sources and structures in the Northeast Asia

    NASA Astrophysics Data System (ADS)

    Kim, Seongryong; Tkalčić, Hrvoje; Mustać, Marija; Rhie, Junkee; Ford, Sean

    2016-04-01

    A framework is presented within which we provide rigorous estimations for seismic sources and structures in the Northeast Asia. We use Bayesian inversion methods, which enable statistical estimations of models and their uncertainties based on data information. Ambiguities in error statistics and model parameterizations are addressed by hierarchical and trans-dimensional (trans-D) techniques, which can be inherently implemented in the Bayesian inversions. Hence reliable estimation of model parameters and their uncertainties is possible, thus avoiding arbitrary regularizations and parameterizations. Hierarchical and trans-D inversions are performed to develop a three-dimensional velocity model using ambient noise data. To further improve the model, we perform joint inversions with receiver function data using a newly developed Bayesian method. For the source estimation, a novel moment tensor inversion method is presented and applied to regional waveform data of the North Korean nuclear explosion tests. By the combination of new Bayesian techniques and the structural model, coupled with meaningful uncertainties related to each of the processes, more quantitative monitoring and discrimination of seismic events is possible.

  17. Equilibration, thermalisation, and the emergence of statistical mechanics in closed quantum systems

    NASA Astrophysics Data System (ADS)

    Gogolin, Christian; Eisert, Jens

    2016-05-01

    We review selected advances in the theoretical understanding of complex quantum many-body systems with regard to emergent notions of quantum statistical mechanics. We cover topics such as equilibration and thermalisation in pure state statistical mechanics, the eigenstate thermalisation hypothesis, the equivalence of ensembles, non-equilibration dynamics following global and local quenches as well as ramps. We also address initial state independence, absence of thermalisation, and many-body localisation. We elucidate the role played by key concepts for these phenomena, such as Lieb-Robinson bounds, entanglement growth, typicality arguments, quantum maximum entropy principles and the generalised Gibbs ensembles, and quantum (non-)integrability. We put emphasis on rigorous approaches and present the most important results in a unified language.

  18. Electrical Conductivity of Charged Particle Systems and Zubarev's Nonequilibrium Statistical Operator Method

    NASA Astrophysics Data System (ADS)

    Röpke, G.

    2018-01-01

    One of the fundamental problems in physics that are not yet rigorously solved is the statistical mechanics of nonequilibrium processes. An important contribution to describing irreversible behavior starting from reversible Hamiltonian dynamics was given by D. N. Zubarev, who invented the method of the nonequilibrium statistical operator. We discuss this approach, in particular, the extended von Neumann equation, and as an example consider the electrical conductivity of a system of charged particles. We consider the selection of the set of relevant observables. We show the relation between kinetic theory and linear response theory. Using thermodynamic Green's functions, we present a systematic treatment of correlation functions, but the convergence needs investigation. We compare different expressions for the conductivity and list open questions.

  19. Equilibration, thermalisation, and the emergence of statistical mechanics in closed quantum systems.

    PubMed

    Gogolin, Christian; Eisert, Jens

    2016-05-01

    We review selected advances in the theoretical understanding of complex quantum many-body systems with regard to emergent notions of quantum statistical mechanics. We cover topics such as equilibration and thermalisation in pure state statistical mechanics, the eigenstate thermalisation hypothesis, the equivalence of ensembles, non-equilibration dynamics following global and local quenches as well as ramps. We also address initial state independence, absence of thermalisation, and many-body localisation. We elucidate the role played by key concepts for these phenomena, such as Lieb-Robinson bounds, entanglement growth, typicality arguments, quantum maximum entropy principles and the generalised Gibbs ensembles, and quantum (non-)integrability. We put emphasis on rigorous approaches and present the most important results in a unified language.

  20. Statistical Characterization and Classification of Edge-Localized Plasma Instabilities

    NASA Astrophysics Data System (ADS)

    Webster, A. J.; Dendy, R. O.

    2013-04-01

    The statistics of edge-localized plasma instabilities (ELMs) in toroidal magnetically confined fusion plasmas are considered. From first principles, standard experimentally motivated assumptions are shown to determine a specific probability distribution for the waiting times between ELMs: the Weibull distribution. This is confirmed empirically by a statistically rigorous comparison with a large data set from the Joint European Torus. The successful characterization of ELM waiting times enables future work to progress in various ways. Here we present a quantitative classification of ELM types, complementary to phenomenological approaches. It also informs us about the nature of ELM processes, such as whether they are random or deterministic. The methods are extremely general and can be applied to numerous other quasiperiodic intermittent phenomena.

  1. Rigorous evaluation of chemical measurement uncertainty: liquid chromatographic analysis methods using detector response factor calibration

    NASA Astrophysics Data System (ADS)

    Toman, Blaza; Nelson, Michael A.; Bedner, Mary

    2017-06-01

    Chemical measurement methods are designed to promote accurate knowledge of a measurand or system. As such, these methods often allow elicitation of latent sources of variability and correlation in experimental data. They typically implement measurement equations that support quantification of effects associated with calibration standards and other known or observed parametric variables. Additionally, multiple samples and calibrants are usually analyzed to assess accuracy of the measurement procedure and repeatability by the analyst. Thus, a realistic assessment of uncertainty for most chemical measurement methods is not purely bottom-up (based on the measurement equation) or top-down (based on the experimental design), but inherently contains elements of both. Confidence in results must be rigorously evaluated for the sources of variability in all of the bottom-up and top-down elements. This type of analysis presents unique challenges due to various statistical correlations among the outputs of measurement equations. One approach is to use a Bayesian hierarchical (BH) model which is intrinsically rigorous, thus making it a straightforward method for use with complex experimental designs, particularly when correlations among data are numerous and difficult to elucidate or explicitly quantify. In simpler cases, careful analysis using GUM Supplement 1 (MC) methods augmented with random effects meta analysis yields similar results to a full BH model analysis. In this article we describe both approaches to rigorous uncertainty evaluation using as examples measurements of 25-hydroxyvitamin D3 in solution reference materials via liquid chromatography with UV absorbance detection (LC-UV) and liquid chromatography mass spectrometric detection using isotope dilution (LC-IDMS).

  2. Rigorous Science: a How-To Guide

    PubMed Central

    Fang, Ferric C.

    2016-01-01

    ABSTRACT Proposals to improve the reproducibility of biomedical research have emphasized scientific rigor. Although the word “rigor” is widely used, there has been little specific discussion as to what it means and how it can be achieved. We suggest that scientific rigor combines elements of mathematics, logic, philosophy, and ethics. We propose a framework for rigor that includes redundant experimental design, sound statistical analysis, recognition of error, avoidance of logical fallacies, and intellectual honesty. These elements lead to five actionable recommendations for research education. PMID:27834205

  3. Theory of the Decoherence Effect in Finite and Infinite Open Quantum Systems Using the Algebraic Approach

    NASA Astrophysics Data System (ADS)

    Blanchard, Philippe; Hellmich, Mario; Ługiewicz, Piotr; Olkiewicz, Robert

    Quantum mechanics is the greatest revision of our conception of the character of the physical world since Newton. Consequently, David Hilbert was very interested in quantum mechanics. He and John von Neumann discussed it frequently during von Neumann's residence in Göttingen. He published in 1932 his book Mathematical Foundations of Quantum Mechanics. In Hilbert's opinion it was the first exposition of quantum mechanics in a mathematically rigorous way. The pioneers of quantum mechanics, Heisenberg and Dirac, neither had use for rigorous mathematics nor much interest in it. Conceptually, quantum theory as developed by Bohr and Heisenberg is based on the positivism of Mach as it describes only observable quantities. It first emerged as a result of experimental data in the form of statistical observations of quantum noise, the basic concept of quantum probability.

  4. Connectopic mapping with resting-state fMRI.

    PubMed

    Haak, Koen V; Marquand, Andre F; Beckmann, Christian F

    2018-04-15

    Brain regions are often topographically connected: nearby locations within one brain area connect with nearby locations in another area. Mapping these connection topographies, or 'connectopies' in short, is crucial for understanding how information is processed in the brain. Here, we propose principled, fully data-driven methods for mapping connectopies using functional magnetic resonance imaging (fMRI) data acquired at rest by combining spectral embedding of voxel-wise connectivity 'fingerprints' with a novel approach to spatial statistical inference. We apply the approach in human primary motor and visual cortex, and show that it can trace biologically plausible, overlapping connectopies in individual subjects that follow these regions' somatotopic and retinotopic maps. As a generic mechanism to perform inference over connectopies, the new spatial statistics approach enables rigorous statistical testing of hypotheses regarding the fine-grained spatial profile of functional connectivity and whether that profile is different between subjects or between experimental conditions. The combined framework offers a fundamental alternative to existing approaches to investigating functional connectivity in the brain, from voxel- or seed-pair wise characterizations of functional association, towards a full, multivariate characterization of spatial topography. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  5. Predictive QSAR modeling workflow, model applicability domains, and virtual screening.

    PubMed

    Tropsha, Alexander; Golbraikh, Alexander

    2007-01-01

    Quantitative Structure Activity Relationship (QSAR) modeling has been traditionally applied as an evaluative approach, i.e., with the focus on developing retrospective and explanatory models of existing data. Model extrapolation was considered if only in hypothetical sense in terms of potential modifications of known biologically active chemicals that could improve compounds' activity. This critical review re-examines the strategy and the output of the modern QSAR modeling approaches. We provide examples and arguments suggesting that current methodologies may afford robust and validated models capable of accurate prediction of compound properties for molecules not included in the training sets. We discuss a data-analytical modeling workflow developed in our laboratory that incorporates modules for combinatorial QSAR model development (i.e., using all possible binary combinations of available descriptor sets and statistical data modeling techniques), rigorous model validation, and virtual screening of available chemical databases to identify novel biologically active compounds. Our approach places particular emphasis on model validation as well as the need to define model applicability domains in the chemistry space. We present examples of studies where the application of rigorously validated QSAR models to virtual screening identified computational hits that were confirmed by subsequent experimental investigations. The emerging focus of QSAR modeling on target property forecasting brings it forward as predictive, as opposed to evaluative, modeling approach.

  6. Maximum entropy models as a tool for building precise neural controls.

    PubMed

    Savin, Cristina; Tkačik, Gašper

    2017-10-01

    Neural responses are highly structured, with population activity restricted to a small subset of the astronomical range of possible activity patterns. Characterizing these statistical regularities is important for understanding circuit computation, but challenging in practice. Here we review recent approaches based on the maximum entropy principle used for quantifying collective behavior in neural activity. We highlight recent models that capture population-level statistics of neural data, yielding insights into the organization of the neural code and its biological substrate. Furthermore, the MaxEnt framework provides a general recipe for constructing surrogate ensembles that preserve aspects of the data, but are otherwise maximally unstructured. This idea can be used to generate a hierarchy of controls against which rigorous statistical tests are possible. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. Methodological rigor and citation frequency in patient compliance literature.

    PubMed Central

    Bruer, J T

    1982-01-01

    An exhaustive bibliography which assesses the methodological rigor of the patient compliance literature, and citation data from the Science Citation Index (SCI) are combined to determine if methodologically rigorous papers are used with greater frequency than substandard articles by compliance investigators. There are low, but statistically significant, correlations between methodological rigor and citation indicators for 138 patient compliance papers published in SCI source journals during 1975 and 1976. The correlation is not strong enough to warrant use of citation measures as indicators of rigor on a paper-by-paper basis. The data do suggest that citation measures might be developed as crude indicators of methodological rigor. There is no evidence that randomized trials are cited more frequently than studies that employ other experimental designs. PMID:7114334

  8. Peer Review of EPA's Draft BMDS Document: Exponential ...

    EPA Pesticide Factsheets

    BMDS is one of the Agency's premier tools for estimating risk assessments, therefore the validity and reliability of its statistical models are of paramount importance. This page provides links to peer review of the BMDS applications and its models as they were developed and eventually released documenting the rigorous review process taken to provide the best science tools available for statistical modeling. This page provides links to peer review of the BMDS applications and its models as they were developed and eventually released documenting the rigorous review process taken to provide the best science tools available for statistical modeling.

  9. Escape rates over potential barriers: variational principles and the Hamilton-Jacobi equation

    NASA Astrophysics Data System (ADS)

    Cortés, Emilio; Espinosa, Francisco

    We describe a rigorous formalism to study some extrema statistics problems, like maximum probability events or escape rate processes, by taking into account that the Hamilton-Jacobi equation completes, in a natural way, the required set of boundary conditions of the Euler-Lagrange equation, for this kind of variational problem. We apply this approach to a one-dimensional stochastic process, driven by colored noise, for a double-parabola potential, where we have one stable and one unstable steady states.

  10. Rapid Creation and Quantitative Monitoring of High Coverage shRNA Libraries

    PubMed Central

    Bassik, Michael C.; Lebbink, Robert Jan; Churchman, L. Stirling; Ingolia, Nicholas T.; Patena, Weronika; LeProust, Emily M.; Schuldiner, Maya; Weissman, Jonathan S.; McManus, Michael T.

    2009-01-01

    Short hairpin RNA (shRNA) libraries are limited by the low efficacy of many shRNAs, giving false negatives, and off-target effects, giving false positives. Here we present a strategy for rapidly creating expanded shRNA pools (∼30 shRNAs/gene) that are analyzed by deep-sequencing (EXPAND). This approach enables identification of multiple effective target-specific shRNAs from a complex pool, allowing a rigorous statistical evaluation of whether a gene is a true hit. PMID:19448642

  11. Microfluidic-based mini-metagenomics enables discovery of novel microbial lineages from complex environmental samples.

    PubMed

    Yu, Feiqiao Brian; Blainey, Paul C; Schulz, Frederik; Woyke, Tanja; Horowitz, Mark A; Quake, Stephen R

    2017-07-05

    Metagenomics and single-cell genomics have enabled genome discovery from unknown branches of life. However, extracting novel genomes from complex mixtures of metagenomic data can still be challenging and represents an ill-posed problem which is generally approached with ad hoc methods. Here we present a microfluidic-based mini-metagenomic method which offers a statistically rigorous approach to extract novel microbial genomes while preserving single-cell resolution. We used this approach to analyze two hot spring samples from Yellowstone National Park and extracted 29 new genomes, including three deeply branching lineages. The single-cell resolution enabled accurate quantification of genome function and abundance, down to 1% in relative abundance. Our analyses of genome level SNP distributions also revealed low to moderate environmental selection. The scale, resolution, and statistical power of microfluidic-based mini-metagenomics make it a powerful tool to dissect the genomic structure of microbial communities while effectively preserving the fundamental unit of biology, the single cell.

  12. Complexities and potential pitfalls of clinical study design and data analysis in assisted reproduction.

    PubMed

    Patounakis, George; Hill, Micah J

    2018-06-01

    The purpose of the current review is to describe the common pitfalls in design and statistical analysis of reproductive medicine studies. It serves to guide both authors and reviewers toward reducing the incidence of spurious statistical results and erroneous conclusions. The large amount of data gathered in IVF cycles leads to problems with multiplicity, multicollinearity, and over fitting of regression models. Furthermore, the use of the word 'trend' to describe nonsignificant results has increased in recent years. Finally, methods to accurately account for female age in infertility research models are becoming more common and necessary. The pitfalls of study design and analysis reviewed provide a framework for authors and reviewers to approach clinical research in the field of reproductive medicine. By providing a more rigorous approach to study design and analysis, the literature in reproductive medicine will have more reliable conclusions that can stand the test of time.

  13. Normalization, bias correction, and peak calling for ChIP-seq

    PubMed Central

    Diaz, Aaron; Park, Kiyoub; Lim, Daniel A.; Song, Jun S.

    2012-01-01

    Next-generation sequencing is rapidly transforming our ability to profile the transcriptional, genetic, and epigenetic states of a cell. In particular, sequencing DNA from the immunoprecipitation of protein-DNA complexes (ChIP-seq) and methylated DNA (MeDIP-seq) can reveal the locations of protein binding sites and epigenetic modifications. These approaches contain numerous biases which may significantly influence the interpretation of the resulting data. Rigorous computational methods for detecting and removing such biases are still lacking. Also, multi-sample normalization still remains an important open problem. This theoretical paper systematically characterizes the biases and properties of ChIP-seq data by comparing 62 separate publicly available datasets, using rigorous statistical models and signal processing techniques. Statistical methods for separating ChIP-seq signal from background noise, as well as correcting enrichment test statistics for sequence-dependent and sonication biases, are presented. Our method effectively separates reads into signal and background components prior to normalization, improving the signal-to-noise ratio. Moreover, most peak callers currently use a generic null model which suffers from low specificity at the sensitivity level requisite for detecting subtle, but true, ChIP enrichment. The proposed method of determining a cell type-specific null model, which accounts for cell type-specific biases, is shown to be capable of achieving a lower false discovery rate at a given significance threshold than current methods. PMID:22499706

  14. Assessment of Teaching Approaches in an Introductory Astronomy College Classroom

    NASA Astrophysics Data System (ADS)

    Alexander, William R.

    In recent years, there have been calls from the astronomy education research community for the increased use of learner-centered approaches to teaching, and systematic assessments of various teaching approaches using such tools as the Astronomy Diagnostic Test 2.0 (ADT 2.0). The research presented is a response to both calls. The ADT 2.0 was used in a modified form to obtain baseline assessments of introductory college astronomy classes that were taught in a traditional, mostly didactic manner. The ADT 2.0 (modified) was administered both before and after the completion of the courses. The courses were then altered to make modest use of learner-centered lecture tutorials. The ADT 2.0 (modified) was again administered before and after completion of the modified courses. Overall, the modest learner-centered approach showed mixed statistical results, with an increase in effect size (from medium to large), but no change in normalized gain index (both were low). Additionally, a mathematically rigorous approach showed no statistically significant improvements in conceptual understanding compared with a mathematically nonrigorous approach. This study will interpret the results from a variety of perspectives. The overall implementation of the lecture tutorials and their implications for teaching will also be discussed.

  15. Why so many "rigorous" evaluations fail to identify unintended consequences of development programs: How mixed methods can contribute.

    PubMed

    Bamberger, Michael; Tarsilla, Michele; Hesse-Biber, Sharlene

    2016-04-01

    Many widely-used impact evaluation designs, including randomized control trials (RCTs) and quasi-experimental designs (QEDs), frequently fail to detect what are often quite serious unintended consequences of development programs. This seems surprising as experienced planners and evaluators are well aware that unintended consequences frequently occur. Most evaluation designs are intended to determine whether there is credible evidence (statistical, theory-based or narrative) that programs have achieved their intended objectives and the logic of many evaluation designs, even those that are considered the most "rigorous," does not permit the identification of outcomes that were not specified in the program design. We take the example of RCTs as they are considered by many to be the most rigorous evaluation designs. We present a numbers of cases to illustrate how infusing RCTs with a mixed-methods approach (sometimes called an "RCT+" design) can strengthen the credibility of these designs and can also capture important unintended consequences. We provide a Mixed Methods Evaluation Framework that identifies 9 ways in which UCs can occur, and we apply this framework to two of the case studies. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. Proposal for a biometrics of the cortical surface: a statistical method for relative surface distance metrics

    NASA Astrophysics Data System (ADS)

    Bookstein, Fred L.

    1995-08-01

    Recent advances in computational geometry have greatly extended the range of neuroanatomical questions that can be approached by rigorous quantitative methods. One of the major current challenges in this area is to describe the variability of human cortical surface form and its implications for individual differences in neurophysiological functioning. Existing techniques for representation of stochastically invaginated surfaces do not conduce to the necessary parametric statistical summaries. In this paper, following a hint from David Van Essen and Heather Drury, I sketch a statistical method customized for the constraints of this complex data type. Cortical surface form is represented by its Riemannian metric tensor and averaged according to parameters of a smooth averaged surface. Sulci are represented by integral trajectories of the smaller principal strains of this metric, and their statistics follow the statistics of that relative metric. The diagrams visualizing this tensor analysis look like alligator leather but summarize all aspects of cortical surface form in between the principal sulci, the reliable ones; no flattening is required.

  17. On Improving the Quality and Interpretation of Environmental Assessments using Statistical Analysis and Geographic Information Systems

    NASA Astrophysics Data System (ADS)

    Karuppiah, R.; Faldi, A.; Laurenzi, I.; Usadi, A.; Venkatesh, A.

    2014-12-01

    An increasing number of studies are focused on assessing the environmental footprint of different products and processes, especially using life cycle assessment (LCA). This work shows how combining statistical methods and Geographic Information Systems (GIS) with environmental analyses can help improve the quality of results and their interpretation. Most environmental assessments in literature yield single numbers that characterize the environmental impact of a process/product - typically global or country averages, often unchanging in time. In this work, we show how statistical analysis and GIS can help address these limitations. For example, we demonstrate a method to separately quantify uncertainty and variability in the result of LCA models using a power generation case study. This is important for rigorous comparisons between the impacts of different processes. Another challenge is lack of data that can affect the rigor of LCAs. We have developed an approach to estimate environmental impacts of incompletely characterized processes using predictive statistical models. This method is applied to estimate unreported coal power plant emissions in several world regions. There is also a general lack of spatio-temporal characterization of the results in environmental analyses. For instance, studies that focus on water usage do not put in context where and when water is withdrawn. Through the use of hydrological modeling combined with GIS, we quantify water stress on a regional and seasonal basis to understand water supply and demand risks for multiple users. Another example where it is important to consider regional dependency of impacts is when characterizing how agricultural land occupation affects biodiversity in a region. We developed a data-driven methodology used in conjuction with GIS to determine if there is a statistically significant difference between the impacts of growing different crops on different species in various biomes of the world.

  18. Treatments of Missing Values in Large National Data Affect Conclusions: The Impact of Multiple Imputation on Arthroplasty Research.

    PubMed

    Ondeck, Nathaniel T; Fu, Michael C; Skrip, Laura A; McLynn, Ryan P; Su, Edwin P; Grauer, Jonathan N

    2018-03-01

    Despite the advantages of large, national datasets, one continuing concern is missing data values. Complete case analysis, where only cases with complete data are analyzed, is commonly used rather than more statistically rigorous approaches such as multiple imputation. This study characterizes the potential selection bias introduced using complete case analysis and compares the results of common regressions using both techniques following unicompartmental knee arthroplasty. Patients undergoing unicompartmental knee arthroplasty were extracted from the 2005 to 2015 National Surgical Quality Improvement Program. As examples, the demographics of patients with and without missing preoperative albumin and hematocrit values were compared. Missing data were then treated with both complete case analysis and multiple imputation (an approach that reproduces the variation and associations that would have been present in a full dataset) and the conclusions of common regressions for adverse outcomes were compared. A total of 6117 patients were included, of which 56.7% were missing at least one value. Younger, female, and healthier patients were more likely to have missing preoperative albumin and hematocrit values. The use of complete case analysis removed 3467 patients from the study in comparison with multiple imputation which included all 6117 patients. The 2 methods of handling missing values led to differing associations of low preoperative laboratory values with commonly studied adverse outcomes. The use of complete case analysis can introduce selection bias and may lead to different conclusions in comparison with the statistically rigorous multiple imputation approach. Joint surgeons should consider the methods of handling missing values when interpreting arthroplasty research. Copyright © 2017 Elsevier Inc. All rights reserved.

  19. Studies on the estimation of the postmortem interval. 3. Rigor mortis (author's transl).

    PubMed

    Suzutani, T; Ishibashi, H; Takatori, T

    1978-11-01

    The authors have devised a method for classifying rigor mortis into 10 types based on its appearance and strength in various parts of a cadaver. By applying the method to the findings of 436 cadavers which were subjected to medico-legal autopsies in our laboratory during the last 10 years, it has been demonstrated that the classifying method is effective for analyzing the phenomenon of onset, persistence and disappearance of rigor mortis statistically. The investigation of the relationship between each type of rigor mortis and the postmortem interval has demonstrated that rigor mortis may be utilized as a basis for estimating the postmortem interval but the values have greater deviation than those described in current textbooks.

  20. A unified statistical approach to non-negative matrix factorization and probabilistic latent semantic indexing

    PubMed Central

    Wang, Guoli; Ebrahimi, Nader

    2014-01-01

    Non-negative matrix factorization (NMF) is a powerful machine learning method for decomposing a high-dimensional nonnegative matrix V into the product of two nonnegative matrices, W and H, such that V ∼ W H. It has been shown to have a parts-based, sparse representation of the data. NMF has been successfully applied in a variety of areas such as natural language processing, neuroscience, information retrieval, image processing, speech recognition and computational biology for the analysis and interpretation of large-scale data. There has also been simultaneous development of a related statistical latent class modeling approach, namely, probabilistic latent semantic indexing (PLSI), for analyzing and interpreting co-occurrence count data arising in natural language processing. In this paper, we present a generalized statistical approach to NMF and PLSI based on Renyi's divergence between two non-negative matrices, stemming from the Poisson likelihood. Our approach unifies various competing models and provides a unique theoretical framework for these methods. We propose a unified algorithm for NMF and provide a rigorous proof of monotonicity of multiplicative updates for W and H. In addition, we generalize the relationship between NMF and PLSI within this framework. We demonstrate the applicability and utility of our approach as well as its superior performance relative to existing methods using real-life and simulated document clustering data. PMID:25821345

  1. A unified statistical approach to non-negative matrix factorization and probabilistic latent semantic indexing.

    PubMed

    Devarajan, Karthik; Wang, Guoli; Ebrahimi, Nader

    2015-04-01

    Non-negative matrix factorization (NMF) is a powerful machine learning method for decomposing a high-dimensional nonnegative matrix V into the product of two nonnegative matrices, W and H , such that V ∼ W H . It has been shown to have a parts-based, sparse representation of the data. NMF has been successfully applied in a variety of areas such as natural language processing, neuroscience, information retrieval, image processing, speech recognition and computational biology for the analysis and interpretation of large-scale data. There has also been simultaneous development of a related statistical latent class modeling approach, namely, probabilistic latent semantic indexing (PLSI), for analyzing and interpreting co-occurrence count data arising in natural language processing. In this paper, we present a generalized statistical approach to NMF and PLSI based on Renyi's divergence between two non-negative matrices, stemming from the Poisson likelihood. Our approach unifies various competing models and provides a unique theoretical framework for these methods. We propose a unified algorithm for NMF and provide a rigorous proof of monotonicity of multiplicative updates for W and H . In addition, we generalize the relationship between NMF and PLSI within this framework. We demonstrate the applicability and utility of our approach as well as its superior performance relative to existing methods using real-life and simulated document clustering data.

  2. Exploring KM Features of High-Performance Companies

    NASA Astrophysics Data System (ADS)

    Wu, Wei-Wen

    2007-12-01

    For reacting to an increasingly rival business environment, many companies emphasize the importance of knowledge management (KM). It is a favorable way to explore and learn KM features of high-performance companies. However, finding out the critical KM features of high-performance companies is a qualitative analysis problem. To handle this kind of problem, the rough set approach is suitable because it is based on data-mining techniques to discover knowledge without rigorous statistical assumptions. Thus, this paper explored KM features of high-performance companies by using the rough set approach. The results show that high-performance companies stress the importance on both tacit and explicit knowledge, and consider that incentives and evaluations are the essentials to implementing KM.

  3. Exclusion Bounds for Extended Anyons

    NASA Astrophysics Data System (ADS)

    Larson, Simon; Lundholm, Douglas

    2018-01-01

    We introduce a rigorous approach to the many-body spectral theory of extended anyons, that is quantum particles confined to two dimensions that interact via attached magnetic fluxes of finite extent. Our main results are many-body magnetic Hardy inequalities and local exclusion principles for these particles, leading to estimates for the ground-state energy of the anyon gas over the full range of the parameters. This brings out further non-trivial aspects in the dependence on the anyonic statistics parameter, and also gives improvements in the ideal (non-extended) case.

  4. Photons Revisited

    NASA Astrophysics Data System (ADS)

    Batic, Matej; Begalli, Marcia; Han, Min Cheol; Hauf, Steffen; Hoff, Gabriela; Kim, Chan Hyeong; Kim, Han Sung; Grazia Pia, Maria; Saracco, Paolo; Weidenspointner, Georg

    2014-06-01

    A systematic review of methods and data for the Monte Carlo simulation of photon interactions is in progress: it concerns a wide set of theoretical modeling approaches and data libraries available for this purpose. Models and data libraries are assessed quantitatively with respect to an extensive collection of experimental measurements documented in the literature to determine their accuracy; this evaluation exploits rigorous statistical analysis methods. The computational performance of the associated modeling algorithms is evaluated as well. An overview of the assessment of photon interaction models and results of the experimental validation are presented.

  5. Tsallis thermostatistics for finite systems: a Hamiltonian approach

    NASA Astrophysics Data System (ADS)

    Adib, Artur B.; Moreira, Andrã© A.; Andrade, José S., Jr.; Almeida, Murilo P.

    2003-05-01

    The derivation of the Tsallis generalized canonical distribution from the traditional approach of the Gibbs microcanonical ensemble is revisited (Phys. Lett. A 193 (1994) 140). We show that finite systems whose Hamiltonians obey a generalized homogeneity relation rigorously follow the nonextensive thermostatistics of Tsallis. In the thermodynamical limit, however, our results indicate that the Boltzmann-Gibbs statistics is always recovered, regardless of the type of potential among interacting particles. This approach provides, moreover, a one-to-one correspondence between the generalized entropy and the Hamiltonian structure of a wide class of systems, revealing a possible origin for the intrinsic nonlinear features present in the Tsallis formalism that lead naturally to power-law behavior. Finally, we confirm these exact results through extensive numerical simulations of the Fermi-Pasta-Ulam chain of anharmonic oscillators.

  6. The log-periodic-AR(1)-GARCH(1,1) model for financial crashes

    NASA Astrophysics Data System (ADS)

    Gazola, L.; Fernandes, C.; Pizzinga, A.; Riera, R.

    2008-02-01

    This paper intends to meet recent claims for the attainment of more rigorous statistical methodology within the econophysics literature. To this end, we consider an econometric approach to investigate the outcomes of the log-periodic model of price movements, which has been largely used to forecast financial crashes. In order to accomplish reliable statistical inference for unknown parameters, we incorporate an autoregressive dynamic and a conditional heteroskedasticity structure in the error term of the original model, yielding the log-periodic-AR(1)-GARCH(1,1) model. Both the original and the extended models are fitted to financial indices of U. S. market, namely S&P500 and NASDAQ. Our analysis reveal two main points: (i) the log-periodic-AR(1)-GARCH(1,1) model has residuals with better statistical properties and (ii) the estimation of the parameter concerning the time of the financial crash has been improved.

  7. Robust Statistical Detection of Power-Law Cross-Correlation.

    PubMed

    Blythe, Duncan A J; Nikulin, Vadim V; Müller, Klaus-Robert

    2016-06-02

    We show that widely used approaches in statistical physics incorrectly indicate the existence of power-law cross-correlations between financial stock market fluctuations measured over several years and the neuronal activity of the human brain lasting for only a few minutes. While such cross-correlations are nonsensical, no current methodology allows them to be reliably discarded, leaving researchers at greater risk when the spurious nature of cross-correlations is not clear from the unrelated origin of the time series and rather requires careful statistical estimation. Here we propose a theory and method (PLCC-test) which allows us to rigorously and robustly test for power-law cross-correlations, correctly detecting genuine and discarding spurious cross-correlations, thus establishing meaningful relationships between processes in complex physical systems. Our method reveals for the first time the presence of power-law cross-correlations between amplitudes of the alpha and beta frequency ranges of the human electroencephalogram.

  8. Robust Statistical Detection of Power-Law Cross-Correlation

    PubMed Central

    Blythe, Duncan A. J.; Nikulin, Vadim V.; Müller, Klaus-Robert

    2016-01-01

    We show that widely used approaches in statistical physics incorrectly indicate the existence of power-law cross-correlations between financial stock market fluctuations measured over several years and the neuronal activity of the human brain lasting for only a few minutes. While such cross-correlations are nonsensical, no current methodology allows them to be reliably discarded, leaving researchers at greater risk when the spurious nature of cross-correlations is not clear from the unrelated origin of the time series and rather requires careful statistical estimation. Here we propose a theory and method (PLCC-test) which allows us to rigorously and robustly test for power-law cross-correlations, correctly detecting genuine and discarding spurious cross-correlations, thus establishing meaningful relationships between processes in complex physical systems. Our method reveals for the first time the presence of power-law cross-correlations between amplitudes of the alpha and beta frequency ranges of the human electroencephalogram. PMID:27250630

  9. Statistical Approaches to Assess Biosimilarity from Analytical Data.

    PubMed

    Burdick, Richard; Coffey, Todd; Gutka, Hiten; Gratzl, Gyöngyi; Conlon, Hugh D; Huang, Chi-Ting; Boyne, Michael; Kuehne, Henriette

    2017-01-01

    Protein therapeutics have unique critical quality attributes (CQAs) that define their purity, potency, and safety. The analytical methods used to assess CQAs must be able to distinguish clinically meaningful differences in comparator products, and the most important CQAs should be evaluated with the most statistical rigor. High-risk CQA measurements assess the most important attributes that directly impact the clinical mechanism of action or have known implications for safety, while the moderate- to low-risk characteristics may have a lower direct impact and thereby may have a broader range to establish similarity. Statistical equivalence testing is applied for high-risk CQA measurements to establish the degree of similarity (e.g., highly similar fingerprint, highly similar, or similar) of selected attributes. Notably, some high-risk CQAs (e.g., primary sequence or disulfide bonding) are qualitative (e.g., the same as the originator or not the same) and therefore not amenable to equivalence testing. For biosimilars, an important step is the acquisition of a sufficient number of unique originator drug product lots to measure the variability in the originator drug manufacturing process and provide sufficient statistical power for the analytical data comparisons. Together, these analytical evaluations, along with PK/PD and safety data (immunogenicity), provide the data necessary to determine if the totality of the evidence warrants a designation of biosimilarity and subsequent licensure for marketing in the USA. In this paper, a case study approach is used to provide examples of analytical similarity exercises and the appropriateness of statistical approaches for the example data.

  10. Microfluidic-based mini-metagenomics enables discovery of novel microbial lineages from complex environmental samples

    PubMed Central

    Yu, Feiqiao Brian; Blainey, Paul C; Schulz, Frederik; Woyke, Tanja; Horowitz, Mark A; Quake, Stephen R

    2017-01-01

    Metagenomics and single-cell genomics have enabled genome discovery from unknown branches of life. However, extracting novel genomes from complex mixtures of metagenomic data can still be challenging and represents an ill-posed problem which is generally approached with ad hoc methods. Here we present a microfluidic-based mini-metagenomic method which offers a statistically rigorous approach to extract novel microbial genomes while preserving single-cell resolution. We used this approach to analyze two hot spring samples from Yellowstone National Park and extracted 29 new genomes, including three deeply branching lineages. The single-cell resolution enabled accurate quantification of genome function and abundance, down to 1% in relative abundance. Our analyses of genome level SNP distributions also revealed low to moderate environmental selection. The scale, resolution, and statistical power of microfluidic-based mini-metagenomics make it a powerful tool to dissect the genomic structure of microbial communities while effectively preserving the fundamental unit of biology, the single cell. DOI: http://dx.doi.org/10.7554/eLife.26580.001 PMID:28678007

  11. Statistical issues in signal extraction from microarrays

    NASA Astrophysics Data System (ADS)

    Bergemann, Tracy; Quiaoit, Filemon; Delrow, Jeffrey J.; Zhao, Lue Ping

    2001-06-01

    Microarray technologies are increasingly used in biomedical research to study genome-wide expression profiles in the post genomic era. Their popularity is largely due to their high throughput and economical affordability. For example, microarrays have been applied to studies of cell cycle, regulatory circuitry, cancer cell lines, tumor tissues, and drug discoveries. One obstacle facing the continued success of applying microarray technologies, however, is the random variaton present on microarrays: within signal spots, between spots and among chips. In addition, signals extracted by available software packages seem to vary significantly. Despite a variety of software packages, it appears that there are two major approaches to signal extraction. One approach is to focus on the identification of signal regions and hence estimation of signal levels above background levels. The other approach is to use the distribution of intensity values as a way of identifying relevant signals. Building upon both approaches, the objective of our work is to develop a method that is statistically rigorous and also efficient and robust. Statistical issues to be considered here include: (1) how to refine grid alignment so that the overall variation is minimized, (2) how to estimate the signal levels relative to the local background levels as well as the variance of this estimate, and (3) how to integrate red and green channel signals so that the ratio of interest is stable, simultaneously relaxing distributional assumptions.

  12. Applying Sociocultural Theory to Teaching Statistics for Doctoral Social Work Students

    ERIC Educational Resources Information Center

    Mogro-Wilson, Cristina; Reeves, Michael G.; Charter, Mollie Lazar

    2015-01-01

    This article describes the development of two doctoral-level multivariate statistics courses utilizing sociocultural theory, an integrative pedagogical framework. In the first course, the implementation of sociocultural theory helps to support the students through a rigorous introduction to statistics. The second course involves students…

  13. Evaluating Rigor in Qualitative Methodology and Research Dissemination

    ERIC Educational Resources Information Center

    Trainor, Audrey A.; Graue, Elizabeth

    2014-01-01

    Despite previous and successful attempts to outline general criteria for rigor, researchers in special education have debated the application of rigor criteria, the significance or importance of small n research, the purpose of interpretivist approaches, and the generalizability of qualitative empirical results. Adding to these complications, the…

  14. An information-theoretic approach to the modeling and analysis of whole-genome bisulfite sequencing data.

    PubMed

    Jenkinson, Garrett; Abante, Jordi; Feinberg, Andrew P; Goutsias, John

    2018-03-07

    DNA methylation is a stable form of epigenetic memory used by cells to control gene expression. Whole genome bisulfite sequencing (WGBS) has emerged as a gold-standard experimental technique for studying DNA methylation by producing high resolution genome-wide methylation profiles. Statistical modeling and analysis is employed to computationally extract and quantify information from these profiles in an effort to identify regions of the genome that demonstrate crucial or aberrant epigenetic behavior. However, the performance of most currently available methods for methylation analysis is hampered by their inability to directly account for statistical dependencies between neighboring methylation sites, thus ignoring significant information available in WGBS reads. We present a powerful information-theoretic approach for genome-wide modeling and analysis of WGBS data based on the 1D Ising model of statistical physics. This approach takes into account correlations in methylation by utilizing a joint probability model that encapsulates all information available in WGBS methylation reads and produces accurate results even when applied on single WGBS samples with low coverage. Using the Shannon entropy, our approach provides a rigorous quantification of methylation stochasticity in individual WGBS samples genome-wide. Furthermore, it utilizes the Jensen-Shannon distance to evaluate differences in methylation distributions between a test and a reference sample. Differential performance assessment using simulated and real human lung normal/cancer data demonstrate a clear superiority of our approach over DSS, a recently proposed method for WGBS data analysis. Critically, these results demonstrate that marginal methods become statistically invalid when correlations are present in the data. This contribution demonstrates clear benefits and the necessity of modeling joint probability distributions of methylation using the 1D Ising model of statistical physics and of quantifying methylation stochasticity using concepts from information theory. By employing this methodology, substantial improvement of DNA methylation analysis can be achieved by effectively taking into account the massive amount of statistical information available in WGBS data, which is largely ignored by existing methods.

  15. 75 FR 5291 - Office of Special Education and Rehabilitative Services; Overview Information; Technology and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-02

    ... two phases: (1) Development and (2) research on effectiveness. Abstracts of projects funded under... approaches. Phase 2 projects must subject technology-based approaches to rigorous field-based research to... disabilities; (2) Present a justification, based on scientifically rigorous research or theory, that supports...

  16. 76 FR 12343 - Technology and Media Services for Individuals With Disabilities-Steppingstones of Technology...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-07

    ... two phases: (1) Development and (2) research on effectiveness. Abstracts of projects funded under... approaches. Phase 2 projects must subject technology-based approaches to rigorous field-based research to... scientifically rigorous research or theory, that demonstrates the potential effectiveness of the technology-based...

  17. Statistical Inference for Data Adaptive Target Parameters.

    PubMed

    Hubbard, Alan E; Kherad-Pajouh, Sara; van der Laan, Mark J

    2016-05-01

    Consider one observes n i.i.d. copies of a random variable with a probability distribution that is known to be an element of a particular statistical model. In order to define our statistical target we partition the sample in V equal size sub-samples, and use this partitioning to define V splits in an estimation sample (one of the V subsamples) and corresponding complementary parameter-generating sample. For each of the V parameter-generating samples, we apply an algorithm that maps the sample to a statistical target parameter. We define our sample-split data adaptive statistical target parameter as the average of these V-sample specific target parameters. We present an estimator (and corresponding central limit theorem) of this type of data adaptive target parameter. This general methodology for generating data adaptive target parameters is demonstrated with a number of practical examples that highlight new opportunities for statistical learning from data. This new framework provides a rigorous statistical methodology for both exploratory and confirmatory analysis within the same data. Given that more research is becoming "data-driven", the theory developed within this paper provides a new impetus for a greater involvement of statistical inference into problems that are being increasingly addressed by clever, yet ad hoc pattern finding methods. To suggest such potential, and to verify the predictions of the theory, extensive simulation studies, along with a data analysis based on adaptively determined intervention rules are shown and give insight into how to structure such an approach. The results show that the data adaptive target parameter approach provides a general framework and resulting methodology for data-driven science.

  18. Statistical Data Analyses of Trace Chemical, Biochemical, and Physical Analytical Signatures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Udey, Ruth Norma

    Analytical and bioanalytical chemistry measurement results are most meaningful when interpreted using rigorous statistical treatments of the data. The same data set may provide many dimensions of information depending on the questions asked through the applied statistical methods. Three principal projects illustrated the wealth of information gained through the application of statistical data analyses to diverse problems.

  19. How Osmolytes Counteract Pressure Denaturation on a Molecular Scale.

    PubMed

    Shimizu, Seishi; Smith, Paul E

    2017-08-18

    Life in the deep sea exposes enzymes to high hydrostatic pressure, which decreases their stability. For survival, deep sea organisms tend to accumulate various osmolytes, most notably trimethylamine N-oxide used by fish, to counteract pressure denaturation. However, exactly how these osmolytes work remains unclear. Here, a rigorous statistical thermodynamics approach is used to clarify the mechanism of osmoprotection. It is shown that the weak, nonspecific, and dynamic interactions of water and osmolytes with proteins can be characterized only statistically, and that the competition between protein-osmolyte and protein-water interactions is crucial in determining conformational stability. Osmoprotection is driven by a stronger exclusion of osmolytes from the denatured protein than from the native conformation, and water distribution has no significant effect on these changes at low osmolyte concentrations. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. Geometry of behavioral spaces: A computational approach to analysis and understanding of agent based models and agent behaviors

    NASA Astrophysics Data System (ADS)

    Cenek, Martin; Dahl, Spencer K.

    2016-11-01

    Systems with non-linear dynamics frequently exhibit emergent system behavior, which is important to find and specify rigorously to understand the nature of the modeled phenomena. Through this analysis, it is possible to characterize phenomena such as how systems assemble or dissipate and what behaviors lead to specific final system configurations. Agent Based Modeling (ABM) is one of the modeling techniques used to study the interaction dynamics between a system's agents and its environment. Although the methodology of ABM construction is well understood and practiced, there are no computational, statistically rigorous, comprehensive tools to evaluate an ABM's execution. Often, a human has to observe an ABM's execution in order to analyze how the ABM functions, identify the emergent processes in the agent's behavior, or study a parameter's effect on the system-wide behavior. This paper introduces a new statistically based framework to automatically analyze agents' behavior, identify common system-wide patterns, and record the probability of agents changing their behavior from one pattern of behavior to another. We use network based techniques to analyze the landscape of common behaviors in an ABM's execution. Finally, we test the proposed framework with a series of experiments featuring increasingly emergent behavior. The proposed framework will allow computational comparison of ABM executions, exploration of a model's parameter configuration space, and identification of the behavioral building blocks in a model's dynamics.

  1. Geometry of behavioral spaces: A computational approach to analysis and understanding of agent based models and agent behaviors.

    PubMed

    Cenek, Martin; Dahl, Spencer K

    2016-11-01

    Systems with non-linear dynamics frequently exhibit emergent system behavior, which is important to find and specify rigorously to understand the nature of the modeled phenomena. Through this analysis, it is possible to characterize phenomena such as how systems assemble or dissipate and what behaviors lead to specific final system configurations. Agent Based Modeling (ABM) is one of the modeling techniques used to study the interaction dynamics between a system's agents and its environment. Although the methodology of ABM construction is well understood and practiced, there are no computational, statistically rigorous, comprehensive tools to evaluate an ABM's execution. Often, a human has to observe an ABM's execution in order to analyze how the ABM functions, identify the emergent processes in the agent's behavior, or study a parameter's effect on the system-wide behavior. This paper introduces a new statistically based framework to automatically analyze agents' behavior, identify common system-wide patterns, and record the probability of agents changing their behavior from one pattern of behavior to another. We use network based techniques to analyze the landscape of common behaviors in an ABM's execution. Finally, we test the proposed framework with a series of experiments featuring increasingly emergent behavior. The proposed framework will allow computational comparison of ABM executions, exploration of a model's parameter configuration space, and identification of the behavioral building blocks in a model's dynamics.

  2. A Rigorous Statistical Approach to Determine Solar Wind Composition from ACE/SWICS Data, and New Ne/O Ratios

    NASA Astrophysics Data System (ADS)

    Shearer, P.; Jawed, M. K.; Raines, J. M.; Lepri, S. T.; Gilbert, J. A.; von Steiger, R.; Zurbuchen, T.

    2013-12-01

    The SWICS instruments aboard ACE and Ulysses have performed in situ measurements of individual solar wind ions for a period spanning over two decades. Solar wind composition is determined by accumulating the measurements into an ion count histogram in which each species appears as a distinct peak. Assigning counts to the appropriate species is a challenging statistical problem because of the limited counts for some species and overlap between some peaks. We show that the most commonly used count assignment methods can suffer from significant bias when a highly abundant species overlaps with a much less abundant one. For ACE/SWICS data, this bias results in an overestimated Ne/O ratio. Bias is greatly reduced by switching to a rigorous maximum likelihood count assignment method, resulting in a 30-50% reduction in the estimated Ne abundance. We will discuss the new Ne/O values and put them in context with the solar system abundances for Ne derived from other techniques, such as in situ collection from Genesis and its heritage instrument, the Solar Foil experiment during the Apollo era. The new count assignment method is currently being applied to reanalyze the archived ACE and Ulysses data and obtain revised abundances of C, N, O, Ne, Mg, Si, S, and Fe, leading to revised datasets that will be made publicly available.

  3. Rigorous Statistical Bounds in Uncertainty Quantification for One-Layer Turbulent Geophysical Flows

    NASA Astrophysics Data System (ADS)

    Qi, Di; Majda, Andrew J.

    2018-04-01

    Statistical bounds controlling the total fluctuations in mean and variance about a basic steady-state solution are developed for the truncated barotropic flow over topography. Statistical ensemble prediction is an important topic in weather and climate research. Here, the evolution of an ensemble of trajectories is considered using statistical instability analysis and is compared and contrasted with the classical deterministic instability for the growth of perturbations in one pointwise trajectory. The maximum growth of the total statistics in fluctuations is derived relying on the statistical conservation principle of the pseudo-energy. The saturation bound of the statistical mean fluctuation and variance in the unstable regimes with non-positive-definite pseudo-energy is achieved by linking with a class of stable reference states and minimizing the stable statistical energy. Two cases with dependence on initial statistical uncertainty and on external forcing and dissipation are compared and unified under a consistent statistical stability framework. The flow structures and statistical stability bounds are illustrated and verified by numerical simulations among a wide range of dynamical regimes, where subtle transient statistical instability exists in general with positive short-time exponential growth in the covariance even when the pseudo-energy is positive-definite. Among the various scenarios in this paper, there exist strong forward and backward energy exchanges between different scales which are estimated by the rigorous statistical bounds.

  4. Mourning dove hunting regulation strategy based on annual harvest statistics and banding data

    USGS Publications Warehouse

    Otis, D.L.

    2006-01-01

    Although managers should strive to base game bird harvest management strategies on mechanistic population models, monitoring programs required to build and continuously update these models may not be in place. Alternatively, If estimates of total harvest and harvest rates are available, then population estimates derived from these harvest data can serve as the basis for making hunting regulation decisions based on population growth rates derived from these estimates. I present a statistically rigorous approach for regulation decision-making using a hypothesis-testing framework and an assumed framework of 3 hunting regulation alternatives. I illustrate and evaluate the technique with historical data on the mid-continent mallard (Anas platyrhynchos) population. I evaluate the statistical properties of the hypothesis-testing framework using the best available data on mourning doves (Zenaida macroura). I use these results to discuss practical implementation of the technique as an interim harvest strategy for mourning doves until reliable mechanistic population models and associated monitoring programs are developed.

  5. Statistical classification approach to discrimination between weak earthquakes and quarry blasts recorded by the Israel Seismic Network

    NASA Astrophysics Data System (ADS)

    Kushnir, A. F.; Troitsky, E. V.; Haikin, L. M.; Dainty, A.

    1999-06-01

    A semi-automatic procedure has been developed to achieve statistically optimum discrimination between earthquakes and explosions at local or regional distances based on a learning set specific to a given region. The method is used for step-by-step testing of candidate discrimination features to find the optimum (combination) subset of features, with the decision taken on a rigorous statistical basis. Linear (LDF) and Quadratic (QDF) Discriminant Functions based on Gaussian distributions of the discrimination features are implemented and statistically grounded; the features may be transformed by the Box-Cox transformation z=(1/ α)( yα-1) to make them more Gaussian. Tests of the method were successfully conducted on seismograms from the Israel Seismic Network using features consisting of spectral ratios between and within phases. Results showed that the QDF was more effective than the LDF and required five features out of 18 candidates for the optimum set. It was found that discrimination improved with increasing distance within the local range, and that eliminating transformation of the features and failing to correct for noise led to degradation of discrimination.

  6. Statistical moments of the Strehl ratio

    NASA Astrophysics Data System (ADS)

    Yaitskova, Natalia; Esselborn, Michael; Gladysz, Szymon

    2012-07-01

    Knowledge of the statistical characteristics of the Strehl ratio is essential for the performance assessment of the existing and future adaptive optics systems. For full assessment not only the mean value of the Strehl ratio but also higher statistical moments are important. Variance is related to the stability of an image and skewness reflects the chance to have in a set of short exposure images more or less images with the quality exceeding the mean. Skewness is a central parameter in the domain of lucky imaging. We present a rigorous theory for the calculation of the mean value, the variance and the skewness of the Strehl ratio. In our approach we represent the residual wavefront as being formed by independent cells. The level of the adaptive optics correction defines the number of the cells and the variance of the cells, which are the two main parameters of our theory. The deliverables are the values of the three moments as the functions of the correction level. We make no further assumptions except for the statistical independence of the cells.

  7. Nanophotonic light-trapping theory for solar cells

    NASA Astrophysics Data System (ADS)

    Yu, Zongfu; Raman, Aaswath; Fan, Shanhui

    2011-11-01

    Conventional light-trapping theory, based on a ray-optics approach, was developed for standard thick photovoltaic cells. The classical theory established an upper limit for possible absorption enhancement in this context and provided a design strategy for reaching this limit. This theory has become the foundation for light management in bulk silicon PV cells, and has had enormous influence on the optical design of solar cells in general. This theory, however, is not applicable in the nanophotonic regime. Here we develop a statistical temporal coupled-mode theory of light trapping based on a rigorous electromagnetic approach. Our theory reveals that the standard limit can be substantially surpassed when optical modes in the active layer are confined to deep-subwavelength scale, opening new avenues for highly efficient next-generation solar cells.

  8. Determining X-ray source intensity and confidence bounds in crowded fields

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Primini, F. A.; Kashyap, V. L., E-mail: fap@head.cfa.harvard.edu

    We present a rigorous description of the general problem of aperture photometry in high-energy astrophysics photon-count images, in which the statistical noise model is Poisson, not Gaussian. We compute the full posterior probability density function for the expected source intensity for various cases of interest, including the important cases in which both source and background apertures contain contributions from the source, and when multiple source apertures partially overlap. A Bayesian approach offers the advantages of allowing one to (1) include explicit prior information on source intensities, (2) propagate posterior distributions as priors for future observations, and (3) use Poisson likelihoods,more » making the treatment valid in the low-counts regime. Elements of this approach have been implemented in the Chandra Source Catalog.« less

  9. The Relationship between Project-Based Learning and Rigor in STEM-Focused High Schools

    ERIC Educational Resources Information Center

    Edmunds, Julie; Arshavsky, Nina; Glennie, Elizabeth; Charles, Karen; Rice, Olivia

    2016-01-01

    Project-based learning (PjBL) is an approach often favored in STEM classrooms, yet some studies have shown that teachers struggle to implement it with academic rigor. This paper explores the relationship between PjBL and rigor in the classrooms of ten STEM-oriented high schools. Utilizing three different data sources reflecting three different…

  10. Causal inference as an emerging statistical approach in neurology: an example for epilepsy in the elderly.

    PubMed

    Moura, Lidia Mvr; Westover, M Brandon; Kwasnik, David; Cole, Andrew J; Hsu, John

    2017-01-01

    The elderly population faces an increasing number of cases of chronic neurological conditions, such as epilepsy and Alzheimer's disease. Because the elderly with epilepsy are commonly excluded from randomized controlled clinical trials, there are few rigorous studies to guide clinical practice. When the elderly are eligible for trials, they either rarely participate or frequently have poor adherence to therapy, thus limiting both generalizability and validity. In contrast, large observational data sets are increasingly available, but are susceptible to bias when using common analytic approaches. Recent developments in causal inference-analytic approaches also introduce the possibility of emulating randomized controlled trials to yield valid estimates. We provide a practical example of the application of the principles of causal inference to a large observational data set of patients with epilepsy. This review also provides a framework for comparative-effectiveness research in chronic neurological conditions.

  11. Computational Approaches to the Chemical Equilibrium Constant in Protein-ligand Binding.

    PubMed

    Montalvo-Acosta, Joel José; Cecchini, Marco

    2016-12-01

    The physiological role played by protein-ligand recognition has motivated the development of several computational approaches to the ligand binding affinity. Some of them, termed rigorous, have a strong theoretical foundation but involve too much computation to be generally useful. Some others alleviate the computational burden by introducing strong approximations and/or empirical calibrations, which also limit their general use. Most importantly, there is no straightforward correlation between the predictive power and the level of approximation introduced. Here, we present a general framework for the quantitative interpretation of protein-ligand binding based on statistical mechanics. Within this framework, we re-derive self-consistently the fundamental equations of some popular approaches to the binding constant and pinpoint the inherent approximations. Our analysis represents a first step towards the development of variants with optimum accuracy/efficiency ratio for each stage of the drug discovery pipeline. © 2016 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. Statistical approach for selection of biologically informative genes.

    PubMed

    Das, Samarendra; Rai, Anil; Mishra, D C; Rai, Shesh N

    2018-05-20

    Selection of informative genes from high dimensional gene expression data has emerged as an important research area in genomics. Many gene selection techniques have been proposed so far are either based on relevancy or redundancy measure. Further, the performance of these techniques has been adjudged through post selection classification accuracy computed through a classifier using the selected genes. This performance metric may be statistically sound but may not be biologically relevant. A statistical approach, i.e. Boot-MRMR, was proposed based on a composite measure of maximum relevance and minimum redundancy, which is both statistically sound and biologically relevant for informative gene selection. For comparative evaluation of the proposed approach, we developed two biological sufficient criteria, i.e. Gene Set Enrichment with QTL (GSEQ) and biological similarity score based on Gene Ontology (GO). Further, a systematic and rigorous evaluation of the proposed technique with 12 existing gene selection techniques was carried out using five gene expression datasets. This evaluation was based on a broad spectrum of statistically sound (e.g. subject classification) and biological relevant (based on QTL and GO) criteria under a multiple criteria decision-making framework. The performance analysis showed that the proposed technique selects informative genes which are more biologically relevant. The proposed technique is also found to be quite competitive with the existing techniques with respect to subject classification and computational time. Our results also showed that under the multiple criteria decision-making setup, the proposed technique is best for informative gene selection over the available alternatives. Based on the proposed approach, an R Package, i.e. BootMRMR has been developed and available at https://cran.r-project.org/web/packages/BootMRMR. This study will provide a practical guide to select statistical techniques for selecting informative genes from high dimensional expression data for breeding and system biology studies. Published by Elsevier B.V.

  13. Rigorous Characterisation of a Novel, Statistically-Based Ocean Colour Algorithm for the PACE Mission

    NASA Astrophysics Data System (ADS)

    Craig, S. E.; Lee, Z.; Du, K.; Lin, J.

    2016-02-01

    An approach based on empirical orthogonal function (EOF) analysis of ocean colour spectra has been shown to accurately derive inherent optical properties (IOPs) and chlorophyll concentration in scenarios, such as optically complex waters, where standard algorithms often perform poorly. The algorithm has been successfully used in a number of regional applications, and has also shown promise in a global implementation based on the NASA NOMAD data set. Additionally, it has demonstrated the unique ability to derive ocean colour products from top of atmosphere (TOA) signals with either no or minimal atmospheric correction applied. Due to its high potential for use over coastal and inland waters, the EOF approach is currently being rigorously characterised as part of a suite of approaches that will be used to support the new NASA ocean colour mission, PACE (Pre-Aerosol, Clouds and ocean Ecosystem). A major component in this model characterisation is the generation of a synthetic TOA data set using a coupled ocean-atmosphere radiative transfer model, which has been run to mimic PACE spectral resolution, and under a wide range of geographical locations, water constituent concentrations, and sea surface and atmospheric conditions. The resulting multidimensional data set will be analysed, and results presented on the sensitivity of the model to various combinations of parameters, and preliminary conclusions made regarding the optimal implementation strategy of this promising approach (e.g. on a global, optical water type or regional basis). This will provide vital guidance for operational implementation of the model for both existing satellite ocean colour sensors and the upcoming PACE mission.

  14. Improvements in Modelling Bystander and Resident Exposure to Pesticide Spray Drift: Investigations into New Approaches for Characterizing the 'Collection Efficiency' of the Human Body.

    PubMed

    Butler Ellis, M Clare; Kennedy, Marc C; Kuster, Christian J; Alanis, Rafael; Tuck, Clive R

    2018-05-28

    The BREAM (Bystander and Resident Exposure Assessment Model) (Kennedy et al. in BREAM: A probabilistic bystander and resident exposure assessment model of spray drift from an agricultural boom sprayer. Comput Electron Agric 2012;88:63-71) for bystander and resident exposure to spray drift from boom sprayers has recently been incorporated into the European Food Safety Authority (EFSA) guidance for determining non-dietary exposures of humans to plant protection products. The component of BREAM, which relates airborne spray concentrations to bystander and resident dermal exposure, has been reviewed to identify whether it is possible to improve this and its description of variability captured in the model. Two approaches have been explored: a more rigorous statistical analysis of the empirical data and a semi-mechanistic model based on established studies combined with new data obtained in a wind tunnel. A statistical comparison between field data and model outputs was used to determine which approach gave the better prediction of exposures. The semi-mechanistic approach gave the better prediction of experimental data and resulted in a reduction in the proposed regulatory values for the 75th and 95th percentiles of the exposure distribution.

  15. Statistically rigorous calculations do not support common input and long-term synchronization of motor-unit firings

    PubMed Central

    Kline, Joshua C.

    2014-01-01

    Over the past four decades, various methods have been implemented to measure synchronization of motor-unit firings. In this work, we provide evidence that prior reports of the existence of universal common inputs to all motoneurons and the presence of long-term synchronization are misleading, because they did not use sufficiently rigorous statistical tests to detect synchronization. We developed a statistically based method (SigMax) for computing synchronization and tested it with data from 17,736 motor-unit pairs containing 1,035,225 firing instances from the first dorsal interosseous and vastus lateralis muscles—a data set one order of magnitude greater than that reported in previous studies. Only firing data, obtained from surface electromyographic signal decomposition with >95% accuracy, were used in the study. The data were not subjectively selected in any manner. Because of the size of our data set and the statistical rigor inherent to SigMax, we have confidence that the synchronization values that we calculated provide an improved estimate of physiologically driven synchronization. Compared with three other commonly used techniques, ours revealed three types of discrepancies that result from failing to use sufficient statistical tests necessary to detect synchronization. 1) On average, the z-score method falsely detected synchronization at 16 separate latencies in each motor-unit pair. 2) The cumulative sum method missed one out of every four synchronization identifications found by SigMax. 3) The common input assumption method identified synchronization from 100% of motor-unit pairs studied. SigMax revealed that only 50% of motor-unit pairs actually manifested synchronization. PMID:25210152

  16. Graphical Descriptives: A Way to Improve Data Transparency and Methodological Rigor in Psychology.

    PubMed

    Tay, Louis; Parrigon, Scott; Huang, Qiming; LeBreton, James M

    2016-09-01

    Several calls have recently been issued to the social sciences for enhanced transparency of research processes and enhanced rigor in the methodological treatment of data and data analytics. We propose the use of graphical descriptives (GDs) as one mechanism for responding to both of these calls. GDs provide a way to visually examine data. They serve as quick and efficient tools for checking data distributions, variable relations, and the potential appropriateness of different statistical analyses (e.g., do data meet the minimum assumptions for a particular analytic method). Consequently, we believe that GDs can promote increased transparency in the journal review process, encourage best practices for data analysis, and promote a more inductive approach to understanding psychological data. We illustrate the value of potentially including GDs as a step in the peer-review process and provide a user-friendly online resource (www.graphicaldescriptives.org) for researchers interested in including data visualizations in their research. We conclude with suggestions on how GDs can be expanded and developed to enhance transparency. © The Author(s) 2016.

  17. Statistical learning theory for high dimensional prediction: Application to criterion-keyed scale development.

    PubMed

    Chapman, Benjamin P; Weiss, Alexander; Duberstein, Paul R

    2016-12-01

    Statistical learning theory (SLT) is the statistical formulation of machine learning theory, a body of analytic methods common in "big data" problems. Regression-based SLT algorithms seek to maximize predictive accuracy for some outcome, given a large pool of potential predictors, without overfitting the sample. Research goals in psychology may sometimes call for high dimensional regression. One example is criterion-keyed scale construction, where a scale with maximal predictive validity must be built from a large item pool. Using this as a working example, we first introduce a core principle of SLT methods: minimization of expected prediction error (EPE). Minimizing EPE is fundamentally different than maximizing the within-sample likelihood, and hinges on building a predictive model of sufficient complexity to predict the outcome well, without undue complexity leading to overfitting. We describe how such models are built and refined via cross-validation. We then illustrate how 3 common SLT algorithms-supervised principal components, regularization, and boosting-can be used to construct a criterion-keyed scale predicting all-cause mortality, using a large personality item pool within a population cohort. Each algorithm illustrates a different approach to minimizing EPE. Finally, we consider broader applications of SLT predictive algorithms, both as supportive analytic tools for conventional methods, and as primary analytic tools in discovery phase research. We conclude that despite their differences from the classic null-hypothesis testing approach-or perhaps because of them-SLT methods may hold value as a statistically rigorous approach to exploratory regression. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  18. Topological Isomorphisms of Human Brain and Financial Market Networks

    PubMed Central

    Vértes, Petra E.; Nicol, Ruth M.; Chapman, Sandra C.; Watkins, Nicholas W.; Robertson, Duncan A.; Bullmore, Edward T.

    2011-01-01

    Although metaphorical and conceptual connections between the human brain and the financial markets have often been drawn, rigorous physical or mathematical underpinnings of this analogy remain largely unexplored. Here, we apply a statistical and graph theoretic approach to the study of two datasets – the time series of 90 stocks from the New York stock exchange over a 3-year period, and the fMRI-derived time series acquired from 90 brain regions over the course of a 10-min-long functional MRI scan of resting brain function in healthy volunteers. Despite the many obvious substantive differences between these two datasets, graphical analysis demonstrated striking commonalities in terms of global network topological properties. Both the human brain and the market networks were non-random, small-world, modular, hierarchical systems with fat-tailed degree distributions indicating the presence of highly connected hubs. These properties could not be trivially explained by the univariate time series statistics of stock price returns. This degree of topological isomorphism suggests that brains and markets can be regarded broadly as members of the same family of networks. The two systems, however, were not topologically identical. The financial market was more efficient and more modular – more highly optimized for information processing – than the brain networks; but also less robust to systemic disintegration as a result of hub deletion. We conclude that the conceptual connections between brains and markets are not merely metaphorical; rather these two information processing systems can be rigorously compared in the same mathematical language and turn out often to share important topological properties in common to some degree. There will be interesting scientific arbitrage opportunities in further work at the graph-theoretically mediated interface between systems neuroscience and the statistical physics of financial markets. PMID:22007161

  19. The importance of early investigation and publishing in an emergent health and environment crisis.

    PubMed

    Murase, Kaori

    2016-10-01

    To minimize the damage resulting from a long-term environmental disaster such as the 2011 Fukushima nuclear accident in Japan, early disclosure of research data by scientists and prompt decision making by government authorities are required in place of careful, time-consuming research and deliberation about the consequences and cause of the accident. A Bayesian approach with flexible statistical modeling helps scientists and encourages government authorities to make decisions based on environmental data available in the early stages of a disaster. It is evident from Fukushima and similar accidents that classical research methods involving statistical methodologies that require rigorous experimental design and complex data sets are too cumbersome and delay important actions that may be critical in the early stages of an environmental disaster. Integr Environ Assess Manag 2016;12:680-682. © 2016 SETAC. © 2016 SETAC.

  20. On the statistical equivalence of restrained-ensemble simulations with the maximum entropy method

    PubMed Central

    Roux, Benoît; Weare, Jonathan

    2013-01-01

    An issue of general interest in computer simulations is to incorporate information from experiments into a structural model. An important caveat in pursuing this goal is to avoid corrupting the resulting model with spurious and arbitrary biases. While the problem of biasing thermodynamic ensembles can be formulated rigorously using the maximum entropy method introduced by Jaynes, the approach can be cumbersome in practical applications with the need to determine multiple unknown coefficients iteratively. A popular alternative strategy to incorporate the information from experiments is to rely on restrained-ensemble molecular dynamics simulations. However, the fundamental validity of this computational strategy remains in question. Here, it is demonstrated that the statistical distribution produced by restrained-ensemble simulations is formally consistent with the maximum entropy method of Jaynes. This clarifies the underlying conditions under which restrained-ensemble simulations will yield results that are consistent with the maximum entropy method. PMID:23464140

  1. Asteroid orbital error analysis: Theory and application

    NASA Technical Reports Server (NTRS)

    Muinonen, K.; Bowell, Edward

    1992-01-01

    We present a rigorous Bayesian theory for asteroid orbital error estimation in which the probability density of the orbital elements is derived from the noise statistics of the observations. For Gaussian noise in a linearized approximation the probability density is also Gaussian, and the errors of the orbital elements at a given epoch are fully described by the covariance matrix. The law of error propagation can then be applied to calculate past and future positional uncertainty ellipsoids (Cappellari et al. 1976, Yeomans et al. 1987, Whipple et al. 1991). To our knowledge, this is the first time a Bayesian approach has been formulated for orbital element estimation. In contrast to the classical Fisherian school of statistics, the Bayesian school allows a priori information to be formally present in the final estimation. However, Bayesian estimation does give the same results as Fisherian estimation when no priori information is assumed (Lehtinen 1988, and reference therein).

  2. Ice Mass Change in Greenland and Antarctica Between 1993 and 2013 from Satellite Gravity Measurements

    NASA Technical Reports Server (NTRS)

    Talpe, Matthieu J.; Nerem, R. Steven; Forootan, Ehsan; Schmidt, Michael; Lemoine, Frank G.; Enderlin, Ellyn M.; Landerer, Felix W.

    2017-01-01

    We construct long-term time series of Greenland and Antarctic ice sheet mass change from satellite gravity measurements. A statistical reconstruction approach is developed based on a principal component analysis (PCA) to combine high-resolution spatial modes from the Gravity Recovery and Climate Experiment (GRACE) mission with the gravity information from conventional satellite tracking data. Uncertainties of this reconstruction are rigorously assessed; they include temporal limitations for short GRACE measurements, spatial limitations for the low-resolution conventional tracking data measurements, and limitations of the estimated statistical relationships between low- and high-degree potential coefficients reflected in the PCA modes. Trends of mass variations in Greenland and Antarctica are assessed against a number of previous studies. The resulting time series for Greenland show a higher rate of mass loss than other methods before 2000, while the Antarctic ice sheet appears heavily influenced by interannual variations.

  3. Methodological quality of behavioural weight loss studies: a systematic review

    PubMed Central

    Lemon, S. C.; Wang, M. L.; Haughton, C. F.; Estabrook, D. P.; Frisard, C. F.; Pagoto, S. L.

    2018-01-01

    Summary This systematic review assessed the methodological quality of behavioural weight loss intervention studies conducted among adults and associations between quality and statistically significant weight loss outcome, strength of intervention effectiveness and sample size. Searches for trials published between January, 2009 and December, 2014 were conducted using PUBMED, MEDLINE and PSYCINFO and identified ninety studies. Methodological quality indicators included study design, anthropometric measurement approach, sample size calculations, intent-to-treat (ITT) analysis, loss to follow-up rate, missing data strategy, sampling strategy, report of treatment receipt and report of intervention fidelity (mean = 6.3). Indicators most commonly utilized included randomized design (100%), objectively measured anthropometrics (96.7%), ITT analysis (86.7%) and reporting treatment adherence (76.7%). Most studies (62.2%) had a follow-up rate >75% and reported a loss to follow-up analytic strategy or minimal missing data (69.9%). Describing intervention fidelity (34.4%) and sampling from a known population (41.1%) were least common. Methodological quality was not associated with reporting a statistically significant result, effect size or sample size. This review found the published literature of behavioural weight loss trials to be of high quality for specific indicators, including study design and measurement. Identified for improvement include utilization of more rigorous statistical approaches to loss to follow up and better fidelity reporting. PMID:27071775

  4. Mathematical Rigor vs. Conceptual Change: Some Early Results

    NASA Astrophysics Data System (ADS)

    Alexander, W. R.

    2003-05-01

    Results from two different pedagogical approaches to teaching introductory astronomy at the college level will be presented. The first of these approaches is a descriptive, conceptually based approach that emphasizes conceptual change. This descriptive class is typically an elective for non-science majors. The other approach is a mathematically rigorous treatment that emphasizes problem solving and is designed to prepare students for further study in astronomy. The mathematically rigorous class is typically taken by science majors. It also fulfills an elective science requirement for these science majors. The Astronomy Diagnostic Test version 2 (ADT 2.0) was used as an assessment instrument since the validity and reliability have been investigated by previous researchers. The ADT 2.0 was administered as both a pre-test and post-test to both groups. Initial results show no significant difference between the two groups in the post-test. However, there is a slightly greater improvement for the descriptive class between the pre and post testing compared to the mathematically rigorous course. There was great care to account for variables. These variables included: selection of text, class format as well as instructor differences. Results indicate that the mathematically rigorous model, doesn't improve conceptual understanding any better than the conceptual change model. Additional results indicate that there is a similar gender bias in favor of males that has been measured by previous investigators. This research has been funded by the College of Science and Mathematics at James Madison University.

  5. Origin of the spike-timing-dependent plasticity rule

    NASA Astrophysics Data System (ADS)

    Cho, Myoung Won; Choi, M. Y.

    2016-08-01

    A biological synapse changes its efficacy depending on the difference between pre- and post-synaptic spike timings. Formulating spike-timing-dependent interactions in terms of the path integral, we establish a neural-network model, which makes it possible to predict relevant quantities rigorously by means of standard methods in statistical mechanics and field theory. In particular, the biological synaptic plasticity rule is shown to emerge as the optimal form for minimizing the free energy. It is further revealed that maximization of the entropy of neural activities gives rise to the competitive behavior of biological learning. This demonstrates that statistical mechanics helps to understand rigorously key characteristic behaviors of a neural network, thus providing the possibility of physics serving as a useful and relevant framework for probing life.

  6. Statistically rigorous calculations do not support common input and long-term synchronization of motor-unit firings.

    PubMed

    De Luca, Carlo J; Kline, Joshua C

    2014-12-01

    Over the past four decades, various methods have been implemented to measure synchronization of motor-unit firings. In this work, we provide evidence that prior reports of the existence of universal common inputs to all motoneurons and the presence of long-term synchronization are misleading, because they did not use sufficiently rigorous statistical tests to detect synchronization. We developed a statistically based method (SigMax) for computing synchronization and tested it with data from 17,736 motor-unit pairs containing 1,035,225 firing instances from the first dorsal interosseous and vastus lateralis muscles--a data set one order of magnitude greater than that reported in previous studies. Only firing data, obtained from surface electromyographic signal decomposition with >95% accuracy, were used in the study. The data were not subjectively selected in any manner. Because of the size of our data set and the statistical rigor inherent to SigMax, we have confidence that the synchronization values that we calculated provide an improved estimate of physiologically driven synchronization. Compared with three other commonly used techniques, ours revealed three types of discrepancies that result from failing to use sufficient statistical tests necessary to detect synchronization. 1) On average, the z-score method falsely detected synchronization at 16 separate latencies in each motor-unit pair. 2) The cumulative sum method missed one out of every four synchronization identifications found by SigMax. 3) The common input assumption method identified synchronization from 100% of motor-unit pairs studied. SigMax revealed that only 50% of motor-unit pairs actually manifested synchronization. Copyright © 2014 the American Physiological Society.

  7. Identifying and characterizing hepatitis C virus hotspots in Massachusetts: a spatial epidemiological approach.

    PubMed

    Stopka, Thomas J; Goulart, Michael A; Meyers, David J; Hutcheson, Marga; Barton, Kerri; Onofrey, Shauna; Church, Daniel; Donahue, Ashley; Chui, Kenneth K H

    2017-04-20

    Hepatitis C virus (HCV) infections have increased during the past decade but little is known about geographic clustering patterns. We used a unique analytical approach, combining geographic information systems (GIS), spatial epidemiology, and statistical modeling to identify and characterize HCV hotspots, statistically significant clusters of census tracts with elevated HCV counts and rates. We compiled sociodemographic and HCV surveillance data (n = 99,780 cases) for Massachusetts census tracts (n = 1464) from 2002 to 2013. We used a five-step spatial epidemiological approach, calculating incremental spatial autocorrelations and Getis-Ord Gi* statistics to identify clusters. We conducted logistic regression analyses to determine factors associated with the HCV hotspots. We identified nine HCV clusters, with the largest in Boston, New Bedford/Fall River, Worcester, and Springfield (p < 0.05). In multivariable analyses, we found that HCV hotspots were independently and positively associated with the percent of the population that was Hispanic (adjusted odds ratio [AOR]: 1.07; 95% confidence interval [CI]: 1.04, 1.09) and the percent of households receiving food stamps (AOR: 1.83; 95% CI: 1.22, 2.74). HCV hotspots were independently and negatively associated with the percent of the population that were high school graduates or higher (AOR: 0.91; 95% CI: 0.89, 0.93) and the percent of the population in the "other" race/ethnicity category (AOR: 0.88; 95% CI: 0.85, 0.91). We identified locations where HCV clusters were a concern, and where enhanced HCV prevention, treatment, and care can help combat the HCV epidemic in Massachusetts. GIS, spatial epidemiological and statistical analyses provided a rigorous approach to identify hotspot clusters of disease, which can inform public health policy and intervention targeting. Further studies that incorporate spatiotemporal cluster analyses, Bayesian spatial and geostatistical models, spatially weighted regression analyses, and assessment of associations between HCV clustering and the built environment are needed to expand upon our combined spatial epidemiological and statistical methods.

  8. Student peer assessment in evidence-based medicine (EBM) searching skills training: an experiment

    PubMed Central

    Eldredge, Jonathan D.; Bear, David G.; Wayne, Sharon J.; Perea, Paul P.

    2013-01-01

    Background: Student peer assessment (SPA) has been used intermittently in medical education for more than four decades, particularly in connection with skills training. SPA generally has not been rigorously tested, so medical educators have limited evidence about SPA effectiveness. Methods: Experimental design: Seventy-one first-year medical students were stratified by previous test scores into problem-based learning tutorial groups, and then these assigned groups were randomized further into intervention and control groups. All students received evidence-based medicine (EBM) training. Only the intervention group members received SPA training, practice with assessment rubrics, and then application of anonymous SPA to assignments submitted by other members of the intervention group. Results: Students in the intervention group had higher mean scores on the formative test with a potential maximum score of 49 points than did students in the control group, 45.7 and 43.5, respectively (P = 0.06). Conclusions: SPA training and the application of these skills by the intervention group resulted in higher scores on formative tests compared to those in the control group, a difference approaching statistical significance. The extra effort expended by librarians, other personnel, and medical students must be factored into the decision to use SPA in any specific educational context. Implications: SPA has not been rigorously tested, particularly in medical education. Future, similarly rigorous studies could further validate use of SPA so that librarians can optimally make use of limited contact time for information skills training in medical school curricula. PMID:24163593

  9. Stress and anxiety among nursing students: A review of intervention strategies in literature between 2009 and 2015.

    PubMed

    Turner, Katrina; McCarthy, Valerie Lander

    2017-01-01

    Undergraduate nursing students experience significant stress and anxiety, inhibiting learning and increasing attrition. Twenty-six intervention studies were identified and evaluated, updating a previous systematic review which categorized interventions targeting: (1) stressors, (2) coping, or (3) appraisal. The majority of interventions in this review aimed to reduce numbers or intensity of stressors through curriculum development (12) or to improve students' coping skills (8). Two studies reported interventions using only cognitive reappraisal while three interventions combined reappraisal with other approaches. Strength of evidence was limited by choice of study design, sample size, and lack of methodological rigor. Some statistically significant support was found for interventions focused on reducing stressors through curriculum development or improving students' coping skills. No statistically significant studies using reappraisal, either alone or in combination with other approaches, were identified, although qualitative findings suggested the potential benefits of this approach do merit further study. Progress was noted since 2008 in the increased number of studies and greater use of validated outcome measures but the review concluded further methodologically sound, adequately powered studies, especially randomized controlled trials, are needed to determine which interventions are effective to address the issue of excessive stress and anxiety among undergraduate nursing students. Copyright © 2016 Elsevier Ltd. All rights reserved.

  10. Statistical Analyses Comparing Prismatic Magnetite Crystals in ALH84001 Carbonate Globules with those from the Terrestrial Magnetotactic Bacteria Strain MV-1

    NASA Technical Reports Server (NTRS)

    Thomas-Keprta, Kathie L.; Clemett, Simon J.; Bazylinski, Dennis A.; Kirschvink, Joseph L.; McKay, David S.; Wentworth, Susan J.; Vali, H.; Gibson, Everett K.

    2000-01-01

    Here we use rigorous mathematical modeling to compare ALH84001 prismatic magnetites with those produced by terrestrial magnetotactic bacteria, MV-1. We find that this subset of the Martian magnetites appears to be statistically indistinguishable from those of MV-1.

  11. On representing the prognostic value of continuous gene expression biomarkers with the restricted mean survival curve.

    PubMed

    Eng, Kevin H; Schiller, Emily; Morrell, Kayla

    2015-11-03

    Researchers developing biomarkers for cancer prognosis from quantitative gene expression data are often faced with an odd methodological discrepancy: while Cox's proportional hazards model, the appropriate and popular technique, produces a continuous and relative risk score, it is hard to cast the estimate in clear clinical terms like median months of survival and percent of patients affected. To produce a familiar Kaplan-Meier plot, researchers commonly make the decision to dichotomize a continuous (often unimodal and symmetric) score. It is well known in the statistical literature that this procedure induces significant bias. We illustrate the liabilities of common techniques for categorizing a risk score and discuss alternative approaches. We promote the use of the restricted mean survival (RMS) and the corresponding RMS curve that may be thought of as an analog to the best fit line from simple linear regression. Continuous biomarker workflows should be modified to include the more rigorous statistical techniques and descriptive plots described in this article. All statistics discussed can be computed via standard functions in the Survival package of the R statistical programming language. Example R language code for the RMS curve is presented in the appendix.

  12. Estimating maize production in Kenya using NDVI: Some statistical considerations

    USGS Publications Warehouse

    Lewis, J.E.; Rowland, James; Nadeau , A.

    1998-01-01

    A regression model approach using a normalized difference vegetation index (NDVI) has the potential for estimating crop production in East Africa. However, before production estimation can become a reality, the underlying model assumptions and statistical nature of the sample data (NDVI and crop production) must be examined rigorously. Annual maize production statistics from 1982-90 for 36 agricultural districts within Kenya were used as the dependent variable; median area NDVI (independent variable) values from each agricultural district and year were extracted from the annual maximum NDVI data set. The input data and the statistical association of NDVI with maize production for Kenya were tested systematically for the following items: (1) homogeneity of the data when pooling the sample, (2) gross data errors and influence points, (3) serial (time) correlation, (4) spatial autocorrelation and (5) stability of the regression coefficients. The results of using a simple regression model with NDVI as the only independent variable are encouraging (r 0.75, p 0.05) and illustrate that NDVI can be a responsive indicator of maize production, especially in areas of high NDVI spatial variability, which coincide with areas of production variability in Kenya.

  13. Statistical Learning Theory for High Dimensional Prediction: Application to Criterion-Keyed Scale Development

    PubMed Central

    Chapman, Benjamin P.; Weiss, Alexander; Duberstein, Paul

    2016-01-01

    Statistical learning theory (SLT) is the statistical formulation of machine learning theory, a body of analytic methods common in “big data” problems. Regression-based SLT algorithms seek to maximize predictive accuracy for some outcome, given a large pool of potential predictors, without overfitting the sample. Research goals in psychology may sometimes call for high dimensional regression. One example is criterion-keyed scale construction, where a scale with maximal predictive validity must be built from a large item pool. Using this as a working example, we first introduce a core principle of SLT methods: minimization of expected prediction error (EPE). Minimizing EPE is fundamentally different than maximizing the within-sample likelihood, and hinges on building a predictive model of sufficient complexity to predict the outcome well, without undue complexity leading to overfitting. We describe how such models are built and refined via cross-validation. We then illustrate how three common SLT algorithms–Supervised Principal Components, Regularization, and Boosting—can be used to construct a criterion-keyed scale predicting all-cause mortality, using a large personality item pool within a population cohort. Each algorithm illustrates a different approach to minimizing EPE. Finally, we consider broader applications of SLT predictive algorithms, both as supportive analytic tools for conventional methods, and as primary analytic tools in discovery phase research. We conclude that despite their differences from the classic null-hypothesis testing approach—or perhaps because of them–SLT methods may hold value as a statistically rigorous approach to exploratory regression. PMID:27454257

  14. Double Dutch: A Tool for Designing Combinatorial Libraries of Biological Systems.

    PubMed

    Roehner, Nicholas; Young, Eric M; Voigt, Christopher A; Gordon, D Benjamin; Densmore, Douglas

    2016-06-17

    Recently, semirational approaches that rely on combinatorial assembly of characterized DNA components have been used to engineer biosynthetic pathways. In practice, however, it is not practical to assemble and test millions of pathway variants in order to elucidate how different DNA components affect the behavior of a pathway. To address this challenge, we apply a rigorous mathematical approach known as design of experiments (DOE) that can be used to construct empirical models of system behavior without testing all variants. To support this approach, we have developed a tool named Double Dutch, which uses a formal grammar and heuristic algorithms to automate the process of DOE library design. Compared to designing by hand, Double Dutch enables users to more efficiently and scalably design libraries of pathway variants that can be used in a DOE framework and uniquely provides a means to flexibly balance design considerations of statistical analysis, construction cost, and risk of homologous recombination, thereby demonstrating the utility of automating decision making when faced with complex design trade-offs.

  15. Introduction to the special issue on recentering science: Replication, robustness, and reproducibility in psychophysiology.

    PubMed

    Kappenman, Emily S; Keil, Andreas

    2017-01-01

    In recent years, the psychological and behavioral sciences have increased efforts to strengthen methodological practices and publication standards, with the ultimate goal of enhancing the value and reproducibility of published reports. These issues are especially important in the multidisciplinary field of psychophysiology, which yields rich and complex data sets with a large number of observations. In addition, the technological tools and analysis methods available in the field of psychophysiology are continually evolving, widening the array of techniques and approaches available to researchers. This special issue presents articles detailing rigorous and systematic evaluations of tasks, measures, materials, analysis approaches, and statistical practices in a variety of subdisciplines of psychophysiology. These articles highlight challenges in conducting and interpreting psychophysiological research and provide data-driven, evidence-based recommendations for overcoming those challenges to produce robust, reproducible results in the field of psychophysiology. © 2016 Society for Psychophysiological Research.

  16. QTest: Quantitative Testing of Theories of Binary Choice.

    PubMed

    Regenwetter, Michel; Davis-Stober, Clintin P; Lim, Shiau Hong; Guo, Ying; Popova, Anna; Zwilling, Chris; Cha, Yun-Shil; Messner, William

    2014-01-01

    The goal of this paper is to make modeling and quantitative testing accessible to behavioral decision researchers interested in substantive questions. We provide a novel, rigorous, yet very general, quantitative diagnostic framework for testing theories of binary choice. This permits the nontechnical scholar to proceed far beyond traditionally rather superficial methods of analysis, and it permits the quantitatively savvy scholar to triage theoretical proposals before investing effort into complex and specialized quantitative analyses. Our theoretical framework links static algebraic decision theory with observed variability in behavioral binary choice data. The paper is supplemented with a custom-designed public-domain statistical analysis package, the QTest software. We illustrate our approach with a quantitative analysis using published laboratory data, including tests of novel versions of "Random Cumulative Prospect Theory." A major asset of the approach is the potential to distinguish decision makers who have a fixed preference and commit errors in observed choices from decision makers who waver in their preferences.

  17. An Uncertainty Quantification Framework for Remote Sensing Retrievals

    NASA Astrophysics Data System (ADS)

    Braverman, A. J.; Hobbs, J.

    2017-12-01

    Remote sensing data sets produced by NASA and other space agencies are the result of complex algorithms that infer geophysical state from observed radiances using retrieval algorithms. The processing must keep up with the downlinked data flow, and this necessitates computational compromises that affect the accuracies of retrieved estimates. The algorithms are also limited by imperfect knowledge of physics and of ancillary inputs that are required. All of this contributes to uncertainties that are generally not rigorously quantified by stepping outside the assumptions that underlie the retrieval methodology. In this talk we discuss a practical framework for uncertainty quantification that can be applied to a variety of remote sensing retrieval algorithms. Ours is a statistical approach that uses Monte Carlo simulation to approximate the sampling distribution of the retrieved estimates. We will discuss the strengths and weaknesses of this approach, and provide a case-study example from the Orbiting Carbon Observatory 2 mission.

  18. Ab Initio and Monte Carlo Approaches For the MagnetocaloricEffect in Co- and In-Doped Ni-Mn-Ga Heusler Alloys

    NASA Astrophysics Data System (ADS)

    Sokolovskiy, Vladimir; Grünebohm, Anna; Buchelnikov, Vasiliy; Entel, Peter

    2014-09-01

    This special issue collects contributions from the participants of the "Information in Dynamical Systems and Complex Systems" workshop, which cover a wide range of important problems and new approaches that lie in the intersection of information theory and dynamical systems. The contributions include theoretical characterization and understanding of the different types of information flow and causality in general stochastic processes, inference and identification of coupling structure and parameters of system dynamics, rigorous coarse-grain modeling of network dynamical systems, and exact statistical testing of fundamental information-theoretic quantities such as the mutual information. The collective efforts reported herein reflect a modern perspective of the intimate connection between dynamical systems and information flow, leading to the promise of better understanding and modeling of natural complex systems and better/optimal design of engineering systems.

  19. Thermal machines beyond the weak coupling regime

    NASA Astrophysics Data System (ADS)

    Gallego, R.; Riera, A.; Eisert, J.

    2014-12-01

    How much work can be extracted from a heat bath using a thermal machine? The study of this question has a very long history in statistical physics in the weak-coupling limit, when applied to macroscopic systems. However, the assumption that thermal heat baths remain uncorrelated with associated physical systems is less reasonable on the nano-scale and in the quantum setting. In this work, we establish a framework of work extraction in the presence of quantum correlations. We show in a mathematically rigorous and quantitative fashion that quantum correlations and entanglement emerge as limitations to work extraction compared to what would be allowed by the second law of thermodynamics. At the heart of the approach are operations that capture the naturally non-equilibrium dynamics encountered when putting physical systems into contact with each other. We discuss various limits that relate to known results and put our work into the context of approaches to finite-time quantum thermodynamics.

  20. Differentiating Wheat Genotypes by Bayesian Hierarchical Nonlinear Mixed Modeling of Wheat Root Density.

    PubMed

    Wasson, Anton P; Chiu, Grace S; Zwart, Alexander B; Binns, Timothy R

    2017-01-01

    Ensuring future food security for a growing population while climate change and urban sprawl put pressure on agricultural land will require sustainable intensification of current farming practices. For the crop breeder this means producing higher crop yields with less resources due to greater environmental stresses. While easy gains in crop yield have been made mostly "above ground," little progress has been made "below ground"; and yet it is these root system traits that can improve productivity and resistance to drought stress. Wheat pre-breeders use soil coring and core-break counts to phenotype root architecture traits, with data collected on rooting density for hundreds of genotypes in small increments of depth. The measured densities are both large datasets and highly variable even within the same genotype, hence, any rigorous, comprehensive statistical analysis of such complex field data would be technically challenging. Traditionally, most attributes of the field data are therefore discarded in favor of simple numerical summary descriptors which retain much of the high variability exhibited by the raw data. This poses practical challenges: although plant scientists have established that root traits do drive resource capture in crops, traits that are more randomly (rather than genetically) determined are difficult to breed for. In this paper we develop a hierarchical nonlinear mixed modeling approach that utilizes the complete field data for wheat genotypes to fit, under the Bayesian paradigm, an "idealized" relative intensity function for the root distribution over depth. Our approach was used to determine heritability : how much of the variation between field samples was purely random vs. being mechanistically driven by the plant genetics? Based on the genotypic intensity functions, the overall heritability estimate was 0.62 (95% Bayesian confidence interval was 0.52 to 0.71). Despite root count profiles that were statistically very noisy, our approach led to denoised profiles which exhibited rigorously discernible phenotypic traits. Profile-specific traits could be representative of a genotype, and thus, used as a quantitative tool to associate phenotypic traits with specific genotypes. This would allow breeders to select for whole root system distributions appropriate for sustainable intensification, and inform policy for mitigating crop yield risk and food insecurity.

  1. Dynamic programming algorithms for biological sequence comparison.

    PubMed

    Pearson, W R; Miller, W

    1992-01-01

    Efficient dynamic programming algorithms are available for a broad class of protein and DNA sequence comparison problems. These algorithms require computer time proportional to the product of the lengths of the two sequences being compared [O(N2)] but require memory space proportional only to the sum of these lengths [O(N)]. Although the requirement for O(N2) time limits use of the algorithms to the largest computers when searching protein and DNA sequence databases, many other applications of these algorithms, such as calculation of distances for evolutionary trees and comparison of a new sequence to a library of sequence profiles, are well within the capabilities of desktop computers. In particular, the results of library searches with rapid searching programs, such as FASTA or BLAST, should be confirmed by performing a rigorous optimal alignment. Whereas rapid methods do not overlook significant sequence similarities, FASTA limits the number of gaps that can be inserted into an alignment, so that a rigorous alignment may extend the alignment substantially in some cases. BLAST does not allow gaps in the local regions that it reports; a calculation that allows gaps is very likely to extend the alignment substantially. Although a Monte Carlo evaluation of the statistical significance of a similarity score with a rigorous algorithm is much slower than the heuristic approach used by the RDF2 program, the dynamic programming approach should take less than 1 hr on a 386-based PC or desktop Unix workstation. For descriptive purposes, we have limited our discussion to methods for calculating similarity scores and distances that use gap penalties of the form g = rk. Nevertheless, programs for the more general case (g = q+rk) are readily available. Versions of these programs that run either on Unix workstations, IBM-PC class computers, or the Macintosh can be obtained from either of the authors.

  2. Quantifying falsifiability of scientific theories

    NASA Astrophysics Data System (ADS)

    Nemenman, Ilya

    I argue that the notion of falsifiability, a key concept in defining a valid scientific theory, can be quantified using Bayesian Model Selection, which is a standard tool in modern statistics. This relates falsifiability to the quantitative version of the statistical Occam's razor, and allows transforming some long-running arguments about validity of scientific theories from philosophical discussions to rigorous mathematical calculations.

  3. Effects of Pre and Post-Rigor Marinade Injection on Some Quality Parameters of Longissimus Dorsi Muscles

    PubMed Central

    Fadıloğlu, Eylem Ezgi; Serdaroğlu, Meltem

    2018-01-01

    Abstract This study was conducted to evaluate the effects of pre and post-rigor marinade injections on some quality parameters of Longissimus dorsi (LD) muscles. Three marinade formulations were prepared with 2% NaCl, 2% NaCl+0.5 M lactic acid and 2% NaCl+0.5 M sodium lactate. In this study marinade uptake, pH, free water, cooking loss, drip loss and color properties were analyzed. Injection time had significant effect on marinade uptake levels of samples. Regardless of marinate formulation, marinade uptake of pre-rigor samples injected with marinade solutions were higher than post rigor samples. Injection of sodium lactate increased pH values of samples whereas lactic acid injection decreased pH. Marinade treatment and storage period had significant effect on cooking loss. At each evaluation period interaction between marinade treatment and injection time showed different effect on free water content. Storage period and marinade application had significant effect on drip loss values. Drip loss in all samples increased during the storage. During all storage days, lowest CIE L* value was found in pre-rigor samples injected with sodium lactate. Lactic acid injection caused color fade in pre-rigor and post-rigor samples. Interaction between marinade treatment and storage period was found statistically significant (p<0.05). At day 0 and 3, the lowest CIE b* values obtained pre-rigor samples injected with sodium lactate and there were no differences were found in other samples. At day 6, no significant differences were found in CIE b* values of all samples. PMID:29805282

  4. Effects of Pre and Post-Rigor Marinade Injection on Some Quality Parameters of Longissimus Dorsi Muscles.

    PubMed

    Fadıloğlu, Eylem Ezgi; Serdaroğlu, Meltem

    2018-04-01

    This study was conducted to evaluate the effects of pre and post-rigor marinade injections on some quality parameters of Longissimus dorsi (LD) muscles. Three marinade formulations were prepared with 2% NaCl, 2% NaCl+0.5 M lactic acid and 2% NaCl+0.5 M sodium lactate. In this study marinade uptake, pH, free water, cooking loss, drip loss and color properties were analyzed. Injection time had significant effect on marinade uptake levels of samples. Regardless of marinate formulation, marinade uptake of pre-rigor samples injected with marinade solutions were higher than post rigor samples. Injection of sodium lactate increased pH values of samples whereas lactic acid injection decreased pH. Marinade treatment and storage period had significant effect on cooking loss. At each evaluation period interaction between marinade treatment and injection time showed different effect on free water content. Storage period and marinade application had significant effect on drip loss values. Drip loss in all samples increased during the storage. During all storage days, lowest CIE L* value was found in pre-rigor samples injected with sodium lactate. Lactic acid injection caused color fade in pre-rigor and post-rigor samples. Interaction between marinade treatment and storage period was found statistically significant ( p <0.05). At day 0 and 3, the lowest CIE b* values obtained pre-rigor samples injected with sodium lactate and there were no differences were found in other samples. At day 6, no significant differences were found in CIE b* values of all samples.

  5. A Model Based Deconvolution Approach for Creating Surface Composition Maps of Irregularly Shaped Bodies from Limited Orbiting Nuclear Spectrometer Measurements

    NASA Astrophysics Data System (ADS)

    Dallmann, N. A.; Carlsten, B. E.; Stonehill, L. C.

    2017-12-01

    Orbiting nuclear spectrometers have contributed significantly to our understanding of the composition of solar system bodies. Gamma rays and neutrons are produced within the surfaces of bodies by impacting galactic cosmic rays (GCR) and by intrinsic radionuclide decay. Measuring the flux and energy spectrum of these products at one point in an orbit elucidates the elemental content of the area in view. Deconvolution of measurements from many spatially registered orbit points can produce detailed maps of elemental abundances. In applying these well-established techniques to small and irregularly shaped bodies like Phobos, one encounters unique challenges beyond those of a large spheroid. Polar mapping orbits are not possible for Phobos and quasistatic orbits will realize only modest inclinations unavoidably limiting surface coverage and creating North-South ambiguities in deconvolution. The irregular shape causes self-shadowing both of the body to the spectrometer but also of the body to the incoming GCR. The view angle to the surface normal as well as the distance between the surface and the spectrometer is highly irregular. These characteristics can be synthesized into a complicated and continuously changing measurement system point spread function. We have begun to explore different model-based, statistically rigorous, iterative deconvolution methods to produce elemental abundance maps for a proposed future investigation of Phobos. By incorporating the satellite orbit, the existing high accuracy shape-models of Phobos, and the spectrometer response function, a detailed and accurate system model can be constructed. Many aspects of this model formation are particularly well suited to modern graphics processing techniques and parallel processing. We will present the current status and preliminary visualizations of the Phobos measurement system model. We will also discuss different deconvolution strategies and their relative merit in statistical rigor, stability, achievable resolution, and exploitation of the irregular shape to partially resolve ambiguities. The general applicability of these new approaches to existing data sets from Mars, Mercury, and Lunar investigations will be noted.

  6. Identifying significant gene‐environment interactions using a combination of screening testing and hierarchical false discovery rate control

    PubMed Central

    Shen, Li; Saykin, Andrew J.; Williams, Scott M.; Moore, Jason H.

    2016-01-01

    ABSTRACT Although gene‐environment (G× E) interactions play an important role in many biological systems, detecting these interactions within genome‐wide data can be challenging due to the loss in statistical power incurred by multiple hypothesis correction. To address the challenge of poor power and the limitations of existing multistage methods, we recently developed a screening‐testing approach for G× E interaction detection that combines elastic net penalized regression with joint estimation to support a single omnibus test for the presence of G× E interactions. In our original work on this technique, however, we did not assess type I error control or power and evaluated the method using just a single, small bladder cancer data set. In this paper, we extend the original method in two important directions and provide a more rigorous performance evaluation. First, we introduce a hierarchical false discovery rate approach to formally assess the significance of individual G× E interactions. Second, to support the analysis of truly genome‐wide data sets, we incorporate a score statistic‐based prescreening step to reduce the number of single nucleotide polymorphisms prior to fitting the first stage penalized regression model. To assess the statistical properties of our method, we compare the type I error rate and statistical power of our approach with competing techniques using both simple simulation designs as well as designs based on real disease architectures. Finally, we demonstrate the ability of our approach to identify biologically plausible SNP‐education interactions relative to Alzheimer's disease status using genome‐wide association study data from the Alzheimer's Disease Neuroimaging Initiative (ADNI). PMID:27578615

  7. An ex post facto evaluation framework for place-based police interventions.

    PubMed

    Braga, Anthony A; Hureau, David M; Papachristos, Andrew V

    2011-12-01

    A small but growing body of research evidence suggests that place-based police interventions generate significant crime control gains. While place-based policing strategies have been adopted by a majority of U.S. police departments, very few agencies make a priori commitments to rigorous evaluations. Recent methodological developments were applied to conduct a rigorous ex post facto evaluation of the Boston Police Department's Safe Street Team (SST) hot spots policing program. A nonrandomized quasi-experimental design was used to evaluate the violent crime control benefits of the SST program at treated street segments and intersections relative to untreated street segments and intersections. Propensity score matching techniques were used to identify comparison places in Boston. Growth curve regression models were used to analyze violent crime trends at treatment places relative to control places. UNITS OF ANALYSIS: Using computerized mapping and database software, a micro-level place database of violent index crimes at all street segments and intersections in Boston was created. Yearly counts of violent index crimes between 2000 and 2009 at the treatment and comparison street segments and intersections served as the key outcome measure. The SST program was associated with a statistically significant reduction in violent index crimes at the treatment places relative to the comparison places without displacing crime into proximate areas. To overcome the challenges of evaluation in real-world settings, evaluators need to continuously develop innovative approaches that take advantage of new theoretical and methodological approaches.

  8. Systemic Planning: An Annotated Bibliography and Literature Guide. Exchange Bibliography No. 91.

    ERIC Educational Resources Information Center

    Catanese, Anthony James

    Systemic planning is an operational approach to using scientific rigor and qualitative judgment in a complementary manner. It integrates rigorous techniques and methods from systems analysis, cybernetics, decision theory, and work programing. The annotated reference sources in this bibliography include those works that have been most influential…

  9. The 1,5-H-shift in 1-butoxy: A case study in the rigorous implementation of transition state theory for a multirotamer system

    NASA Astrophysics Data System (ADS)

    Vereecken, Luc; Peeters, Jozef

    2003-09-01

    The rigorous implementation of transition state theory (TST) for a reaction system with multiple reactant rotamers and multiple transition state conformers is discussed by way of a statistical rate analysis of the 1,5-H-shift in 1-butoxy radicals, a prototype reaction for the important class of H-shift reactions in atmospheric chemistry. Several approaches for deriving a multirotamer TST expression are treated: oscillator versus (hindered) internal rotor models; distinguishable versus indistinguishable atoms; and direct count methods versus degeneracy factors calculated by (simplified) direct count methods or from symmetry numbers and number of enantiomers, where applicable. It is shown that the various treatments are fully consistent, even if the TST expressions themselves appear different. The 1-butoxy H-shift reaction is characterized quantum chemically using B3LYP-DFT; the performance of this level of theory is compared to other methods. Rigorous application of the multirotamer TST methodology in an harmonic oscillator approximation based on this data yields a rate coefficient of k(298 K,1 atm)=1.4×105 s-1, and an Arrhenius expression k(T,1 atm)=1.43×1011 exp(-8.17 kcal mol-1/RT) s-1, which both closely match the experimental recommendations in the literature. The T-dependence is substantially influenced by the multirotamer treatment, as well as by the tunneling and fall-off corrections. The present results are compared to those of simplified TST calculations based solely on the properties of the lowest energy 1-butoxy rotamer.

  10. Safety Assessment of Food and Feed from GM Crops in Europe: Evaluating EFSA's Alternative Framework for the Rat 90-day Feeding Study.

    PubMed

    Hong, Bonnie; Du, Yingzhou; Mukerji, Pushkor; Roper, Jason M; Appenzeller, Laura M

    2017-07-12

    Regulatory-compliant rodent subchronic feeding studies are compulsory regardless of a hypothesis to test, according to recent EU legislation for the safety assessment of whole food/feed produced from genetically modified (GM) crops containing a single genetic transformation event (European Union Commission Implementing Regulation No. 503/2013). The Implementing Regulation refers to guidelines set forth by the European Food Safety Authority (EFSA) for the design, conduct, and analysis of rodent subchronic feeding studies. The set of EFSA recommendations was rigorously applied to a 90-day feeding study in Sprague-Dawley rats. After study completion, the appropriateness and applicability of these recommendations were assessed using a battery of statistical analysis approaches including both retrospective and prospective statistical power analyses as well as variance-covariance decomposition. In the interest of animal welfare considerations, alternative experimental designs were investigated and evaluated in the context of informing the health risk assessment of food/feed from GM crops.

  11. Handwriting Examination: Moving from Art to Science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jarman, K.H.; Hanlen, R.C.; Manzolillo, P.A.

    In this document, we present a method for validating the premises and methodology of forensic handwriting examination. This method is intuitively appealing because it relies on quantitative measurements currently used qualitatively by FDE's in making comparisons, and it is scientifically rigorous because it exploits the power of multivariate statistical analysis. This approach uses measures of both central tendency and variation to construct a profile for a given individual. (Central tendency and variation are important for characterizing an individual's writing and both are currently used by FDE's in comparative analyses). Once constructed, different profiles are then compared for individuality using clustermore » analysis; they are grouped so that profiles within a group cannot be differentiated from one another based on the measured characteristics, whereas profiles between groups can. The cluster analysis procedure used here exploits the power of multivariate hypothesis testing. The result is not only a profile grouping but also an indication of statistical significance of the groups generated.« less

  12. Solar Energy Enhancement Using Down-converting Particles: A Rigorous Approach

    DTIC Science & Technology

    2011-06-06

    Solar energy enhancement using down-converting particles: A rigorous approach Ze’ev R. Abrams,1,2 Avi Niv ,2 and Xiang Zhang2,3,a) 1Applied Science...System 1. 114905-2 Abrams, Niv , and Zhang J. Appl. Phys. 109, 114905 (2011) [This article is copyrighted as indicated in the article. Reuse of AIP...This increase per band-gap is displayed in 114905-3 Abrams, Niv , and Zhang J. Appl. Phys. 109, 114905 (2011) [This article is copyrighted as indicated

  13. Time Scale Optimization and the Hunt for Astronomical Cycles in Deep Time Strata

    NASA Astrophysics Data System (ADS)

    Meyers, Stephen R.

    2016-04-01

    A valuable attribute of astrochronology is the direct link between chronometer and climate change, providing a remarkable opportunity to constrain the evolution of the surficial Earth System. Consequently, the hunt for astronomical cycles in strata has spurred the development of a rich conceptual framework for climatic/oceanographic change, and has allowed exploration of the geologic record with unprecedented temporal resolution. Accompanying these successes, however, has been a persistent skepticism about appropriate astrochronologic testing and circular reasoning: how does one reliably test for astronomical cycles in stratigraphic data, especially when time is poorly constrained? From this perspective, it would seem that the merits and promise of astrochronology (e.g., a geologic time scale measured in ≤400 kyr increments) also serves as its Achilles heel, if the confirmation of such short rhythms defies rigorous statistical testing. To address these statistical challenges in astrochronologic testing, a new approach has been developed that (1) explicitly evaluates time scale uncertainty, (2) is resilient to common problems associated with spectrum confidence level assessment and 'multiple testing', and (3) achieves high statistical power under a wide range of conditions (it can identify astronomical cycles when present in data). Designated TimeOpt (for "time scale optimization"; Meyers 2015), the method employs a probabilistic linear regression model framework to investigate amplitude modulation and frequency ratios (bundling) in stratigraphic data, while simultaneously determining the optimal time scale. This presentation will review the TimeOpt method, and demonstrate how the flexible statistical framework can be further extended to evaluate (and optimize upon) complex sedimentation rate models, enhancing the statistical power of the approach, and addressing the challenge of unsteady sedimentation. Meyers, S. R. (2015), The evaluation of eccentricity-related amplitude modulation and bundling in paleoclimate data: An inverse approach for astrochronologic testing and time scale optimization, Paleoceanography, 30, doi:10.1002/ 2015PA002850.

  14. Quantification and statistical significance analysis of group separation in NMR-based metabonomics studies

    PubMed Central

    Goodpaster, Aaron M.; Kennedy, Michael A.

    2015-01-01

    Currently, no standard metrics are used to quantify cluster separation in PCA or PLS-DA scores plots for metabonomics studies or to determine if cluster separation is statistically significant. Lack of such measures makes it virtually impossible to compare independent or inter-laboratory studies and can lead to confusion in the metabonomics literature when authors putatively identify metabolites distinguishing classes of samples based on visual and qualitative inspection of scores plots that exhibit marginal separation. While previous papers have addressed quantification of cluster separation in PCA scores plots, none have advocated routine use of a quantitative measure of separation that is supported by a standard and rigorous assessment of whether or not the cluster separation is statistically significant. Here quantification and statistical significance of separation of group centroids in PCA and PLS-DA scores plots are considered. The Mahalanobis distance is used to quantify the distance between group centroids, and the two-sample Hotelling's T2 test is computed for the data, related to an F-statistic, and then an F-test is applied to determine if the cluster separation is statistically significant. We demonstrate the value of this approach using four datasets containing various degrees of separation, ranging from groups that had no apparent visual cluster separation to groups that had no visual cluster overlap. Widespread adoption of such concrete metrics to quantify and evaluate the statistical significance of PCA and PLS-DA cluster separation would help standardize reporting of metabonomics data. PMID:26246647

  15. On testing for spatial correspondence between maps of human brain structure and function.

    PubMed

    Alexander-Bloch, Aaron F; Shou, Haochang; Liu, Siyuan; Satterthwaite, Theodore D; Glahn, David C; Shinohara, Russell T; Vandekar, Simon N; Raznahan, Armin

    2018-06-01

    A critical issue in many neuroimaging studies is the comparison between brain maps. Nonetheless, it remains unclear how one should test hypotheses focused on the overlap or spatial correspondence between two or more brain maps. This "correspondence problem" affects, for example, the interpretation of comparisons between task-based patterns of functional activation, resting-state networks or modules, and neuroanatomical landmarks. To date, this problem has been addressed with remarkable variability in terms of methodological approaches and statistical rigor. In this paper, we address the correspondence problem using a spatial permutation framework to generate null models of overlap by applying random rotations to spherical representations of the cortical surface, an approach for which we also provide a theoretical statistical foundation. We use this method to derive clusters of cognitive functions that are correlated in terms of their functional neuroatomical substrates. In addition, using publicly available data, we formally demonstrate the correspondence between maps of task-based functional activity, resting-state fMRI networks and gyral-based anatomical landmarks. We provide open-access code to implement the methods presented for two commonly-used tools for surface based cortical analysis (https://www.github.com/spin-test). This spatial permutation approach constitutes a useful advance over widely-used methods for the comparison of cortical maps, thereby opening new possibilities for the integration of diverse neuroimaging data. Copyright © 2018 Elsevier Inc. All rights reserved.

  16. Reconciling the Rigor-Relevance Dilemma in Intellectual Capital Research

    ERIC Educational Resources Information Center

    Andriessen, Daniel

    2004-01-01

    This paper raises the issue of research methodology for intellectual capital and other types of management research by focusing on the dilemma of rigour versus relevance. The more traditional explanatory approach to research often leads to rigorous results that are not of much help to solve practical problems. This paper describes an alternative…

  17. Spatial scaling and multi-model inference in landscape genetics: Martes americana in northern Idaho

    Treesearch

    Tzeidle N. Wasserman; Samuel A. Cushman; Michael K. Schwartz; David O. Wallin

    2010-01-01

    Individual-based analyses relating landscape structure to genetic distances across complex landscapes enable rigorous evaluation of multiple alternative hypotheses linking landscape structure to gene flow. We utilize two extensions to increase the rigor of the individual-based causal modeling approach to inferring relationships between landscape patterns and gene flow...

  18. Collisional damping rates for plasma waves

    NASA Astrophysics Data System (ADS)

    Tigik, S. F.; Ziebell, L. F.; Yoon, P. H.

    2016-06-01

    The distinction between the plasma dynamics dominated by collisional transport versus collective processes has never been rigorously addressed until recently. A recent paper [P. H. Yoon et al., Phys. Rev. E 93, 033203 (2016)] formulates for the first time, a unified kinetic theory in which collective processes and collisional dynamics are systematically incorporated from first principles. One of the outcomes of such a formalism is the rigorous derivation of collisional damping rates for Langmuir and ion-acoustic waves, which can be contrasted to the heuristic customary approach. However, the results are given only in formal mathematical expressions. The present brief communication numerically evaluates the rigorous collisional damping rates by considering the case of plasma particles with Maxwellian velocity distribution function so as to assess the consequence of the rigorous formalism in a quantitative manner. Comparison with the heuristic ("Spitzer") formula shows that the accurate damping rates are much lower in magnitude than the conventional expression, which implies that the traditional approach over-estimates the importance of attenuation of plasma waves by collisional relaxation process. Such a finding may have a wide applicability ranging from laboratory to space and astrophysical plasmas.

  19. Computer experimental analysis of the CHP performance of a 100 kW e SOFC Field Unit by a factorial design

    NASA Astrophysics Data System (ADS)

    Calì, M.; Santarelli, M. G. L.; Leone, P.

    Gas Turbine Technologies (GTT) and Politecnico di Torino, both located in Torino (Italy), have been involved in the design and installation of a SOFC laboratory in order to analyse the operation, in cogenerative configuration, of the CHP 100 kW e SOFC Field Unit, built by Siemens-Westinghouse Power Corporation (SWPC), which is at present (May 2005) starting its operation and which will supply electric and thermal power to the GTT factory. In order to take the better advantage from the analysis of the on-site operation, and especially to correctly design the scheduled experimental tests on the system, we developed a mathematical model and run a simulated experimental campaign, applying a rigorous statistical approach to the analysis of the results. The aim of this work is the computer experimental analysis, through a statistical methodology (2 k factorial experiments), of the CHP 100 performance. First, the mathematical model has been calibrated with the results acquired during the first CHP100 demonstration at EDB/ELSAM in Westerwoort. After, the simulated tests have been performed in the form of computer experimental session, and the measurement uncertainties have been simulated with perturbation imposed to the model independent variables. The statistical methodology used for the computer experimental analysis is the factorial design (Yates' Technique): using the ANOVA technique the effect of the main independent variables (air utilization factor U ox, fuel utilization factor U F, internal fuel and air preheating and anodic recycling flow rate) has been investigated in a rigorous manner. Analysis accounts for the effects of parameters on stack electric power, thermal recovered power, single cell voltage, cell operative temperature, consumed fuel flow and steam to carbon ratio. Each main effect and interaction effect of parameters is shown with particular attention on generated electric power and stack heat recovered.

  20. Spatial Statistical Data Fusion (SSDF)

    NASA Technical Reports Server (NTRS)

    Braverman, Amy J.; Nguyen, Hai M.; Cressie, Noel

    2013-01-01

    As remote sensing for scientific purposes has transitioned from an experimental technology to an operational one, the selection of instruments has become more coordinated, so that the scientific community can exploit complementary measurements. However, tech nological and scientific heterogeneity across devices means that the statistical characteristics of the data they collect are different. The challenge addressed here is how to combine heterogeneous remote sensing data sets in a way that yields optimal statistical estimates of the underlying geophysical field, and provides rigorous uncertainty measures for those estimates. Different remote sensing data sets may have different spatial resolutions, different measurement error biases and variances, and other disparate characteristics. A state-of-the-art spatial statistical model was used to relate the true, but not directly observed, geophysical field to noisy, spatial aggregates observed by remote sensing instruments. The spatial covariances of the true field and the covariances of the true field with the observations were modeled. The observations are spatial averages of the true field values, over pixels, with different measurement noise superimposed. A kriging framework is used to infer optimal (minimum mean squared error and unbiased) estimates of the true field at point locations from pixel-level, noisy observations. A key feature of the spatial statistical model is the spatial mixed effects model that underlies it. The approach models the spatial covariance function of the underlying field using linear combinations of basis functions of fixed size. Approaches based on kriging require the inversion of very large spatial covariance matrices, and this is usually done by making simplifying assumptions about spatial covariance structure that simply do not hold for geophysical variables. In contrast, this method does not require these assumptions, and is also computationally much faster. This method is fundamentally different than other approaches to data fusion for remote sensing data because it is inferential rather than merely descriptive. All approaches combine data in a way that minimizes some specified loss function. Most of these are more or less ad hoc criteria based on what looks good to the eye, or some criteria that relate only to the data at hand.

  1. Progress Toward an Integration of Process-Structure-Property-Performance Models for "Three-Dimensional (3-D) Printing" of Titanium Alloys

    NASA Astrophysics Data System (ADS)

    Collins, P. C.; Haden, C. V.; Ghamarian, I.; Hayes, B. J.; Ales, T.; Penso, G.; Dixit, V.; Harlow, G.

    2014-07-01

    Electron beam direct manufacturing, synonymously known as electron beam additive manufacturing, along with other additive "3-D printing" manufacturing processes, are receiving widespread attention as a means of producing net-shape (or near-net-shape) components, owing to potential manufacturing benefits. Yet, materials scientists know that differences in manufacturing processes often significantly influence the microstructure of even widely accepted materials and, thus, impact the properties and performance of a material in service. It is important to accelerate the understanding of the processing-structure-property relationship of materials being produced via these novel approaches in a framework that considers the performance in a statistically rigorous way. This article describes the development of a process model, the assessment of key microstructural features to be incorporated into a microstructure simulation model, a novel approach to extract a constitutive equation to predict tensile properties in Ti-6Al-4V (Ti-64), and a probabilistic approach to measure the fidelity of the property model against real data. This integrated approach will provide designers a tool to vary process parameters and understand the influence on performance, enabling design and optimization for these highly visible manufacturing approaches.

  2. Bioregional monitoring design and occupancy estimation for two Sierra Nevadan amphibian taxa

    EPA Science Inventory

    Land-management agencies need quantitative, statistically rigorous monitoring data, often at large spatial and temporal scales, to support resource-management decisions. Monitoring designs typically must accommodate multiple ecological, logistical, political, and economic objec...

  3. Small sample mediation testing: misplaced confidence in bootstrapped confidence intervals.

    PubMed

    Koopman, Joel; Howe, Michael; Hollenbeck, John R; Sin, Hock-Peng

    2015-01-01

    Bootstrapping is an analytical tool commonly used in psychology to test the statistical significance of the indirect effect in mediation models. Bootstrapping proponents have particularly advocated for its use for samples of 20-80 cases. This advocacy has been heeded, especially in the Journal of Applied Psychology, as researchers are increasingly utilizing bootstrapping to test mediation with samples in this range. We discuss reasons to be concerned with this escalation, and in a simulation study focused specifically on this range of sample sizes, we demonstrate not only that bootstrapping has insufficient statistical power to provide a rigorous hypothesis test in most conditions but also that bootstrapping has a tendency to exhibit an inflated Type I error rate. We then extend our simulations to investigate an alternative empirical resampling method as well as a Bayesian approach and demonstrate that they exhibit comparable statistical power to bootstrapping in small samples without the associated inflated Type I error. Implications for researchers testing mediation hypotheses in small samples are presented. For researchers wishing to use these methods in their own research, we have provided R syntax in the online supplemental materials. (c) 2015 APA, all rights reserved.

  4. Forced oral opening for cadavers with rigor mortis: two approaches for the myotomy on the temporal muscles.

    PubMed

    Nakayama, Y; Aoki, Y; Niitsu, H; Saigusa, K

    2001-04-15

    Forensic dentistry plays an essential role in personal identification procedures. An adequate interincisal space of cadavers with rigor mortis is required to obtain detailed dental findings. We have developed intraoral and two directional approaches, for myotomy of the temporal muscles. The intraoral approach, in which the temporalis was dissected with scissors inserted via an intraoral incision, was adopted for elderly cadavers, females and emaciated or exhausted bodies, and had a merit of no incision on the face. The two directional approach, in which myotomy was performed with thread-wire saw from behind and with scissors via the intraoral incision, was designed for male muscular youths. Both approaches were effective to obtain a desired degree of an interincisal opening without facial damage.

  5. Dealing with missing standard deviation and mean values in meta-analysis of continuous outcomes: a systematic review.

    PubMed

    Weir, Christopher J; Butcher, Isabella; Assi, Valentina; Lewis, Stephanie C; Murray, Gordon D; Langhorne, Peter; Brady, Marian C

    2018-03-07

    Rigorous, informative meta-analyses rely on availability of appropriate summary statistics or individual participant data. For continuous outcomes, especially those with naturally skewed distributions, summary information on the mean or variability often goes unreported. While full reporting of original trial data is the ideal, we sought to identify methods for handling unreported mean or variability summary statistics in meta-analysis. We undertook two systematic literature reviews to identify methodological approaches used to deal with missing mean or variability summary statistics. Five electronic databases were searched, in addition to the Cochrane Colloquium abstract books and the Cochrane Statistics Methods Group mailing list archive. We also conducted cited reference searching and emailed topic experts to identify recent methodological developments. Details recorded included the description of the method, the information required to implement the method, any underlying assumptions and whether the method could be readily applied in standard statistical software. We provided a summary description of the methods identified, illustrating selected methods in example meta-analysis scenarios. For missing standard deviations (SDs), following screening of 503 articles, fifteen methods were identified in addition to those reported in a previous review. These included Bayesian hierarchical modelling at the meta-analysis level; summary statistic level imputation based on observed SD values from other trials in the meta-analysis; a practical approximation based on the range; and algebraic estimation of the SD based on other summary statistics. Following screening of 1124 articles for methods estimating the mean, one approximate Bayesian computation approach and three papers based on alternative summary statistics were identified. Illustrative meta-analyses showed that when replacing a missing SD the approximation using the range minimised loss of precision and generally performed better than omitting trials. When estimating missing means, a formula using the median, lower quartile and upper quartile performed best in preserving the precision of the meta-analysis findings, although in some scenarios, omitting trials gave superior results. Methods based on summary statistics (minimum, maximum, lower quartile, upper quartile, median) reported in the literature facilitate more comprehensive inclusion of randomised controlled trials with missing mean or variability summary statistics within meta-analyses.

  6. Digital morphogenesis via Schelling segregation

    NASA Astrophysics Data System (ADS)

    Barmpalias, George; Elwes, Richard; Lewis-Pye, Andrew

    2018-04-01

    Schelling’s model of segregation looks to explain the way in which particles or agents of two types may come to arrange themselves spatially into configurations consisting of large homogeneous clusters, i.e. connected regions consisting of only one type. As one of the earliest agent based models studied by economists and perhaps the most famous model of self-organising behaviour, it also has direct links to areas at the interface between computer science and statistical mechanics, such as the Ising model and the study of contagion and cascading phenomena in networks. While the model has been extensively studied it has largely resisted rigorous analysis, prior results from the literature generally pertaining to variants of the model which are tweaked so as to be amenable to standard techniques from statistical mechanics or stochastic evolutionary game theory. In Brandt et al (2012 Proc. 44th Annual ACM Symp. on Theory of Computing) provided the first rigorous analysis of the unperturbed model, for a specific set of input parameters. Here we provide a rigorous analysis of the model’s behaviour much more generally and establish some surprising forms of threshold behaviour, notably the existence of situations where an increased level of intolerance for neighbouring agents of opposite type leads almost certainly to decreased segregation.

  7. Can power-law scaling and neuronal avalanches arise from stochastic dynamics?

    PubMed

    Touboul, Jonathan; Destexhe, Alain

    2010-02-11

    The presence of self-organized criticality in biology is often evidenced by a power-law scaling of event size distributions, which can be measured by linear regression on logarithmic axes. We show here that such a procedure does not necessarily mean that the system exhibits self-organized criticality. We first provide an analysis of multisite local field potential (LFP) recordings of brain activity and show that event size distributions defined as negative LFP peaks can be close to power-law distributions. However, this result is not robust to change in detection threshold, or when tested using more rigorous statistical analyses such as the Kolmogorov-Smirnov test. Similar power-law scaling is observed for surrogate signals, suggesting that power-law scaling may be a generic property of thresholded stochastic processes. We next investigate this problem analytically, and show that, indeed, stochastic processes can produce spurious power-law scaling without the presence of underlying self-organized criticality. However, this power-law is only apparent in logarithmic representations, and does not survive more rigorous analysis such as the Kolmogorov-Smirnov test. The same analysis was also performed on an artificial network known to display self-organized criticality. In this case, both the graphical representations and the rigorous statistical analysis reveal with no ambiguity that the avalanche size is distributed as a power-law. We conclude that logarithmic representations can lead to spurious power-law scaling induced by the stochastic nature of the phenomenon. This apparent power-law scaling does not constitute a proof of self-organized criticality, which should be demonstrated by more stringent statistical tests.

  8. All Rigor and No Play Is No Way to Improve Learning

    ERIC Educational Resources Information Center

    Wohlwend, Karen; Peppler, Kylie

    2015-01-01

    The authors propose and discuss their Playshop curricular model, which they developed with teachers. Their studies suggest a playful approach supports even more rigor than the Common Core State Standards require for preschool and early grade children. Children keep their attention longer when learning comes in the form of something they can play…

  9. A statistical kinematic source inversion approach based on the QUESO library for uncertainty quantification and prediction

    NASA Astrophysics Data System (ADS)

    Zielke, Olaf; McDougall, Damon; Mai, Martin; Babuska, Ivo

    2014-05-01

    Seismic, often augmented with geodetic data, are frequently used to invert for the spatio-temporal evolution of slip along a rupture plane. The resulting images of the slip evolution for a single event, inferred by different research teams, often vary distinctly, depending on the adopted inversion approach and rupture model parameterization. This observation raises the question, which of the provided kinematic source inversion solutions is most reliable and most robust, and — more generally — how accurate are fault parameterization and solution predictions? These issues are not included in "standard" source inversion approaches. Here, we present a statistical inversion approach to constrain kinematic rupture parameters from teleseismic body waves. The approach is based a) on a forward-modeling scheme that computes synthetic (body-)waves for a given kinematic rupture model, and b) on the QUESO (Quantification of Uncertainty for Estimation, Simulation, and Optimization) library that uses MCMC algorithms and Bayes theorem for sample selection. We present Bayesian inversions for rupture parameters in synthetic earthquakes (i.e. for which the exact rupture history is known) in an attempt to identify the cross-over at which further model discretization (spatial and temporal resolution of the parameter space) is no longer attributed to a decreasing misfit. Identification of this cross-over is of importance as it reveals the resolution power of the studied data set (i.e. teleseismic body waves), enabling one to constrain kinematic earthquake rupture histories of real earthquakes at a resolution that is supported by data. In addition, the Bayesian approach allows for mapping complete posterior probability density functions of the desired kinematic source parameters, thus enabling us to rigorously assess the uncertainties in earthquake source inversions.

  10. Simulation-Based Probabilistic Tsunami Hazard Analysis: Empirical and Robust Hazard Predictions

    NASA Astrophysics Data System (ADS)

    De Risi, Raffaele; Goda, Katsuichiro

    2017-08-01

    Probabilistic tsunami hazard analysis (PTHA) is the prerequisite for rigorous risk assessment and thus for decision-making regarding risk mitigation strategies. This paper proposes a new simulation-based methodology for tsunami hazard assessment for a specific site of an engineering project along the coast, or, more broadly, for a wider tsunami-prone region. The methodology incorporates numerous uncertain parameters that are related to geophysical processes by adopting new scaling relationships for tsunamigenic seismic regions. Through the proposed methodology it is possible to obtain either a tsunami hazard curve for a single location, that is the representation of a tsunami intensity measure (such as inundation depth) versus its mean annual rate of occurrence, or tsunami hazard maps, representing the expected tsunami intensity measures within a geographical area, for a specific probability of occurrence in a given time window. In addition to the conventional tsunami hazard curve that is based on an empirical statistical representation of the simulation-based PTHA results, this study presents a robust tsunami hazard curve, which is based on a Bayesian fitting methodology. The robust approach allows a significant reduction of the number of simulations and, therefore, a reduction of the computational effort. Both methods produce a central estimate of the hazard as well as a confidence interval, facilitating the rigorous quantification of the hazard uncertainties.

  11. Devising, Implementing, and Evaluating Interventions to Eliminate Health Care Disparities in Minority Children

    PubMed Central

    Flores, Glenn

    2010-01-01

    Despite an accumulating body of literature addressing racial/ethnic disparities in children’s health and health care, there have been few published studies of interventions that have been successful in eliminating these disparities. The objectives of this article, therefore, are to (1) describe 3 interventions that have been successful in eliminating racial/ethnic disparities in children’s health and health care, (2) high-light tips and pitfalls regarding devising, implementing, and evaluating pediatric disparities interventions, and (3) propose a research agenda for pediatric disparities interventions. Key characteristics of the 3 successful interventions include rigorous study designs; large sample sizes; appropriate comparison groups; community-based interventions that are culturally and linguistically sensitive and involve collaboration with participants; research staff from the same community as the participants; appropriate blinding of outcomes assessors; and statistical adjustment of outcomes for relevant covariates. On the basis of these characteristics, I propose tips, pitfalls, an approach, and a research agenda for devising, implementing, and evaluating successful pediatric disparities interventions. Examination of 3 successful interventions indicates that pediatric health care disparities can be eliminated. Achievement of this goal requires an intervention that is rigorous, evidence-based, and culturally and linguistically appropriate. The intervention must also include community collaboration, minimize attrition, adjust for potential confounders, and incorporate mechanisms for sustainability. PMID:19861473

  12. Devising, implementing, and evaluating interventions to eliminate health care disparities in minority children.

    PubMed

    Flores, Glenn

    2009-11-01

    Despite an accumulating body of literature addressing racial/ethnic disparities in children's health and health care, there have been few published studies of interventions that have been successful in eliminating these disparities. The objectives of this article, therefore, are to (1) describe 3 interventions that have been successful in eliminating racial/ethnic disparities in children's health and health care, (2) highlight tips and pitfalls regarding devising, implementing, and evaluating pediatric disparities interventions, and (3) propose a research agenda for pediatric disparities interventions. Key characteristics of the 3 successful interventions include rigorous study designs; large sample sizes; appropriate comparison groups; community-based interventions that are culturally and linguistically sensitive and involve collaboration with participants; research staff from the same community as the participants; appropriate blinding of outcomes assessors; and statistical adjustment of outcomes for relevant covariates. On the basis of these characteristics, I propose tips, pitfalls, an approach, and a research agenda for devising, implementing, and evaluating successful pediatric disparities interventions. Examination of 3 successful interventions indicates that pediatric health care disparities can be eliminated. Achievement of this goal requires an intervention that is rigorous, evidence-based, and culturally and linguistically appropriate. The intervention must also include community collaboration, minimize attrition, adjust for potential confounders, and incorporate mechanisms for sustainability.

  13. Rigorous Results for the Distribution of Money on Connected Graphs

    NASA Astrophysics Data System (ADS)

    Lanchier, Nicolas; Reed, Stephanie

    2018-05-01

    This paper is concerned with general spatially explicit versions of three stochastic models for the dynamics of money that have been introduced and studied numerically by statistical physicists: the uniform reshuffling model, the immediate exchange model and the model with saving propensity. All three models consist of systems of economical agents that consecutively engage in pairwise monetary transactions. Computer simulations performed in the physics literature suggest that, when the number of agents and the average amount of money per agent are large, the limiting distribution of money as time goes to infinity approaches the exponential distribution for the first model, the gamma distribution with shape parameter two for the second model and a distribution similar but not exactly equal to a gamma distribution whose shape parameter depends on the saving propensity for the third model. The main objective of this paper is to give rigorous proofs of these conjectures and also extend these conjectures to generalizations of the first two models and a variant of the third model that include local rather than global interactions, i.e., instead of choosing the two interacting agents uniformly at random from the system, the agents are located on the vertex set of a general connected graph and can only interact with their neighbors.

  14. Use of Spatial Epidemiology and Hot Spot Analysis to Target Women Eligible for Prenatal Women, Infants, and Children Services

    PubMed Central

    Krawczyk, Christopher; Gradziel, Pat; Geraghty, Estella M.

    2014-01-01

    Objectives. We used a geographic information system and cluster analyses to determine locations in need of enhanced Special Supplemental Nutrition Program for Women, Infants, and Children (WIC) Program services. Methods. We linked documented births in the 2010 California Birth Statistical Master File with the 2010 data from the WIC Integrated Statewide Information System. Analyses focused on the density of pregnant women who were eligible for but not receiving WIC services in California’s 7049 census tracts. We used incremental spatial autocorrelation and hot spot analyses to identify clusters of WIC-eligible nonparticipants. Results. We detected clusters of census tracts with higher-than-expected densities, compared with the state mean density of WIC-eligible nonparticipants, in 21 of 58 (36.2%) California counties (P < .05). In subsequent county-level analyses, we located neighborhood-level clusters of higher-than-expected densities of eligible nonparticipants in Sacramento, San Francisco, Fresno, and Los Angeles Counties (P < .05). Conclusions. Hot spot analyses provided a rigorous and objective approach to determine the locations of statistically significant clusters of WIC-eligible nonparticipants. Results helped inform WIC program and funding decisions, including the opening of new WIC centers, and offered a novel approach for targeting public health services. PMID:24354821

  15. Protein Multiplexed Immunoassay Analysis with R.

    PubMed

    Breen, Edmond J

    2017-01-01

    Plasma samples from 177 control and type 2 diabetes patients collected at three Australian hospitals are screened for 14 analytes using six custom-made multiplex kits across 60 96-well plates. In total 354 samples were collected from the patients, representing one baseline and one end point sample from each patient. R methods and source code for analyzing the analyte fluorescence response obtained from these samples by Luminex Bio-Plex ® xMap multiplexed immunoassay technology are disclosed. Techniques and R procedures for reading Bio-Plex ® result files for statistical analysis and data visualization are also presented. The need for technical replicates and the number of technical replicates are addressed as well as plate layout design strategies. Multinomial regression is used to determine plate to sample covariate balance. Methods for matching clinical covariate information to Bio-Plex ® results and vice versa are given. As well as methods for measuring and inspecting the quality of the fluorescence responses are presented. Both fixed and mixed-effect approaches for immunoassay statistical differential analysis are presented and discussed. A random effect approach to outlier analysis and detection is also shown. The bioinformatics R methodology present here provides a foundation for rigorous and reproducible analysis of the fluorescence response obtained from multiplexed immunoassays.

  16. On Statistical Approaches for Demonstrating Analytical Similarity in the Presence of Correlation.

    PubMed

    Yang, Harry; Novick, Steven; Burdick, Richard K

    Analytical similarity is the foundation for demonstration of biosimilarity between a proposed product and a reference product. For this assessment, currently the U.S. Food and Drug Administration (FDA) recommends a tiered system in which quality attributes are categorized into three tiers commensurate with their risk and approaches of varying statistical rigor are subsequently used for the three-tier quality attributes. Key to the analyses of Tiers 1 and 2 quality attributes is the establishment of equivalence acceptance criterion and quality range. For particular licensure applications, the FDA has provided advice on statistical methods for demonstration of analytical similarity. For example, for Tier 1 assessment, an equivalence test can be used based on an equivalence margin of 1.5 σ R , where σ R is the reference product variability estimated by the sample standard deviation S R from a sample of reference lots. The quality range for demonstrating Tier 2 analytical similarity is of the form X̄ R ± K × σ R where the constant K is appropriately justified. To demonstrate Tier 2 analytical similarity, a large percentage (e.g., 90%) of test product must fall in the quality range. In this paper, through both theoretical derivations and simulations, we show that when the reference drug product lots are correlated, the sample standard deviation S R underestimates the true reference product variability σ R As a result, substituting S R for σ R in the Tier 1 equivalence acceptance criterion and the Tier 2 quality range inappropriately reduces the statistical power and the ability to declare analytical similarity. Also explored is the impact of correlation among drug product lots on Type I error rate and power. Three methods based on generalized pivotal quantities are introduced, and their performance is compared against a two-one-sided tests (TOST) approach. Finally, strategies to mitigate risk of correlation among the reference products lots are discussed. A biosimilar is a generic version of the original biological drug product. A key component of a biosimilar development is the demonstration of analytical similarity between the biosimilar and the reference product. Such demonstration relies on application of statistical methods to establish a similarity margin and appropriate test for equivalence between the two products. This paper discusses statistical issues with demonstration of analytical similarity and provides alternate approaches to potentially mitigate these problems. © PDA, Inc. 2016.

  17. Cerebrospinal Fluid Biomarkers for Huntington's Disease.

    PubMed

    Byrne, Lauren M; Wild, Edward J

    2016-01-01

    Cerebrospinal fluid (CSF) is enriched in brain-derived components and represents an accessible and appealing means of interrogating the CNS milieu to study neurodegenerative diseases and identify biomarkers to facilitate the development of novel therapeutics. Many such CSF biomarkers have been proposed for Huntington's disease (HD) but none has been validated for clinical trial use. Across many studies proposing dozens of biomarker candidates, there is a notable lack of statistical power, consistency, rigor and validation. Here we review proposed CSF biomarkers including neurotransmitters, transglutaminase activity, kynurenine pathway metabolites, oxidative stress markers, inflammatory markers, neuroendocrine markers, protein markers of neuronal death, proteomic approaches and mutant huntingtin protein itself. We reflect on the need for large-scale, standardized CSF collections with detailed phenotypic data to validate and qualify much-needed CSF biomarkers for clinical trial use in HD.

  18. Fundamental limit of nanophotonic light trapping in solar cells.

    PubMed

    Yu, Zongfu; Raman, Aaswath; Fan, Shanhui

    2010-10-12

    Establishing the fundamental limit of nanophotonic light-trapping schemes is of paramount importance and is becoming increasingly urgent for current solar cell research. The standard theory of light trapping demonstrated that absorption enhancement in a medium cannot exceed a factor of 4n(2)/sin(2)θ, where n is the refractive index of the active layer, and θ is the angle of the emission cone in the medium surrounding the cell. This theory, however, is not applicable in the nanophotonic regime. Here we develop a statistical temporal coupled-mode theory of light trapping based on a rigorous electromagnetic approach. Our theory reveals that the conventional limit can be substantially surpassed when optical modes exhibit deep-subwavelength-scale field confinement, opening new avenues for highly efficient next-generation solar cells.

  19. An advanced kinetic theory for morphing continuum with inner structures

    NASA Astrophysics Data System (ADS)

    Chen, James

    2017-12-01

    Advanced kinetic theory with the Boltzmann-Curtiss equation provides a promising tool for polyatomic gas flows, especially for fluid flows containing inner structures, such as turbulence, polyatomic gas flows and others. Although a Hamiltonian-based distribution function was proposed for diatomic gas flow, a general distribution function for the generalized Boltzmann-Curtiss equations and polyatomic gas flow is still out of reach. With assistance from Boltzmann's entropy principle, a generalized Boltzmann-Curtiss distribution for polyatomic gas flow is introduced. The corresponding governing equations at equilibrium state are derived and compared with Eringen's morphing (micropolar) continuum theory derived under the framework of rational continuum thermomechanics. Although rational continuum thermomechanics has the advantages of mathematical rigor and simplicity, the presented statistical kinetic theory approach provides a clear physical picture for what the governing equations represent.

  20. Identification of dynamic systems, theory and formulation

    NASA Technical Reports Server (NTRS)

    Maine, R. E.; Iliff, K. W.

    1985-01-01

    The problem of estimating parameters of dynamic systems is addressed in order to present the theoretical basis of system identification and parameter estimation in a manner that is complete and rigorous, yet understandable with minimal prerequisites. Maximum likelihood and related estimators are highlighted. The approach used requires familiarity with calculus, linear algebra, and probability, but does not require knowledge of stochastic processes or functional analysis. The treatment emphasizes unification of the various areas in estimation in dynamic systems is treated as a direct outgrowth of the static system theory. Topics covered include basic concepts and definitions; numerical optimization methods; probability; statistical estimators; estimation in static systems; stochastic processes; state estimation in dynamic systems; output error, filter error, and equation error methods of parameter estimation in dynamic systems, and the accuracy of the estimates.

  1. Revised Planning Methodology For Signalized Intersections And Operational Analysis Of Exclusive Left-Turn Lanes, Part-II: Models And Procedures (Final Report)

    DOT National Transportation Integrated Search

    1996-04-01

    THIS REPORT ALSO DESCRIBES THE PROCEDURES FOR DIRECT ESTIMATION OF INTERSECTION CAPACITY WITH SIMULATION, INCLUDING A SET OF RIGOROUS STATISTICAL TESTS FOR SIMULATION PARAMETER CALIBRATION FROM FIELD DATA.

  2. Testing for Mutagens Using Fruit Flies.

    ERIC Educational Resources Information Center

    Liebl, Eric C.

    1998-01-01

    Describes a laboratory employed in undergraduate teaching that uses fruit flies to test student-selected compounds for their ability to cause mutations. Requires no prior experience with fruit flies, incorporates a student design component, and employs both rigorous controls and statistical analyses. (DDR)

  3. A High Performance Computing Study of a Scalable FISST-Based Approach to Multi-Target, Multi-Sensor Tracking

    NASA Astrophysics Data System (ADS)

    Hussein, I.; Wilkins, M.; Roscoe, C.; Faber, W.; Chakravorty, S.; Schumacher, P.

    2016-09-01

    Finite Set Statistics (FISST) is a rigorous Bayesian multi-hypothesis management tool for the joint detection, classification and tracking of multi-sensor, multi-object systems. Implicit within the approach are solutions to the data association and target label-tracking problems. The full FISST filtering equations, however, are intractable. While FISST-based methods such as the PHD and CPHD filters are tractable, they require heavy moment approximations to the full FISST equations that result in a significant loss of information contained in the collected data. In this paper, we review Smart Sampling Markov Chain Monte Carlo (SSMCMC) that enables FISST to be tractable while avoiding moment approximations. We study the effect of tuning key SSMCMC parameters on tracking quality and computation time. The study is performed on a representative space object catalog with varying numbers of RSOs. The solution is implemented in the Scala computing language at the Maui High Performance Computing Center (MHPCC) facility.

  4. QTest: Quantitative Testing of Theories of Binary Choice

    PubMed Central

    Regenwetter, Michel; Davis-Stober, Clintin P.; Lim, Shiau Hong; Guo, Ying; Popova, Anna; Zwilling, Chris; Cha, Yun-Shil; Messner, William

    2014-01-01

    The goal of this paper is to make modeling and quantitative testing accessible to behavioral decision researchers interested in substantive questions. We provide a novel, rigorous, yet very general, quantitative diagnostic framework for testing theories of binary choice. This permits the nontechnical scholar to proceed far beyond traditionally rather superficial methods of analysis, and it permits the quantitatively savvy scholar to triage theoretical proposals before investing effort into complex and specialized quantitative analyses. Our theoretical framework links static algebraic decision theory with observed variability in behavioral binary choice data. The paper is supplemented with a custom-designed public-domain statistical analysis package, the QTest software. We illustrate our approach with a quantitative analysis using published laboratory data, including tests of novel versions of “Random Cumulative Prospect Theory.” A major asset of the approach is the potential to distinguish decision makers who have a fixed preference and commit errors in observed choices from decision makers who waver in their preferences. PMID:24999495

  5. Rigor Mortis: Statistical thoroughness in reporting and the making of truth.

    PubMed

    Tal, Aner

    2016-02-01

    Should a uniform checklist be adopted for methodological and statistical reporting? The current article discusses this notion, with particular attention to the use of old versus new statistics, and a consideration of the arguments brought up by Von Roten. The article argues that an overly exhaustive checklist that is uniformly applied to all submitted papers may be unsuitable for multidisciplinary work, and would further result in undue clutter and potentially distract reviewers from pertinent considerations in their evaluation of research articles. © The Author(s) 2015.

  6. Zonation in the deep benthic megafauna : Application of a general test.

    PubMed

    Gardiner, Frederick P; Haedrich, Richard L

    1978-01-01

    A test based on Maxwell-Boltzman statistics, instead of the formerly suggested but inappropriate Bose-Einstein statistics (Pielou and Routledge, 1976), examines the distribution of the boundaries of species' ranges distributed along a gradient, and indicates whether they are random or clustered (zoned). The test is most useful as a preliminary to the application of more instructive but less statistically rigorous methods such as cluster analysis. The test indicates zonation is marked in the deep benthic megafauna living between 200 and 3000 m, but below 3000 m little zonation may be found.

  7. Statistical physics approach to earthquake occurrence and forecasting

    NASA Astrophysics Data System (ADS)

    de Arcangelis, Lucilla; Godano, Cataldo; Grasso, Jean Robert; Lippiello, Eugenio

    2016-04-01

    There is striking evidence that the dynamics of the Earth crust is controlled by a wide variety of mutually dependent mechanisms acting at different spatial and temporal scales. The interplay of these mechanisms produces instabilities in the stress field, leading to abrupt energy releases, i.e., earthquakes. As a consequence, the evolution towards instability before a single event is very difficult to monitor. On the other hand, collective behavior in stress transfer and relaxation within the Earth crust leads to emergent properties described by stable phenomenological laws for a population of many earthquakes in size, time and space domains. This observation has stimulated a statistical mechanics approach to earthquake occurrence, applying ideas and methods as scaling laws, universality, fractal dimension, renormalization group, to characterize the physics of earthquakes. In this review we first present a description of the phenomenological laws of earthquake occurrence which represent the frame of reference for a variety of statistical mechanical models, ranging from the spring-block to more complex fault models. Next, we discuss the problem of seismic forecasting in the general framework of stochastic processes, where seismic occurrence can be described as a branching process implementing space-time-energy correlations between earthquakes. In this context we show how correlations originate from dynamical scaling relations between time and energy, able to account for universality and provide a unifying description for the phenomenological power laws. Then we discuss how branching models can be implemented to forecast the temporal evolution of the earthquake occurrence probability and allow to discriminate among different physical mechanisms responsible for earthquake triggering. In particular, the forecasting problem will be presented in a rigorous mathematical framework, discussing the relevance of the processes acting at different temporal scales for different levels of prediction. In this review we also briefly discuss how the statistical mechanics approach can be applied to non-tectonic earthquakes and to other natural stochastic processes, such as volcanic eruptions and solar flares.

  8. Differentiating Wheat Genotypes by Bayesian Hierarchical Nonlinear Mixed Modeling of Wheat Root Density

    PubMed Central

    Wasson, Anton P.; Chiu, Grace S.; Zwart, Alexander B.; Binns, Timothy R.

    2017-01-01

    Ensuring future food security for a growing population while climate change and urban sprawl put pressure on agricultural land will require sustainable intensification of current farming practices. For the crop breeder this means producing higher crop yields with less resources due to greater environmental stresses. While easy gains in crop yield have been made mostly “above ground,” little progress has been made “below ground”; and yet it is these root system traits that can improve productivity and resistance to drought stress. Wheat pre-breeders use soil coring and core-break counts to phenotype root architecture traits, with data collected on rooting density for hundreds of genotypes in small increments of depth. The measured densities are both large datasets and highly variable even within the same genotype, hence, any rigorous, comprehensive statistical analysis of such complex field data would be technically challenging. Traditionally, most attributes of the field data are therefore discarded in favor of simple numerical summary descriptors which retain much of the high variability exhibited by the raw data. This poses practical challenges: although plant scientists have established that root traits do drive resource capture in crops, traits that are more randomly (rather than genetically) determined are difficult to breed for. In this paper we develop a hierarchical nonlinear mixed modeling approach that utilizes the complete field data for wheat genotypes to fit, under the Bayesian paradigm, an “idealized” relative intensity function for the root distribution over depth. Our approach was used to determine heritability: how much of the variation between field samples was purely random vs. being mechanistically driven by the plant genetics? Based on the genotypic intensity functions, the overall heritability estimate was 0.62 (95% Bayesian confidence interval was 0.52 to 0.71). Despite root count profiles that were statistically very noisy, our approach led to denoised profiles which exhibited rigorously discernible phenotypic traits. Profile-specific traits could be representative of a genotype, and thus, used as a quantitative tool to associate phenotypic traits with specific genotypes. This would allow breeders to select for whole root system distributions appropriate for sustainable intensification, and inform policy for mitigating crop yield risk and food insecurity. PMID:28303148

  9. A Statistical Comparative Planetology Approach to the Hunt for Habitable Exoplanets and Life Beyond the Solar System

    NASA Astrophysics Data System (ADS)

    Bean, Jacob L.; Abbot, Dorian S.; Kempton, Eliza M.-R.

    2017-06-01

    The search for habitable exoplanets and life beyond the solar system is one of the most compelling scientific opportunities of our time. Nevertheless, the high cost of building facilities that can address this topic and the keen public interest in the results of such research requires rigorous development of experiments that can deliver a definitive advancement in our understanding. Most work to date in this area has focused on a “systems science” approach of obtaining and interpreting comprehensive data for individual planets to make statements about their habitability and the possibility that they harbor life. This strategy is challenging because of the diversity of exoplanets, both observed and expected, and the limited information that can be obtained with astronomical instruments. Here, we propose a complementary approach that is based on performing surveys of key planetary characteristics and using statistical marginalization to answer broader questions than can be addressed with a small sample of objects. The fundamental principle of this comparative planetology approach is maximizing what can be learned from each type of measurement by applying it widely rather than requiring that multiple kinds of observations be brought to bear on a single object. As a proof of concept, we outline a survey of terrestrial exoplanet atmospheric water and carbon dioxide abundances that would test the habitable zone hypothesis and lead to a deeper understanding of the frequency of habitable planets. We also discuss ideas for additional surveys that could be developed to test other foundational hypotheses in this area.

  10. A Statistical Comparative Planetology Approach to the Hunt for Habitable Exoplanets and Life Beyond the Solar Syste

    NASA Astrophysics Data System (ADS)

    Abbot, D. S.; Bean, J. L.; Kempton, E.

    2017-12-01

    The search for habitable exoplanets and life beyond the solar system is one of the most compelling scientific opportunities of our time. Nevertheless, the high cost of building facilities that can address this topic and the keen public interest in the results of such research requires rigorous development of experiments that can deliver a definitive advancement in our understanding. Most work to date in this area has focused on a "systems science" approach of obtaining and interpreting comprehensive data for individual planets to make statements about their habitability and the possibility that they harbor life. This strategy is challenging because of the diversity of exoplanets, both observed and expected, and the limited information that can be obtained with astronomical instruments. Here, we propose a complementary approach that is based on performing surveys of key planetary characteristics and using statistical marginalization to answer broader questions than can be addressed with a small sample of objects. The fundamental principle of this comparative planetology approach is maximizing what can be learned from each type of measurement by applying it widely rather than requiring that multiple kinds of observations be brought to bear on a single object. As a proof of concept, we outline a survey of terrestrial exoplanet atmospheric water and carbon dioxide abundances that would test the habitable zone hypothesis and lead to a deeper understanding of the frequency of habitable planets. We also discuss ideas for additional surveys that could be developed to test other foundational hypotheses is this area.

  11. A statistically rigorous sampling design to integrate avian monitoring and management within Bird Conservation Regions.

    PubMed

    Pavlacky, David C; Lukacs, Paul M; Blakesley, Jennifer A; Skorkowsky, Robert C; Klute, David S; Hahn, Beth A; Dreitz, Victoria J; George, T Luke; Hanni, David J

    2017-01-01

    Monitoring is an essential component of wildlife management and conservation. However, the usefulness of monitoring data is often undermined by the lack of 1) coordination across organizations and regions, 2) meaningful management and conservation objectives, and 3) rigorous sampling designs. Although many improvements to avian monitoring have been discussed, the recommendations have been slow to emerge in large-scale programs. We introduce the Integrated Monitoring in Bird Conservation Regions (IMBCR) program designed to overcome the above limitations. Our objectives are to outline the development of a statistically defensible sampling design to increase the value of large-scale monitoring data and provide example applications to demonstrate the ability of the design to meet multiple conservation and management objectives. We outline the sampling process for the IMBCR program with a focus on the Badlands and Prairies Bird Conservation Region (BCR 17). We provide two examples for the Brewer's sparrow (Spizella breweri) in BCR 17 demonstrating the ability of the design to 1) determine hierarchical population responses to landscape change and 2) estimate hierarchical habitat relationships to predict the response of the Brewer's sparrow to conservation efforts at multiple spatial scales. The collaboration across organizations and regions provided economy of scale by leveraging a common data platform over large spatial scales to promote the efficient use of monitoring resources. We designed the IMBCR program to address the information needs and core conservation and management objectives of the participating partner organizations. Although it has been argued that probabilistic sampling designs are not practical for large-scale monitoring, the IMBCR program provides a precedent for implementing a statistically defensible sampling design from local to bioregional scales. We demonstrate that integrating conservation and management objectives with rigorous statistical design and analyses ensures reliable knowledge about bird populations that is relevant and integral to bird conservation at multiple scales.

  12. A statistically rigorous sampling design to integrate avian monitoring and management within Bird Conservation Regions

    PubMed Central

    Hahn, Beth A.; Dreitz, Victoria J.; George, T. Luke

    2017-01-01

    Monitoring is an essential component of wildlife management and conservation. However, the usefulness of monitoring data is often undermined by the lack of 1) coordination across organizations and regions, 2) meaningful management and conservation objectives, and 3) rigorous sampling designs. Although many improvements to avian monitoring have been discussed, the recommendations have been slow to emerge in large-scale programs. We introduce the Integrated Monitoring in Bird Conservation Regions (IMBCR) program designed to overcome the above limitations. Our objectives are to outline the development of a statistically defensible sampling design to increase the value of large-scale monitoring data and provide example applications to demonstrate the ability of the design to meet multiple conservation and management objectives. We outline the sampling process for the IMBCR program with a focus on the Badlands and Prairies Bird Conservation Region (BCR 17). We provide two examples for the Brewer’s sparrow (Spizella breweri) in BCR 17 demonstrating the ability of the design to 1) determine hierarchical population responses to landscape change and 2) estimate hierarchical habitat relationships to predict the response of the Brewer’s sparrow to conservation efforts at multiple spatial scales. The collaboration across organizations and regions provided economy of scale by leveraging a common data platform over large spatial scales to promote the efficient use of monitoring resources. We designed the IMBCR program to address the information needs and core conservation and management objectives of the participating partner organizations. Although it has been argued that probabilistic sampling designs are not practical for large-scale monitoring, the IMBCR program provides a precedent for implementing a statistically defensible sampling design from local to bioregional scales. We demonstrate that integrating conservation and management objectives with rigorous statistical design and analyses ensures reliable knowledge about bird populations that is relevant and integral to bird conservation at multiple scales. PMID:29065128

  13. DESIGNA ND ANALYSIS FOR THEMATIC MAP ACCURACY ASSESSMENT: FUNDAMENTAL PRINCIPLES

    EPA Science Inventory

    Before being used in scientific investigations and policy decisions, thematic maps constructed from remotely sensed data should be subjected to a statistically rigorous accuracy assessment. The three basic components of an accuracy assessment are: 1) the sampling design used to s...

  14. Perspectives on statistics education: observations from statistical consulting in an academic nursing environment.

    PubMed

    Hayat, Matthew J; Schmiege, Sarah J; Cook, Paul F

    2014-04-01

    Statistics knowledge is essential for understanding the nursing and health care literature, as well as for applying rigorous science in nursing research. Statistical consultants providing services to faculty and students in an academic nursing program have the opportunity to identify gaps and challenges in statistics education for nursing students. This information may be useful to curriculum committees and statistics educators. This article aims to provide perspective on statistics education stemming from the experiences of three experienced statistics educators who regularly collaborate and consult with nurse investigators. The authors share their knowledge and express their views about data management, data screening and manipulation, statistical software, types of scientific investigation, and advanced statistical topics not covered in the usual coursework. The suggestions provided promote a call for data to study these topics. Relevant data about statistics education can assist educators in developing comprehensive statistics coursework for nursing students. Copyright 2014, SLACK Incorporated.

  15. Implementation and evaluation of an efficient secure computation system using ‘R’ for healthcare statistics

    PubMed Central

    Chida, Koji; Morohashi, Gembu; Fuji, Hitoshi; Magata, Fumihiko; Fujimura, Akiko; Hamada, Koki; Ikarashi, Dai; Yamamoto, Ryuichi

    2014-01-01

    Background and objective While the secondary use of medical data has gained attention, its adoption has been constrained due to protection of patient privacy. Making medical data secure by de-identification can be problematic, especially when the data concerns rare diseases. We require rigorous security management measures. Materials and methods Using secure computation, an approach from cryptography, our system can compute various statistics over encrypted medical records without decrypting them. An issue of secure computation is that the amount of processing time required is immense. We implemented a system that securely computes healthcare statistics from the statistical computing software ‘R’ by effectively combining secret-sharing-based secure computation with original computation. Results Testing confirmed that our system could correctly complete computation of average and unbiased variance of approximately 50 000 records of dummy insurance claim data in a little over a second. Computation including conditional expressions and/or comparison of values, for example, t test and median, could also be correctly completed in several tens of seconds to a few minutes. Discussion If medical records are simply encrypted, the risk of leaks exists because decryption is usually required during statistical analysis. Our system possesses high-level security because medical records remain in encrypted state even during statistical analysis. Also, our system can securely compute some basic statistics with conditional expressions using ‘R’ that works interactively while secure computation protocols generally require a significant amount of processing time. Conclusions We propose a secure statistical analysis system using ‘R’ for medical data that effectively integrates secret-sharing-based secure computation and original computation. PMID:24763677

  16. Implementation and evaluation of an efficient secure computation system using 'R' for healthcare statistics.

    PubMed

    Chida, Koji; Morohashi, Gembu; Fuji, Hitoshi; Magata, Fumihiko; Fujimura, Akiko; Hamada, Koki; Ikarashi, Dai; Yamamoto, Ryuichi

    2014-10-01

    While the secondary use of medical data has gained attention, its adoption has been constrained due to protection of patient privacy. Making medical data secure by de-identification can be problematic, especially when the data concerns rare diseases. We require rigorous security management measures. Using secure computation, an approach from cryptography, our system can compute various statistics over encrypted medical records without decrypting them. An issue of secure computation is that the amount of processing time required is immense. We implemented a system that securely computes healthcare statistics from the statistical computing software 'R' by effectively combining secret-sharing-based secure computation with original computation. Testing confirmed that our system could correctly complete computation of average and unbiased variance of approximately 50,000 records of dummy insurance claim data in a little over a second. Computation including conditional expressions and/or comparison of values, for example, t test and median, could also be correctly completed in several tens of seconds to a few minutes. If medical records are simply encrypted, the risk of leaks exists because decryption is usually required during statistical analysis. Our system possesses high-level security because medical records remain in encrypted state even during statistical analysis. Also, our system can securely compute some basic statistics with conditional expressions using 'R' that works interactively while secure computation protocols generally require a significant amount of processing time. We propose a secure statistical analysis system using 'R' for medical data that effectively integrates secret-sharing-based secure computation and original computation. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  17. Multi-species Identification of Polymorphic Peptide Variants via Propagation in Spectral Networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Na, Seungjin; Payne, Samuel H.; Bandeira, Nuno

    The spectral networks approach enables the detection of pairs of spectra from related peptides and thus allows for the propagation of annotations from identified peptides to unidentified spectra. Beyond allowing for unbiased discovery of unexpected post-translational modifications, spectral networks are also applicable to multi-species comparative proteomics or metaproteomics to identify numerous orthologous versions of a protein. We present algorithmic and statistical advances in spectral networks that have made it possible to rigorously assess the statistical significance of spectral pairs and accurately estimate the error rate of identifications via propagation. In the analysis of three related Cyanothece species, a model organismmore » for biohydrogen production, spectral networks identified peptides with highly divergent sequences with up to dozens of variants per peptide, including many novel peptides in species that lack a sequenced genome. Furthermore, spectral networks strongly suggested the presence of novel peptides even in genomically characterized species (i.e. missing from databases) in that a significant portion of unidentified multi-species networks included at least two polymorphic peptide variants.« less

  18. Practice-based evidence study design for comparative effectiveness research.

    PubMed

    Horn, Susan D; Gassaway, Julie

    2007-10-01

    To describe a new, rigorous, comprehensive practice-based evidence for clinical practice improvement (PBE-CPI) study methodology, and compare its features, advantages, and disadvantages to those of randomized controlled trials and sophisticated statistical methods for comparative effectiveness research. PBE-CPI incorporates natural variation within data from routine clinical practice to determine what works, for whom, when, and at what cost. It uses the knowledge of front-line caregivers, who develop study questions and define variables as part of a transdisciplinary team. Its comprehensive measurement framework provides a basis for analyses of significant bivariate and multivariate associations between treatments and outcomes, controlling for patient differences, such as severity of illness. PBE-CPI studies can uncover better practices more quickly than randomized controlled trials or sophisticated statistical methods, while achieving many of the same advantages. We present examples of actionable findings from PBE-CPI studies in postacute care settings related to comparative effectiveness of medications, nutritional support approaches, incontinence products, physical therapy activities, and other services. Outcomes improved when practices associated with better outcomes in PBE-CPI analyses were adopted in practice.

  19. Reconstructing constructivism: causal models, Bayesian learning mechanisms, and the theory theory.

    PubMed

    Gopnik, Alison; Wellman, Henry M

    2012-11-01

    We propose a new version of the "theory theory" grounded in the computational framework of probabilistic causal models and Bayesian learning. Probabilistic models allow a constructivist but rigorous and detailed approach to cognitive development. They also explain the learning of both more specific causal hypotheses and more abstract framework theories. We outline the new theoretical ideas, explain the computational framework in an intuitive and nontechnical way, and review an extensive but relatively recent body of empirical results that supports these ideas. These include new studies of the mechanisms of learning. Children infer causal structure from statistical information, through their own actions on the world and through observations of the actions of others. Studies demonstrate these learning mechanisms in children from 16 months to 4 years old and include research on causal statistical learning, informal experimentation through play, and imitation and informal pedagogy. They also include studies of the variability and progressive character of intuitive theory change, particularly theory of mind. These studies investigate both the physical and the psychological and social domains. We conclude with suggestions for further collaborative projects between developmental and computational cognitive scientists.

  20. When Assessment Data Are Words: Validity Evidence for Qualitative Educational Assessments.

    PubMed

    Cook, David A; Kuper, Ayelet; Hatala, Rose; Ginsburg, Shiphra

    2016-10-01

    Quantitative scores fail to capture all important features of learner performance. This awareness has led to increased use of qualitative data when assessing health professionals. Yet the use of qualitative assessments is hampered by incomplete understanding of their role in forming judgments, and lack of consensus in how to appraise the rigor of judgments therein derived. The authors articulate the role of qualitative assessment as part of a comprehensive program of assessment, and translate the concept of validity to apply to judgments arising from qualitative assessments. They first identify standards for rigor in qualitative research, and then use two contemporary assessment validity frameworks to reorganize these standards for application to qualitative assessment.Standards for rigor in qualitative research include responsiveness, reflexivity, purposive sampling, thick description, triangulation, transparency, and transferability. These standards can be reframed using Messick's five sources of validity evidence (content, response process, internal structure, relationships with other variables, and consequences) and Kane's four inferences in validation (scoring, generalization, extrapolation, and implications). Evidence can be collected and evaluated for each evidence source or inference. The authors illustrate this approach using published research on learning portfolios.The authors advocate a "methods-neutral" approach to assessment, in which a clearly stated purpose determines the nature of and approach to data collection and analysis. Increased use of qualitative assessments will necessitate more rigorous judgments of the defensibility (validity) of inferences and decisions. Evidence should be strategically sought to inform a coherent validity argument.

  1. A Semantic Transformation Methodology for the Secondary Use of Observational Healthcare Data in Postmarketing Safety Studies.

    PubMed

    Pacaci, Anil; Gonul, Suat; Sinaci, A Anil; Yuksel, Mustafa; Laleci Erturkmen, Gokce B

    2018-01-01

    Background: Utilization of the available observational healthcare datasets is key to complement and strengthen the postmarketing safety studies. Use of common data models (CDM) is the predominant approach in order to enable large scale systematic analyses on disparate data models and vocabularies. Current CDM transformation practices depend on proprietarily developed Extract-Transform-Load (ETL) procedures, which require knowledge both on the semantics and technical characteristics of the source datasets and target CDM. Purpose: In this study, our aim is to develop a modular but coordinated transformation approach in order to separate semantic and technical steps of transformation processes, which do not have a strict separation in traditional ETL approaches. Such an approach would discretize the operations to extract data from source electronic health record systems, alignment of the source, and target models on the semantic level and the operations to populate target common data repositories. Approach: In order to separate the activities that are required to transform heterogeneous data sources to a target CDM, we introduce a semantic transformation approach composed of three steps: (1) transformation of source datasets to Resource Description Framework (RDF) format, (2) application of semantic conversion rules to get the data as instances of ontological model of the target CDM, and (3) population of repositories, which comply with the specifications of the CDM, by processing the RDF instances from step 2. The proposed approach has been implemented on real healthcare settings where Observational Medical Outcomes Partnership (OMOP) CDM has been chosen as the common data model and a comprehensive comparative analysis between the native and transformed data has been conducted. Results: Health records of ~1 million patients have been successfully transformed to an OMOP CDM based database from the source database. Descriptive statistics obtained from the source and target databases present analogous and consistent results. Discussion and Conclusion: Our method goes beyond the traditional ETL approaches by being more declarative and rigorous. Declarative because the use of RDF based mapping rules makes each mapping more transparent and understandable to humans while retaining logic-based computability. Rigorous because the mappings would be based on computer readable semantics which are amenable to validation through logic-based inference methods.

  2. In search of a statistical probability model for petroleum-resource assessment : a critique of the probabilistic significance of certain concepts and methods used in petroleum-resource assessment : to that end, a probabilistic model is sketched

    USGS Publications Warehouse

    Grossling, Bernardo F.

    1975-01-01

    Exploratory drilling is still in incipient or youthful stages in those areas of the world where the bulk of the potential petroleum resources is yet to be discovered. Methods of assessing resources from projections based on historical production and reserve data are limited to mature areas. For most of the world's petroleum-prospective areas, a more speculative situation calls for a critical review of resource-assessment methodology. The language of mathematical statistics is required to define more rigorously the appraisal of petroleum resources. Basically, two approaches have been used to appraise the amounts of undiscovered mineral resources in a geologic province: (1) projection models, which use statistical data on the past outcome of exploration and development in the province; and (2) estimation models of the overall resources of the province, which use certain known parameters of the province together with the outcome of exploration and development in analogous provinces. These two approaches often lead to widely different estimates. Some of the controversy that arises results from a confusion of the probabilistic significance of the quantities yielded by each of the two approaches. Also, inherent limitations of analytic projection models-such as those using the logistic and Gomperts functions --have often been ignored. The resource-assessment problem should be recast in terms that provide for consideration of the probability of existence of the resource and of the probability of discovery of a deposit. Then the two above-mentioned models occupy the two ends of the probability range. The new approach accounts for (1) what can be expected with reasonably high certainty by mere projections of what has been accomplished in the past; (2) the inherent biases of decision-makers and resource estimators; (3) upper bounds that can be set up as goals for exploration; and (4) the uncertainties in geologic conditions in a search for minerals. Actual outcomes can then be viewed as phenomena subject to statistical uncertainty and responsive to changes in economic and technologic factors.

  3. Comparative effectiveness research methodology using secondary data: A starting user's guide.

    PubMed

    Sun, Maxine; Lipsitz, Stuart R

    2018-04-01

    The use of secondary data, such as claims or administrative data, in comparative effectiveness research has grown tremendously in recent years. We believe that the current review can help investigators relying on secondary data to (1) gain insight into both the methodologies and statistical methods, (2) better understand the necessity of a rigorous planning before initiating a comparative effectiveness investigation, and (3) optimize the quality of their investigations. Specifically, we review concepts of adjusted analyses and confounders, methods of propensity score analyses, and instrumental variable analyses, risk prediction models (logistic and time-to-event), decision-curve analysis, as well as the interpretation of the P value and hypothesis testing. Overall, we hope that the current review article can help research investigators relying on secondary data to perform comparative effectiveness research better understand the necessity of a rigorous planning before study start, and gain better insight in the choice of statistical methods so as to optimize the quality of the research study. Copyright © 2017 Elsevier Inc. All rights reserved.

  4. On generic obstructions to recovering correct statistics from climate simulations: Homogenization for deterministic maps and multiplicative noise

    NASA Astrophysics Data System (ADS)

    Gottwald, Georg; Melbourne, Ian

    2013-04-01

    Whereas diffusion limits of stochastic multi-scale systems have a long and successful history, the case of constructing stochastic parametrizations of chaotic deterministic systems has been much less studied. We present rigorous results of convergence of a chaotic slow-fast system to a stochastic differential equation with multiplicative noise. Furthermore we present rigorous results for chaotic slow-fast maps, occurring as numerical discretizations of continuous time systems. This raises the issue of how to interpret certain stochastic integrals; surprisingly the resulting integrals of the stochastic limit system are generically neither of Stratonovich nor of Ito type in the case of maps. It is shown that the limit system of a numerical discretisation is different to the associated continuous time system. This has important consequences when interpreting the statistics of long time simulations of multi-scale systems - they may be very different to the one of the original continuous time system which we set out to study.

  5. Clinical decision making-a functional medicine perspective.

    PubMed

    Pizzorno, Joseph E

    2012-09-01

    As 21st century health care moves from a disease-based approach to a more patient-centric system that can address biochemical individuality to improve health and function, clinical decision making becomes more complex. Accentuating the problem is the lack of a clear standard for this more complex functional medicine approach. While there is relatively broad agreement in Western medicine for what constitutes competent assessment of disease and identification of related treatment approaches, the complex functional medicine model posits multiple and individualized diagnostic and therapeutic approaches, most or many of which have reasonable underlying science and principles, but which have not been rigorously tested in a research or clinical setting. This has led to non-rigorous thinking and sometimes to uncritical acceptance of both poorly documented diagnostic procedures and ineffective therapies, resulting in less than optimal clinical care.

  6. Clinical Decision Making—A Functional Medicine Perspective

    PubMed Central

    2012-01-01

    As 21st century health care moves from a disease-based approach to a more patient-centric system that can address biochemical individuality to improve health and function, clinical decision making becomes more complex. Accentuating the problem is the lack of a clear standard for this more complex functional medicine approach. While there is relatively broad agreement in Western medicine for what constitutes competent assessment of disease and identification of related treatment approaches, the complex functional medicine model posits multiple and individualized diagnostic and therapeutic approaches, most or many of which have reasonable underlying science and principles, but which have not been rigorously tested in a research or clinical setting. This has led to non-rigorous thinking and sometimes to uncritical acceptance of both poorly documented diagnostic procedures and ineffective therapies, resulting in less than optimal clinical care. PMID:24278827

  7. Ayahuasca-assisted therapy for addiction: results from a preliminary observational study in Canada.

    PubMed

    Thomas, Gerald; Lucas, Philippe; Capler, N Rielle; Tupper, Kenneth W; Martin, Gina

    2013-03-01

    This paper reports results from a preliminary observational study of ayahuasca-assisted treatment for problematic substance use and stress delivered in a rural First Nations community in British Columbia, Canada. The "Working with Addiction and Stress" retreats combined four days of group counselling with two expert-led ayahuasca ceremonies. This study collected pre-treatment and six months follow-up data from 12 participants on several psychological and behavioral factors related to problematic substance use, and qualitative data assessing the personal experiences of the participants six months after the retreat. Statistically significant (p < 0.05) improvements were demonstrated for scales assessing hopefulness, empowerment, mindfulness, and quality of life meaning and outlook subscales. Self-reported alcohol, tobacco and cocaine use declined, although cannabis and opiate use did not; reported reductions in problematic cocaine use were statistically significant. All study participants reported positive and lasting changes from participating in the retreats. This form of ayahuasca-assisted therapy appears to be associated with statistically significant improvements in several factors related to problematic substance use among a rural aboriginal population. These findings suggest participants may have experienced positive psychological and behavioral changes in response to this therapeutic approach, and that more rigorous research of ayahuasca-assisted therapy for problematic substance use is warranted.

  8. A common base method for analysis of qPCR data and the application of simple blocking in qPCR experiments.

    PubMed

    Ganger, Michael T; Dietz, Geoffrey D; Ewing, Sarah J

    2017-12-01

    qPCR has established itself as the technique of choice for the quantification of gene expression. Procedures for conducting qPCR have received significant attention; however, more rigorous approaches to the statistical analysis of qPCR data are needed. Here we develop a mathematical model, termed the Common Base Method, for analysis of qPCR data based on threshold cycle values (C q ) and efficiencies of reactions (E). The Common Base Method keeps all calculations in the logscale as long as possible by working with log 10 (E) ∙ C q , which we call the efficiency-weighted C q value; subsequent statistical analyses are then applied in the logscale. We show how efficiency-weighted C q values may be analyzed using a simple paired or unpaired experimental design and develop blocking methods to help reduce unexplained variation. The Common Base Method has several advantages. It allows for the incorporation of well-specific efficiencies and multiple reference genes. The method does not necessitate the pairing of samples that must be performed using traditional analysis methods in order to calculate relative expression ratios. Our method is also simple enough to be implemented in any spreadsheet or statistical software without additional scripts or proprietary components.

  9. A Recommended Procedure for Estimating the Cosmic-Ray Spectral Parameter of a Simple Power Law With Applications to Detector Design

    NASA Technical Reports Server (NTRS)

    Howell, L. W.

    2001-01-01

    A simple power law model consisting of a single spectral index alpha-1 is believed to be an adequate description of the galactic cosmic-ray (GCR) proton flux at energies below 10(exp 13) eV. Two procedures for estimating alpha-1 the method of moments and maximum likelihood (ML), are developed and their statistical performance compared. It is concluded that the ML procedure attains the most desirable statistical properties and is hence the recommended statistical estimation procedure for estimating alpha-1. The ML procedure is then generalized for application to a set of real cosmic-ray data and thereby makes this approach applicable to existing cosmic-ray data sets. Several other important results, such as the relationship between collecting power and detector energy resolution, as well as inclusion of a non-Gaussian detector response function, are presented. These results have many practical benefits in the design phase of a cosmic-ray detector as they permit instrument developers to make important trade studies in design parameters as a function of one of the science objectives. This is particularly important for space-based detectors where physical parameters, such as dimension and weight, impose rigorous practical limits to the design envelope.

  10. Use of FEV1 in Cystic Fibrosis Epidemiologic Studies and Clinical Trials: A Statistical Perspective for the Clinical Researcher

    PubMed Central

    Szczesniak, Rhonda; Heltshe, Sonya L.; Stanojevic, Sanja; Mayer-Hamblett, Nicole

    2017-01-01

    Background Forced expiratory volume in 1 second (FEV1) is an established marker of cystic fibrosis (CF) disease progression that is used to capture clinical course and evaluate therapeutic efficacy. The research community has established FEV1 surveillance data through a variety of observational data sources such as patient registries, and there is a growing pipeline of new CF therapies demonstrated to be efficacious in clinical trials by establishing improvements in FEV1. Results In this review, we summarize from a statistical perspective the clinical relevance of FEV1 based on its association with morbidity and mortality in CF, its role in epidemiologic studies of disease progression and comparative effectiveness, and its utility in clinical trials. In addition, we identify opportunities to advance epidemiologic research and the clinical development pipeline through further statistical considerations. Conclusions Our understanding of CF disease course, therapeutics, and clinical care has evolved immensely in the past decades, in large part due to the thoughtful application of rigorous research methods and meaningful clinical endpoints such as FEV1. A continued commitment to conduct research that minimizes the potential for bias, maximizes the limited patient population, and harmonizes approaches to FEV1 analysis while maintaining clinical relevance, will facilitate further opportunities to advance CF care. PMID:28117136

  11. Analyzing Single-Molecule Time Series via Nonparametric Bayesian Inference

    PubMed Central

    Hines, Keegan E.; Bankston, John R.; Aldrich, Richard W.

    2015-01-01

    The ability to measure the properties of proteins at the single-molecule level offers an unparalleled glimpse into biological systems at the molecular scale. The interpretation of single-molecule time series has often been rooted in statistical mechanics and the theory of Markov processes. While existing analysis methods have been useful, they are not without significant limitations including problems of model selection and parameter nonidentifiability. To address these challenges, we introduce the use of nonparametric Bayesian inference for the analysis of single-molecule time series. These methods provide a flexible way to extract structure from data instead of assuming models beforehand. We demonstrate these methods with applications to several diverse settings in single-molecule biophysics. This approach provides a well-constrained and rigorously grounded method for determining the number of biophysical states underlying single-molecule data. PMID:25650922

  12. Fish-Eye Observing with Phased Array Radio Telescopes

    NASA Astrophysics Data System (ADS)

    Wijnholds, S. J.

    The radio astronomical community is currently developing and building several new radio telescopes based on phased array technology. These telescopes provide a large field-of-view, that may in principle span a full hemisphere. This makes calibration and imaging very challenging tasks due to the complex source structures and direction dependent radio wave propagation effects. In this thesis, calibration and imaging methods are developed based on least squares estimation of instrument and source parameters. Monte Carlo simulations and actual observations with several prototype show that this model based approach provides statistically and computationally efficient solutions. The error analysis provides a rigorous mathematical framework to assess the imaging performance of current and future radio telescopes in terms of the effective noise, which is the combined effect of propagated calibration errors, noise in the data and source confusion.

  13. Empirical validation of an agent-based model of wood markets in Switzerland

    PubMed Central

    Hilty, Lorenz M.; Lemm, Renato; Thees, Oliver

    2018-01-01

    We present an agent-based model of wood markets and show our efforts to validate this model using empirical data from different sources, including interviews, workshops, experiments, and official statistics. Own surveys closed gaps where data was not available. Our approach to model validation used a variety of techniques, including the replication of historical production amounts, prices, and survey results, as well as a historical case study of a large sawmill entering the market and becoming insolvent only a few years later. Validating the model using this case provided additional insights, showing how the model can be used to simulate scenarios of resource availability and resource allocation. We conclude that the outcome of the rigorous validation qualifies the model to simulate scenarios concerning resource availability and allocation in our study region. PMID:29351300

  14. Trends in Mediation Analysis in Nursing Research: Improving Current Practice.

    PubMed

    Hertzog, Melody

    2018-06-01

    The purpose of this study was to describe common approaches used by nursing researchers to test mediation models and evaluate them within the context of current methodological advances. MEDLINE was used to locate studies testing a mediation model and published from 2004 to 2015 in nursing journals. Design (experimental/correlation, cross-sectional/longitudinal, model complexity) and analysis (method, inclusion of test of mediated effect, violations/discussion of assumptions, sample size/power) characteristics were coded for 456 studies. General trends were identified using descriptive statistics. Consistent with findings of reviews in other disciplines, evidence was found that nursing researchers may not be aware of the strong assumptions and serious limitations of their analyses. Suggestions for strengthening the rigor of such studies and an overview of current methods for testing more complex models, including longitudinal mediation processes, are presented.

  15. Rigorous diffraction analysis using geometrical theory of diffraction for future mask technology

    NASA Astrophysics Data System (ADS)

    Chua, Gek S.; Tay, Cho J.; Quan, Chenggen; Lin, Qunying

    2004-05-01

    Advanced lithographic techniques such as phase shift masks (PSM) and optical proximity correction (OPC) result in a more complex mask design and technology. In contrast to the binary masks, which have only transparent and nontransparent regions, phase shift masks also take into consideration transparent features with a different optical thickness and a modified phase of the transmitted light. PSM are well-known to show prominent diffraction effects, which cannot be described by the assumption of an infinitely thin mask (Kirchhoff approach) that is used in many commercial photolithography simulators. A correct prediction of sidelobe printability, process windows and linearity of OPC masks require the application of rigorous diffraction theory. The problem of aerial image intensity imbalance through focus with alternating Phase Shift Masks (altPSMs) is performed and compared between a time-domain finite-difference (TDFD) algorithm (TEMPEST) and Geometrical theory of diffraction (GTD). Using GTD, with the solution to the canonical problems, we obtained a relationship between the edge on the mask and the disturbance in image space. The main interest is to develop useful formulations that can be readily applied to solve rigorous diffraction for future mask technology. Analysis of rigorous diffraction effects for altPSMs using GTD approach will be discussed.

  16. A Multiphase Validation of Atlas-Based Automatic and Semiautomatic Segmentation Strategies for Prostate MRI

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Martin, Spencer; Rodrigues, George, E-mail: george.rodrigues@lhsc.on.ca; Department of Epidemiology/Biostatistics, University of Western Ontario, London

    2013-01-01

    Purpose: To perform a rigorous technological assessment and statistical validation of a software technology for anatomic delineations of the prostate on MRI datasets. Methods and Materials: A 3-phase validation strategy was used. Phase I consisted of anatomic atlas building using 100 prostate cancer MRI data sets to provide training data sets for the segmentation algorithms. In phase II, 2 experts contoured 15 new MRI prostate cancer cases using 3 approaches (manual, N points, and region of interest). In phase III, 5 new physicians with variable MRI prostate contouring experience segmented the same 15 phase II datasets using 3 approaches: manual,more » N points with no editing, and full autosegmentation with user editing allowed. Statistical analyses for time and accuracy (using Dice similarity coefficient) endpoints used traditional descriptive statistics, analysis of variance, analysis of covariance, and pooled Student t test. Results: In phase I, average (SD) total and per slice contouring time for the 2 physicians was 228 (75), 17 (3.5), 209 (65), and 15 seconds (3.9), respectively. In phase II, statistically significant differences in physician contouring time were observed based on physician, type of contouring, and case sequence. The N points strategy resulted in superior segmentation accuracy when initial autosegmented contours were compared with final contours. In phase III, statistically significant differences in contouring time were observed based on physician, type of contouring, and case sequence again. The average relative timesaving for N points and autosegmentation were 49% and 27%, respectively, compared with manual contouring. The N points and autosegmentation strategies resulted in average Dice values of 0.89 and 0.88, respectively. Pre- and postedited autosegmented contours demonstrated a higher average Dice similarity coefficient of 0.94. Conclusion: The software provided robust contours with minimal editing required. Observed time savings were seen for all physicians irrespective of experience level and baseline manual contouring speed.« less

  17. A statistical physics perspective on criticality in financial markets

    NASA Astrophysics Data System (ADS)

    Bury, Thomas

    2013-11-01

    Stock markets are complex systems exhibiting collective phenomena and particular features such as synchronization, fluctuations distributed as power-laws, non-random structures and similarity to neural networks. Such specific properties suggest that markets operate at a very special point. Financial markets are believed to be critical by analogy to physical systems, but little statistically founded evidence has been given. Through a data-based methodology and comparison to simulations inspired by the statistical physics of complex systems, we show that the Dow Jones and index sets are not rigorously critical. However, financial systems are closer to criticality in the crash neighborhood.

  18. Integration of Technology into the Classroom: Case Studies.

    ERIC Educational Resources Information Center

    Johnson, D. LaMont, Ed.; Maddux, Cleborne D., Ed.; Liu, Leping, Ed.

    This book contains the following case studies on the integration of technology in education: (1) "First Steps toward a Statistically Generated Information Technology Integration Model" (D. LaMont Johnson and Leping Liu); (2) "Case Studies: Are We Rejecting Rigor or Rediscovering Richness?" (Cleborne D. Maddux); (3)…

  19. A Psychometric Evaluation of the Digital Logic Concept Inventory

    ERIC Educational Resources Information Center

    Herman, Geoffrey L.; Zilles, Craig; Loui, Michael C.

    2014-01-01

    Concept inventories hold tremendous promise for promoting the rigorous evaluation of teaching methods that might remedy common student misconceptions and promote deep learning. The measurements from concept inventories can be trusted only if the concept inventories are evaluated both by expert feedback and statistical scrutiny (psychometric…

  20. Community, State, and Federal Approaches to Cumulative Risk Assessment: Challenges and Opportunities for Integration

    EPA Science Inventory

    Community, state, and federal approaches to conventional and cumulative risk assessment (CRA) were described and compared to assess similarities and differences, and develop recommendations for a consistent CRA approach, acceptable across each level as a rigorous scientific metho...

  1. Methodological Developments in Geophysical Assimilation Modeling

    NASA Astrophysics Data System (ADS)

    Christakos, George

    2005-06-01

    This work presents recent methodological developments in geophysical assimilation research. We revisit the meaning of the term "solution" of a mathematical model representing a geophysical system, and we examine its operational formulations. We argue that an assimilation solution based on epistemic cognition (which assumes that the model describes incomplete knowledge about nature and focuses on conceptual mechanisms of scientific thinking) could lead to more realistic representations of the geophysical situation than a conventional ontologic assimilation solution (which assumes that the model describes nature as is and focuses on form manipulations). Conceptually, the two approaches are fundamentally different. Unlike the reasoning structure of conventional assimilation modeling that is based mainly on ad hoc technical schemes, the epistemic cognition approach is based on teleologic criteria and stochastic adaptation principles. In this way some key ideas are introduced that could open new areas of geophysical assimilation to detailed understanding in an integrated manner. A knowledge synthesis framework can provide the rational means for assimilating a variety of knowledge bases (general and site specific) that are relevant to the geophysical system of interest. Epistemic cognition-based assimilation techniques can produce a realistic representation of the geophysical system, provide a rigorous assessment of the uncertainty sources, and generate informative predictions across space-time. The mathematics of epistemic assimilation involves a powerful and versatile spatiotemporal random field theory that imposes no restriction on the shape of the probability distributions or the form of the predictors (non-Gaussian distributions, multiple-point statistics, and nonlinear models are automatically incorporated) and accounts rigorously for the uncertainty features of the geophysical system. In the epistemic cognition context the assimilation concept may be used to investigate critical issues related to knowledge reliability, such as uncertainty due to model structure error (conceptual uncertainty).

  2. A Hilly path through the thermodynamics and statistical mechanics of protein solutions.

    PubMed

    Wills, Peter R

    2016-12-01

    The opus of Don Winzor in the fields of physical and analytical biochemistry is a major component of that certain antipodean approach to this broad area of research that blossomed in the second half of the twentieth century. The need to formulate problems in terms of thermodynamic nonideality posed the challenge of describing a clear route from molecular interactions to the parameters that biochemists routinely measure. Mapping out this route required delving into the statistical mechanics of solutions of macromolecules, and at every turn mathematically complex, rigorous, general results that had previously been derived previously, often by Terrell Hill, came to the fore. Central to this work were the definition of the "thermodynamic activity", the pivotal position of the polynomial expansion of the osmotic pressure in terms of molar concentration and the relationship of virial coefficients to details of the forces between limited-size groups of interacting molecules. All of this was richly exploited in the task of taking account of excluded volume and electrostatic interactions, especially in the use of sedimentation equilibrium to determine values of constants for molecular association reactions. Such an approach has proved relevant to the study of molecular interactions generally, even those between the main macromolecular solute and components of the solvent, by using techniques such as exclusion and affinity chromatography as well as light scattering.

  3. Quantifying Trace Amounts of Aggregates in Biopharmaceuticals Using Analytical Ultracentrifugation Sedimentation Velocity: Bayesian Analyses and F Statistics.

    PubMed

    Wafer, Lucas; Kloczewiak, Marek; Luo, Yin

    2016-07-01

    Analytical ultracentrifugation-sedimentation velocity (AUC-SV) is often used to quantify high molar mass species (HMMS) present in biopharmaceuticals. Although these species are often present in trace quantities, they have received significant attention due to their potential immunogenicity. Commonly, AUC-SV data is analyzed as a diffusion-corrected, sedimentation coefficient distribution, or c(s), using SEDFIT to numerically solve Lamm-type equations. SEDFIT also utilizes maximum entropy or Tikhonov-Phillips regularization to further allow the user to determine relevant sample information, including the number of species present, their sedimentation coefficients, and their relative abundance. However, this methodology has several, often unstated, limitations, which may impact the final analysis of protein therapeutics. These include regularization-specific effects, artificial "ripple peaks," and spurious shifts in the sedimentation coefficients. In this investigation, we experimentally verified that an explicit Bayesian approach, as implemented in SEDFIT, can largely correct for these effects. Clear guidelines on how to implement this technique and interpret the resulting data, especially for samples containing micro-heterogeneity (e.g., differential glycosylation), are also provided. In addition, we demonstrated how the Bayesian approach can be combined with F statistics to draw more accurate conclusions and rigorously exclude artifactual peaks. Numerous examples with an antibody and an antibody-drug conjugate were used to illustrate the strengths and drawbacks of each technique.

  4. Investigating the feasibility of using partial least squares as a method of extracting salient information for the evaluation of digital breast tomosynthesis

    NASA Astrophysics Data System (ADS)

    Zhang, George Z.; Myers, Kyle J.; Park, Subok

    2013-03-01

    Digital breast tomosynthesis (DBT) has shown promise for improving the detection of breast cancer, but it has not yet been fully optimized due to a large space of system parameters to explore. A task-based statistical approach1 is a rigorous method for evaluating and optimizing this promising imaging technique with the use of optimal observers such as the Hotelling observer (HO). However, the high data dimensionality found in DBT has been the bottleneck for the use of a task-based approach in DBT evaluation. To reduce data dimensionality while extracting salient information for performing a given task, efficient channels have to be used for the HO. In the past few years, 2D Laguerre-Gauss (LG) channels, which are a complete basis for stationary backgrounds and rotationally symmetric signals, have been utilized for DBT evaluation2, 3 . But since background and signal statistics from DBT data are neither stationary nor rotationally symmetric, LG channels may not be efficient in providing reliable performance trends as a function of system parameters. Recently, partial least squares (PLS) has been shown to generate efficient channels for the Hotelling observer in detection tasks involving random backgrounds and signals.4 In this study, we investigate the use of PLS as a method for extracting salient information from DBT in order to better evaluate such systems.

  5. Circular codes revisited: a statistical approach.

    PubMed

    Gonzalez, D L; Giannerini, S; Rosa, R

    2011-04-21

    In 1996 Arquès and Michel [1996. A complementary circular code in the protein coding genes. J. Theor. Biol. 182, 45-58] discovered the existence of a common circular code in eukaryote and prokaryote genomes. Since then, circular code theory has provoked great interest and underwent a rapid development. In this paper we discuss some theoretical issues related to the synchronization properties of coding sequences and circular codes with particular emphasis on the problem of retrieval and maintenance of the reading frame. Motivated by the theoretical discussion, we adopt a rigorous statistical approach in order to try to answer different questions. First, we investigate the covering capability of the whole class of 216 self-complementary, C(3) maximal codes with respect to a large set of coding sequences. The results indicate that, on average, the code proposed by Arquès and Michel has the best covering capability but, still, there exists a great variability among sequences. Second, we focus on such code and explore the role played by the proportion of the bases by means of a hierarchy of permutation tests. The results show the existence of a sort of optimization mechanism such that coding sequences are tailored as to maximize or minimize the coverage of circular codes on specific reading frames. Such optimization clearly relates the function of circular codes with reading frame synchronization. Copyright © 2011 Elsevier Ltd. All rights reserved.

  6. Phases, phase equilibria, and phase rules in low-dimensional systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Frolov, T., E-mail: timfrol@berkeley.edu; Mishin, Y., E-mail: ymishin@gmu.edu

    2015-07-28

    We present a unified approach to thermodynamic description of one, two, and three dimensional phases and phase transformations among them. The approach is based on a rigorous definition of a phase applicable to thermodynamic systems of any dimensionality. Within this approach, the same thermodynamic formalism can be applied for the description of phase transformations in bulk systems, interfaces, and line defects separating interface phases. For both lines and interfaces, we rigorously derive an adsorption equation, the phase coexistence equations, and other thermodynamic relations expressed in terms of generalized line and interface excess quantities. As a generalization of the Gibbs phasemore » rule for bulk phases, we derive phase rules for lines and interfaces and predict the maximum number of phases than may coexist in systems of the respective dimensionality.« less

  7. Improved key-rate bounds for practical decoy-state quantum-key-distribution systems

    NASA Astrophysics Data System (ADS)

    Zhang, Zhen; Zhao, Qi; Razavi, Mohsen; Ma, Xiongfeng

    2017-01-01

    The decoy-state scheme is the most widely implemented quantum-key-distribution protocol in practice. In order to account for the finite-size key effects on the achievable secret key generation rate, a rigorous statistical fluctuation analysis is required. Originally, a heuristic Gaussian-approximation technique was used for this purpose, which, despite its analytical convenience, was not sufficiently rigorous. The fluctuation analysis has recently been made rigorous by using the Chernoff bound. There is a considerable gap, however, between the key-rate bounds obtained from these techniques and that obtained from the Gaussian assumption. Here we develop a tighter bound for the decoy-state method, which yields a smaller failure probability. This improvement results in a higher key rate and increases the maximum distance over which secure key exchange is possible. By optimizing the system parameters, our simulation results show that our method almost closes the gap between the two previously proposed techniques and achieves a performance similar to that of conventional Gaussian approximations.

  8. Assessing the Ecological Condition of Streams in a Southeastern Brazilian Basin using a Probabilistic Monitoring Design

    EPA Science Inventory

    Prompt assessment and management actions are required if we are to reduce the current rapid loss of habitat and biodiversity worldwide. Statistically valid quantification of the biota and habitat condition in water bodies are prerequisites for rigorous assessment of aquatic biodi...

  9. Acquiring data for large aquatic resource surveys: the art of ompromise among science, logistics, and reality

    EPA Science Inventory

    The US Environmental Protection Agency (EPA) is revising its strategy to obtain the information needed to answer questions pertinent to water-quality management efficiently and rigorously at national scales. One tool of this revised strategy is use of statistically based surveys ...

  10. Examining Multidimensional Middle Grade Outcomes after Early Elementary School Grade Retention

    ERIC Educational Resources Information Center

    Hwang, Sophia; Cappella, Elise; Schwartz, Kate

    2016-01-01

    Recently, researchers have begun to employ rigorous statistical methods and developmentally-informed theories to evaluate outcomes for students retained in non-kindergarten early elementary school. However, the majority of this research focuses on academic outcomes. Gaps remain regarding retention's effects on psychosocial outcomes important to…

  11. A Two-Step Approach to Uncertainty Quantification of Core Simulators

    DOE PAGES

    Yankov, Artem; Collins, Benjamin; Klein, Markus; ...

    2012-01-01

    For the multiple sources of error introduced into the standard computational regime for simulating reactor cores, rigorous uncertainty analysis methods are available primarily to quantify the effects of cross section uncertainties. Two methods for propagating cross section uncertainties through core simulators are the XSUSA statistical approach and the “two-step” method. The XSUSA approach, which is based on the SUSA code package, is fundamentally a stochastic sampling method. Alternatively, the two-step method utilizes generalized perturbation theory in the first step and stochastic sampling in the second step. The consistency of these two methods in quantifying uncertainties in the multiplication factor andmore » in the core power distribution was examined in the framework of phase I-3 of the OECD Uncertainty Analysis in Modeling benchmark. With the Three Mile Island Unit 1 core as a base model for analysis, the XSUSA and two-step methods were applied with certain limitations, and the results were compared to those produced by other stochastic sampling-based codes. Based on the uncertainty analysis results, conclusions were drawn as to the method that is currently more viable for computing uncertainties in burnup and transient calculations.« less

  12. Ergodicity of Truncated Stochastic Navier Stokes with Deterministic Forcing and Dispersion

    NASA Astrophysics Data System (ADS)

    Majda, Andrew J.; Tong, Xin T.

    2016-10-01

    Turbulence in idealized geophysical flows is a very rich and important topic. The anisotropic effects of explicit deterministic forcing, dispersive effects from rotation due to the β -plane and F-plane, and topography together with random forcing all combine to produce a remarkable number of realistic phenomena. These effects have been studied through careful numerical experiments in the truncated geophysical models. These important results include transitions between coherent jets and vortices, and direct and inverse turbulence cascades as parameters are varied, and it is a contemporary challenge to explain these diverse statistical predictions. Here we contribute to these issues by proving with full mathematical rigor that for any values of the deterministic forcing, the β - and F-plane effects and topography, with minimal stochastic forcing, there is geometric ergodicity for any finite Galerkin truncation. This means that there is a unique smooth invariant measure which attracts all statistical initial data at an exponential rate. In particular, this rigorous statistical theory guarantees that there are no bifurcations to multiple stable and unstable statistical steady states as geophysical parameters are varied in contrast to claims in the applied literature. The proof utilizes a new statistical Lyapunov function to account for enstrophy exchanges between the statistical mean and the variance fluctuations due to the deterministic forcing. It also requires careful proofs of hypoellipticity with geophysical effects and uses geometric control theory to establish reachability. To illustrate the necessity of these conditions, a two-dimensional example is developed which has the square of the Euclidean norm as the Lyapunov function and is hypoelliptic with nonzero noise forcing, yet fails to be reachable or ergodic.

  13. EVALUATION OF THE COLD PIPE PRECHARGER

    EPA Science Inventory

    The article gives results of an evaluation of the performance of the cold pipe precharger, taking a more rigorous approach than had been previously taken. The approach required detailed descriptions of electrical characteristics, electro-hydrodynamics, and charging theory. The co...

  14. Forecasting volatility with neural regression: a contribution to model adequacy.

    PubMed

    Refenes, A N; Holt, W T

    2001-01-01

    Neural nets' usefulness for forecasting is limited by problems of overfitting and the lack of rigorous procedures for model identification, selection and adequacy testing. This paper describes a methodology for neural model misspecification testing. We introduce a generalization of the Durbin-Watson statistic for neural regression and discuss the general issues of misspecification testing using residual analysis. We derive a generalized influence matrix for neural estimators which enables us to evaluate the distribution of the statistic. We deploy Monte Carlo simulation to compare the power of the test for neural and linear regressors. While residual testing is not a sufficient condition for model adequacy, it is nevertheless a necessary condition to demonstrate that the model is a good approximation to the data generating process, particularly as neural-network estimation procedures are susceptible to partial convergence. The work is also an important step toward developing rigorous procedures for neural model identification, selection and adequacy testing which have started to appear in the literature. We demonstrate its applicability in the nontrivial problem of forecasting implied volatility innovations using high-frequency stock index options. Each step of the model building process is validated using statistical tests to verify variable significance and model adequacy with the results confirming the presence of nonlinear relationships in implied volatility innovations.

  15. Increased scientific rigor will improve reliability of research and effectiveness of management

    USGS Publications Warehouse

    Sells, Sarah N.; Bassing, Sarah B.; Barker, Kristin J.; Forshee, Shannon C.; Keever, Allison; Goerz, James W.; Mitchell, Michael S.

    2018-01-01

    Rigorous science that produces reliable knowledge is critical to wildlife management because it increases accurate understanding of the natural world and informs management decisions effectively. Application of a rigorous scientific method based on hypothesis testing minimizes unreliable knowledge produced by research. To evaluate the prevalence of scientific rigor in wildlife research, we examined 24 issues of the Journal of Wildlife Management from August 2013 through July 2016. We found 43.9% of studies did not state or imply a priori hypotheses, which are necessary to produce reliable knowledge. We posit that this is due, at least in part, to a lack of common understanding of what rigorous science entails, how it produces more reliable knowledge than other forms of interpreting observations, and how research should be designed to maximize inferential strength and usefulness of application. Current primary literature does not provide succinct explanations of the logic behind a rigorous scientific method or readily applicable guidance for employing it, particularly in wildlife biology; we therefore synthesized an overview of the history, philosophy, and logic that define scientific rigor for biological studies. A rigorous scientific method includes 1) generating a research question from theory and prior observations, 2) developing hypotheses (i.e., plausible biological answers to the question), 3) formulating predictions (i.e., facts that must be true if the hypothesis is true), 4) designing and implementing research to collect data potentially consistent with predictions, 5) evaluating whether predictions are consistent with collected data, and 6) drawing inferences based on the evaluation. Explicitly testing a priori hypotheses reduces overall uncertainty by reducing the number of plausible biological explanations to only those that are logically well supported. Such research also draws inferences that are robust to idiosyncratic observations and unavoidable human biases. Offering only post hoc interpretations of statistical patterns (i.e., a posteriorihypotheses) adds to uncertainty because it increases the number of plausible biological explanations without determining which have the greatest support. Further, post hocinterpretations are strongly subject to human biases. Testing hypotheses maximizes the credibility of research findings, makes the strongest contributions to theory and management, and improves reproducibility of research. Management decisions based on rigorous research are most likely to result in effective conservation of wildlife resources. 

  16. Quantum theory of multiscale coarse-graining.

    PubMed

    Han, Yining; Jin, Jaehyeok; Wagner, Jacob W; Voth, Gregory A

    2018-03-14

    Coarse-grained (CG) models serve as a powerful tool to simulate molecular systems at much longer temporal and spatial scales. Previously, CG models and methods have been built upon classical statistical mechanics. The present paper develops a theory and numerical methodology for coarse-graining in quantum statistical mechanics, by generalizing the multiscale coarse-graining (MS-CG) method to quantum Boltzmann statistics. A rigorous derivation of the sufficient thermodynamic consistency condition is first presented via imaginary time Feynman path integrals. It identifies the optimal choice of CG action functional and effective quantum CG (qCG) force field to generate a quantum MS-CG (qMS-CG) description of the equilibrium system that is consistent with the quantum fine-grained model projected onto the CG variables. A variational principle then provides a class of algorithms for optimally approximating the qMS-CG force fields. Specifically, a variational method based on force matching, which was also adopted in the classical MS-CG theory, is generalized to quantum Boltzmann statistics. The qMS-CG numerical algorithms and practical issues in implementing this variational minimization procedure are also discussed. Then, two numerical examples are presented to demonstrate the method. Finally, as an alternative strategy, a quasi-classical approximation for the thermal density matrix expressed in the CG variables is derived. This approach provides an interesting physical picture for coarse-graining in quantum Boltzmann statistical mechanics in which the consistency with the quantum particle delocalization is obviously manifest, and it opens up an avenue for using path integral centroid-based effective classical force fields in a coarse-graining methodology.

  17. Tailoring Systems Engineering Projects for Small Satellite Missions

    NASA Technical Reports Server (NTRS)

    Horan, Stephen; Belvin, Keith

    2013-01-01

    NASA maintains excellence in its spaceflight systems by utilizing rigorous engineering processes based on over 50 years of experience. The NASA systems engineering process for flight projects described in NPR 7120.5E was initially developed for major flight projects. The design and development of low-cost small satellite systems does not entail the financial and risk consequences traditionally associated with spaceflight projects. Consequently, an approach is offered to tailoring of the processes such that the small satellite missions will benefit from the engineering rigor without overly burdensome overhead. In this paper we will outline the approaches to tailoring the standard processes for these small missions and describe how it will be applied in a proposed small satellite mission.

  18. A Rigorous Framework for Optimization of Expensive Functions by Surrogates

    NASA Technical Reports Server (NTRS)

    Booker, Andrew J.; Dennis, J. E., Jr.; Frank, Paul D.; Serafini, David B.; Torczon, Virginia; Trosset, Michael W.

    1998-01-01

    The goal of the research reported here is to develop rigorous optimization algorithms to apply to some engineering design problems for which design application of traditional optimization approaches is not practical. This paper presents and analyzes a framework for generating a sequence of approximations to the objective function and managing the use of these approximations as surrogates for optimization. The result is to obtain convergence to a minimizer of an expensive objective function subject to simple constraints. The approach is widely applicable because it does not require, or even explicitly approximate, derivatives of the objective. Numerical results are presented for a 31-variable helicopter rotor blade design example and for a standard optimization test example.

  19. Statistical testing and power analysis for brain-wide association study.

    PubMed

    Gong, Weikang; Wan, Lin; Lu, Wenlian; Ma, Liang; Cheng, Fan; Cheng, Wei; Grünewald, Stefan; Feng, Jianfeng

    2018-04-05

    The identification of connexel-wise associations, which involves examining functional connectivities between pairwise voxels across the whole brain, is both statistically and computationally challenging. Although such a connexel-wise methodology has recently been adopted by brain-wide association studies (BWAS) to identify connectivity changes in several mental disorders, such as schizophrenia, autism and depression, the multiple correction and power analysis methods designed specifically for connexel-wise analysis are still lacking. Therefore, we herein report the development of a rigorous statistical framework for connexel-wise significance testing based on the Gaussian random field theory. It includes controlling the family-wise error rate (FWER) of multiple hypothesis testings using topological inference methods, and calculating power and sample size for a connexel-wise study. Our theoretical framework can control the false-positive rate accurately, as validated empirically using two resting-state fMRI datasets. Compared with Bonferroni correction and false discovery rate (FDR), it can reduce false-positive rate and increase statistical power by appropriately utilizing the spatial information of fMRI data. Importantly, our method bypasses the need of non-parametric permutation to correct for multiple comparison, thus, it can efficiently tackle large datasets with high resolution fMRI images. The utility of our method is shown in a case-control study. Our approach can identify altered functional connectivities in a major depression disorder dataset, whereas existing methods fail. A software package is available at https://github.com/weikanggong/BWAS. Copyright © 2018 Elsevier B.V. All rights reserved.

  20. Scalable privacy-preserving data sharing methodology for genome-wide association studies.

    PubMed

    Yu, Fei; Fienberg, Stephen E; Slavković, Aleksandra B; Uhler, Caroline

    2014-08-01

    The protection of privacy of individual-level information in genome-wide association study (GWAS) databases has been a major concern of researchers following the publication of "an attack" on GWAS data by Homer et al. (2008). Traditional statistical methods for confidentiality and privacy protection of statistical databases do not scale well to deal with GWAS data, especially in terms of guarantees regarding protection from linkage to external information. The more recent concept of differential privacy, introduced by the cryptographic community, is an approach that provides a rigorous definition of privacy with meaningful privacy guarantees in the presence of arbitrary external information, although the guarantees may come at a serious price in terms of data utility. Building on such notions, Uhler et al. (2013) proposed new methods to release aggregate GWAS data without compromising an individual's privacy. We extend the methods developed in Uhler et al. (2013) for releasing differentially-private χ(2)-statistics by allowing for arbitrary number of cases and controls, and for releasing differentially-private allelic test statistics. We also provide a new interpretation by assuming the controls' data are known, which is a realistic assumption because some GWAS use publicly available data as controls. We assess the performance of the proposed methods through a risk-utility analysis on a real data set consisting of DNA samples collected by the Wellcome Trust Case Control Consortium and compare the methods with the differentially-private release mechanism proposed by Johnson and Shmatikov (2013). Copyright © 2014 Elsevier Inc. All rights reserved.

  1. Parametric Cost Modeling of Space Missions Using the Develop New Projects (DMP) Implementation Process

    NASA Technical Reports Server (NTRS)

    Rosenberg, Leigh; Hihn, Jairus; Roust, Kevin; Warfield, Keith

    2000-01-01

    This paper presents an overview of a parametric cost model that has been built at JPL to estimate costs of future, deep space, robotic science missions. Due to the recent dramatic changes in JPL business practices brought about by an internal reengineering effort known as develop new products (DNP), high-level historic cost data is no longer considered analogous to future missions. Therefore, the historic data is of little value in forecasting costs for projects developed using the DNP process. This has lead to the development of an approach for obtaining expert opinion and also for combining actual data with expert opinion to provide a cost database for future missions. In addition, the DNP cost model has a maximum of objective cost drivers which reduces the likelihood of model input error. Version 2 is now under development which expands the model capabilities, links it more tightly with key design technical parameters, and is grounded in more rigorous statistical techniques. The challenges faced in building this model will be discussed, as well as it's background, development approach, status, validation, and future plans.

  2. Herbal medicine development: a plea for a rigorous scientific foundation.

    PubMed

    Lietman, Paul S

    2012-09-01

    Science, including rigorous basic scientific research and rigorous clinical research, must underlie both the development and the clinical use of herbal medicines. Yet almost none of the hundreds or thousands of articles that are published each year on some aspect of herbal medicines, adheres to 3 simple but profound scientific principles must underlie all of herbal drug development or clinical use. Three fundamental principles that should underlie everyone's thinking about the development and/or clinical use of any herbal medicine. (1) There must be standardization and regulation (rigorously enforced) of the product being studied or being used clinically. (2) There must be scientific proof of a beneficial clinical effect for something of value to the patient and established by rigorous clinical research. (3) There must be scientific proof of safety (acceptable toxicity) for the patient and established by rigorous clinical research. These fundamental principles of science have ramifications for both the scientist and the clinician. It is critically important that both the investigator and the prescriber know exactly what is in the studied or recommended product and how effective and toxic it is. We will find new and useful drugs from natural sources. However, we will have to learn how to study herbal medicines rigorously, and we will have to try to convince the believers in herbal medicines of the wisdom and even the necessity of a rigorous scientific approach to herbal medicine development. Both biomedical science and practicing physicians must enthusiastically accept the responsibility for searching for truth in the discovery and development of new herbal medicines, in the truthful teaching about herbal medicines from a scientific perspective, and in the scientifically proven clinical use of herbal medicines.

  3. Inference and Analysis of Population Structure Using Genetic Data and Network Theory

    PubMed Central

    Greenbaum, Gili; Templeton, Alan R.; Bar-David, Shirli

    2016-01-01

    Clustering individuals to subpopulations based on genetic data has become commonplace in many genetic studies. Inference about population structure is most often done by applying model-based approaches, aided by visualization using distance-based approaches such as multidimensional scaling. While existing distance-based approaches suffer from a lack of statistical rigor, model-based approaches entail assumptions of prior conditions such as that the subpopulations are at Hardy-Weinberg equilibria. Here we present a distance-based approach for inference about population structure using genetic data by defining population structure using network theory terminology and methods. A network is constructed from a pairwise genetic-similarity matrix of all sampled individuals. The community partition, a partition of a network to dense subgraphs, is equated with population structure, a partition of the population to genetically related groups. Community-detection algorithms are used to partition the network into communities, interpreted as a partition of the population to subpopulations. The statistical significance of the structure can be estimated by using permutation tests to evaluate the significance of the partition’s modularity, a network theory measure indicating the quality of community partitions. To further characterize population structure, a new measure of the strength of association (SA) for an individual to its assigned community is presented. The strength of association distribution (SAD) of the communities is analyzed to provide additional population structure characteristics, such as the relative amount of gene flow experienced by the different subpopulations and identification of hybrid individuals. Human genetic data and simulations are used to demonstrate the applicability of the analyses. The approach presented here provides a novel, computationally efficient model-free method for inference about population structure that does not entail assumption of prior conditions. The method is implemented in the software NetStruct (available at https://giligreenbaum.wordpress.com/software/). PMID:26888080

  4. Inference and Analysis of Population Structure Using Genetic Data and Network Theory.

    PubMed

    Greenbaum, Gili; Templeton, Alan R; Bar-David, Shirli

    2016-04-01

    Clustering individuals to subpopulations based on genetic data has become commonplace in many genetic studies. Inference about population structure is most often done by applying model-based approaches, aided by visualization using distance-based approaches such as multidimensional scaling. While existing distance-based approaches suffer from a lack of statistical rigor, model-based approaches entail assumptions of prior conditions such as that the subpopulations are at Hardy-Weinberg equilibria. Here we present a distance-based approach for inference about population structure using genetic data by defining population structure using network theory terminology and methods. A network is constructed from a pairwise genetic-similarity matrix of all sampled individuals. The community partition, a partition of a network to dense subgraphs, is equated with population structure, a partition of the population to genetically related groups. Community-detection algorithms are used to partition the network into communities, interpreted as a partition of the population to subpopulations. The statistical significance of the structure can be estimated by using permutation tests to evaluate the significance of the partition's modularity, a network theory measure indicating the quality of community partitions. To further characterize population structure, a new measure of the strength of association (SA) for an individual to its assigned community is presented. The strength of association distribution (SAD) of the communities is analyzed to provide additional population structure characteristics, such as the relative amount of gene flow experienced by the different subpopulations and identification of hybrid individuals. Human genetic data and simulations are used to demonstrate the applicability of the analyses. The approach presented here provides a novel, computationally efficient model-free method for inference about population structure that does not entail assumption of prior conditions. The method is implemented in the software NetStruct (available at https://giligreenbaum.wordpress.com/software/). Copyright © 2016 by the Genetics Society of America.

  5. Detection of crossover time scales in multifractal detrended fluctuation analysis

    NASA Astrophysics Data System (ADS)

    Ge, Erjia; Leung, Yee

    2013-04-01

    Fractal is employed in this paper as a scale-based method for the identification of the scaling behavior of time series. Many spatial and temporal processes exhibiting complex multi(mono)-scaling behaviors are fractals. One of the important concepts in fractals is crossover time scale(s) that separates distinct regimes having different fractal scaling behaviors. A common method is multifractal detrended fluctuation analysis (MF-DFA). The detection of crossover time scale(s) is, however, relatively subjective since it has been made without rigorous statistical procedures and has generally been determined by eye balling or subjective observation. Crossover time scales such determined may be spurious and problematic. It may not reflect the genuine underlying scaling behavior of a time series. The purpose of this paper is to propose a statistical procedure to model complex fractal scaling behaviors and reliably identify the crossover time scales under MF-DFA. The scaling-identification regression model, grounded on a solid statistical foundation, is first proposed to describe multi-scaling behaviors of fractals. Through the regression analysis and statistical inference, we can (1) identify the crossover time scales that cannot be detected by eye-balling observation, (2) determine the number and locations of the genuine crossover time scales, (3) give confidence intervals for the crossover time scales, and (4) establish the statistically significant regression model depicting the underlying scaling behavior of a time series. To substantive our argument, the regression model is applied to analyze the multi-scaling behaviors of avian-influenza outbreaks, water consumption, daily mean temperature, and rainfall of Hong Kong. Through the proposed model, we can have a deeper understanding of fractals in general and a statistical approach to identify multi-scaling behavior under MF-DFA in particular.

  6. Sex Differences in the Response of Children with ADHD to Once-Daily Formulations of Methylphenidate

    ERIC Educational Resources Information Center

    Sonuga-Barke, J. S.; Coghill, David; Markowitz, John S.; Swanson, James M.; Vandenberghe, Mieke; Hatch, Simon J.

    2007-01-01

    Objectives: Studies of sex differences in methylphenidate response by children with attention-deficit/hyperactivity disorder have lacked methodological rigor and statistical power. This paper reports an examination of sex differences based on further analysis of data from a comparison of two once-daily methylphenidate formulations (the COMACS…

  7. The Role of Data Analysis Software in Graduate Programs in Education and Post-Graduate Research

    ERIC Educational Resources Information Center

    Harwell, Michael

    2018-01-01

    The importance of data analysis software in graduate programs in education and post-graduate educational research is self-evident. However the role of this software in facilitating supererogated statistical practice versus "cookbookery" is unclear. The need to rigorously document the role of data analysis software in students' graduate…

  8. Meeting the needs of an ever-demanding market.

    PubMed

    Rigby, Richard

    2002-04-01

    Balancing cost and performance in packaging is critical. This article outlines techniques to assist in this whilst delivering added value and product differentiation. The techniques include a rigorous statistical process capable of delivering cost reduction and improved quality and a computer modelling process that can save time when validating new packaging options.

  9. Critical Examination of Candidates' Diversity Competence: Rigorous and Systematic Assessment of Candidates' Efficacy to Teach Diverse Student Populations

    ERIC Educational Resources Information Center

    Benton-Borghi, Beatrice Hope; Chang, Young Mi

    2011-01-01

    The National Center for Educational Statistics (NCES, 2010) continues to report substantial underachievement of diverse student populations in the nation's schools. After decades of focus on diversity and multicultural education, with integrating field and clinical practice, candidates continue to graduate without adequate knowledge, skills and…

  10. State College- and Career-Ready High School Graduation Requirements. Updated

    ERIC Educational Resources Information Center

    Achieve, Inc., 2013

    2013-01-01

    Research by Achieve, ACT, and others suggests that for high school graduates to be prepared for success in a wide range of postsecondary settings, they need to take four years of challenging mathematics--covering Advanced Algebra; Geometry; and data, probability, and statistics content--and four years of rigorous English aligned with college- and…

  11. High School Redesign. Diplomas Count, 2016. Education Week. Volume 35, Number 33

    ERIC Educational Resources Information Center

    Edwards, Virginia B., Ed.

    2016-01-01

    This year's report focuses on efforts to redesign high schools. Those include incorporating student voice, implementing a rigorous and relevant curriculum, embracing career exploration, and more. The report also includes the latest statistics on the nation's overall, on-time high school graduation rate. Articles include: (1) To Build a Better High…

  12. Interactive visual analysis promotes exploration of long-term ecological data

    Treesearch

    T.N. Pham; J.A. Jones; R. Metoyer; F.J. Swanson; R.J. Pabst

    2013-01-01

    Long-term ecological data are crucial in helping ecologists understand ecosystem function and environmental change. Nevertheless, these kinds of data sets are difficult to analyze because they are usually large, multivariate, and spatiotemporal. Although existing analysis tools such as statistical methods and spreadsheet software permit rigorous tests of pre-conceived...

  13. SCOTCH: Secure Counting Of encrypTed genomiC data using a Hybrid approach.

    PubMed

    Chenghong, Wang; Jiang, Yichen; Mohammed, Noman; Chen, Feng; Jiang, Xiaoqian; Al Aziz, Md Momin; Sadat, Md Nazmus; Wang, Shuang

    2017-01-01

    As genomic data are usually at large scale and highly sensitive, it is essential to enable both efficient and secure analysis, by which the data owner can securely delegate both computation and storage on untrusted public cloud. Counting query of genotypes is a basic function for many downstream applications in biomedical research (e.g., computing allele frequency, calculating chi-squared statistics, etc.). Previous solutions show promise on secure counting of outsourced data but the efficiency is still a big limitation for real world applications. In this paper, we propose a novel hybrid solution to combine a rigorous theoretical model (homomorphic encryption) and the latest hardware-based infrastructure (i.e., Software Guard Extensions) to speed up the computation while preserving the privacy of both data owners and data users. Our results demonstrated efficiency by using the real data from the personal genome project.

  14. A genetically informed study of the association between harsh punishment and offspring behavioral problems.

    PubMed

    Lynch, Stacy K; Turkheimer, Eric; D'Onofrio, Brian M; Mendle, Jane; Emery, Robert E; Slutske, Wendy S; Martin, Nicholas G

    2006-06-01

    Conclusions about the effects of harsh parenting on children have been limited by research designs that cannot control for genetic or shared environmental confounds. The present study used a sample of children of twins and a hierarchical linear modeling statistical approach to analyze the consequences of varying levels of punishment while controlling for many confounding influences. The sample of 887 twin pairs and 2,554 children came from the Australian Twin Registry. Although corporal punishment per se did not have significant associations with negative childhood outcomes, harsher forms of physical punishment did appear to have specific and significant effects. The observed association between harsh physical punishment and negative outcomes in children survived a relatively rigorous test of its causal status, thereby increasing the authors' conviction that harsh physical punishment is a serious risk factor for children. ((c) 2006 APA, all rights reserved).

  15. Controls of multi-modal wave conditions in a complex coastal setting

    USGS Publications Warehouse

    Hegermiller, Christie; Rueda, Ana C.; Erikson, Li H.; Barnard, Patrick L.; Antolinez, J.A.A.; Mendez, Fernando J.

    2017-01-01

    Coastal hazards emerge from the combined effect of wave conditions and sea level anomalies associated with storms or low-frequency atmosphere-ocean oscillations. Rigorous characterization of wave climate is limited by the availability of spectral wave observations, the computational cost of dynamical simulations, and the ability to link wave-generating atmospheric patterns with coastal conditions. We present a hybrid statistical-dynamical approach to simulating nearshore wave climate in complex coastal settings, demonstrated in the Southern California Bight, where waves arriving from distant, disparate locations are refracted over complex bathymetry and shadowed by offshore islands. Contributions of wave families and large-scale atmospheric drivers to nearshore wave energy flux are analyzed. Results highlight the variability of influences controlling wave conditions along neighboring coastlines. The universal method demonstrated here can be applied to complex coastal settings worldwide, facilitating analysis of the effects of climate change on nearshore wave climate.

  16. Controls of Multimodal Wave Conditions in a Complex Coastal Setting

    NASA Astrophysics Data System (ADS)

    Hegermiller, C. A.; Rueda, A.; Erikson, L. H.; Barnard, P. L.; Antolinez, J. A. A.; Mendez, F. J.

    2017-12-01

    Coastal hazards emerge from the combined effect of wave conditions and sea level anomalies associated with storms or low-frequency atmosphere-ocean oscillations. Rigorous characterization of wave climate is limited by the availability of spectral wave observations, the computational cost of dynamical simulations, and the ability to link wave-generating atmospheric patterns with coastal conditions. We present a hybrid statistical-dynamical approach to simulating nearshore wave climate in complex coastal settings, demonstrated in the Southern California Bight, where waves arriving from distant, disparate locations are refracted over complex bathymetry and shadowed by offshore islands. Contributions of wave families and large-scale atmospheric drivers to nearshore wave energy flux are analyzed. Results highlight the variability of influences controlling wave conditions along neighboring coastlines. The universal method demonstrated here can be applied to complex coastal settings worldwide, facilitating analysis of the effects of climate change on nearshore wave climate.

  17. Overarching framework for data-based modelling

    NASA Astrophysics Data System (ADS)

    Schelter, Björn; Mader, Malenka; Mader, Wolfgang; Sommerlade, Linda; Platt, Bettina; Lai, Ying-Cheng; Grebogi, Celso; Thiel, Marco

    2014-02-01

    One of the main modelling paradigms for complex physical systems are networks. When estimating the network structure from measured signals, typically several assumptions such as stationarity are made in the estimation process. Violating these assumptions renders standard analysis techniques fruitless. We here propose a framework to estimate the network structure from measurements of arbitrary non-linear, non-stationary, stochastic processes. To this end, we propose a rigorous mathematical theory that underlies this framework. Based on this theory, we present a highly efficient algorithm and the corresponding statistics that are immediately sensibly applicable to measured signals. We demonstrate its performance in a simulation study. In experiments of transitions between vigilance stages in rodents, we infer small network structures with complex, time-dependent interactions; this suggests biomarkers for such transitions, the key to understand and diagnose numerous diseases such as dementia. We argue that the suggested framework combines features that other approaches followed so far lack.

  18. Mindfulness Meditation for Chronic Pain: Systematic Review and Meta-analysis.

    PubMed

    Hilton, Lara; Hempel, Susanne; Ewing, Brett A; Apaydin, Eric; Xenakis, Lea; Newberry, Sydne; Colaiaco, Ben; Maher, Alicia Ruelaz; Shanman, Roberta M; Sorbero, Melony E; Maglione, Margaret A

    2017-04-01

    Chronic pain patients increasingly seek treatment through mindfulness meditation. This study aims to synthesize evidence on efficacy and safety of mindfulness meditation interventions for the treatment of chronic pain in adults. We conducted a systematic review on randomized controlled trials (RCTs) with meta-analyses using the Hartung-Knapp-Sidik-Jonkman method for random-effects models. Quality of evidence was assessed using the GRADE approach. Outcomes included pain, depression, quality of life, and analgesic use. Thirty-eight RCTs met inclusion criteria; seven reported on safety. We found low-quality evidence that mindfulness meditation is associated with a small decrease in pain compared with all types of controls in 30 RCTs. Statistically significant effects were also found for depression symptoms and quality of life. While mindfulness meditation improves pain and depression symptoms and quality of life, additional well-designed, rigorous, and large-scale RCTs are needed to decisively provide estimates of the efficacy of mindfulness meditation for chronic pain.

  19. Using Temporal Correlations and Full Distributions to Separate Intrinsic and Extrinsic Fluctuations in Biological Systems

    NASA Astrophysics Data System (ADS)

    Hilfinger, Andreas; Chen, Mark; Paulsson, Johan

    2012-12-01

    Studies of stochastic biological dynamics typically compare observed fluctuations to theoretically predicted variances, sometimes after separating the intrinsic randomness of the system from the enslaving influence of changing environments. But variances have been shown to discriminate surprisingly poorly between alternative mechanisms, while for other system properties no approaches exist that rigorously disentangle environmental influences from intrinsic effects. Here, we apply the theory of generalized random walks in random environments to derive exact rules for decomposing time series and higher statistics, rather than just variances. We show for which properties and for which classes of systems intrinsic fluctuations can be analyzed without accounting for extrinsic stochasticity and vice versa. We derive two independent experimental methods to measure the separate noise contributions and show how to use the additional information in temporal correlations to detect multiplicative effects in dynamical systems.

  20. Anthropic selection and the habitability of planets orbiting M and K dwarfs

    NASA Astrophysics Data System (ADS)

    Waltham, Dave

    2011-10-01

    The Earth may have untypical characteristics which were necessary preconditions for the emergence of life and, ultimately, intelligent observers. This paper presents a rigorous procedure for quantifying such "anthropic selection" effects by comparing Earth's properties to those of exoplanets. The hypothesis that there is anthropic selection for stellar mass (i.e. planets orbiting stars with masses within a particular range are more favourable for the emergence of observers) is then tested. The results rule out the expected strong selection for low mass stars which would result, all else being equal, if the typical timescale for the emergence of intelligent observers is very long. This indicates that the habitable zone of small stars may be less hospitable for intelligent life than the habitable zone of solar-mass stars. Additional planetary properties can also be analyzed, using the approach introduced here, once relatively complete and unbiased statistics are made available by current and planned exoplanet characterization projects.

  1. MUSiC - Model-independent search for deviations from Standard Model predictions in CMS

    NASA Astrophysics Data System (ADS)

    Pieta, Holger

    2010-02-01

    We present an approach for a model independent search in CMS. Systematically scanning the data for deviations from the standard model Monte Carlo expectations, such an analysis can help to understand the detector and tune event generators. By minimizing the theoretical bias the analysis is furthermore sensitive to a wide range of models for new physics, including the uncounted number of models not-yet-thought-of. After sorting the events into classes defined by their particle content (leptons, photons, jets and missing transverse energy), a minimally prejudiced scan is performed on a number of distributions. Advanced statistical methods are used to determine the significance of the deviating regions, rigorously taking systematic uncertainties into account. A number of benchmark scenarios, including common models of new physics and possible detector effects, have been used to gauge the power of such a method. )

  2. A Genetically Informed Study of the Association Between Harsh Punishment and Offspring Behavioral Problems

    PubMed Central

    Lynch, Stacy K.; Turkheimer, Eric; D’Onofrio, Brian M.; Mendle, Jane; Emery, Robert E.; Slutske, Wendy S.; Martin, Nicholas G.

    2010-01-01

    Conclusions about the effects of harsh parenting on children have been limited by research designs that cannot control for genetic or shared environmental confounds. The present study used a sample of children of twins and a hierarchical linear modeling statistical approach to analyze the consequences of varying levels of punishment while controlling for many confounding influences. The sample of 887 twin pairs and 2,554 children came from the Australian Twin Registry. Although corporal punishment per se did not have significant associations with negative childhood outcomes, harsher forms of physical punishment did appear to have specific and significant effects. The observed association between harsh physical punishment and negative outcomes in children survived a relatively rigorous test of its causal status, thereby increasing the authors’ conviction that harsh physical punishment is a serious risk factor for children. PMID:16756394

  3. SCOTCH: Secure Counting Of encrypTed genomiC data using a Hybrid approach

    PubMed Central

    Chenghong, Wang; Jiang, Yichen; Mohammed, Noman; Chen, Feng; Jiang, Xiaoqian; Al Aziz, Md Momin; Sadat, Md Nazmus; Wang, Shuang

    2017-01-01

    As genomic data are usually at large scale and highly sensitive, it is essential to enable both efficient and secure analysis, by which the data owner can securely delegate both computation and storage on untrusted public cloud. Counting query of genotypes is a basic function for many downstream applications in biomedical research (e.g., computing allele frequency, calculating chi-squared statistics, etc.). Previous solutions show promise on secure counting of outsourced data but the efficiency is still a big limitation for real world applications. In this paper, we propose a novel hybrid solution to combine a rigorous theoretical model (homomorphic encryption) and the latest hardware-based infrastructure (i.e., Software Guard Extensions) to speed up the computation while preserving the privacy of both data owners and data users. Our results demonstrated efficiency by using the real data from the personal genome project. PMID:29854245

  4. Universality for 1d Random Band Matrices: Sigma-Model Approximation

    NASA Astrophysics Data System (ADS)

    Shcherbina, Mariya; Shcherbina, Tatyana

    2018-02-01

    The paper continues the development of the rigorous supersymmetric transfer matrix approach to the random band matrices started in (J Stat Phys 164:1233-1260, 2016; Commun Math Phys 351:1009-1044, 2017). We consider random Hermitian block band matrices consisting of W× W random Gaussian blocks (parametrized by j,k \\in Λ =[1,n]^d\\cap Z^d ) with a fixed entry's variance J_{jk}=δ _{j,k}W^{-1}+β Δ _{j,k}W^{-2} , β >0 in each block. Taking the limit W→ ∞ with fixed n and β , we derive the sigma-model approximation of the second correlation function similar to Efetov's one. Then, considering the limit β , n→ ∞, we prove that in the dimension d=1 the behaviour of the sigma-model approximation in the bulk of the spectrum, as β ≫ n , is determined by the classical Wigner-Dyson statistics.

  5. Uncertainty Analysis of Instrument Calibration and Application

    NASA Technical Reports Server (NTRS)

    Tripp, John S.; Tcheng, Ping

    1999-01-01

    Experimental aerodynamic researchers require estimated precision and bias uncertainties of measured physical quantities, typically at 95 percent confidence levels. Uncertainties of final computed aerodynamic parameters are obtained by propagation of individual measurement uncertainties through the defining functional expressions. In this paper, rigorous mathematical techniques are extended to determine precision and bias uncertainties of any instrument-sensor system. Through this analysis, instrument uncertainties determined through calibration are now expressed as functions of the corresponding measurement for linear and nonlinear univariate and multivariate processes. Treatment of correlated measurement precision error is developed. During laboratory calibration, calibration standard uncertainties are assumed to be an order of magnitude less than those of the instrument being calibrated. Often calibration standards do not satisfy this assumption. This paper applies rigorous statistical methods for inclusion of calibration standard uncertainty and covariance due to the order of their application. The effects of mathematical modeling error on calibration bias uncertainty are quantified. The effects of experimental design on uncertainty are analyzed. The importance of replication is emphasized, techniques for estimation of both bias and precision uncertainties using replication are developed. Statistical tests for stationarity of calibration parameters over time are obtained.

  6. Optical simulations of organic light-emitting diodes through a combination of rigorous electromagnetic solvers and Monte Carlo ray-tracing methods

    NASA Astrophysics Data System (ADS)

    Bahl, Mayank; Zhou, Gui-Rong; Heller, Evan; Cassarly, William; Jiang, Mingming; Scarmozzino, Rob; Gregory, G. Groot

    2014-09-01

    Over the last two decades there has been extensive research done to improve the design of Organic Light Emitting Diodes (OLEDs) so as to enhance light extraction efficiency, improve beam shaping, and allow color tuning through techniques such as the use of patterned substrates, photonic crystal (PCs) gratings, back reflectors, surface texture, and phosphor down-conversion. Computational simulation has been an important tool for examining these increasingly complex designs. It has provided insights for improving OLED performance as a result of its ability to explore limitations, predict solutions, and demonstrate theoretical results. Depending upon the focus of the design and scale of the problem, simulations are carried out using rigorous electromagnetic (EM) wave optics based techniques, such as finite-difference time-domain (FDTD) and rigorous coupled wave analysis (RCWA), or through ray optics based technique such as Monte Carlo ray-tracing. The former are typically used for modeling nanostructures on the OLED die, and the latter for modeling encapsulating structures, die placement, back-reflection, and phosphor down-conversion. This paper presents the use of a mixed-level simulation approach which unifies the use of EM wave-level and ray-level tools. This approach uses rigorous EM wave based tools to characterize the nanostructured die and generate both a Bidirectional Scattering Distribution function (BSDF) and a far-field angular intensity distribution. These characteristics are then incorporated into the ray-tracing simulator to obtain the overall performance. Such mixed-level approach allows for comprehensive modeling of the optical characteristic of OLEDs and can potentially lead to more accurate performance than that from individual modeling tools alone.

  7. Theory! The Missing Link in Understanding the Performance of Neonate/Infant Home-Visiting Programs to Prevent Child Maltreatment: A Systematic Review

    PubMed Central

    Segal, Leonie; Sara Opie, Rachelle; Dalziel, Kim

    2012-01-01

    Context Home-visiting programs have been offered for more than sixty years to at-risk families of newborns and infants. But despite decades of experience with program delivery, more than sixty published controlled trials, and more than thirty published literature reviews, there is still uncertainty surrounding the performance of these programs. Our particular interest was the performance of home visiting in reducing child maltreatment. Methods We developed a program logic framework to assist in understanding the neonate/infant home-visiting literature, identified through a systematic literature review. We tested whether success could be explained by the logic model using descriptive synthesis and statistical analysis. Findings Having a stated objective of reducing child maltreatment—a theory or mechanism of change underpinning the home-visiting program consistent with the target population and their needs and program components that can deliver against the nominated theory of change—considerably increased the chance of success. We found that only seven of fifty-three programs demonstrated such consistency, all of which had a statistically significant positive outcome, whereas of the fifteen that had no match, none was successful. Programs with a partial match had an intermediate success rate. The relationship between program success and full, partial or no match was statistically significant. Conclusions Employing a theory-driven approach provides a new way of understanding the disparate performance of neonate/infant home-visiting programs. Employing a similar theory-driven approach could also prove useful in the review of other programs that embody a diverse set of characteristics and may apply to diverse populations and settings. A program logic framework provides a rigorous approach to deriving policy-relevant meaning from effectiveness evidence of complex programs. For neonate/infant home-visiting programs, it means that in developing these programs, attention to consistency of objectives, theory of change, target population, and program components is critical. PMID:22428693

  8. Theory! The missing link in understanding the performance of neonate/infant home-visiting programs to prevent child maltreatment: a systematic review.

    PubMed

    Segal, Leonie; Sara Opie, Rachelle; Dalziel, Kim

    2012-03-01

    Home-visiting programs have been offered for more than sixty years to at-risk families of newborns and infants. But despite decades of experience with program delivery, more than sixty published controlled trials, and more than thirty published literature reviews, there is still uncertainty surrounding the performance of these programs. Our particular interest was the performance of home visiting in reducing child maltreatment. We developed a program logic framework to assist in understanding the neonate/infant home-visiting literature, identified through a systematic literature review. We tested whether success could be explained by the logic model using descriptive synthesis and statistical analysis. Having a stated objective of reducing child maltreatment-a theory or mechanism of change underpinning the home-visiting program consistent with the target population and their needs and program components that can deliver against the nominated theory of change-considerably increased the chance of success. We found that only seven of fifty-three programs demonstrated such consistency, all of which had a statistically significant positive outcome, whereas of the fifteen that had no match, none was successful. Programs with a partial match had an intermediate success rate. The relationship between program success and full, partial or no match was statistically significant. Employing a theory-driven approach provides a new way of understanding the disparate performance of neonate/infant home-visiting programs. Employing a similar theory-driven approach could also prove useful in the review of other programs that embody a diverse set of characteristics and may apply to diverse populations and settings. A program logic framework provides a rigorous approach to deriving policy-relevant meaning from effectiveness evidence of complex programs. For neonate/infant home-visiting programs, it means that in developing these programs, attention to consistency of objectives, theory of change, target population, and program components is critical. © 2012 Milbank Memorial Fund.

  9. A Bayesian nonparametric method for prediction in EST analysis

    PubMed Central

    Lijoi, Antonio; Mena, Ramsés H; Prünster, Igor

    2007-01-01

    Background Expressed sequence tags (ESTs) analyses are a fundamental tool for gene identification in organisms. Given a preliminary EST sample from a certain library, several statistical prediction problems arise. In particular, it is of interest to estimate how many new genes can be detected in a future EST sample of given size and also to determine the gene discovery rate: these estimates represent the basis for deciding whether to proceed sequencing the library and, in case of a positive decision, a guideline for selecting the size of the new sample. Such information is also useful for establishing sequencing efficiency in experimental design and for measuring the degree of redundancy of an EST library. Results In this work we propose a Bayesian nonparametric approach for tackling statistical problems related to EST surveys. In particular, we provide estimates for: a) the coverage, defined as the proportion of unique genes in the library represented in the given sample of reads; b) the number of new unique genes to be observed in a future sample; c) the discovery rate of new genes as a function of the future sample size. The Bayesian nonparametric model we adopt conveys, in a statistically rigorous way, the available information into prediction. Our proposal has appealing properties over frequentist nonparametric methods, which become unstable when prediction is required for large future samples. EST libraries, previously studied with frequentist methods, are analyzed in detail. Conclusion The Bayesian nonparametric approach we undertake yields valuable tools for gene capture and prediction in EST libraries. The estimators we obtain do not feature the kind of drawbacks associated with frequentist estimators and are reliable for any size of the additional sample. PMID:17868445

  10. A new statistical framework to assess structural alignment quality using information compression

    PubMed Central

    Collier, James H.; Allison, Lloyd; Lesk, Arthur M.; Garcia de la Banda, Maria; Konagurthu, Arun S.

    2014-01-01

    Motivation: Progress in protein biology depends on the reliability of results from a handful of computational techniques, structural alignments being one. Recent reviews have highlighted substantial inconsistencies and differences between alignment results generated by the ever-growing stock of structural alignment programs. The lack of consensus on how the quality of structural alignments must be assessed has been identified as the main cause for the observed differences. Current methods assess structural alignment quality by constructing a scoring function that attempts to balance conflicting criteria, mainly alignment coverage and fidelity of structures under superposition. This traditional approach to measuring alignment quality, the subject of considerable literature, has failed to solve the problem. Further development along the same lines is unlikely to rectify the current deficiencies in the field. Results: This paper proposes a new statistical framework to assess structural alignment quality and significance based on lossless information compression. This is a radical departure from the traditional approach of formulating scoring functions. It links the structural alignment problem to the general class of statistical inductive inference problems, solved using the information-theoretic criterion of minimum message length. Based on this, we developed an efficient and reliable measure of structural alignment quality, I-value. The performance of I-value is demonstrated in comparison with a number of popular scoring functions, on a large collection of competing alignments. Our analysis shows that I-value provides a rigorous and reliable quantification of structural alignment quality, addressing a major gap in the field. Availability: http://lcb.infotech.monash.edu.au/I-value Contact: arun.konagurthu@monash.edu Supplementary information: Online supplementary data are available at http://lcb.infotech.monash.edu.au/I-value/suppl.html PMID:25161241

  11. Reconstructing constructivism: Causal models, Bayesian learning mechanisms and the theory theory

    PubMed Central

    Gopnik, Alison; Wellman, Henry M.

    2012-01-01

    We propose a new version of the “theory theory” grounded in the computational framework of probabilistic causal models and Bayesian learning. Probabilistic models allow a constructivist but rigorous and detailed approach to cognitive development. They also explain the learning of both more specific causal hypotheses and more abstract framework theories. We outline the new theoretical ideas, explain the computational framework in an intuitive and non-technical way, and review an extensive but relatively recent body of empirical results that supports these ideas. These include new studies of the mechanisms of learning. Children infer causal structure from statistical information, through their own actions on the world and through observations of the actions of others. Studies demonstrate these learning mechanisms in children from 16 months to 4 years old and include research on causal statistical learning, informal experimentation through play, and imitation and informal pedagogy. They also include studies of the variability and progressive character of intuitive theory change, particularly theory of mind. These studies investigate both the physical and psychological and social domains. We conclude with suggestions for further collaborative projects between developmental and computational cognitive scientists. PMID:22582739

  12. Photoconductivity enhancement and charge transport properties in ruthenium-containing block copolymer/carbon nanotube hybrids.

    PubMed

    Lo, Kin Cheung; Hau, King In; Chan, Wai Kin

    2018-04-05

    Functional polymer/carbon nanotube (CNT) hybrid materials can serve as a good model for light harvesting systems based on CNTs. This paper presents the synthesis of block copolymer/CNT hybrids and the characterization of their photocurrent responses by both experimental and computational approaches. A series of functional diblock copolymers was synthesized by reversible addition-fragmentation chain transfer polymerizations for the dispersion and functionalization of CNTs. The block copolymers contain photosensitizing ruthenium complexes and modified pyrene-based anchoring units. The photocurrent responses of the polymer/CNT hybrids were measured by photoconductive atomic force microscopy (PCAFM), from which the experimental data were analyzed by vigorous statistical models. The difference in photocurrent response among different hybrids was correlated to the conformations of the hybrids, which were elucidated by molecular dynamics simulations, and the electronic properties of polymers. The photoresponse of the block copolymer/CNT hybrids can be enhanced by introducing an electron-accepting block between the photosensitizing block and the CNT. We have demonstrated that the application of a rigorous statistical methodology can unravel the charge transport properties of these hybrid materials and provide general guidelines for the design of molecular light harvesting systems.

  13. A New Approach to an Old Order.

    ERIC Educational Resources Information Center

    Rambhia, Sanjay

    2002-01-01

    Explains the difficulties middle school students face in algebra regarding the order of operations. Describes a more visual approach to teaching the order of operations so that students can better solve complex problems and be better prepared for the rigors of algebra. (YDS)

  14. Methodological reporting in qualitative, quantitative, and mixed methods health services research articles.

    PubMed

    Wisdom, Jennifer P; Cavaleri, Mary A; Onwuegbuzie, Anthony J; Green, Carla A

    2012-04-01

    Methodologically sound mixed methods research can improve our understanding of health services by providing a more comprehensive picture of health services than either method can alone. This study describes the frequency of mixed methods in published health services research and compares the presence of methodological components indicative of rigorous approaches across mixed methods, qualitative, and quantitative articles. All empirical articles (n = 1,651) published between 2003 and 2007 from four top-ranked health services journals. All mixed methods articles (n = 47) and random samples of qualitative and quantitative articles were evaluated to identify reporting of key components indicating rigor for each method, based on accepted standards for evaluating the quality of research reports (e.g., use of p-values in quantitative reports, description of context in qualitative reports, and integration in mixed method reports). We used chi-square tests to evaluate differences between article types for each component. Mixed methods articles comprised 2.85 percent (n = 47) of empirical articles, quantitative articles 90.98 percent (n = 1,502), and qualitative articles 6.18 percent (n = 102). There was a statistically significant difference (χ(2) (1) = 12.20, p = .0005, Cramer's V = 0.09, odds ratio = 1.49 [95% confidence interval = 1,27, 1.74]) in the proportion of quantitative methodological components present in mixed methods compared to quantitative papers (21.94 versus 47.07 percent, respectively) but no statistically significant difference (χ(2) (1) = 0.02, p = .89, Cramer's V = 0.01) in the proportion of qualitative methodological components in mixed methods compared to qualitative papers (21.34 versus 25.47 percent, respectively). Few published health services research articles use mixed methods. The frequency of key methodological components is variable. Suggestions are provided to increase the transparency of mixed methods studies and the presence of key methodological components in published reports. © Health Research and Educational Trust.

  15. Methodological Reporting in Qualitative, Quantitative, and Mixed Methods Health Services Research Articles

    PubMed Central

    Wisdom, Jennifer P; Cavaleri, Mary A; Onwuegbuzie, Anthony J; Green, Carla A

    2012-01-01

    Objectives Methodologically sound mixed methods research can improve our understanding of health services by providing a more comprehensive picture of health services than either method can alone. This study describes the frequency of mixed methods in published health services research and compares the presence of methodological components indicative of rigorous approaches across mixed methods, qualitative, and quantitative articles. Data Sources All empirical articles (n = 1,651) published between 2003 and 2007 from four top-ranked health services journals. Study Design All mixed methods articles (n = 47) and random samples of qualitative and quantitative articles were evaluated to identify reporting of key components indicating rigor for each method, based on accepted standards for evaluating the quality of research reports (e.g., use of p-values in quantitative reports, description of context in qualitative reports, and integration in mixed method reports). We used chi-square tests to evaluate differences between article types for each component. Principal Findings Mixed methods articles comprised 2.85 percent (n = 47) of empirical articles, quantitative articles 90.98 percent (n = 1,502), and qualitative articles 6.18 percent (n = 102). There was a statistically significant difference (χ2(1) = 12.20, p = .0005, Cramer's V = 0.09, odds ratio = 1.49 [95% confidence interval = 1,27, 1.74]) in the proportion of quantitative methodological components present in mixed methods compared to quantitative papers (21.94 versus 47.07 percent, respectively) but no statistically significant difference (χ2(1) = 0.02, p = .89, Cramer's V = 0.01) in the proportion of qualitative methodological components in mixed methods compared to qualitative papers (21.34 versus 25.47 percent, respectively). Conclusion Few published health services research articles use mixed methods. The frequency of key methodological components is variable. Suggestions are provided to increase the transparency of mixed methods studies and the presence of key methodological components in published reports. PMID:22092040

  16. Field significance of performance measures in the context of regional climate model evaluation. Part 2: precipitation

    NASA Astrophysics Data System (ADS)

    Ivanov, Martin; Warrach-Sagi, Kirsten; Wulfmeyer, Volker

    2018-04-01

    A new approach for rigorous spatial analysis of the downscaling performance of regional climate model (RCM) simulations is introduced. It is based on a multiple comparison of the local tests at the grid cells and is also known as `field' or `global' significance. The block length for the local resampling tests is precisely determined to adequately account for the time series structure. New performance measures for estimating the added value of downscaled data relative to the large-scale forcing fields are developed. The methodology is exemplarily applied to a standard EURO-CORDEX hindcast simulation with the Weather Research and Forecasting (WRF) model coupled with the land surface model NOAH at 0.11 ∘ grid resolution. Daily precipitation climatology for the 1990-2009 period is analysed for Germany for winter and summer in comparison with high-resolution gridded observations from the German Weather Service. The field significance test controls the proportion of falsely rejected local tests in a meaningful way and is robust to spatial dependence. Hence, the spatial patterns of the statistically significant local tests are also meaningful. We interpret them from a process-oriented perspective. While the downscaled precipitation distributions are statistically indistinguishable from the observed ones in most regions in summer, the biases of some distribution characteristics are significant over large areas in winter. WRF-NOAH generates appropriate stationary fine-scale climate features in the daily precipitation field over regions of complex topography in both seasons and appropriate transient fine-scale features almost everywhere in summer. As the added value of global climate model (GCM)-driven simulations cannot be smaller than this perfect-boundary estimate, this work demonstrates in a rigorous manner the clear additional value of dynamical downscaling over global climate simulations. The evaluation methodology has a broad spectrum of applicability as it is distribution-free, robust to spatial dependence, and accounts for time series structure.

  17. Advanced EUV mask and imaging modeling

    NASA Astrophysics Data System (ADS)

    Evanschitzky, Peter; Erdmann, Andreas

    2017-10-01

    The exploration and optimization of image formation in partially coherent EUV projection systems with complex source shapes requires flexible, accurate, and efficient simulation models. This paper reviews advanced mask diffraction and imaging models for the highly accurate and fast simulation of EUV lithography systems, addressing important aspects of the current technical developments. The simulation of light diffraction from the mask employs an extended rigorous coupled wave analysis (RCWA) approach, which is optimized for EUV applications. In order to be able to deal with current EUV simulation requirements, several additional models are included in the extended RCWA approach: a field decomposition and a field stitching technique enable the simulation of larger complex structured mask areas. An EUV multilayer defect model including a database approach makes the fast and fully rigorous defect simulation and defect repair simulation possible. A hybrid mask simulation approach combining real and ideal mask parts allows the detailed investigation of the origin of different mask 3-D effects. The image computation is done with a fully vectorial Abbe-based approach. Arbitrary illumination and polarization schemes and adapted rigorous mask simulations guarantee a high accuracy. A fully vectorial sampling-free description of the pupil with Zernikes and Jones pupils and an optimized representation of the diffraction spectrum enable the computation of high-resolution images with high accuracy and short simulation times. A new pellicle model supports the simulation of arbitrary membrane stacks, pellicle distortions, and particles/defects on top of the pellicle. Finally, an extension for highly accurate anamorphic imaging simulations is included. The application of the models is demonstrated by typical use cases.

  18. ON MODEL SELECTION STRATEGIES TO IDENTIFY GENES UNDERLYING BINARY TRAITS USING GENOME-WIDE ASSOCIATION DATA.

    PubMed

    Wu, Zheyang; Zhao, Hongyu

    2012-01-01

    For more fruitful discoveries of genetic variants associated with diseases in genome-wide association studies, it is important to know whether joint analysis of multiple markers is more powerful than the commonly used single-marker analysis, especially in the presence of gene-gene interactions. This article provides a statistical framework to rigorously address this question through analytical power calculations for common model search strategies to detect binary trait loci: marginal search, exhaustive search, forward search, and two-stage screening search. Our approach incorporates linkage disequilibrium, random genotypes, and correlations among score test statistics of logistic regressions. We derive analytical results under two power definitions: the power of finding all the associated markers and the power of finding at least one associated marker. We also consider two types of error controls: the discovery number control and the Bonferroni type I error rate control. After demonstrating the accuracy of our analytical results by simulations, we apply them to consider a broad genetic model space to investigate the relative performances of different model search strategies. Our analytical study provides rapid computation as well as insights into the statistical mechanism of capturing genetic signals under different genetic models including gene-gene interactions. Even though we focus on genetic association analysis, our results on the power of model selection procedures are clearly very general and applicable to other studies.

  19. ON MODEL SELECTION STRATEGIES TO IDENTIFY GENES UNDERLYING BINARY TRAITS USING GENOME-WIDE ASSOCIATION DATA

    PubMed Central

    Wu, Zheyang; Zhao, Hongyu

    2013-01-01

    For more fruitful discoveries of genetic variants associated with diseases in genome-wide association studies, it is important to know whether joint analysis of multiple markers is more powerful than the commonly used single-marker analysis, especially in the presence of gene-gene interactions. This article provides a statistical framework to rigorously address this question through analytical power calculations for common model search strategies to detect binary trait loci: marginal search, exhaustive search, forward search, and two-stage screening search. Our approach incorporates linkage disequilibrium, random genotypes, and correlations among score test statistics of logistic regressions. We derive analytical results under two power definitions: the power of finding all the associated markers and the power of finding at least one associated marker. We also consider two types of error controls: the discovery number control and the Bonferroni type I error rate control. After demonstrating the accuracy of our analytical results by simulations, we apply them to consider a broad genetic model space to investigate the relative performances of different model search strategies. Our analytical study provides rapid computation as well as insights into the statistical mechanism of capturing genetic signals under different genetic models including gene-gene interactions. Even though we focus on genetic association analysis, our results on the power of model selection procedures are clearly very general and applicable to other studies. PMID:23956610

  20. Increasing URM Undergraduate Student Success through Assessment-Driven Interventions: A Multiyear Study Using Freshman-Level General Biology as a Model System

    PubMed Central

    Carmichael, Mary C.; St. Clair, Candace; Edwards, Andrea M.; Barrett, Peter; McFerrin, Harris; Davenport, Ian; Awad, Mohamed; Kundu, Anup; Ireland, Shubha Kale

    2016-01-01

    Xavier University of Louisiana leads the nation in awarding BS degrees in the biological sciences to African-American students. In this multiyear study with ∼5500 participants, data-driven interventions were adopted to improve student academic performance in a freshman-level general biology course. The three hour-long exams were common and administered concurrently to all students. New exam questions were developed using Bloom’s taxonomy, and exam results were analyzed statistically with validated assessment tools. All but the comprehensive final exam were returned to students for self-evaluation and remediation. Among other approaches, course rigor was monitored by using an identical set of 60 questions on the final exam across 10 semesters. Analysis of the identical sets of 60 final exam questions revealed that overall averages increased from 72.9% (2010) to 83.5% (2015). Regression analysis demonstrated a statistically significant correlation between high-risk students and their averages on the 60 questions. Additional analysis demonstrated statistically significant improvements for at least one letter grade from midterm to final and a 20% increase in the course pass rates over time, also for the high-risk population. These results support the hypothesis that our data-driven interventions and assessment techniques are successful in improving student retention, particularly for our academically at-risk students. PMID:27543637

  1. Worrying trends in econophysics

    NASA Astrophysics Data System (ADS)

    Gallegati, Mauro; Keen, Steve; Lux, Thomas; Ormerod, Paul

    2006-10-01

    Econophysics has already made a number of important empirical contributions to our understanding of the social and economic world. These fall mainly into the areas of finance and industrial economics, where in each case there is a large amount of reasonably well-defined data. More recently, Econophysics has also begun to tackle other areas of economics where data is much more sparse and much less reliable. In addition, econophysicists have attempted to apply the theoretical approach of statistical physics to try to understand empirical findings. Our concerns are fourfold. First, a lack of awareness of work that has been done within economics itself. Second, resistance to more rigorous and robust statistical methodology. Third, the belief that universal empirical regularities can be found in many areas of economic activity. Fourth, the theoretical models which are being used to explain empirical phenomena. The latter point is of particular concern. Essentially, the models are based upon models of statistical physics in which energy is conserved in exchange processes. There are examples in economics where the principle of conservation may be a reasonable approximation to reality, such as primitive hunter-gatherer societies. But in the industrialised capitalist economies, income is most definitely not conserved. The process of production and not exchange is responsible for this. Models which focus purely on exchange and not on production cannot by definition offer a realistic description of the generation of income in the capitalist, industrialised economies.

  2. diffHic: a Bioconductor package to detect differential genomic interactions in Hi-C data.

    PubMed

    Lun, Aaron T L; Smyth, Gordon K

    2015-08-19

    Chromatin conformation capture with high-throughput sequencing (Hi-C) is a technique that measures the in vivo intensity of interactions between all pairs of loci in the genome. Most conventional analyses of Hi-C data focus on the detection of statistically significant interactions. However, an alternative strategy involves identifying significant changes in the interaction intensity (i.e., differential interactions) between two or more biological conditions. This is more statistically rigorous and may provide more biologically relevant results. Here, we present the diffHic software package for the detection of differential interactions from Hi-C data. diffHic provides methods for read pair alignment and processing, counting into bin pairs, filtering out low-abundance events and normalization of trended or CNV-driven biases. It uses the statistical framework of the edgeR package to model biological variability and to test for significant differences between conditions. Several options for the visualization of results are also included. The use of diffHic is demonstrated with real Hi-C data sets. Performance against existing methods is also evaluated with simulated data. On real data, diffHic is able to successfully detect interactions with significant differences in intensity between biological conditions. It also compares favourably to existing software tools on simulated data sets. These results suggest that diffHic is a viable approach for differential analyses of Hi-C data.

  3. Detailed Spectral Analysis of the 260 ks XMM-Newton Data of 1E 1207.4-5209 and Significance of a 2.1 keV Absorption Feature

    NASA Astrophysics Data System (ADS)

    Mori, Kaya; Chonko, James C.; Hailey, Charles J.

    2005-10-01

    We have reanalyzed the 260 ks XMM-Newton observation of 1E 1207.4-5209. There are several significant improvements over previous work. First, a much broader range of physically plausible spectral models was used. Second, we have used a more rigorous statistical analysis. The standard F-distribution was not employed, but rather the exact finite statistics F-distribution was determined by Monte Carlo simulations. This approach was motivated by the recent work of Protassov and coworkers and Freeman and coworkers. They demonstrated that the standard F-distribution is not even asymptotically correct when applied to assess the significance of additional absorption features in a spectrum. With our improved analysis we do not find a third and fourth spectral feature in 1E 1207.4-5209 but only the two broad absorption features previously reported. Two additional statistical tests, one line model dependent and the other line model independent, confirmed our modified F-test analysis. For all physically plausible continuum models in which the weak residuals are strong enough to fit, the residuals occur at the instrument Au M edge. As a sanity check we confirmed that the residuals are consistent in strength and position with the instrument Au M residuals observed in 3C 273.

  4. Uncertainty quantification for nuclear density functional theory and information content of new measurements.

    PubMed

    McDonnell, J D; Schunck, N; Higdon, D; Sarich, J; Wild, S M; Nazarewicz, W

    2015-03-27

    Statistical tools of uncertainty quantification can be used to assess the information content of measured observables with respect to present-day theoretical models, to estimate model errors and thereby improve predictive capability, to extrapolate beyond the regions reached by experiment, and to provide meaningful input to applications and planned measurements. To showcase new opportunities offered by such tools, we make a rigorous analysis of theoretical statistical uncertainties in nuclear density functional theory using Bayesian inference methods. By considering the recent mass measurements from the Canadian Penning Trap at Argonne National Laboratory, we demonstrate how the Bayesian analysis and a direct least-squares optimization, combined with high-performance computing, can be used to assess the information content of the new data with respect to a model based on the Skyrme energy density functional approach. Employing the posterior probability distribution computed with a Gaussian process emulator, we apply the Bayesian framework to propagate theoretical statistical uncertainties in predictions of nuclear masses, two-neutron dripline, and fission barriers. Overall, we find that the new mass measurements do not impose a constraint that is strong enough to lead to significant changes in the model parameters. The example discussed in this study sets the stage for quantifying and maximizing the impact of new measurements with respect to current modeling and guiding future experimental efforts, thus enhancing the experiment-theory cycle in the scientific method.

  5. Peer Review Documents Related to the Evaluation of ...

    EPA Pesticide Factsheets

    BMDS is one of the Agency's premier tools for estimating risk assessments, therefore the validity and reliability of its statistical models are of paramount importance. This page provides links to peer review and expert summaries of the BMDS application and its models as they were developed and eventually released documenting the rigorous review process taken to provide the best science tools available for statistical modeling. This page provides links to peer reviews and expert summaries of the BMDS applications and its models as they were developed and eventually released.

  6. A criterion for establishing life limits. [for Space Shuttle Main Engine service

    NASA Technical Reports Server (NTRS)

    Skopp, G. H.; Porter, A. A.

    1990-01-01

    The development of a rigorous statistical method that would utilize hardware-demonstrated reliability to evaluate hardware capability and provide ground rules for safe flight margin is discussed. A statistical-based method using the Weibull/Weibayes cumulative distribution function is described. Its advantages and inadequacies are pointed out. Another, more advanced procedure, Single Flight Reliability (SFR), determines a life limit which ensures that the reliability of any single flight is never less than a stipulated value at a stipulated confidence level. Application of the SFR method is illustrated.

  7. Nonparametric Residue Analysis of Dynamic PET Data With Application to Cerebral FDG Studies in Normals.

    PubMed

    O'Sullivan, Finbarr; Muzi, Mark; Spence, Alexander M; Mankoff, David M; O'Sullivan, Janet N; Fitzgerald, Niall; Newman, George C; Krohn, Kenneth A

    2009-06-01

    Kinetic analysis is used to extract metabolic information from dynamic positron emission tomography (PET) uptake data. The theory of indicator dilutions, developed in the seminal work of Meier and Zierler (1954), provides a probabilistic framework for representation of PET tracer uptake data in terms of a convolution between an arterial input function and a tissue residue. The residue is a scaled survival function associated with tracer residence in the tissue. Nonparametric inference for the residue, a deconvolution problem, provides a novel approach to kinetic analysis-critically one that is not reliant on specific compartmental modeling assumptions. A practical computational technique based on regularized cubic B-spline approximation of the residence time distribution is proposed. Nonparametric residue analysis allows formal statistical evaluation of specific parametric models to be considered. This analysis needs to properly account for the increased flexibility of the nonparametric estimator. The methodology is illustrated using data from a series of cerebral studies with PET and fluorodeoxyglucose (FDG) in normal subjects. Comparisons are made between key functionals of the residue, tracer flux, flow, etc., resulting from a parametric (the standard two-compartment of Phelps et al. 1979) and a nonparametric analysis. Strong statistical evidence against the compartment model is found. Primarily these differences relate to the representation of the early temporal structure of the tracer residence-largely a function of the vascular supply network. There are convincing physiological arguments against the representations implied by the compartmental approach but this is the first time that a rigorous statistical confirmation using PET data has been reported. The compartmental analysis produces suspect values for flow but, notably, the impact on the metabolic flux, though statistically significant, is limited to deviations on the order of 3%-4%. The general advantage of the nonparametric residue analysis is the ability to provide a valid kinetic quantitation in the context of studies where there may be heterogeneity or other uncertainty about the accuracy of a compartmental model approximation of the tissue residue.

  8. 76 FR 17654 - Submission for OMB Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-30

    ... OMB Review; Comment Request Title: Evaluation of Adolescent Pregnancy Prevention Approaches-- First... as part of the Evaluation of Adolescent Pregnancy Prevention Approaches (PPA). PPA is a random assignment evaluation designed to result in rigorous evidence on effective ways to reduce teen pregnancy. The...

  9. Quasi-experimental study designs series-paper 6: risk of bias assessment.

    PubMed

    Waddington, Hugh; Aloe, Ariel M; Becker, Betsy Jane; Djimeu, Eric W; Hombrados, Jorge Garcia; Tugwell, Peter; Wells, George; Reeves, Barney

    2017-09-01

    Rigorous and transparent bias assessment is a core component of high-quality systematic reviews. We assess modifications to existing risk of bias approaches to incorporate rigorous quasi-experimental approaches with selection on unobservables. These are nonrandomized studies using design-based approaches to control for unobservable sources of confounding such as difference studies, instrumental variables, interrupted time series, natural experiments, and regression-discontinuity designs. We review existing risk of bias tools. Drawing on these tools, we present domains of bias and suggest directions for evaluation questions. The review suggests that existing risk of bias tools provide, to different degrees, incomplete transparent criteria to assess the validity of these designs. The paper then presents an approach to evaluating the internal validity of quasi-experiments with selection on unobservables. We conclude that tools for nonrandomized studies of interventions need to be further developed to incorporate evaluation questions for quasi-experiments with selection on unobservables. Copyright © 2017 Elsevier Inc. All rights reserved.

  10. Rigorous Mathematical Thinking Approach to Enhance Students’ Mathematical Creative and Critical Thinking Abilities

    NASA Astrophysics Data System (ADS)

    Hidayat, D.; Nurlaelah, E.; Dahlan, J. A.

    2017-09-01

    The ability of mathematical creative and critical thinking are two abilities that need to be developed in the learning of mathematics. Therefore, efforts need to be made in the design of learning that is capable of developing both capabilities. The purpose of this research is to examine the mathematical creative and critical thinking ability of students who get rigorous mathematical thinking (RMT) approach and students who get expository approach. This research was quasi experiment with control group pretest-posttest design. The population were all of students grade 11th in one of the senior high school in Bandung. The result showed that: the achievement of mathematical creative and critical thinking abilities of student who obtain RMT is better than students who obtain expository approach. The use of Psychological tools and mediation with criteria of intentionality, reciprocity, and mediated of meaning on RMT helps students in developing condition in critical and creative processes. This achievement contributes to the development of integrated learning design on students’ critical and creative thinking processes.

  11. Integration of the Response Surface Methodology with the Compromise Decision Support Problem in Developing a General Robust Design Procedure

    NASA Technical Reports Server (NTRS)

    Chen, Wei; Tsui, Kwok-Leung; Allen, Janet K.; Mistree, Farrokh

    1994-01-01

    In this paper we introduce a comprehensive and rigorous robust design procedure to overcome some limitations of the current approaches. A comprehensive approach is general enough to model the two major types of robust design applications, namely, robust design associated with the minimization of the deviation of performance caused by the deviation of noise factors (uncontrollable parameters), and robust design due to the minimization of the deviation of performance caused by the deviation of control factors (design variables). We achieve mathematical rigor by using, as a foundation, principles from the design of experiments and optimization. Specifically, we integrate the Response Surface Method (RSM) with the compromise Decision Support Problem (DSP). Our approach is especially useful for design problems where there are no closed-form solutions and system performance is computationally expensive to evaluate. The design of a solar powered irrigation system is used as an example. Our focus in this paper is on illustrating our approach rather than on the results per se.

  12. Systematic review of the quality of prognosis studies in systemic lupus erythematosus.

    PubMed

    Lim, Lily S H; Lee, Senq J; Feldman, Brian M; Gladman, Dafna D; Pullenayegum, Eleanor; Uleryk, Elizabeth; Silverman, Earl D

    2014-10-01

    Prognosis studies examine outcomes and/or seek to identify predictors or factors associated with outcomes. Many prognostic factors have been identified in systemic lupus erythematosus (SLE), but few have been consistently found across studies. We hypothesized that this is due to a lack of rigor of study designs. This study aimed to systematically assess the methodologic quality of prognosis studies in SLE. A search of prognosis studies in SLE was performed using MEDLINE and Embase, from January 1990 to June 2011. A representative sample of 150 articles was selected using a random number generator and assessed by 2 reviewers. Each study was assessed by a risk of bias tool according to 6 domains: study participation, study attrition, measurement of prognostic factors, measurement of outcomes, measurement/adjustment for confounders, and appropriateness of statistical analysis. Information about missing data was also collected. A cohort design was used in 71% of studies. High risk of bias was found in 65% of studies for confounders, 57% for study participation, 56% for attrition, 36% for statistical analyses, 20% for prognostic factors, and 18% for outcome. Missing covariate or outcome information was present in half of the studies. Only 6 studies discussed reasons for missing data and 2 imputed missing data. Lack of rigorous study design, especially in addressing confounding, study participation and attrition, and inadequately handled missing data, has limited the quality of prognosis studies in SLE. Future prognosis studies should be designed with consideration of these factors to improve methodologic rigor. Copyright © 2014 by the American College of Rheumatology.

  13. Realizing Scientific Methods for Cyber Security

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carroll, Thomas E.; Manz, David O.; Edgar, Thomas W.

    There is little doubt among cyber security researchers about the lack of scientic rigor that underlies much of the liter-ature. The issues are manifold and are well documented. Further complicating the problem is insufficient scientic methods to address these issues. Cyber security melds man and machine: we inherit the challenges of computer science, sociology, psychology, and many other elds and create new ones where these elds interface. In this paper we detail a partial list of challenges imposed by rigorous science and survey how other sciences have tackled them, in the hope of applying a similar approach to cyber securitymore » science. This paper is by no means comprehensive: its purpose is to foster discussion in the community on how we can improve rigor in cyber security science.« less

  14. Final Report, DOE Early Career Award: Predictive modeling of complex physical systems: new tools for statistical inference, uncertainty quantification, and experimental design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marzouk, Youssef

    Predictive simulation of complex physical systems increasingly rests on the interplay of experimental observations with computational models. Key inputs, parameters, or structural aspects of models may be incomplete or unknown, and must be developed from indirect and limited observations. At the same time, quantified uncertainties are needed to qualify computational predictions in the support of design and decision-making. In this context, Bayesian statistics provides a foundation for inference from noisy and limited data, but at prohibitive computional expense. This project intends to make rigorous predictive modeling *feasible* in complex physical systems, via accelerated and scalable tools for uncertainty quantification, Bayesianmore » inference, and experimental design. Specific objectives are as follows: 1. Develop adaptive posterior approximations and dimensionality reduction approaches for Bayesian inference in high-dimensional nonlinear systems. 2. Extend accelerated Bayesian methodologies to large-scale {\\em sequential} data assimilation, fully treating nonlinear models and non-Gaussian state and parameter distributions. 3. Devise efficient surrogate-based methods for Bayesian model selection and the learning of model structure. 4. Develop scalable simulation/optimization approaches to nonlinear Bayesian experimental design, for both parameter inference and model selection. 5. Demonstrate these inferential tools on chemical kinetic models in reacting flow, constructing and refining thermochemical and electrochemical models from limited data. Demonstrate Bayesian filtering on canonical stochastic PDEs and in the dynamic estimation of inhomogeneous subsurface properties and flow fields.« less

  15. Establishing Interventions via a Theory-Driven Single Case Design Research Cycle

    ERIC Educational Resources Information Center

    Kilgus, Stephen P.; Riley-Tillman, T. Chris; Kratochwill, Thomas R.

    2016-01-01

    Recent studies have suggested single case design (SCD) intervention research is subject to publication bias, wherein studies are more likely to be published if they possess large or statistically significant effects and use rigorous experimental methods. The nature of SCD and the purposes for which it might be used could suggest that large effects…

  16. Power of Statistical Tests Used to Address Nonresponse Error in the "Journal of Agricultural Education"

    ERIC Educational Resources Information Center

    Johnson, Donald M.; Shoulders, Catherine W.

    2017-01-01

    As members of a profession committed to the dissemination of rigorous research pertaining to agricultural education, authors publishing in the Journal of Agricultural Education (JAE) must seek methods to evaluate and, when necessary, improve their research methods. The purpose of this study was to describe how authors of manuscripts published in…

  17. Applications of satellite-derived disturbance information in support of sustainable forest management

    Treesearch

    Sean Healey; Warren Cohen; Gretchen Moisen

    2007-01-01

    The need for current information about the effects of fires, harvest, and storms is evident in many areas of sustainable forest management. While there are several potential sources of this information, each source has its limitations. Generally speaking, the statistical rigor associated with traditional forest sampling is an important asset in any monitoring effort....

  18. Statistical linearization for multi-input/multi-output nonlinearities

    NASA Technical Reports Server (NTRS)

    Lin, Ching-An; Cheng, Victor H. L.

    1991-01-01

    Formulas are derived for the computation of the random input-describing functions for MIMO nonlinearities; these straightforward and rigorous derivations are based on the optimal mean square linear approximation. The computations involve evaluations of multiple integrals. It is shown that, for certain classes of nonlinearities, multiple-integral evaluations are obviated and the computations are significantly simplified.

  19. Slow off the Mark: Elementary School Teachers and the Crisis in STEM Education

    ERIC Educational Resources Information Center

    Epstein, Diana; Miller, Raegen T.

    2011-01-01

    Prospective teachers can typically obtain a license to teach elementary school without taking a rigorous college-level STEM class such as calculus, statistics, or chemistry, and without demonstrating a solid grasp of mathematics knowledge, scientific knowledge, or the nature of scientific inquiry. This is not a recipe for ensuring students have…

  20. Unperturbed Schelling Segregation in Two or Three Dimensions

    NASA Astrophysics Data System (ADS)

    Barmpalias, George; Elwes, Richard; Lewis-Pye, Andrew

    2016-09-01

    Schelling's models of segregation, first described in 1969 (Am Econ Rev 59:488-493, 1969) are among the best known models of self-organising behaviour. Their original purpose was to identify mechanisms of urban racial segregation. But his models form part of a family which arises in statistical mechanics, neural networks, social science, and beyond, where populations of agents interact on networks. Despite extensive study, unperturbed Schelling models have largely resisted rigorous analysis, prior results generally focusing on variants in which noise is introduced into the dynamics, the resulting system being amenable to standard techniques from statistical mechanics or stochastic evolutionary game theory (Young in Individual strategy and social structure: an evolutionary theory of institutions, Princeton University Press, Princeton, 1998). A series of recent papers (Brandt et al. in: Proceedings of the 44th annual ACM symposium on theory of computing (STOC 2012), 2012); Barmpalias et al. in: 55th annual IEEE symposium on foundations of computer science, Philadelphia, 2014, J Stat Phys 158:806-852, 2015), has seen the first rigorous analyses of 1-dimensional unperturbed Schelling models, in an asymptotic framework largely unknown in statistical mechanics. Here we provide the first such analysis of 2- and 3-dimensional unperturbed models, establishing most of the phase diagram, and answering a challenge from Brandt et al. in: Proceedings of the 44th annual ACM symposium on theory of computing (STOC 2012), 2012).

  1. Approaching Cauchy's Theorem

    ERIC Educational Resources Information Center

    Garcia, Stephan Ramon; Ross, William T.

    2017-01-01

    We hope to initiate a discussion about various methods for introducing Cauchy's Theorem. Although Cauchy's Theorem is the fundamental theorem upon which complex analysis is based, there is no "standard approach." The appropriate choice depends upon the prerequisites for the course and the level of rigor intended. Common methods include…

  2. Verification of Compartmental Epidemiological Models using Metamorphic Testing, Model Checking and Visual Analytics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ramanathan, Arvind; Steed, Chad A; Pullum, Laura L

    Compartmental models in epidemiology are widely used as a means to model disease spread mechanisms and understand how one can best control the disease in case an outbreak of a widespread epidemic occurs. However, a significant challenge within the community is in the development of approaches that can be used to rigorously verify and validate these models. In this paper, we present an approach to rigorously examine and verify the behavioral properties of compartmen- tal epidemiological models under several common modeling scenarios including birth/death rates and multi-host/pathogen species. Using metamorphic testing, a novel visualization tool and model checking, we buildmore » a workflow that provides insights into the functionality of compartmental epidemiological models. Our initial results indicate that metamorphic testing can be used to verify the implementation of these models and provide insights into special conditions where these mathematical models may fail. The visualization front-end allows the end-user to scan through a variety of parameters commonly used in these models to elucidate the conditions under which an epidemic can occur. Further, specifying these models using a process algebra allows one to automatically construct behavioral properties that can be rigorously verified using model checking. Taken together, our approach allows for detecting implementation errors as well as handling conditions under which compartmental epidemiological models may fail to provide insights into disease spread dynamics.« less

  3. An Interaction-Based Approach to Enhancing Secondary School Instruction and Student Achievement

    ERIC Educational Resources Information Center

    Allen, Joseph; Pianta, Robert; Gregory, Anne; Mikami, Amori; Lun, Janetta

    2011-01-01

    Improving teaching quality is widely recognized as critical to addressing deficiencies in secondary school education, yet the field has struggled to identify rigorously evaluated teacher-development approaches that can produce reliable gains in student achievement. A randomized controlled trial of My Teaching Partner-Secondary--a Web-mediated…

  4. A Practical Guide to Regression Discontinuity

    ERIC Educational Resources Information Center

    Jacob, Robin; Zhu, Pei; Somers, Marie-Andrée; Bloom, Howard

    2012-01-01

    Regression discontinuity (RD) analysis is a rigorous nonexperimental approach that can be used to estimate program impacts in situations in which candidates are selected for treatment based on whether their value for a numeric rating exceeds a designated threshold or cut-point. Over the last two decades, the regression discontinuity approach has…

  5. A Qualitative Approach to Enzyme Inhibition

    ERIC Educational Resources Information Center

    Waldrop, Grover L.

    2009-01-01

    Most general biochemistry textbooks present enzyme inhibition by showing how the basic Michaelis-Menten parameters K[subscript m] and V[subscript max] are affected mathematically by a particular type of inhibitor. This approach, while mathematically rigorous, does not lend itself to understanding how inhibition patterns are used to determine the…

  6. Treatment Controversies in Autism

    ERIC Educational Resources Information Center

    Schreibman, Laura

    2008-01-01

    With the increasing numbers of children who are being diagnosed with an autism spectrum disorder there are a wide variety of treatment approaches being marketed to vulnerable parents who are desperate to help their child. However, many of these approaches have not been rigorously evaluated and can lead to false hopes, unreasonable fears, or…

  7. Three Views of Systems Theories and Their Implications for Sustainability Education

    ERIC Educational Resources Information Center

    Porter, Terry; Cordoba, Jose

    2009-01-01

    Worldwide, there is an emerging interest in sustainability and sustainability education. A popular and promising approach is the use of systems thinking. However, the systems approach to sustainability has neither been clearly defined nor has its practical application followed any systematic rigor, resulting in confounded and underspecified…

  8. A rigorous approach to facilitate and guarantee the correctness of the genetic testing management in human genome information systems.

    PubMed

    Araújo, Luciano V; Malkowski, Simon; Braghetto, Kelly R; Passos-Bueno, Maria R; Zatz, Mayana; Pu, Calton; Ferreira, João E

    2011-12-22

    Recent medical and biological technology advances have stimulated the development of new testing systems that have been providing huge, varied amounts of molecular and clinical data. Growing data volumes pose significant challenges for information processing systems in research centers. Additionally, the routines of genomics laboratory are typically characterized by high parallelism in testing and constant procedure changes. This paper describes a formal approach to address this challenge through the implementation of a genetic testing management system applied to human genome laboratory. We introduced the Human Genome Research Center Information System (CEGH) in Brazil, a system that is able to support constant changes in human genome testing and can provide patients updated results based on the most recent and validated genetic knowledge. Our approach uses a common repository for process planning to ensure reusability, specification, instantiation, monitoring, and execution of processes, which are defined using a relational database and rigorous control flow specifications based on process algebra (ACP). The main difference between our approach and related works is that we were able to join two important aspects: 1) process scalability achieved through relational database implementation, and 2) correctness of processes using process algebra. Furthermore, the software allows end users to define genetic testing without requiring any knowledge about business process notation or process algebra. This paper presents the CEGH information system that is a Laboratory Information Management System (LIMS) based on a formal framework to support genetic testing management for Mendelian disorder studies. We have proved the feasibility and showed usability benefits of a rigorous approach that is able to specify, validate, and perform genetic testing using easy end user interfaces.

  9. A rigorous approach to facilitate and guarantee the correctness of the genetic testing management in human genome information systems

    PubMed Central

    2011-01-01

    Background Recent medical and biological technology advances have stimulated the development of new testing systems that have been providing huge, varied amounts of molecular and clinical data. Growing data volumes pose significant challenges for information processing systems in research centers. Additionally, the routines of genomics laboratory are typically characterized by high parallelism in testing and constant procedure changes. Results This paper describes a formal approach to address this challenge through the implementation of a genetic testing management system applied to human genome laboratory. We introduced the Human Genome Research Center Information System (CEGH) in Brazil, a system that is able to support constant changes in human genome testing and can provide patients updated results based on the most recent and validated genetic knowledge. Our approach uses a common repository for process planning to ensure reusability, specification, instantiation, monitoring, and execution of processes, which are defined using a relational database and rigorous control flow specifications based on process algebra (ACP). The main difference between our approach and related works is that we were able to join two important aspects: 1) process scalability achieved through relational database implementation, and 2) correctness of processes using process algebra. Furthermore, the software allows end users to define genetic testing without requiring any knowledge about business process notation or process algebra. Conclusions This paper presents the CEGH information system that is a Laboratory Information Management System (LIMS) based on a formal framework to support genetic testing management for Mendelian disorder studies. We have proved the feasibility and showed usability benefits of a rigorous approach that is able to specify, validate, and perform genetic testing using easy end user interfaces. PMID:22369688

  10. Narrative Practice and the Signs of Safety Approach: Engaging Adolescents in Building Rigorous Safety Plans

    ERIC Educational Resources Information Center

    Gibson, Matthew

    2014-01-01

    The Signs of Safety approach to child protection has been gaining prominence around the world and this approach has developed through learning from good practice. Generally, examples of good practice are derived from adults who pose a risk to children, while this paper outlines an example of good practice that engages an adolescent in building a…

  11. The MIXED framework: A novel approach to evaluating mixed-methods rigor.

    PubMed

    Eckhardt, Ann L; DeVon, Holli A

    2017-10-01

    Evaluation of rigor in mixed-methods (MM) research is a persistent challenge due to the combination of inconsistent philosophical paradigms, the use of multiple research methods which require different skill sets, and the need to combine research at different points in the research process. Researchers have proposed a variety of ways to thoroughly evaluate MM research, but each method fails to provide a framework that is useful for the consumer of research. In contrast, the MIXED framework is meant to bridge the gap between an academic exercise and practical assessment of a published work. The MIXED framework (methods, inference, expertise, evaluation, and design) borrows from previously published frameworks to create a useful tool for the evaluation of a published study. The MIXED framework uses an experimental eight-item scale that allows for comprehensive integrated assessment of MM rigor in published manuscripts. Mixed methods are becoming increasingly prevalent in nursing and healthcare research requiring researchers and consumers to address issues unique to MM such as evaluation of rigor. © 2017 John Wiley & Sons Ltd.

  12. Staying Clear of the Dragons.

    PubMed

    Elf, Johan

    2016-04-27

    A new, game-changing approach makes it possible to rigorously disprove models without making assumptions about the unknown parts of the biological system. Copyright © 2016 Elsevier Inc. All rights reserved.

  13. Using qualitative mixed methods to study small health care organizations while maximising trustworthiness and authenticity.

    PubMed

    Phillips, Christine B; Dwan, Kathryn; Hepworth, Julie; Pearce, Christopher; Hall, Sally

    2014-11-19

    The primary health care sector delivers the majority of health care in western countries through small, community-based organizations. However, research into these healthcare organizations is limited by the time constraints and pressure facing them, and the concern by staff that research is peripheral to their work. We developed Q-RARA-Qualitative Rapid Appraisal, Rigorous Analysis-to study small, primary health care organizations in a way that is efficient, acceptable to participants and methodologically rigorous. Q-RARA comprises a site visit, semi-structured interviews, structured and unstructured observations, photographs, floor plans, and social scanning data. Data were collected over the course of one day per site and the qualitative analysis was integrated and iterative. We found Q-RARA to be acceptable to participants and effective in collecting data on organizational function in multiple sites without disrupting the practice, while maintaining a balance between speed and trustworthiness. The Q-RARA approach is capable of providing a richly textured, rigorous understanding of the processes of the primary care practice while also allowing researchers to develop an organizational perspective. For these reasons the approach is recommended for use in small-scale organizations both within and outside the primary health care sector.

  14. On the total bandwidth for the rational Harper's equation

    NASA Astrophysics Data System (ADS)

    Helffer, Bernard; Kerdelhué, Phillippe

    1995-10-01

    In the last years several contributions have been done around the total bandwidth of the spectrum for the Harper's operator. In particular an interesting conjecture has been proposed by Thouless which gives also strong convincing arguments for the proof in special cases. On the other hand, in the study of the Cantor structure of the spectrum, B. Helffer and J. Sjöstrand have justified an heuristic semiclassical approach proposed by M. Wilkinson. The aim of this article is to explain how one can use the first step of this approach to give a rigorous semi-classical proof of the Thouless formula in some of the simplest cases. We shall also indicate how one can hope with more effort to prove rigorously recent results of Last and Wilkinson on the same conjecture.

  15. CORSSA: The Community Online Resource for Statistical Seismicity Analysis

    USGS Publications Warehouse

    Michael, Andrew J.; Wiemer, Stefan

    2010-01-01

    Statistical seismology is the application of rigorous statistical methods to earthquake science with the goal of improving our knowledge of how the earth works. Within statistical seismology there is a strong emphasis on the analysis of seismicity data in order to improve our scientific understanding of earthquakes and to improve the evaluation and testing of earthquake forecasts, earthquake early warning, and seismic hazards assessments. Given the societal importance of these applications, statistical seismology must be done well. Unfortunately, a lack of educational resources and available software tools make it difficult for students and new practitioners to learn about this discipline. The goal of the Community Online Resource for Statistical Seismicity Analysis (CORSSA) is to promote excellence in statistical seismology by providing the knowledge and resources necessary to understand and implement the best practices, so that the reader can apply these methods to their own research. This introduction describes the motivation for and vision of CORRSA. It also describes its structure and contents.

  16. Statistical ecology comes of age.

    PubMed

    Gimenez, Olivier; Buckland, Stephen T; Morgan, Byron J T; Bez, Nicolas; Bertrand, Sophie; Choquet, Rémi; Dray, Stéphane; Etienne, Marie-Pierre; Fewster, Rachel; Gosselin, Frédéric; Mérigot, Bastien; Monestiez, Pascal; Morales, Juan M; Mortier, Frédéric; Munoz, François; Ovaskainen, Otso; Pavoine, Sandrine; Pradel, Roger; Schurr, Frank M; Thomas, Len; Thuiller, Wilfried; Trenkel, Verena; de Valpine, Perry; Rexstad, Eric

    2014-12-01

    The desire to predict the consequences of global environmental change has been the driver towards more realistic models embracing the variability and uncertainties inherent in ecology. Statistical ecology has gelled over the past decade as a discipline that moves away from describing patterns towards modelling the ecological processes that generate these patterns. Following the fourth International Statistical Ecology Conference (1-4 July 2014) in Montpellier, France, we analyse current trends in statistical ecology. Important advances in the analysis of individual movement, and in the modelling of population dynamics and species distributions, are made possible by the increasing use of hierarchical and hidden process models. Exciting research perspectives include the development of methods to interpret citizen science data and of efficient, flexible computational algorithms for model fitting. Statistical ecology has come of age: it now provides a general and mathematically rigorous framework linking ecological theory and empirical data.

  17. Statistical ecology comes of age

    PubMed Central

    Gimenez, Olivier; Buckland, Stephen T.; Morgan, Byron J. T.; Bez, Nicolas; Bertrand, Sophie; Choquet, Rémi; Dray, Stéphane; Etienne, Marie-Pierre; Fewster, Rachel; Gosselin, Frédéric; Mérigot, Bastien; Monestiez, Pascal; Morales, Juan M.; Mortier, Frédéric; Munoz, François; Ovaskainen, Otso; Pavoine, Sandrine; Pradel, Roger; Schurr, Frank M.; Thomas, Len; Thuiller, Wilfried; Trenkel, Verena; de Valpine, Perry; Rexstad, Eric

    2014-01-01

    The desire to predict the consequences of global environmental change has been the driver towards more realistic models embracing the variability and uncertainties inherent in ecology. Statistical ecology has gelled over the past decade as a discipline that moves away from describing patterns towards modelling the ecological processes that generate these patterns. Following the fourth International Statistical Ecology Conference (1–4 July 2014) in Montpellier, France, we analyse current trends in statistical ecology. Important advances in the analysis of individual movement, and in the modelling of population dynamics and species distributions, are made possible by the increasing use of hierarchical and hidden process models. Exciting research perspectives include the development of methods to interpret citizen science data and of efficient, flexible computational algorithms for model fitting. Statistical ecology has come of age: it now provides a general and mathematically rigorous framework linking ecological theory and empirical data. PMID:25540151

  18. Comparing the Rigor of Compressed Format Courses to Their Regular Semester Counterparts

    ERIC Educational Resources Information Center

    Lutes, Lyndell; Davies, Randall

    2013-01-01

    This study compared workloads of undergraduate courses taught in 16-week and 8-week sessions. A statistically significant difference in workload was found between the two. Based on survey data from approximately 29,000 students, on average students spent about 17 minutes more per credit per week on 16-week courses than on similar 8-week courses.…

  19. Statistical tests and measures for the presence and influence of digit preference

    Treesearch

    Jay Beaman; Grenier Michel

    1998-01-01

    Digit preference which is really showing preference for certain numbers has often described as the heaping or rounding of responses to numbers ending in zero or five. Number preference, NP, has been a topic in the social science literature for some years. However, until recently concepts were not adequately rigorously specified to allow, for example, the estimation of...

  20. A new assessment of the alleged link between element 115 and element 117 decay chains

    NASA Astrophysics Data System (ADS)

    Forsberg, U.; Rudolph, D.; Fahlander, C.; Golubev, P.; Sarmiento, L. G.; Åberg, S.; Block, M.; Düllmann, Ch. E.; Heßberger, F. P.; Kratz, J. V.; Yakushev, A.

    2016-09-01

    A novel rigorous statistical treatment is applied to available data (May 9, 2016) from search and spectroscopy experiments on the elements with atomic numbers Z = 115 and Z = 117. The present analysis implies that the hitherto proposed cross-reaction link between α-decay chains associated with the isotopes 293117 and 289115 is highly improbable.

  1. Uses of Multivariate Analytical Techniques in Online and Blended Business Education: An Assessment of Current Practice and Recommendations for Future Research

    ERIC Educational Resources Information Center

    Arbaugh, J. B.; Hwang, Alvin

    2013-01-01

    Seeking to assess the analytical rigor of empirical research in management education, this article reviews the use of multivariate statistical techniques in 85 studies of online and blended management education over the past decade and compares them with prescriptions offered by both the organization studies and educational research communities.…

  2. Statistical rigor in LiDAR-assisted estimation of aboveground forest biomass

    Treesearch

    Timothy G. Gregoire; Erik Næsset; Ronald E. McRoberts; Göran Ståhl; Hans Andersen; Terje Gobakken; Liviu Ene; Ross Nelson

    2016-01-01

    For many decades remotely sensed data have been used as a source of auxiliary information when conducting regional or national surveys of forest resources. In the past decade, airborne scanning LiDAR (Light Detection and Ranging) has emerged as a promising tool for sample surveys aimed at improving estimation of aboveground forest biomass. This technology is now...

  3. The Relationship between the Rigor of a State's Proficiency Standard and Student Achievement in the State

    ERIC Educational Resources Information Center

    Stoneberg, Bert D.

    2015-01-01

    The National Center of Education Statistics conducted a mapping study that equated the percentage proficient or above on each state's NCLB reading and mathematics tests in grades 4 and 8 to the NAEP scale. Each "NAEP equivalent score" was labeled according to NAEP's achievement levels and used to compare state proficiency standards and…

  4. 40 CFR Appendix 6 to Subpart A of... - Reverse Phase Extraction (RPE) Method for Detection of Oil Contamination in Non-Aqueous Drilling...

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... rigorous statistical experimental design and interpretation (Reference 16.4). 14.0Pollution Prevention 14... fluids. 1.4This method has been designed to show positive contamination for 5% of representative crude....1Sample collection bottles/jars—New, pre-cleaned bottles/jars, lot-certified to be free of artifacts...

  5. 40 CFR Appendix 6 to Subpart A of... - Reverse Phase Extraction (RPE) Method for Detection of Oil Contamination in Non-Aqueous Drilling...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... rigorous statistical experimental design and interpretation (Reference 16.4). 14.0Pollution Prevention 14... fluids. 1.4This method has been designed to show positive contamination for 5% of representative crude....1Sample collection bottles/jars—New, pre-cleaned bottles/jars, lot-certified to be free of artifacts...

  6. 40 CFR Appendix 6 to Subpart A of... - Reverse Phase Extraction (RPE) Method for Detection of Oil Contamination in Non-Aqueous Drilling...

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... rigorous statistical experimental design and interpretation (Reference 16.4). 14.0Pollution Prevention 14... fluids. 1.4This method has been designed to show positive contamination for 5% of representative crude....1Sample collection bottles/jars—New, pre-cleaned bottles/jars, lot-certified to be free of artifacts...

  7. 40 CFR Appendix 6 to Subpart A of... - Reverse Phase Extraction (RPE) Method for Detection of Oil Contamination in Non-Aqueous Drilling...

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... rigorous statistical experimental design and interpretation (Reference 16.4). 14.0Pollution Prevention 14... oil contamination in drilling fluids. 1.4This method has been designed to show positive contamination....1Sample collection bottles/jars—New, pre-cleaned bottles/jars, lot-certified to be free of artifacts...

  8. Bayesian Inference: with ecological applications

    USGS Publications Warehouse

    Link, William A.; Barker, Richard J.

    2010-01-01

    This text provides a mathematically rigorous yet accessible and engaging introduction to Bayesian inference with relevant examples that will be of interest to biologists working in the fields of ecology, wildlife management and environmental studies as well as students in advanced undergraduate statistics.. This text opens the door to Bayesian inference, taking advantage of modern computational efficiencies and easily accessible software to evaluate complex hierarchical models.

  9. Academic Rigor and Economic Value: GED[R] and High School Students' Perceptions and Misperceptions of the GED[R] vs. the High School Diploma

    ERIC Educational Resources Information Center

    Horne, Lela M.; Rachal, John R.; Shelley, Kyna

    2012-01-01

    A mixed methods framework utilized quantitative and qualitative data to determine whether statistically significant differences existed between high school and GED[R] student perceptions of credential value. An exploratory factor analysis (n=326) extracted four factors and then a MANOVA procedure was performed with a stratified quota sample…

  10. On the probability density function and characteristic function moments of image steganalysis in the log prediction error wavelet subband

    NASA Astrophysics Data System (ADS)

    Bao, Zhenkun; Li, Xiaolong; Luo, Xiangyang

    2017-01-01

    Extracting informative statistic features is the most essential technical issue of steganalysis. Among various steganalysis methods, probability density function (PDF) and characteristic function (CF) moments are two important types of features due to the excellent ability for distinguishing the cover images from the stego ones. The two types of features are quite similar in definition. The only difference is that the PDF moments are computed in the spatial domain, while the CF moments are computed in the Fourier-transformed domain. Then, the comparison between PDF and CF moments is an interesting question of steganalysis. Several theoretical results have been derived, and CF moments are proved better than PDF moments in some cases. However, in the log prediction error wavelet subband of wavelet decomposition, some experiments show that the result is opposite and lacks a rigorous explanation. To solve this problem, a comparison result based on the rigorous proof is presented: the first-order PDF moment is proved better than the CF moment, while the second-order CF moment is better than the PDF moment. It tries to open the theoretical discussion on steganalysis and the question of finding suitable statistical features.

  11. Translation efficiency of heterologous proteins is significantly affected by the genetic context of RBS sequences in engineered cyanobacterium Synechocystis sp. PCC 6803.

    PubMed

    Thiel, Kati; Mulaku, Edita; Dandapani, Hariharan; Nagy, Csaba; Aro, Eva-Mari; Kallio, Pauli

    2018-03-02

    Photosynthetic cyanobacteria have been studied as potential host organisms for direct solar-driven production of different carbon-based chemicals from CO 2 and water, as part of the development of sustainable future biotechnological applications. The engineering approaches, however, are still limited by the lack of comprehensive information on most optimal expression strategies and validated species-specific genetic elements which are essential for increasing the intricacy, predictability and efficiency of the systems. This study focused on the systematic evaluation of the key translational control elements, ribosome binding sites (RBS), in the cyanobacterial host Synechocystis sp. PCC 6803, with the objective of expanding the palette of tools for more rigorous engineering approaches. An expression system was established for the comparison of 13 selected RBS sequences in Synechocystis, using several alternative reporter proteins (sYFP2, codon-optimized GFPmut3 and ethylene forming enzyme) as quantitative indicators of the relative translation efficiencies. The set-up was shown to yield highly reproducible expression patterns in independent analytical series with low variation between biological replicates, thus allowing statistical comparison of the activities of the different RBSs in vivo. While the RBSs covered a relatively broad overall expression level range, the downstream gene sequence was demonstrated in a rigorous manner to have a clear impact on the resulting translational profiles. This was expected to reflect interfering sequence-specific mRNA-level interaction between the RBS and the coding region, yet correlation between potential secondary structure formation and observed translation levels could not be resolved with existing in silico prediction tools. The study expands our current understanding on the potential and limitations associated with the regulation of protein expression at translational level in engineered cyanobacteria. The acquired information can be used for selecting appropriate RBSs for optimizing over-expression constructs or multicistronic pathways in Synechocystis, while underlining the complications in predicting the activity due to gene-specific interactions which may reduce the translational efficiency for a given RBS-gene combination. Ultimately, the findings emphasize the need for additional characterized insulator sequence elements to decouple the interaction between the RBS and the coding region for future engineering approaches.

  12. Educational Technology: A Theoretical Discussion

    ERIC Educational Resources Information Center

    Andrews, Barbara; Hakken, David

    1977-01-01

    Views educational technology in relation to the pattern of technological change, argues that the new technology must be rigorously evaluated, and suggests it is best understood as a business approach to education. (DD)

  13. PARAGON: A Systematic, Integrated Approach to Aerosol Observation and Modeling

    NASA Technical Reports Server (NTRS)

    Diner, David J.; Kahn, Ralph A.; Braverman, Amy J.; Davies, Roger; Martonchik, John V.; Menzies, Robert T.; Ackerman, Thomas P.; Seinfeld, John H.; Anderson, Theodore L.; Charlson, Robert J.; hide

    2004-01-01

    Aerosols are generated and transformed by myriad processes operating across many spatial and temporal scales. Evaluation of climate models and their sensitivity to changes, such as in greenhouse gas abundances, requires quantifying natural and anthropogenic aerosol forcings and accounting for other critical factors, such as cloud feedbacks. High accuracy is required to provide sufficient sensitivity to perturbations, separate anthropogenic from natural influences, and develop confidence in inputs used to support policy decisions. Although many relevant data sources exist, the aerosol research community does not currently have the means to combine these diverse inputs into an integrated data set for maximum scientific benefit. Bridging observational gaps, adapting to evolving measurements, and establishing rigorous protocols for evaluating models are necessary, while simultaneously maintaining consistent, well understood accuracies. The Progressive Aerosol Retrieval and Assimilation Global Observing Network (PARAGON) concept represents a systematic, integrated approach to global aerosol Characterization, bringing together modern measurement and modeling techniques, geospatial statistics methodologies, and high-performance information technologies to provide the machinery necessary for achieving a comprehensive understanding of how aerosol physical, chemical, and radiative processes impact the Earth system. We outline a framework for integrating and interpreting observations and models and establishing an accurate, consistent and cohesive long-term data record.

  14. Intrinsic random functions for mitigation of atmospheric effects in terrestrial radar interferometry

    NASA Astrophysics Data System (ADS)

    Butt, Jemil; Wieser, Andreas; Conzett, Stefan

    2017-06-01

    The benefits of terrestrial radar interferometry (TRI) for deformation monitoring are restricted by the influence of changing meteorological conditions contaminating the potentially highly precise measurements with spurious deformations. This is especially the case when the measurement setup includes long distances between instrument and objects of interest and the topography affecting atmospheric refraction is complex. These situations are typically encountered with geo-monitoring in mountainous regions, e.g. with glaciers, landslides or volcanoes. We propose and explain an approach for the mitigation of atmospheric influences based on the theory of intrinsic random functions of order k (IRF-k) generalizing existing approaches based on ordinary least squares estimation of trend functions. This class of random functions retains convenient computational properties allowing for rigorous statistical inference while still permitting to model stochastic spatial phenomena which are non-stationary in mean and variance. We explore the correspondence between the properties of the IRF-k and the properties of the measurement process. In an exemplary case study, we find that our method reduces the time needed to obtain reliable estimates of glacial movements from 12 h down to 0.5 h compared to simple temporal averaging procedures.

  15. Filtering Meteoroid Flights Using Multiple Unscented Kalman Filters

    NASA Astrophysics Data System (ADS)

    Sansom, E. K.; Bland, P. A.; Rutten, M. G.; Paxman, J.; Towner, M. C.

    2016-11-01

    Estimator algorithms are immensely versatile and powerful tools that can be applied to any problem where a dynamic system can be modeled by a set of equations and where observations are available. A well designed estimator enables system states to be optimally predicted and errors to be rigorously quantified. Unscented Kalman filters (UKFs) and interactive multiple models can be found in methods from satellite tracking to self-driving cars. The luminous trajectory of the Bunburra Rockhole fireball was observed by the Desert Fireball Network in mid-2007. The recorded data set is used in this paper to examine the application of these two techniques as a viable approach to characterizing fireball dynamics. The nonlinear, single-body system of equations, used to model meteoroid entry through the atmosphere, is challenged by gross fragmentation events that may occur. The incorporation of the UKF within an interactive multiple model smoother provides a likely solution for when fragmentation events may occur as well as providing a statistical analysis of the state uncertainties. In addition to these benefits, another advantage of this approach is its automatability for use within an image processing pipeline to facilitate large fireball data analyses and meteorite recoveries.

  16. Virtual reality and cognitive rehabilitation: a review of current outcome research.

    PubMed

    Larson, Eric B; Feigon, Maia; Gagliardo, Pablo; Dvorkin, Assaf Y

    2014-01-01

    Recent advancement in the technology of virtual reality (VR) has allowed improved applications for cognitive rehabilitation. The aim of this review is to facilitate comparisons of therapeutic efficacy of different VR interventions. A systematic approach for the review of VR cognitive rehabilitation outcome research addressed the nature of each sample, treatment apparatus, experimental treatment protocol, control treatment protocol, statistical analysis and results. Using this approach, studies that provide valid evidence of efficacy of VR applications are summarized. Applications that have not yet undergone controlled outcome study but which have promise are introduced. Seventeen studies conducted over the past eight years are reviewed. The few randomized controlled trials that have been completed show that some applications are effective in treating cognitive deficits in people with neurological diagnoses although further study is needed. Innovations requiring further study include the use of enriched virtual environments that provide haptic sensory input in addition to visual and auditory inputs and the use of commercially available gaming systems to provide tele-rehabilitation services. Recommendations are offered to improve efficacy of rehabilitation, to improve scientific rigor of rehabilitation research and to broaden access to the evidence-based treatments that this research has identified.

  17. Orders of Magnitude Extension of the Effective Dynamic Range of TDC-Based TOFMS Data Through Maximum Likelihood Estimation

    NASA Astrophysics Data System (ADS)

    Ipsen, Andreas; Ebbels, Timothy M. D.

    2014-10-01

    In a recent article, we derived a probability distribution that was shown to closely approximate that of the data produced by liquid chromatography time-of-flight mass spectrometry (LC/TOFMS) instruments employing time-to-digital converters (TDCs) as part of their detection system. The approach of formulating detailed and highly accurate mathematical models of LC/MS data via probability distributions that are parameterized by quantities of analytical interest does not appear to have been fully explored before. However, we believe it could lead to a statistically rigorous framework for addressing many of the data analytical problems that arise in LC/MS studies. In this article, we present new procedures for correcting for TDC saturation using such an approach and demonstrate that there is potential for significant improvements in the effective dynamic range of TDC-based mass spectrometers, which could make them much more competitive with the alternative analog-to-digital converters (ADCs). The degree of improvement depends on our ability to generate mass and chromatographic peaks that conform to known mathematical functions and our ability to accurately describe the state of the detector dead time—tasks that may be best addressed through engineering efforts.

  18. Wetting of heterogeneous substrates. A classical density-functional-theory approach

    NASA Astrophysics Data System (ADS)

    Yatsyshin, Peter; Parry, Andrew O.; Rascón, Carlos; Duran-Olivencia, Miguel A.; Kalliadasis, Serafim

    2017-11-01

    Wetting is a nucleation of a third phase (liquid) on the interface between two different phases (solid and gas). In many experimentally accessible cases of wetting, the interplay between the substrate structure, and the fluid-fluid and fluid-substrate intermolecular interactions leads to the appearance of a whole ``zoo'' of exciting interface phase transitions, associated with the formation of nano-droplets/bubbles, and thin films. Practical applications of wetting at small scales are numerous and include the design of lab-on-a-chip devices and superhydrophobic surfaces. In this talk, we will use a fully microscopic approach to explore the phase space of a planar wall, decorated with patches of different hydrophobicity, and demonstrate the highly non-trivial behaviour of the liquid-gas interface near the substrate. We will present fluid density profiles, adsorption isotherms and wetting phase diagrams. Our analysis is based on a formulation of statistical mechanics, commonly known as classical density-functional theory. It provides a computationally-friendly and rigorous framework, suitable for probing small-scale physics of classical fluids and other soft-matter systems. EPSRC Grants No. EP/L027186,EP/K503733;ERC Advanced Grant No. 247031.

  19. Misclassification of acute respiratory distress syndrome after traumatic injury: The cost of less rigorous approaches.

    PubMed

    Hendrickson, Carolyn M; Dobbins, Sarah; Redick, Brittney J; Greenberg, Molly D; Calfee, Carolyn S; Cohen, Mitchell Jay

    2015-09-01

    Adherence to rigorous research protocols for identifying adult respiratory distress syndrome (ARDS) after trauma is variable. To examine how misclassification of ARDS may bias observational studies in trauma populations, we evaluated the agreement of two methods for adjudicating ARDS after trauma: the current gold standard, direct review of chest radiographs and review of dictated radiology reports, a commonly used alternative. This nested cohort study included 123 mechanically ventilated patients between 2005 and 2008, with at least one PaO2/FIO2 less than 300 within the first 8 days of admission. Two blinded physician investigators adjudicated ARDS by two methods. The investigators directly reviewed all chest radiographs to evaluate for bilateral infiltrates. Several months later, blinded to their previous assessments, they adjudicated ARDS using a standardized rubric to classify radiology reports. A κ statistics was calculated. Regression analyses quantified the association between established risk factors as well as important clinical outcomes and ARDS determined by the aforementioned methods as well as hypoxemia as a surrogate marker. The κ was 0.47 for the observed agreement between ARDS adjudicated by direct review of chest radiographs and ARDS adjudicated by review of radiology reports. Both the magnitude and direction of bias on the estimates of association between ARDS and established risk factors as well as clinical outcomes varied by method of adjudication. Classification of ARDS by review of dictated radiology reports had only moderate agreement with the current gold standard, ARDS adjudicated by direct review of chest radiographs. While the misclassification of ARDS had varied effects on the estimates of associations with established risk factors, it tended to weaken the association of ARDS with important clinical outcomes. A standardized approach to ARDS adjudication after trauma by direct review of chest radiographs will minimize misclassification bias in future observational studies. Diagnostic study, level II.

  20. Statistical Models for Averaging of the Pump–Probe Traces: Example of Denoising in Terahertz Time-Domain Spectroscopy

    NASA Astrophysics Data System (ADS)

    Skorobogatiy, Maksim; Sadasivan, Jayesh; Guerboukha, Hichem

    2018-05-01

    In this paper, we first discuss the main types of noise in a typical pump-probe system, and then focus specifically on terahertz time domain spectroscopy (THz-TDS) setups. We then introduce four statistical models for the noisy pulses obtained in such systems, and detail rigorous mathematical algorithms to de-noise such traces, find the proper averages and characterise various types of experimental noise. Finally, we perform a comparative analysis of the performance, advantages and limitations of the algorithms by testing them on the experimental data collected using a particular THz-TDS system available in our laboratories. We conclude that using advanced statistical models for trace averaging results in the fitting errors that are significantly smaller than those obtained when only a simple statistical average is used.

  1. Back to the Future: The Expanding Communities Curriculum in Geography Education

    ERIC Educational Resources Information Center

    Halvorsen, Anne-Lise

    2009-01-01

    This article traces the history of the expanding communities approach, the leading organizational structure for elementary social studies education since the 1930s. Since its introduction into the curriculum, educators have argued about the approach's effectiveness and suitability. Critics claim it lacks intellectual rigor and is redundant in that…

  2. Special Teaching for Special Children? Pedagogies for Inclusion. Inclusive Education

    ERIC Educational Resources Information Center

    Lewis, Ann, Ed.; Norwich, Brahm, Ed.

    2004-01-01

    Some special needs groups (for example dyslexia) have argued strongly for the need for particular specialist approaches. In contrast, many proponents of inclusion have argued that "good teaching is good teaching for all" and that all children benefit from similar approaches. Both positions fail to scrutinise this issue rigorously and…

  3. Building a Higher Education Pipeline: Sociocultural and Critical Approaches to 'Internationalisation' in Teaching and Research

    ERIC Educational Resources Information Center

    Chang, Benjamin

    2017-01-01

    This article discusses the use of critical and sociocultural approaches to more dynamically 'internationalise' higher education in the Hong Kong Special Administrative Region of China. The article explores the integration of critical pedagogy and sociocultural learning theory in developing more engaging and rigorous education practices for…

  4. Rigorous investigation of the reduced density matrix for the ideal Bose gas in harmonic traps by a loop-gas-like approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beau, Mathieu, E-mail: mbeau@stp.dias.ie; Savoie, Baptiste, E-mail: baptiste.savoie@gmail.com

    2014-05-15

    In this paper, we rigorously investigate the reduced density matrix (RDM) associated to the ideal Bose gas in harmonic traps. We present a method based on a sum-decomposition of the RDM allowing to treat not only the isotropic trap, but also general anisotropic traps. When focusing on the isotropic trap, the method is analogous to the loop-gas approach developed by Mullin [“The loop-gas approach to Bose-Einstein condensation for trapped particles,” Am. J. Phys. 68(2), 120 (2000)]. Turning to the case of anisotropic traps, we examine the RDM for some anisotropic trap models corresponding to some quasi-1D and quasi-2D regimes. Formore » such models, we bring out an additional contribution in the local density of particles which arises from the mesoscopic loops. The close connection with the occurrence of generalized-Bose-Einstein condensation is discussed. Our loop-gas-like approach provides relevant information which can help guide numerical investigations on highly anisotropic systems based on the Path Integral Monte Carlo method.« less

  5. Uncertainty quantification for nuclear density functional theory and information content of new measurements

    DOE PAGES

    McDonnell, J. D.; Schunck, N.; Higdon, D.; ...

    2015-03-24

    Statistical tools of uncertainty quantification can be used to assess the information content of measured observables with respect to present-day theoretical models, to estimate model errors and thereby improve predictive capability, to extrapolate beyond the regions reached by experiment, and to provide meaningful input to applications and planned measurements. To showcase new opportunities offered by such tools, we make a rigorous analysis of theoretical statistical uncertainties in nuclear density functional theory using Bayesian inference methods. By considering the recent mass measurements from the Canadian Penning Trap at Argonne National Laboratory, we demonstrate how the Bayesian analysis and a direct least-squaresmore » optimization, combined with high-performance computing, can be used to assess the information content of the new data with respect to a model based on the Skyrme energy density functional approach. Employing the posterior probability distribution computed with a Gaussian process emulator, we apply the Bayesian framework to propagate theoretical statistical uncertainties in predictions of nuclear masses, two-neutron dripline, and fission barriers. Overall, we find that the new mass measurements do not impose a constraint that is strong enough to lead to significant changes in the model parameters. In addition, the example discussed in this study sets the stage for quantifying and maximizing the impact of new measurements with respect to current modeling and guiding future experimental efforts, thus enhancing the experiment-theory cycle in the scientific method.« less

  6. Taking a statistical approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wild, M.; Rouhani, S.

    1995-02-01

    A typical site investigation entails extensive sampling and monitoring. In the past, sampling plans have been designed on purely ad hoc bases, leading to significant expenditures and, in some cases, collection of redundant information. In many instances, sampling costs exceed the true worth of the collected data. The US Environmental Protection Agency (EPA) therefore has advocated the use of geostatistics to provide a logical framework for sampling and analysis of environmental data. Geostatistical methodology uses statistical techniques for the spatial analysis of a variety of earth-related data. The use of geostatistics was developed by the mining industry to estimate oremore » concentrations. The same procedure is effective in quantifying environmental contaminants in soils for risk assessments. Unlike classical statistical techniques, geostatistics offers procedures to incorporate the underlying spatial structure of the investigated field. Sample points spaced close together tend to be more similar than samples spaced further apart. This can guide sampling strategies and determine complex contaminant distributions. Geostatistic techniques can be used to evaluate site conditions on the basis of regular, irregular, random and even spatially biased samples. In most environmental investigations, it is desirable to concentrate sampling in areas of known or suspected contamination. The rigorous mathematical procedures of geostatistics allow for accurate estimates at unsampled locations, potentially reducing sampling requirements. The use of geostatistics serves as a decision-aiding and planning tool and can significantly reduce short-term site assessment costs, long-term sampling and monitoring needs, as well as lead to more accurate and realistic remedial design criteria.« less

  7. Uncertainty quantification for nuclear density functional theory and information content of new measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McDonnell, J. D.; Schunck, N.; Higdon, D.

    2015-03-24

    Statistical tools of uncertainty quantification can be used to assess the information content of measured observables with respect to present-day theoretical models, to estimate model errors and thereby improve predictive capability, to extrapolate beyond the regions reached by experiment, and to provide meaningful input to applications and planned measurements. To showcase new opportunities offered by such tools, we make a rigorous analysis of theoretical statistical uncertainties in nuclear density functional theory using Bayesian inference methods. By considering the recent mass measurements from the Canadian Penning Trap at Argonne National Laboratory, we demonstrate how the Bayesian analysis and a direct least-squaresmore » optimization, combined with high-performance computing, can be used to assess the information content of the new data with respect to a model based on the Skyrme energy density functional approach. Employing the posterior probability distribution computed with a Gaussian process emulator, we apply the Bayesian framework to propagate theoretical statistical uncertainties in predictions of nuclear masses, two-neutron dripline, and fission barriers. Overall, we find that the new mass measurements do not impose a constraint that is strong enough to lead to significant changes in the model parameters. As a result, the example discussed in this study sets the stage for quantifying and maximizing the impact of new measurements with respect to current modeling and guiding future experimental efforts, thus enhancing the experiment-theory cycle in the scientific method.« less

  8. TREATMENT SWITCHING: STATISTICAL AND DECISION-MAKING CHALLENGES AND APPROACHES.

    PubMed

    Latimer, Nicholas R; Henshall, Chris; Siebert, Uwe; Bell, Helen

    2016-01-01

    Treatment switching refers to the situation in a randomized controlled trial where patients switch from their randomly assigned treatment onto an alternative. Often, switching is from the control group onto the experimental treatment. In this instance, a standard intention-to-treat analysis does not identify the true comparative effectiveness of the treatments under investigation. We aim to describe statistical methods for adjusting for treatment switching in a comprehensible way for nonstatisticians, and to summarize views on these methods expressed by stakeholders at the 2014 Adelaide International Workshop on Treatment Switching in Clinical Trials. We describe three statistical methods used to adjust for treatment switching: marginal structural models, two-stage adjustment, and rank preserving structural failure time models. We draw upon discussion heard at the Adelaide International Workshop to explore the views of stakeholders on the acceptability of these methods. Stakeholders noted that adjustment methods are based on assumptions, the validity of which may often be questionable. There was disagreement on the acceptability of adjustment methods, but consensus that when these are used, they should be justified rigorously. The utility of adjustment methods depends upon the decision being made and the processes used by the decision-maker. Treatment switching makes estimating the true comparative effect of a new treatment challenging. However, many decision-makers have reservations with adjustment methods. These, and how they affect the utility of adjustment methods, require further exploration. Further technical work is required to develop adjustment methods to meet real world needs, to enhance their acceptability to decision-makers.

  9. Determinants of the Rigor of State Protection Policies for Persons With Dementia in Assisted Living.

    PubMed

    Nattinger, Matthew C; Kaskie, Brian

    2017-01-01

    Continued growth in the number of individuals with dementia residing in assisted living (AL) facilities raises concerns about their safety and protection. However, unlike federally regulated nursing facilities, AL facilities are state-regulated and there is a high degree of variation among policies designed to protect persons with dementia. Despite the important role these protection policies have in shaping the quality of life of persons with dementia residing in AL facilities, little is known about their formation. In this research, we examined the adoption of AL protection policies pertaining to staffing, the physical environment, and the use of chemical restraints. For each protection policy type, we modeled policy rigor using an innovative point-in-time approach, incorporating variables associated with state contextual, institutional, political, and external factors. We found that the rate of state AL protection policy adoptions remained steady over the study period, with staffing policies becoming less rigorous over time. Variables reflecting institutional policy making, including legislative professionalism and bureaucratic oversight, were associated with the rigor of state AL dementia protection policies. As we continue to evaluate the mechanisms contributing to the rigor of AL protection policies, it seems that organized advocacy efforts might expand their role in educating state policy makers about the importance of protecting persons with dementia residing in AL facilities and moving to advance appropriate policies.

  10. Dinosaurs in decline tens of millions of years before their final extinction

    PubMed Central

    Sakamoto, Manabu; Benton, Michael J.; Venditti, Chris

    2016-01-01

    Whether dinosaurs were in a long-term decline or whether they were reigning strong right up to their final disappearance at the Cretaceous–Paleogene (K-Pg) mass extinction event 66 Mya has been debated for decades with no clear resolution. The dispute has continued unresolved because of a lack of statistical rigor and appropriate evolutionary framework. Here, for the first time to our knowledge, we apply a Bayesian phylogenetic approach to model the evolutionary dynamics of speciation and extinction through time in Mesozoic dinosaurs, properly taking account of previously ignored statistical violations. We find overwhelming support for a long-term decline across all dinosaurs and within all three dinosaurian subclades (Ornithischia, Sauropodomorpha, and Theropoda), where speciation rate slowed down through time and was ultimately exceeded by extinction rate tens of millions of years before the K-Pg boundary. The only exceptions to this general pattern are the morphologically specialized herbivores, the Hadrosauriformes and Ceratopsidae, which show rapid species proliferations throughout the Late Cretaceous instead. Our results highlight that, despite some heterogeneity in speciation dynamics, dinosaurs showed a marked reduction in their ability to replace extinct species with new ones, making them vulnerable to extinction and unable to respond quickly to and recover from the final catastrophic event. PMID:27092007

  11. Dinosaurs in decline tens of millions of years before their final extinction.

    PubMed

    Sakamoto, Manabu; Benton, Michael J; Venditti, Chris

    2016-05-03

    Whether dinosaurs were in a long-term decline or whether they were reigning strong right up to their final disappearance at the Cretaceous-Paleogene (K-Pg) mass extinction event 66 Mya has been debated for decades with no clear resolution. The dispute has continued unresolved because of a lack of statistical rigor and appropriate evolutionary framework. Here, for the first time to our knowledge, we apply a Bayesian phylogenetic approach to model the evolutionary dynamics of speciation and extinction through time in Mesozoic dinosaurs, properly taking account of previously ignored statistical violations. We find overwhelming support for a long-term decline across all dinosaurs and within all three dinosaurian subclades (Ornithischia, Sauropodomorpha, and Theropoda), where speciation rate slowed down through time and was ultimately exceeded by extinction rate tens of millions of years before the K-Pg boundary. The only exceptions to this general pattern are the morphologically specialized herbivores, the Hadrosauriformes and Ceratopsidae, which show rapid species proliferations throughout the Late Cretaceous instead. Our results highlight that, despite some heterogeneity in speciation dynamics, dinosaurs showed a marked reduction in their ability to replace extinct species with new ones, making them vulnerable to extinction and unable to respond quickly to and recover from the final catastrophic event.

  12. Obesity interventions in African American faith-based organizations: a systematic review.

    PubMed

    Lancaster, K J; Carter-Edwards, L; Grilo, S; Shen, C; Schoenthaler, A M

    2014-10-01

    African Americans, especially women, have higher obesity rates than the general US population. Because of the importance of faith to many African Americans, faith-based organizations (FBOs) may be effective venues for delivering health messages and promoting adoption of healthy behaviours. This article systematically reviews interventions targeting weight and related behaviours in faith settings. We searched literature published through July 2012 for interventions in FBOs targeting weight loss, diet and/or physical activity (PA) in African Americans. Of 27 relevant articles identified, 12 were randomized controlled trials; seven of these reported a statistically significant change in an outcome. Four of the five quasi-experimental and single-group design studies reported a statistically significant outcome. All 10 pilot studies reported improvement in at least one outcome, but most did not have a comparison group. Overall, 70% of interventions reported success in reducing weight, 60% reported increased fruit and vegetable intake and 38% reported increased PA. These results suggest that interventions in African American FBOs can successfully improve weight and related behaviours. However, not all of the findings about the success of certain approaches were as expected. This review identifies gaps in knowledge and recommends more rigorous studies be conducted to strengthen the comparative methodology and evidence. © 2014 World Obesity.

  13. GOEAST: a web-based software toolkit for Gene Ontology enrichment analysis.

    PubMed

    Zheng, Qi; Wang, Xiu-Jie

    2008-07-01

    Gene Ontology (GO) analysis has become a commonly used approach for functional studies of large-scale genomic or transcriptomic data. Although there have been a lot of software with GO-related analysis functions, new tools are still needed to meet the requirements for data generated by newly developed technologies or for advanced analysis purpose. Here, we present a Gene Ontology Enrichment Analysis Software Toolkit (GOEAST), an easy-to-use web-based toolkit that identifies statistically overrepresented GO terms within given gene sets. Compared with available GO analysis tools, GOEAST has the following improved features: (i) GOEAST displays enriched GO terms in graphical format according to their relationships in the hierarchical tree of each GO category (biological process, molecular function and cellular component), therefore, provides better understanding of the correlations among enriched GO terms; (ii) GOEAST supports analysis for data from various sources (probe or probe set IDs of Affymetrix, Illumina, Agilent or customized microarrays, as well as different gene identifiers) and multiple species (about 60 prokaryote and eukaryote species); (iii) One unique feature of GOEAST is to allow cross comparison of the GO enrichment status of multiple experiments to identify functional correlations among them. GOEAST also provides rigorous statistical tests to enhance the reliability of analysis results. GOEAST is freely accessible at http://omicslab.genetics.ac.cn/GOEAST/

  14. Dinosaurs in decline tens of millions of years before their final extinction

    NASA Astrophysics Data System (ADS)

    Sakamoto, Manabu; Benton, Michael J.

    2016-05-01

    Whether dinosaurs were in a long-term decline or whether they were reigning strong right up to their final disappearance at the Cretaceous-Paleogene (K-Pg) mass extinction event 66 Mya has been debated for decades with no clear resolution. The dispute has continued unresolved because of a lack of statistical rigor and appropriate evolutionary framework. Here, for the first time to our knowledge, we apply a Bayesian phylogenetic approach to model the evolutionary dynamics of speciation and extinction through time in Mesozoic dinosaurs, properly taking account of previously ignored statistical violations. We find overwhelming support for a long-term decline across all dinosaurs and within all three dinosaurian subclades (Ornithischia, Sauropodomorpha, and Theropoda), where speciation rate slowed down through time and was ultimately exceeded by extinction rate tens of millions of years before the K-Pg boundary. The only exceptions to this general pattern are the morphologically specialized herbivores, the Hadrosauriformes and Ceratopsidae, which show rapid species proliferations throughout the Late Cretaceous instead. Our results highlight that, despite some heterogeneity in speciation dynamics, dinosaurs showed a marked reduction in their ability to replace extinct species with new ones, making them vulnerable to extinction and unable to respond quickly to and recover from the final catastrophic event.

  15. Robustness of movement models: can models bridge the gap between temporal scales of data sets and behavioural processes?

    PubMed

    Schlägel, Ulrike E; Lewis, Mark A

    2016-12-01

    Discrete-time random walks and their extensions are common tools for analyzing animal movement data. In these analyses, resolution of temporal discretization is a critical feature. Ideally, a model both mirrors the relevant temporal scale of the biological process of interest and matches the data sampling rate. Challenges arise when resolution of data is too coarse due to technological constraints, or when we wish to extrapolate results or compare results obtained from data with different resolutions. Drawing loosely on the concept of robustness in statistics, we propose a rigorous mathematical framework for studying movement models' robustness against changes in temporal resolution. In this framework, we define varying levels of robustness as formal model properties, focusing on random walk models with spatially-explicit component. With the new framework, we can investigate whether models can validly be applied to data across varying temporal resolutions and how we can account for these different resolutions in statistical inference results. We apply the new framework to movement-based resource selection models, demonstrating both analytical and numerical calculations, as well as a Monte Carlo simulation approach. While exact robustness is rare, the concept of approximate robustness provides a promising new direction for analyzing movement models.

  16. Comparative audit of clinical research in pediatric neurology.

    PubMed

    Al-Futaisi, Amna; Shevell, Michael

    2004-11-01

    Clinical research involves direct observation or data collection on human subjects. This study was conducted to evaluate the profile of pediatric neurology clinical research over a decade. Trends in pediatric neurology clinical research were documented through a systematic comparative review of articles published in selected journals. Eleven journals (five pediatric neurology, three general neurology, three general pediatrics) were systematically reviewed for articles involving a majority of human subjects less than 18 years of age for the years 1990 and 2000. Three hundred thirty-five clinical research articles in pediatric neurology were identified in the 11 journals for 1990 and 398 for 2000, a 19% increase. A statistically significant increase in analytic design (21.8% vs 39.5%; P = .01), statistical support (6% vs 16.6%; P < .0001), and multidisciplinary team (69.9% vs 87%; P = .003) was observed. In terms of specific study design, a significant decline in case reports (34.3% vs 10.3%; P < .0001) and an increase in case-control studies (11.3% vs 22.9%; P = .02) were evident over the 10-year interval. This comparative audit revealed that there has been a discernible change in the methodology profile of clinical research in child neurology over a decade. Trends apparently suggest a more rigorous approach to study design and investigation in this field.

  17. Host and parasite morphology influence congruence between host and parasite phylogenies.

    PubMed

    Sweet, Andrew D; Bush, Sarah E; Gustafsson, Daniel R; Allen, Julie M; DiBlasi, Emily; Skeen, Heather R; Weckstein, Jason D; Johnson, Kevin P

    2018-03-23

    Comparisons of host and parasite phylogenies often show varying degrees of phylogenetic congruence. However, few studies have rigorously explored the factors driving this variation. Multiple factors such as host or parasite morphology may govern the degree of phylogenetic congruence. An ideal analysis for understanding the factors correlated with congruence would focus on a diverse host-parasite system for increased variation and statistical power. In this study, we focused on the Brueelia-complex, a diverse and widespread group of feather lice that primarily parasitise songbirds. We generated a molecular phylogeny of the lice and compared this tree with a phylogeny of their avian hosts. We also tested for the contribution of each host-parasite association to the overall congruence. The two trees overall were significantly congruent, but the contribution of individual associations to this congruence varied. To understand this variation, we developed a novel approach to test whether host, parasite or biogeographic factors were statistically associated with patterns of congruence. Both host plumage dimorphism and parasite ecomorphology were associated with patterns of congruence, whereas host body size, other plumage traits and biogeography were not. Our results lay the framework for future studies to further elucidate how these factors influence the process of host-parasite coevolution. Copyright © 2018 Australian Society for Parasitology. Published by Elsevier Ltd. All rights reserved.

  18. All That Glitters: A Glimpse into the Future of Cancer Screening

    Cancer.gov

    Developing new screening approaches and rigorously establishing their validity is challenging. Researchers are actively searching for new screening tests that improve the benefits of screening while limiting the harms.

  19. Testability of evolutionary game dynamics based on experimental economics data

    NASA Astrophysics Data System (ADS)

    Wang, Yijia; Chen, Xiaojie; Wang, Zhijian

    2017-11-01

    Understanding the dynamic processes of a real game system requires an appropriate dynamics model, and rigorously testing a dynamics model is nontrivial. In our methodological research, we develop an approach to testing the validity of game dynamics models that considers the dynamic patterns of angular momentum and speed as measurement variables. Using Rock-Paper-Scissors (RPS) games as an example, we illustrate the geometric patterns in the experiment data. We then derive the related theoretical patterns from a series of typical dynamics models. By testing the goodness-of-fit between the experimental and theoretical patterns, we show that the validity of these models can be evaluated quantitatively. Our approach establishes a link between dynamics models and experimental systems, which is, to the best of our knowledge, the most effective and rigorous strategy for ascertaining the testability of evolutionary game dynamics models.

  20. Statistical issues in the design, conduct and analysis of two large safety studies.

    PubMed

    Gaffney, Michael

    2016-10-01

    The emergence, post approval, of serious medical events, which may be associated with the use of a particular drug or class of drugs, is an important public health and regulatory issue. The best method to address this issue is through a large, rigorously designed safety study. Therefore, it is important to elucidate the statistical issues involved in these large safety studies. Two such studies are PRECISION and EAGLES. PRECISION is the primary focus of this article. PRECISION is a non-inferiority design with a clinically relevant non-inferiority margin. Statistical issues in the design, conduct and analysis of PRECISION are discussed. Quantitative and clinical aspects of the selection of the composite primary endpoint, the determination and role of the non-inferiority margin in a large safety study and the intent-to-treat and modified intent-to-treat analyses in a non-inferiority safety study are shown. Protocol changes that were necessary during the conduct of PRECISION are discussed from a statistical perspective. Issues regarding the complex analysis and interpretation of the results of PRECISION are outlined. EAGLES is presented as a large, rigorously designed safety study when a non-inferiority margin was not able to be determined by a strong clinical/scientific method. In general, when a non-inferiority margin is not able to be determined, the width of the 95% confidence interval is a way to size the study and to assess the cost-benefit of relative trial size. A non-inferiority margin, when able to be determined by a strong scientific method, should be included in a large safety study. Although these studies could not be called "pragmatic," they are examples of best real-world designs to address safety and regulatory concerns. © The Author(s) 2016.

  1. Exploring Gender-Specific Trends in Underage Drinking across Adolescent Age Groups and Measures of Drinking: Is Girls' Drinking Catching up with Boys'?

    ERIC Educational Resources Information Center

    Zhong, Hua; Schwartz, Jennifer

    2010-01-01

    Underage drinking is among the most serious of public health problems facing adolescents in the United States. Recent concerns have centered on young women, reflected in media reports and arrest statistics on their increasing problematic alcohol use. This study rigorously examined whether girls' alcohol use rose by applying time series methods to…

  2. Decomposition of the Inequality of Income Distribution by Income Types—Application for Romania

    NASA Astrophysics Data System (ADS)

    Andrei, Tudorel; Oancea, Bogdan; Richmond, Peter; Dhesi, Gurjeet; Herteliu, Claudiu

    2017-09-01

    This paper identifies the salient factors that characterize the inequality income distribution for Romania. Data analysis is rigorously carried out using sophisticated techniques borrowed from classical statistics (Theil). Decomposition of the inequalities measured by the Theil index is also performed. This study relies on an exhaustive (11.1 million records for 2014) data-set for total personal gross income of Romanian citizens.

  3. Not so Fast My Friend: The Rush to R and the Need for Rigorous Evaluation of Data Analysis and Software in Education

    ERIC Educational Resources Information Center

    Harwell, Michael

    2014-01-01

    Commercial data analysis software has been a fixture of quantitative analyses in education for more than three decades. Despite its apparent widespread use there is no formal evidence cataloging what software is used in educational research and educational statistics classes, by whom and for what purpose, and whether some programs should be…

  4. The Uphill Battle of Performing Education Scholarship: Barriers Educators and Education Researchers Face.

    PubMed

    Jordan, Jaime; Coates, Wendy C; Clarke, Samuel; Runde, Daniel; Fowlkes, Emilie; Kurth, Jaqueline; Yarris, Lalena

    2018-05-01

    Educators and education researchers report that their scholarship is limited by lack of time, funding, mentorship, expertise, and reward. This study aims to evaluate these groups' perceptions regarding barriers to scholarship and potential strategies for success. Core emergency medicine (EM) educators and education researchers completed an online survey consisting of multiple-choice, 10-point Likert scale, and free-response items in 2015. Descriptive statistics were reported. We used qualitative analysis applying a thematic approach to free-response items. A total of 204 educators and 42 education researchers participated. Education researchers were highly productive: 19/42 reported more than 20 peer-reviewed education scholarship publications on their curricula vitae. In contrast, 68/197 educators reported no education publications within five years. Only a minority, 61/197 had formal research training compared to 25/42 education researchers. Barriers to performing research for both groups were lack of time, competing demands, lack of support, lack of funding, and challenges achieving scientifically rigorous methods and publication. The most common motivators identified were dissemination of knowledge, support of evidence-based practices, and promotion. Respondents advised those who seek greater education research involvement to pursue mentorship, formal research training, collaboration, and rigorous methodological standards. The most commonly cited barriers were lack of time and competing demands. Stakeholders were motivated by the desire to disseminate knowledge, support evidence-based practices, and achieve promotion. Suggested strategies for success included formal training, mentorship, and collaboration. This information may inform interventions to support educators in their scholarly pursuits and improve the overall quality of education research in EM.

  5. Toward rigorous idiographic research in prevention science: comparison between three analytic strategies for testing preventive intervention in very small samples.

    PubMed

    Ridenour, Ty A; Pineo, Thomas Z; Maldonado Molina, Mildred M; Hassmiller Lich, Kristen

    2013-06-01

    Psychosocial prevention research lacks evidence from intensive within-person lines of research to understand idiographic processes related to development and response to intervention. Such data could be used to fill gaps in the literature and expand the study design options for prevention researchers, including lower-cost yet rigorous studies (e.g., for program evaluations), pilot studies, designs to test programs for low prevalence outcomes, selective/indicated/adaptive intervention research, and understanding of differential response to programs. This study compared three competing analytic strategies designed for this type of research: autoregressive moving average, mixed model trajectory analysis, and P-technique. Illustrative time series data were from a pilot study of an intervention for nursing home residents with diabetes (N = 4) designed to improve control of blood glucose. A within-person, intermittent baseline design was used. Intervention effects were detected using each strategy for the aggregated sample and for individual patients. The P-technique model most closely replicated observed glucose levels. ARIMA and P-technique models were most similar in terms of estimated intervention effects and modeled glucose levels. However, ARIMA and P-technique also were more sensitive to missing data, outliers and number of observations. Statistical testing suggested that results generalize both to other persons as well as to idiographic, longitudinal processes. This study demonstrated the potential contributions of idiographic research in prevention science as well as the need for simulation studies to delineate the research circumstances when each analytic approach is optimal for deriving the correct parameter estimates.

  6. Toward Rigorous Idiographic Research in Prevention Science: Comparison Between Three Analytic Strategies for Testing Preventive Intervention in Very Small Samples

    PubMed Central

    Pineo, Thomas Z.; Maldonado Molina, Mildred M.; Lich, Kristen Hassmiller

    2013-01-01

    Psychosocial prevention research lacks evidence from intensive within-person lines of research to understand idiographic processes related to development and response to intervention. Such data could be used to fill gaps in the literature and expand the study design options for prevention researchers, including lower-cost yet rigorous studies (e.g., for program evaluations), pilot studies, designs to test programs for low prevalence outcomes, selective/indicated/ adaptive intervention research, and understanding of differential response to programs. This study compared three competing analytic strategies designed for this type of research: autoregressive moving average, mixed model trajectory analysis, and P-technique. Illustrative time series data were from a pilot study of an intervention for nursing home residents with diabetes (N=4) designed to improve control of blood glucose. A within-person, intermittent baseline design was used. Intervention effects were detected using each strategy for the aggregated sample and for individual patients. The P-technique model most closely replicated observed glucose levels. ARIMA and P-technique models were most similar in terms of estimated intervention effects and modeled glucose levels. However, ARIMA and P-technique also were more sensitive to missing data, outliers and number of observations. Statistical testing suggested that results generalize both to other persons as well as to idiographic, longitudinal processes. This study demonstrated the potential contributions of idiographic research in prevention science as well as the need for simulation studies to delineate the research circumstances when each analytic approach is optimal for deriving the correct parameter estimates. PMID:23299558

  7. Combining machine learning and remotely sensed bandratios to investigate chlorophyll content and photosynthetic processes

    NASA Astrophysics Data System (ADS)

    Gholizadeh, Hamed

    Photosynthesis in aquatic and terrestrial ecosystems is the key component of the food chain and the most important driver of the global carbon cycle. Therefore, estimation of photosynthesis at large spatial scales is of great scientific importance and can only practically be achieved by remote sensing data and techniques. In this dissertation, remotely sensed information and techniques, as well as field measurements, are used to improve current approaches of assessing photosynthetic processes. More specifically, three topics are the focus here: (1) investigating the application of spectral vegetation indices as proxies for terrestrial chlorophyll in a mangrove ecosystem, (2) evaluating and improving one of the most common empirical ocean-color algorithms (OC4), and (3) developing an improved approach based on sunlit-to-shaded scaled photochemical reflectance index (sPRI) ratios for detecting drought signals in a deciduous forest at eastern United States. The results indicated that although the green normalized difference vegetation index (GNDVI) is an efficient proxy for terrestrial chlorophyll content, there are opportunities to improve the performance of vegetation indices by optimizing the band weights. In regards to the second topic, we concluded that the parameters of the OC4 algorithm and similar empirical models should be tuned regionally and the addition of sea-surface temperature makes the global ocean-color approaches more valid. Results obtained from the third topic showed that considering shaded and sunlit portions of the canopy (i.e., two-leaf models instead of single big leaf models) and taking into account the divergent stomatal behavior of the species (i.e. isohydric and anisohydric) can improve the capability of sPRI in detecting drought. In addition to investigating the photosynthetic processes, the other common theme of the three research topics is the evaluation of "off- the-shelf" solutions to remote-sensing problems. Although widely used approaches such as normalized difference vegetation index (NDVI) are easy to apply and are often efficient choices in remote sensing applications, the use of these approaches should be justified and their shortcomings need to be considered in the context of the research application. When developing new remote sensing approaches, special attention should be paid to (1) initial data analysis such as statistical data transformations (e.g. Tukey ladder-of-powers transformation) and (2) rigorous validation design by creating separate training and validation data sets preferably using both field measurements and satellite-based data. Developing a sound approach and applying a rigorous validation methodology go hand in hand. In sum, all approaches have advantages and disadvantages or as George Box puts it, "all models are wrong but some are useful".

  8. The Work-Family Conflict Scale (WAFCS): development and initial validation of a self-report measure of work-family conflict for use with parents.

    PubMed

    Haslam, Divna; Filus, Ania; Morawska, Alina; Sanders, Matthew R; Fletcher, Renee

    2015-06-01

    This paper outlines the development and validation of the Work-Family Conflict Scale (WAFCS) designed to measure work-to-family conflict (WFC) and family-to-work conflict (FWC) for use with parents of young children. An expert informant and consumer feedback approach was utilised to develop and refine 20 items, which were subjected to a rigorous validation process using two separate samples of parents of 2-12 year old children (n = 305 and n = 264). As a result of statistical analyses several items were dropped resulting in a brief 10-item scale comprising two subscales assessing theoretically distinct but related constructs: FWC (five items) and WFC (five items). Analyses revealed both subscales have good internal consistency, construct validity as well as concurrent and predictive validity. The results indicate the WAFCS is a promising brief measure for the assessment of work-family conflict in parents. Benefits of the measure as well as potential uses are discussed.

  9. Unified quantitative characterization of epithelial tissue development

    PubMed Central

    Guirao, Boris; Rigaud, Stéphane U; Bosveld, Floris; Bailles, Anaïs; López-Gay, Jesús; Ishihara, Shuji; Sugimura, Kaoru

    2015-01-01

    Understanding the mechanisms regulating development requires a quantitative characterization of cell divisions, rearrangements, cell size and shape changes, and apoptoses. We developed a multiscale formalism that relates the characterizations of each cell process to tissue growth and morphogenesis. Having validated the formalism on computer simulations, we quantified separately all morphogenetic events in the Drosophila dorsal thorax and wing pupal epithelia to obtain comprehensive statistical maps linking cell and tissue scale dynamics. While globally cell shape changes, rearrangements and divisions all significantly participate in tissue morphogenesis, locally, their relative participations display major variations in space and time. By blocking division we analyzed the impact of division on rearrangements, cell shape changes and tissue morphogenesis. Finally, by combining the formalism with mechanical stress measurement, we evidenced unexpected interplays between patterns of tissue elongation, cell division and stress. Our formalism provides a novel and rigorous approach to uncover mechanisms governing tissue development. DOI: http://dx.doi.org/10.7554/eLife.08519.001 PMID:26653285

  10. What can comparative effectiveness research, propensity score and registry study bring to Chinese medicine?

    PubMed

    Liao, Xing; Xie, Yan-ming

    2014-10-01

    The impact of evidence-based medicine and clinical epidemiology on clinical research has contributed to the development of Chinese medicine in modern times over the past two decades. Many concepts and methods of modern science and technology are emerging in Chinese medicine research, resulting in constant progress. Systematic reviews, randomized controlled trials and other advanced mathematic approaches and statistical analysis methods have brought reform to Chinese medicine. In this new era, Chinese medicine researchers have many opportunities and challenges. On the one hand, Chinese medicine researchers need to dedicate themselves to providing enough evidence to the world through rigorous studies, whilst on the other hand, they also need to keep up with the speed of modern medicine research. For example, recently, real world study, comparative effectiveness research, propensity score techniques and registry study have emerged. This article aims to inspire Chinese medicine researchers to explore new areas by introducing these new ideas and new techniques.

  11. Methodological challenges of validating a clinical decision-making tool in the practice environment.

    PubMed

    Brennan, Caitlin W; Daly, Barbara J

    2015-04-01

    Validating a measurement tool intended for use in the practice environment poses challenges that may not be present when validating a tool intended solely for research purposes. The aim of this article is to describe the methodological challenges of validating a clinical decision-making tool, the Oncology Acuity Tool, which nurses use to make nurse assignment and staffing decisions prospectively each shift. Data were derived from a larger validation study, during which several methodological challenges arose. Revisions to the tool, including conducting iterative feedback cycles with end users, were necessary before the validation study was initiated. The "true" value of patient acuity is unknown, and thus, two approaches to inter-rater reliability assessment were used. Discordant perspectives existed between experts and end users. Balancing psychometric rigor with clinical relevance may be achieved through establishing research-practice partnerships, seeking active and continuous feedback with end users, and weighing traditional statistical rules of thumb with practical considerations. © The Author(s) 2014.

  12. The MR-Base platform supports systematic causal inference across the human phenome

    PubMed Central

    Wade, Kaitlin H; Haberland, Valeriia; Baird, Denis; Laurin, Charles; Burgess, Stephen; Bowden, Jack; Langdon, Ryan; Tan, Vanessa Y; Yarmolinsky, James; Shihab, Hashem A; Timpson, Nicholas J; Evans, David M; Relton, Caroline; Martin, Richard M; Davey Smith, George

    2018-01-01

    Results from genome-wide association studies (GWAS) can be used to infer causal relationships between phenotypes, using a strategy known as 2-sample Mendelian randomization (2SMR) and bypassing the need for individual-level data. However, 2SMR methods are evolving rapidly and GWAS results are often insufficiently curated, undermining efficient implementation of the approach. We therefore developed MR-Base (http://www.mrbase.org): a platform that integrates a curated database of complete GWAS results (no restrictions according to statistical significance) with an application programming interface, web app and R packages that automate 2SMR. The software includes several sensitivity analyses for assessing the impact of horizontal pleiotropy and other violations of assumptions. The database currently comprises 11 billion single nucleotide polymorphism-trait associations from 1673 GWAS and is updated on a regular basis. Integrating data with software ensures more rigorous application of hypothesis-driven analyses and allows millions of potential causal relationships to be efficiently evaluated in phenome-wide association studies. PMID:29846171

  13. Potential energy landscapes identify the information-theoretic nature of the epigenome

    PubMed Central

    Jenkinson, Garrett; Pujadas, Elisabet; Goutsias, John; Feinberg, Andrew P.

    2017-01-01

    Epigenetics studies genomic modifications carrying information independent of DNA sequence heritable through cell division. In 1940, Waddington coined the term “epigenetic landscape” as a metaphor for pluripotency and differentiation, but methylation landscapes have not yet been rigorously computed. By using principles of statistical physics and information theory, we derive epigenetic energy landscapes from whole-genome bisulfite sequencing data that allow us to quantify methylation stochasticity genome-wide using Shannon’s entropy and associate entropy with chromatin structure. Moreover, we consider the Jensen-Shannon distance between sample-specific energy landscapes as a measure of epigenetic dissimilarity and demonstrate its effectiveness for discerning epigenetic differences. By viewing methylation maintenance as a communications system, we introduce methylation channels and show that higher-order chromatin organization can be predicted from their informational properties. Our results provide a fundamental understanding of the information-theoretic nature of the epigenome that leads to a powerful approach for studying its role in disease and aging. PMID:28346445

  14. Data Assimilation - Advances and Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williams, Brian J.

    2014-07-30

    This presentation provides an overview of data assimilation (model calibration) for complex computer experiments. Calibration refers to the process of probabilistically constraining uncertain physics/engineering model inputs to be consistent with observed experimental data. An initial probability distribution for these parameters is updated using the experimental information. Utilization of surrogate models and empirical adjustment for model form error in code calibration form the basis for the statistical methodology considered. The role of probabilistic code calibration in supporting code validation is discussed. Incorporation of model form uncertainty in rigorous uncertainty quantification (UQ) analyses is also addressed. Design criteria used within a batchmore » sequential design algorithm are introduced for efficiently achieving predictive maturity and improved code calibration. Predictive maturity refers to obtaining stable predictive inference with calibrated computer codes. These approaches allow for augmentation of initial experiment designs for collecting new physical data. A standard framework for data assimilation is presented and techniques for updating the posterior distribution of the state variables based on particle filtering and the ensemble Kalman filter are introduced.« less

  15. Probabilistic risk analysis of building contamination.

    PubMed

    Bolster, D T; Tartakovsky, D M

    2008-10-01

    We present a general framework for probabilistic risk assessment (PRA) of building contamination. PRA provides a powerful tool for the rigorous quantification of risk in contamination of building spaces. A typical PRA starts by identifying relevant components of a system (e.g. ventilation system components, potential sources of contaminants, remediation methods) and proceeds by using available information and statistical inference to estimate the probabilities of their failure. These probabilities are then combined by means of fault-tree analyses to yield probabilistic estimates of the risk of system failure (e.g. building contamination). A sensitivity study of PRAs can identify features and potential problems that need to be addressed with the most urgency. Often PRAs are amenable to approximations, which can significantly simplify the approach. All these features of PRA are presented in this paper via a simple illustrative example, which can be built upon in further studies. The tool presented here can be used to design and maintain adequate ventilation systems to minimize exposure of occupants to contaminants.

  16. A Curricular-Sampling Approach to Progress Monitoring: Mathematics Concepts and Applications

    ERIC Educational Resources Information Center

    Fuchs, Lynn S.; Fuchs, Douglas; Zumeta, Rebecca O.

    2008-01-01

    Progress monitoring is an important component of effective instructional practice. Curriculum-based measurement (CBM) is a form of progress monitoring that has been the focus of rigorous research. Two approaches for formulating CBM systems exist. The first is to assess performance regularly on a task that serves as a global indicator of competence…

  17. 77 FR 55698 - National Emission Standards for Hazardous Air Pollutants From the Pulp and Paper Industry

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-09-11

    ... tests will help to ensure that control systems are maintained properly over time and a more rigorous... approach, industry is expected to save time in the performance test submittal process. Additionally this... pulping vent gas control at mills where the CCA approach would be adversely affected. Our revised cost...

  18. A Review of the Application of Lifecycle Analysis to Renewable Energy Systems

    ERIC Educational Resources Information Center

    Lund, Chris; Biswas, Wahidul

    2008-01-01

    The lifecycle concept is a "cradle to grave" approach to thinking about products, processes, and services, recognizing that all stages have environmental and economic impacts. Any rigorous and meaningful comparison of energy supply options must be done using a lifecycle analysis approach. It has been applied to an increasing number of conventional…

  19. On statistical inference in time series analysis of the evolution of road safety.

    PubMed

    Commandeur, Jacques J F; Bijleveld, Frits D; Bergel-Hayat, Ruth; Antoniou, Constantinos; Yannis, George; Papadimitriou, Eleonora

    2013-11-01

    Data collected for building a road safety observatory usually include observations made sequentially through time. Examples of such data, called time series data, include annual (or monthly) number of road traffic accidents, traffic fatalities or vehicle kilometers driven in a country, as well as the corresponding values of safety performance indicators (e.g., data on speeding, seat belt use, alcohol use, etc.). Some commonly used statistical techniques imply assumptions that are often violated by the special properties of time series data, namely serial dependency among disturbances associated with the observations. The first objective of this paper is to demonstrate the impact of such violations to the applicability of standard methods of statistical inference, which leads to an under or overestimation of the standard error and consequently may produce erroneous inferences. Moreover, having established the adverse consequences of ignoring serial dependency issues, the paper aims to describe rigorous statistical techniques used to overcome them. In particular, appropriate time series analysis techniques of varying complexity are employed to describe the development over time, relating the accident-occurrences to explanatory factors such as exposure measures or safety performance indicators, and forecasting the development into the near future. Traditional regression models (whether they are linear, generalized linear or nonlinear) are shown not to naturally capture the inherent dependencies in time series data. Dedicated time series analysis techniques, such as the ARMA-type and DRAG approaches are discussed next, followed by structural time series models, which are a subclass of state space methods. The paper concludes with general recommendations and practice guidelines for the use of time series models in road safety research. Copyright © 2012 Elsevier Ltd. All rights reserved.

  20. Weak value amplification considered harmful

    NASA Astrophysics Data System (ADS)

    Ferrie, Christopher; Combes, Joshua

    2014-03-01

    We show using statistically rigorous arguments that the technique of weak value amplification does not perform better than standard statistical techniques for the tasks of parameter estimation and signal detection. We show that using all data and considering the joint distribution of all measurement outcomes yields the optimal estimator. Moreover, we show estimation using the maximum likelihood technique with weak values as small as possible produces better performance for quantum metrology. In doing so, we identify the optimal experimental arrangement to be the one which reveals the maximal eigenvalue of the square of system observables. We also show these conclusions do not change in the presence of technical noise.

  1. A model comparison approach shows stronger support for economic models of fertility decline

    PubMed Central

    Shenk, Mary K.; Towner, Mary C.; Kress, Howard C.; Alam, Nurul

    2013-01-01

    The demographic transition is an ongoing global phenomenon in which high fertility and mortality rates are replaced by low fertility and mortality. Despite intense interest in the causes of the transition, especially with respect to decreasing fertility rates, the underlying mechanisms motivating it are still subject to much debate. The literature is crowded with competing theories, including causal models that emphasize (i) mortality and extrinsic risk, (ii) the economic costs and benefits of investing in self and children, and (iii) the cultural transmission of low-fertility social norms. Distinguishing between models, however, requires more comprehensive, better-controlled studies than have been published to date. We use detailed demographic data from recent fieldwork to determine which models produce the most robust explanation of the rapid, recent demographic transition in rural Bangladesh. To rigorously compare models, we use an evidence-based statistical approach using model selection techniques derived from likelihood theory. This approach allows us to quantify the relative evidence the data give to alternative models, even when model predictions are not mutually exclusive. Results indicate that fertility, measured as either total fertility or surviving children, is best explained by models emphasizing economic factors and related motivations for parental investment. Our results also suggest important synergies between models, implicating multiple causal pathways in the rapidity and degree of recent demographic transitions. PMID:23630293

  2. Hierarchical animal movement models for population-level inference

    USGS Publications Warehouse

    Hooten, Mevin B.; Buderman, Frances E.; Brost, Brian M.; Hanks, Ephraim M.; Ivans, Jacob S.

    2016-01-01

    New methods for modeling animal movement based on telemetry data are developed regularly. With advances in telemetry capabilities, animal movement models are becoming increasingly sophisticated. Despite a need for population-level inference, animal movement models are still predominantly developed for individual-level inference. Most efforts to upscale the inference to the population level are either post hoc or complicated enough that only the developer can implement the model. Hierarchical Bayesian models provide an ideal platform for the development of population-level animal movement models but can be challenging to fit due to computational limitations or extensive tuning required. We propose a two-stage procedure for fitting hierarchical animal movement models to telemetry data. The two-stage approach is statistically rigorous and allows one to fit individual-level movement models separately, then resample them using a secondary MCMC algorithm. The primary advantages of the two-stage approach are that the first stage is easily parallelizable and the second stage is completely unsupervised, allowing for an automated fitting procedure in many cases. We demonstrate the two-stage procedure with two applications of animal movement models. The first application involves a spatial point process approach to modeling telemetry data, and the second involves a more complicated continuous-time discrete-space animal movement model. We fit these models to simulated data and real telemetry data arising from a population of monitored Canada lynx in Colorado, USA.

  3. Child and youth participatory interventions for addressing lifestyle-related childhood obesity: a systematic review.

    PubMed

    Frerichs, L; Ataga, O; Corbie-Smith, G; Tessler Lindau, S

    2016-12-01

    A growing number of childhood obesity interventions involve children and youth in participatory roles, but these types of interventions have not been systematically reviewed. We aimed to identify child and youth participatory interventions in the peer-reviewed literature in order to characterize the approaches and examine their impact on obesity and obesity-related lifestyle behaviours. We searched PubMed/Medline, psychINFO and ERIC for quasi-experimental and randomized trials conducted from date of database initiation through May 2015 that engaged children or youth in implementing healthy eating, physical activity or weight management strategies. Eighteen studies met our eligibility criteria. Most (n = 14) trained youth to implement pre-defined strategies targeting their peers. A few (n = 4) assisted youth to plan and implement interventions that addressed environmental changes. Thirteen studies reported at least one statistically significant weight, physical activity or dietary change outcome. Participatory approaches have potential, but variation in strategies and outcomes leave questions unanswered about the mechanisms through which child and youth engagement impact childhood obesity. Future research should compare child-delivered or youth-delivered to adult-delivered health promotion interventions and more rigorously evaluate natural experiments that engage youth to implement environmental changes. With careful attention to theoretical frameworks, process and outcome measures, these studies could strengthen the effectiveness of child and youth participatory approaches. © 2016 World Obesity Federation.

  4. Biomedical text mining for research rigor and integrity: tasks, challenges, directions.

    PubMed

    Kilicoglu, Halil

    2017-06-13

    An estimated quarter of a trillion US dollars is invested in the biomedical research enterprise annually. There is growing alarm that a significant portion of this investment is wasted because of problems in reproducibility of research findings and in the rigor and integrity of research conduct and reporting. Recent years have seen a flurry of activities focusing on standardization and guideline development to enhance the reproducibility and rigor of biomedical research. Research activity is primarily communicated via textual artifacts, ranging from grant applications to journal publications. These artifacts can be both the source and the manifestation of practices leading to research waste. For example, an article may describe a poorly designed experiment, or the authors may reach conclusions not supported by the evidence presented. In this article, we pose the question of whether biomedical text mining techniques can assist the stakeholders in the biomedical research enterprise in doing their part toward enhancing research integrity and rigor. In particular, we identify four key areas in which text mining techniques can make a significant contribution: plagiarism/fraud detection, ensuring adherence to reporting guidelines, managing information overload and accurate citation/enhanced bibliometrics. We review the existing methods and tools for specific tasks, if they exist, or discuss relevant research that can provide guidance for future work. With the exponential increase in biomedical research output and the ability of text mining approaches to perform automatic tasks at large scale, we propose that such approaches can support tools that promote responsible research practices, providing significant benefits for the biomedical research enterprise. Published by Oxford University Press 2017. This work is written by a US Government employee and is in the public domain in the US.

  5. Rigor and reproducibility in research with transcranial electrical stimulation: An NIMH-sponsored workshop.

    PubMed

    Bikson, Marom; Brunoni, Andre R; Charvet, Leigh E; Clark, Vincent P; Cohen, Leonardo G; Deng, Zhi-De; Dmochowski, Jacek; Edwards, Dylan J; Frohlich, Flavio; Kappenman, Emily S; Lim, Kelvin O; Loo, Colleen; Mantovani, Antonio; McMullen, David P; Parra, Lucas C; Pearson, Michele; Richardson, Jessica D; Rumsey, Judith M; Sehatpour, Pejman; Sommers, David; Unal, Gozde; Wassermann, Eric M; Woods, Adam J; Lisanby, Sarah H

    Neuropsychiatric disorders are a leading source of disability and require novel treatments that target mechanisms of disease. As such disorders are thought to result from aberrant neuronal circuit activity, neuromodulation approaches are of increasing interest given their potential for manipulating circuits directly. Low intensity transcranial electrical stimulation (tES) with direct currents (transcranial direct current stimulation, tDCS) or alternating currents (transcranial alternating current stimulation, tACS) represent novel, safe, well-tolerated, and relatively inexpensive putative treatment modalities. This report seeks to promote the science, technology and effective clinical applications of these modalities, identify research challenges, and suggest approaches for addressing these needs in order to achieve rigorous, reproducible findings that can advance clinical treatment. The National Institute of Mental Health (NIMH) convened a workshop in September 2016 that brought together experts in basic and human neuroscience, electrical stimulation biophysics and devices, and clinical trial methods to examine the physiological mechanisms underlying tDCS/tACS, technologies and technical strategies for optimizing stimulation protocols, and the state of the science with respect to therapeutic applications and trial designs. Advances in understanding mechanisms, methodological and technological improvements (e.g., electronics, computational models to facilitate proper dosing), and improved clinical trial designs are poised to advance rigorous, reproducible therapeutic applications of these techniques. A number of challenges were identified and meeting participants made recommendations made to address them. These recommendations align with requirements in NIMH funding opportunity announcements to, among other needs, define dosimetry, demonstrate dose/response relationships, implement rigorous blinded trial designs, employ computational modeling, and demonstrate target engagement when testing stimulation-based interventions for the treatment of mental disorders. Published by Elsevier Inc.

  6. Rigor and reproducibility in research with transcranial electrical stimulation: An NIMH-sponsored workshop

    PubMed Central

    Bikson, Marom; Brunoni, Andre R.; Charvet, Leigh E.; Clark, Vincent P.; Cohen, Leonardo G.; Deng, Zhi-De; Dmochowski, Jacek; Edwards, Dylan J.; Frohlich, Flavio; Kappenman, Emily S.; Lim, Kelvin O.; Loo, Colleen; Mantovani, Antonio; McMullen, David P.; Parra, Lucas C.; Pearson, Michele; Richardson, Jessica D.; Rumsey, Judith M.; Sehatpour, Pejman; Sommers, David; Unal, Gozde; Wassermann, Eric M.; Woods, Adam J.; Lisanby, Sarah H.

    2018-01-01

    Background Neuropsychiatric disorders are a leading source of disability and require novel treatments that target mechanisms of disease. As such disorders are thought to result from aberrant neuronal circuit activity, neuromodulation approaches are of increasing interest given their potential for manipulating circuits directly. Low intensity transcranial electrical stimulation (tES) with direct currents (transcranial direct current stimulation, tDCS) or alternating currents (transcranial alternating current stimulation, tACS) represent novel, safe, well-tolerated, and relatively inexpensive putative treatment modalities. Objective This report seeks to promote the science, technology and effective clinical applications of these modalities, identify research challenges, and suggest approaches for addressing these needs in order to achieve rigorous, reproducible findings that can advance clinical treatment. Methods The National Institute of Mental Health (NIMH) convened a workshop in September 2016 that brought together experts in basic and human neuroscience, electrical stimulation biophysics and devices, and clinical trial methods to examine the physiological mechanisms underlying tDCS/tACS, technologies and technical strategies for optimizing stimulation protocols, and the state of the science with respect to therapeutic applications and trial designs. Results Advances in understanding mechanisms, methodological and technological improvements (e.g., electronics, computational models to facilitate proper dosing), and improved clinical trial designs are poised to advance rigorous, reproducible therapeutic applications of these techniques. A number of challenges were identified and meeting participants made recommendations made to address them. Conclusions These recommendations align with requirements in NIMH funding opportunity announcements to, among other needs, define dosimetry, demonstrate dose/response relationships, implement rigorous blinded trial designs, employ computational modeling, and demonstrate target engagement when testing stimulation-based interventions for the treatment of mental disorders. PMID:29398575

  7. Field significance of performance measures in the context of regional climate model evaluation. Part 1: temperature

    NASA Astrophysics Data System (ADS)

    Ivanov, Martin; Warrach-Sagi, Kirsten; Wulfmeyer, Volker

    2018-04-01

    A new approach for rigorous spatial analysis of the downscaling performance of regional climate model (RCM) simulations is introduced. It is based on a multiple comparison of the local tests at the grid cells and is also known as "field" or "global" significance. New performance measures for estimating the added value of downscaled data relative to the large-scale forcing fields are developed. The methodology is exemplarily applied to a standard EURO-CORDEX hindcast simulation with the Weather Research and Forecasting (WRF) model coupled with the land surface model NOAH at 0.11 ∘ grid resolution. Monthly temperature climatology for the 1990-2009 period is analysed for Germany for winter and summer in comparison with high-resolution gridded observations from the German Weather Service. The field significance test controls the proportion of falsely rejected local tests in a meaningful way and is robust to spatial dependence. Hence, the spatial patterns of the statistically significant local tests are also meaningful. We interpret them from a process-oriented perspective. In winter and in most regions in summer, the downscaled distributions are statistically indistinguishable from the observed ones. A systematic cold summer bias occurs in deep river valleys due to overestimated elevations, in coastal areas due probably to enhanced sea breeze circulation, and over large lakes due to the interpolation of water temperatures. Urban areas in concave topography forms have a warm summer bias due to the strong heat islands, not reflected in the observations. WRF-NOAH generates appropriate fine-scale features in the monthly temperature field over regions of complex topography, but over spatially homogeneous areas even small biases can lead to significant deteriorations relative to the driving reanalysis. As the added value of global climate model (GCM)-driven simulations cannot be smaller than this perfect-boundary estimate, this work demonstrates in a rigorous manner the clear additional value of dynamical downscaling over global climate simulations. The evaluation methodology has a broad spectrum of applicability as it is distribution-free, robust to spatial dependence, and accounts for time series structure.

  8. Missing data treatments matter: an analysis of multiple imputation for anterior cervical discectomy and fusion procedures.

    PubMed

    Ondeck, Nathaniel T; Fu, Michael C; Skrip, Laura A; McLynn, Ryan P; Cui, Jonathan J; Basques, Bryce A; Albert, Todd J; Grauer, Jonathan N

    2018-04-09

    The presence of missing data is a limitation of large datasets, including the National Surgical Quality Improvement Program (NSQIP). In addressing this issue, most studies use complete case analysis, which excludes cases with missing data, thus potentially introducing selection bias. Multiple imputation, a statistically rigorous approach that approximates missing data and preserves sample size, may be an improvement over complete case analysis. The present study aims to evaluate the impact of using multiple imputation in comparison with complete case analysis for assessing the associations between preoperative laboratory values and adverse outcomes following anterior cervical discectomy and fusion (ACDF) procedures. This is a retrospective review of prospectively collected data. Patients undergoing one-level ACDF were identified in NSQIP 2012-2015. Perioperative adverse outcome variables assessed included the occurrence of any adverse event, severe adverse events, and hospital readmission. Missing preoperative albumin and hematocrit values were handled using complete case analysis and multiple imputation. These preoperative laboratory levels were then tested for associations with 30-day postoperative outcomes using logistic regression. A total of 11,999 patients were included. Of this cohort, 63.5% of patients had missing preoperative albumin and 9.9% had missing preoperative hematocrit. When using complete case analysis, only 4,311 patients were studied. The removed patients were significantly younger, healthier, of a common body mass index, and male. Logistic regression analysis failed to identify either preoperative hypoalbuminemia or preoperative anemia as significantly associated with adverse outcomes. When employing multiple imputation, all 11,999 patients were included. Preoperative hypoalbuminemia was significantly associated with the occurrence of any adverse event and severe adverse events. Preoperative anemia was significantly associated with the occurrence of any adverse event, severe adverse events, and hospital readmission. Multiple imputation is a rigorous statistical procedure that is being increasingly used to address missing values in large datasets. Using this technique for ACDF avoided the loss of cases that may have affected the representativeness and power of the study and led to different results than complete case analysis. Multiple imputation should be considered for future spine studies. Copyright © 2018 Elsevier Inc. All rights reserved.

  9. The time-local view of nonequilibrium statistical mechanics. I. Linear theory of transport and relaxation

    NASA Astrophysics Data System (ADS)

    der, R.

    1987-01-01

    The various approaches to nonequilibrium statistical mechanics may be subdivided into convolution and convolutionless (time-local) ones. While the former, put forward by Zwanzig, Mori, and others, are used most commonly, the latter are less well developed, but have proven very useful in recent applications. The aim of the present series of papers is to develop the time-local picture (TLP) of nonequilibrium statistical mechanics on a new footing and to consider its physical implications for topics such as the formulation of irreversible thermodynamics. The most natural approach to TLP is seen to derive from the Fourier-Laplace transformwidetilde{C}(z)) of pertinent time correlation functions, which on the physical sheet typically displays an essential singularity at z=∞ and a number of macroscopic and microscopic poles in the lower half-plane corresponding to long- and short-lived modes, respectively, the former giving rise to the autonomous macrodynamics, whereas the latter are interpreted as doorway modes mediating the transfer of information from relevant to irrelevant channels. Possible implications of this doorway mode concept for socalled extended irreversible thermodynamics are briefly discussed. The pole structure is used for deriving new kinds of generalized Green-Kubo relations expressing macroscopic quantities, transport coefficients, e.g., by contour integrals over current-current correlation functions obeying Hamiltonian dynamics, the contour integration replacing projection. The conventional Green-Kubo relations valid for conserved quantities only are rederived for illustration. Moreover,widetilde{C}(z) may be expressed by a Laurent series expansion in positive and negative powers of z, from which a rigorous, general, and straightforward method is developed for extracting all macroscopic quantities from so-called secularly divergent expansions ofwidetilde{C}(z) as obtained from the application of conventional many-body techniques to the calculation ofwidetilde{C}(z). The expressions are formulated as time scale expansions, which should rapidly converge if macroscopic and microscopic time scales are sufficiently well separated, i.e., if lifetime ("memory") effects are not too large.

  10. A review of mammalian carcinogenicity study design and potential effects of alternate test procedures on the safety evaluation of food ingredients.

    PubMed

    Hayes, A W; Dayan, A D; Hall, W C; Kodell, R L; Williams, G M; Waddell, W D; Slesinski, R S; Kruger, C L

    2011-06-01

    Extensive experience in conducting long term cancer bioassays has been gained over the past 50 years of animal testing on drugs, pesticides, industrial chemicals, food additives and consumer products. Testing protocols for the conduct of carcinogenicity studies in rodents have been developed in Guidelines promulgated by regulatory agencies, including the US EPA (Environmental Protection Agency), the US FDA (Food and Drug Administration), the OECD (Organization for Economic Co-operation and Development) for the EU member states and the MAFF (Ministries of Agriculture, Forestries and Fisheries) and MHW (Ministry of Health and Welfare) in Japan. The basis of critical elements of the study design that lead to an accepted identification of the carcinogenic hazard of substances in food and beverages is the focus of this review. The approaches used by entities well-known for carcinogenicity testing and/or guideline development are discussed. Particular focus is placed on comparison of testing programs used by the US National Toxicology Program (NTP) and advocated in OECD guidelines to the testing programs of the European Ramazzini Foundation (ERF), an organization with numerous published carcinogenicity studies. This focus allows for a good comparison of differences in approaches to carcinogenicity testing and allows for a critical consideration of elements important to appropriate carcinogenicity study designs and practices. OECD protocols serve as good standard models for carcinogenicity testing protocol design. Additionally, the detailed design of any protocol should include attention to the rationale for inclusion of particular elements, including the impact of those elements on study interpretations. Appropriate interpretation of study results is dependent on rigorous evaluation of the study design and conduct, including differences from standard practices. Important considerations are differences in the strain of animal used, diet and housing practices, rigorousness of test procedures, dose selection, histopathology procedures, application of historical control data, statistical evaluations and whether statistical extrapolations are supported by, or are beyond the limits of, the data generated. Without due consideration, there can be result conflicting data interpretations and uncertainty about the relevance of a study's results to human risk. This paper discusses the critical elements of rodent (rat) carcinogenicity studies, particularly with respect to the study of food ingredients. It also highlights study practices and procedures that can detract from the appropriate evaluation of human relevance of results, indicating the importance of adherence to international consensus protocols, such as those detailed by OECD. Copyright © 2010. Published by Elsevier Inc.

  11. Effectiveness of Culturally Appropriate Adaptations to Juvenile Justice Services

    PubMed Central

    Vergara, Andrew T.; Kathuria, Parul; Woodmass, Kyler; Janke, Robert; Wells, Susan J.

    2017-01-01

    Despite efforts to increase cultural competence of services within juvenile justice systems, disproportional minority contact (DMC) persists throughout Canada and the United States. Commonly cited approaches to decreasing DMC include large-scale systemic changes as well as enhancement of the cultural relevance and responsiveness of services delivered. Cultural adaptations to service delivery focus on prevention, decision-making, and treatment services to reduce initial contact, minimize unnecessary restraint, and reduce recidivism. Though locating rigorous testing of these approaches compared to standard interventions is difficult, this paper identifies and reports on such research. The Cochrane guidelines for systematic literature reviews and meta-analyses served as a foundation for study methodology. Databases such as Legal Periodicals and Books were searched through June 2015. Three studies were sufficiently rigorous to identify the effect of the cultural adaptations, and three studies that are making potentially important contributions to the field were also reviewed. PMID:29468092

  12. The Economic Costs of Poverty in the United States: Subsequent Effects of Children Growing Up Poor. Discussion Paper No. 1327-07

    ERIC Educational Resources Information Center

    Holzer, Harry J.; Schanzenbach, Diane Whitmore; Duncan, Greg J.; Ludwig, Jens

    2007-01-01

    In this paper, we review a range of rigorous research studies that estimate the average statistical relationships between children growing up in poverty and their earnings, propensity to commit crime, and quality of health later in life. We also review estimates of the costs that crime and poor health per person impose on the economy. Then we…

  13. Enumerating Sparse Organisms in Ships’ Ballast Water: Why Counting to 10 Is Not So Easy

    PubMed Central

    2011-01-01

    To reduce ballast water-borne aquatic invasions worldwide, the International Maritime Organization and United States Coast Guard have each proposed discharge standards specifying maximum concentrations of living biota that may be released in ships’ ballast water (BW), but these regulations still lack guidance for standardized type approval and compliance testing of treatment systems. Verifying whether BW meets a discharge standard poses significant challenges. Properly treated BW will contain extremely sparse numbers of live organisms, and robust estimates of rare events require extensive sampling efforts. A balance of analytical rigor and practicality is essential to determine the volume of BW that can be reasonably sampled and processed, yet yield accurate live counts. We applied statistical modeling to a range of sample volumes, plankton concentrations, and regulatory scenarios (i.e., levels of type I and type II errors), and calculated the statistical power of each combination to detect noncompliant discharge concentrations. The model expressly addresses the roles of sampling error, BW volume, and burden of proof on the detection of noncompliant discharges in order to establish a rigorous lower limit of sampling volume. The potential effects of recovery errors (i.e., incomplete recovery and detection of live biota) in relation to sample volume are also discussed. PMID:21434685

  14. Understanding photon sideband statistics and correlation for determining phonon coherence

    NASA Astrophysics Data System (ADS)

    Ding, Ding; Yin, Xiaobo; Li, Baowen

    2018-01-01

    Generating and detecting coherent high-frequency heat-carrying phonons have been topics of great interest in recent years. Although there have been successful attempts in generating and observing coherent phonons, rigorous techniques to characterize and detect phonon coherence in a crystalline material have been lagging compared to what has been achieved for photons. One main challenge is a lack of detailed understanding of how detection signals for phonons can be related to coherence. The quantum theory of photoelectric detection has greatly advanced the ability to characterize photon coherence in the past century, and a similar theory for phonon detection is necessary. Here, we reexamine the optical sideband fluorescence technique that has been used to detect high-frequency phonons in materials with optically active defects. We propose a quantum theory of phonon detection using the sideband technique and found that there are distinct differences in sideband counting statistics between thermal and coherent phonons. We further propose a second-order correlation function unique to sideband signals that allows for a rigorous distinction between thermal and coherent phonons. Our theory is relevant to a correlation measurement with nontrivial response functions at the quantum level and can potentially bridge the gap of experimentally determining phonon coherence to be on par with that of photons.

  15. Enumerating sparse organisms in ships' ballast water: why counting to 10 is not so easy.

    PubMed

    Miller, A Whitman; Frazier, Melanie; Smith, George E; Perry, Elgin S; Ruiz, Gregory M; Tamburri, Mario N

    2011-04-15

    To reduce ballast water-borne aquatic invasions worldwide, the International Maritime Organization and United States Coast Guard have each proposed discharge standards specifying maximum concentrations of living biota that may be released in ships' ballast water (BW), but these regulations still lack guidance for standardized type approval and compliance testing of treatment systems. Verifying whether BW meets a discharge standard poses significant challenges. Properly treated BW will contain extremely sparse numbers of live organisms, and robust estimates of rare events require extensive sampling efforts. A balance of analytical rigor and practicality is essential to determine the volume of BW that can be reasonably sampled and processed, yet yield accurate live counts. We applied statistical modeling to a range of sample volumes, plankton concentrations, and regulatory scenarios (i.e., levels of type I and type II errors), and calculated the statistical power of each combination to detect noncompliant discharge concentrations. The model expressly addresses the roles of sampling error, BW volume, and burden of proof on the detection of noncompliant discharges in order to establish a rigorous lower limit of sampling volume. The potential effects of recovery errors (i.e., incomplete recovery and detection of live biota) in relation to sample volume are also discussed.

  16. Improvement of IFNγ ELISPOT Performance Following Overnight Resting of Frozen PBMC Samples Confirmed Through Rigorous Statistical Analysis

    PubMed Central

    Santos, Radleigh; Buying, Alcinette; Sabri, Nazila; Yu, John; Gringeri, Anthony; Bender, James; Janetzki, Sylvia; Pinilla, Clemencia; Judkowski, Valeria A.

    2014-01-01

    Immune monitoring of functional responses is a fundamental parameter to establish correlates of protection in clinical trials evaluating vaccines and therapies to boost antigen-specific responses. The IFNγ ELISPOT assay is a well-standardized and validated method for the determination of functional IFNγ-producing T-cells in peripheral blood mononuclear cells (PBMC); however, its performance greatly depends on the quality and integrity of the cryopreserved PBMC. Here, we investigate the effect of overnight (ON) resting of the PBMC on the detection of CD8-restricted peptide-specific responses by IFNγ ELISPOT. The study used PBMC from healthy donors to evaluate the CD8 T-cell response to five pooled or individual HLA-A2 viral peptides. The results were analyzed using a modification of the existing distribution free resampling (DFR) recommended for the analysis of ELISPOT data to ensure the most rigorous possible standard of significance. The results of the study demonstrate that ON resting of PBMC samples prior to IFNγ ELISPOT increases both the magnitude and the statistical significance of the responses. In addition, a comparison of the results with a 13-day preculture of PBMC with the peptides before testing demonstrates that ON resting is sufficient for the efficient evaluation of immune functioning. PMID:25546016

  17. Learning from Science and Sport - How we, Safety, "Engage with Rigor"

    NASA Astrophysics Data System (ADS)

    Herd, A.

    2012-01-01

    As the world of spaceflight safety is relatively small and potentially inward-looking, we need to be aware of the "outside world". We should then try to remind ourselves to be open to the possibility that data, knowledge or experience from outside of the spaceflight community may provide some constructive alternate perspectives. This paper will assess aspects from two seemingly tangential fields, science and sport, and align these with the world of safety. In doing so some useful insights will be given to the challenges we face and may provide solutions relevant in our everyday (of safety engineering). Sport, particularly a contact sport such as rugby union, requires direct interaction between members of two (opposing) teams. Professional, accurately timed and positioned interaction for a desired outcome. These interactions, whilst an essential part of the game, are however not without their constraints. The rugby scrum has constraints as to the formation and engagement of the two teams. The controlled engagement provides for an interaction between the two teams in a safe manner. The constraints arising from the reality that an incorrect engagement could cause serious injury to members of either team. In academia, scientific rigor is applied to assure that the arguments provided and the conclusions drawn in academic papers presented for publication are valid, legitimate and credible. The scientific goal of the need for rigor may be expressed in the example of achieving a statistically relevant sample size, n, in order to assure analysis validity of the data pool. A failure to apply rigor could then place the entire study at risk of failing to have the respective paper published. This paper will consider the merits of these two different aspects, scientific rigor and sports engagement, and offer a reflective look at how this may provide a "modus operandi" for safety engineers at any level whether at their desks (creating or reviewing safety assessments) or in a safety review meeting (providing a verbal critique of the presented safety case).

  18. Controlled-Root Approach To Digital Phase-Locked Loops

    NASA Technical Reports Server (NTRS)

    Stephens, Scott A.; Thomas, J. Brooks

    1995-01-01

    Performance tailored more flexibly and directly to satisfy design requirements. Controlled-root approach improved method for analysis and design of digital phase-locked loops (DPLLs). Developed rigorously from first principles for fully digital loops, making DPLL theory and design simpler and more straightforward (particularly for third- or fourth-order DPLL) and controlling performance more accurately in case of high gain.

  19. Case Method Teaching as Science and Art: A Metaphoric Approach and Curricular Application

    ERIC Educational Resources Information Center

    Greenhalgh, Anne M.

    2007-01-01

    The following article takes a metaphoric approach to case method teaching to shed light on one of our most important practices. The article hinges on the dual comparison of case method as science and as art. The dominant, scientific view of cases is that they are neutral descriptions of real-life business problems, subject to rigorous analysis.…

  20. Learning Extrema Problems Using a Non-Differential Approach in a Digital Dynamic Environment: The Case of High-Track yet Low-Achievers

    ERIC Educational Resources Information Center

    Dvir, Assaf; Tabach, Michal

    2017-01-01

    High schools commonly use a differential approach to teach minima and maxima geometric problems. Although calculus serves as a systematic and powerful technique, this rigorous instrument might hinder students' ability to understand the behavior and constraints of the objective function. The proliferation of digital environments allowed us to adopt…

  1. Meta-Analysis of the Effectiveness of Individual Intervention in the Controlled Multisensory Environment (Snoezelen[R]) for Individuals with Intellectual Disability

    ERIC Educational Resources Information Center

    Lotan, Meir; Gold, Christian

    2009-01-01

    Background: The Snoezelen[R] is a multisensory intervention approach that has been implemented with various populations. Due to an almost complete absence of rigorous research in this field, the confirmation of this approach as an effective therapeutic intervention is warranted. Method: To evaluate the therapeutic influence of the…

  2. Precisely and Accurately Inferring Single-Molecule Rate Constants

    PubMed Central

    Kinz-Thompson, Colin D.; Bailey, Nevette A.; Gonzalez, Ruben L.

    2017-01-01

    The kinetics of biomolecular systems can be quantified by calculating the stochastic rate constants that govern the biomolecular state versus time trajectories (i.e., state trajectories) of individual biomolecules. To do so, the experimental signal versus time trajectories (i.e., signal trajectories) obtained from observing individual biomolecules are often idealized to generate state trajectories by methods such as thresholding or hidden Markov modeling. Here, we discuss approaches for idealizing signal trajectories and calculating stochastic rate constants from the resulting state trajectories. Importantly, we provide an analysis of how the finite length of signal trajectories restrict the precision of these approaches, and demonstrate how Bayesian inference-based versions of these approaches allow rigorous determination of this precision. Similarly, we provide an analysis of how the finite lengths and limited time resolutions of signal trajectories restrict the accuracy of these approaches, and describe methods that, by accounting for the effects of the finite length and limited time resolution of signal trajectories, substantially improve this accuracy. Collectively, therefore, the methods we consider here enable a rigorous assessment of the precision, and a significant enhancement of the accuracy, with which stochastic rate constants can be calculated from single-molecule signal trajectories. PMID:27793280

  3. Stakeholder-Driven Quality Improvement: A Compelling Force for Clinical Practice Guidelines.

    PubMed

    Rosenfeld, Richard M; Wyer, Peter C

    2018-01-01

    Clinical practice guideline development should be driven by rigorous methodology, but what is less clear is where quality improvement enters the process: should it be a priority-guiding force, or should it enter only after recommendations are formulated? We argue for a stakeholder-driven approach to guideline development, with an overriding goal of quality improvement based on stakeholder perceptions of needs, uncertainties, and knowledge gaps. In contrast, the widely used topic-driven approach, which often makes recommendations based only on randomized controlled trials, is driven by epidemiologic purity and evidence rigor, with quality improvement a downstream consideration. The advantages of a stakeholder-driven versus a topic-driven approach are highlighted by comparisons of guidelines for otitis media with effusion, thyroid nodules, sepsis, and acute bacterial rhinosinusitis. These comparisons show that stakeholder-driven guidelines are more likely to address the quality improvement needs and pressing concerns of clinicians and patients, including understudied populations and patients with multiple chronic conditions. Conversely, a topic-driven approach often addresses "typical" patients, based on research that may not reflect the needs of high-risk groups excluded from studies because of ethical issues or a desire for purity of research design.

  4. An intercomparison of approaches for improving operational seasonal streamflow forecasts

    NASA Astrophysics Data System (ADS)

    Mendoza, Pablo A.; Wood, Andrew W.; Clark, Elizabeth; Rothwell, Eric; Clark, Martyn P.; Nijssen, Bart; Brekke, Levi D.; Arnold, Jeffrey R.

    2017-07-01

    For much of the last century, forecasting centers around the world have offered seasonal streamflow predictions to support water management. Recent work suggests that the two major avenues to advance seasonal predictability are improvements in the estimation of initial hydrologic conditions (IHCs) and the incorporation of climate information. This study investigates the marginal benefits of a variety of methods using IHCs and/or climate information, focusing on seasonal water supply forecasts (WSFs) in five case study watersheds located in the US Pacific Northwest region. We specify two benchmark methods that mimic standard operational approaches - statistical regression against IHCs and model-based ensemble streamflow prediction (ESP) - and then systematically intercompare WSFs across a range of lead times. Additional methods include (i) statistical techniques using climate information either from standard indices or from climate reanalysis variables and (ii) several hybrid/hierarchical approaches harnessing both land surface and climate predictability. In basins where atmospheric teleconnection signals are strong, and when watershed predictability is low, climate information alone provides considerable improvements. For those basins showing weak teleconnections, custom predictors from reanalysis fields were more effective in forecast skill than standard climate indices. ESP predictions tended to have high correlation skill but greater bias compared to other methods, and climate predictors failed to substantially improve these deficiencies within a trace weighting framework. Lower complexity techniques were competitive with more complex methods, and the hierarchical expert regression approach introduced here (hierarchical ensemble streamflow prediction - HESP) provided a robust alternative for skillful and reliable water supply forecasts at all initialization times. Three key findings from this effort are (1) objective approaches supporting methodologically consistent hindcasts open the door to a broad range of beneficial forecasting strategies; (2) the use of climate predictors can add to the seasonal forecast skill available from IHCs; and (3) sample size limitations must be handled rigorously to avoid over-trained forecast solutions. Overall, the results suggest that despite a rich, long heritage of operational use, there remain a number of compelling opportunities to improve the skill and value of seasonal streamflow predictions.

  5. Interface-Resolving Simulation of Collision Efficiency of Cloud Droplets

    NASA Astrophysics Data System (ADS)

    Wang, Lian-Ping; Peng, Cheng; Rosa, Bodgan; Onishi, Ryo

    2017-11-01

    Small-scale air turbulence could enhance the geometric collision rate of cloud droplets while large-scale air turbulence could augment the diffusional growth of cloud droplets. Air turbulence could also enhance the collision efficiency of cloud droplets. Accurate simulation of collision efficiency, however, requires capture of the multi-scale droplet-turbulence and droplet-droplet interactions, which has only been partially achieved in the recent past using the hybrid direct numerical simulation (HDNS) approach. % where Stokes disturbance flow is assumed. The HDNS approach has two major drawbacks: (1) the short-range droplet-droplet interaction is not treated rigorously; (2) the finite-Reynolds number correction to the collision efficiency is not included. In this talk, using two independent numerical methods, we will develop an interface-resolved simulation approach in which the disturbance flows are directly resolved numerically, combined with a rigorous lubrication correction model for near-field droplet-droplet interaction. This multi-scale approach is first used to study the effect of finite flow Reynolds numbers on the droplet collision efficiency in still air. Our simulation results show a significant finite-Re effect on collision efficiency when the droplets are of similar sizes. Preliminary results on integrating this approach in a turbulent flow laden with droplets will also be presented. This work is partially supported by the National Science Foundation.

  6. Scientific approaches to science policy.

    PubMed

    Berg, Jeremy M

    2013-11-01

    The development of robust science policy depends on use of the best available data, rigorous analysis, and inclusion of a wide range of input. While director of the National Institute of General Medical Sciences (NIGMS), I took advantage of available data and emerging tools to analyze training time distribution by new NIGMS grantees, the distribution of the number of publications as a function of total annual National Institutes of Health support per investigator, and the predictive value of peer-review scores on subsequent scientific productivity. Rigorous data analysis should be used to develop new reforms and initiatives that will help build a more sustainable American biomedical research enterprise.

  7. Griffith Edwards' rigorous sympathy with Alcoholics Anonymous.

    PubMed

    Humphreys, Keith

    2015-07-01

    Griffith Edwards made empirical contributions early in his career to the literature on Alcoholics Anonymous (AA), but the attitude he adopted towards AA and other peer-led mutual help initiatives constitutes an even more important legacy. Unlike many treatment professionals who dismissed the value of AA or were threatened by its non-professional approach, Edwards was consistently respectful of the organization. However, he never became an uncritical booster of AA or overgeneralized what could be learnt from it. Future scholarly and clinical endeavors concerning addiction-related mutual help initiatives will benefit by continuing Edwards' tradition of 'rigorous sympathy'. © 2015 Society for the Study of Addiction.

  8. Orbital State Uncertainty Realism

    NASA Astrophysics Data System (ADS)

    Horwood, J.; Poore, A. B.

    2012-09-01

    Fundamental to the success of the space situational awareness (SSA) mission is the rigorous inclusion of uncertainty in the space surveillance network. The *proper characterization of uncertainty* in the orbital state of a space object is a common requirement to many SSA functions including tracking and data association, resolution of uncorrelated tracks (UCTs), conjunction analysis and probability of collision, sensor resource management, and anomaly detection. While tracking environments, such as air and missile defense, make extensive use of Gaussian and local linearity assumptions within algorithms for uncertainty management, space surveillance is inherently different due to long time gaps between updates, high misdetection rates, nonlinear and non-conservative dynamics, and non-Gaussian phenomena. The latter implies that "covariance realism" is not always sufficient. SSA also requires "uncertainty realism"; the proper characterization of both the state and covariance and all non-zero higher-order cumulants. In other words, a proper characterization of a space object's full state *probability density function (PDF)* is required. In order to provide a more statistically rigorous treatment of uncertainty in the space surveillance tracking environment and to better support the aforementioned SSA functions, a new class of multivariate PDFs are formulated which more accurately characterize the uncertainty of a space object's state or orbit. The new distribution contains a parameter set controlling the higher-order cumulants which gives the level sets a distinctive "banana" or "boomerang" shape and degenerates to a Gaussian in a suitable limit. Using the new class of PDFs within the general Bayesian nonlinear filter, the resulting filter prediction step (i.e., uncertainty propagation) is shown to have the *same computational cost as the traditional unscented Kalman filter* with the former able to maintain a proper characterization of the uncertainty for up to *ten times as long* as the latter. The filter correction step also furnishes a statistically rigorous *prediction error* which appears in the likelihood ratios for scoring the association of one report or observation to another. Thus, the new filter can be used to support multi-target tracking within a general multiple hypothesis tracking framework. Additionally, the new distribution admits a distance metric which extends the classical Mahalanobis distance (chi^2 statistic). This metric provides a test for statistical significance and facilitates single-frame data association methods with the potential to easily extend the covariance-based track association algorithm of Hill, Sabol, and Alfriend. The filtering, data fusion, and association methods using the new class of orbital state PDFs are shown to be mathematically tractable and operationally viable.

  9. BEAT: Bioinformatics Exon Array Tool to store, analyze and visualize Affymetrix GeneChip Human Exon Array data from disease experiments

    PubMed Central

    2012-01-01

    Background It is known from recent studies that more than 90% of human multi-exon genes are subject to Alternative Splicing (AS), a key molecular mechanism in which multiple transcripts may be generated from a single gene. It is widely recognized that a breakdown in AS mechanisms plays an important role in cellular differentiation and pathologies. Polymerase Chain Reactions, microarrays and sequencing technologies have been applied to the study of transcript diversity arising from alternative expression. Last generation Affymetrix GeneChip Human Exon 1.0 ST Arrays offer a more detailed view of the gene expression profile providing information on the AS patterns. The exon array technology, with more than five million data points, can detect approximately one million exons, and it allows performing analyses at both gene and exon level. In this paper we describe BEAT, an integrated user-friendly bioinformatics framework to store, analyze and visualize exon arrays datasets. It combines a data warehouse approach with some rigorous statistical methods for assessing the AS of genes involved in diseases. Meta statistics are proposed as a novel approach to explore the analysis results. BEAT is available at http://beat.ba.itb.cnr.it. Results BEAT is a web tool which allows uploading and analyzing exon array datasets using standard statistical methods and an easy-to-use graphical web front-end. BEAT has been tested on a dataset with 173 samples and tuned using new datasets of exon array experiments from 28 colorectal cancer and 26 renal cell cancer samples produced at the Medical Genetics Unit of IRCCS Casa Sollievo della Sofferenza. To highlight all possible AS events, alternative names, accession Ids, Gene Ontology terms and biochemical pathways annotations are integrated with exon and gene level expression plots. The user can customize the results choosing custom thresholds for the statistical parameters and exploiting the available clinical data of the samples for a multivariate AS analysis. Conclusions Despite exon array chips being widely used for transcriptomics studies, there is a lack of analysis tools offering advanced statistical features and requiring no programming knowledge. BEAT provides a user-friendly platform for a comprehensive study of AS events in human diseases, displaying the analysis results with easily interpretable and interactive tables and graphics. PMID:22536968

  10. Mixed memory, (non) Hurst effect, and maximum entropy of rainfall in the tropical Andes

    NASA Astrophysics Data System (ADS)

    Poveda, Germán

    2011-02-01

    Diverse linear and nonlinear statistical parameters of rainfall under aggregation in time and the kind of temporal memory are investigated. Data sets from the Andes of Colombia at different resolutions (15 min and 1-h), and record lengths (21 months and 8-40 years) are used. A mixture of two timescales is found in the autocorrelation and autoinformation functions, with short-term memory holding for time lags less than 15-30 min, and long-term memory onwards. Consistently, rainfall variance exhibits different temporal scaling regimes separated at 15-30 min and 24 h. Tests for the Hurst effect evidence the frailty of the R/ S approach in discerning the kind of memory in high resolution rainfall, whereas rigorous statistical tests for short-memory processes do reject the existence of the Hurst effect. Rainfall information entropy grows as a power law of aggregation time, S( T) ˜ Tβ with < β> = 0.51, up to a timescale, TMaxEnt (70-202 h), at which entropy saturates, with β = 0 onwards. Maximum entropy is reached through a dynamic Generalized Pareto distribution, consistently with the maximum information-entropy principle for heavy-tailed random variables, and with its asymptotically infinitely divisible property. The dynamics towards the limit distribution is quantified. Tsallis q-entropies also exhibit power laws with T, such that Sq( T) ˜ Tβ( q) , with β( q) ⩽ 0 for q ⩽ 0, and β( q) ≃ 0.5 for q ⩾ 1. No clear patterns are found in the geographic distribution within and among the statistical parameters studied, confirming the strong variability of tropical Andean rainfall.

  11. Development of a new pan-European testate amoeba transfer function for reconstructing peatland palaeohydrology

    NASA Astrophysics Data System (ADS)

    Amesbury, Matthew J.; Swindles, Graeme T.; Bobrov, Anatoly; Charman, Dan J.; Holden, Joseph; Lamentowicz, Mariusz; Mallon, Gunnar; Mazei, Yuri; Mitchell, Edward A. D.; Payne, Richard J.; Roland, Thomas P.; Turner, T. Edward; Warner, Barry G.

    2016-11-01

    In the decade since the first pan-European testate amoeba-based transfer function for peatland palaeohydrological reconstruction was published, a vast amount of additional data collection has been undertaken by the research community. Here, we expand the pan-European dataset from 128 to 1799 samples, spanning 35° of latitude and 55° of longitude. After the development of a new taxonomic scheme to permit compilation of data from a wide range of contributors and the removal of samples with high pH values, we developed ecological transfer functions using a range of model types and a dataset of ∼1300 samples. We rigorously tested the efficacy of these models using both statistical validation and independent test sets with associated instrumental data. Model performance measured by statistical indicators was comparable to other published models. Comparison to test sets showed that taxonomic resolution did not impair model performance and that the new pan-European model can therefore be used as an effective tool for palaeohydrological reconstruction. Our results question the efficacy of relying on statistical validation of transfer functions alone and support a multi-faceted approach to the assessment of new models. We substantiated recent advice that model outputs should be standardised and presented as residual values in order to focus interpretation on secure directional shifts, avoiding potentially inaccurate conclusions relating to specific water-table depths. The extent and diversity of the dataset highlighted that, at the taxonomic resolution applied, a majority of taxa had broad geographic distributions, though some morphotypes appeared to have restricted ranges.

  12. CPR methodology with new steady-state criterion and more accurate statistical treatment of channel bow

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baumgartner, S.; Bieli, R.; Bergmann, U. C.

    2012-07-01

    An overview is given of existing CPR design criteria and the methods used in BWR reload analysis to evaluate the impact of channel bow on CPR margins. Potential weaknesses in today's methodologies are discussed. Westinghouse in collaboration with KKL and Axpo - operator and owner of the Leibstadt NPP - has developed an optimized CPR methodology based on a new criterion to protect against dryout during normal operation and with a more rigorous treatment of channel bow. The new steady-state criterion is expressed in terms of an upper limit of 0.01 for the dryout failure probability per year. This ismore » considered a meaningful and appropriate criterion that can be directly related to the probabilistic criteria set-up for the analyses of Anticipated Operation Occurrences (AOOs) and accidents. In the Monte Carlo approach a statistical modeling of channel bow and an accurate evaluation of CPR response functions allow the associated CPR penalties to be included directly in the plant SLMCPR and OLMCPR in a best-estimate manner. In this way, the treatment of channel bow is equivalent to all other uncertainties affecting CPR. Emphasis is put on quantifying the statistical distribution of channel bow throughout the core using measurement data. The optimized CPR methodology has been implemented in the Westinghouse Monte Carlo code, McSLAP. The methodology improves the quality of dryout safety assessments by supplying more valuable information and better control of conservatisms in establishing operational limits for CPR. The methodology is demonstrated with application examples from the introduction at KKL. (authors)« less

  13. Resemblance profiles as clustering decision criteria: Estimating statistical power, error, and correspondence for a hypothesis test for multivariate structure.

    PubMed

    Kilborn, Joshua P; Jones, David L; Peebles, Ernst B; Naar, David F

    2017-04-01

    Clustering data continues to be a highly active area of data analysis, and resemblance profiles are being incorporated into ecological methodologies as a hypothesis testing-based approach to clustering multivariate data. However, these new clustering techniques have not been rigorously tested to determine the performance variability based on the algorithm's assumptions or any underlying data structures. Here, we use simulation studies to estimate the statistical error rates for the hypothesis test for multivariate structure based on dissimilarity profiles (DISPROF). We concurrently tested a widely used algorithm that employs the unweighted pair group method with arithmetic mean (UPGMA) to estimate the proficiency of clustering with DISPROF as a decision criterion. We simulated unstructured multivariate data from different probability distributions with increasing numbers of objects and descriptors, and grouped data with increasing overlap, overdispersion for ecological data, and correlation among descriptors within groups. Using simulated data, we measured the resolution and correspondence of clustering solutions achieved by DISPROF with UPGMA against the reference grouping partitions used to simulate the structured test datasets. Our results highlight the dynamic interactions between dataset dimensionality, group overlap, and the properties of the descriptors within a group (i.e., overdispersion or correlation structure) that are relevant to resemblance profiles as a clustering criterion for multivariate data. These methods are particularly useful for multivariate ecological datasets that benefit from distance-based statistical analyses. We propose guidelines for using DISPROF as a clustering decision tool that will help future users avoid potential pitfalls during the application of methods and the interpretation of results.

  14. The impact of rigorous mathematical thinking as learning method toward geometry understanding

    NASA Astrophysics Data System (ADS)

    Nugraheni, Z.; Budiyono, B.; Slamet, I.

    2018-05-01

    To reach higher order thinking skill, needed to be mastered the conceptual understanding. RMT is a unique realization of the cognitive conceptual construction approach based on Mediated Learning Experience (MLE) theory by Feurstein and Vygotsky’s sociocultural theory. This was quasi experimental research which was comparing the experimental class that was given Rigorous Mathematical Thinking (RMT) as learning method and control class that was given Direct Learning (DL) as the conventional learning activity. This study examined whether there was different effect of two learning method toward conceptual understanding of Junior High School students. The data was analyzed by using Independent t-test and obtained a significant difference of mean value between experimental and control class on geometry conceptual understanding. Further, by semi-structure interview known that students taught by RMT had deeper conceptual understanding than students who were taught by conventional way. By these result known that Rigorous Mathematical Thinking (RMT) as learning method have positive impact toward Geometry conceptual understanding.

  15. Optical properties of electrohydrodynamic convection patterns: rigorous and approximate methods.

    PubMed

    Bohley, Christian; Heuer, Jana; Stannarius, Ralf

    2005-12-01

    We analyze the optical behavior of two-dimensionally periodic structures that occur in electrohydrodynamic convection (EHC) patterns in nematic sandwich cells. These structures are anisotropic, locally uniaxial, and periodic on the scale of micrometers. For the first time, the optics of these structures is investigated with a rigorous method. The method used for the description of the electromagnetic waves interacting with EHC director patterns is a numerical approach that discretizes directly the Maxwell equations. It works as a space-grid-time-domain method and computes electric and magnetic fields in time steps. This so-called finite-difference-time-domain (FDTD) method is able to generate the fields with arbitrary accuracy. We compare this rigorous method with earlier attempts based on ray-tracing and analytical approximations. Results of optical studies of EHC structures made earlier based on ray-tracing methods are confirmed for thin cells, when the spatial periods of the pattern are sufficiently large. For the treatment of small-scale convection structures, the FDTD method is without alternatives.

  16. Lower-Secondary Introductory Chemistry Course: A Novel Approach Based on Science-Education Theories, with Emphasis on the Macroscopic Approach, and the Delayed Meaningful Teaching of the Concepts of Molecule and Atom

    ERIC Educational Resources Information Center

    Tsaparlis, Georgios; Kolioulis, Dimitrios; Pappa, Eleni

    2010-01-01

    We present a programme for a novel introductory lower-secondary chemistry course (seventh or eighth grade) that aims at the application of theories of science education, and in particular of conceptual/meaningful learning and of teaching methodology that encourages active and inquiry forms of learning The approach is rigorous with careful use of…

  17. GREGOR: evaluating global enrichment of trait-associated variants in epigenomic features using a systematic, data-driven approach.

    PubMed

    Schmidt, Ellen M; Zhang, Ji; Zhou, Wei; Chen, Jin; Mohlke, Karen L; Chen, Y Eugene; Willer, Cristen J

    2015-08-15

    The majority of variation identified by genome wide association studies falls in non-coding genomic regions and is hypothesized to impact regulatory elements that modulate gene expression. Here we present a statistically rigorous software tool GREGOR (Genomic Regulatory Elements and Gwas Overlap algoRithm) for evaluating enrichment of any set of genetic variants with any set of regulatory features. Using variants from five phenotypes, we describe a data-driven approach to determine the tissue and cell types most relevant to a trait of interest and to identify the subset of regulatory features likely impacted by these variants. Last, we experimentally evaluate six predicted functional variants at six lipid-associated loci and demonstrate significant evidence for allele-specific impact on expression levels. GREGOR systematically evaluates enrichment of genetic variation with the vast collection of regulatory data available to explore novel biological mechanisms of disease and guide us toward the functional variant at trait-associated loci. GREGOR, including source code, documentation, examples, and executables, is available at http://genome.sph.umich.edu/wiki/GREGOR. cristen@umich.edu Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  18. Nanosystem self-assembly pathways discovered via all-atom multiscale analysis.

    PubMed

    Pankavich, Stephen D; Ortoleva, Peter J

    2012-07-26

    We consider the self-assembly of composite structures from a group of nanocomponents, each consisting of particles within an N-atom system. Self-assembly pathways and rates for nanocomposites are derived via a multiscale analysis of the classical Liouville equation. From a reduced statistical framework, rigorous stochastic equations for population levels of beginning, intermediate, and final aggregates are also derived. It is shown that the definition of an assembly type is a self-consistency criterion that must strike a balance between precision and the need for population levels to be slowly varying relative to the time scale of atomic motion. The deductive multiscale approach is complemented by a qualitative notion of multicomponent association and the ensemble of exact atomic-level configurations consistent with them. In processes such as viral self-assembly from proteins and RNA or DNA, there are many possible intermediates, so that it is usually difficult to predict the most efficient assembly pathway. However, in the current study, rates of assembly of each possible intermediate can be predicted. This avoids the need, as in a phenomenological approach, for recalibration with each new application. The method accounts for the feedback across scales in space and time that is fundamental to nanosystem self-assembly. The theory has applications to bionanostructures, geomaterials, engineered composites, and nanocapsule therapeutic delivery systems.

  19. Strand-seq: a unifying tool for studies of chromosome segregation

    PubMed Central

    Falconer, Ester; Lansdorp, Peter M.

    2013-01-01

    Non random segregation of sister chromatids has been implicated to help specify daughter cell fate (the Silent Sister Hypothesis [1]) or to protect the genome of long-lived stem cells (the Immortal Strand Hypothesis [2]). The idea that sister chromatids are non-randomly segregated into specific daughter cells is only marginally supported by data in sporadic and often contradictory studies. As a result, the field has moved forward rather slowly. The advent of being able to directly label and differentiate sister chromatids in vivo using fluorescence in situ hybridization [3] was a significant advance for such studies. However, this approach is limited by the need for large tracks of unidirectional repeats on chromosomes and the reliance on quantitative imaging of fluorescent probes and rigorous statistical analysis to discern between the two competing hypotheses. A novel method called Strand-seq which uses next-generation sequencing to assay sister chromatid inheritance patterns independently for each chromosome [4] offers a comprehensive approach to test for non-random segregation. In addition Strand-seq enables studies on the deposition of chromatin marks in relation to DNA replication. This method is expected to help unify the field by testing previous claims of non-random segregation in an unbiased way in many model systems in vitro and in vivo. PMID:23665005

  20. The new production theory for health care through clinical reengineering: a study of clinical guidelines--Part I.

    PubMed

    Sharp, J R

    1994-12-01

    Drucker writes that the emerging theory of manufacturing includes four principles and practices: statistical quality control, manufacturing accounting, modular organization, and systems approach. SQC is a rigorous, scientific method of identifying variation in the quality and productivity of a given production process, with an emphasis on improvement. The new manufacturing economics intends to integrate the production strategy with the business strategy in order to account for the biggest portions of costs that the old methods did not assess: time and automation. Production operations that are both standardized and flexible will allow the organization to keep up with changes in design, technology, and the market. The return on innovation in this environment is predicated on a modular arrangement of flexible steps in the process. Finally, the systems approach sees the entire process as being integrated in converting goods or services into economic satisfaction. There is now a major restructuring of the U.S. health care industry, and the incorporation of these four theories into health care reform would appear to be essential. This two-part article will address two problems: Will Drucker's theories relate to health care (Part I)? Will the "new manufacturing" in health care (practice guidelines) demonstrate cost, quality, and access changes that reform demands (Part II)?

  1. All biology is computational biology.

    PubMed

    Markowetz, Florian

    2017-03-01

    Here, I argue that computational thinking and techniques are so central to the quest of understanding life that today all biology is computational biology. Computational biology brings order into our understanding of life, it makes biological concepts rigorous and testable, and it provides a reference map that holds together individual insights. The next modern synthesis in biology will be driven by mathematical, statistical, and computational methods being absorbed into mainstream biological training, turning biology into a quantitative science.

  2. Application of systematic review methodology to the field of nutrition

    USDA-ARS?s Scientific Manuscript database

    Systematic reviews represent a rigorous and transparent approach of synthesizing scientific evidence that minimizes bias. They evolved within the medical community to support development of clinical and public health practice guidelines, set research agendas and formulate scientific consensus state...

  3. Peer Assessment with Online Tools to Improve Student Modeling

    NASA Astrophysics Data System (ADS)

    Atkins, Leslie J.

    2012-11-01

    Introductory physics courses often require students to develop precise models of phenomena and represent these with diagrams, including free-body diagrams, light-ray diagrams, and maps of field lines. Instructors expect that students will adopt a certain rigor and precision when constructing these diagrams, but we want that rigor and precision to be an aid to sense-making rather than meeting seemingly arbitrary requirements set by the instructor. By giving students the authority to develop their own models and establish requirements for their diagrams, the sense that these are arbitrary requirements diminishes and students are more likely to see modeling as a sense-making activity. The practice of peer assessment can help students take ownership; however, it can be difficult for instructors to manage. Furthermore, it is not without risk: students can be reluctant to critique their peers, they may view this as the job of the instructor, and there is no guarantee that students will employ greater rigor and precision as a result of peer assessment. In this article, we describe one approach for peer assessment that can establish norms for diagrams in a way that is student driven, where students retain agency and authority in assessing and improving their work. We show that such an approach does indeed improve students' diagrams and abilities to assess their own work, without sacrificing students' authority and agency.

  4. Thermodynamics of ideal quantum gas with fractional statistics in D dimensions.

    PubMed

    Potter, Geoffrey G; Müller, Gerhard; Karbach, Michael

    2007-06-01

    We present exact and explicit results for the thermodynamic properties (isochores, isotherms, isobars, response functions, velocity of sound) of a quantum gas in dimensions D > or = 1 and with fractional exclusion statistics 0 < or = g < or =1 connecting bosons (g=0) and fermions (g=1) . In D=1 the results are equivalent to those of the Calogero-Sutherland model. Emphasis is given to the crossover between bosonlike and fermionlike features, caused by aspects of the statistical interaction that mimic long-range attraction and short-range repulsion. A phase transition along the isobar occurs at a nonzero temperature in all dimensions. The T dependence of the velocity of sound is in simple relation to isochores and isobars. The effects of soft container walls are accounted for rigorously for the case of a pure power-law potential.

  5. Towards fast, rigorous and efficient conformational sampling of biomolecules: Advances in accelerated molecular dynamics.

    PubMed

    Doshi, Urmi; Hamelberg, Donald

    2015-05-01

    Accelerated molecular dynamics (aMD) has been proven to be a powerful biasing method for enhanced sampling of biomolecular conformations on general-purpose computational platforms. Biologically important long timescale events that are beyond the reach of standard molecular dynamics can be accessed without losing the detailed atomistic description of the system in aMD. Over other biasing methods, aMD offers the advantages of tuning the level of acceleration to access the desired timescale without any advance knowledge of the reaction coordinate. Recent advances in the implementation of aMD and its applications to small peptides and biological macromolecules are reviewed here along with a brief account of all the aMD variants introduced in the last decade. In comparison to the original implementation of aMD, the recent variant in which all the rotatable dihedral angles are accelerated (RaMD) exhibits faster convergence rates and significant improvement in statistical accuracy of retrieved thermodynamic properties. RaMD in conjunction with accelerating diffusive degrees of freedom, i.e. dual boosting, has been rigorously tested for the most difficult conformational sampling problem, protein folding. It has been shown that RaMD with dual boosting is capable of efficiently sampling multiple folding and unfolding events in small fast folding proteins. RaMD with the dual boost approach opens exciting possibilities for sampling multiple timescales in biomolecules. While equilibrium properties can be recovered satisfactorily from aMD-based methods, directly obtaining dynamics and kinetic rates for larger systems presents a future challenge. This article is part of a Special Issue entitled Recent developments of molecular dynamics. Copyright © 2014 Elsevier B.V. All rights reserved.

  6. Improving de novo sequence assembly using machine learning and comparative genomics for overlap correction.

    PubMed

    Palmer, Lance E; Dejori, Mathaeus; Bolanos, Randall; Fasulo, Daniel

    2010-01-15

    With the rapid expansion of DNA sequencing databases, it is now feasible to identify relevant information from prior sequencing projects and completed genomes and apply it to de novo sequencing of new organisms. As an example, this paper demonstrates how such extra information can be used to improve de novo assemblies by augmenting the overlapping step. Finding all pairs of overlapping reads is a key task in many genome assemblers, and to this end, highly efficient algorithms have been developed to find alignments in large collections of sequences. It is well known that due to repeated sequences, many aligned pairs of reads nevertheless do not overlap. But no overlapping algorithm to date takes a rigorous approach to separating aligned but non-overlapping read pairs from true overlaps. We present an approach that extends the Minimus assembler by a data driven step to classify overlaps as true or false prior to contig construction. We trained several different classification models within the Weka framework using various statistics derived from overlaps of reads available from prior sequencing projects. These statistics included percent mismatch and k-mer frequencies within the overlaps as well as a comparative genomics score derived from mapping reads to multiple reference genomes. We show that in real whole-genome sequencing data from the E. coli and S. aureus genomes, by providing a curated set of overlaps to the contigging phase of the assembler, we nearly doubled the median contig length (N50) without sacrificing coverage of the genome or increasing the number of mis-assemblies. Machine learning methods that use comparative and non-comparative features to classify overlaps as true or false can be used to improve the quality of a sequence assembly.

  7. A theoretical Gaussian framework for anomalous change detection in hyperspectral images

    NASA Astrophysics Data System (ADS)

    Acito, Nicola; Diani, Marco; Corsini, Giovanni

    2017-10-01

    Exploitation of temporal series of hyperspectral images is a relatively new discipline that has a wide variety of possible applications in fields like remote sensing, area surveillance, defense and security, search and rescue and so on. In this work, we discuss how images taken at two different times can be processed to detect changes caused by insertion, deletion or displacement of small objects in the monitored scene. This problem is known in the literature as anomalous change detection (ACD) and it can be viewed as the extension, to the multitemporal case, of the well-known anomaly detection problem in a single image. In fact, in both cases, the hyperspectral images are processed blindly in an unsupervised manner and without a-priori knowledge about the target spectrum. We introduce the ACD problem using an approach based on the statistical decision theory and we derive a common framework including different ACD approaches. Particularly, we clearly define the observation space, the data statistical distribution conditioned to the two competing hypotheses and the procedure followed to come with the solution. The proposed overview places emphasis on techniques based on the multivariate Gaussian model that allows a formal presentation of the ACD problem and the rigorous derivation of the possible solutions in a way that is both mathematically more tractable and easier to interpret. We also discuss practical problems related to the application of the detectors in the real world and present affordable solutions. Namely, we describe the ACD processing chain including the strategies that are commonly adopted to compensate pervasive radiometric changes, caused by the different illumination/atmospheric conditions, and to mitigate the residual geometric image co-registration errors. Results obtained on real freely available data are discussed in order to test and compare the methods within the proposed general framework.

  8. Expected values and variances of Bragg peak intensities measured in a nanocrystalline powder diffraction experiment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Öztürk, Hande; Noyan, I. Cevdet

    A rigorous study of sampling and intensity statistics applicable for a powder diffraction experiment as a function of crystallite size is presented. Our analysis yields approximate equations for the expected value, variance and standard deviations for both the number of diffracting grains and the corresponding diffracted intensity for a given Bragg peak. The classical formalism published in 1948 by Alexander, Klug & Kummer [J. Appl. Phys.(1948),19, 742–753] appears as a special case, limited to large crystallite sizes, here. It is observed that both the Lorentz probability expression and the statistics equations used in the classical formalism are inapplicable for nanocrystallinemore » powder samples.« less

  9. Expected values and variances of Bragg peak intensities measured in a nanocrystalline powder diffraction experiment

    DOE PAGES

    Öztürk, Hande; Noyan, I. Cevdet

    2017-08-24

    A rigorous study of sampling and intensity statistics applicable for a powder diffraction experiment as a function of crystallite size is presented. Our analysis yields approximate equations for the expected value, variance and standard deviations for both the number of diffracting grains and the corresponding diffracted intensity for a given Bragg peak. The classical formalism published in 1948 by Alexander, Klug & Kummer [J. Appl. Phys.(1948),19, 742–753] appears as a special case, limited to large crystallite sizes, here. It is observed that both the Lorentz probability expression and the statistics equations used in the classical formalism are inapplicable for nanocrystallinemore » powder samples.« less

  10. Mechanical properties of frog skeletal muscles in iodoacetic acid rigor.

    PubMed Central

    Mulvany, M J

    1975-01-01

    1. Methods have been developed for describing the length: tension characteristics of frog skeletal muscles which go into rigor at 4 degrees C following iodoacetic acid poisoning either in the presence of Ca2+ (Ca-rigor) or its absence (Ca-free-rigor). 2. Such rigor muscles showed less resistance to slow stretch (slow rigor resistance) that to fast stretch (fast rigor resistance). The slow and fast rigor resistances of Ca-free-rigor muscles were much lower than those of Ca-rigor muscles. 3. The slow rigor resistance of Ca-rigor muscles was proportional to the amount of overlap between the contractile filaments present when the muscles were put into rigor. 4. Withdrawing Ca2+ from Ca-rigor muscles (induced-Ca-free rigor) reduced their slow and fast rigor resistances. Readdition of Ca2+ (but not Mg2+, Mn2+ or Sr2+) reversed the effect. 5. The slow and fast rigor resistances of Ca-rigor muscles (but not of Ca-free-rigor muscles) decreased with time. 6.The sarcomere structure of Ca-rigor and induced-Ca-free rigor muscles stretched by 0.2lo was destroyed in proportion to the amount of stretch, but the lengths of the remaining intact sarcomeres were essentially unchanged. This suggests that there had been a successive yielding of the weakeast sarcomeres. 7. The difference between the slow and fast rigor resistance and the effect of calcium on these resistances are discussed in relation to possible variations in the strength of crossbridges between the thick and thin filaments. Images Plate 1 Plate 2 PMID:1082023

  11. DIMM-SC: a Dirichlet mixture model for clustering droplet-based single cell transcriptomic data.

    PubMed

    Sun, Zhe; Wang, Ting; Deng, Ke; Wang, Xiao-Feng; Lafyatis, Robert; Ding, Ying; Hu, Ming; Chen, Wei

    2018-01-01

    Single cell transcriptome sequencing (scRNA-Seq) has become a revolutionary tool to study cellular and molecular processes at single cell resolution. Among existing technologies, the recently developed droplet-based platform enables efficient parallel processing of thousands of single cells with direct counting of transcript copies using Unique Molecular Identifier (UMI). Despite the technology advances, statistical methods and computational tools are still lacking for analyzing droplet-based scRNA-Seq data. Particularly, model-based approaches for clustering large-scale single cell transcriptomic data are still under-explored. We developed DIMM-SC, a Dirichlet Mixture Model for clustering droplet-based Single Cell transcriptomic data. This approach explicitly models UMI count data from scRNA-Seq experiments and characterizes variations across different cell clusters via a Dirichlet mixture prior. We performed comprehensive simulations to evaluate DIMM-SC and compared it with existing clustering methods such as K-means, CellTree and Seurat. In addition, we analyzed public scRNA-Seq datasets with known cluster labels and in-house scRNA-Seq datasets from a study of systemic sclerosis with prior biological knowledge to benchmark and validate DIMM-SC. Both simulation studies and real data applications demonstrated that overall, DIMM-SC achieves substantially improved clustering accuracy and much lower clustering variability compared to other existing clustering methods. More importantly, as a model-based approach, DIMM-SC is able to quantify the clustering uncertainty for each single cell, facilitating rigorous statistical inference and biological interpretations, which are typically unavailable from existing clustering methods. DIMM-SC has been implemented in a user-friendly R package with a detailed tutorial available on www.pitt.edu/∼wec47/singlecell.html. wei.chen@chp.edu or hum@ccf.org. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  12. Linguistic Diversity and Traffic Accidents: Lessons from Statistical Studies of Cultural Traits

    PubMed Central

    Roberts, Seán; Winters, James

    2013-01-01

    The recent proliferation of digital databases of cultural and linguistic data, together with new statistical techniques becoming available has lead to a rise in so-called nomothetic studies [1]–[8]. These seek relationships between demographic variables and cultural traits from large, cross-cultural datasets. The insights from these studies are important for understanding how cultural traits evolve. While these studies are fascinating and are good at generating testable hypotheses, they may underestimate the probability of finding spurious correlations between cultural traits. Here we show that this kind of approach can find links between such unlikely cultural traits as traffic accidents, levels of extra-martial sex, political collectivism and linguistic diversity. This suggests that spurious correlations, due to historical descent, geographic diffusion or increased noise-to-signal ratios in large datasets, are much more likely than some studies admit. We suggest some criteria for the evaluation of nomothetic studies and some practical solutions to the problems. Since some of these studies are receiving media attention without a widespread understanding of the complexities of the issue, there is a risk that poorly controlled studies could affect policy. We hope to contribute towards a general skepticism for correlational studies by demonstrating the ease of finding apparently rigorous correlations between cultural traits. Despite this, we see well-controlled nomothetic studies as useful tools for the development of theories. PMID:23967132

  13. Linguistic diversity and traffic accidents: lessons from statistical studies of cultural traits.

    PubMed

    Roberts, Seán; Winters, James

    2013-01-01

    The recent proliferation of digital databases of cultural and linguistic data, together with new statistical techniques becoming available has lead to a rise in so-called nomothetic studies [1]-[8]. These seek relationships between demographic variables and cultural traits from large, cross-cultural datasets. The insights from these studies are important for understanding how cultural traits evolve. While these studies are fascinating and are good at generating testable hypotheses, they may underestimate the probability of finding spurious correlations between cultural traits. Here we show that this kind of approach can find links between such unlikely cultural traits as traffic accidents, levels of extra-martial sex, political collectivism and linguistic diversity. This suggests that spurious correlations, due to historical descent, geographic diffusion or increased noise-to-signal ratios in large datasets, are much more likely than some studies admit. We suggest some criteria for the evaluation of nomothetic studies and some practical solutions to the problems. Since some of these studies are receiving media attention without a widespread understanding of the complexities of the issue, there is a risk that poorly controlled studies could affect policy. We hope to contribute towards a general skepticism for correlational studies by demonstrating the ease of finding apparently rigorous correlations between cultural traits. Despite this, we see well-controlled nomothetic studies as useful tools for the development of theories.

  14. Statistical tests of simple earthquake cycle models

    NASA Astrophysics Data System (ADS)

    DeVries, Phoebe M. R.; Evans, Eileen L.

    2016-12-01

    A central goal of observing and modeling the earthquake cycle is to forecast when a particular fault may generate an earthquake: a fault late in its earthquake cycle may be more likely to generate an earthquake than a fault early in its earthquake cycle. Models that can explain geodetic observations throughout the entire earthquake cycle may be required to gain a more complete understanding of relevant physics and phenomenology. Previous efforts to develop unified earthquake models for strike-slip faults have largely focused on explaining both preseismic and postseismic geodetic observations available across a few faults in California, Turkey, and Tibet. An alternative approach leverages the global distribution of geodetic and geologic slip rate estimates on strike-slip faults worldwide. Here we use the Kolmogorov-Smirnov test for similarity of distributions to infer, in a statistically rigorous manner, viscoelastic earthquake cycle models that are inconsistent with 15 sets of observations across major strike-slip faults. We reject a large subset of two-layer models incorporating Burgers rheologies at a significance level of α = 0.05 (those with long-term Maxwell viscosities ηM < 4.0 × 1019 Pa s and ηM > 4.6 × 1020 Pa s) but cannot reject models on the basis of transient Kelvin viscosity ηK. Finally, we examine the implications of these results for the predicted earthquake cycle timing of the 15 faults considered and compare these predictions to the geologic and historical record.

  15. Towards tests of quark-hadron duality with functional analysis and spectral function data

    NASA Astrophysics Data System (ADS)

    Boito, Diogo; Caprini, Irinel

    2017-04-01

    The presence of terms that violate quark-hadron duality in the expansion of QCD Green's functions is a generally accepted fact. Recently, a new approach was proposed for the study of duality violations (DVs), which exploits the existence of a rigorous lower bound on the functional distance, measured in a certain norm, between a "true" correlator and its approximant calculated theoretically along a contour in the complex energy plane. In the present paper, we pursue the investigation of functional-analysis-based tests towards their application to real spectral function data. We derive a closed analytic expression for the minimal functional distance based on the general weighted L2 norm and discuss its relation with the distance measured in the L∞ norm. Using fake data sets obtained from a realistic toy model in which we allow for covariances inspired from the publicly available ALEPH spectral functions, we obtain, by Monte Carlo simulations, the statistical distribution of the strength parameter that measures the magnitude of the DV term added to the usual operator product expansion. The results show that, if the region with large errors near the end point of the spectrum in τ decays is excluded, the functional-analysis-based tests using either L2 or L∞ norms are able to detect, in a statistically significant way, the presence of DVs in realistic spectral function pseudodata.

  16. Landscape metrics, scales of resolution

    Treesearch

    Samuel A. Cushman; Kevin McGarigal

    2008-01-01

    Effective implementation of the "multiple path" approach to managing green landscapes depends fundamentally on rigorous quantification of the composition and structure of the landscapes of concern at present, modelling landscape structure trajectories under alternative management paths, and monitoring landscape structure into the future to confirm...

  17. Autism and Pervasive Developmental Disorders

    ERIC Educational Resources Information Center

    Volkmar, Fred R.; Lord, Catherine; Bailey, Anthony; Schultz, Robert T.; Klin, Ami

    2004-01-01

    The quantity and quality of research into autism and related conditions have increased dramatically in recent years. Consequently we selectively review key accomplishments and highlight directions for future research. More consistent approaches to diagnosis and more rigorous assessment methods have significantly advanced research, although the…

  18. Endobiogeny: a global approach to systems biology (part 1 of 2).

    PubMed

    Lapraz, Jean-Claude; Hedayat, Kamyar M

    2013-01-01

    Endobiogeny is a global systems approach to human biology that may offer an advancement in clinical medicine based in scientific principles of rigor and experimentation and the humanistic principles of individualization of care and alleviation of suffering with minimization of harm. Endobiogeny is neither a movement away from modern science nor an uncritical embracing of pre-rational methods of inquiry but a synthesis of quantitative and qualitative relationships reflected in a systems-approach to life and based on new mathematical paradigms of pattern recognition.

  19. Augmented assessment as a means to augmented reality.

    PubMed

    Bergeron, Bryan

    2006-01-01

    Rigorous scientific assessment of educational technologies typically lags behind the availability of the technologies by years because of the lack of validated instruments and benchmarks. Even when the appropriate assessment instruments are available, they may not be applied because of time and monetary constraints. Work in augmented reality, instrumented mannequins, serious gaming, and similar promising educational technologies that haven't undergone timely, rigorous evaluation, highlights the need for assessment methodologies that address the limitations of traditional approaches. The most promising augmented assessment solutions incorporate elements of rapid prototyping used in the software industry, simulation-based assessment techniques modeled after methods used in bioinformatics, and object-oriented analysis methods borrowed from object oriented programming.

  20. Rigorous Numerics for ill-posed PDEs: Periodic Orbits in the Boussinesq Equation

    NASA Astrophysics Data System (ADS)

    Castelli, Roberto; Gameiro, Marcio; Lessard, Jean-Philippe

    2018-04-01

    In this paper, we develop computer-assisted techniques for the analysis of periodic orbits of ill-posed partial differential equations. As a case study, our proposed method is applied to the Boussinesq equation, which has been investigated extensively because of its role in the theory of shallow water waves. The idea is to use the symmetry of the solutions and a Newton-Kantorovich type argument (the radii polynomial approach) to obtain rigorous proofs of existence of the periodic orbits in a weighted ℓ1 Banach space of space-time Fourier coefficients with exponential decay. We present several computer-assisted proofs of the existence of periodic orbits at different parameter values.

  1. Joint analysis of input and parametric uncertainties in watershed water quality modeling: A formal Bayesian approach

    NASA Astrophysics Data System (ADS)

    Han, Feng; Zheng, Yi

    2018-06-01

    Significant Input uncertainty is a major source of error in watershed water quality (WWQ) modeling. It remains challenging to address the input uncertainty in a rigorous Bayesian framework. This study develops the Bayesian Analysis of Input and Parametric Uncertainties (BAIPU), an approach for the joint analysis of input and parametric uncertainties through a tight coupling of Markov Chain Monte Carlo (MCMC) analysis and Bayesian Model Averaging (BMA). The formal likelihood function for this approach is derived considering a lag-1 autocorrelated, heteroscedastic, and Skew Exponential Power (SEP) distributed error model. A series of numerical experiments were performed based on a synthetic nitrate pollution case and on a real study case in the Newport Bay Watershed, California. The Soil and Water Assessment Tool (SWAT) and Differential Evolution Adaptive Metropolis (DREAM(ZS)) were used as the representative WWQ model and MCMC algorithm, respectively. The major findings include the following: (1) the BAIPU can be implemented and used to appropriately identify the uncertain parameters and characterize the predictive uncertainty; (2) the compensation effect between the input and parametric uncertainties can seriously mislead the modeling based management decisions, if the input uncertainty is not explicitly accounted for; (3) the BAIPU accounts for the interaction between the input and parametric uncertainties and therefore provides more accurate calibration and uncertainty results than a sequential analysis of the uncertainties; and (4) the BAIPU quantifies the credibility of different input assumptions on a statistical basis and can be implemented as an effective inverse modeling approach to the joint inference of parameters and inputs.

  2. Tensile Properties of Dyneema SK76 Single Fibers at Multiple Loading Rates Using a Direct Gripping Method

    DTIC Science & Technology

    2014-06-01

    lower density compared with aramid fibers such as Kevlar and Twaron. Numerical modeling is used to design more effective fiber-based composite armor...in measuring fibers and doing experiments. vi INTENTIONALLY LEFT BLANK. 1 1. Introduction Aramid fibers such as Kevlar (DuPont) and Twaron...methyl methacrylate blocks. The efficacy of this method to grip Kevlar fibers has been rigorously studied using a variety of statistical methods at

  3. Examining the Statistical Rigor of Test and Evaluation Results in the Live, Virtual and Constructive Environment

    DTIC Science & Technology

    2011-06-01

    Committee Meeting. 23 June 2008. Bjorkman, Eileen A. and Frank B. Gray . “Testing in a Joint Environment 2004-2008: Findings, Conclusions and...the LVC joint test environment to evaluate system performance and joint mission effectiveness (Bjorkman and Gray 2009a). The LVC battlespace...attack (Bjorkman and Gray 2009b). Figure 3 - JTEM Methodology (Bjorkman 2008) A key INTEGRAL FIRE lesson learned was realizing the need for each

  4. In Search of Golden Rules: Comment on Hypothesis-Testing Approaches to Setting Cutoff Values for Fit Indexes and Dangers in Overgeneralizing Hu and Bentler's (1999) Findings

    ERIC Educational Resources Information Center

    Marsh, Herbert W.; Hau, Kit-Tai; Wen, Zhonglin

    2004-01-01

    Goodness-of-fit (GOF) indexes provide "rules of thumb"?recommended cutoff values for assessing fit in structural equation modeling. Hu and Bentler (1999) proposed a more rigorous approach to evaluating decision rules based on GOF indexes and, on this basis, proposed new and more stringent cutoff values for many indexes. This article discusses…

  5. Open Pit Mine 3d Mapping by Tls and Digital Photogrammetry: 3d Model Update Thanks to a Slam Based Approach

    NASA Astrophysics Data System (ADS)

    Vassena, G.; Clerici, A.

    2018-05-01

    The state of the art of 3D surveying technologies, if correctly applied, allows to obtain 3D coloured models of large open pit mines using different technologies as terrestrial laser scanner (TLS), with images, combined with UAV based digital photogrammetry. GNSS and/or total station are also currently used to geo reference the model. The University of Brescia has been realised a project to map in 3D an open pit mine located in Botticino, a famous location of marble extraction close to Brescia in North Italy. Terrestrial Laser Scanner 3D point clouds combined with RGB images and digital photogrammetry from UAV have been used to map a large part of the cave. By rigorous and well know procedures a 3D point cloud and mesh model have been obtained using an easy and rigorous approach. After the description of the combined mapping process, the paper describes the innovative process proposed for the daily/weekly update of the model itself. To realize this task a SLAM technology approach is described, using an innovative approach based on an innovative instrument capable to run an automatic localization process and real time on the field change detection analysis.

  6. Qualitative Methods in Field Research: An Indonesian Experience in Community Based Practice.

    ERIC Educational Resources Information Center

    Lysack, Catherine L.; Krefting, Laura

    1994-01-01

    Cross-cultural evaluation of a community-based rehabilitation project in Indonesia used three methods: focus groups, questionnaires, and key informant interviews. A continuous cyclical approach to data collection and concern for cultural sensitivity increased the rigor of the research. (SK)

  7. Symmetry Properties of Potentiometric Titration Curves.

    ERIC Educational Resources Information Center

    Macca, Carlo; Bombi, G. Giorgio

    1983-01-01

    Demonstrates how the symmetry properties of titration curves can be efficiently and rigorously treated by means of a simple method, assisted by the use of logarithmic diagrams. Discusses the symmetry properties of several typical titration curves, comparing the graphical approach and an explicit mathematical treatment. (Author/JM)

  8. Accurate Biomass Estimation via Bayesian Adaptive Sampling

    NASA Technical Reports Server (NTRS)

    Wheeler, Kevin R.; Knuth, Kevin H.; Castle, Joseph P.; Lvov, Nikolay

    2005-01-01

    The following concepts were introduced: a) Bayesian adaptive sampling for solving biomass estimation; b) Characterization of MISR Rahman model parameters conditioned upon MODIS landcover. c) Rigorous non-parametric Bayesian approach to analytic mixture model determination. d) Unique U.S. asset for science product validation and verification.

  9. Blended Learning: An Innovative Approach

    ERIC Educational Resources Information Center

    Lalima; Dangwal, Kiran Lata

    2017-01-01

    Blended learning is an innovative concept that embraces the advantages of both traditional teaching in the classroom and ICT supported learning including both offline learning and online learning. It has scope for collaborative learning; constructive learning and computer assisted learning (CAI). Blended learning needs rigorous efforts, right…

  10. Automatic Prediction of Protein 3D Structures by Probabilistic Multi-template Homology Modeling.

    PubMed

    Meier, Armin; Söding, Johannes

    2015-10-01

    Homology modeling predicts the 3D structure of a query protein based on the sequence alignment with one or more template proteins of known structure. Its great importance for biological research is owed to its speed, simplicity, reliability and wide applicability, covering more than half of the residues in protein sequence space. Although multiple templates have been shown to generally increase model quality over single templates, the information from multiple templates has so far been combined using empirically motivated, heuristic approaches. We present here a rigorous statistical framework for multi-template homology modeling. First, we find that the query proteins' atomic distance restraints can be accurately described by two-component Gaussian mixtures. This insight allowed us to apply the standard laws of probability theory to combine restraints from multiple templates. Second, we derive theoretically optimal weights to correct for the redundancy among related templates. Third, a heuristic template selection strategy is proposed. We improve the average GDT-ha model quality score by 11% over single template modeling and by 6.5% over a conventional multi-template approach on a set of 1000 query proteins. Robustness with respect to wrong constraints is likewise improved. We have integrated our multi-template modeling approach with the popular MODELLER homology modeling software in our free HHpred server http://toolkit.tuebingen.mpg.de/hhpred and also offer open source software for running MODELLER with the new restraints at https://bitbucket.org/soedinglab/hh-suite.

  11. Robust inference for responder analysis: Innovative clinical trial design using a minimum p-value approach.

    PubMed

    Lin, Yunzhi

    2016-08-15

    Responder analysis is in common use in clinical trials, and has been described and endorsed in regulatory guidance documents, especially in trials where "soft" clinical endpoints such as rating scales are used. The procedure is useful, because responder rates can be understood more intuitively than a difference in means of rating scales. However, two major issues arise: 1) such dichotomized outcomes are inefficient in terms of using the information available and can seriously reduce the power of the study; and 2) the results of clinical trials depend considerably on the response cutoff chosen, yet in many disease areas there is no consensus as to what is the most appropriate cutoff. This article addresses these two issues, offering a novel approach for responder analysis that could both improve the power of responder analysis and explore different responder cutoffs if an agreed-upon common cutoff is not present. Specifically, we propose a statistically rigorous clinical trial design that pre-specifies multiple tests of responder rates between treatment groups based on a range of pre-specified responder cutoffs, and uses the minimum of the p-values for formal inference. The critical value for hypothesis testing comes from permutation distributions. Simulation studies are carried out to examine the finite sample performance of the proposed method. We demonstrate that the new method substantially improves the power of responder analysis, and in certain cases, yields power that is approaching the analysis using the original continuous (or ordinal) measure.

  12. Transfer of analytical procedures: a panel of strategies selected for risk management, with emphasis on an integrated equivalence-based comparative testing approach.

    PubMed

    Agut, C; Caron, A; Giordano, C; Hoffman, D; Ségalini, A

    2011-09-10

    In 2001, a multidisciplinary team made of analytical scientists and statisticians at Sanofi-aventis has published a methodology which has governed, from that time, the transfers from R&D sites to Manufacturing sites of the release monographs. This article provides an overview of the recent adaptations brought to this original methodology taking advantage of our experience and the new regulatory framework, and, in particular, the risk management perspective introduced by ICH Q9. Although some alternate strategies have been introduced in our practices, the comparative testing one, based equivalence testing as statistical approach, remains the standard for assays lying on very critical quality attributes. This is conducted with the concern to control the most important consumer's risk involved at two levels in analytical decisions in the frame of transfer studies: risk, for the receiving laboratory, to take poor release decisions with the analytical method and risk, for the sending laboratory, to accredit such a receiving laboratory on account of its insufficient performances with the method. Among the enhancements to the comparative studies, the manuscript presents the process settled within our company for a better integration of the transfer study into the method life-cycle, just as proposals of generic acceptance criteria and designs for assay and related substances methods. While maintaining rigor and selectivity of the original approach, these improvements tend towards an increased efficiency in the transfer operations. Copyright © 2011 Elsevier B.V. All rights reserved.

  13. Tank waste remediation system immobilized high-level waste storage project configuration management implementation plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burgard, K.G.

    This Configuration Management Implementation Plan was developed to assist in the management of systems, structures, and components, to facilitate the effective control and statusing of changes to systems, structures, and components; and to ensure technical consistency between design, performance, and operational requirements. Its purpose is to describe the approach Project W-464 will take in implementing a configuration management control, to determine the rigor of control, and to identify the mechanisms for imposing that control.This Configuration Management Implementation Plan was developed to assist in the management of systems, structures, and components, to facilitate the effective control and statusing of changes tomore » systems, structures, and components; and to ensure technical consistency between design, performance, and operational requirements. Its purpose is to describe the approach Project W-464 will take in implementing a configuration management control, to determine the rigor of control, and to identify the mechanisms for imposing that control.« less

  14. Balancing Methodological Rigor and the Needs of Research Participants: A Debate on Alternative Approaches to Sensitive Research.

    PubMed

    Chan, T M Simon; Teram, Eli; Shaw, Ian

    2017-01-01

    Despite growing consideration of the needs of research participants in studies related to sensitive issues, discussions of alternative ways to design sensitive research are scarce. Structured as an exchange between two researchers who used different approaches in their studies with childhood sexual abuse survivors, in this article, we seek to advance understanding of methodological and ethical issues in designing sensitive research. The first perspective, which is termed protective, promotes the gradual progression of participants from a treatment phase into a research phase, with the ongoing presence of a researcher and a social worker in both phases. In the second perspective, which is termed minimalist, we argue for clear boundaries between research and treatment processes, limiting the responsibility of researchers to ensuring that professional support is available to participants who experience emotional difficulties. Following rebuttals, lessons are drawn for ethical balancing between methodological rigor and the needs of participants. © The Author(s) 2015.

  15. Finite state projection based bounds to compare chemical master equation models using single-cell data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fox, Zachary; Neuert, Gregor; Department of Pharmacology, School of Medicine, Vanderbilt University, Nashville, Tennessee 37232

    2016-08-21

    Emerging techniques now allow for precise quantification of distributions of biological molecules in single cells. These rapidly advancing experimental methods have created a need for more rigorous and efficient modeling tools. Here, we derive new bounds on the likelihood that observations of single-cell, single-molecule responses come from a discrete stochastic model, posed in the form of the chemical master equation. These strict upper and lower bounds are based on a finite state projection approach, and they converge monotonically to the exact likelihood value. These bounds allow one to discriminate rigorously between models and with a minimum level of computational effort.more » In practice, these bounds can be incorporated into stochastic model identification and parameter inference routines, which improve the accuracy and efficiency of endeavors to analyze and predict single-cell behavior. We demonstrate the applicability of our approach using simulated data for three example models as well as for experimental measurements of a time-varying stochastic transcriptional response in yeast.« less

  16. Phytoremediation of hazardous wastes. Technical report, 23--26 July 1995

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCutcheon, S.C.; Wolfe, N.L.; Carreria, L.H.

    1995-07-26

    A new and innovative approach to phytoremediation (the use of plants to degrade hazardous contaminants) was developed. The new approach to phytoremediation involves rigorous pathway analyses, mass balance determinations, and identification of specific enzymes that break down trinitrotoluene (TNT), other explosives (RDX and HMX), nitrobenzene, and chlorinated solvents (e.g., TCE and PCE) (EPA 1994). As a good example, TNT is completely and rapidly degraded by nitroreductase and laccase enzymes. The aromatic ring is broken and the carbon in the ring fragments is incorporated into new plant fiber, as part of the natural lignification process. Half lives for TNT degradation approachmore » 1 hr or less under ideal laboratory conditions. Continuous-flow pilot studies indicate that scale up residence times in created wetlands may be two to three times longer than in laboratory batch studies. The use of created wetlands and land farming techniques guided by rigorous field biochemistry and ecology promises to be a vital part of a newly evolving field, ecological engineering.« less

  17. Phytoremediation of hazardous wastes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCutcheon, S.C.; Wolfe, N.L.; Carreria, L.H.

    1995-11-01

    A new and innovative approach to phytoremediation (the use of plants to degrade hazardous contaminants) was developed. The new approach to phytoremediation involves rigorous pathway analyses, mass balance determinations, and identification of specific enzymes that break down trinitrotoluene (TNT), other explosives (RDX and HMX), nitrobenzene, and chlorinated solvents (e.g., TCE and PCE) (EPA 1994). As a good example, TNT is completely and rapidly degraded by nitroreductase and laccase enzymes. The aromatic ring is broken and the carbon in the ring fragments is incorporated into new plant fiber, as part of the natural lignification process. Half lives for TNT degradation approachmore » 1 hr or less under ideal laboratory conditions. Continuous-flow pilot studies indicate that scale up residence times in created wetlands may be two to three times longer than in laboratory batch studies. The use of created wetlands and land farming techniques guided by rigorous field biochemistry and ecology promises to be a vital part of a newly evolving field, ecological engineering.« less

  18. Methods used and lessons learnt in conducting document reviews of medical and allied health curricula - a key step in curriculum evaluation.

    PubMed

    Rohwer, Anke; Schoonees, Anel; Young, Taryn

    2014-11-02

    This paper describes the process, our experience and the lessons learnt in doing document reviews of health science curricula. Since we could not find relevant literature to guide us on how to approach these reviews, we feel that sharing our experience would benefit researchers embarking on similar projects. We followed a rigorous, transparent, pre-specified approach that included the preparation of a protocol, a pre-piloted data extraction form and coding schedule. Data were extracted, analysed and synthesised. Quality checks were included at all stages of the process. The main lessons we learnt related to time and project management, continuous quality assurance, selecting the software that meets the needs of the project, involving experts as needed and disseminating the findings to relevant stakeholders. A complete curriculum evaluation comprises, apart from a document review, interviews with students and lecturers to assess the learnt and taught curricula respectively. Rigorous methods must be used to ensure an objective assessment.

  19. Shear-induced opening of the coronal magnetic field

    NASA Technical Reports Server (NTRS)

    Wolfson, Richard

    1995-01-01

    This work describes the evolution of a model solar corona in response to motions of the footpoints of its magnetic field. The mathematics involved is semianalytic, with the only numerical solution being that of an ordinary differential equation. This approach, while lacking the flexibility and physical details of full MHD simulations, allows for very rapid computation along with complete and rigorous exploration of the model's implications. We find that the model coronal field bulges upward, at first slowly and then more dramatically, in response to footpoint displacements. The energy in the field rises monotonically from that of the initial potential state, and the field configuration and energy appraoch asymptotically that of a fully open field. Concurrently, electric currents develop and concentrate into a current sheet as the limiting case of the open field is approached. Examination of the equations shows rigorously that in the asymptotic limit of the fully open field, the current layer becomes a true ideal MHD singularity.

  20. Including Magnetostriction in Micromagnetic Models

    NASA Astrophysics Data System (ADS)

    Conbhuí, Pádraig Ó.; Williams, Wyn; Fabian, Karl; Nagy, Lesleis

    2016-04-01

    The magnetic anomalies that identify crustal spreading are predominantly recorded by basalts formed at the mid-ocean ridges, whose magnetic signals are dominated by iron-titanium-oxides (Fe3-xTixO4), so called "titanomagnetites", of which the Fe2.4Ti0.6O4 (TM60) phase is the most common. With sufficient quantities of titanium present, these minerals exhibit strong magnetostriction. To date, models of these grains in the pseudo-single domain (PSD) range have failed to accurately account for this effect. In particular, a popular analytic treatment provided by Kittel (1949) for describing the magnetostrictive energy as an effective increase of the anisotropy constant can produce unphysical strains for non-uniform magnetizations. I will present a rigorous approach based on work by Brown (1966) and by Kroner (1958) for including magnetostriction in micromagnetic codes which is suitable for modelling hysteresis loops and finding remanent states in the PSD regime. Preliminary results suggest the more rigorously defined micromagnetic models exhibit higher coercivities and extended single domain ranges when compared to more simplistic approaches.

  1. GIA Model Statistics for GRACE Hydrology, Cryosphere, and Ocean Science

    NASA Astrophysics Data System (ADS)

    Caron, L.; Ivins, E. R.; Larour, E.; Adhikari, S.; Nilsson, J.; Blewitt, G.

    2018-03-01

    We provide a new analysis of glacial isostatic adjustment (GIA) with the goal of assembling the model uncertainty statistics required for rigorously extracting trends in surface mass from the Gravity Recovery and Climate Experiment (GRACE) mission. Such statistics are essential for deciphering sea level, ocean mass, and hydrological changes because the latter signals can be relatively small (≤2 mm/yr water height equivalent) over very large regions, such as major ocean basins and watersheds. With abundant new >7 year continuous measurements of vertical land motion (VLM) reported by Global Positioning System stations on bedrock and new relative sea level records, our new statistical evaluation of GIA uncertainties incorporates Bayesian methodologies. A unique aspect of the method is that both the ice history and 1-D Earth structure vary through a total of 128,000 forward models. We find that best fit models poorly capture the statistical inferences needed to correctly invert for lower mantle viscosity and that GIA uncertainty exceeds the uncertainty ascribed to trends from 14 years of GRACE data in polar regions.

  2. Output statistics of laser anemometers in sparsely seeded flows

    NASA Technical Reports Server (NTRS)

    Edwards, R. V.; Jensen, A. S.

    1982-01-01

    It is noted that until very recently, research on this topic concentrated on the particle arrival statistics and the influence of the optical parameters on them. Little attention has been paid to the influence of subsequent processing on the measurement statistics. There is also controversy over whether the effects of the particle statistics can be measured. It is shown here that some of the confusion derives from a lack of understanding of the experimental parameters that are to be controlled or known. A rigorous framework is presented for examining the measurement statistics of such systems. To provide examples, two problems are then addressed. The first has to do with a sample and hold processor, the second with what is called a saturable processor. The sample and hold processor converts the output to a continuous signal by holding the last reading until a new one is obtained. The saturable system is one where the maximum processable rate is arrived at by the dead time of some unit in the system. At high particle rates, the processed rate is determined through the dead time.

  3. Estimation of integral curves from high angular resolution diffusion imaging (HARDI) data.

    PubMed

    Carmichael, Owen; Sakhanenko, Lyudmila

    2015-05-15

    We develop statistical methodology for a popular brain imaging technique HARDI based on the high order tensor model by Özarslan and Mareci [10]. We investigate how uncertainty in the imaging procedure propagates through all levels of the model: signals, tensor fields, vector fields, and fibers. We construct asymptotically normal estimators of the integral curves or fibers which allow us to trace the fibers together with confidence ellipsoids. The procedure is computationally intense as it blends linear algebra concepts from high order tensors with asymptotical statistical analysis. The theoretical results are illustrated on simulated and real datasets. This work generalizes the statistical methodology proposed for low angular resolution diffusion tensor imaging by Carmichael and Sakhanenko [3], to several fibers per voxel. It is also a pioneering statistical work on tractography from HARDI data. It avoids all the typical limitations of the deterministic tractography methods and it delivers the same information as probabilistic tractography methods. Our method is computationally cheap and it provides well-founded mathematical and statistical framework where diverse functionals on fibers, directions and tensors can be studied in a systematic and rigorous way.

  4. Estimation of integral curves from high angular resolution diffusion imaging (HARDI) data

    PubMed Central

    Carmichael, Owen; Sakhanenko, Lyudmila

    2015-01-01

    We develop statistical methodology for a popular brain imaging technique HARDI based on the high order tensor model by Özarslan and Mareci [10]. We investigate how uncertainty in the imaging procedure propagates through all levels of the model: signals, tensor fields, vector fields, and fibers. We construct asymptotically normal estimators of the integral curves or fibers which allow us to trace the fibers together with confidence ellipsoids. The procedure is computationally intense as it blends linear algebra concepts from high order tensors with asymptotical statistical analysis. The theoretical results are illustrated on simulated and real datasets. This work generalizes the statistical methodology proposed for low angular resolution diffusion tensor imaging by Carmichael and Sakhanenko [3], to several fibers per voxel. It is also a pioneering statistical work on tractography from HARDI data. It avoids all the typical limitations of the deterministic tractography methods and it delivers the same information as probabilistic tractography methods. Our method is computationally cheap and it provides well-founded mathematical and statistical framework where diverse functionals on fibers, directions and tensors can be studied in a systematic and rigorous way. PMID:25937674

  5. Associations of neighborhood disorganization and maternal spanking with children's aggression: A fixed-effects regression analysis.

    PubMed

    Ma, Julie; Grogan-Kaylor, Andrew; Lee, Shawna J

    2018-02-01

    This study employed fixed effects regression that controls for selection bias, omitted variables bias, and all time-invariant aspects of parent and child characteristics to examine the simultaneous associations between neighborhood disorganization, maternal spanking, and aggressive behavior in early childhood using data from the Fragile Families and Child Wellbeing Study (FFCWS). Analysis was based on 2,472 children and their mothers who participated in Wave 3 (2001-2003; child age 3) and Wave 4 (2003-2006; child age 5) of the FFCWS. Results indicated that higher rates of neighborhood crime and violence predicted higher levels of child aggression. Maternal spanking in the past year, whether frequent or infrequent, was also associated with increases in aggressive behavior. This study contributes statistically rigorous evidence that exposure to violence in the neighborhood as well as the family context are predictors of child aggression. We conclude with a discussion for the need for multilevel prevention and intervention approaches that target both community and parenting factors. Copyright © 2017 The Author(s). Published by Elsevier Ltd.. All rights reserved.

  6. Bringing a transgenic crop to market: where compositional analysis fits.

    PubMed

    Privalle, Laura S; Gillikin, Nancy; Wandelt, Christine

    2013-09-04

    In the process of developing a biotechnology product, thousands of genes and transformation events are evaluated to select the event that will be commercialized. The ideal event is identified on the basis of multiple characteristics including trait efficacy, the molecular characteristics of the insert, and agronomic performance. Once selected, the commercial event is subjected to a rigorous safety evaluation taking a multipronged approach including examination of the safety of the gene and gene product - the protein, plant performance, impact of cultivating the crop on the environment, agronomic performance, and equivalence of the crop/food to conventional crops/food - by compositional analysis. The compositional analysis is composed of a comparison of the nutrient and antinutrient composition of the crop containing the event, its parental line (variety), and other conventional lines (varieties). Different geographies have different requirements for the compositional analysis studies. Parameters that vary include the number of years (seasons) and locations (environments) to be evaluated, the appropriate comparator(s), analytes to be evaluated, and statistical analysis. Specific examples of compositional analysis results will be presented.

  7. Polynomial probability distribution estimation using the method of moments

    PubMed Central

    Mattsson, Lars; Rydén, Jesper

    2017-01-01

    We suggest a procedure for estimating Nth degree polynomial approximations to unknown (or known) probability density functions (PDFs) based on N statistical moments from each distribution. The procedure is based on the method of moments and is setup algorithmically to aid applicability and to ensure rigor in use. In order to show applicability, polynomial PDF approximations are obtained for the distribution families Normal, Log-Normal, Weibull as well as for a bimodal Weibull distribution and a data set of anonymized household electricity use. The results are compared with results for traditional PDF series expansion methods of Gram–Charlier type. It is concluded that this procedure is a comparatively simple procedure that could be used when traditional distribution families are not applicable or when polynomial expansions of probability distributions might be considered useful approximations. In particular this approach is practical for calculating convolutions of distributions, since such operations become integrals of polynomial expressions. Finally, in order to show an advanced applicability of the method, it is shown to be useful for approximating solutions to the Smoluchowski equation. PMID:28394949

  8. Approach to atmospheric laser-propagation theory based on the extended Huygens-Fresnel principle and a self-consistency concept.

    PubMed

    Bochove, Erik J; Rao Gudimetla, V S

    2017-01-01

    We propose a self-consistency condition based on the extended Huygens-Fresnel principle, which we apply to the propagation kernel of the mutual coherence function of a partially coherent laser beam propagating through a turbulent atmosphere. The assumption of statistical independence of turbulence in neighboring propagation segments leads to an integral equation in the propagation kernel. This integral equation is satisfied by a Gaussian function, with dependence on the transverse coordinates that is identical to the previous Gaussian formulation by Yura [Appl. Opt.11, 1399 (1972)APOPAI0003-693510.1364/AO.11.001399], but differs in the transverse coherence length's dependence on propagation distance, so that this established version violates our self-consistency principle. Our formulation has one free parameter, which in the context of Kolmogorov's theory is independent of turbulence strength and propagation distance. We determined its value by numerical fitting to the rigorous beam propagation theory of Yura and Hanson [J. Opt. Soc. Am. A6, 564 (1989)JOAOD60740-323210.1364/JOSAA.6.000564], demonstrating in addition a significant improvement over other Gaussian models.

  9. Stochastic dynamics of intermittent pore-scale particle motion in three-dimensional porous media

    NASA Astrophysics Data System (ADS)

    Morales, V. L.; Dentz, M.; Willmann, M.; Holzner, M.

    2017-12-01

    A proper understanding of velocity dynamics is key for making transport predictions through porous media at any scale. We study the velocity evolution process from particle dynamics at the pore-scale with particular interest in preasymptotic (non-Fickian) behavior. Experimental measurements from 3-dimensional particle tracking velocimetry are used to obtain Lagrangian velocity statistics for three different types of media heterogeneity. Particle velocities are found to be intermittent in nature, log-normally distributed and non-stationary. We show that these velocity characteristics can be captured with a correlated Ornstein-Uhlenbeck process for a random walk in space that is parameterized from velocity distributions. Our simple model is rigorously tested for accurate reproduction of velocity variability in magnitude and frequency. We further show that it captures exceptionally well the preasymptotic mean and mean squared displacement in the ballistic and superdiffusive regimes, and can be extended to determine if and when Fickian behavior will be reached. Our approach reproduces both preasymptotic and asymptotic transport behavior with a single transport model, demonstrating correct description of the fundamental controls of anomalous transport.

  10. Mechanical performance of pyrolytic carbon in prosthetic heart valve applications.

    PubMed

    Cao, H

    1996-06-01

    An experimental procedure has been developed for rigorous characterization of the fracture resistance and fatigue crack extension in pyrolytic carbon for prosthetic heart valve application. Experiments were conducted under sustained and cyclic loading in a simulated biological environment using Carbomedics Pyrolite carbon. While the material was shown to have modest fracture toughness, it exhibited excellent resistance to subcritical crack growth. The crack growth kinetics in pyrolytic carbon were formulated using a phenomenological description. A fatigue threshold was observed below which the crack growth rate diminishes. A damage tolerance concept based on fracture mechanics was used to develop an engineering design approach for mechanical heart valve prostheses. In particular, a new quantity, referred to as the safe-life index, was introduced to assess the design adequacy against subcritical crack growth in brittle materials. In addition, a weakest-link statistical description of the fracture strength is provided and used in the design of component proof-tests. It is shown that the structural reliability of mechanical heart valves can be assured by combining effective flaw detection and manufacturing quality control with adequate damage tolerance design.

  11. Polynomial probability distribution estimation using the method of moments.

    PubMed

    Munkhammar, Joakim; Mattsson, Lars; Rydén, Jesper

    2017-01-01

    We suggest a procedure for estimating Nth degree polynomial approximations to unknown (or known) probability density functions (PDFs) based on N statistical moments from each distribution. The procedure is based on the method of moments and is setup algorithmically to aid applicability and to ensure rigor in use. In order to show applicability, polynomial PDF approximations are obtained for the distribution families Normal, Log-Normal, Weibull as well as for a bimodal Weibull distribution and a data set of anonymized household electricity use. The results are compared with results for traditional PDF series expansion methods of Gram-Charlier type. It is concluded that this procedure is a comparatively simple procedure that could be used when traditional distribution families are not applicable or when polynomial expansions of probability distributions might be considered useful approximations. In particular this approach is practical for calculating convolutions of distributions, since such operations become integrals of polynomial expressions. Finally, in order to show an advanced applicability of the method, it is shown to be useful for approximating solutions to the Smoluchowski equation.

  12. Solution-Processed Carbon Nanotube True Random Number Generator.

    PubMed

    Gaviria Rojas, William A; McMorrow, Julian J; Geier, Michael L; Tang, Qianying; Kim, Chris H; Marks, Tobin J; Hersam, Mark C

    2017-08-09

    With the growing adoption of interconnected electronic devices in consumer and industrial applications, there is an increasing demand for robust security protocols when transmitting and receiving sensitive data. Toward this end, hardware true random number generators (TRNGs), commonly used to create encryption keys, offer significant advantages over software pseudorandom number generators. However, the vast network of devices and sensors envisioned for the "Internet of Things" will require small, low-cost, and mechanically flexible TRNGs with low computational complexity. These rigorous constraints position solution-processed semiconducting single-walled carbon nanotubes (SWCNTs) as leading candidates for next-generation security devices. Here, we demonstrate the first TRNG using static random access memory (SRAM) cells based on solution-processed SWCNTs that digitize thermal noise to generate random bits. This bit generation strategy can be readily implemented in hardware with minimal transistor and computational overhead, resulting in an output stream that passes standardized statistical tests for randomness. By using solution-processed semiconducting SWCNTs in a low-power, complementary architecture to achieve TRNG, we demonstrate a promising approach for improving the security of printable and flexible electronics.

  13. Steady-state distributions of probability fluxes on complex networks

    NASA Astrophysics Data System (ADS)

    Chełminiak, Przemysław; Kurzyński, Michał

    2017-02-01

    We consider a simple model of the Markovian stochastic dynamics on complex networks to examine the statistical properties of the probability fluxes. The additional transition, called hereafter a gate, powered by the external constant force breaks a detailed balance in the network. We argue, using a theoretical approach and numerical simulations, that the stationary distributions of the probability fluxes emergent under such conditions converge to the Gaussian distribution. By virtue of the stationary fluctuation theorem, its standard deviation depends directly on the square root of the mean flux. In turn, the nonlinear relation between the mean flux and the external force, which provides the key result of the present study, allows us to calculate the two parameters that entirely characterize the Gaussian distribution of the probability fluxes both close to as well as far from the equilibrium state. Also, the other effects that modify these parameters, such as the addition of shortcuts to the tree-like network, the extension and configuration of the gate and a change in the network size studied by means of computer simulations are widely discussed in terms of the rigorous theoretical predictions.

  14. Skill Assessment in Ocean Biological Data Assimilation

    NASA Technical Reports Server (NTRS)

    Gregg, Watson W.; Friedrichs, Marjorie A. M.; Robinson, Allan R.; Rose, Kenneth A.; Schlitzer, Reiner; Thompson, Keith R.; Doney, Scott C.

    2008-01-01

    There is growing recognition that rigorous skill assessment is required to understand the ability of ocean biological models to represent ocean processes and distributions. Statistical analysis of model results with observations represents the most quantitative form of skill assessment, and this principle serves as well for data assimilation models. However, skill assessment for data assimilation requires special consideration. This is because there are three sets of information in the free-run model, data, and the assimilation model, which uses Data assimilation information from both the flee-run model and the data. Intercom parison of results among the three sets of information is important and useful for assessment, but is not conclusive since the three information sets are intertwined. An independent data set is necessary for an objective determination. Other useful measures of ocean biological data assimilation assessment include responses of unassimilated variables to the data assimilation, performance outside the prescribed region/time of interest, forecasting, and trend analysis. Examples of each approach from the literature are provided. A comprehensive list of ocean biological data assimilation and their applications of skill assessment, in both ecosystem/biogeochemical and fisheries efforts, is summarized.

  15. Bayesian adaptive bandit-based designs using the Gittins index for multi-armed trials with normally distributed endpoints.

    PubMed

    Smith, Adam L; Villar, Sofía S

    2018-01-01

    Adaptive designs for multi-armed clinical trials have become increasingly popular recently because of their potential to shorten development times and to increase patient response. However, developing response-adaptive designs that offer patient-benefit while ensuring the resulting trial provides a statistically rigorous and unbiased comparison of the different treatments included is highly challenging. In this paper, the theory of Multi-Armed Bandit Problems is used to define near optimal adaptive designs in the context of a clinical trial with a normally distributed endpoint with known variance. We report the operating characteristics (type I error, power, bias) and patient-benefit of these approaches and alternative designs using simulation studies based on an ongoing trial. These results are then compared to those recently published in the context of Bernoulli endpoints. Many limitations and advantages are similar in both cases but there are also important differences, specially with respect to type I error control. This paper proposes a simulation-based testing procedure to correct for the observed type I error inflation that bandit-based and adaptive rules can induce.

  16. Characterizing the information content of cloud thermodynamic phase retrievals from the notional PACE OCI shortwave reflectance measurements

    NASA Astrophysics Data System (ADS)

    Coddington, O. M.; Vukicevic, T.; Schmidt, K. S.; Platnick, S.

    2017-08-01

    We rigorously quantify the probability of liquid or ice thermodynamic phase using only shortwave spectral channels specific to the National Aeronautics and Space Administration's Moderate Resolution Imaging Spectroradiometer, Visible Infrared Imaging Radiometer Suite, and the notional future Plankton, Aerosol, Cloud, ocean Ecosystem imager. The results show that two shortwave-infrared channels (2135 and 2250 nm) provide more information on cloud thermodynamic phase than either channel alone; in one case, the probability of ice phase retrieval increases from 65 to 82% by combining 2135 and 2250 nm channels. The analysis is performed with a nonlinear statistical estimation approach, the GEneralized Nonlinear Retrieval Analysis (GENRA). The GENRA technique has previously been used to quantify the retrieval of cloud optical properties from passive shortwave observations, for an assumed thermodynamic phase. Here we present the methodology needed to extend the utility of GENRA to a binary thermodynamic phase space (i.e., liquid or ice). We apply formal information content metrics to quantify our results; two of these (mutual and conditional information) have not previously been used in the field of cloud studies.

  17. Ghost imaging based on Pearson correlation coefficients

    NASA Astrophysics Data System (ADS)

    Yu, Wen-Kai; Yao, Xu-Ri; Liu, Xue-Feng; Li, Long-Zhen; Zhai, Guang-Jie

    2015-05-01

    Correspondence imaging is a new modality of ghost imaging, which can retrieve a positive/negative image by simple conditional averaging of the reference frames that correspond to relatively large/small values of the total intensity measured at the bucket detector. Here we propose and experimentally demonstrate a more rigorous and general approach in which a ghost image is retrieved by calculating a Pearson correlation coefficient between the bucket detector intensity and the brightness at a given pixel of the reference frames, and at the next pixel, and so on. Furthermore, we theoretically provide a statistical interpretation of these two imaging phenomena, and explain how the error depends on the sample size and what kind of distribution the error obeys. According to our analysis, the image signal-to-noise ratio can be greatly improved and the sampling number reduced by means of our new method. Project supported by the National Key Scientific Instrument and Equipment Development Project of China (Grant No. 2013YQ030595) and the National High Technology Research and Development Program of China (Grant No. 2013AA122902).

  18. On global solutions of the random Hamilton-Jacobi equations and the KPZ problem

    NASA Astrophysics Data System (ADS)

    Bakhtin, Yuri; Khanin, Konstantin

    2018-04-01

    In this paper, we discuss possible qualitative approaches to the problem of KPZ universality. Throughout the paper, our point of view is based on the geometrical and dynamical properties of minimisers and shocks forming interlacing tree-like structures. We believe that the KPZ universality can be explained in terms of statistics of these structures evolving in time. The paper is focussed on the setting of the random Hamilton-Jacobi equations. We formulate several conjectures concerning global solutions and discuss how their properties are connected to the KPZ scalings in dimension 1  +  1. In the case of general viscous Hamilton-Jacobi equations with non-quadratic Hamiltonians, we define generalised directed polymers. We expect that their behaviour is similar to the behaviour of classical directed polymers, and present arguments in favour of this conjecture. We also define a new renormalisation transformation defined in purely geometrical terms and discuss conjectural properties of the corresponding fixed points. Most of our conjectures are widely open, and supported by only partial rigorous results for particular models.

  19. Statistical model of exotic rotational correlations in emergent space-time

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hogan, Craig; Kwon, Ohkyung; Richardson, Jonathan

    2017-06-06

    A statistical model is formulated to compute exotic rotational correlations that arise as inertial frames and causal structure emerge on large scales from entangled Planck scale quantum systems. Noncommutative quantum dynamics are represented by random transverse displacements that respect causal symmetry. Entanglement is represented by covariance of these displacements in Planck scale intervals defined by future null cones of events on an observer's world line. Light that propagates in a nonradial direction inherits a projected component of the exotic rotational correlation that accumulates as a random walk in phase. A calculation of the projection and accumulation leads to exact predictionsmore » for statistical properties of exotic Planck scale correlations in an interferometer of any configuration. The cross-covariance for two nearly co-located interferometers is shown to depart only slightly from the autocovariance. Specific examples are computed for configurations that approximate realistic experiments, and show that the model can be rigorously tested.« less

  20. Statistical methods for thermonuclear reaction rates and nucleosynthesis simulations

    NASA Astrophysics Data System (ADS)

    Iliadis, Christian; Longland, Richard; Coc, Alain; Timmes, F. X.; Champagne, Art E.

    2015-03-01

    Rigorous statistical methods for estimating thermonuclear reaction rates and nucleosynthesis are becoming increasingly established in nuclear astrophysics. The main challenge being faced is that experimental reaction rates are highly complex quantities derived from a multitude of different measured nuclear parameters (e.g., astrophysical S-factors, resonance energies and strengths, particle and γ-ray partial widths). We discuss the application of the Monte Carlo method to two distinct, but related, questions. First, given a set of measured nuclear parameters, how can one best estimate the resulting thermonuclear reaction rates and associated uncertainties? Second, given a set of appropriate reaction rates, how can one best estimate the abundances from nucleosynthesis (i.e., reaction network) calculations? The techniques described here provide probability density functions that can be used to derive statistically meaningful reaction rates and final abundances for any desired coverage probability. Examples are given for applications to s-process neutron sources, core-collapse supernovae, classical novae, and Big Bang nucleosynthesis.

  1. Using cancer to make cellular reproduction rigorous and relevant

    NASA Astrophysics Data System (ADS)

    Duncan, Cynthia F.

    The 1983 report Nation at Risk highlighted the fact that test scores of American students were far below that of competing nations and educational standards were being lowered. This trend has continued and studies have also shown that students are not entering college ready for success. This trend can be reversed. Students can better understand and retain biology content expectations if they are taught in a way that is both rigorous and relevant. In the past, students have learned the details of cellular reproduction with little knowledge of why it is important to their everyday lives. This material is learned only for the test. Knowing the details of cellular reproduction is crucial for understanding cancer. Cancer is a topic that will likely affect all of my students at some point in their lives. Students used hands on activities, including simulations, labs, and models to learn about cellular reproduction with cancer as a theme throughout. Students were challenged to learn how to use the rigorous biology content expectations to think about cancer, including stem cell research. Students that will some day be college students, voting citizens, and parents, will become better learners. Students were assessed before and after the completion of the unit to determine if learning occurs. Students did learn the material and became more critical thinkers. Statistical analysis was completed to insure confidence in the results.

  2. The Evolution of a More Rigorous Approach to Benefit Transfer: Benefit Function Transfer

    NASA Astrophysics Data System (ADS)

    Loomis, John B.

    1992-03-01

    The desire for economic values of recreation for unstudied recreation resources dates back to the water resource development benefit-cost analyses of the early 1960s. Rather than simply applying existing estimates of benefits per trip to the study site, a fairly rigorous approach was developed by a number of economists. This approach involves application of travel cost demand equations and contingent valuation benefit functions from existing sites to the new site. In this way the spatial market of the new site (i.e., its differing own price, substitute prices and population distribution) is accounted for in the new estimate of total recreation benefits. The assumptions of benefit transfer from recreation sites in one state to another state for the same recreation activity is empirically tested. The equality of demand coefficients for ocean sport salmon fishing in Oregon versus Washington and for freshwater steelhead fishing in Oregon versus Idaho is rejected. Thus transfer of either demand equations or average benefits per trip are likely to be in error. Using the Oregon steelhead equation, benefit transfers to rivers within the state are shown to be accurate to within 5-15%.

  3. Aerial photography flight quality assessment with GPS/INS and DEM data

    NASA Astrophysics Data System (ADS)

    Zhao, Haitao; Zhang, Bing; Shang, Jiali; Liu, Jiangui; Li, Dong; Chen, Yanyan; Zuo, Zhengli; Chen, Zhengchao

    2018-01-01

    The flight altitude, ground coverage, photo overlap, and other acquisition specifications of an aerial photography flight mission directly affect the quality and accuracy of the subsequent mapping tasks. To ensure smooth post-flight data processing and fulfill the pre-defined mapping accuracy, flight quality assessments should be carried out in time. This paper presents a novel and rigorous approach for flight quality evaluation of frame cameras with GPS/INS data and DEM, using geometric calculation rather than image analysis as in the conventional methods. This new approach is based mainly on the collinearity equations, in which the accuracy of a set of flight quality indicators is derived through a rigorous error propagation model and validated with scenario data. Theoretical analysis and practical flight test of an aerial photography mission using an UltraCamXp camera showed that the calculated photo overlap is accurate enough for flight quality assessment of 5 cm ground sample distance image, using the SRTMGL3 DEM and the POSAV510 GPS/INS data. An even better overlap accuracy could be achieved for coarser-resolution aerial photography. With this new approach, the flight quality evaluation can be conducted on site right after landing, providing accurate and timely information for decision making.

  4. Distinguishing cause from correlation in tokamak experiments to trigger edge-localised plasma instabilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Webster, Anthony J.; CCFE, Culham Science Centre, Abingdon OX14 3DB

    2014-11-15

    The generic question is considered: How can we determine the probability of an otherwise quasi-random event, having been triggered by an external influence? A specific problem is the quantification of the success of techniques to trigger, and hence control, edge-localised plasma instabilities (ELMs) in magnetically confined fusion (MCF) experiments. The development of such techniques is essential to ensure tolerable heat loads on components in large MCF fusion devices, and is necessary for their development into economically successful power plants. Bayesian probability theory is used to rigorously formulate the problem and to provide a formal solution. Accurate but pragmatic methods aremore » developed to estimate triggering probabilities, and are illustrated with experimental data. These allow results from experiments to be quantitatively assessed, and rigorously quantified conclusions to be formed. Example applications include assessing whether triggering of ELMs is a statistical or deterministic process, and the establishment of thresholds to ensure that ELMs are reliably triggered.« less

  5. Probability bounds analysis for nonlinear population ecology models.

    PubMed

    Enszer, Joshua A; Andrei Măceș, D; Stadtherr, Mark A

    2015-09-01

    Mathematical models in population ecology often involve parameters that are empirically determined and inherently uncertain, with probability distributions for the uncertainties not known precisely. Propagating such imprecise uncertainties rigorously through a model to determine their effect on model outputs can be a challenging problem. We illustrate here a method for the direct propagation of uncertainties represented by probability bounds though nonlinear, continuous-time, dynamic models in population ecology. This makes it possible to determine rigorous bounds on the probability that some specified outcome for a population is achieved, which can be a core problem in ecosystem modeling for risk assessment and management. Results can be obtained at a computational cost that is considerably less than that required by statistical sampling methods such as Monte Carlo analysis. The method is demonstrated using three example systems, with focus on a model of an experimental aquatic food web subject to the effects of contamination by ionic liquids, a new class of potentially important industrial chemicals. Copyright © 2015. Published by Elsevier Inc.

  6. The Power of Thinking Big

    ERIC Educational Resources Information Center

    Celeste, Eric

    2016-01-01

    Communities of practice have become important tools for districts striving to improve teacher quality in a way that improves student outcomes, but scaling the benefits of these communities requires a more rigorous, intentional approach. That's why Learning Forward, with support from the Bill & Melinda Gates Foundation, created the Redesign PD…

  7. Towards a Unified Theory of Engineering Education

    ERIC Educational Resources Information Center

    Salcedo Orozco, Oscar H.

    2017-01-01

    STEM education is an interdisciplinary approach to learning where rigorous academic concepts are coupled with real-world lessons and activities as students apply science, technology, engineering, and mathematics in contexts that make connections between school, community, work, and the global enterprise enabling STEM literacy (Tsupros, Kohler and…

  8. Evidence-based decision making : developing a knowledge base for successful program outcomes in transportation asset management.

    DOT National Transportation Integrated Search

    2015-12-01

    MAP-21 and AASHTOs framework for transportation asset management (TAM) offer opportunities to use more : rigorous approaches to collect and apply evidence within a TAM context. This report documents the results of a study : funded by the Georgia D...

  9. Diversity Management. Time for a New Approach.

    ERIC Educational Resources Information Center

    Ivancevich, John M.; Gilbert, Jacqueline A.

    2000-01-01

    A review of the history of diversity management resulted in a call for a new agenda that encourages more collaboration between scholars and administrators, increased researcher observation of workplace reactions to diversity management initiatives, more informative and rigorous case studies, and more third-party evaluations of diversity management…

  10. Caution--Praise Can Be Dangerous.

    ERIC Educational Resources Information Center

    Dweck, Carol S.

    1999-01-01

    Reviews research into the effects of praise on students. Suggests an approach that gets students to focus on their potential to learn, to value challenge, and to concentrate on effort and learning processes in the face of obstacles. This can all be done while holding students to rigorous standards. (SLD)

  11. Connected Learning Communities: A Toolkit for Reinventing High School.

    ERIC Educational Resources Information Center

    Almeida, Cheryl, Ed.; Steinberg, Adria, Ed.

    This document presents tools and guidelines to help practitioners transform their high schools into institutions facilitating community-connected learning. The approach underpinning the tools and guidelines is based on the following principles: academic rigor and relevance; personalized learning; self-passage to adulthood; and productive learning…

  12. Methods for assessing Phytophthora ramorum chlamydospore germination

    Treesearch

    Joyce Eberhart; Elilzabeth Stamm; Jennifer Parke

    2013-01-01

    Germination of chlamydospores is difficult to accurately assess when chlamydospores are attached to remnants of supporting hyphae. We developed two approaches for closely observing and rigorously quantifying the frequency of chlamydospore germination in vitro. The plate marking and scanning method was useful for quantifying germination of large...

  13. A Point System Approach to Secondary Classroom Management

    ERIC Educational Resources Information Center

    Xenos, Anthony J.

    2012-01-01

    This article presents guiding principles governing the design, implementation, and management of a point system to promote discipline and academic rigor in a secondary classroom. Four considerations are discussed: (1) assigning appropriate point values to integral classroom behaviors and tasks; (2) determining the relationship among consequences,…

  14. A Criterion-Referenced Approach to Student Ratings of Instruction

    ERIC Educational Resources Information Center

    Meyer, J. Patrick; Doromal, Justin B.; Wei, Xiaoxin; Zhu, Shi

    2017-01-01

    We developed a criterion-referenced student rating of instruction (SRI) to facilitate formative assessment of teaching. It involves four dimensions of teaching quality that are grounded in current instructional design principles: Organization and structure, Assessment and feedback, Personal interactions, and Academic rigor. Using item response…

  15. Contributing Meaning to Research in Developmental Disabilities: Integrating Participatory Action and Methodological Rigor

    ERIC Educational Resources Information Center

    Powers, Laurie E.

    2017-01-01

    Action research approaches reflecting power sharing by academic and community researchers, full engagement of community partners across all study phases, and ongoing commitment to partnership and capacity building have been increasingly embraced, particularly in research affecting marginalized populations. Findings suggest action research…

  16. Characterizing Surface Transport Barriers in the South China Sea

    DTIC Science & Technology

    2015-09-30

    to a coral reef system flow, rigorously identifying hyperbolic and elliptic flow structures. 2 RESULTS The FTLE approach was found to be...included in real world applications (Allshouse et al. 2015). Figure 3: The impact of windage on a hypothetical tracer release event of Ningaloo Reef

  17. On the estimation of the worst-case implant-induced RF-heating in multi-channel MRI.

    PubMed

    Córcoles, Juan; Zastrow, Earl; Kuster, Niels

    2017-06-21

    The increasing use of multiple radiofrequency (RF) transmit channels in magnetic resonance imaging (MRI) systems makes it necessary to rigorously assess the risk of RF-induced heating. This risk is especially aggravated with inclusions of medical implants within the body. The worst-case RF-heating scenario is achieved when the local tissue deposition in the at-risk region (generally in the vicinity of the implant electrodes) reaches its maximum value while MRI exposure is compliant with predefined general specific absorption rate (SAR) limits or power requirements. This work first reviews the common approach to estimate the worst-case RF-induced heating in multi-channel MRI environment, based on the maximization of the ratio of two Hermitian forms by solving a generalized eigenvalue problem. It is then shown that the common approach is not rigorous and may lead to an underestimation of the worst-case RF-heating scenario when there is a large number of RF transmit channels and there exist multiple SAR or power constraints to be satisfied. Finally, this work derives a rigorous SAR-based formulation to estimate a preferable worst-case scenario, which is solved by casting a semidefinite programming relaxation of this original non-convex problem, whose solution closely approximates the true worst-case including all SAR constraints. Numerical results for 2, 4, 8, 16, and 32 RF channels in a 3T-MRI volume coil for a patient with a deep-brain stimulator under a head imaging exposure are provided as illustrative examples.

  18. On the estimation of the worst-case implant-induced RF-heating in multi-channel MRI

    NASA Astrophysics Data System (ADS)

    Córcoles, Juan; Zastrow, Earl; Kuster, Niels

    2017-06-01

    The increasing use of multiple radiofrequency (RF) transmit channels in magnetic resonance imaging (MRI) systems makes it necessary to rigorously assess the risk of RF-induced heating. This risk is especially aggravated with inclusions of medical implants within the body. The worst-case RF-heating scenario is achieved when the local tissue deposition in the at-risk region (generally in the vicinity of the implant electrodes) reaches its maximum value while MRI exposure is compliant with predefined general specific absorption rate (SAR) limits or power requirements. This work first reviews the common approach to estimate the worst-case RF-induced heating in multi-channel MRI environment, based on the maximization of the ratio of two Hermitian forms by solving a generalized eigenvalue problem. It is then shown that the common approach is not rigorous and may lead to an underestimation of the worst-case RF-heating scenario when there is a large number of RF transmit channels and there exist multiple SAR or power constraints to be satisfied. Finally, this work derives a rigorous SAR-based formulation to estimate a preferable worst-case scenario, which is solved by casting a semidefinite programming relaxation of this original non-convex problem, whose solution closely approximates the true worst-case including all SAR constraints. Numerical results for 2, 4, 8, 16, and 32 RF channels in a 3T-MRI volume coil for a patient with a deep-brain stimulator under a head imaging exposure are provided as illustrative examples.

  19. External model validation of binary clinical risk prediction models in cardiovascular and thoracic surgery.

    PubMed

    Hickey, Graeme L; Blackstone, Eugene H

    2016-08-01

    Clinical risk-prediction models serve an important role in healthcare. They are used for clinical decision-making and measuring the performance of healthcare providers. To establish confidence in a model, external model validation is imperative. When designing such an external model validation study, thought must be given to patient selection, risk factor and outcome definitions, missing data, and the transparent reporting of the analysis. In addition, there are a number of statistical methods available for external model validation. Execution of a rigorous external validation study rests in proper study design, application of suitable statistical methods, and transparent reporting. Copyright © 2016 The American Association for Thoracic Surgery. Published by Elsevier Inc. All rights reserved.

  20. Can Using Rigorous Evidence to Guide Federal Education Funds Improve Student Achievement? Randomized Trials Show Encouraging Initial Results for DoED's Investing in Innovation Fund

    ERIC Educational Resources Information Center

    Coalition for Evidence-Based Policy, 2014

    2014-01-01

    An important recent development in evidence-based policy is the federal government's use of a "tiered evidence" approach to allocating funding in grant programs such as the U.S. Department of Education's Investing in Innovation Fund (i3). A central feature of this approach is that the largest grants are awarded to fund large-scale…

  1. The effect of a workplace violence training program for generalist nurses in the acute hospital setting: A quasi-experimental study.

    PubMed

    Lamont, Scott; Brunero, Scott

    2018-05-19

    Workplace violence prevalence has attracted significant attention within the international nursing literature. Little attention to non-mental health settings and a lack of evaluation rigor have been identified within review literature. To examine the effects of a workplace violence training program in relation to risk assessment and management practices, de-escalation skills, breakaway techniques, and confidence levels, within an acute hospital setting. A quasi-experimental study of nurses using pretest-posttest measurements of educational objectives and confidence levels, with two week follow-up. A 440 bed metropolitan tertiary referral hospital in Sydney, Australia. Nurses working in specialties identified as a 'high risk' for violence. A pre-post-test design was used with participants attending a one day workshop. The workshop evaluation comprised the use of two validated questionnaires: the Continuing Professional Development Reaction questionnaire, and the Confidence in Coping with Patient Aggression Instrument. Descriptive and inferential statistics were calculated. The paired t-test was used to assess the statistical significance of changes in the clinical behaviour intention and confidence scores from pre- to post-intervention. Cohen's d effect sizes were calculated to determine the extent of the significant results. Seventy-eight participants completed both pre- and post-workshop evaluation questionnaires. Statistically significant increases in behaviour intention scores were found in fourteen of the fifteen constructs relating to the three broad workshop objectives, and confidence ratings, with medium to large effect sizes observed in some constructs. A significant increase in overall confidence in coping with patient aggression was also found post-test with large effect size. Positive results were observed from the workplace violence training. Training needs to be complimented by a multi-faceted organisational approach which includes governance, quality and review processes. Copyright © 2018 Elsevier Ltd. All rights reserved.

  2. Using structural damage statistics to derive macroseismic intensity within the Kathmandu valley for the 2015 M7.8 Gorkha, Nepal earthquake

    NASA Astrophysics Data System (ADS)

    McGowan, S. M.; Jaiswal, K. S.; Wald, D. J.

    2017-09-01

    We make and analyze structural damage observations from within the Kathmandu valley following the 2015 M7.8 Gorkha, Nepal earthquake to derive macroseismic intensities at several locations including some located near ground motion recording sites. The macroseismic intensity estimates supplement the limited strong ground motion data in order to characterize the damage statistics. This augmentation allows for direct comparisons between ground motion amplitudes and structural damage characteristics and ultimately produces a more constrained ground shaking hazard map for the Gorkha earthquake. For systematic assessments, we focused on damage to three specific building categories: (a) low/mid-rise reinforced concrete frames with infill brick walls, (b) unreinforced brick masonry bearing walls with reinforced concrete slabs, and (c) unreinforced brick masonry bearing walls with partial timber framing. Evaluating dozens of photos of each construction type, assigning each building in the study sample to a European Macroseismic Scale (EMS)-98 Vulnerability Class based upon its structural characteristics, and then individually assigning an EMS-98 Damage Grade to each building allows a statistically derived estimate of macroseismic intensity for each of nine study areas in and around the Kathmandu valley. This analysis concludes that EMS-98 macroseismic intensities for the study areas from the Gorkha mainshock typically were in the VII-IX range. The intensity assignment process described is more rigorous than the informal approach of assigning intensities based upon anecdotal media or first-person accounts of felt-reports, shaking, and their interpretation of damage. Detailed EMS-98 macroseismic assessments in urban areas are critical for quantifying relations between shaking and damage as well as for calibrating loss estimates. We show that the macroseismic assignments made herein result in fatality estimates consistent with the overall and district-wide reported values.

  3. Using structural damage statistics to derive macroseismic intensity within the Kathmandu valley for the 2015 M7.8 Gorkha, Nepal earthquake

    USGS Publications Warehouse

    McGowan, Sean; Jaiswal, Kishor; Wald, David J.

    2017-01-01

    We make and analyze structural damage observations from within the Kathmandu valley following the 2015 M7.8 Gorkha, Nepal earthquake to derive macroseismic intensities at several locations including some located near ground motion recording sites. The macroseismic intensity estimates supplement the limited strong ground motion data in order to characterize the damage statistics. This augmentation allows for direct comparisons between ground motion amplitudes and structural damage characteristics and ultimately produces a more constrained ground shaking hazard map for the Gorkha earthquake. For systematic assessments, we focused on damage to three specific building categories: (a) low/mid-rise reinforced concrete frames with infill brick walls, (b) unreinforced brick masonry bearing walls with reinforced concrete slabs, and (c) unreinforced brick masonry bearing walls with partial timber framing. Evaluating dozens of photos of each construction type, assigning each building in the study sample to a European Macroseismic Scale (EMS)-98 Vulnerability Class based upon its structural characteristics, and then individually assigning an EMS-98 Damage Grade to each building allows a statistically derived estimate of macroseismic intensity for each of nine study areas in and around the Kathmandu valley. This analysis concludes that EMS-98 macroseismic intensities for the study areas from the Gorkha mainshock typically were in the VII–IX range. The intensity assignment process described is more rigorous than the informal approach of assigning intensities based upon anecdotal media or first-person accounts of felt-reports, shaking, and their interpretation of damage. Detailed EMS-98 macroseismic assessments in urban areas are critical for quantifying relations between shaking and damage as well as for calibrating loss estimates. We show that the macroseismic assignments made herein result in fatality estimates consistent with the overall and district-wide reported values.

  4. Effect and safety of early weight-bearing on the outcome after open-wedge high tibial osteotomy: a systematic review and meta-analysis.

    PubMed

    Lee, O-Sung; Ahn, Soyeon; Lee, Yong Seuk

    2017-07-01

    The purpose of this systematic review and meta-analysis was to evaluate the effectiveness and safety of early weight-bearing by comparing clinical and radiological outcomes between early and traditional delayed weight-bearing after OWHTO. A rigorous and systematic approach was used. The methodological quality was also assessed. Results that are possible to be compared in two or more than two articles were presented as forest plots. A 95% confidence interval was calculated for each effect size, and we calculated the I 2 statistic, which presents the percentage of total variation attributable to the heterogeneity among studies. The random-effects model was used to calculate the effect size. Six articles were included in the final analysis. All case groups were composed of early full weight-bearing within 2 weeks. All control groups were composed of late full weight-bearing between 6 weeks and 2 months. Pooled analysis was possible for the improvement in Lysholm score, but there was no statistically significant difference shown between groups. Other clinical results were also similar between groups. Four studies reported mechanical femorotibial angle (mFTA) and this result showed no statistically significant difference between groups in the pooled analysis. Furthermore, early weight-bearing showed more favorable results in some radiologic results (osseointegration and patellar height) and complications (thrombophlebitis and recurrence). Our analysis supports that early full weight-bearing after OWHTO using a locking plate leads to improvement in outcomes and was comparable to the delayed weight-bearing in terms of clinical and radiological outcomes. On the contrary, early weight-bearing was more favorable with respect to some radiologic parameters and complications compared with delayed weight-bearing.

  5. Increasing URM Undergraduate Student Success through Assessment-Driven Interventions: A Multiyear Study Using Freshman-Level General Biology as a Model System.

    PubMed

    Carmichael, Mary C; St Clair, Candace; Edwards, Andrea M; Barrett, Peter; McFerrin, Harris; Davenport, Ian; Awad, Mohamed; Kundu, Anup; Ireland, Shubha Kale

    2016-01-01

    Xavier University of Louisiana leads the nation in awarding BS degrees in the biological sciences to African-American students. In this multiyear study with ∼5500 participants, data-driven interventions were adopted to improve student academic performance in a freshman-level general biology course. The three hour-long exams were common and administered concurrently to all students. New exam questions were developed using Bloom's taxonomy, and exam results were analyzed statistically with validated assessment tools. All but the comprehensive final exam were returned to students for self-evaluation and remediation. Among other approaches, course rigor was monitored by using an identical set of 60 questions on the final exam across 10 semesters. Analysis of the identical sets of 60 final exam questions revealed that overall averages increased from 72.9% (2010) to 83.5% (2015). Regression analysis demonstrated a statistically significant correlation between high-risk students and their averages on the 60 questions. Additional analysis demonstrated statistically significant improvements for at least one letter grade from midterm to final and a 20% increase in the course pass rates over time, also for the high-risk population. These results support the hypothesis that our data-driven interventions and assessment techniques are successful in improving student retention, particularly for our academically at-risk students. © 2016 M. C. Carmichael et al. CBE—Life Sciences Education © 2016 The American Society for Cell Biology. This article is distributed by The American Society for Cell Biology under license from the author(s). It is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).

  6. Statistical framework for the utilization of simultaneous pupil plane and focal plane telemetry for exoplanet imaging. I. Accounting for aberrations in multiple planes.

    PubMed

    Frazin, Richard A

    2016-04-01

    A new generation of telescopes with mirror diameters of 20 m or more, called extremely large telescopes (ELTs), has the potential to provide unprecedented imaging and spectroscopy of exoplanetary systems, if the difficulties in achieving the extremely high dynamic range required to differentiate the planetary signal from the star can be overcome to a sufficient degree. Fully utilizing the potential of ELTs for exoplanet imaging will likely require simultaneous and self-consistent determination of both the planetary image and the unknown aberrations in multiple planes of the optical system, using statistical inference based on the wavefront sensor and science camera data streams. This approach promises to overcome the most important systematic errors inherent in the various schemes based on differential imaging, such as angular differential imaging and spectral differential imaging. This paper is the first in a series on this subject, in which a formalism is established for the exoplanet imaging problem, setting the stage for the statistical inference methods to follow in the future. Every effort has been made to be rigorous and complete, so that validity of approximations to be made later can be assessed. Here, the polarimetric image is expressed in terms of aberrations in the various planes of a polarizing telescope with an adaptive optics system. Further, it is shown that current methods that utilize focal plane sensing to correct the speckle field, e.g., electric field conjugation, rely on the tacit assumption that aberrations on multiple optical surfaces can be represented as aberration on a single optical surface, ultimately limiting their potential effectiveness for ground-based astronomy.

  7. Time-of-flight PET image reconstruction using origin ensembles.

    PubMed

    Wülker, Christian; Sitek, Arkadiusz; Prevrhal, Sven

    2015-03-07

    The origin ensemble (OE) algorithm is a novel statistical method for minimum-mean-square-error (MMSE) reconstruction of emission tomography data. This method allows one to perform reconstruction entirely in the image domain, i.e. without the use of forward and backprojection operations. We have investigated the OE algorithm in the context of list-mode (LM) time-of-flight (TOF) PET reconstruction. In this paper, we provide a general introduction to MMSE reconstruction, and a statistically rigorous derivation of the OE algorithm. We show how to efficiently incorporate TOF information into the reconstruction process, and how to correct for random coincidences and scattered events. To examine the feasibility of LM-TOF MMSE reconstruction with the OE algorithm, we applied MMSE-OE and standard maximum-likelihood expectation-maximization (ML-EM) reconstruction to LM-TOF phantom data with a count number typically registered in clinical PET examinations. We analyzed the convergence behavior of the OE algorithm, and compared reconstruction time and image quality to that of the EM algorithm. In summary, during the reconstruction process, MMSE-OE contrast recovery (CRV) remained approximately the same, while background variability (BV) gradually decreased with an increasing number of OE iterations. The final MMSE-OE images exhibited lower BV and a slightly lower CRV than the corresponding ML-EM images. The reconstruction time of the OE algorithm was approximately 1.3 times longer. At the same time, the OE algorithm can inherently provide a comprehensive statistical characterization of the acquired data. This characterization can be utilized for further data processing, e.g. in kinetic analysis and image registration, making the OE algorithm a promising approach in a variety of applications.

  8. Time-of-flight PET image reconstruction using origin ensembles

    NASA Astrophysics Data System (ADS)

    Wülker, Christian; Sitek, Arkadiusz; Prevrhal, Sven

    2015-03-01

    The origin ensemble (OE) algorithm is a novel statistical method for minimum-mean-square-error (MMSE) reconstruction of emission tomography data. This method allows one to perform reconstruction entirely in the image domain, i.e. without the use of forward and backprojection operations. We have investigated the OE algorithm in the context of list-mode (LM) time-of-flight (TOF) PET reconstruction. In this paper, we provide a general introduction to MMSE reconstruction, and a statistically rigorous derivation of the OE algorithm. We show how to efficiently incorporate TOF information into the reconstruction process, and how to correct for random coincidences and scattered events. To examine the feasibility of LM-TOF MMSE reconstruction with the OE algorithm, we applied MMSE-OE and standard maximum-likelihood expectation-maximization (ML-EM) reconstruction to LM-TOF phantom data with a count number typically registered in clinical PET examinations. We analyzed the convergence behavior of the OE algorithm, and compared reconstruction time and image quality to that of the EM algorithm. In summary, during the reconstruction process, MMSE-OE contrast recovery (CRV) remained approximately the same, while background variability (BV) gradually decreased with an increasing number of OE iterations. The final MMSE-OE images exhibited lower BV and a slightly lower CRV than the corresponding ML-EM images. The reconstruction time of the OE algorithm was approximately 1.3 times longer. At the same time, the OE algorithm can inherently provide a comprehensive statistical characterization of the acquired data. This characterization can be utilized for further data processing, e.g. in kinetic analysis and image registration, making the OE algorithm a promising approach in a variety of applications.

  9. Demodulation of messages received with low signal to noise ratio

    NASA Astrophysics Data System (ADS)

    Marguinaud, A.; Quignon, T.; Romann, B.

    The implementation of this all-digital demodulator is derived from maximum likelihood considerations applied to an analytical representation of the received signal. Traditional adapted filters and phase lock loops are replaced by minimum variance estimators and hypothesis tests. These statistical tests become very simple when working on phase signal. These methods, combined with rigorous control data representation allow significant computation savings as compared to conventional realizations. Nominal operation has been verified down to energetic signal over noise of -3 dB upon a QPSK demodulator.

  10. Ray-optical theory of broadband partially coherent emission

    NASA Astrophysics Data System (ADS)

    Epstein, Ariel; Tessler, Nir; Einziger, Pinchas D.

    2013-04-01

    We present a rigorous formulation of the effects of spectral broadening on emission of partially coherent source ensembles embedded in multilayered formations with arbitrarily shaped interfaces, provided geometrical optics is valid. The resulting ray-optical theory, applicable to a variety of optical systems from terahertz lenses to photovoltaic cells, quantifies the fundamental interplay between bandwidth and layer dimensions, and sheds light on common practices in optical analysis of statistical fields, e.g., disregarding multiple reflections or neglecting interference cross terms.

  11. A case-control study of malignant melanoma among Lawrence Livermore National Laboratory employees: A critical evaluation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kupper, L.L.; Setzer, R.W.; Schwartzbaum, J.

    1987-07-01

    This document reports on a reevaluation of data obtained in a previous report on occupational factors associated with the development of malignant melanomas at Lawrence Livermore National Laboratory. The current report reduces the number of these factors from five to three based on a rigorous statistical analysis of the original data. Recommendations include restructuring the original questionnaire and trying to contact more individuals that worked with volatile photographic chemicals. 17 refs., 7 figs., 22 tabs. (TEM)

  12. Alternative approaches to research in physical therapy: positivism and phenomenology.

    PubMed

    Shepard, K F; Jensen, G M; Schmoll, B J; Hack, L M; Gwyer, J

    1993-02-01

    This article presents philosophical approaches to research in physical therapy. A comparison is made to demonstrate how the research purpose, research design, research methods, and research data differ when one approaches research from the philosophical perspective of positivism (predominantly quantitative) as compared with the philosophical perspective of phenomenology (predominantly qualitative). Differences between the two approaches are highlighted by examples from research articles published in Physical Therapy. The authors urge physical therapy researchers to become familiar with the tenets, rigor, and knowledge gained from the use of both approaches in order to increase their options in conducting research relevant to the practice of physical therapy.

  13. On the Modeling of Shells in Multibody Dynamics

    NASA Technical Reports Server (NTRS)

    Bauchau, Olivier A.; Choi, Jou-Young; Bottasso, Carlo L.

    2000-01-01

    Energy preserving/decaying schemes are presented for the simulation of the nonlinear multibody systems involving shell components. The proposed schemes are designed to meet four specific requirements: unconditional nonlinear stability of the scheme, a rigorous treatment of both geometric and material nonlinearities, exact satisfaction of the constraints, and the presence of high frequency numerical dissipation. The kinematic nonlinearities associated with arbitrarily large displacements and rotations of shells are treated in a rigorous manner, and the material nonlinearities can be handled when the, constitutive laws stem from the existence of a strain energy density function. The efficiency and robustness of the proposed approach is illustrated with specific numerical examples that also demonstrate the need for integration schemes possessing high frequency numerical dissipation.

  14. Efficient O(N) integration for all-electron electronic structure calculation using numeric basis functions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Havu, V.; Fritz Haber Institute of the Max Planck Society, Berlin; Blum, V.

    2009-12-01

    We consider the problem of developing O(N) scaling grid-based operations needed in many central operations when performing electronic structure calculations with numeric atom-centered orbitals as basis functions. We outline the overall formulation of localized algorithms, and specifically the creation of localized grid batches. The choice of the grid partitioning scheme plays an important role in the performance and memory consumption of the grid-based operations. Three different top-down partitioning methods are investigated, and compared with formally more rigorous yet much more expensive bottom-up algorithms. We show that a conceptually simple top-down grid partitioning scheme achieves essentially the same efficiency as themore » more rigorous bottom-up approaches.« less

  15. A feminist response to Weitzer.

    PubMed

    Dines, Gail

    2012-04-01

    In his review of my book Pornland: How Porn has Hijacked our Sexuality, Ronald Weitzer claims that anti-porn feminists are incapable of objective, rigorous research because they operate within the "oppression paradigm," which he defines as "a perspective that depicts all types of sex work as exploitive, violent, and perpetuating gender inequality." (VAW, 2011, 666). This article argues that while anti-porn feminists do indeed see pornography as exploitive, such a position is rooted in the rigorous theories and methods of cultural studies developed by critical media scholars such as Stuart Hall and Antonio Gramsci. Pornland applies a cultural studies approach by exploring how porn images are part of a wider system of sexist representations that legitimize and normalize the economic, political and legal oppression of women.

  16. Development of rigor mortis is not affected by muscle volume.

    PubMed

    Kobayashi, M; Ikegaya, H; Takase, I; Hatanaka, K; Sakurada, K; Iwase, H

    2001-04-01

    There is a hypothesis suggesting that rigor mortis progresses more rapidly in small muscles than in large muscles. We measured rigor mortis as tension determined isometrically in rat musculus erector spinae that had been cut into muscle bundles of various volumes. The muscle volume did not influence either the progress or the resolution of rigor mortis, which contradicts the hypothesis. Differences in pre-rigor load on the muscles influenced the onset and resolution of rigor mortis in a few pairs of samples, but did not influence the time taken for rigor mortis to reach its full extent after death. Moreover, the progress of rigor mortis in this muscle was biphasic; this may reflect the early rigor of red muscle fibres and the late rigor of white muscle fibres.

  17. Neoliberalism, Policy Reforms and Higher Education in Bangladesh

    ERIC Educational Resources Information Center

    Kabir, Ariful Haq

    2013-01-01

    Bangladesh has introduced neoliberal policies since the 1970s. Military regimes, since the dramatic political changes in 1975, accelerated the process. A succession of military rulers made rigorous changes in policy-making in various sectors. This article uses a critical approach to document analysis and examines the perceptions of key…

  18. Quantitative Approaches to Group Research: Suggestions for Best Practices

    ERIC Educational Resources Information Center

    McCarthy, Christopher J.; Whittaker, Tiffany A.; Boyle, Lauren H.; Eyal, Maytal

    2017-01-01

    Rigorous scholarship is essential to the continued growth of group work, yet the unique nature of this counseling specialty poses challenges for quantitative researchers. The purpose of this proposal is to overview unique challenges to quantitative research with groups in the counseling field, including difficulty in obtaining large sample sizes…

  19. Making Sense of Energy

    ERIC Educational Resources Information Center

    Boohan, Richard

    2014-01-01

    This article describes an approach to teaching about the energy concept that aims to be accessible to students starting in early secondary school, while being scientifically rigorous and forming the foundation for later work. It discusses how exploring thermal processes is a good starting point for a more general consideration of the ways that…

  20. Academic Rigor

    ERIC Educational Resources Information Center

    Tucker, James E.

    2014-01-01

    The field of K-12 deaf education today continues to be fractured by ideological camps. A newcomer to the field quickly learns that the controversies related to language, communication, and instructional approaches continue to rage after almost 200 years of contentious debate. Much attention is given to auditory and speech development as well as…

Top