Rigorous Science: a How-To Guide.
Casadevall, Arturo; Fang, Ferric C
2016-11-08
Proposals to improve the reproducibility of biomedical research have emphasized scientific rigor. Although the word "rigor" is widely used, there has been little specific discussion as to what it means and how it can be achieved. We suggest that scientific rigor combines elements of mathematics, logic, philosophy, and ethics. We propose a framework for rigor that includes redundant experimental design, sound statistical analysis, recognition of error, avoidance of logical fallacies, and intellectual honesty. These elements lead to five actionable recommendations for research education. Copyright © 2016 Casadevall and Fang.
Rigorous Science: a How-To Guide
Fang, Ferric C.
2016-01-01
ABSTRACT Proposals to improve the reproducibility of biomedical research have emphasized scientific rigor. Although the word “rigor” is widely used, there has been little specific discussion as to what it means and how it can be achieved. We suggest that scientific rigor combines elements of mathematics, logic, philosophy, and ethics. We propose a framework for rigor that includes redundant experimental design, sound statistical analysis, recognition of error, avoidance of logical fallacies, and intellectual honesty. These elements lead to five actionable recommendations for research education. PMID:27834205
Applying Sociocultural Theory to Teaching Statistics for Doctoral Social Work Students
ERIC Educational Resources Information Center
Mogro-Wilson, Cristina; Reeves, Michael G.; Charter, Mollie Lazar
2015-01-01
This article describes the development of two doctoral-level multivariate statistics courses utilizing sociocultural theory, an integrative pedagogical framework. In the first course, the implementation of sociocultural theory helps to support the students through a rigorous introduction to statistics. The second course involves students…
Schaid, Daniel J
2010-01-01
Measures of genomic similarity are the basis of many statistical analytic methods. We review the mathematical and statistical basis of similarity methods, particularly based on kernel methods. A kernel function converts information for a pair of subjects to a quantitative value representing either similarity (larger values meaning more similar) or distance (smaller values meaning more similar), with the requirement that it must create a positive semidefinite matrix when applied to all pairs of subjects. This review emphasizes the wide range of statistical methods and software that can be used when similarity is based on kernel methods, such as nonparametric regression, linear mixed models and generalized linear mixed models, hierarchical models, score statistics, and support vector machines. The mathematical rigor for these methods is summarized, as is the mathematical framework for making kernels. This review provides a framework to move from intuitive and heuristic approaches to define genomic similarities to more rigorous methods that can take advantage of powerful statistical modeling and existing software. A companion paper reviews novel approaches to creating kernels that might be useful for genomic analyses, providing insights with examples [1]. Copyright © 2010 S. Karger AG, Basel.
NASA Astrophysics Data System (ADS)
Lee, K. David; Wiesenfeld, Eric; Gelfand, Andrew
2007-04-01
One of the greatest challenges in modern combat is maintaining a high level of timely Situational Awareness (SA). In many situations, computational complexity and accuracy considerations make the development and deployment of real-time, high-level inference tools very difficult. An innovative hybrid framework that combines Bayesian inference, in the form of Bayesian Networks, and Possibility Theory, in the form of Fuzzy Logic systems, has recently been introduced to provide a rigorous framework for high-level inference. In previous research, the theoretical basis and benefits of the hybrid approach have been developed. However, lacking is a concrete experimental comparison of the hybrid framework with traditional fusion methods, to demonstrate and quantify this benefit. The goal of this research, therefore, is to provide a statistical analysis on the comparison of the accuracy and performance of hybrid network theory, with pure Bayesian and Fuzzy systems and an inexact Bayesian system approximated using Particle Filtering. To accomplish this task, domain specific models will be developed under these different theoretical approaches and then evaluated, via Monte Carlo Simulation, in comparison to situational ground truth to measure accuracy and fidelity. Following this, a rigorous statistical analysis of the performance results will be performed, to quantify the benefit of hybrid inference to other fusion tools.
Origin of the spike-timing-dependent plasticity rule
NASA Astrophysics Data System (ADS)
Cho, Myoung Won; Choi, M. Y.
2016-08-01
A biological synapse changes its efficacy depending on the difference between pre- and post-synaptic spike timings. Formulating spike-timing-dependent interactions in terms of the path integral, we establish a neural-network model, which makes it possible to predict relevant quantities rigorously by means of standard methods in statistical mechanics and field theory. In particular, the biological synaptic plasticity rule is shown to emerge as the optimal form for minimizing the free energy. It is further revealed that maximization of the entropy of neural activities gives rise to the competitive behavior of biological learning. This demonstrates that statistical mechanics helps to understand rigorously key characteristic behaviors of a neural network, thus providing the possibility of physics serving as a useful and relevant framework for probing life.
Rigorous Statistical Bounds in Uncertainty Quantification for One-Layer Turbulent Geophysical Flows
NASA Astrophysics Data System (ADS)
Qi, Di; Majda, Andrew J.
2018-04-01
Statistical bounds controlling the total fluctuations in mean and variance about a basic steady-state solution are developed for the truncated barotropic flow over topography. Statistical ensemble prediction is an important topic in weather and climate research. Here, the evolution of an ensemble of trajectories is considered using statistical instability analysis and is compared and contrasted with the classical deterministic instability for the growth of perturbations in one pointwise trajectory. The maximum growth of the total statistics in fluctuations is derived relying on the statistical conservation principle of the pseudo-energy. The saturation bound of the statistical mean fluctuation and variance in the unstable regimes with non-positive-definite pseudo-energy is achieved by linking with a class of stable reference states and minimizing the stable statistical energy. Two cases with dependence on initial statistical uncertainty and on external forcing and dissipation are compared and unified under a consistent statistical stability framework. The flow structures and statistical stability bounds are illustrated and verified by numerical simulations among a wide range of dynamical regimes, where subtle transient statistical instability exists in general with positive short-time exponential growth in the covariance even when the pseudo-energy is positive-definite. Among the various scenarios in this paper, there exist strong forward and backward energy exchanges between different scales which are estimated by the rigorous statistical bounds.
Mourning dove hunting regulation strategy based on annual harvest statistics and banding data
Otis, D.L.
2006-01-01
Although managers should strive to base game bird harvest management strategies on mechanistic population models, monitoring programs required to build and continuously update these models may not be in place. Alternatively, If estimates of total harvest and harvest rates are available, then population estimates derived from these harvest data can serve as the basis for making hunting regulation decisions based on population growth rates derived from these estimates. I present a statistically rigorous approach for regulation decision-making using a hypothesis-testing framework and an assumed framework of 3 hunting regulation alternatives. I illustrate and evaluate the technique with historical data on the mid-continent mallard (Anas platyrhynchos) population. I evaluate the statistical properties of the hypothesis-testing framework using the best available data on mourning doves (Zenaida macroura). I use these results to discuss practical implementation of the technique as an interim harvest strategy for mourning doves until reliable mechanistic population models and associated monitoring programs are developed.
ERIC Educational Resources Information Center
Horne, Lela M.; Rachal, John R.; Shelley, Kyna
2012-01-01
A mixed methods framework utilized quantitative and qualitative data to determine whether statistically significant differences existed between high school and GED[R] student perceptions of credential value. An exploratory factor analysis (n=326) extracted four factors and then a MANOVA procedure was performed with a stratified quota sample…
NASA Astrophysics Data System (ADS)
Cenek, Martin; Dahl, Spencer K.
2016-11-01
Systems with non-linear dynamics frequently exhibit emergent system behavior, which is important to find and specify rigorously to understand the nature of the modeled phenomena. Through this analysis, it is possible to characterize phenomena such as how systems assemble or dissipate and what behaviors lead to specific final system configurations. Agent Based Modeling (ABM) is one of the modeling techniques used to study the interaction dynamics between a system's agents and its environment. Although the methodology of ABM construction is well understood and practiced, there are no computational, statistically rigorous, comprehensive tools to evaluate an ABM's execution. Often, a human has to observe an ABM's execution in order to analyze how the ABM functions, identify the emergent processes in the agent's behavior, or study a parameter's effect on the system-wide behavior. This paper introduces a new statistically based framework to automatically analyze agents' behavior, identify common system-wide patterns, and record the probability of agents changing their behavior from one pattern of behavior to another. We use network based techniques to analyze the landscape of common behaviors in an ABM's execution. Finally, we test the proposed framework with a series of experiments featuring increasingly emergent behavior. The proposed framework will allow computational comparison of ABM executions, exploration of a model's parameter configuration space, and identification of the behavioral building blocks in a model's dynamics.
Cenek, Martin; Dahl, Spencer K
2016-11-01
Systems with non-linear dynamics frequently exhibit emergent system behavior, which is important to find and specify rigorously to understand the nature of the modeled phenomena. Through this analysis, it is possible to characterize phenomena such as how systems assemble or dissipate and what behaviors lead to specific final system configurations. Agent Based Modeling (ABM) is one of the modeling techniques used to study the interaction dynamics between a system's agents and its environment. Although the methodology of ABM construction is well understood and practiced, there are no computational, statistically rigorous, comprehensive tools to evaluate an ABM's execution. Often, a human has to observe an ABM's execution in order to analyze how the ABM functions, identify the emergent processes in the agent's behavior, or study a parameter's effect on the system-wide behavior. This paper introduces a new statistically based framework to automatically analyze agents' behavior, identify common system-wide patterns, and record the probability of agents changing their behavior from one pattern of behavior to another. We use network based techniques to analyze the landscape of common behaviors in an ABM's execution. Finally, we test the proposed framework with a series of experiments featuring increasingly emergent behavior. The proposed framework will allow computational comparison of ABM executions, exploration of a model's parameter configuration space, and identification of the behavioral building blocks in a model's dynamics.
Bamberger, Michael; Tarsilla, Michele; Hesse-Biber, Sharlene
2016-04-01
Many widely-used impact evaluation designs, including randomized control trials (RCTs) and quasi-experimental designs (QEDs), frequently fail to detect what are often quite serious unintended consequences of development programs. This seems surprising as experienced planners and evaluators are well aware that unintended consequences frequently occur. Most evaluation designs are intended to determine whether there is credible evidence (statistical, theory-based or narrative) that programs have achieved their intended objectives and the logic of many evaluation designs, even those that are considered the most "rigorous," does not permit the identification of outcomes that were not specified in the program design. We take the example of RCTs as they are considered by many to be the most rigorous evaluation designs. We present a numbers of cases to illustrate how infusing RCTs with a mixed-methods approach (sometimes called an "RCT+" design) can strengthen the credibility of these designs and can also capture important unintended consequences. We provide a Mixed Methods Evaluation Framework that identifies 9 ways in which UCs can occur, and we apply this framework to two of the case studies. Copyright © 2016 Elsevier Ltd. All rights reserved.
Walsh, Daniel P.; Norton, Andrew S.; Storm, Daniel J.; Van Deelen, Timothy R.; Heisy, Dennis M.
2018-01-01
Implicit and explicit use of expert knowledge to inform ecological analyses is becoming increasingly common because it often represents the sole source of information in many circumstances. Thus, there is a need to develop statistical methods that explicitly incorporate expert knowledge, and can successfully leverage this information while properly accounting for associated uncertainty during analysis. Studies of cause-specific mortality provide an example of implicit use of expert knowledge when causes-of-death are uncertain and assigned based on the observer's knowledge of the most likely cause. To explicitly incorporate this use of expert knowledge and the associated uncertainty, we developed a statistical model for estimating cause-specific mortality using a data augmentation approach within a Bayesian hierarchical framework. Specifically, for each mortality event, we elicited the observer's belief of cause-of-death by having them specify the probability that the death was due to each potential cause. These probabilities were then used as prior predictive values within our framework. This hierarchical framework permitted a simple and rigorous estimation method that was easily modified to include covariate effects and regularizing terms. Although applied to survival analysis, this method can be extended to any event-time analysis with multiple event types, for which there is uncertainty regarding the true outcome. We conducted simulations to determine how our framework compared to traditional approaches that use expert knowledge implicitly and assume that cause-of-death is specified accurately. Simulation results supported the inclusion of observer uncertainty in cause-of-death assignment in modeling of cause-specific mortality to improve model performance and inference. Finally, we applied the statistical model we developed and a traditional method to cause-specific survival data for white-tailed deer, and compared results. We demonstrate that model selection results changed between the two approaches, and incorporating observer knowledge in cause-of-death increased the variability associated with parameter estimates when compared to the traditional approach. These differences between the two approaches can impact reported results, and therefore, it is critical to explicitly incorporate expert knowledge in statistical methods to ensure rigorous inference.
The MIXED framework: A novel approach to evaluating mixed-methods rigor.
Eckhardt, Ann L; DeVon, Holli A
2017-10-01
Evaluation of rigor in mixed-methods (MM) research is a persistent challenge due to the combination of inconsistent philosophical paradigms, the use of multiple research methods which require different skill sets, and the need to combine research at different points in the research process. Researchers have proposed a variety of ways to thoroughly evaluate MM research, but each method fails to provide a framework that is useful for the consumer of research. In contrast, the MIXED framework is meant to bridge the gap between an academic exercise and practical assessment of a published work. The MIXED framework (methods, inference, expertise, evaluation, and design) borrows from previously published frameworks to create a useful tool for the evaluation of a published study. The MIXED framework uses an experimental eight-item scale that allows for comprehensive integrated assessment of MM rigor in published manuscripts. Mixed methods are becoming increasingly prevalent in nursing and healthcare research requiring researchers and consumers to address issues unique to MM such as evaluation of rigor. © 2017 John Wiley & Sons Ltd.
Bayesian Reconstruction of Disease Outbreaks by Combining Epidemiologic and Genomic Data
Jombart, Thibaut; Cori, Anne; Didelot, Xavier; Cauchemez, Simon; Fraser, Christophe; Ferguson, Neil
2014-01-01
Recent years have seen progress in the development of statistically rigorous frameworks to infer outbreak transmission trees (“who infected whom”) from epidemiological and genetic data. Making use of pathogen genome sequences in such analyses remains a challenge, however, with a variety of heuristic approaches having been explored to date. We introduce a statistical method exploiting both pathogen sequences and collection dates to unravel the dynamics of densely sampled outbreaks. Our approach identifies likely transmission events and infers dates of infections, unobserved cases and separate introductions of the disease. It also proves useful for inferring numbers of secondary infections and identifying heterogeneous infectivity and super-spreaders. After testing our approach using simulations, we illustrate the method with the analysis of the beginning of the 2003 Singaporean outbreak of Severe Acute Respiratory Syndrome (SARS), providing new insights into the early stage of this epidemic. Our approach is the first tool for disease outbreak reconstruction from genetic data widely available as free software, the R package outbreaker. It is applicable to various densely sampled epidemics, and improves previous approaches by detecting unobserved and imported cases, as well as allowing multiple introductions of the pathogen. Because of its generality, we believe this method will become a tool of choice for the analysis of densely sampled disease outbreaks, and will form a rigorous framework for subsequent methodological developments. PMID:24465202
A combinatorial framework to quantify peak/pit asymmetries in complex dynamics.
Hasson, Uri; Iacovacci, Jacopo; Davis, Ben; Flanagan, Ryan; Tagliazucchi, Enzo; Laufs, Helmut; Lacasa, Lucas
2018-02-23
We explore a combinatorial framework which efficiently quantifies the asymmetries between minima and maxima in local fluctuations of time series. We first showcase its performance by applying it to a battery of synthetic cases. We find rigorous results on some canonical dynamical models (stochastic processes with and without correlations, chaotic processes) complemented by extensive numerical simulations for a range of processes which indicate that the methodology correctly distinguishes different complex dynamics and outperforms state of the art metrics in several cases. Subsequently, we apply this methodology to real-world problems emerging across several disciplines including cases in neurobiology, finance and climate science. We conclude that differences between the statistics of local maxima and local minima in time series are highly informative of the complex underlying dynamics and a graph-theoretic extraction procedure allows to use these features for statistical learning purposes.
Overarching framework for data-based modelling
NASA Astrophysics Data System (ADS)
Schelter, Björn; Mader, Malenka; Mader, Wolfgang; Sommerlade, Linda; Platt, Bettina; Lai, Ying-Cheng; Grebogi, Celso; Thiel, Marco
2014-02-01
One of the main modelling paradigms for complex physical systems are networks. When estimating the network structure from measured signals, typically several assumptions such as stationarity are made in the estimation process. Violating these assumptions renders standard analysis techniques fruitless. We here propose a framework to estimate the network structure from measurements of arbitrary non-linear, non-stationary, stochastic processes. To this end, we propose a rigorous mathematical theory that underlies this framework. Based on this theory, we present a highly efficient algorithm and the corresponding statistics that are immediately sensibly applicable to measured signals. We demonstrate its performance in a simulation study. In experiments of transitions between vigilance stages in rodents, we infer small network structures with complex, time-dependent interactions; this suggests biomarkers for such transitions, the key to understand and diagnose numerous diseases such as dementia. We argue that the suggested framework combines features that other approaches followed so far lack.
Statistical ecology comes of age.
Gimenez, Olivier; Buckland, Stephen T; Morgan, Byron J T; Bez, Nicolas; Bertrand, Sophie; Choquet, Rémi; Dray, Stéphane; Etienne, Marie-Pierre; Fewster, Rachel; Gosselin, Frédéric; Mérigot, Bastien; Monestiez, Pascal; Morales, Juan M; Mortier, Frédéric; Munoz, François; Ovaskainen, Otso; Pavoine, Sandrine; Pradel, Roger; Schurr, Frank M; Thomas, Len; Thuiller, Wilfried; Trenkel, Verena; de Valpine, Perry; Rexstad, Eric
2014-12-01
The desire to predict the consequences of global environmental change has been the driver towards more realistic models embracing the variability and uncertainties inherent in ecology. Statistical ecology has gelled over the past decade as a discipline that moves away from describing patterns towards modelling the ecological processes that generate these patterns. Following the fourth International Statistical Ecology Conference (1-4 July 2014) in Montpellier, France, we analyse current trends in statistical ecology. Important advances in the analysis of individual movement, and in the modelling of population dynamics and species distributions, are made possible by the increasing use of hierarchical and hidden process models. Exciting research perspectives include the development of methods to interpret citizen science data and of efficient, flexible computational algorithms for model fitting. Statistical ecology has come of age: it now provides a general and mathematically rigorous framework linking ecological theory and empirical data.
Statistical ecology comes of age
Gimenez, Olivier; Buckland, Stephen T.; Morgan, Byron J. T.; Bez, Nicolas; Bertrand, Sophie; Choquet, Rémi; Dray, Stéphane; Etienne, Marie-Pierre; Fewster, Rachel; Gosselin, Frédéric; Mérigot, Bastien; Monestiez, Pascal; Morales, Juan M.; Mortier, Frédéric; Munoz, François; Ovaskainen, Otso; Pavoine, Sandrine; Pradel, Roger; Schurr, Frank M.; Thomas, Len; Thuiller, Wilfried; Trenkel, Verena; de Valpine, Perry; Rexstad, Eric
2014-01-01
The desire to predict the consequences of global environmental change has been the driver towards more realistic models embracing the variability and uncertainties inherent in ecology. Statistical ecology has gelled over the past decade as a discipline that moves away from describing patterns towards modelling the ecological processes that generate these patterns. Following the fourth International Statistical Ecology Conference (1–4 July 2014) in Montpellier, France, we analyse current trends in statistical ecology. Important advances in the analysis of individual movement, and in the modelling of population dynamics and species distributions, are made possible by the increasing use of hierarchical and hidden process models. Exciting research perspectives include the development of methods to interpret citizen science data and of efficient, flexible computational algorithms for model fitting. Statistical ecology has come of age: it now provides a general and mathematically rigorous framework linking ecological theory and empirical data. PMID:25540151
NASA Astrophysics Data System (ADS)
Kim, Seongryong; Tkalčić, Hrvoje; Mustać, Marija; Rhie, Junkee; Ford, Sean
2016-04-01
A framework is presented within which we provide rigorous estimations for seismic sources and structures in the Northeast Asia. We use Bayesian inversion methods, which enable statistical estimations of models and their uncertainties based on data information. Ambiguities in error statistics and model parameterizations are addressed by hierarchical and trans-dimensional (trans-D) techniques, which can be inherently implemented in the Bayesian inversions. Hence reliable estimation of model parameters and their uncertainties is possible, thus avoiding arbitrary regularizations and parameterizations. Hierarchical and trans-D inversions are performed to develop a three-dimensional velocity model using ambient noise data. To further improve the model, we perform joint inversions with receiver function data using a newly developed Bayesian method. For the source estimation, a novel moment tensor inversion method is presented and applied to regional waveform data of the North Korean nuclear explosion tests. By the combination of new Bayesian techniques and the structural model, coupled with meaningful uncertainties related to each of the processes, more quantitative monitoring and discrimination of seismic events is possible.
Reconstructing constructivism: causal models, Bayesian learning mechanisms, and the theory theory.
Gopnik, Alison; Wellman, Henry M
2012-11-01
We propose a new version of the "theory theory" grounded in the computational framework of probabilistic causal models and Bayesian learning. Probabilistic models allow a constructivist but rigorous and detailed approach to cognitive development. They also explain the learning of both more specific causal hypotheses and more abstract framework theories. We outline the new theoretical ideas, explain the computational framework in an intuitive and nontechnical way, and review an extensive but relatively recent body of empirical results that supports these ideas. These include new studies of the mechanisms of learning. Children infer causal structure from statistical information, through their own actions on the world and through observations of the actions of others. Studies demonstrate these learning mechanisms in children from 16 months to 4 years old and include research on causal statistical learning, informal experimentation through play, and imitation and informal pedagogy. They also include studies of the variability and progressive character of intuitive theory change, particularly theory of mind. These studies investigate both the physical and the psychological and social domains. We conclude with suggestions for further collaborative projects between developmental and computational cognitive scientists.
Grading Rigor in Counselor Education: A Specifications Grading Framework
ERIC Educational Resources Information Center
Bonner, Matthew W.
2016-01-01
According to accreditation and professional bodies, evaluation and grading are a high priority in counselor education. Specifications grading, an evaluative tool, can be used to increase grading rigor. This article describes the components of specifications grading and applies the framework of specifications grading to a counseling theories course.
Maximum entropy models as a tool for building precise neural controls.
Savin, Cristina; Tkačik, Gašper
2017-10-01
Neural responses are highly structured, with population activity restricted to a small subset of the astronomical range of possible activity patterns. Characterizing these statistical regularities is important for understanding circuit computation, but challenging in practice. Here we review recent approaches based on the maximum entropy principle used for quantifying collective behavior in neural activity. We highlight recent models that capture population-level statistics of neural data, yielding insights into the organization of the neural code and its biological substrate. Furthermore, the MaxEnt framework provides a general recipe for constructing surrogate ensembles that preserve aspects of the data, but are otherwise maximally unstructured. This idea can be used to generate a hierarchy of controls against which rigorous statistical tests are possible. Copyright © 2017 Elsevier Ltd. All rights reserved.
Skelly, Daniel A.; Johansson, Marnie; Madeoy, Jennifer; Wakefield, Jon; Akey, Joshua M.
2011-01-01
Variation in gene expression is thought to make a significant contribution to phenotypic diversity among individuals within populations. Although high-throughput cDNA sequencing offers a unique opportunity to delineate the genome-wide architecture of regulatory variation, new statistical methods need to be developed to capitalize on the wealth of information contained in RNA-seq data sets. To this end, we developed a powerful and flexible hierarchical Bayesian model that combines information across loci to allow both global and locus-specific inferences about allele-specific expression (ASE). We applied our methodology to a large RNA-seq data set obtained in a diploid hybrid of two diverse Saccharomyces cerevisiae strains, as well as to RNA-seq data from an individual human genome. Our statistical framework accurately quantifies levels of ASE with specified false-discovery rates, achieving high reproducibility between independent sequencing platforms. We pinpoint loci that show unusual and biologically interesting patterns of ASE, including allele-specific alternative splicing and transcription termination sites. Our methodology provides a rigorous, quantitative, and high-resolution tool for profiling ASE across whole genomes. PMID:21873452
A probabilistic framework to infer brain functional connectivity from anatomical connections.
Deligianni, Fani; Varoquaux, Gael; Thirion, Bertrand; Robinson, Emma; Sharp, David J; Edwards, A David; Rueckert, Daniel
2011-01-01
We present a novel probabilistic framework to learn across several subjects a mapping from brain anatomical connectivity to functional connectivity, i.e. the covariance structure of brain activity. This prediction problem must be formulated as a structured-output learning task, as the predicted parameters are strongly correlated. We introduce a model selection framework based on cross-validation with a parametrization-independent loss function suitable to the manifold of covariance matrices. Our model is based on constraining the conditional independence structure of functional activity by the anatomical connectivity. Subsequently, we learn a linear predictor of a stationary multivariate autoregressive model. This natural parameterization of functional connectivity also enforces the positive-definiteness of the predicted covariance and thus matches the structure of the output space. Our results show that functional connectivity can be explained by anatomical connectivity on a rigorous statistical basis, and that a proper model of functional connectivity is essential to assess this link.
Schlägel, Ulrike E; Lewis, Mark A
2016-12-01
Discrete-time random walks and their extensions are common tools for analyzing animal movement data. In these analyses, resolution of temporal discretization is a critical feature. Ideally, a model both mirrors the relevant temporal scale of the biological process of interest and matches the data sampling rate. Challenges arise when resolution of data is too coarse due to technological constraints, or when we wish to extrapolate results or compare results obtained from data with different resolutions. Drawing loosely on the concept of robustness in statistics, we propose a rigorous mathematical framework for studying movement models' robustness against changes in temporal resolution. In this framework, we define varying levels of robustness as formal model properties, focusing on random walk models with spatially-explicit component. With the new framework, we can investigate whether models can validly be applied to data across varying temporal resolutions and how we can account for these different resolutions in statistical inference results. We apply the new framework to movement-based resource selection models, demonstrating both analytical and numerical calculations, as well as a Monte Carlo simulation approach. While exact robustness is rare, the concept of approximate robustness provides a promising new direction for analyzing movement models.
Statistical Analysis of Protein Ensembles
NASA Astrophysics Data System (ADS)
Máté, Gabriell; Heermann, Dieter
2014-04-01
As 3D protein-configuration data is piling up, there is an ever-increasing need for well-defined, mathematically rigorous analysis approaches, especially that the vast majority of the currently available methods rely heavily on heuristics. We propose an analysis framework which stems from topology, the field of mathematics which studies properties preserved under continuous deformations. First, we calculate a barcode representation of the molecules employing computational topology algorithms. Bars in this barcode represent different topological features. Molecules are compared through their barcodes by statistically determining the difference in the set of their topological features. As a proof-of-principle application, we analyze a dataset compiled of ensembles of different proteins, obtained from the Ensemble Protein Database. We demonstrate that our approach correctly detects the different protein groupings.
Reconstructing constructivism: Causal models, Bayesian learning mechanisms and the theory theory
Gopnik, Alison; Wellman, Henry M.
2012-01-01
We propose a new version of the “theory theory” grounded in the computational framework of probabilistic causal models and Bayesian learning. Probabilistic models allow a constructivist but rigorous and detailed approach to cognitive development. They also explain the learning of both more specific causal hypotheses and more abstract framework theories. We outline the new theoretical ideas, explain the computational framework in an intuitive and non-technical way, and review an extensive but relatively recent body of empirical results that supports these ideas. These include new studies of the mechanisms of learning. Children infer causal structure from statistical information, through their own actions on the world and through observations of the actions of others. Studies demonstrate these learning mechanisms in children from 16 months to 4 years old and include research on causal statistical learning, informal experimentation through play, and imitation and informal pedagogy. They also include studies of the variability and progressive character of intuitive theory change, particularly theory of mind. These studies investigate both the physical and psychological and social domains. We conclude with suggestions for further collaborative projects between developmental and computational cognitive scientists. PMID:22582739
Unperturbed Schelling Segregation in Two or Three Dimensions
NASA Astrophysics Data System (ADS)
Barmpalias, George; Elwes, Richard; Lewis-Pye, Andrew
2016-09-01
Schelling's models of segregation, first described in 1969 (Am Econ Rev 59:488-493, 1969) are among the best known models of self-organising behaviour. Their original purpose was to identify mechanisms of urban racial segregation. But his models form part of a family which arises in statistical mechanics, neural networks, social science, and beyond, where populations of agents interact on networks. Despite extensive study, unperturbed Schelling models have largely resisted rigorous analysis, prior results generally focusing on variants in which noise is introduced into the dynamics, the resulting system being amenable to standard techniques from statistical mechanics or stochastic evolutionary game theory (Young in Individual strategy and social structure: an evolutionary theory of institutions, Princeton University Press, Princeton, 1998). A series of recent papers (Brandt et al. in: Proceedings of the 44th annual ACM symposium on theory of computing (STOC 2012), 2012); Barmpalias et al. in: 55th annual IEEE symposium on foundations of computer science, Philadelphia, 2014, J Stat Phys 158:806-852, 2015), has seen the first rigorous analyses of 1-dimensional unperturbed Schelling models, in an asymptotic framework largely unknown in statistical mechanics. Here we provide the first such analysis of 2- and 3-dimensional unperturbed models, establishing most of the phase diagram, and answering a challenge from Brandt et al. in: Proceedings of the 44th annual ACM symposium on theory of computing (STOC 2012), 2012).
Density profiles in the Scrape-Off Layer interpreted through filament dynamics
NASA Astrophysics Data System (ADS)
Militello, Fulvio
2017-10-01
We developed a new theoretical framework to clarify the relation between radial Scrape-Off Layer density profiles and the fluctuations that generate them. The framework provides an interpretation of the experimental features of the profiles and of the turbulence statistics on the basis of simple properties of the filaments, such as their radial motion and their draining towards the divertor. L-mode and inter-ELM filaments are described as a Poisson process in which each event is independent and modelled with a wave function of amplitude and width statistically distributed according to experimental observations and evolving according to fluid equations. We will rigorously show that radially accelerating filaments, less efficient parallel exhaust and also a statistical distribution of their radial velocity can contribute to induce flatter profiles in the far SOL and therefore enhance plasma-wall interactions. A quite general result of our analysis is the resiliency of this non-exponential nature of the profiles and the increase of the relative fluctuation amplitude towards the wall, as experimentally observed. According to the framework, profile broadening at high fueling rates can be caused by interactions with neutrals (e.g. charge exchange) in the divertor or by a significant radial acceleration of the filaments. The framework assumptions were tested with 3D numerical simulations of seeded SOL filaments based on a two fluid model. In particular, filaments interact through the electrostatic field they generate only when they are in close proximity (separation comparable to their width in the drift plane), thus justifying our independence hypothesis. In addition, we will discuss how isolated filament motion responds to variations in the plasma conditions, and specifically divertor conditions. Finally, using the theoretical framework we will reproduce and interpret experimental results obtained on JET, MAST and HL-2A.
THE OPTICS OF REFRACTIVE SUBSTRUCTURE
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnson, Michael D.; Narayan, Ramesh, E-mail: mjohnson@cfa.harvard.edu
2016-08-01
Newly recognized effects of refractive scattering in the ionized interstellar medium have broad implications for very long baseline interferometry (VLBI) at extreme angular resolutions. Building upon work by Blandford and Narayan, we present a simplified, geometrical optics framework, which enables rapid, semi-analytic estimates of refractive scattering effects. We show that these estimates exactly reproduce previous results based on a more rigorous statistical formulation. We then derive new expressions for the scattering-induced fluctuations of VLBI observables such as closure phase, and we demonstrate how to calculate the fluctuations for arbitrary quantities of interest using a Monte Carlo technique.
Statistical Inference for Data Adaptive Target Parameters.
Hubbard, Alan E; Kherad-Pajouh, Sara; van der Laan, Mark J
2016-05-01
Consider one observes n i.i.d. copies of a random variable with a probability distribution that is known to be an element of a particular statistical model. In order to define our statistical target we partition the sample in V equal size sub-samples, and use this partitioning to define V splits in an estimation sample (one of the V subsamples) and corresponding complementary parameter-generating sample. For each of the V parameter-generating samples, we apply an algorithm that maps the sample to a statistical target parameter. We define our sample-split data adaptive statistical target parameter as the average of these V-sample specific target parameters. We present an estimator (and corresponding central limit theorem) of this type of data adaptive target parameter. This general methodology for generating data adaptive target parameters is demonstrated with a number of practical examples that highlight new opportunities for statistical learning from data. This new framework provides a rigorous statistical methodology for both exploratory and confirmatory analysis within the same data. Given that more research is becoming "data-driven", the theory developed within this paper provides a new impetus for a greater involvement of statistical inference into problems that are being increasingly addressed by clever, yet ad hoc pattern finding methods. To suggest such potential, and to verify the predictions of the theory, extensive simulation studies, along with a data analysis based on adaptively determined intervention rules are shown and give insight into how to structure such an approach. The results show that the data adaptive target parameter approach provides a general framework and resulting methodology for data-driven science.
Statistical tests and identifiability conditions for pooling and analyzing multisite datasets
Zhou, Hao Henry; Singh, Vikas; Johnson, Sterling C.; Wahba, Grace
2018-01-01
When sample sizes are small, the ability to identify weak (but scientifically interesting) associations between a set of predictors and a response may be enhanced by pooling existing datasets. However, variations in acquisition methods and the distribution of participants or observations between datasets, especially due to the distributional shifts in some predictors, may obfuscate real effects when datasets are combined. We present a rigorous statistical treatment of this problem and identify conditions where we can correct the distributional shift. We also provide an algorithm for the situation where the correction is identifiable. We analyze various properties of the framework for testing model fit, constructing confidence intervals, and evaluating consistency characteristics. Our technical development is motivated by Alzheimer’s disease (AD) studies, and we present empirical results showing that our framework enables harmonizing of protein biomarkers, even when the assays across sites differ. Our contribution may, in part, mitigate a bottleneck that researchers face in clinical research when pooling smaller sized datasets and may offer benefits when the subjects of interest are difficult to recruit or when resources prohibit large single-site studies. PMID:29386387
Accurate Modeling of Galaxy Clustering on Small Scales: Testing the Standard ΛCDM + Halo Model
NASA Astrophysics Data System (ADS)
Sinha, Manodeep; Berlind, Andreas A.; McBride, Cameron; Scoccimarro, Roman
2015-01-01
The large-scale distribution of galaxies can be explained fairly simply by assuming (i) a cosmological model, which determines the dark matter halo distribution, and (ii) a simple connection between galaxies and the halos they inhabit. This conceptually simple framework, called the halo model, has been remarkably successful at reproducing the clustering of galaxies on all scales, as observed in various galaxy redshift surveys. However, none of these previous studies have carefully modeled the systematics and thus truly tested the halo model in a statistically rigorous sense. We present a new accurate and fully numerical halo model framework and test it against clustering measurements from two luminosity samples of galaxies drawn from the SDSS DR7. We show that the simple ΛCDM cosmology + halo model is not able to simultaneously reproduce the galaxy projected correlation function and the group multiplicity function. In particular, the more luminous sample shows significant tension with theory. We discuss the implications of our findings and how this work paves the way for constraining galaxy formation by accurate simultaneous modeling of multiple galaxy clustering statistics.
ERIC Educational Resources Information Center
Whitley, Meredith A.
2014-01-01
While the quality and quantity of research on service-learning has increased considerably over the past 20 years, researchers as well as governmental and funding agencies have called for more rigor in service-learning research. One key variable in improving rigor is using relevant existing theories to improve the research. The purpose of this…
Output statistics of laser anemometers in sparsely seeded flows
NASA Technical Reports Server (NTRS)
Edwards, R. V.; Jensen, A. S.
1982-01-01
It is noted that until very recently, research on this topic concentrated on the particle arrival statistics and the influence of the optical parameters on them. Little attention has been paid to the influence of subsequent processing on the measurement statistics. There is also controversy over whether the effects of the particle statistics can be measured. It is shown here that some of the confusion derives from a lack of understanding of the experimental parameters that are to be controlled or known. A rigorous framework is presented for examining the measurement statistics of such systems. To provide examples, two problems are then addressed. The first has to do with a sample and hold processor, the second with what is called a saturable processor. The sample and hold processor converts the output to a continuous signal by holding the last reading until a new one is obtained. The saturable system is one where the maximum processable rate is arrived at by the dead time of some unit in the system. At high particle rates, the processed rate is determined through the dead time.
Estimation of integral curves from high angular resolution diffusion imaging (HARDI) data.
Carmichael, Owen; Sakhanenko, Lyudmila
2015-05-15
We develop statistical methodology for a popular brain imaging technique HARDI based on the high order tensor model by Özarslan and Mareci [10]. We investigate how uncertainty in the imaging procedure propagates through all levels of the model: signals, tensor fields, vector fields, and fibers. We construct asymptotically normal estimators of the integral curves or fibers which allow us to trace the fibers together with confidence ellipsoids. The procedure is computationally intense as it blends linear algebra concepts from high order tensors with asymptotical statistical analysis. The theoretical results are illustrated on simulated and real datasets. This work generalizes the statistical methodology proposed for low angular resolution diffusion tensor imaging by Carmichael and Sakhanenko [3], to several fibers per voxel. It is also a pioneering statistical work on tractography from HARDI data. It avoids all the typical limitations of the deterministic tractography methods and it delivers the same information as probabilistic tractography methods. Our method is computationally cheap and it provides well-founded mathematical and statistical framework where diverse functionals on fibers, directions and tensors can be studied in a systematic and rigorous way.
Estimation of integral curves from high angular resolution diffusion imaging (HARDI) data
Carmichael, Owen; Sakhanenko, Lyudmila
2015-01-01
We develop statistical methodology for a popular brain imaging technique HARDI based on the high order tensor model by Özarslan and Mareci [10]. We investigate how uncertainty in the imaging procedure propagates through all levels of the model: signals, tensor fields, vector fields, and fibers. We construct asymptotically normal estimators of the integral curves or fibers which allow us to trace the fibers together with confidence ellipsoids. The procedure is computationally intense as it blends linear algebra concepts from high order tensors with asymptotical statistical analysis. The theoretical results are illustrated on simulated and real datasets. This work generalizes the statistical methodology proposed for low angular resolution diffusion tensor imaging by Carmichael and Sakhanenko [3], to several fibers per voxel. It is also a pioneering statistical work on tractography from HARDI data. It avoids all the typical limitations of the deterministic tractography methods and it delivers the same information as probabilistic tractography methods. Our method is computationally cheap and it provides well-founded mathematical and statistical framework where diverse functionals on fibers, directions and tensors can be studied in a systematic and rigorous way. PMID:25937674
NASA Astrophysics Data System (ADS)
Walker, David M.; Allingham, David; Lee, Heung Wing Joseph; Small, Michael
2010-02-01
Small world network models have been effective in capturing the variable behaviour of reported case data of the SARS coronavirus outbreak in Hong Kong during 2003. Simulations of these models have previously been realized using informed “guesses” of the proposed model parameters and tested for consistency with the reported data by surrogate analysis. In this paper we attempt to provide statistically rigorous parameter distributions using Approximate Bayesian Computation sampling methods. We find that such sampling schemes are a useful framework for fitting parameters of stochastic small world network models where simulation of the system is straightforward but expressing a likelihood is cumbersome.
Chiu, Grace S; Wu, Margaret A; Lu, Lin
2013-01-01
The ability to quantitatively assess ecological health is of great interest to those tasked with monitoring and conserving ecosystems. For decades, biomonitoring research and policies have relied on multimetric health indices of various forms. Although indices are numbers, many are constructed based on qualitative procedures, thus limiting the quantitative rigor of the practical interpretations of such indices. The statistical modeling approach to construct the latent health factor index (LHFI) was recently developed. With ecological data that otherwise are used to construct conventional multimetric indices, the LHFI framework expresses such data in a rigorous quantitative model, integrating qualitative features of ecosystem health and preconceived ecological relationships among such features. This hierarchical modeling approach allows unified statistical inference of health for observed sites (along with prediction of health for partially observed sites, if desired) and of the relevance of ecological drivers, all accompanied by formal uncertainty statements from a single, integrated analysis. Thus far, the LHFI approach has been demonstrated and validated in a freshwater context. We adapt this approach to modeling estuarine health, and illustrate it on the previously unassessed system in Richibucto in New Brunswick, Canada, where active oyster farming is a potential stressor through its effects on sediment properties. Field data correspond to health metrics that constitute the popular AZTI marine biotic index and the infaunal trophic index, as well as abiotic predictors preconceived to influence biota. Our paper is the first to construct a scientifically sensible model that rigorously identifies the collective explanatory capacity of salinity, distance downstream, channel depth, and silt-clay content-all regarded a priori as qualitatively important abiotic drivers-towards site health in the Richibucto ecosystem. This suggests the potential effectiveness of the LHFI approach for assessing not only freshwater systems but aquatic ecosystems in general.
NASA Astrophysics Data System (ADS)
Mustac, M.; Kim, S.; Tkalcic, H.; Rhie, J.; Chen, Y.; Ford, S. R.; Sebastian, N.
2015-12-01
Conventional approaches to inverse problems suffer from non-linearity and non-uniqueness in estimations of seismic structures and source properties. Estimated results and associated uncertainties are often biased by applied regularizations and additional constraints, which are commonly introduced to solve such problems. Bayesian methods, however, provide statistically meaningful estimations of models and their uncertainties constrained by data information. In addition, hierarchical and trans-dimensional (trans-D) techniques are inherently implemented in the Bayesian framework to account for involved error statistics and model parameterizations, and, in turn, allow more rigorous estimations of the same. Here, we apply Bayesian methods throughout the entire inference process to estimate seismic structures and source properties in Northeast Asia including east China, the Korean peninsula, and the Japanese islands. Ambient noise analysis is first performed to obtain a base three-dimensional (3-D) heterogeneity model using continuous broadband waveforms from more than 300 stations. As for the tomography of surface wave group and phase velocities in the 5-70 s band, we adopt a hierarchical and trans-D Bayesian inversion method using Voronoi partition. The 3-D heterogeneity model is further improved by joint inversions of teleseismic receiver functions and dispersion data using a newly developed high-efficiency Bayesian technique. The obtained model is subsequently used to prepare 3-D structural Green's functions for the source characterization. A hierarchical Bayesian method for point source inversion using regional complete waveform data is applied to selected events from the region. The seismic structure and source characteristics with rigorously estimated uncertainties from the novel Bayesian methods provide enhanced monitoring and discrimination of seismic events in northeast Asia.
QTest: Quantitative Testing of Theories of Binary Choice.
Regenwetter, Michel; Davis-Stober, Clintin P; Lim, Shiau Hong; Guo, Ying; Popova, Anna; Zwilling, Chris; Cha, Yun-Shil; Messner, William
2014-01-01
The goal of this paper is to make modeling and quantitative testing accessible to behavioral decision researchers interested in substantive questions. We provide a novel, rigorous, yet very general, quantitative diagnostic framework for testing theories of binary choice. This permits the nontechnical scholar to proceed far beyond traditionally rather superficial methods of analysis, and it permits the quantitatively savvy scholar to triage theoretical proposals before investing effort into complex and specialized quantitative analyses. Our theoretical framework links static algebraic decision theory with observed variability in behavioral binary choice data. The paper is supplemented with a custom-designed public-domain statistical analysis package, the QTest software. We illustrate our approach with a quantitative analysis using published laboratory data, including tests of novel versions of "Random Cumulative Prospect Theory." A major asset of the approach is the potential to distinguish decision makers who have a fixed preference and commit errors in observed choices from decision makers who waver in their preferences.
An Uncertainty Quantification Framework for Remote Sensing Retrievals
NASA Astrophysics Data System (ADS)
Braverman, A. J.; Hobbs, J.
2017-12-01
Remote sensing data sets produced by NASA and other space agencies are the result of complex algorithms that infer geophysical state from observed radiances using retrieval algorithms. The processing must keep up with the downlinked data flow, and this necessitates computational compromises that affect the accuracies of retrieved estimates. The algorithms are also limited by imperfect knowledge of physics and of ancillary inputs that are required. All of this contributes to uncertainties that are generally not rigorously quantified by stepping outside the assumptions that underlie the retrieval methodology. In this talk we discuss a practical framework for uncertainty quantification that can be applied to a variety of remote sensing retrieval algorithms. Ours is a statistical approach that uses Monte Carlo simulation to approximate the sampling distribution of the retrieved estimates. We will discuss the strengths and weaknesses of this approach, and provide a case-study example from the Orbiting Carbon Observatory 2 mission.
Computational Approaches to the Chemical Equilibrium Constant in Protein-ligand Binding.
Montalvo-Acosta, Joel José; Cecchini, Marco
2016-12-01
The physiological role played by protein-ligand recognition has motivated the development of several computational approaches to the ligand binding affinity. Some of them, termed rigorous, have a strong theoretical foundation but involve too much computation to be generally useful. Some others alleviate the computational burden by introducing strong approximations and/or empirical calibrations, which also limit their general use. Most importantly, there is no straightforward correlation between the predictive power and the level of approximation introduced. Here, we present a general framework for the quantitative interpretation of protein-ligand binding based on statistical mechanics. Within this framework, we re-derive self-consistently the fundamental equations of some popular approaches to the binding constant and pinpoint the inherent approximations. Our analysis represents a first step towards the development of variants with optimum accuracy/efficiency ratio for each stage of the drug discovery pipeline. © 2016 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
OCT Amplitude and Speckle Statistics of Discrete Random Media.
Almasian, Mitra; van Leeuwen, Ton G; Faber, Dirk J
2017-11-01
Speckle, amplitude fluctuations in optical coherence tomography (OCT) images, contains information on sub-resolution structural properties of the imaged sample. Speckle statistics could therefore be utilized in the characterization of biological tissues. However, a rigorous theoretical framework relating OCT speckle statistics to structural tissue properties has yet to be developed. As a first step, we present a theoretical description of OCT speckle, relating the OCT amplitude variance to size and organization for samples of discrete random media (DRM). Starting the calculations from the size and organization of the scattering particles, we analytically find expressions for the OCT amplitude mean, amplitude variance, the backscattering coefficient and the scattering coefficient. We assume fully developed speckle and verify the validity of this assumption by experiments on controlled samples of silica microspheres suspended in water. We show that the OCT amplitude variance is sensitive to sub-resolution changes in size and organization of the scattering particles. Experimentally determined and theoretically calculated optical properties are compared and in good agreement.
Statistical tests and identifiability conditions for pooling and analyzing multisite datasets.
Zhou, Hao Henry; Singh, Vikas; Johnson, Sterling C; Wahba, Grace
2018-02-13
When sample sizes are small, the ability to identify weak (but scientifically interesting) associations between a set of predictors and a response may be enhanced by pooling existing datasets. However, variations in acquisition methods and the distribution of participants or observations between datasets, especially due to the distributional shifts in some predictors, may obfuscate real effects when datasets are combined. We present a rigorous statistical treatment of this problem and identify conditions where we can correct the distributional shift. We also provide an algorithm for the situation where the correction is identifiable. We analyze various properties of the framework for testing model fit, constructing confidence intervals, and evaluating consistency characteristics. Our technical development is motivated by Alzheimer's disease (AD) studies, and we present empirical results showing that our framework enables harmonizing of protein biomarkers, even when the assays across sites differ. Our contribution may, in part, mitigate a bottleneck that researchers face in clinical research when pooling smaller sized datasets and may offer benefits when the subjects of interest are difficult to recruit or when resources prohibit large single-site studies. Copyright © 2018 the Author(s). Published by PNAS.
Statistical testing and power analysis for brain-wide association study.
Gong, Weikang; Wan, Lin; Lu, Wenlian; Ma, Liang; Cheng, Fan; Cheng, Wei; Grünewald, Stefan; Feng, Jianfeng
2018-04-05
The identification of connexel-wise associations, which involves examining functional connectivities between pairwise voxels across the whole brain, is both statistically and computationally challenging. Although such a connexel-wise methodology has recently been adopted by brain-wide association studies (BWAS) to identify connectivity changes in several mental disorders, such as schizophrenia, autism and depression, the multiple correction and power analysis methods designed specifically for connexel-wise analysis are still lacking. Therefore, we herein report the development of a rigorous statistical framework for connexel-wise significance testing based on the Gaussian random field theory. It includes controlling the family-wise error rate (FWER) of multiple hypothesis testings using topological inference methods, and calculating power and sample size for a connexel-wise study. Our theoretical framework can control the false-positive rate accurately, as validated empirically using two resting-state fMRI datasets. Compared with Bonferroni correction and false discovery rate (FDR), it can reduce false-positive rate and increase statistical power by appropriately utilizing the spatial information of fMRI data. Importantly, our method bypasses the need of non-parametric permutation to correct for multiple comparison, thus, it can efficiently tackle large datasets with high resolution fMRI images. The utility of our method is shown in a case-control study. Our approach can identify altered functional connectivities in a major depression disorder dataset, whereas existing methods fail. A software package is available at https://github.com/weikanggong/BWAS. Copyright © 2018 Elsevier B.V. All rights reserved.
Patounakis, George; Hill, Micah J
2018-06-01
The purpose of the current review is to describe the common pitfalls in design and statistical analysis of reproductive medicine studies. It serves to guide both authors and reviewers toward reducing the incidence of spurious statistical results and erroneous conclusions. The large amount of data gathered in IVF cycles leads to problems with multiplicity, multicollinearity, and over fitting of regression models. Furthermore, the use of the word 'trend' to describe nonsignificant results has increased in recent years. Finally, methods to accurately account for female age in infertility research models are becoming more common and necessary. The pitfalls of study design and analysis reviewed provide a framework for authors and reviewers to approach clinical research in the field of reproductive medicine. By providing a more rigorous approach to study design and analysis, the literature in reproductive medicine will have more reliable conclusions that can stand the test of time.
Time Scale Optimization and the Hunt for Astronomical Cycles in Deep Time Strata
NASA Astrophysics Data System (ADS)
Meyers, Stephen R.
2016-04-01
A valuable attribute of astrochronology is the direct link between chronometer and climate change, providing a remarkable opportunity to constrain the evolution of the surficial Earth System. Consequently, the hunt for astronomical cycles in strata has spurred the development of a rich conceptual framework for climatic/oceanographic change, and has allowed exploration of the geologic record with unprecedented temporal resolution. Accompanying these successes, however, has been a persistent skepticism about appropriate astrochronologic testing and circular reasoning: how does one reliably test for astronomical cycles in stratigraphic data, especially when time is poorly constrained? From this perspective, it would seem that the merits and promise of astrochronology (e.g., a geologic time scale measured in ≤400 kyr increments) also serves as its Achilles heel, if the confirmation of such short rhythms defies rigorous statistical testing. To address these statistical challenges in astrochronologic testing, a new approach has been developed that (1) explicitly evaluates time scale uncertainty, (2) is resilient to common problems associated with spectrum confidence level assessment and 'multiple testing', and (3) achieves high statistical power under a wide range of conditions (it can identify astronomical cycles when present in data). Designated TimeOpt (for "time scale optimization"; Meyers 2015), the method employs a probabilistic linear regression model framework to investigate amplitude modulation and frequency ratios (bundling) in stratigraphic data, while simultaneously determining the optimal time scale. This presentation will review the TimeOpt method, and demonstrate how the flexible statistical framework can be further extended to evaluate (and optimize upon) complex sedimentation rate models, enhancing the statistical power of the approach, and addressing the challenge of unsteady sedimentation. Meyers, S. R. (2015), The evaluation of eccentricity-related amplitude modulation and bundling in paleoclimate data: An inverse approach for astrochronologic testing and time scale optimization, Paleoceanography, 30, doi:10.1002/ 2015PA002850.
Wang, Guoli; Ebrahimi, Nader
2014-01-01
Non-negative matrix factorization (NMF) is a powerful machine learning method for decomposing a high-dimensional nonnegative matrix V into the product of two nonnegative matrices, W and H, such that V ∼ W H. It has been shown to have a parts-based, sparse representation of the data. NMF has been successfully applied in a variety of areas such as natural language processing, neuroscience, information retrieval, image processing, speech recognition and computational biology for the analysis and interpretation of large-scale data. There has also been simultaneous development of a related statistical latent class modeling approach, namely, probabilistic latent semantic indexing (PLSI), for analyzing and interpreting co-occurrence count data arising in natural language processing. In this paper, we present a generalized statistical approach to NMF and PLSI based on Renyi's divergence between two non-negative matrices, stemming from the Poisson likelihood. Our approach unifies various competing models and provides a unique theoretical framework for these methods. We propose a unified algorithm for NMF and provide a rigorous proof of monotonicity of multiplicative updates for W and H. In addition, we generalize the relationship between NMF and PLSI within this framework. We demonstrate the applicability and utility of our approach as well as its superior performance relative to existing methods using real-life and simulated document clustering data. PMID:25821345
Devarajan, Karthik; Wang, Guoli; Ebrahimi, Nader
2015-04-01
Non-negative matrix factorization (NMF) is a powerful machine learning method for decomposing a high-dimensional nonnegative matrix V into the product of two nonnegative matrices, W and H , such that V ∼ W H . It has been shown to have a parts-based, sparse representation of the data. NMF has been successfully applied in a variety of areas such as natural language processing, neuroscience, information retrieval, image processing, speech recognition and computational biology for the analysis and interpretation of large-scale data. There has also been simultaneous development of a related statistical latent class modeling approach, namely, probabilistic latent semantic indexing (PLSI), for analyzing and interpreting co-occurrence count data arising in natural language processing. In this paper, we present a generalized statistical approach to NMF and PLSI based on Renyi's divergence between two non-negative matrices, stemming from the Poisson likelihood. Our approach unifies various competing models and provides a unique theoretical framework for these methods. We propose a unified algorithm for NMF and provide a rigorous proof of monotonicity of multiplicative updates for W and H . In addition, we generalize the relationship between NMF and PLSI within this framework. We demonstrate the applicability and utility of our approach as well as its superior performance relative to existing methods using real-life and simulated document clustering data.
NASA Astrophysics Data System (ADS)
Anishchenko, V. S.; Boev, Ya. I.; Semenova, N. I.; Strelkova, G. I.
2015-07-01
We review rigorous and numerical results on the statistics of Poincaré recurrences which are related to the modern development of the Poincaré recurrence problem. We analyze and describe the rigorous results which are achieved both in the classical (local) approach and in the recently developed global approach. These results are illustrated by numerical simulation data for simple chaotic and ergodic systems. It is shown that the basic theoretical laws can be applied to noisy systems if the probability measure is ergodic and stationary. Poincaré recurrences are studied numerically in nonautonomous systems. Statistical characteristics of recurrences are analyzed in the framework of the global approach for the cases of positive and zero topological entropy. We show that for the positive entropy, there is a relationship between the Afraimovich-Pesin dimension, Lyapunov exponents and the Kolmogorov-Sinai entropy either without and in the presence of external noise. The case of zero topological entropy is exemplified by numerical results for the Poincare recurrence statistics in the circle map. We show and prove that the dependence of minimal recurrence times on the return region size demonstrates universal properties for the golden and the silver ratio. The behavior of Poincaré recurrences is analyzed at the critical point of Feigenbaum attractor birth. We explore Poincaré recurrences for an ergodic set which is generated in the stroboscopic section of a nonautonomous oscillator and is similar to a circle shift. Based on the obtained results we show how the Poincaré recurrence statistics can be applied for solving a number of nonlinear dynamics issues. We propose and illustrate alternative methods for diagnosing effects of external and mutual synchronization of chaotic systems in the context of the local and global approaches. The properties of the recurrence time probability density can be used to detect the stochastic resonance phenomenon. We also discuss how the fractal dimension of chaotic attractors can be estimated using the Poincaré recurrence statistics.
A framework for grouping nanoparticles based on their measurable characteristics.
Sayes, Christie M; Smith, P Alex; Ivanov, Ivan V
2013-01-01
There is a need to take a broader look at nanotoxicological studies. Eventually, the field will demand that some generalizations be made. To begin to address this issue, we posed a question: are metal colloids on the nanometer-size scale a homogeneous group? In general, most people can agree that the physicochemical properties of nanomaterials can be linked and related to their induced toxicological responses. The focus of this study was to determine how a set of selected physicochemical properties of five specific metal-based colloidal materials on the nanometer-size scale - silver, copper, nickel, iron, and zinc - could be used as nanodescriptors that facilitate the grouping of these metal-based colloids. The example of the framework pipeline processing provided in this paper shows the utility of specific statistical and pattern recognition techniques in grouping nanoparticles based on experimental data about their physicochemical properties. Interestingly, the results of the analyses suggest that a seemingly homogeneous group of nanoparticles could be separated into sub-groups depending on interdependencies observed in their nanodescriptors. These particles represent an important category of nanomaterials that are currently mass produced. Each has been reputed to induce toxicological and/or cytotoxicological effects. Here, we propose an experimental methodology coupled with mathematical and statistical modeling that can serve as a prototype for a rigorous framework that aids in the ability to group nanomaterials together and to facilitate the subsequent analysis of trends in data based on quantitative modeling of nanoparticle-specific structure-activity relationships. The computational part of the proposed framework is rather general and can be applied to other groups of nanomaterials as well.
Harris, Sarah Parker; Gould, Robert; Fujiura, Glenn
2015-01-01
There is increasing theoretical consideration about the use of systematic and scoping reviews of evidence in informing disability and rehabilitation research and practice. Indicative of this trend, this journal published a piece by Rumrill, Fitzgerald and Merchant in 2010 explaining the utility and process for conducting reviews of intervention-based research. There is still need to consider how to apply such rigor when conducting more exploratory reviews of heterogeneous research. This article explores the challenges, benefits, and procedures for conducting rigorous exploratory scoping reviews of diverse evidence. The article expands upon Rumrill, Fitzgerald and Merchant's framework and considers its application to more heterogeneous evidence on the impact of social policy. A worked example of a scoping review of the Americans with Disabilities Act is provided with a procedural framework for conducting scoping reviews on the effects of a social policy. The need for more nuanced techniques for enhancing rigor became apparent during the review process. There are multiple methodological steps that can enhance the utility of exploratory scoping reviews. The potential of systematic consideration during the exploratory review process is shown as a viable method to enhance the rigor in reviewing diverse bodies of evidence.
The Importance of the C3 Framework
ERIC Educational Resources Information Center
Social Education, 2013
2013-01-01
"The C3 Framework for Social Studies State Standards will soon be released under the title "The College, Career, and Civic Life (C3) Framework for Social Studies State Standards: State Guidance for Enhancing the Rigor of K-12 Civics, Economics, Geography, and History." The C3 Project Director and Lead Writer was NCSS member Kathy…
Mass, Momentum and Kinetic Energy of a Relativistic Particle
ERIC Educational Resources Information Center
Zanchini, Enzo
2010-01-01
A rigorous definition of mass in special relativity, proposed in a recent paper, is recalled and employed to obtain simple and rigorous deductions of the expressions of momentum and kinetic energy for a relativistic particle. The whole logical framework appears as the natural extension of the classical one. Only the first, second and third laws of…
QTest: Quantitative Testing of Theories of Binary Choice
Regenwetter, Michel; Davis-Stober, Clintin P.; Lim, Shiau Hong; Guo, Ying; Popova, Anna; Zwilling, Chris; Cha, Yun-Shil; Messner, William
2014-01-01
The goal of this paper is to make modeling and quantitative testing accessible to behavioral decision researchers interested in substantive questions. We provide a novel, rigorous, yet very general, quantitative diagnostic framework for testing theories of binary choice. This permits the nontechnical scholar to proceed far beyond traditionally rather superficial methods of analysis, and it permits the quantitatively savvy scholar to triage theoretical proposals before investing effort into complex and specialized quantitative analyses. Our theoretical framework links static algebraic decision theory with observed variability in behavioral binary choice data. The paper is supplemented with a custom-designed public-domain statistical analysis package, the QTest software. We illustrate our approach with a quantitative analysis using published laboratory data, including tests of novel versions of “Random Cumulative Prospect Theory.” A major asset of the approach is the potential to distinguish decision makers who have a fixed preference and commit errors in observed choices from decision makers who waver in their preferences. PMID:24999495
Methodological rigor and citation frequency in patient compliance literature.
Bruer, J T
1982-01-01
An exhaustive bibliography which assesses the methodological rigor of the patient compliance literature, and citation data from the Science Citation Index (SCI) are combined to determine if methodologically rigorous papers are used with greater frequency than substandard articles by compliance investigators. There are low, but statistically significant, correlations between methodological rigor and citation indicators for 138 patient compliance papers published in SCI source journals during 1975 and 1976. The correlation is not strong enough to warrant use of citation measures as indicators of rigor on a paper-by-paper basis. The data do suggest that citation measures might be developed as crude indicators of methodological rigor. There is no evidence that randomized trials are cited more frequently than studies that employ other experimental designs. PMID:7114334
Peer Review of EPA's Draft BMDS Document: Exponential ...
BMDS is one of the Agency's premier tools for estimating risk assessments, therefore the validity and reliability of its statistical models are of paramount importance. This page provides links to peer review of the BMDS applications and its models as they were developed and eventually released documenting the rigorous review process taken to provide the best science tools available for statistical modeling. This page provides links to peer review of the BMDS applications and its models as they were developed and eventually released documenting the rigorous review process taken to provide the best science tools available for statistical modeling.
Chiu, Grace S.; Wu, Margaret A.; Lu, Lin
2013-01-01
The ability to quantitatively assess ecological health is of great interest to those tasked with monitoring and conserving ecosystems. For decades, biomonitoring research and policies have relied on multimetric health indices of various forms. Although indices are numbers, many are constructed based on qualitative procedures, thus limiting the quantitative rigor of the practical interpretations of such indices. The statistical modeling approach to construct the latent health factor index (LHFI) was recently developed. With ecological data that otherwise are used to construct conventional multimetric indices, the LHFI framework expresses such data in a rigorous quantitative model, integrating qualitative features of ecosystem health and preconceived ecological relationships among such features. This hierarchical modeling approach allows unified statistical inference of health for observed sites (along with prediction of health for partially observed sites, if desired) and of the relevance of ecological drivers, all accompanied by formal uncertainty statements from a single, integrated analysis. Thus far, the LHFI approach has been demonstrated and validated in a freshwater context. We adapt this approach to modeling estuarine health, and illustrate it on the previously unassessed system in Richibucto in New Brunswick, Canada, where active oyster farming is a potential stressor through its effects on sediment properties. Field data correspond to health metrics that constitute the popular AZTI marine biotic index and the infaunal trophic index, as well as abiotic predictors preconceived to influence biota. Our paper is the first to construct a scientifically sensible model that rigorously identifies the collective explanatory capacity of salinity, distance downstream, channel depth, and silt–clay content–all regarded a priori as qualitatively important abiotic drivers–towards site health in the Richibucto ecosystem. This suggests the potential effectiveness of the LHFI approach for assessing not only freshwater systems but aquatic ecosystems in general. PMID:23785443
Multiplicative Multitask Feature Learning
Wang, Xin; Bi, Jinbo; Yu, Shipeng; Sun, Jiangwen; Song, Minghu
2016-01-01
We investigate a general framework of multiplicative multitask feature learning which decomposes individual task’s model parameters into a multiplication of two components. One of the components is used across all tasks and the other component is task-specific. Several previous methods can be proved to be special cases of our framework. We study the theoretical properties of this framework when different regularization conditions are applied to the two decomposed components. We prove that this framework is mathematically equivalent to the widely used multitask feature learning methods that are based on a joint regularization of all model parameters, but with a more general form of regularizers. Further, an analytical formula is derived for the across-task component as related to the task-specific component for all these regularizers, leading to a better understanding of the shrinkage effects of different regularizers. Study of this framework motivates new multitask learning algorithms. We propose two new learning formulations by varying the parameters in the proposed framework. An efficient blockwise coordinate descent algorithm is developed suitable for solving the entire family of formulations with rigorous convergence analysis. Simulation studies have identified the statistical properties of data that would be in favor of the new formulations. Extensive empirical studies on various classification and regression benchmark data sets have revealed the relative advantages of the two new formulations by comparing with the state of the art, which provides instructive insights into the feature learning problem with multiple tasks. PMID:28428735
Hong, Bonnie; Du, Yingzhou; Mukerji, Pushkor; Roper, Jason M; Appenzeller, Laura M
2017-07-12
Regulatory-compliant rodent subchronic feeding studies are compulsory regardless of a hypothesis to test, according to recent EU legislation for the safety assessment of whole food/feed produced from genetically modified (GM) crops containing a single genetic transformation event (European Union Commission Implementing Regulation No. 503/2013). The Implementing Regulation refers to guidelines set forth by the European Food Safety Authority (EFSA) for the design, conduct, and analysis of rodent subchronic feeding studies. The set of EFSA recommendations was rigorously applied to a 90-day feeding study in Sprague-Dawley rats. After study completion, the appropriateness and applicability of these recommendations were assessed using a battery of statistical analysis approaches including both retrospective and prospective statistical power analyses as well as variance-covariance decomposition. In the interest of animal welfare considerations, alternative experimental designs were investigated and evaluated in the context of informing the health risk assessment of food/feed from GM crops.
Single-case synthesis tools I: Comparing tools to evaluate SCD quality and rigor.
Zimmerman, Kathleen N; Ledford, Jennifer R; Severini, Katherine E; Pustejovsky, James E; Barton, Erin E; Lloyd, Blair P
2018-03-03
Tools for evaluating the quality and rigor of single case research designs (SCD) are often used when conducting SCD syntheses. Preferred components include evaluations of design features related to the internal validity of SCD to obtain quality and/or rigor ratings. Three tools for evaluating the quality and rigor of SCD (Council for Exceptional Children, What Works Clearinghouse, and Single-Case Analysis and Design Framework) were compared to determine if conclusions regarding the effectiveness of antecedent sensory-based interventions for young children changed based on choice of quality evaluation tool. Evaluation of SCD quality differed across tools, suggesting selection of quality evaluation tools impacts evaluation findings. Suggestions for selecting an appropriate quality and rigor assessment tool are provided and across-tool conclusions are drawn regarding the quality and rigor of studies. Finally, authors provide guidance for using quality evaluations in conjunction with outcome analyses when conducting syntheses of interventions evaluated in the context of SCD. Copyright © 2018 Elsevier Ltd. All rights reserved.
Q and A about the College, Career, and Civic Life (C3) Framework for Social Studies State Standards
ERIC Educational Resources Information Center
Herczog, Michelle
2013-01-01
The "College, Career, and Civic Life (C3) Framework for Social Studies State Standards: State Guidance for Enhancing the Rigor of K-12 Civics, Economics, Geography, and History" will soon be released. The C3 Framework was developed to serve two audiences: for states to upgrade their state social studies standards, and for…
DESCQA: Synthetic Sky Catalog Validation Framework
NASA Astrophysics Data System (ADS)
Mao, Yao-Yuan; Uram, Thomas D.; Zhou, Rongpu; Kovacs, Eve; Ricker, Paul M.; Kalmbach, J. Bryce; Padilla, Nelson; Lanusse, François; Zu, Ying; Tenneti, Ananth; Vikraman, Vinu; DeRose, Joseph
2018-04-01
The DESCQA framework provides rigorous validation protocols for assessing the quality of high-quality simulated sky catalogs in a straightforward and comprehensive way. DESCQA enables the inspection, validation, and comparison of an inhomogeneous set of synthetic catalogs via the provision of a common interface within an automated framework. An interactive web interface is also available at portal.nersc.gov/project/lsst/descqa.
The C3 Framework: One Year Later - an Interview with Kathy Swan
ERIC Educational Resources Information Center
Social Education, 2014
2014-01-01
On September 17, 2013 (Constitution Day), the C3 Framework was released under the title "The College, Career and Civic Life (C3) Framework for Social Studies State Standards: Guidance for Enhancing the Rigor of K-12 Civics, Economics, Geography, and History." The C3 Project Director and lead writer was NCSS member Kathy Swan, who is…
A Rigorous Framework for Optimization of Expensive Functions by Surrogates
NASA Technical Reports Server (NTRS)
Booker, Andrew J.; Dennis, J. E., Jr.; Frank, Paul D.; Serafini, David B.; Torczon, Virginia; Trosset, Michael W.
1998-01-01
The goal of the research reported here is to develop rigorous optimization algorithms to apply to some engineering design problems for which design application of traditional optimization approaches is not practical. This paper presents and analyzes a framework for generating a sequence of approximations to the objective function and managing the use of these approximations as surrogates for optimization. The result is to obtain convergence to a minimizer of an expensive objective function subject to simple constraints. The approach is widely applicable because it does not require, or even explicitly approximate, derivatives of the objective. Numerical results are presented for a 31-variable helicopter rotor blade design example and for a standard optimization test example.
Eslick, John C.; Ng, Brenda; Gao, Qianwen; ...
2014-12-31
Under the auspices of the U.S. Department of Energy’s Carbon Capture Simulation Initiative (CCSI), a Framework for Optimization and Quantification of Uncertainty and Sensitivity (FOQUS) has been developed. This tool enables carbon capture systems to be rapidly synthesized and rigorously optimized, in an environment that accounts for and propagates uncertainties in parameters and models. FOQUS currently enables (1) the development of surrogate algebraic models utilizing the ALAMO algorithm, which can be used for superstructure optimization to identify optimal process configurations, (2) simulation-based optimization utilizing derivative free optimization (DFO) algorithms with detailed black-box process models, and (3) rigorous uncertainty quantification throughmore » PSUADE. FOQUS utilizes another CCSI technology, the Turbine Science Gateway, to manage the thousands of simulated runs necessary for optimization and UQ. Thus, this computational framework has been demonstrated for the design and analysis of a solid sorbent based carbon capture system.« less
ERIC Educational Resources Information Center
Cassata-Widera, Amy; Century, Jeanne; Kim, Dae Y.
2011-01-01
The practical need for multidimensional measures of fidelity of implementation (FOI) of reform-based science, technology, engineering, and mathematics (STEM) instructional materials, combined with a theoretical need in the field for a shared conceptual framework that could support accumulating knowledge on specific enacted program elements across…
Studies on the estimation of the postmortem interval. 3. Rigor mortis (author's transl).
Suzutani, T; Ishibashi, H; Takatori, T
1978-11-01
The authors have devised a method for classifying rigor mortis into 10 types based on its appearance and strength in various parts of a cadaver. By applying the method to the findings of 436 cadavers which were subjected to medico-legal autopsies in our laboratory during the last 10 years, it has been demonstrated that the classifying method is effective for analyzing the phenomenon of onset, persistence and disappearance of rigor mortis statistically. The investigation of the relationship between each type of rigor mortis and the postmortem interval has demonstrated that rigor mortis may be utilized as a basis for estimating the postmortem interval but the values have greater deviation than those described in current textbooks.
An advanced kinetic theory for morphing continuum with inner structures
NASA Astrophysics Data System (ADS)
Chen, James
2017-12-01
Advanced kinetic theory with the Boltzmann-Curtiss equation provides a promising tool for polyatomic gas flows, especially for fluid flows containing inner structures, such as turbulence, polyatomic gas flows and others. Although a Hamiltonian-based distribution function was proposed for diatomic gas flow, a general distribution function for the generalized Boltzmann-Curtiss equations and polyatomic gas flow is still out of reach. With assistance from Boltzmann's entropy principle, a generalized Boltzmann-Curtiss distribution for polyatomic gas flow is introduced. The corresponding governing equations at equilibrium state are derived and compared with Eringen's morphing (micropolar) continuum theory derived under the framework of rational continuum thermomechanics. Although rational continuum thermomechanics has the advantages of mathematical rigor and simplicity, the presented statistical kinetic theory approach provides a clear physical picture for what the governing equations represent.
Does McNemar's test compare the sensitivities and specificities of two diagnostic tests?
Kim, Soeun; Lee, Woojoo
2017-02-01
McNemar's test is often used in practice to compare the sensitivities and specificities for the evaluation of two diagnostic tests. For correct evaluation of accuracy, an intuitive recommendation is to test the diseased and the non-diseased groups separately so that the sensitivities can be compared among the diseased, and specificities can be compared among the healthy group of people. This paper provides a rigorous theoretical framework for this argument and study the validity of McNemar's test regardless of the conditional independence assumption. We derive McNemar's test statistic under the null hypothesis considering both assumptions of conditional independence and conditional dependence. We then perform power analyses to show how the result is affected by the amount of the conditional dependence under alternative hypothesis.
Orbital State Uncertainty Realism
NASA Astrophysics Data System (ADS)
Horwood, J.; Poore, A. B.
2012-09-01
Fundamental to the success of the space situational awareness (SSA) mission is the rigorous inclusion of uncertainty in the space surveillance network. The *proper characterization of uncertainty* in the orbital state of a space object is a common requirement to many SSA functions including tracking and data association, resolution of uncorrelated tracks (UCTs), conjunction analysis and probability of collision, sensor resource management, and anomaly detection. While tracking environments, such as air and missile defense, make extensive use of Gaussian and local linearity assumptions within algorithms for uncertainty management, space surveillance is inherently different due to long time gaps between updates, high misdetection rates, nonlinear and non-conservative dynamics, and non-Gaussian phenomena. The latter implies that "covariance realism" is not always sufficient. SSA also requires "uncertainty realism"; the proper characterization of both the state and covariance and all non-zero higher-order cumulants. In other words, a proper characterization of a space object's full state *probability density function (PDF)* is required. In order to provide a more statistically rigorous treatment of uncertainty in the space surveillance tracking environment and to better support the aforementioned SSA functions, a new class of multivariate PDFs are formulated which more accurately characterize the uncertainty of a space object's state or orbit. The new distribution contains a parameter set controlling the higher-order cumulants which gives the level sets a distinctive "banana" or "boomerang" shape and degenerates to a Gaussian in a suitable limit. Using the new class of PDFs within the general Bayesian nonlinear filter, the resulting filter prediction step (i.e., uncertainty propagation) is shown to have the *same computational cost as the traditional unscented Kalman filter* with the former able to maintain a proper characterization of the uncertainty for up to *ten times as long* as the latter. The filter correction step also furnishes a statistically rigorous *prediction error* which appears in the likelihood ratios for scoring the association of one report or observation to another. Thus, the new filter can be used to support multi-target tracking within a general multiple hypothesis tracking framework. Additionally, the new distribution admits a distance metric which extends the classical Mahalanobis distance (chi^2 statistic). This metric provides a test for statistical significance and facilitates single-frame data association methods with the potential to easily extend the covariance-based track association algorithm of Hill, Sabol, and Alfriend. The filtering, data fusion, and association methods using the new class of orbital state PDFs are shown to be mathematically tractable and operationally viable.
A user-centered model for designing consumer mobile health (mHealth) applications (apps).
Schnall, Rebecca; Rojas, Marlene; Bakken, Suzanne; Brown, William; Carballo-Dieguez, Alex; Carry, Monique; Gelaude, Deborah; Mosley, Jocelyn Patterson; Travers, Jasmine
2016-04-01
Mobile technologies are a useful platform for the delivery of health behavior interventions. Yet little work has been done to create a rigorous and standardized process for the design of mobile health (mHealth) apps. This project sought to explore the use of the Information Systems Research (ISR) framework as guide for the design of mHealth apps. Our work was guided by the ISR framework which is comprised of 3 cycles: Relevance, Rigor and Design. In the Relevance cycle, we conducted 5 focus groups with 33 targeted end-users. In the Rigor cycle, we performed a review to identify technology-based interventions for meeting the health prevention needs of our target population. In the Design Cycle, we employed usability evaluation methods to iteratively develop and refine mock-ups for a mHealth app. Through an iterative process, we identified barriers and facilitators to the use of mHealth technology for HIV prevention for high-risk MSM, developed 'use cases' and identified relevant functional content and features for inclusion in a design document to guide future app development. Findings from our work support the use of the ISR framework as a guide for designing future mHealth apps. Results from this work provide detailed descriptions of the user-centered design and system development and have heuristic value for those venturing into the area of technology-based intervention work. Findings from this study support the use of the ISR framework as a guide for future mHealth app development. Use of the ISR framework is a potentially useful approach for the design of a mobile app that incorporates end-users' design preferences. Copyright © 2016 Elsevier Inc. All rights reserved.
Connectopic mapping with resting-state fMRI.
Haak, Koen V; Marquand, Andre F; Beckmann, Christian F
2018-04-15
Brain regions are often topographically connected: nearby locations within one brain area connect with nearby locations in another area. Mapping these connection topographies, or 'connectopies' in short, is crucial for understanding how information is processed in the brain. Here, we propose principled, fully data-driven methods for mapping connectopies using functional magnetic resonance imaging (fMRI) data acquired at rest by combining spectral embedding of voxel-wise connectivity 'fingerprints' with a novel approach to spatial statistical inference. We apply the approach in human primary motor and visual cortex, and show that it can trace biologically plausible, overlapping connectopies in individual subjects that follow these regions' somatotopic and retinotopic maps. As a generic mechanism to perform inference over connectopies, the new spatial statistics approach enables rigorous statistical testing of hypotheses regarding the fine-grained spatial profile of functional connectivity and whether that profile is different between subjects or between experimental conditions. The combined framework offers a fundamental alternative to existing approaches to investigating functional connectivity in the brain, from voxel- or seed-pair wise characterizations of functional association, towards a full, multivariate characterization of spatial topography. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.
Using Framework Analysis in nursing research: a worked example.
Ward, Deborah J; Furber, Christine; Tierney, Stephanie; Swallow, Veronica
2013-11-01
To demonstrate Framework Analysis using a worked example and to illustrate how criticisms of qualitative data analysis including issues of clarity and transparency can be addressed. Critics of the analysis of qualitative data sometimes cite lack of clarity and transparency about analytical procedures; this can deter nurse researchers from undertaking qualitative studies. Framework Analysis is flexible, systematic, and rigorous, offering clarity, transparency, an audit trail, an option for theme-based and case-based analysis and for readily retrievable data. This paper offers further explanation of the process undertaken which is illustrated with a worked example. Data were collected from 31 nursing students in 2009 using semi-structured interviews. The data collected are not reported directly here but used as a worked example for the five steps of Framework Analysis. Suggestions are provided to guide researchers through essential steps in undertaking Framework Analysis. The benefits and limitations of Framework Analysis are discussed. Nurses increasingly use qualitative research methods and need to use an analysis approach that offers transparency and rigour which Framework Analysis can provide. Nurse researchers may find the detailed critique of Framework Analysis presented in this paper a useful resource when designing and conducting qualitative studies. Qualitative data analysis presents challenges in relation to the volume and complexity of data obtained and the need to present an 'audit trail' for those using the research findings. Framework Analysis is an appropriate, rigorous and systematic method for undertaking qualitative analysis. © 2013 Blackwell Publishing Ltd.
Chatterjee, Abhijit; Vlachos, Dionisios G
2007-07-21
While recently derived continuum mesoscopic equations successfully bridge the gap between microscopic and macroscopic physics, so far they have been derived only for simple lattice models. In this paper, general deterministic continuum mesoscopic equations are derived rigorously via nonequilibrium statistical mechanics to account for multiple interacting surface species and multiple processes on multiple site types and/or different crystallographic planes. Adsorption, desorption, reaction, and surface diffusion are modeled. It is demonstrated that contrary to conventional phenomenological continuum models, microscopic physics, such as the interaction potential, determines the final form of the mesoscopic equation. Models of single component diffusion and binary diffusion of interacting particles on single-type site lattice and of single component diffusion on complex microporous materials' lattices consisting of two types of sites are derived, as illustrations of the mesoscopic framework. Simplification of the diffusion mesoscopic model illustrates the relation to phenomenological models, such as the Fickian and Maxwell-Stefan transport models. It is demonstrated that the mesoscopic equations are in good agreement with lattice kinetic Monte Carlo simulations for several prototype examples studied.
Approximation Methods for Inverse Problems Governed by Nonlinear Parabolic Systems
1999-12-17
We present a rigorous theoretical framework for approximation of nonlinear parabolic systems with delays in the context of inverse least squares...numerical results demonstrating the convergence are given for a model of dioxin uptake and elimination in a distributed liver model that is a special case of the general theoretical framework .
Higher Order Thinking Skills: Challenging All Students to Achieve
ERIC Educational Resources Information Center
Williams, R. Bruce
2007-01-01
Explicit instruction in thinking skills must be a priority goal of all teachers. In this book, the author presents a framework of the five Rs: Relevancy, Richness, Relatedness, Rigor, and Recursiveness. The framework serves to illuminate instruction in critical and creative thinking skills for K-12 teachers across content areas. Each chapter…
Latest Results From the QuakeFinder Statistical Analysis Framework
NASA Astrophysics Data System (ADS)
Kappler, K. N.; MacLean, L. S.; Schneider, D.; Bleier, T.
2017-12-01
Since 2005 QuakeFinder (QF) has acquired an unique dataset with outstanding spatial and temporal sampling of earth's magnetic field along several active fault systems. This QF network consists of 124 stations in California and 45 stations along fault zones in Greece, Taiwan, Peru, Chile and Indonesia. Each station is equipped with three feedback induction magnetometers, two ion sensors, a 4 Hz geophone, a temperature sensor, and a humidity sensor. Data are continuously recorded at 50 Hz with GPS timing and transmitted daily to the QF data center in California for analysis. QF is attempting to detect and characterize anomalous EM activity occurring ahead of earthquakes. There have been many reports of anomalous variations in the earth's magnetic field preceding earthquakes. Specifically, several authors have drawn attention to apparent anomalous pulsations seen preceding earthquakes. Often studies in long term monitoring of seismic activity are limited by availability of event data. It is particularly difficult to acquire a large dataset for rigorous statistical analyses of the magnetic field near earthquake epicenters because large events are relatively rare. Since QF has acquired hundreds of earthquakes in more than 70 TB of data, we developed an automated approach for finding statistical significance of precursory behavior and developed an algorithm framework. Previously QF reported on the development of an Algorithmic Framework for data processing and hypothesis testing. The particular instance of algorithm we discuss identifies and counts magnetic variations from time series data and ranks each station-day according to the aggregate number of pulses in a time window preceding the day in question. If the hypothesis is true that magnetic field activity increases over some time interval preceding earthquakes, this should reveal itself by the station-days on which earthquakes occur receiving higher ranks than they would if the ranking scheme were random. This can be analysed using the Receiver Operating Characteristic test. In this presentation we give a status report of our latest results, largely focussed on reproducibility of results, robust statistics in the presence of missing data, and exploring optimization landscapes in our parameter space.
ERIC Educational Resources Information Center
Mantzicopoulos, Panayota; French, Brian F.; Patrick, Helen; Watson, J. Samuel; Ahn, Inok
2018-01-01
To meet recent accountability mandates, school districts are implementing assessment frameworks to document teachers' effectiveness. Observational assessments play a key role in this process, albeit without compelling evidence of their psychometric rigor. Using a sample of kindergarten teachers, we employed Generalizability theory to investigate…
ERIC Educational Resources Information Center
OECD Publishing, 2017
2017-01-01
What is important for citizens to know and be able to do? The OECD Programme for International Student Assessment (PISA) seeks to answer that question through the most comprehensive and rigorous international assessment of student knowledge and skills. The PISA 2015 Assessment and Analytical Framework presents the conceptual foundations of the…
Relevance and Rigor in International Business Teaching: Using the CSA-FSA Matrix
ERIC Educational Resources Information Center
Collinson, Simon C.; Rugman, Alan M.
2011-01-01
We advance three propositions in this paper. First, teaching international business (IB) at any level needs to be theoretically driven, using mainstream frameworks to organize thinking. Second, these frameworks need to be made relevant to the experiences of the students; for example, by using them in case studies. Third, these parameters of rigor…
Towards a rigorous framework for studying 2-player continuous games.
Shutters, Shade T
2013-03-21
The use of 2-player strategic games is one of the most common frameworks for studying the evolution of economic and social behavior. Games are typically played between two players, each given two choices that lie at the extremes of possible behavior (e.g. completely cooperate or completely defect). Recently there has been much interest in studying the outcome of games in which players may choose a strategy from the continuous interval between extremes, requiring the set of two possible choices be replaced by a single continuous equation. This has led to confusion and even errors in the classification of the game being played. The issue is described here specifically in relation to the continuous prisoners dilemma and the continuous snowdrift game. A case study is then presented demonstrating the misclassification that can result from the extension of discrete games into continuous space. The paper ends with a call for a more rigorous and clear framework for working with continuous games. Copyright © 2013 Elsevier Ltd. All rights reserved.
Ensemble forecast of human West Nile virus cases and mosquito infection rates
NASA Astrophysics Data System (ADS)
Defelice, Nicholas B.; Little, Eliza; Campbell, Scott R.; Shaman, Jeffrey
2017-02-01
West Nile virus (WNV) is now endemic in the continental United States; however, our ability to predict spillover transmission risk and human WNV cases remains limited. Here we develop a model depicting WNV transmission dynamics, which we optimize using a data assimilation method and two observed data streams, mosquito infection rates and reported human WNV cases. The coupled model-inference framework is then used to generate retrospective ensemble forecasts of historical WNV outbreaks in Long Island, New York for 2001-2014. Accurate forecasts of mosquito infection rates are generated before peak infection, and >65% of forecasts accurately predict seasonal total human WNV cases up to 9 weeks before the past reported case. This work provides the foundation for implementation of a statistically rigorous system for real-time forecast of seasonal outbreaks of WNV.
Fish-Eye Observing with Phased Array Radio Telescopes
NASA Astrophysics Data System (ADS)
Wijnholds, S. J.
The radio astronomical community is currently developing and building several new radio telescopes based on phased array technology. These telescopes provide a large field-of-view, that may in principle span a full hemisphere. This makes calibration and imaging very challenging tasks due to the complex source structures and direction dependent radio wave propagation effects. In this thesis, calibration and imaging methods are developed based on least squares estimation of instrument and source parameters. Monte Carlo simulations and actual observations with several prototype show that this model based approach provides statistically and computationally efficient solutions. The error analysis provides a rigorous mathematical framework to assess the imaging performance of current and future radio telescopes in terms of the effective noise, which is the combined effect of propagated calibration errors, noise in the data and source confusion.
Ensemble forecast of human West Nile virus cases and mosquito infection rates.
DeFelice, Nicholas B; Little, Eliza; Campbell, Scott R; Shaman, Jeffrey
2017-02-24
West Nile virus (WNV) is now endemic in the continental United States; however, our ability to predict spillover transmission risk and human WNV cases remains limited. Here we develop a model depicting WNV transmission dynamics, which we optimize using a data assimilation method and two observed data streams, mosquito infection rates and reported human WNV cases. The coupled model-inference framework is then used to generate retrospective ensemble forecasts of historical WNV outbreaks in Long Island, New York for 2001-2014. Accurate forecasts of mosquito infection rates are generated before peak infection, and >65% of forecasts accurately predict seasonal total human WNV cases up to 9 weeks before the past reported case. This work provides the foundation for implementation of a statistically rigorous system for real-time forecast of seasonal outbreaks of WNV.
A Rigorous Statistical Framework for the Mathematics of Sensing, Exploitation and Execution
2015-05-01
sports activities, including bike riding, disc golf , baseball, and parkour. More detail about the activities in this SOC may be found in...detect” o swinging – [the UCLA SUT] “needs accurate 3D arm motion, also [the predicate is] ambiguous.” o occluding – “Ambiguous definition and...10 touching 2 1 1 4 catching 2 1 1 4 swinging 1 1 occluding 2 1 3 donning 1 2 3* 6 doffing 2 1 4* 7 facing(automobile) 1 1
A psychometric evaluation of the digital logic concept inventory
NASA Astrophysics Data System (ADS)
Herman, Geoffrey L.; Zilles, Craig; Loui, Michael C.
2014-10-01
Concept inventories hold tremendous promise for promoting the rigorous evaluation of teaching methods that might remedy common student misconceptions and promote deep learning. The measurements from concept inventories can be trusted only if the concept inventories are evaluated both by expert feedback and statistical scrutiny (psychometric evaluation). Classical Test Theory and Item Response Theory provide two psychometric frameworks for evaluating the quality of assessment tools. We discuss how these theories can be applied to assessment tools generally and then apply them to the Digital Logic Concept Inventory (DLCI). We demonstrate that the DLCI is sufficiently reliable for research purposes when used in its entirety and as a post-course assessment of students' conceptual understanding of digital logic. The DLCI can also discriminate between students across a wide range of ability levels, providing the most information about weaker students' ability levels.
Jiang, Xiaoye; Yao, Yuan; Liu, Han; Guibas, Leonidas
2014-01-01
Modern data acquisition routinely produces massive amounts of network data. Though many methods and models have been proposed to analyze such data, the research of network data is largely disconnected with the classical theory of statistical learning and signal processing. In this paper, we present a new framework for modeling network data, which connects two seemingly different areas: network data analysis and compressed sensing. From a nonparametric perspective, we model an observed network using a large dictionary. In particular, we consider the network clique detection problem and show connections between our formulation with a new algebraic tool, namely Randon basis pursuit in homogeneous spaces. Such a connection allows us to identify rigorous recovery conditions for clique detection problems. Though this paper is mainly conceptual, we also develop practical approximation algorithms for solving empirical problems and demonstrate their usefulness on real-world datasets. PMID:25620806
Practice-based evidence study design for comparative effectiveness research.
Horn, Susan D; Gassaway, Julie
2007-10-01
To describe a new, rigorous, comprehensive practice-based evidence for clinical practice improvement (PBE-CPI) study methodology, and compare its features, advantages, and disadvantages to those of randomized controlled trials and sophisticated statistical methods for comparative effectiveness research. PBE-CPI incorporates natural variation within data from routine clinical practice to determine what works, for whom, when, and at what cost. It uses the knowledge of front-line caregivers, who develop study questions and define variables as part of a transdisciplinary team. Its comprehensive measurement framework provides a basis for analyses of significant bivariate and multivariate associations between treatments and outcomes, controlling for patient differences, such as severity of illness. PBE-CPI studies can uncover better practices more quickly than randomized controlled trials or sophisticated statistical methods, while achieving many of the same advantages. We present examples of actionable findings from PBE-CPI studies in postacute care settings related to comparative effectiveness of medications, nutritional support approaches, incontinence products, physical therapy activities, and other services. Outcomes improved when practices associated with better outcomes in PBE-CPI analyses were adopted in practice.
A theoretical Gaussian framework for anomalous change detection in hyperspectral images
NASA Astrophysics Data System (ADS)
Acito, Nicola; Diani, Marco; Corsini, Giovanni
2017-10-01
Exploitation of temporal series of hyperspectral images is a relatively new discipline that has a wide variety of possible applications in fields like remote sensing, area surveillance, defense and security, search and rescue and so on. In this work, we discuss how images taken at two different times can be processed to detect changes caused by insertion, deletion or displacement of small objects in the monitored scene. This problem is known in the literature as anomalous change detection (ACD) and it can be viewed as the extension, to the multitemporal case, of the well-known anomaly detection problem in a single image. In fact, in both cases, the hyperspectral images are processed blindly in an unsupervised manner and without a-priori knowledge about the target spectrum. We introduce the ACD problem using an approach based on the statistical decision theory and we derive a common framework including different ACD approaches. Particularly, we clearly define the observation space, the data statistical distribution conditioned to the two competing hypotheses and the procedure followed to come with the solution. The proposed overview places emphasis on techniques based on the multivariate Gaussian model that allows a formal presentation of the ACD problem and the rigorous derivation of the possible solutions in a way that is both mathematically more tractable and easier to interpret. We also discuss practical problems related to the application of the detectors in the real world and present affordable solutions. Namely, we describe the ACD processing chain including the strategies that are commonly adopted to compensate pervasive radiometric changes, caused by the different illumination/atmospheric conditions, and to mitigate the residual geometric image co-registration errors. Results obtained on real freely available data are discussed in order to test and compare the methods within the proposed general framework.
Single toxin dose-response models revisited
DOE Office of Scientific and Technical Information (OSTI.GOV)
Demidenko, Eugene, E-mail: eugened@dartmouth.edu
The goal of this paper is to offer a rigorous analysis of the sigmoid shape single toxin dose-response relationship. The toxin efficacy function is introduced and four special points, including maximum toxin efficacy and inflection points, on the dose-response curve are defined. The special points define three phases of the toxin effect on mortality: (1) toxin concentrations smaller than the first inflection point or (2) larger then the second inflection point imply low mortality rate, and (3) concentrations between the first and the second inflection points imply high mortality rate. Probabilistic interpretation and mathematical analysis for each of the fourmore » models, Hill, logit, probit, and Weibull is provided. Two general model extensions are introduced: (1) the multi-target hit model that accounts for the existence of several vital receptors affected by the toxin, and (2) model with a nonzero mortality at zero concentration to account for natural mortality. Special attention is given to statistical estimation in the framework of the generalized linear model with the binomial dependent variable as the mortality count in each experiment, contrary to the widespread nonlinear regression treating the mortality rate as continuous variable. The models are illustrated using standard EPA Daphnia acute (48 h) toxicity tests with mortality as a function of NiCl or CuSO{sub 4} toxin. - Highlights: • The paper offers a rigorous study of a sigmoid dose-response relationship. • The concentration with highest mortality rate is rigorously defined. • A table with four special points for five morality curves is presented. • Two new sigmoid dose-response models have been introduced. • The generalized linear model is advocated for estimation of sigmoid dose-response relationship.« less
A conceptual framework for invasion in microbial communities.
Kinnunen, Marta; Dechesne, Arnaud; Proctor, Caitlin; Hammes, Frederik; Johnson, David; Quintela-Baluja, Marcos; Graham, David; Daffonchio, Daniele; Fodelianakis, Stilianos; Hahn, Nicole; Boon, Nico; Smets, Barth F
2016-12-01
There is a growing interest in controlling-promoting or avoiding-the invasion of microbial communities by new community members. Resource availability and community structure have been reported as determinants of invasion success. However, most invasion studies do not adhere to a coherent and consistent terminology nor always include rigorous interpretations of the processes behind invasion. Therefore, we suggest that a consistent set of definitions and a rigorous conceptual framework are needed. We define invasion in a microbial community as the establishment of an alien microbial type in a resident community and argue how simple criteria to define aliens, residents, and alien establishment can be applied for a wide variety of communities. In addition, we suggest an adoption of the community ecology framework advanced by Vellend (2010) to clarify potential determinants of invasion. This framework identifies four fundamental processes that control community dynamics: dispersal, selection, drift and diversification. While selection has received ample attention in microbial community invasion research, the three other processes are often overlooked. Here, we elaborate on the relevance of all four processes and conclude that invasion experiments should be designed to elucidate the role of dispersal, drift and diversification, in order to obtain a complete picture of invasion as a community process.
A conceptual framework for invasion in microbial communities
Kinnunen, Marta; Dechesne, Arnaud; Proctor, Caitlin; Hammes, Frederik; Johnson, David; Quintela-Baluja, Marcos; Graham, David; Daffonchio, Daniele; Fodelianakis, Stilianos; Hahn, Nicole; Boon, Nico; Smets, Barth F
2016-01-01
There is a growing interest in controlling—promoting or avoiding—the invasion of microbial communities by new community members. Resource availability and community structure have been reported as determinants of invasion success. However, most invasion studies do not adhere to a coherent and consistent terminology nor always include rigorous interpretations of the processes behind invasion. Therefore, we suggest that a consistent set of definitions and a rigorous conceptual framework are needed. We define invasion in a microbial community as the establishment of an alien microbial type in a resident community and argue how simple criteria to define aliens, residents, and alien establishment can be applied for a wide variety of communities. In addition, we suggest an adoption of the community ecology framework advanced by Vellend (2010) to clarify potential determinants of invasion. This framework identifies four fundamental processes that control community dynamics: dispersal, selection, drift and diversification. While selection has received ample attention in microbial community invasion research, the three other processes are often overlooked. Here, we elaborate on the relevance of all four processes and conclude that invasion experiments should be designed to elucidate the role of dispersal, drift and diversification, in order to obtain a complete picture of invasion as a community process. PMID:27137125
A new statistical framework to assess structural alignment quality using information compression
Collier, James H.; Allison, Lloyd; Lesk, Arthur M.; Garcia de la Banda, Maria; Konagurthu, Arun S.
2014-01-01
Motivation: Progress in protein biology depends on the reliability of results from a handful of computational techniques, structural alignments being one. Recent reviews have highlighted substantial inconsistencies and differences between alignment results generated by the ever-growing stock of structural alignment programs. The lack of consensus on how the quality of structural alignments must be assessed has been identified as the main cause for the observed differences. Current methods assess structural alignment quality by constructing a scoring function that attempts to balance conflicting criteria, mainly alignment coverage and fidelity of structures under superposition. This traditional approach to measuring alignment quality, the subject of considerable literature, has failed to solve the problem. Further development along the same lines is unlikely to rectify the current deficiencies in the field. Results: This paper proposes a new statistical framework to assess structural alignment quality and significance based on lossless information compression. This is a radical departure from the traditional approach of formulating scoring functions. It links the structural alignment problem to the general class of statistical inductive inference problems, solved using the information-theoretic criterion of minimum message length. Based on this, we developed an efficient and reliable measure of structural alignment quality, I-value. The performance of I-value is demonstrated in comparison with a number of popular scoring functions, on a large collection of competing alignments. Our analysis shows that I-value provides a rigorous and reliable quantification of structural alignment quality, addressing a major gap in the field. Availability: http://lcb.infotech.monash.edu.au/I-value Contact: arun.konagurthu@monash.edu Supplementary information: Online supplementary data are available at http://lcb.infotech.monash.edu.au/I-value/suppl.html PMID:25161241
Putz, Mihai V.
2009-01-01
The density matrix theory, the ancestor of density functional theory, provides the immediate framework for Path Integral (PI) development, allowing the canonical density be extended for the many-electronic systems through the density functional closure relationship. Yet, the use of path integral formalism for electronic density prescription presents several advantages: assures the inner quantum mechanical description of the system by parameterized paths; averages the quantum fluctuations; behaves as the propagator for time-space evolution of quantum information; resembles Schrödinger equation; allows quantum statistical description of the system through partition function computing. In this framework, four levels of path integral formalism were presented: the Feynman quantum mechanical, the semiclassical, the Feynman-Kleinert effective classical, and the Fokker-Planck non-equilibrium ones. In each case the density matrix or/and the canonical density were rigorously defined and presented. The practical specializations for quantum free and harmonic motions, for statistical high and low temperature limits, the smearing justification for the Bohr’s quantum stability postulate with the paradigmatic Hydrogen atomic excursion, along the quantum chemical calculation of semiclassical electronegativity and hardness, of chemical action and Mulliken electronegativity, as well as by the Markovian generalizations of Becke-Edgecombe electronic focalization functions – all advocate for the reliability of assuming PI formalism of quantum mechanics as a versatile one, suited for analytically and/or computationally modeling of a variety of fundamental physical and chemical reactivity concepts characterizing the (density driving) many-electronic systems. PMID:20087467
Putz, Mihai V
2009-11-10
The density matrix theory, the ancestor of density functional theory, provides the immediate framework for Path Integral (PI) development, allowing the canonical density be extended for the many-electronic systems through the density functional closure relationship. Yet, the use of path integral formalism for electronic density prescription presents several advantages: assures the inner quantum mechanical description of the system by parameterized paths; averages the quantum fluctuations; behaves as the propagator for time-space evolution of quantum information; resembles Schrödinger equation; allows quantum statistical description of the system through partition function computing. In this framework, four levels of path integral formalism were presented: the Feynman quantum mechanical, the semiclassical, the Feynman-Kleinert effective classical, and the Fokker-Planck non-equilibrium ones. In each case the density matrix or/and the canonical density were rigorously defined and presented. The practical specializations for quantum free and harmonic motions, for statistical high and low temperature limits, the smearing justification for the Bohr's quantum stability postulate with the paradigmatic Hydrogen atomic excursion, along the quantum chemical calculation of semiclassical electronegativity and hardness, of chemical action and Mulliken electronegativity, as well as by the Markovian generalizations of Becke-Edgecombe electronic focalization functions - all advocate for the reliability of assuming PI formalism of quantum mechanics as a versatile one, suited for analytically and/or computationally modeling of a variety of fundamental physical and chemical reactivity concepts characterizing the (density driving) many-electronic systems.
Wu, Baolin
2006-02-15
Differential gene expression detection and sample classification using microarray data have received much research interest recently. Owing to the large number of genes p and small number of samples n (p > n), microarray data analysis poses big challenges for statistical analysis. An obvious problem owing to the 'large p small n' is over-fitting. Just by chance, we are likely to find some non-differentially expressed genes that can classify the samples very well. The idea of shrinkage is to regularize the model parameters to reduce the effects of noise and produce reliable inferences. Shrinkage has been successfully applied in the microarray data analysis. The SAM statistics proposed by Tusher et al. and the 'nearest shrunken centroid' proposed by Tibshirani et al. are ad hoc shrinkage methods. Both methods are simple, intuitive and prove to be useful in empirical studies. Recently Wu proposed the penalized t/F-statistics with shrinkage by formally using the (1) penalized linear regression models for two-class microarray data, showing good performance. In this paper we systematically discussed the use of penalized regression models for analyzing microarray data. We generalize the two-class penalized t/F-statistics proposed by Wu to multi-class microarray data. We formally derive the ad hoc shrunken centroid used by Tibshirani et al. using the (1) penalized regression models. And we show that the penalized linear regression models provide a rigorous and unified statistical framework for sample classification and differential gene expression detection.
Statistical Data Analyses of Trace Chemical, Biochemical, and Physical Analytical Signatures
DOE Office of Scientific and Technical Information (OSTI.GOV)
Udey, Ruth Norma
Analytical and bioanalytical chemistry measurement results are most meaningful when interpreted using rigorous statistical treatments of the data. The same data set may provide many dimensions of information depending on the questions asked through the applied statistical methods. Three principal projects illustrated the wealth of information gained through the application of statistical data analyses to diverse problems.
Children facing a family member's acute illness: a review of intervention studies.
Spath, Mary L
2007-07-01
A review of psycho-educational intervention studies to benefit children adapting to a close (parent, sibling, or grandparent) family member's serious illness was conducted. To review the literature on studies addressing this topic, critique research methods, describe clinical outcomes, and make recommendations for future research efforts. Research citations from 1990 to 2005 from Medline, CINAHL, Health Source: Nursing/Academic Edition, PsycARTICLES, and PsycINFO databases were identified. Citations were reviewed and evaluated for sample, design, theoretical framework, intervention, threats to validity, and outcomes. Reviewed studies were limited to those that included statistical analysis to evaluate interventions and outcomes. Six studies were reviewed. Positive outcomes were reported for all of the interventional strategies used in the studies. Reviewed studies generally lacked a theoretical framework and a control group, were generally composed of small convenience samples, and primarily used non-tested investigator instruments. They were diverse in terms of intervention length and intensity, and measured short-term outcomes related to participant program satisfaction, rather than participant cognitive and behavioral change. The paucity of interventional studies and lack of systematic empirical precision to evaluate intervention effectiveness necessitates future studies that are methodologically rigorous.
Statistical physics approach to earthquake occurrence and forecasting
NASA Astrophysics Data System (ADS)
de Arcangelis, Lucilla; Godano, Cataldo; Grasso, Jean Robert; Lippiello, Eugenio
2016-04-01
There is striking evidence that the dynamics of the Earth crust is controlled by a wide variety of mutually dependent mechanisms acting at different spatial and temporal scales. The interplay of these mechanisms produces instabilities in the stress field, leading to abrupt energy releases, i.e., earthquakes. As a consequence, the evolution towards instability before a single event is very difficult to monitor. On the other hand, collective behavior in stress transfer and relaxation within the Earth crust leads to emergent properties described by stable phenomenological laws for a population of many earthquakes in size, time and space domains. This observation has stimulated a statistical mechanics approach to earthquake occurrence, applying ideas and methods as scaling laws, universality, fractal dimension, renormalization group, to characterize the physics of earthquakes. In this review we first present a description of the phenomenological laws of earthquake occurrence which represent the frame of reference for a variety of statistical mechanical models, ranging from the spring-block to more complex fault models. Next, we discuss the problem of seismic forecasting in the general framework of stochastic processes, where seismic occurrence can be described as a branching process implementing space-time-energy correlations between earthquakes. In this context we show how correlations originate from dynamical scaling relations between time and energy, able to account for universality and provide a unifying description for the phenomenological power laws. Then we discuss how branching models can be implemented to forecast the temporal evolution of the earthquake occurrence probability and allow to discriminate among different physical mechanisms responsible for earthquake triggering. In particular, the forecasting problem will be presented in a rigorous mathematical framework, discussing the relevance of the processes acting at different temporal scales for different levels of prediction. In this review we also briefly discuss how the statistical mechanics approach can be applied to non-tectonic earthquakes and to other natural stochastic processes, such as volcanic eruptions and solar flares.
Comments in reply: new directions in migration research.
Shaw, R P
1986-01-01
The author comments on a review of his recent book NEW DIRECTIONS IN MIGRATION RESEARCH and reflects on theory and model specification, problems of estimation and statistical inference, realities of temporal and spatial heterogeneity, choices of explanatory variables, and the importance of broader political issues in migration studies. A core hypothesis is that market forces have declined as influences on internal migration in Canada over the last 30 years. Theoretical underpinnings include declining relevance of wage considerations in the decision to migrate on the assumption that marginal utility of money diminishes and marginal utility of leisure increases as society becomes wealthier. The author perceives the human capital model to have limitations and is especially troubled by the "as if" clause--that all migrants behave "as if" they calculate benefits and risks with equal rigor. The author has "shadowed" and not quantified the costs involved. He implies that normative frameworks for future migration research and planning should be established.
Moura, Lidia Mvr; Westover, M Brandon; Kwasnik, David; Cole, Andrew J; Hsu, John
2017-01-01
The elderly population faces an increasing number of cases of chronic neurological conditions, such as epilepsy and Alzheimer's disease. Because the elderly with epilepsy are commonly excluded from randomized controlled clinical trials, there are few rigorous studies to guide clinical practice. When the elderly are eligible for trials, they either rarely participate or frequently have poor adherence to therapy, thus limiting both generalizability and validity. In contrast, large observational data sets are increasingly available, but are susceptible to bias when using common analytic approaches. Recent developments in causal inference-analytic approaches also introduce the possibility of emulating randomized controlled trials to yield valid estimates. We provide a practical example of the application of the principles of causal inference to a large observational data set of patients with epilepsy. This review also provides a framework for comparative-effectiveness research in chronic neurological conditions.
Forecasting seasonal outbreaks of influenza.
Shaman, Jeffrey; Karspeck, Alicia
2012-12-11
Influenza recurs seasonally in temperate regions of the world; however, our ability to predict the timing, duration, and magnitude of local seasonal outbreaks of influenza remains limited. Here we develop a framework for initializing real-time forecasts of seasonal influenza outbreaks, using a data assimilation technique commonly applied in numerical weather prediction. The availability of real-time, web-based estimates of local influenza infection rates makes this type of quantitative forecasting possible. Retrospective ensemble forecasts are generated on a weekly basis following assimilation of these web-based estimates for the 2003-2008 influenza seasons in New York City. The findings indicate that real-time skillful predictions of peak timing can be made more than 7 wk in advance of the actual peak. In addition, confidence in those predictions can be inferred from the spread of the forecast ensemble. This work represents an initial step in the development of a statistically rigorous system for real-time forecast of seasonal influenza.
Forecasting seasonal outbreaks of influenza
Shaman, Jeffrey; Karspeck, Alicia
2012-01-01
Influenza recurs seasonally in temperate regions of the world; however, our ability to predict the timing, duration, and magnitude of local seasonal outbreaks of influenza remains limited. Here we develop a framework for initializing real-time forecasts of seasonal influenza outbreaks, using a data assimilation technique commonly applied in numerical weather prediction. The availability of real-time, web-based estimates of local influenza infection rates makes this type of quantitative forecasting possible. Retrospective ensemble forecasts are generated on a weekly basis following assimilation of these web-based estimates for the 2003–2008 influenza seasons in New York City. The findings indicate that real-time skillful predictions of peak timing can be made more than 7 wk in advance of the actual peak. In addition, confidence in those predictions can be inferred from the spread of the forecast ensemble. This work represents an initial step in the development of a statistically rigorous system for real-time forecast of seasonal influenza. PMID:23184969
Thermal machines beyond the weak coupling regime
NASA Astrophysics Data System (ADS)
Gallego, R.; Riera, A.; Eisert, J.
2014-12-01
How much work can be extracted from a heat bath using a thermal machine? The study of this question has a very long history in statistical physics in the weak-coupling limit, when applied to macroscopic systems. However, the assumption that thermal heat baths remain uncorrelated with associated physical systems is less reasonable on the nano-scale and in the quantum setting. In this work, we establish a framework of work extraction in the presence of quantum correlations. We show in a mathematically rigorous and quantitative fashion that quantum correlations and entanglement emerge as limitations to work extraction compared to what would be allowed by the second law of thermodynamics. At the heart of the approach are operations that capture the naturally non-equilibrium dynamics encountered when putting physical systems into contact with each other. We discuss various limits that relate to known results and put our work into the context of approaches to finite-time quantum thermodynamics.
Miyata, Hiroaki; Kai, Ichiro
2006-05-01
Debate about the relationship between quantitative and qualitative paradigms is often muddled and confused and the clutter of terms and arguments has resulted in the concepts becoming obscure and unrecognizable. It is therefore very important to reconsider evaluation criteria regarding rigor in social science. As Lincoln & Guba have already compared quantitative paradigms (validity, reliability, neutrality, generalizability) with qualitative paradigms (credibility, dependability, confirmability, transferability), we have discuss use of evaluation criteria based on pragmatic perspective. Validity/Credibility is the paradigm concerned to observational framework, while Reliability/Dependability refer to the range of stability in observations, Neutrality/Confirmability reflect influences between observers and subjects, Generalizability/Transferability have epistemological difference in the way findings are applied. Qualitative studies, however, does not always chose the qualitative paradigms. If we assume the stability to some extent, it is better to use the quantitative paradigm (reliability). Moreover as a quantitative study can not always guarantee a perfect observational framework, with stability in all phases of observations, it is useful to use qualitative paradigms to enhance the rigor in the study.
Next Generation of Leaching Tests
A corresponding abstract has been cleared for this presentation. The four methods comprising the Leaching Environmental Assessment Framework are described along with the tools to support implementation of the more rigorous and accurate source terms that are developed using LEAF ...
Kline, Joshua C.
2014-01-01
Over the past four decades, various methods have been implemented to measure synchronization of motor-unit firings. In this work, we provide evidence that prior reports of the existence of universal common inputs to all motoneurons and the presence of long-term synchronization are misleading, because they did not use sufficiently rigorous statistical tests to detect synchronization. We developed a statistically based method (SigMax) for computing synchronization and tested it with data from 17,736 motor-unit pairs containing 1,035,225 firing instances from the first dorsal interosseous and vastus lateralis muscles—a data set one order of magnitude greater than that reported in previous studies. Only firing data, obtained from surface electromyographic signal decomposition with >95% accuracy, were used in the study. The data were not subjectively selected in any manner. Because of the size of our data set and the statistical rigor inherent to SigMax, we have confidence that the synchronization values that we calculated provide an improved estimate of physiologically driven synchronization. Compared with three other commonly used techniques, ours revealed three types of discrepancies that result from failing to use sufficient statistical tests necessary to detect synchronization. 1) On average, the z-score method falsely detected synchronization at 16 separate latencies in each motor-unit pair. 2) The cumulative sum method missed one out of every four synchronization identifications found by SigMax. 3) The common input assumption method identified synchronization from 100% of motor-unit pairs studied. SigMax revealed that only 50% of motor-unit pairs actually manifested synchronization. PMID:25210152
Measuring coherence with entanglement concurrence
NASA Astrophysics Data System (ADS)
Qi, Xianfei; Gao, Ting; Yan, Fengli
2017-07-01
Quantum coherence is a fundamental manifestation of the quantum superposition principle. Recently, Baumgratz et al (2014 Phys. Rev. Lett. 113 140401) presented a rigorous framework to quantify coherence from the view of theory of physical resource. Here we propose a new valid quantum coherence measure which is a convex roof measure, for a quantum system of arbitrary dimension, essentially using the generalized Gell-Mann matrices. Rigorous proof shows that the proposed coherence measure, coherence concurrence, fulfills all the requirements dictated by the resource theory of quantum coherence measures. Moreover, strong links between the resource frameworks of coherence concurrence and entanglement concurrence is derived, which shows that any degree of coherence with respect to some reference basis can be converted to entanglement via incoherent operations. Our work provides a clear quantitative and operational connection between coherence and entanglement based on two kinds of concurrence. This new coherence measure, coherence concurrence, may also be beneficial to the study of quantum coherence.
JACOB: an enterprise framework for computational chemistry.
Waller, Mark P; Dresselhaus, Thomas; Yang, Jack
2013-06-15
Here, we present just a collection of beans (JACOB): an integrated batch-based framework designed for the rapid development of computational chemistry applications. The framework expedites developer productivity by handling the generic infrastructure tier, and can be easily extended by user-specific scientific code. Paradigms from enterprise software engineering were rigorously applied to create a scalable, testable, secure, and robust framework. A centralized web application is used to configure and control the operation of the framework. The application-programming interface provides a set of generic tools for processing large-scale noninteractive jobs (e.g., systematic studies), or for coordinating systems integration (e.g., complex workflows). The code for the JACOB framework is open sourced and is available at: www.wallerlab.org/jacob. Copyright © 2013 Wiley Periodicals, Inc.
Texas M-E flexible pavement design system: literature review and proposed framework.
DOT National Transportation Integrated Search
2012-04-01
Recent developments over last several decades have offered an opportunity for more rational and rigorous pavement design procedures. Substantial work has already been completed in Texas, nationally, and internationally, in all aspects of modeling, ma...
2017-01-01
Technological developments and greater rigor in the quantitative measurement of biological features in medical images have given rise to an increased interest in using quantitative imaging biomarkers (QIBs) to measure changes in these features. Critical to the performance of a QIB in preclinical or clinical settings are three primary metrology areas of interest: measurement linearity and bias, repeatability, and the ability to consistently reproduce equivalent results when conditions change, as would be expected in any clinical trial. Unfortunately, performance studies to date differ greatly in designs, analysis method and metrics used to assess a QIB for clinical use. It is therefore, difficult or not possible to integrate results from different studies or to use reported results to design studies. The Radiological Society of North America (RSNA) and the Quantitative Imaging Biomarker Alliance (QIBA) with technical, radiological and statistical experts developed a set of technical performance analysis methods, metrics and study designs that provide terminology, metrics and methods consistent with widely accepted metrological standards. This document provides a consistent framework for the conduct and evaluation of QIB performance studies so that results from multiple studies can be compared, contrasted or combined. PMID:24919831
Rigorous force field optimization principles based on statistical distance minimization
Vlcek, Lukas; Chialvo, Ariel A.
2015-10-12
We use the concept of statistical distance to define a measure of distinguishability between a pair of statistical mechanical systems, i.e., a model and its target, and show that its minimization leads to general convergence of the model’s static measurable properties to those of the target. Here we exploit this feature to define a rigorous basis for the development of accurate and robust effective molecular force fields that are inherently compatible with coarse-grained experimental data. The new model optimization principles and their efficient implementation are illustrated through selected examples, whose outcome demonstrates the higher robustness and predictive accuracy of themore » approach compared to other currently used methods, such as force matching and relative entropy minimization. We also discuss relations between the newly developed principles and established thermodynamic concepts, which include the Gibbs-Bogoliubov inequality and the thermodynamic length.« less
From screening to synthesis: using nvivo to enhance transparency in qualitative evidence synthesis.
Houghton, Catherine; Murphy, Kathy; Meehan, Ben; Thomas, James; Brooker, Dawn; Casey, Dympna
2017-03-01
To explore the experiences and perceptions of healthcare staff caring for people with dementia in the acute setting. This article focuses on the methodological process of conducting framework synthesis using nvivo for each stage of the review: screening, data extraction, synthesis and critical appraisal. Qualitative evidence synthesis brings together many research findings in a meaningful way that can be used to guide practice and policy development. For this purpose, synthesis must be conducted in a comprehensive and rigorous way. There has been previous discussion on how using nvivo can assist in enhancing and illustrate the rigorous processes involved. Qualitative framework synthesis. Twelve documents, or research reports, based on nine studies, were included for synthesis. The benefits of using nvivo are outlined in terms of facilitating teams of researchers to systematically and rigorously synthesise findings. nvivo functions were used to conduct a sensitivity analysis. Some valuable lessons were learned, and these are presented to assist and guide researchers who wish to use similar methods in future. Ultimately, good qualitative evidence synthesis will provide practitioners and policymakers with significant information that will guide decision-making on many aspects of clinical practice. The example provided explored how people with dementia are cared for acute settings. © 2016 The Authors. Journal of Clinical Nursing Published by John Wiley & Sons Ltd.
A Framework for Assessing High School Students' Statistical Reasoning.
Chan, Shiau Wei; Ismail, Zaleha; Sumintono, Bambang
2016-01-01
Based on a synthesis of literature, earlier studies, analyses and observations on high school students, this study developed an initial framework for assessing students' statistical reasoning about descriptive statistics. Framework descriptors were established across five levels of statistical reasoning and four key constructs. The former consisted of idiosyncratic reasoning, verbal reasoning, transitional reasoning, procedural reasoning, and integrated process reasoning. The latter include describing data, organizing and reducing data, representing data, and analyzing and interpreting data. In contrast to earlier studies, this initial framework formulated a complete and coherent statistical reasoning framework. A statistical reasoning assessment tool was then constructed from this initial framework. The tool was administered to 10 tenth-grade students in a task-based interview. The initial framework was refined, and the statistical reasoning assessment tool was revised. The ten students then participated in the second task-based interview, and the data obtained were used to validate the framework. The findings showed that the students' statistical reasoning levels were consistent across the four constructs, and this result confirmed the framework's cohesion. Developed to contribute to statistics education, this newly developed statistical reasoning framework provides a guide for planning learning goals and designing instruction and assessments.
Blanquart, François; Bataillon, Thomas
2016-01-01
The fitness landscape defines the relationship between genotypes and fitness in a given environment and underlies fundamental quantities such as the distribution of selection coefficient and the magnitude and type of epistasis. A better understanding of variation in landscape structure across species and environments is thus necessary to understand and predict how populations will adapt. An increasing number of experiments investigate the properties of fitness landscapes by identifying mutations, constructing genotypes with combinations of these mutations, and measuring the fitness of these genotypes. Yet these empirical landscapes represent a very small sample of the vast space of all possible genotypes, and this sample is often biased by the protocol used to identify mutations. Here we develop a rigorous statistical framework based on Approximate Bayesian Computation to address these concerns and use this flexible framework to fit a broad class of phenotypic fitness models (including Fisher’s model) to 26 empirical landscapes representing nine diverse biological systems. Despite uncertainty owing to the small size of most published empirical landscapes, the inferred landscapes have similar structure in similar biological systems. Surprisingly, goodness-of-fit tests reveal that this class of phenotypic models, which has been successful so far in interpreting experimental data, is a plausible in only three of nine biological systems. More precisely, although Fisher’s model was able to explain several statistical properties of the landscapes—including the mean and SD of selection and epistasis coefficients—it was often unable to explain the full structure of fitness landscapes. PMID:27052568
New Insights into the Estimation of Extreme Geomagnetic Storm Occurrences
NASA Astrophysics Data System (ADS)
Ruffenach, Alexis; Winter, Hugo; Lavraud, Benoit; Bernardara, Pietro
2017-04-01
Space weather events such as intense geomagnetic storms are major disturbances of the near-Earth environment that may lead to serious impacts on our modern society. As such, it is of great importance to estimate their probability, and in particular that of extreme events. One approach largely used in statistical sciences for extreme events probability estimates is Extreme Value Analysis (EVA). Using this rigorous statistical framework, estimations of the occurrence of extreme geomagnetic storms are performed here based on the most relevant global parameters related to geomagnetic storms, such as ground parameters (e.g. geomagnetic Dst and aa indexes), and space parameters related to the characteristics of Coronal Mass Ejections (CME) (velocity, southward magnetic field component, electric field). Using our fitted model, we estimate the annual probability of a Carrington-type event (Dst = -850nT) to be on the order of 10-3, with a lower limit of the uncertainties on the return period of ˜500 years. Our estimate is significantly higher than that of most past studies, which typically had a return period of a few 100 years at maximum. Thus precautions are required when extrapolating intense values. Currently, the complexity of the processes and the length of available data inevitably leads to significant uncertainties in return period estimates for the occurrence of extreme geomagnetic storms. However, our application of extreme value models for extrapolating into the tail of the distribution provides a mathematically justified framework for the estimation of extreme return periods, thereby enabling the determination of more accurate estimates and reduced associated uncertainties.
A Rigorous Treatment of Energy Extraction from a Rotating Black Hole
NASA Astrophysics Data System (ADS)
Finster, F.; Kamran, N.; Smoller, J.; Yau, S.-T.
2009-05-01
The Cauchy problem is considered for the scalar wave equation in the Kerr geometry. We prove that by choosing a suitable wave packet as initial data, one can extract energy from the black hole, thereby putting supperradiance, the wave analogue of the Penrose process, into a rigorous mathematical framework. We quantify the maximal energy gain. We also compute the infinitesimal change of mass and angular momentum of the black hole, in agreement with Christodoulou’s result for the Penrose process. The main mathematical tool is our previously derived integral representation of the wave propagator.
Wu, Zheyang; Zhao, Hongyu
2012-01-01
For more fruitful discoveries of genetic variants associated with diseases in genome-wide association studies, it is important to know whether joint analysis of multiple markers is more powerful than the commonly used single-marker analysis, especially in the presence of gene-gene interactions. This article provides a statistical framework to rigorously address this question through analytical power calculations for common model search strategies to detect binary trait loci: marginal search, exhaustive search, forward search, and two-stage screening search. Our approach incorporates linkage disequilibrium, random genotypes, and correlations among score test statistics of logistic regressions. We derive analytical results under two power definitions: the power of finding all the associated markers and the power of finding at least one associated marker. We also consider two types of error controls: the discovery number control and the Bonferroni type I error rate control. After demonstrating the accuracy of our analytical results by simulations, we apply them to consider a broad genetic model space to investigate the relative performances of different model search strategies. Our analytical study provides rapid computation as well as insights into the statistical mechanism of capturing genetic signals under different genetic models including gene-gene interactions. Even though we focus on genetic association analysis, our results on the power of model selection procedures are clearly very general and applicable to other studies.
Wu, Zheyang; Zhao, Hongyu
2013-01-01
For more fruitful discoveries of genetic variants associated with diseases in genome-wide association studies, it is important to know whether joint analysis of multiple markers is more powerful than the commonly used single-marker analysis, especially in the presence of gene-gene interactions. This article provides a statistical framework to rigorously address this question through analytical power calculations for common model search strategies to detect binary trait loci: marginal search, exhaustive search, forward search, and two-stage screening search. Our approach incorporates linkage disequilibrium, random genotypes, and correlations among score test statistics of logistic regressions. We derive analytical results under two power definitions: the power of finding all the associated markers and the power of finding at least one associated marker. We also consider two types of error controls: the discovery number control and the Bonferroni type I error rate control. After demonstrating the accuracy of our analytical results by simulations, we apply them to consider a broad genetic model space to investigate the relative performances of different model search strategies. Our analytical study provides rapid computation as well as insights into the statistical mechanism of capturing genetic signals under different genetic models including gene-gene interactions. Even though we focus on genetic association analysis, our results on the power of model selection procedures are clearly very general and applicable to other studies. PMID:23956610
diffHic: a Bioconductor package to detect differential genomic interactions in Hi-C data.
Lun, Aaron T L; Smyth, Gordon K
2015-08-19
Chromatin conformation capture with high-throughput sequencing (Hi-C) is a technique that measures the in vivo intensity of interactions between all pairs of loci in the genome. Most conventional analyses of Hi-C data focus on the detection of statistically significant interactions. However, an alternative strategy involves identifying significant changes in the interaction intensity (i.e., differential interactions) between two or more biological conditions. This is more statistically rigorous and may provide more biologically relevant results. Here, we present the diffHic software package for the detection of differential interactions from Hi-C data. diffHic provides methods for read pair alignment and processing, counting into bin pairs, filtering out low-abundance events and normalization of trended or CNV-driven biases. It uses the statistical framework of the edgeR package to model biological variability and to test for significant differences between conditions. Several options for the visualization of results are also included. The use of diffHic is demonstrated with real Hi-C data sets. Performance against existing methods is also evaluated with simulated data. On real data, diffHic is able to successfully detect interactions with significant differences in intensity between biological conditions. It also compares favourably to existing software tools on simulated data sets. These results suggest that diffHic is a viable approach for differential analyses of Hi-C data.
Estimating pseudocounts and fold changes for digital expression measurements.
Erhard, Florian
2018-06-19
Fold changes from count based high-throughput experiments such as RNA-seq suffer from a zero-frequency problem. To circumvent division by zero, so-called pseudocounts are added to make all observed counts strictly positive. The magnitude of pseudocounts for digital expression measurements and on which stage of the analysis they are introduced remained an arbitrary choice. Moreover, in the strict sense, fold changes are not quantities that can be computed. Instead, due to the stochasticity involved in the experiments, they must be estimated by statistical inference. Here, we build on a statistical framework for fold changes, where pseudocounts correspond to the parameters of the prior distribution used for Bayesian inference of the fold change. We show that arbirary and widely used choices for applying pseudocounts can lead to biased results. As a statistical rigorous alternative, we propose and test an empirical Bayes procedure to choose appropriate pseudocounts. Moreover, we introduce the novel estimator Ψ LFC for fold changes showing favorable properties with small counts and smaller deviations from the truth in simulations and real data compared to existing methods. Our results have direct implications for entities with few reads in sequencing experiments, and indirectly also affect results for entities with many reads. Ψ LFC is available as an R package under https://github.com/erhard-lab/lfc (Apache 2.0 license); R scripts to generate all figures are available at zenodo (doi:10.5281/zenodo.1163029).
McDonnell, J D; Schunck, N; Higdon, D; Sarich, J; Wild, S M; Nazarewicz, W
2015-03-27
Statistical tools of uncertainty quantification can be used to assess the information content of measured observables with respect to present-day theoretical models, to estimate model errors and thereby improve predictive capability, to extrapolate beyond the regions reached by experiment, and to provide meaningful input to applications and planned measurements. To showcase new opportunities offered by such tools, we make a rigorous analysis of theoretical statistical uncertainties in nuclear density functional theory using Bayesian inference methods. By considering the recent mass measurements from the Canadian Penning Trap at Argonne National Laboratory, we demonstrate how the Bayesian analysis and a direct least-squares optimization, combined with high-performance computing, can be used to assess the information content of the new data with respect to a model based on the Skyrme energy density functional approach. Employing the posterior probability distribution computed with a Gaussian process emulator, we apply the Bayesian framework to propagate theoretical statistical uncertainties in predictions of nuclear masses, two-neutron dripline, and fission barriers. Overall, we find that the new mass measurements do not impose a constraint that is strong enough to lead to significant changes in the model parameters. The example discussed in this study sets the stage for quantifying and maximizing the impact of new measurements with respect to current modeling and guiding future experimental efforts, thus enhancing the experiment-theory cycle in the scientific method.
De Luca, Carlo J; Kline, Joshua C
2014-12-01
Over the past four decades, various methods have been implemented to measure synchronization of motor-unit firings. In this work, we provide evidence that prior reports of the existence of universal common inputs to all motoneurons and the presence of long-term synchronization are misleading, because they did not use sufficiently rigorous statistical tests to detect synchronization. We developed a statistically based method (SigMax) for computing synchronization and tested it with data from 17,736 motor-unit pairs containing 1,035,225 firing instances from the first dorsal interosseous and vastus lateralis muscles--a data set one order of magnitude greater than that reported in previous studies. Only firing data, obtained from surface electromyographic signal decomposition with >95% accuracy, were used in the study. The data were not subjectively selected in any manner. Because of the size of our data set and the statistical rigor inherent to SigMax, we have confidence that the synchronization values that we calculated provide an improved estimate of physiologically driven synchronization. Compared with three other commonly used techniques, ours revealed three types of discrepancies that result from failing to use sufficient statistical tests necessary to detect synchronization. 1) On average, the z-score method falsely detected synchronization at 16 separate latencies in each motor-unit pair. 2) The cumulative sum method missed one out of every four synchronization identifications found by SigMax. 3) The common input assumption method identified synchronization from 100% of motor-unit pairs studied. SigMax revealed that only 50% of motor-unit pairs actually manifested synchronization. Copyright © 2014 the American Physiological Society.
A Framework for Assessing High School Students' Statistical Reasoning
2016-01-01
Based on a synthesis of literature, earlier studies, analyses and observations on high school students, this study developed an initial framework for assessing students’ statistical reasoning about descriptive statistics. Framework descriptors were established across five levels of statistical reasoning and four key constructs. The former consisted of idiosyncratic reasoning, verbal reasoning, transitional reasoning, procedural reasoning, and integrated process reasoning. The latter include describing data, organizing and reducing data, representing data, and analyzing and interpreting data. In contrast to earlier studies, this initial framework formulated a complete and coherent statistical reasoning framework. A statistical reasoning assessment tool was then constructed from this initial framework. The tool was administered to 10 tenth-grade students in a task-based interview. The initial framework was refined, and the statistical reasoning assessment tool was revised. The ten students then participated in the second task-based interview, and the data obtained were used to validate the framework. The findings showed that the students’ statistical reasoning levels were consistent across the four constructs, and this result confirmed the framework’s cohesion. Developed to contribute to statistics education, this newly developed statistical reasoning framework provides a guide for planning learning goals and designing instruction and assessments. PMID:27812091
Separation Kernel Protection Profile Revisited: Choices and Rationale
2010-12-01
provide the most stringent protection and rigorous security countermeasures” [ IATF ]. In other words, robustness is not the same as assurance. Figure 3... IATF Information Assurance Technical Framework, Chapter 4, Release 3.1, National Security Agency, September 2002. Karjoth01 G. Karjoth, “The
Removing Preconceptions with a "Learning Cycle."
ERIC Educational Resources Information Center
Gang, Su
1995-01-01
Describes a teaching experiment that uses the Learning Cycle to achieve the reorientation of physics' students conceptual frameworks away from commonsense perspectives toward scientifically rigorous outlooks. Uses Archimedes' principle as the content topic while using the Learning Cycle to remove students' nonscientific preconceptions. (JRH)
A computational framework for automation of point defect calculations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goyal, Anuj; Gorai, Prashun; Peng, Haowei
We have developed a complete and rigorously validated open-source Python framework to automate point defect calculations using density functional theory. Furthermore, the framework provides an effective and efficient method for defect structure generation, and creation of simple yet customizable workflows to analyze defect calculations. This package provides the capability to compute widely-accepted correction schemes to overcome finite-size effects, including (1) potential alignment, (2) image-charge correction, and (3) band filling correction to shallow defects. Using Si, ZnO and In2O3 as test examples, we demonstrate the package capabilities and validate the methodology.
A computational framework for automation of point defect calculations
Goyal, Anuj; Gorai, Prashun; Peng, Haowei; ...
2017-01-13
We have developed a complete and rigorously validated open-source Python framework to automate point defect calculations using density functional theory. Furthermore, the framework provides an effective and efficient method for defect structure generation, and creation of simple yet customizable workflows to analyze defect calculations. This package provides the capability to compute widely-accepted correction schemes to overcome finite-size effects, including (1) potential alignment, (2) image-charge correction, and (3) band filling correction to shallow defects. Using Si, ZnO and In2O3 as test examples, we demonstrate the package capabilities and validate the methodology.
The Impact of Climate Projection Method on the Analysis of Climate Change in Semi-arid Basins
NASA Astrophysics Data System (ADS)
Halper, E.; Shamir, E.
2016-12-01
In small basins with arid climates, rainfall characteristics are highly variable and stream flow is tightly coupled with the nuances of rainfall events (e.g. hourly precipitation patterns Climate change assessments in these basins typically employ CMIP5 projections downscaled with Bias Corrected Statistical Downscaling and Bias Correction/Constructed Analogs (BCSD-BCCA) methods, but these products have drawbacks. Specifically, BCSD-BCCA these projections do not explicitly account for localized physical precipitation mechanisms (e.g. monsoon and snowfall) that are essential to many hydrological systems in the U. S. Southwest. An investigation of the impact of different types of precipitation projections for two kinds of hydrologic studies is being conducted under the U.S. Bureau of Reclamation's Science and Technology Grant Program. An innovative modeling framework consisting of a weather generator of likely hourly precipitation scenarios, coupled with rainfall-runoff, river routing and groundwater models, has been developed in the Nogales, Arizona area. This framework can simulate the impact of future climate on municipal water operations. This framework allows the rigorous comparison of the BCSD-BCCA methods with alternative approaches including rainfall output from dynamical downscaled Regional Climate Models (RCM), a stochastic rainfall generator forced by either Global Climate Models (GCM) or RCM, and projections using historical records conditioned on either GCM or RCM. The results will provide guide for the use of climate change projections into hydrologic studies of semi-arid areas. The project extends this comparison to analyses of flood control. Large flows on the Bill Williams River are a concern for the operation of dams along the Lower Colorado River. After adapting the weather generator for this region, we will evaluate the model performance for rainfall and stream flow, with emphasis on statistical features important to the specific needs of flood management. The end product of the research is to develop a test to guide selection of a precipitation projection method (including downscaling procedure) for a given region and objective.
Martin, Guillaume; Chapuis, Elodie; Goudet, Jérôme
2008-01-01
Neutrality tests in quantitative genetics provide a statistical framework for the detection of selection on polygenic traits in wild populations. However, the existing method based on comparisons of divergence at neutral markers and quantitative traits (Qst–Fst) suffers from several limitations that hinder a clear interpretation of the results with typical empirical designs. In this article, we propose a multivariate extension of this neutrality test based on empirical estimates of the among-populations (D) and within-populations (G) covariance matrices by MANOVA. A simple pattern is expected under neutrality: D = 2Fst/(1 − Fst)G, so that neutrality implies both proportionality of the two matrices and a specific value of the proportionality coefficient. This pattern is tested using Flury's framework for matrix comparison [common principal-component (CPC) analysis], a well-known tool in G matrix evolution studies. We show the importance of using a Bartlett adjustment of the test for the small sample sizes typically found in empirical studies. We propose a dual test: (i) that the proportionality coefficient is not different from its neutral expectation [2Fst/(1 − Fst)] and (ii) that the MANOVA estimates of mean square matrices between and among populations are proportional. These two tests combined provide a more stringent test for neutrality than the classic Qst–Fst comparison and avoid several statistical problems. Extensive simulations of realistic empirical designs suggest that these tests correctly detect the expected pattern under neutrality and have enough power to efficiently detect mild to strong selection (homogeneous, heterogeneous, or mixed) when it is occurring on a set of traits. This method also provides a rigorous and quantitative framework for disentangling the effects of different selection regimes and of drift on the evolution of the G matrix. We discuss practical requirements for the proper application of our test in empirical studies and potential extensions. PMID:18245845
Diehl, Glen; Major, Solomon
2015-01-01
Measuring the effectiveness of military Global Health Engagements (GHEs) has become an area of increasing interest to the military medical field. As a result, there have been efforts to more logically and rigorously evaluate GHE projects and programs; many of these have been based on the Logic and Results Frameworks. However, while these Frameworks are apt and appropriate planning tools, they are not ideally suited to measuring programs' effectiveness. This article introduces military medicine professionals to the Measures of Effectiveness for Defense Engagement and Learning (MODEL) program, which implements a new method of assessment, one that seeks to rigorously use Measures of Effectiveness (vs. Measures of Performance) to gauge programs' and projects' success and fidelity to Theater Campaign goals. While the MODEL method draws on the Logic and Results Frameworks where appropriate, it goes beyond their planning focus by using the latest social scientific and econometric evaluation methodologies to link on-the-ground GHE "lines of effort" to the realization of national and strategic goals and end-states. It is hoped these methods will find use beyond the MODEL project itself, and will catalyze a new body of rigorous, empirically based work, which measures the effectiveness of a broad spectrum of GHE and security cooperation activities. We based our strategies on the principle that it is much more cost-effective to prevent conflicts than it is to stop one once it's started. I cannot overstate the importance of our theater security cooperation programs as the centerpiece to securing our Homeland from the irregular and catastrophic threats of the 21st Century.-GEN James L. Jones, USMC (Ret.). Reprint & Copyright © 2015 Association of Military Surgeons of the U.S.
NASA Technical Reports Server (NTRS)
Thomas-Keprta, Kathie L.; Clemett, Simon J.; Bazylinski, Dennis A.; Kirschvink, Joseph L.; McKay, David S.; Wentworth, Susan J.; Vali, H.; Gibson, Everett K.
2000-01-01
Here we use rigorous mathematical modeling to compare ALH84001 prismatic magnetites with those produced by terrestrial magnetotactic bacteria, MV-1. We find that this subset of the Martian magnetites appears to be statistically indistinguishable from those of MV-1.
An ex post facto evaluation framework for place-based police interventions.
Braga, Anthony A; Hureau, David M; Papachristos, Andrew V
2011-12-01
A small but growing body of research evidence suggests that place-based police interventions generate significant crime control gains. While place-based policing strategies have been adopted by a majority of U.S. police departments, very few agencies make a priori commitments to rigorous evaluations. Recent methodological developments were applied to conduct a rigorous ex post facto evaluation of the Boston Police Department's Safe Street Team (SST) hot spots policing program. A nonrandomized quasi-experimental design was used to evaluate the violent crime control benefits of the SST program at treated street segments and intersections relative to untreated street segments and intersections. Propensity score matching techniques were used to identify comparison places in Boston. Growth curve regression models were used to analyze violent crime trends at treatment places relative to control places. UNITS OF ANALYSIS: Using computerized mapping and database software, a micro-level place database of violent index crimes at all street segments and intersections in Boston was created. Yearly counts of violent index crimes between 2000 and 2009 at the treatment and comparison street segments and intersections served as the key outcome measure. The SST program was associated with a statistically significant reduction in violent index crimes at the treatment places relative to the comparison places without displacing crime into proximate areas. To overcome the challenges of evaluation in real-world settings, evaluators need to continuously develop innovative approaches that take advantage of new theoretical and methodological approaches.
Designing a mixed methods study in primary care.
Creswell, John W; Fetters, Michael D; Ivankova, Nataliya V
2004-01-01
Mixed methods or multimethod research holds potential for rigorous, methodologically sound investigations in primary care. The objective of this study was to use criteria from the literature to evaluate 5 mixed methods studies in primary care and to advance 3 models useful for designing such investigations. We first identified criteria from the social and behavioral sciences to analyze mixed methods studies in primary care research. We then used the criteria to evaluate 5 mixed methods investigations published in primary care research journals. Of the 5 studies analyzed, 3 included a rationale for mixing based on the need to develop a quantitative instrument from qualitative data or to converge information to best understand the research topic. Quantitative data collection involved structured interviews, observational checklists, and chart audits that were analyzed using descriptive and inferential statistical procedures. Qualitative data consisted of semistructured interviews and field observations that were analyzed using coding to develop themes and categories. The studies showed diverse forms of priority: equal priority, qualitative priority, and quantitative priority. Data collection involved quantitative and qualitative data gathered both concurrently and sequentially. The integration of the quantitative and qualitative data in these studies occurred between data analysis from one phase and data collection from a subsequent phase, while analyzing the data, and when reporting the results. We recommend instrument-building, triangulation, and data transformation models for mixed methods designs as useful frameworks to add rigor to investigations in primary care. We also discuss the limitations of our study and the need for future research.
First-Order Frameworks for Managing Models in Engineering Optimization
NASA Technical Reports Server (NTRS)
Alexandrov, Natlia M.; Lewis, Robert Michael
2000-01-01
Approximation/model management optimization (AMMO) is a rigorous methodology for attaining solutions of high-fidelity optimization problems with minimal expense in high- fidelity function and derivative evaluation. First-order AMMO frameworks allow for a wide variety of models and underlying optimization algorithms. Recent demonstrations with aerodynamic optimization achieved three-fold savings in terms of high- fidelity function and derivative evaluation in the case of variable-resolution models and five-fold savings in the case of variable-fidelity physics models. The savings are problem dependent but certain trends are beginning to emerge. We give an overview of the first-order frameworks, current computational results, and an idea of the scope of the first-order framework applicability.
Statistical hydrodynamics and related problems in spaces of probability measures
NASA Astrophysics Data System (ADS)
Dostoglou, Stamatios
2017-11-01
A rigorous theory of statistical solutions of the Navier-Stokes equations, suitable for exploring Kolmogorov's ideas, has been developed by M.I. Vishik and A.V. Fursikov, culminating in their monograph "Mathematical problems of Statistical Hydromechanics." We review some progress made in recent years following this approach, with emphasis on problems concerning the correlation of velocities and corresponding questions in the space of probability measures on Hilbert spaces.
Gerber, Brian D.; Kendall, William L.; Hooten, Mevin B.; Dubovsky, James A.; Drewien, Roderick C.
2015-01-01
Prediction is fundamental to scientific enquiry and application; however, ecologists tend to favour explanatory modelling. We discuss a predictive modelling framework to evaluate ecological hypotheses and to explore novel/unobserved environmental scenarios to assist conservation and management decision-makers. We apply this framework to develop an optimal predictive model for juvenile (<1 year old) sandhill crane Grus canadensis recruitment of the Rocky Mountain Population (RMP). We consider spatial climate predictors motivated by hypotheses of how drought across multiple time-scales and spring/summer weather affects recruitment.Our predictive modelling framework focuses on developing a single model that includes all relevant predictor variables, regardless of collinearity. This model is then optimized for prediction by controlling model complexity using a data-driven approach that marginalizes or removes irrelevant predictors from the model. Specifically, we highlight two approaches of statistical regularization, Bayesian least absolute shrinkage and selection operator (LASSO) and ridge regression.Our optimal predictive Bayesian LASSO and ridge regression models were similar and on average 37% superior in predictive accuracy to an explanatory modelling approach. Our predictive models confirmed a priori hypotheses that drought and cold summers negatively affect juvenile recruitment in the RMP. The effects of long-term drought can be alleviated by short-term wet spring–summer months; however, the alleviation of long-term drought has a much greater positive effect on juvenile recruitment. The number of freezing days and snowpack during the summer months can also negatively affect recruitment, while spring snowpack has a positive effect.Breeding habitat, mediated through climate, is a limiting factor on population growth of sandhill cranes in the RMP, which could become more limiting with a changing climate (i.e. increased drought). These effects are likely not unique to cranes. The alteration of hydrological patterns and water levels by drought may impact many migratory, wetland nesting birds in the Rocky Mountains and beyond.Generalizable predictive models (trained by out-of-sample fit and based on ecological hypotheses) are needed by conservation and management decision-makers. Statistical regularization improves predictions and provides a general framework for fitting models with a large number of predictors, even those with collinearity, to simultaneously identify an optimal predictive model while conducting rigorous Bayesian model selection. Our framework is important for understanding population dynamics under a changing climate and has direct applications for making harvest and habitat management decisions.
Segal, Leonie; Sara Opie, Rachelle; Dalziel, Kim
2012-01-01
Context Home-visiting programs have been offered for more than sixty years to at-risk families of newborns and infants. But despite decades of experience with program delivery, more than sixty published controlled trials, and more than thirty published literature reviews, there is still uncertainty surrounding the performance of these programs. Our particular interest was the performance of home visiting in reducing child maltreatment. Methods We developed a program logic framework to assist in understanding the neonate/infant home-visiting literature, identified through a systematic literature review. We tested whether success could be explained by the logic model using descriptive synthesis and statistical analysis. Findings Having a stated objective of reducing child maltreatment—a theory or mechanism of change underpinning the home-visiting program consistent with the target population and their needs and program components that can deliver against the nominated theory of change—considerably increased the chance of success. We found that only seven of fifty-three programs demonstrated such consistency, all of which had a statistically significant positive outcome, whereas of the fifteen that had no match, none was successful. Programs with a partial match had an intermediate success rate. The relationship between program success and full, partial or no match was statistically significant. Conclusions Employing a theory-driven approach provides a new way of understanding the disparate performance of neonate/infant home-visiting programs. Employing a similar theory-driven approach could also prove useful in the review of other programs that embody a diverse set of characteristics and may apply to diverse populations and settings. A program logic framework provides a rigorous approach to deriving policy-relevant meaning from effectiveness evidence of complex programs. For neonate/infant home-visiting programs, it means that in developing these programs, attention to consistency of objectives, theory of change, target population, and program components is critical. PMID:22428693
Segal, Leonie; Sara Opie, Rachelle; Dalziel, Kim
2012-03-01
Home-visiting programs have been offered for more than sixty years to at-risk families of newborns and infants. But despite decades of experience with program delivery, more than sixty published controlled trials, and more than thirty published literature reviews, there is still uncertainty surrounding the performance of these programs. Our particular interest was the performance of home visiting in reducing child maltreatment. We developed a program logic framework to assist in understanding the neonate/infant home-visiting literature, identified through a systematic literature review. We tested whether success could be explained by the logic model using descriptive synthesis and statistical analysis. Having a stated objective of reducing child maltreatment-a theory or mechanism of change underpinning the home-visiting program consistent with the target population and their needs and program components that can deliver against the nominated theory of change-considerably increased the chance of success. We found that only seven of fifty-three programs demonstrated such consistency, all of which had a statistically significant positive outcome, whereas of the fifteen that had no match, none was successful. Programs with a partial match had an intermediate success rate. The relationship between program success and full, partial or no match was statistically significant. Employing a theory-driven approach provides a new way of understanding the disparate performance of neonate/infant home-visiting programs. Employing a similar theory-driven approach could also prove useful in the review of other programs that embody a diverse set of characteristics and may apply to diverse populations and settings. A program logic framework provides a rigorous approach to deriving policy-relevant meaning from effectiveness evidence of complex programs. For neonate/infant home-visiting programs, it means that in developing these programs, attention to consistency of objectives, theory of change, target population, and program components is critical. © 2012 Milbank Memorial Fund.
Use of software engineering techniques in the design of the ALEPH data acquisition system
NASA Astrophysics Data System (ADS)
Charity, T.; McClatchey, R.; Harvey, J.
1987-08-01
The SASD methodology is being used to provide a rigorous design framework for various components of the ALEPH data acquisition system. The Entity-Relationship data model is used to describe the layout and configuration of the control and acquisition systems and detector components. State Transition Diagrams are used to specify control applications such as run control and resource management and Data Flow Diagrams assist in decomposing software tasks and defining interfaces between processes. These techniques encourage rigorous software design leading to enhanced functionality and reliability. Improved documentation and communication ensures continuity over the system life-cycle and simplifies project management.
Double Dutch: A Tool for Designing Combinatorial Libraries of Biological Systems.
Roehner, Nicholas; Young, Eric M; Voigt, Christopher A; Gordon, D Benjamin; Densmore, Douglas
2016-06-17
Recently, semirational approaches that rely on combinatorial assembly of characterized DNA components have been used to engineer biosynthetic pathways. In practice, however, it is not practical to assemble and test millions of pathway variants in order to elucidate how different DNA components affect the behavior of a pathway. To address this challenge, we apply a rigorous mathematical approach known as design of experiments (DOE) that can be used to construct empirical models of system behavior without testing all variants. To support this approach, we have developed a tool named Double Dutch, which uses a formal grammar and heuristic algorithms to automate the process of DOE library design. Compared to designing by hand, Double Dutch enables users to more efficiently and scalably design libraries of pathway variants that can be used in a DOE framework and uniquely provides a means to flexibly balance design considerations of statistical analysis, construction cost, and risk of homologous recombination, thereby demonstrating the utility of automating decision making when faced with complex design trade-offs.
Methodological Issues in Trials of Complementary and Alternative Medicine Interventions
Sikorskii, Alla; Wyatt, Gwen; Victorson, David; Faulkner, Gwen; Rahbar, Mohammad Hossein
2010-01-01
Background Complementary and alternative medicine (CAM) use is widespread among cancer patients. Information on safety and efficacy of CAM therapies is needed for both patients and health care providers. Well-designed randomized clinical trials (RCTs) of CAM therapy interventions can inform both clinical research and practice. Objectives To review important issues that affect the design of RCTs for CAM interventions. Methods Using the methods component of the Consolidated Standards for Reporting Trials (CONSORT) as a guiding framework, and a National Cancer Institute-funded reflexology study as an exemplar, methodological issues related to participants, intervention, objectives, outcomes, sample size, randomization, blinding, and statistical methods were reviewed. Discussion Trials of CAM interventions designed and implemented according to appropriate methodological standards will facilitate the needed scientific rigor in CAM research. Interventions in CAM can be tested using proposed methodology, and the results of testing will inform nursing practice in providing safe and effective supportive care and improving the well-being of patients. PMID:19918155
Data Assimilation - Advances and Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Williams, Brian J.
2014-07-30
This presentation provides an overview of data assimilation (model calibration) for complex computer experiments. Calibration refers to the process of probabilistically constraining uncertain physics/engineering model inputs to be consistent with observed experimental data. An initial probability distribution for these parameters is updated using the experimental information. Utilization of surrogate models and empirical adjustment for model form error in code calibration form the basis for the statistical methodology considered. The role of probabilistic code calibration in supporting code validation is discussed. Incorporation of model form uncertainty in rigorous uncertainty quantification (UQ) analyses is also addressed. Design criteria used within a batchmore » sequential design algorithm are introduced for efficiently achieving predictive maturity and improved code calibration. Predictive maturity refers to obtaining stable predictive inference with calibrated computer codes. These approaches allow for augmentation of initial experiment designs for collecting new physical data. A standard framework for data assimilation is presented and techniques for updating the posterior distribution of the state variables based on particle filtering and the ensemble Kalman filter are introduced.« less
Probabilistic risk analysis of building contamination.
Bolster, D T; Tartakovsky, D M
2008-10-01
We present a general framework for probabilistic risk assessment (PRA) of building contamination. PRA provides a powerful tool for the rigorous quantification of risk in contamination of building spaces. A typical PRA starts by identifying relevant components of a system (e.g. ventilation system components, potential sources of contaminants, remediation methods) and proceeds by using available information and statistical inference to estimate the probabilities of their failure. These probabilities are then combined by means of fault-tree analyses to yield probabilistic estimates of the risk of system failure (e.g. building contamination). A sensitivity study of PRAs can identify features and potential problems that need to be addressed with the most urgency. Often PRAs are amenable to approximations, which can significantly simplify the approach. All these features of PRA are presented in this paper via a simple illustrative example, which can be built upon in further studies. The tool presented here can be used to design and maintain adequate ventilation systems to minimize exposure of occupants to contaminants.
Lee, Duncan; Mukhopadhyay, Sabyasachi; Rushworth, Alastair; Sahu, Sujit K
2017-04-01
In the United Kingdom, air pollution is linked to around 40000 premature deaths each year, but estimating its health effects is challenging in a spatio-temporal study. The challenges include spatial misalignment between the pollution and disease data; uncertainty in the estimated pollution surface; and complex residual spatio-temporal autocorrelation in the disease data. This article develops a two-stage model that addresses these issues. The first stage is a spatio-temporal fusion model linking modeled and measured pollution data, while the second stage links these predictions to the disease data. The methodology is motivated by a new five-year study investigating the effects of multiple pollutants on respiratory hospitalizations in England between 2007 and 2011, using pollution and disease data relating to local and unitary authorities on a monthly time scale. © The Author 2016. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
DOT National Transportation Integrated Search
2015-12-01
MAP-21 and AASHTOs framework for transportation asset management (TAM) offer opportunities to use more : rigorous approaches to collect and apply evidence within a TAM context. This report documents the results of a study : funded by the Georgia D...
"No Excuses" in New Orleans: The Silent Passivity of Neoliberal Schooling
ERIC Educational Resources Information Center
Sondel, Beth
2016-01-01
Drawing on ethnographic data, this article critically analyzes pedagogy in "no excuses" charter schools in New Orleans. Employing Ladson-Billings's framework for culturally relevant pedagogy, the author describes the level of academic rigor, cultural competence, and critical consciousness development across classrooms. This study…
Accessing Social Capital through the Academic Mentoring Process
ERIC Educational Resources Information Center
Smith, Buffy
2007-01-01
This article explores how mentors and mentees create and maintain social capital during the mentoring process. I employ a sociological conceptual framework and rigorous qualitative analytical techniques to examine how students of color and first-generation college students access social capital through mentoring relationships. The findings…
Quantifying falsifiability of scientific theories
NASA Astrophysics Data System (ADS)
Nemenman, Ilya
I argue that the notion of falsifiability, a key concept in defining a valid scientific theory, can be quantified using Bayesian Model Selection, which is a standard tool in modern statistics. This relates falsifiability to the quantitative version of the statistical Occam's razor, and allows transforming some long-running arguments about validity of scientific theories from philosophical discussions to rigorous mathematical calculations.
Fadıloğlu, Eylem Ezgi; Serdaroğlu, Meltem
2018-01-01
Abstract This study was conducted to evaluate the effects of pre and post-rigor marinade injections on some quality parameters of Longissimus dorsi (LD) muscles. Three marinade formulations were prepared with 2% NaCl, 2% NaCl+0.5 M lactic acid and 2% NaCl+0.5 M sodium lactate. In this study marinade uptake, pH, free water, cooking loss, drip loss and color properties were analyzed. Injection time had significant effect on marinade uptake levels of samples. Regardless of marinate formulation, marinade uptake of pre-rigor samples injected with marinade solutions were higher than post rigor samples. Injection of sodium lactate increased pH values of samples whereas lactic acid injection decreased pH. Marinade treatment and storage period had significant effect on cooking loss. At each evaluation period interaction between marinade treatment and injection time showed different effect on free water content. Storage period and marinade application had significant effect on drip loss values. Drip loss in all samples increased during the storage. During all storage days, lowest CIE L* value was found in pre-rigor samples injected with sodium lactate. Lactic acid injection caused color fade in pre-rigor and post-rigor samples. Interaction between marinade treatment and storage period was found statistically significant (p<0.05). At day 0 and 3, the lowest CIE b* values obtained pre-rigor samples injected with sodium lactate and there were no differences were found in other samples. At day 6, no significant differences were found in CIE b* values of all samples. PMID:29805282
Fadıloğlu, Eylem Ezgi; Serdaroğlu, Meltem
2018-04-01
This study was conducted to evaluate the effects of pre and post-rigor marinade injections on some quality parameters of Longissimus dorsi (LD) muscles. Three marinade formulations were prepared with 2% NaCl, 2% NaCl+0.5 M lactic acid and 2% NaCl+0.5 M sodium lactate. In this study marinade uptake, pH, free water, cooking loss, drip loss and color properties were analyzed. Injection time had significant effect on marinade uptake levels of samples. Regardless of marinate formulation, marinade uptake of pre-rigor samples injected with marinade solutions were higher than post rigor samples. Injection of sodium lactate increased pH values of samples whereas lactic acid injection decreased pH. Marinade treatment and storage period had significant effect on cooking loss. At each evaluation period interaction between marinade treatment and injection time showed different effect on free water content. Storage period and marinade application had significant effect on drip loss values. Drip loss in all samples increased during the storage. During all storage days, lowest CIE L* value was found in pre-rigor samples injected with sodium lactate. Lactic acid injection caused color fade in pre-rigor and post-rigor samples. Interaction between marinade treatment and storage period was found statistically significant ( p <0.05). At day 0 and 3, the lowest CIE b* values obtained pre-rigor samples injected with sodium lactate and there were no differences were found in other samples. At day 6, no significant differences were found in CIE b* values of all samples.
What to Do When K-Means Clustering Fails: A Simple yet Principled Alternative Algorithm.
Raykov, Yordan P; Boukouvalas, Alexis; Baig, Fahd; Little, Max A
The K-means algorithm is one of the most popular clustering algorithms in current use as it is relatively fast yet simple to understand and deploy in practice. Nevertheless, its use entails certain restrictive assumptions about the data, the negative consequences of which are not always immediately apparent, as we demonstrate. While more flexible algorithms have been developed, their widespread use has been hindered by their computational and technical complexity. Motivated by these considerations, we present a flexible alternative to K-means that relaxes most of the assumptions, whilst remaining almost as fast and simple. This novel algorithm which we call MAP-DP (maximum a-posteriori Dirichlet process mixtures), is statistically rigorous as it is based on nonparametric Bayesian Dirichlet process mixture modeling. This approach allows us to overcome most of the limitations imposed by K-means. The number of clusters K is estimated from the data instead of being fixed a-priori as in K-means. In addition, while K-means is restricted to continuous data, the MAP-DP framework can be applied to many kinds of data, for example, binary, count or ordinal data. Also, it can efficiently separate outliers from the data. This additional flexibility does not incur a significant computational overhead compared to K-means with MAP-DP convergence typically achieved in the order of seconds for many practical problems. Finally, in contrast to K-means, since the algorithm is based on an underlying statistical model, the MAP-DP framework can deal with missing data and enables model testing such as cross validation in a principled way. We demonstrate the simplicity and effectiveness of this algorithm on the health informatics problem of clinical sub-typing in a cluster of diseases known as parkinsonism.
What to Do When K-Means Clustering Fails: A Simple yet Principled Alternative Algorithm
Baig, Fahd; Little, Max A.
2016-01-01
The K-means algorithm is one of the most popular clustering algorithms in current use as it is relatively fast yet simple to understand and deploy in practice. Nevertheless, its use entails certain restrictive assumptions about the data, the negative consequences of which are not always immediately apparent, as we demonstrate. While more flexible algorithms have been developed, their widespread use has been hindered by their computational and technical complexity. Motivated by these considerations, we present a flexible alternative to K-means that relaxes most of the assumptions, whilst remaining almost as fast and simple. This novel algorithm which we call MAP-DP (maximum a-posteriori Dirichlet process mixtures), is statistically rigorous as it is based on nonparametric Bayesian Dirichlet process mixture modeling. This approach allows us to overcome most of the limitations imposed by K-means. The number of clusters K is estimated from the data instead of being fixed a-priori as in K-means. In addition, while K-means is restricted to continuous data, the MAP-DP framework can be applied to many kinds of data, for example, binary, count or ordinal data. Also, it can efficiently separate outliers from the data. This additional flexibility does not incur a significant computational overhead compared to K-means with MAP-DP convergence typically achieved in the order of seconds for many practical problems. Finally, in contrast to K-means, since the algorithm is based on an underlying statistical model, the MAP-DP framework can deal with missing data and enables model testing such as cross validation in a principled way. We demonstrate the simplicity and effectiveness of this algorithm on the health informatics problem of clinical sub-typing in a cluster of diseases known as parkinsonism. PMID:27669525
Hierarchical spatial models for predicting pygmy rabbit distribution and relative abundance
Wilson, T.L.; Odei, J.B.; Hooten, M.B.; Edwards, T.C.
2010-01-01
Conservationists routinely use species distribution models to plan conservation, restoration and development actions, while ecologists use them to infer process from pattern. These models tend to work well for common or easily observable species, but are of limited utility for rare and cryptic species. This may be because honest accounting of known observation bias and spatial autocorrelation are rarely included, thereby limiting statistical inference of resulting distribution maps. We specified and implemented a spatially explicit Bayesian hierarchical model for a cryptic mammal species (pygmy rabbit Brachylagus idahoensis). Our approach used two levels of indirect sign that are naturally hierarchical (burrows and faecal pellets) to build a model that allows for inference on regression coefficients as well as spatially explicit model parameters. We also produced maps of rabbit distribution (occupied burrows) and relative abundance (number of burrows expected to be occupied by pygmy rabbits). The model demonstrated statistically rigorous spatial prediction by including spatial autocorrelation and measurement uncertainty. We demonstrated flexibility of our modelling framework by depicting probabilistic distribution predictions using different assumptions of pygmy rabbit habitat requirements. Spatial representations of the variance of posterior predictive distributions were obtained to evaluate heterogeneity in model fit across the spatial domain. Leave-one-out cross-validation was conducted to evaluate the overall model fit. Synthesis and applications. Our method draws on the strengths of previous work, thereby bridging and extending two active areas of ecological research: species distribution models and multi-state occupancy modelling. Our framework can be extended to encompass both larger extents and other species for which direct estimation of abundance is difficult. ?? 2010 The Authors. Journal compilation ?? 2010 British Ecological Society.
Dinosaurs in decline tens of millions of years before their final extinction
Sakamoto, Manabu; Benton, Michael J.; Venditti, Chris
2016-01-01
Whether dinosaurs were in a long-term decline or whether they were reigning strong right up to their final disappearance at the Cretaceous–Paleogene (K-Pg) mass extinction event 66 Mya has been debated for decades with no clear resolution. The dispute has continued unresolved because of a lack of statistical rigor and appropriate evolutionary framework. Here, for the first time to our knowledge, we apply a Bayesian phylogenetic approach to model the evolutionary dynamics of speciation and extinction through time in Mesozoic dinosaurs, properly taking account of previously ignored statistical violations. We find overwhelming support for a long-term decline across all dinosaurs and within all three dinosaurian subclades (Ornithischia, Sauropodomorpha, and Theropoda), where speciation rate slowed down through time and was ultimately exceeded by extinction rate tens of millions of years before the K-Pg boundary. The only exceptions to this general pattern are the morphologically specialized herbivores, the Hadrosauriformes and Ceratopsidae, which show rapid species proliferations throughout the Late Cretaceous instead. Our results highlight that, despite some heterogeneity in speciation dynamics, dinosaurs showed a marked reduction in their ability to replace extinct species with new ones, making them vulnerable to extinction and unable to respond quickly to and recover from the final catastrophic event. PMID:27092007
Dinosaurs in decline tens of millions of years before their final extinction.
Sakamoto, Manabu; Benton, Michael J; Venditti, Chris
2016-05-03
Whether dinosaurs were in a long-term decline or whether they were reigning strong right up to their final disappearance at the Cretaceous-Paleogene (K-Pg) mass extinction event 66 Mya has been debated for decades with no clear resolution. The dispute has continued unresolved because of a lack of statistical rigor and appropriate evolutionary framework. Here, for the first time to our knowledge, we apply a Bayesian phylogenetic approach to model the evolutionary dynamics of speciation and extinction through time in Mesozoic dinosaurs, properly taking account of previously ignored statistical violations. We find overwhelming support for a long-term decline across all dinosaurs and within all three dinosaurian subclades (Ornithischia, Sauropodomorpha, and Theropoda), where speciation rate slowed down through time and was ultimately exceeded by extinction rate tens of millions of years before the K-Pg boundary. The only exceptions to this general pattern are the morphologically specialized herbivores, the Hadrosauriformes and Ceratopsidae, which show rapid species proliferations throughout the Late Cretaceous instead. Our results highlight that, despite some heterogeneity in speciation dynamics, dinosaurs showed a marked reduction in their ability to replace extinct species with new ones, making them vulnerable to extinction and unable to respond quickly to and recover from the final catastrophic event.
High-Dimensional Multivariate Repeated Measures Analysis with Unequal Covariance Matrices.
Harrar, Solomon W; Kong, Xiaoli
2015-03-01
In this paper, test statistics for repeated measures design are introduced when the dimension is large. By large dimension is meant the number of repeated measures and the total sample size grow together but either one could be larger than the other. Asymptotic distribution of the statistics are derived for the equal as well as unequal covariance cases in the balanced as well as unbalanced cases. The asymptotic framework considered requires proportional growth of the sample sizes and the dimension of the repeated measures in the unequal covariance case. In the equal covariance case, one can grow at much faster rate than the other. The derivations of the asymptotic distributions mimic that of Central Limit Theorem with some important peculiarities addressed with sufficient rigor. Consistent and unbiased estimators of the asymptotic variances, which make efficient use of all the observations, are also derived. Simulation study provides favorable evidence for the accuracy of the asymptotic approximation under the null hypothesis. Power simulations have shown that the new methods have comparable power with a popular method known to work well in low-dimensional situation but the new methods have shown enormous advantage when the dimension is large. Data from Electroencephalograph (EEG) experiment is analyzed to illustrate the application of the results.
High-Dimensional Multivariate Repeated Measures Analysis with Unequal Covariance Matrices
Harrar, Solomon W.; Kong, Xiaoli
2015-01-01
In this paper, test statistics for repeated measures design are introduced when the dimension is large. By large dimension is meant the number of repeated measures and the total sample size grow together but either one could be larger than the other. Asymptotic distribution of the statistics are derived for the equal as well as unequal covariance cases in the balanced as well as unbalanced cases. The asymptotic framework considered requires proportional growth of the sample sizes and the dimension of the repeated measures in the unequal covariance case. In the equal covariance case, one can grow at much faster rate than the other. The derivations of the asymptotic distributions mimic that of Central Limit Theorem with some important peculiarities addressed with sufficient rigor. Consistent and unbiased estimators of the asymptotic variances, which make efficient use of all the observations, are also derived. Simulation study provides favorable evidence for the accuracy of the asymptotic approximation under the null hypothesis. Power simulations have shown that the new methods have comparable power with a popular method known to work well in low-dimensional situation but the new methods have shown enormous advantage when the dimension is large. Data from Electroencephalograph (EEG) experiment is analyzed to illustrate the application of the results. PMID:26778861
Dinosaurs in decline tens of millions of years before their final extinction
NASA Astrophysics Data System (ADS)
Sakamoto, Manabu; Benton, Michael J.
2016-05-01
Whether dinosaurs were in a long-term decline or whether they were reigning strong right up to their final disappearance at the Cretaceous-Paleogene (K-Pg) mass extinction event 66 Mya has been debated for decades with no clear resolution. The dispute has continued unresolved because of a lack of statistical rigor and appropriate evolutionary framework. Here, for the first time to our knowledge, we apply a Bayesian phylogenetic approach to model the evolutionary dynamics of speciation and extinction through time in Mesozoic dinosaurs, properly taking account of previously ignored statistical violations. We find overwhelming support for a long-term decline across all dinosaurs and within all three dinosaurian subclades (Ornithischia, Sauropodomorpha, and Theropoda), where speciation rate slowed down through time and was ultimately exceeded by extinction rate tens of millions of years before the K-Pg boundary. The only exceptions to this general pattern are the morphologically specialized herbivores, the Hadrosauriformes and Ceratopsidae, which show rapid species proliferations throughout the Late Cretaceous instead. Our results highlight that, despite some heterogeneity in speciation dynamics, dinosaurs showed a marked reduction in their ability to replace extinct species with new ones, making them vulnerable to extinction and unable to respond quickly to and recover from the final catastrophic event.
Raunig, David L; McShane, Lisa M; Pennello, Gene; Gatsonis, Constantine; Carson, Paul L; Voyvodic, James T; Wahl, Richard L; Kurland, Brenda F; Schwarz, Adam J; Gönen, Mithat; Zahlmann, Gudrun; Kondratovich, Marina V; O'Donnell, Kevin; Petrick, Nicholas; Cole, Patricia E; Garra, Brian; Sullivan, Daniel C
2015-02-01
Technological developments and greater rigor in the quantitative measurement of biological features in medical images have given rise to an increased interest in using quantitative imaging biomarkers to measure changes in these features. Critical to the performance of a quantitative imaging biomarker in preclinical or clinical settings are three primary metrology areas of interest: measurement linearity and bias, repeatability, and the ability to consistently reproduce equivalent results when conditions change, as would be expected in any clinical trial. Unfortunately, performance studies to date differ greatly in designs, analysis method, and metrics used to assess a quantitative imaging biomarker for clinical use. It is therefore difficult or not possible to integrate results from different studies or to use reported results to design studies. The Radiological Society of North America and the Quantitative Imaging Biomarker Alliance with technical, radiological, and statistical experts developed a set of technical performance analysis methods, metrics, and study designs that provide terminology, metrics, and methods consistent with widely accepted metrological standards. This document provides a consistent framework for the conduct and evaluation of quantitative imaging biomarker performance studies so that results from multiple studies can be compared, contrasted, or combined. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.
Host and parasite morphology influence congruence between host and parasite phylogenies.
Sweet, Andrew D; Bush, Sarah E; Gustafsson, Daniel R; Allen, Julie M; DiBlasi, Emily; Skeen, Heather R; Weckstein, Jason D; Johnson, Kevin P
2018-03-23
Comparisons of host and parasite phylogenies often show varying degrees of phylogenetic congruence. However, few studies have rigorously explored the factors driving this variation. Multiple factors such as host or parasite morphology may govern the degree of phylogenetic congruence. An ideal analysis for understanding the factors correlated with congruence would focus on a diverse host-parasite system for increased variation and statistical power. In this study, we focused on the Brueelia-complex, a diverse and widespread group of feather lice that primarily parasitise songbirds. We generated a molecular phylogeny of the lice and compared this tree with a phylogeny of their avian hosts. We also tested for the contribution of each host-parasite association to the overall congruence. The two trees overall were significantly congruent, but the contribution of individual associations to this congruence varied. To understand this variation, we developed a novel approach to test whether host, parasite or biogeographic factors were statistically associated with patterns of congruence. Both host plumage dimorphism and parasite ecomorphology were associated with patterns of congruence, whereas host body size, other plumage traits and biogeography were not. Our results lay the framework for future studies to further elucidate how these factors influence the process of host-parasite coevolution. Copyright © 2018 Australian Society for Parasitology. Published by Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Figueroa, Julie López; Rodriguez, Gloria M.
2015-01-01
This chapter outlines critical practices that emerged from utilizing social justice frameworks to mentor first-generation, underrepresented minority students at the undergraduate to doctoral levels. The mentoring strategies include helping students to reframe instances when faculty and peers unconsciously conflate academic rigor with color-blind…
ERIC Educational Resources Information Center
DeVillar, Robert A.; Jiang, Binbin
2011-01-01
Creatively and rigorously blending historical research and contemporary data from various disciplines, this book cogently and comprehensively illustrates the problems and opportunities the American nation faces in education, economics, and the global arena. The authors propose a framework of transformation that would render American culture no…
USDA-ARS?s Scientific Manuscript database
Prokaryotic taxonomy is the underpinning of microbiology, providing a framework for the proper identification and naming of organisms. The 'gold standard' of bacterial species delineation is the overall genome similarity as determined by DNA-DNA hybridization (DDH), a technically rigorous yet someti...
USDA-ARS?s Scientific Manuscript database
To ensure current land use strategies and management practices are economically, environmentally, and socially sustainable, tools and techniques for assessing and quantifying changes in soil quality/health (SQ) need to be developed through rigorous research and potential use by consultants, and othe...
Cash on Demand: A Framework for Managing a Cash Liquidity Position.
ERIC Educational Resources Information Center
Augustine, John H.
1995-01-01
A well-run college or university will seek to accumulate and maintain an appropriate cash reserve or liquidity position. A rigorous analytic process for estimating the size and cost of a liquidity position, based on judgments about the institution's operating risks and opportunities, is outlined. (MSE)
ERIC Educational Resources Information Center
Joan Herman; Robert Linn
2014-01-01
Researching. Synthesizing. Reasoning with evidence. The PARCC and Smarter Balanced assessments are clearly setting their sights on complex thinking skills. Researchers Joan Herman and Robert Linn look at the new assessments to see how they stack up against Norman Webb's depth of knowledge framework as well as against current state tests. The…
Evaluating Computer-Related Incidents on Campus
ERIC Educational Resources Information Center
Rothschild, Daniel; Rezmierski, Virginia
2004-01-01
The Computer Incident Factor Analysis and Categorization (CIFAC) Project at the University of Michigan began in September 2003 with grants from EDUCAUSE and the National Science Foundation (NSF). The project's primary goal is to create a best-practices security framework for colleges and universities based on rigorous quantitative analysis of…
Testing Adaptive Toolbox Models: A Bayesian Hierarchical Approach
ERIC Educational Resources Information Center
Scheibehenne, Benjamin; Rieskamp, Jorg; Wagenmakers, Eric-Jan
2013-01-01
Many theories of human cognition postulate that people are equipped with a repertoire of strategies to solve the tasks they face. This theoretical framework of a cognitive toolbox provides a plausible account of intra- and interindividual differences in human behavior. Unfortunately, it is often unclear how to rigorously test the toolbox…
McDonnell, J. D.; Schunck, N.; Higdon, D.; ...
2015-03-24
Statistical tools of uncertainty quantification can be used to assess the information content of measured observables with respect to present-day theoretical models, to estimate model errors and thereby improve predictive capability, to extrapolate beyond the regions reached by experiment, and to provide meaningful input to applications and planned measurements. To showcase new opportunities offered by such tools, we make a rigorous analysis of theoretical statistical uncertainties in nuclear density functional theory using Bayesian inference methods. By considering the recent mass measurements from the Canadian Penning Trap at Argonne National Laboratory, we demonstrate how the Bayesian analysis and a direct least-squaresmore » optimization, combined with high-performance computing, can be used to assess the information content of the new data with respect to a model based on the Skyrme energy density functional approach. Employing the posterior probability distribution computed with a Gaussian process emulator, we apply the Bayesian framework to propagate theoretical statistical uncertainties in predictions of nuclear masses, two-neutron dripline, and fission barriers. Overall, we find that the new mass measurements do not impose a constraint that is strong enough to lead to significant changes in the model parameters. In addition, the example discussed in this study sets the stage for quantifying and maximizing the impact of new measurements with respect to current modeling and guiding future experimental efforts, thus enhancing the experiment-theory cycle in the scientific method.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wild, M.; Rouhani, S.
1995-02-01
A typical site investigation entails extensive sampling and monitoring. In the past, sampling plans have been designed on purely ad hoc bases, leading to significant expenditures and, in some cases, collection of redundant information. In many instances, sampling costs exceed the true worth of the collected data. The US Environmental Protection Agency (EPA) therefore has advocated the use of geostatistics to provide a logical framework for sampling and analysis of environmental data. Geostatistical methodology uses statistical techniques for the spatial analysis of a variety of earth-related data. The use of geostatistics was developed by the mining industry to estimate oremore » concentrations. The same procedure is effective in quantifying environmental contaminants in soils for risk assessments. Unlike classical statistical techniques, geostatistics offers procedures to incorporate the underlying spatial structure of the investigated field. Sample points spaced close together tend to be more similar than samples spaced further apart. This can guide sampling strategies and determine complex contaminant distributions. Geostatistic techniques can be used to evaluate site conditions on the basis of regular, irregular, random and even spatially biased samples. In most environmental investigations, it is desirable to concentrate sampling in areas of known or suspected contamination. The rigorous mathematical procedures of geostatistics allow for accurate estimates at unsampled locations, potentially reducing sampling requirements. The use of geostatistics serves as a decision-aiding and planning tool and can significantly reduce short-term site assessment costs, long-term sampling and monitoring needs, as well as lead to more accurate and realistic remedial design criteria.« less
Burnecki, Krzysztof; Kepten, Eldad; Janczura, Joanna; Bronshtein, Irena; Garini, Yuval; Weron, Aleksander
2012-01-01
We present a systematic statistical analysis of the recently measured individual trajectories of fluorescently labeled telomeres in the nucleus of living human cells. The experiments were performed in the U2OS cancer cell line. We propose an algorithm for identification of the telomere motion. By expanding the previously published data set, we are able to explore the dynamics in six time orders, a task not possible earlier. As a result, we establish a rigorous mathematical characterization of the stochastic process and identify the basic mathematical mechanisms behind the telomere motion. We find that the increments of the motion are stationary, Gaussian, ergodic, and even more chaotic—mixing. Moreover, the obtained memory parameter estimates, as well as the ensemble average mean square displacement reveal subdiffusive behavior at all time spans. All these findings statistically prove a fractional Brownian motion for the telomere trajectories, which is confirmed by a generalized p-variation test. Taking into account the biophysical nature of telomeres as monomers in the chromatin chain, we suggest polymer dynamics as a sufficient framework for their motion with no influence of other models. In addition, these results shed light on other studies of telomere motion and the alternative telomere lengthening mechanism. We hope that identification of these mechanisms will allow the development of a proper physical and biological model for telomere subdynamics. This array of tests can be easily implemented to other data sets to enable quick and accurate analysis of their statistical characteristics. PMID:23199912
A novel statistical method for quantitative comparison of multiple ChIP-seq datasets.
Chen, Li; Wang, Chi; Qin, Zhaohui S; Wu, Hao
2015-06-15
ChIP-seq is a powerful technology to measure the protein binding or histone modification strength in the whole genome scale. Although there are a number of methods available for single ChIP-seq data analysis (e.g. 'peak detection'), rigorous statistical method for quantitative comparison of multiple ChIP-seq datasets with the considerations of data from control experiment, signal to noise ratios, biological variations and multiple-factor experimental designs is under-developed. In this work, we develop a statistical method to perform quantitative comparison of multiple ChIP-seq datasets and detect genomic regions showing differential protein binding or histone modification. We first detect peaks from all datasets and then union them to form a single set of candidate regions. The read counts from IP experiment at the candidate regions are assumed to follow Poisson distribution. The underlying Poisson rates are modeled as an experiment-specific function of artifacts and biological signals. We then obtain the estimated biological signals and compare them through the hypothesis testing procedure in a linear model framework. Simulations and real data analyses demonstrate that the proposed method provides more accurate and robust results compared with existing ones. An R software package ChIPComp is freely available at http://web1.sph.emory.edu/users/hwu30/software/ChIPComp.html. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
DOE Office of Scientific and Technical Information (OSTI.GOV)
McDonnell, J. D.; Schunck, N.; Higdon, D.
2015-03-24
Statistical tools of uncertainty quantification can be used to assess the information content of measured observables with respect to present-day theoretical models, to estimate model errors and thereby improve predictive capability, to extrapolate beyond the regions reached by experiment, and to provide meaningful input to applications and planned measurements. To showcase new opportunities offered by such tools, we make a rigorous analysis of theoretical statistical uncertainties in nuclear density functional theory using Bayesian inference methods. By considering the recent mass measurements from the Canadian Penning Trap at Argonne National Laboratory, we demonstrate how the Bayesian analysis and a direct least-squaresmore » optimization, combined with high-performance computing, can be used to assess the information content of the new data with respect to a model based on the Skyrme energy density functional approach. Employing the posterior probability distribution computed with a Gaussian process emulator, we apply the Bayesian framework to propagate theoretical statistical uncertainties in predictions of nuclear masses, two-neutron dripline, and fission barriers. Overall, we find that the new mass measurements do not impose a constraint that is strong enough to lead to significant changes in the model parameters. As a result, the example discussed in this study sets the stage for quantifying and maximizing the impact of new measurements with respect to current modeling and guiding future experimental efforts, thus enhancing the experiment-theory cycle in the scientific method.« less
Designing A Mixed Methods Study In Primary Care
Creswell, John W.; Fetters, Michael D.; Ivankova, Nataliya V.
2004-01-01
BACKGROUND Mixed methods or multimethod research holds potential for rigorous, methodologically sound investigations in primary care. The objective of this study was to use criteria from the literature to evaluate 5 mixed methods studies in primary care and to advance 3 models useful for designing such investigations. METHODS We first identified criteria from the social and behavioral sciences to analyze mixed methods studies in primary care research. We then used the criteria to evaluate 5 mixed methods investigations published in primary care research journals. RESULTS Of the 5 studies analyzed, 3 included a rationale for mixing based on the need to develop a quantitative instrument from qualitative data or to converge information to best understand the research topic. Quantitative data collection involved structured interviews, observational checklists, and chart audits that were analyzed using descriptive and inferential statistical procedures. Qualitative data consisted of semistructured interviews and field observations that were analyzed using coding to develop themes and categories. The studies showed diverse forms of priority: equal priority, qualitative priority, and quantitative priority. Data collection involved quantitative and qualitative data gathered both concurrently and sequentially. The integration of the quantitative and qualitative data in these studies occurred between data analysis from one phase and data collection from a subsequent phase, while analyzing the data, and when reporting the results. DISCUSSION We recommend instrument-building, triangulation, and data transformation models for mixed methods designs as useful frameworks to add rigor to investigations in primary care. We also discuss the limitations of our study and the need for future research. PMID:15053277
Beyond the Quantitative and Qualitative Divide: Research in Art Education as Border Skirmish.
ERIC Educational Resources Information Center
Sullivan, Graeme
1996-01-01
Analyzes a research project that utilizes a coherent conceptual model of art education research incorporating the demand for empirical rigor and providing for diverse interpretive frameworks. Briefly profiles the NUD*IST (Non-numerical Unstructured Data Indexing Searching and Theorizing) software system that can organize and retrieve complex…
An Ex Post Facto Evaluation Framework for Place-Based Police Interventions
ERIC Educational Resources Information Center
Braga, Anthony A.; Hureau, David M.; Papachristos, Andrew V.
2011-01-01
Background: A small but growing body of research evidence suggests that place-based police interventions generate significant crime control gains. While place-based policing strategies have been adopted by a majority of U.S. police departments, very few agencies make a priori commitments to rigorous evaluations. Objective: Recent methodological…
Memory Hazard Functions: A Vehicle for Theory Development and Test
ERIC Educational Resources Information Center
Chechile, Richard A.
2006-01-01
A framework is developed to rigorously test an entire class of memory retention functions by examining hazard properties. Evidence is provided that the memory hazard function is not monotonically decreasing. Yet most of the proposals for retention functions, which have emerged from the psychological literature, imply that memory hazard is…
Reconstructing Constructivism: Causal Models, Bayesian Learning Mechanisms, and the Theory Theory
ERIC Educational Resources Information Center
Gopnik, Alison; Wellman, Henry M.
2012-01-01
We propose a new version of the "theory theory" grounded in the computational framework of probabilistic causal models and Bayesian learning. Probabilistic models allow a constructivist but rigorous and detailed approach to cognitive development. They also explain the learning of both more specific causal hypotheses and more abstract framework…
The Mauritian Education System: Was There a Will to Anglicize it?
ERIC Educational Resources Information Center
Tirvsssen, Rada
2007-01-01
Clive Whitehead (2005: 315-329) makes an indisputable claim that British colonial education is a controversial topic in the history of education. The macro study of educational systems undertaken within a framework that guarantees a systematic and rigorous approach can offer answers to many disputed issues, but researchers should not underestimate…
ERIC Educational Resources Information Center
Kelcey, Ben; Phelps, Geoffrey
2013-01-01
Despite recent shifts in research emphasizing the value of carefully designed experiments, the number of studies of teacher professional development with rigorous designs has lagged behind its student outcome counterparts. We outline a framework for the design of group randomized trials (GRTs) with teachers' knowledge as the outcome and…
Further Iterations on Using the Problem-Analysis Framework
ERIC Educational Resources Information Center
Annan, Michael; Chua, Jocelyn; Cole, Rachel; Kennedy, Emma; James, Robert; Markusdottir, Ingibjorg; Monsen, Jeremy; Robertson, Lucy; Shah, Sonia
2013-01-01
A core component of applied educational and child psychology practice is the skilfulness with which practitioners are able to rigorously structure and conceptualise complex real world human problems. This is done in such a way that when they (with others) jointly work on them, there is an increased likelihood of positive outcomes being achieved…
USDA-ARS?s Scientific Manuscript database
Accurate species delimitation underpins good taxonomy. Formalisation of integrative taxonomy in the last decade has provided a framework for using multidisciplinary data to increase rigor in species delimitation hypotheses. We address the state of integrative taxonomy by using an international proje...
ERIC Educational Resources Information Center
Bringle, Robert G., Ed.; Hatcher, Julie A., Ed.; Jones, Steven G., Ed.
2010-01-01
This book focuses on conducting research on International Service Learning (ISL), which includes developing and evaluating hypotheses about ISL outcomes and measuring its impact on students, faculty, and communities. The book argues that rigorous research is essential to improving the quality of ISL's implementation and delivery, and providing the…
School Governor Regulation in England's Changing Education Landscape
ERIC Educational Resources Information Center
Baxter, Jacqueline
2017-01-01
The changing education landscape in England, combined with a more rigorous form of governor regulation in the form of the Ofsted 2012 Inspection Framework, are together placing more demands than ever before on the 300,000 volunteer school governors in England. These school governors are, in many cases, directly accountable to the Secretary of…
ERIC Educational Resources Information Center
Broerse, Jacqueline E. W.; de Cock Buning, Tjard; Roelofsen, Anneloes; Bunders, Joske F. G.
2009-01-01
Public engagement is increasingly advocated and applied in the development and implementation of technological innovations. However, initiatives so far are rarely considered effective. There is a need for more methodological rigor and insight into conducive conditions. The authors developed an evaluative framework and assessed accordingly the…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-06
... program is to use 10 key components based on the ``Program of Study Design Framework'' [[Page 47574...., the States' Career Clusters \\2\\), and offer students the opportunities to earn postsecondary credits... extent to which students are attaining necessary knowledge and skills, we agree that administrators...
A customisable framework for the assessment of therapies in the solution of therapy decision tasks.
Manjarrés Riesco, A; Martínez Tomás, R; Mira Mira, J
2000-01-01
In current medical research, a growing interest can be observed in the definition of a global therapy-evaluation framework which integrates considerations such as patients preferences and quality-of-life results. In this article, we propose the use of the research results in this domain as a source of knowledge in the design of support systems for therapy decision analysis, in particular with a view to application in oncology. We discuss the incorporation of these considerations in the definition of the therapy-assessment methods involved in the solution of a generic therapy decision task, described in the context of AI software development methodologies such as CommonKADS. The goal of the therapy decision task is to identify the ideal therapy, for a given patient, in accordance with a set of objectives of a diverse nature. The assessment methods applied are based either on data obtained from statistics or on the specific idiosyncrasies of each patient, as identified from their responses to a suite of psychological tests. In the analysis of the therapy decision task we emphasise the importance, from a methodological perspective, of using a rigorous approach to the modelling of domain ontologies and domain-specific data. To this aim we make extensive use of the semi-formal object oriented analysis notation UML to describe the domain level.
Bioregional monitoring design and occupancy estimation for two Sierra Nevadan amphibian taxa
Land-management agencies need quantitative, statistically rigorous monitoring data, often at large spatial and temporal scales, to support resource-management decisions. Monitoring designs typically must accommodate multiple ecological, logistical, political, and economic objec...
The extraction and integration framework: a two-process account of statistical learning.
Thiessen, Erik D; Kronstein, Alexandra T; Hufnagle, Daniel G
2013-07-01
The term statistical learning in infancy research originally referred to sensitivity to transitional probabilities. Subsequent research has demonstrated that statistical learning contributes to infant development in a wide array of domains. The range of statistical learning phenomena necessitates a broader view of the processes underlying statistical learning. Learners are sensitive to a much wider range of statistical information than the conditional relations indexed by transitional probabilities, including distributional and cue-based statistics. We propose a novel framework that unifies learning about all of these kinds of statistical structure. From our perspective, learning about conditional relations outputs discrete representations (such as words). Integration across these discrete representations yields sensitivity to cues and distributional information. To achieve sensitivity to all of these kinds of statistical structure, our framework combines processes that extract segments of the input with processes that compare across these extracted items. In this framework, the items extracted from the input serve as exemplars in long-term memory. The similarity structure of those exemplars in long-term memory leads to the discovery of cues and categorical structure, which guides subsequent extraction. The extraction and integration framework provides a way to explain sensitivity to both conditional statistical structure (such as transitional probabilities) and distributional statistical structure (such as item frequency and variability), and also a framework for thinking about how these different aspects of statistical learning influence each other. 2013 APA, all rights reserved
Digital morphogenesis via Schelling segregation
NASA Astrophysics Data System (ADS)
Barmpalias, George; Elwes, Richard; Lewis-Pye, Andrew
2018-04-01
Schelling’s model of segregation looks to explain the way in which particles or agents of two types may come to arrange themselves spatially into configurations consisting of large homogeneous clusters, i.e. connected regions consisting of only one type. As one of the earliest agent based models studied by economists and perhaps the most famous model of self-organising behaviour, it also has direct links to areas at the interface between computer science and statistical mechanics, such as the Ising model and the study of contagion and cascading phenomena in networks. While the model has been extensively studied it has largely resisted rigorous analysis, prior results from the literature generally pertaining to variants of the model which are tweaked so as to be amenable to standard techniques from statistical mechanics or stochastic evolutionary game theory. In Brandt et al (2012 Proc. 44th Annual ACM Symp. on Theory of Computing) provided the first rigorous analysis of the unperturbed model, for a specific set of input parameters. Here we provide a rigorous analysis of the model’s behaviour much more generally and establish some surprising forms of threshold behaviour, notably the existence of situations where an increased level of intolerance for neighbouring agents of opposite type leads almost certainly to decreased segregation.
Statistical approach for selection of biologically informative genes.
Das, Samarendra; Rai, Anil; Mishra, D C; Rai, Shesh N
2018-05-20
Selection of informative genes from high dimensional gene expression data has emerged as an important research area in genomics. Many gene selection techniques have been proposed so far are either based on relevancy or redundancy measure. Further, the performance of these techniques has been adjudged through post selection classification accuracy computed through a classifier using the selected genes. This performance metric may be statistically sound but may not be biologically relevant. A statistical approach, i.e. Boot-MRMR, was proposed based on a composite measure of maximum relevance and minimum redundancy, which is both statistically sound and biologically relevant for informative gene selection. For comparative evaluation of the proposed approach, we developed two biological sufficient criteria, i.e. Gene Set Enrichment with QTL (GSEQ) and biological similarity score based on Gene Ontology (GO). Further, a systematic and rigorous evaluation of the proposed technique with 12 existing gene selection techniques was carried out using five gene expression datasets. This evaluation was based on a broad spectrum of statistically sound (e.g. subject classification) and biological relevant (based on QTL and GO) criteria under a multiple criteria decision-making framework. The performance analysis showed that the proposed technique selects informative genes which are more biologically relevant. The proposed technique is also found to be quite competitive with the existing techniques with respect to subject classification and computational time. Our results also showed that under the multiple criteria decision-making setup, the proposed technique is best for informative gene selection over the available alternatives. Based on the proposed approach, an R Package, i.e. BootMRMR has been developed and available at https://cran.r-project.org/web/packages/BootMRMR. This study will provide a practical guide to select statistical techniques for selecting informative genes from high dimensional expression data for breeding and system biology studies. Published by Elsevier B.V.
Can power-law scaling and neuronal avalanches arise from stochastic dynamics?
Touboul, Jonathan; Destexhe, Alain
2010-02-11
The presence of self-organized criticality in biology is often evidenced by a power-law scaling of event size distributions, which can be measured by linear regression on logarithmic axes. We show here that such a procedure does not necessarily mean that the system exhibits self-organized criticality. We first provide an analysis of multisite local field potential (LFP) recordings of brain activity and show that event size distributions defined as negative LFP peaks can be close to power-law distributions. However, this result is not robust to change in detection threshold, or when tested using more rigorous statistical analyses such as the Kolmogorov-Smirnov test. Similar power-law scaling is observed for surrogate signals, suggesting that power-law scaling may be a generic property of thresholded stochastic processes. We next investigate this problem analytically, and show that, indeed, stochastic processes can produce spurious power-law scaling without the presence of underlying self-organized criticality. However, this power-law is only apparent in logarithmic representations, and does not survive more rigorous analysis such as the Kolmogorov-Smirnov test. The same analysis was also performed on an artificial network known to display self-organized criticality. In this case, both the graphical representations and the rigorous statistical analysis reveal with no ambiguity that the avalanche size is distributed as a power-law. We conclude that logarithmic representations can lead to spurious power-law scaling induced by the stochastic nature of the phenomenon. This apparent power-law scaling does not constitute a proof of self-organized criticality, which should be demonstrated by more stringent statistical tests.
Lyon, Aaron R; Connors, Elizabeth; Jensen-Doss, Amanda; Landes, Sara J; Lewis, Cara C; McLeod, Bryce D; Rutt, Christopher; Stanick, Cameo; Weiner, Bryan J
2017-09-01
The advancement of implementation science is dependent on identifying assessment strategies that can address implementation and clinical outcome variables in ways that are valid, relevant to stakeholders, and scalable. This paper presents a measurement agenda for implementation science that integrates the previously disparate assessment traditions of idiographic and nomothetic approaches. Although idiographic and nomothetic approaches are both used in implementation science, a review of the literature on this topic suggests that their selection can be indiscriminate, driven by convenience, and not explicitly tied to research study design. As a result, they are not typically combined deliberately or effectively. Thoughtful integration may simultaneously enhance both the rigor and relevance of assessments across multiple levels within health service systems. Background on nomothetic and idiographic assessment is provided as well as their potential to support research in implementation science. Drawing from an existing framework, seven structures (of various sequencing and weighting options) and five functions (Convergence, Complementarity, Expansion, Development, Sampling) for integrating conceptually distinct research methods are articulated as they apply to the deliberate, design-driven integration of nomothetic and idiographic assessment approaches. Specific examples and practical guidance are provided to inform research consistent with this framework. Selection and integration of idiographic and nomothetic assessments for implementation science research designs can be improved. The current paper argues for the deliberate application of a clear framework to improve the rigor and relevance of contemporary assessment strategies.
Becan, Jennifer E; Bartkowski, John P; Knight, Danica K; Wiley, Tisha R A; DiClemente, Ralph; Ducharme, Lori; Welsh, Wayne N; Bowser, Diana; McCollister, Kathryn; Hiller, Matthew; Spaulding, Anne C; Flynn, Patrick M; Swartzendruber, Andrea; Dickson, Megan F; Fisher, Jacqueline Horan; Aarons, Gregory A
2018-04-13
This paper describes the means by which a United States National Institute on Drug Abuse (NIDA)-funded cooperative, Juvenile Justice-Translational Research on Interventions for Adolescents in the Legal System (JJ-TRIALS), utilized an established implementation science framework in conducting a multi-site, multi-research center implementation intervention initiative. The initiative aimed to bolster the ability of juvenile justice agencies to address unmet client needs related to substance use while enhancing inter-organizational relationships between juvenile justice and local behavioral health partners. The EPIS (Exploration, Preparation, Implementation, Sustainment) framework was selected and utilized as the guiding model from inception through project completion; including the mapping of implementation strategies to EPIS stages, articulation of research questions, and selection, content, and timing of measurement protocols. Among other key developments, the project led to a reconceptualization of its governing implementation science framework into cyclical form as the EPIS Wheel. The EPIS Wheel is more consistent with rapid-cycle testing principles and permits researchers to track both progressive and recursive movement through EPIS. Moreover, because this randomized controlled trial was predicated on a bundled strategy method, JJ-TRIALS was designed to rigorously test progress through the EPIS stages as promoted by facilitation of data-driven decision making principles. The project extended EPIS by (1) elucidating the role and nature of recursive activity in promoting change (yielding the circular EPIS Wheel), (2) by expanding the applicability of the EPIS framework beyond a single evidence-based practice (EBP) to address varying process improvement efforts (representing varying EBPs), and (3) by disentangling outcome measures of progression through EPIS stages from the a priori established study timeline. The utilization of EPIS in JJ-TRIALS provides a model for practical and applied use of implementation frameworks in real-world settings that span outer service system and inner organizational contexts in improving care for vulnerable populations. NCT02672150 . Retrospectively registered on 22 January 2016.
Gerber, Brian D; Kendall, William L; Hooten, Mevin B; Dubovsky, James A; Drewien, Roderick C
2015-09-01
1. Prediction is fundamental to scientific enquiry and application; however, ecologists tend to favour explanatory modelling. We discuss a predictive modelling framework to evaluate ecological hypotheses and to explore novel/unobserved environmental scenarios to assist conservation and management decision-makers. We apply this framework to develop an optimal predictive model for juvenile (<1 year old) sandhill crane Grus canadensis recruitment of the Rocky Mountain Population (RMP). We consider spatial climate predictors motivated by hypotheses of how drought across multiple time-scales and spring/summer weather affects recruitment. 2. Our predictive modelling framework focuses on developing a single model that includes all relevant predictor variables, regardless of collinearity. This model is then optimized for prediction by controlling model complexity using a data-driven approach that marginalizes or removes irrelevant predictors from the model. Specifically, we highlight two approaches of statistical regularization, Bayesian least absolute shrinkage and selection operator (LASSO) and ridge regression. 3. Our optimal predictive Bayesian LASSO and ridge regression models were similar and on average 37% superior in predictive accuracy to an explanatory modelling approach. Our predictive models confirmed a priori hypotheses that drought and cold summers negatively affect juvenile recruitment in the RMP. The effects of long-term drought can be alleviated by short-term wet spring-summer months; however, the alleviation of long-term drought has a much greater positive effect on juvenile recruitment. The number of freezing days and snowpack during the summer months can also negatively affect recruitment, while spring snowpack has a positive effect. 4. Breeding habitat, mediated through climate, is a limiting factor on population growth of sandhill cranes in the RMP, which could become more limiting with a changing climate (i.e. increased drought). These effects are likely not unique to cranes. The alteration of hydrological patterns and water levels by drought may impact many migratory, wetland nesting birds in the Rocky Mountains and beyond. 5. Generalizable predictive models (trained by out-of-sample fit and based on ecological hypotheses) are needed by conservation and management decision-makers. Statistical regularization improves predictions and provides a general framework for fitting models with a large number of predictors, even those with collinearity, to simultaneously identify an optimal predictive model while conducting rigorous Bayesian model selection. Our framework is important for understanding population dynamics under a changing climate and has direct applications for making harvest and habitat management decisions. Published 2015. This article is a U.S. Government work and is in the public domain in the USA.
An Algorithm Framework for Isolating Anomalous Signals in Electromagnetic Data
NASA Astrophysics Data System (ADS)
Kappler, K. N.; Schneider, D.; Bleier, T.; MacLean, L. S.
2016-12-01
QuakeFinder and its international collaborators have installed and currently maintain an array of 165 three-axis induction magnetometer instrument sites in California, Peru, Taiwan, Greece, Chile and Sumatra. Based on research by Bleier et al. (2009), Fraser-Smith et al. (1990), and Freund (2007), the electromagnetic data from these instruments are being analyzed for pre-earthquake signatures. This analysis consists of both private research by QuakeFinder, and institutional collaborators (PUCP in Peru, NCU in Taiwan, NOA in Greece, LASP at University of Colorado, Stanford, UCLA, NASA-ESI, NASA-AMES and USC-CSEP). QuakeFinder has developed an algorithm framework aimed at isolating anomalous signals (pulses) in the time series. Results are presented from an application of this framework to induction-coil magnetometer data. Our data driven approach starts with sliding windows applied to uniformly resampled array data with a variety of lengths and overlap. Data variance (a proxy for energy) is calculated on each window and a short-term average/ long-term average (STA/LTA) filter is applied to the variance time series. Pulse identification is done by flagging time intervals in the STA/LTA filtered time series which exceed a threshold. Flagged time intervals are subsequently fed into a feature extraction program which computes statistical properties of the resampled data. These features are then filtered using a Principal Component Analysis (PCA) based method to cluster similar pulses. We explore the extent to which this approach categorizes pulses with known sources (e.g. cars, lightning, etc.) and the remaining pulses of unknown origin can be analyzed with respect to their relationship with seismicity. We seek a correlation between these daily pulse-counts (with known sources removed) and subsequent (days to weeks) seismic events greater than M5 within 15km radius. Thus we explore functions which map daily pulse-counts to a time series representing the likelihood of a seismic event occurring at some future time. These "pseudo-probabilities" can in turn be represented as Molchan diagrams. The Molchan curve provides an effective cost function for optimization and allows for a rigorous statistical assessment of the validity of pre-earthquake signals in the electromagnetic data.
Specifying the behavior of concurrent systems
NASA Technical Reports Server (NTRS)
Furtek, F. C.
1984-01-01
A framework for rigorously specifying the behavior of concurrent systems is proposed. It is based on the view of a concurrent system as a collection of interacting processes but no assumptions are made about the mechanisms for process synchronization and communication. A formal language is described that permits the expression of a broad range of logical and timing dependencies.
Changes in Residents' Self-Efficacy Beliefs in a Clinically Rich Graduate Teacher Education Program
ERIC Educational Resources Information Center
Reynolds, Heather M.; Wagle, A. Tina; Mahar, Donna; Yannuzzi, Leigh; Tramonte, Barbara; King, Joseph
2016-01-01
Increasing the clinical preparation of teachers in the United States to meet greater rigor in K-12 education has become a goal of institutions of higher education, especially since the publication of the National Council for the Accreditation of Teacher Education Blue Ribbon Panel Report on Clinical Practice. Using a theoretical framework grounded…
Climate Change: Creating an Integrated Framework for Improving School Climate
ERIC Educational Resources Information Center
Alliance for Excellent Education, 2013
2013-01-01
This report from the Alliance finds that schools that struggle most with providing a positive school climate more often disproportionately serve students of color and low-income students. It also confirms that students of color and students from low-income families are less likely to have access to rigorous course work and experienced teachers,…
Education in Emergencies: A Review of Theory and Research
ERIC Educational Resources Information Center
Burde, Dana; Kapit, Amy; Wahl, Rachel L.; Guven, Ozen; Skarpeteig, Margot Igland
2017-01-01
In this article, we conduct an integrative and rigorous review of theory and research on education in emergencies programs and interventions as international agencies implement them in areas of armed conflict. We ask several questions. How did this subfield emerge and what are the key conceptual frameworks that shape it today? How do education in…
ERIC Educational Resources Information Center
Gallagher, Carole; Rabinowitz, Stanley; Yeagley, Pamela
2011-01-01
Researchers recommend that policymakers use data from multiple sources when making decisions that have high-stakes consequences (Herman, Baker, & Linn, 2004; Linn, 2007; Stone & Lane, 2003). For this reason, a fair but rigorous teacher-effectiveness rating process relies on evidence collected from different sources (Goe, Bell, & Little, 2008;…
Integrated model development for liquid fueled rocket propulsion systems
NASA Technical Reports Server (NTRS)
Santi, L. Michael
1993-01-01
As detailed in the original statement of work, the objective of phase two of this research effort was to develop a general framework for rocket engine performance prediction that integrates physical principles, a rigorous mathematical formalism, component level test data, system level test data, and theory-observation reconciliation. Specific phase two development tasks are defined.
ERIC Educational Resources Information Center
Chatterji, Madhabi; Kwon, Young Ae; Sng, Clarice
2006-01-01
The No Child Left Behind (NCLB) Act of 2001 requires that public schools adopt research-supported programs and practices, with a strong recommendation for randomized controlled trials (RCTs) as the "gold standard" for scientific rigor in empirical research. Within that policy framework, this paper compares the relative utility of…
ERIC Educational Resources Information Center
Camara, Wayne, Ed.; O'Connor, Ryan, Ed.; Mattern, Krista, Ed.; Hanson, Mary Ann, Ed.
2015-01-01
Colleges have long recognized the importance of multiple domains. Admissions officers look to high school grades as indicators of persistence and achievement; student statements and letters of recommendation as indicators of character, behavior, and adaptability; the rigor of courses completed in high school as evidence of effort, motivation, and…
Decision support frameworks and tools for conservation
Schwartz, Mark W.; Cook, Carly N.; Pressey, Robert L.; Pullin, Andrew S.; Runge, Michael C.; Salafsky, Nick; Sutherland, William J.; Williamson, Matthew A.
2018-01-01
The practice of conservation occurs within complex socioecological systems fraught with challenges that require transparent, defensible, and often socially engaged project planning and management. Planning and decision support frameworks are designed to help conservation practitioners increase planning rigor, project accountability, stakeholder participation, transparency in decisions, and learning. We describe and contrast five common frameworks within the context of six fundamental questions (why, who, what, where, when, how) at each of three planning stages of adaptive management (project scoping, operational planning, learning). We demonstrate that decision support frameworks provide varied and extensive tools for conservation planning and management. However, using any framework in isolation risks diminishing potential benefits since no one framework covers the full spectrum of potential conservation planning and decision challenges. We describe two case studies that have effectively deployed tools from across conservation frameworks to improve conservation actions and outcomes. Attention to the critical questions for conservation project planning should allow practitioners to operate within any framework and adapt tools to suit their specific management context. We call on conservation researchers and practitioners to regularly use decision support tools as standard practice for framing both practice and research.
An Uncertainty Quantification Framework for Prognostics and Condition-Based Monitoring
NASA Technical Reports Server (NTRS)
Sankararaman, Shankar; Goebel, Kai
2014-01-01
This paper presents a computational framework for uncertainty quantification in prognostics in the context of condition-based monitoring of aerospace systems. The different sources of uncertainty and the various uncertainty quantification activities in condition-based prognostics are outlined in detail, and it is demonstrated that the Bayesian subjective approach is suitable for interpreting uncertainty in online monitoring. A state-space model-based framework for prognostics, that can rigorously account for the various sources of uncertainty, is presented. Prognostics consists of two important steps. First, the state of the system is estimated using Bayesian tracking, and then, the future states of the system are predicted until failure, thereby computing the remaining useful life of the system. The proposed framework is illustrated using the power system of a planetary rover test-bed, which is being developed and studied at NASA Ames Research Center.
Bringing scientific rigor to community-developed programs in Hong Kong.
Fabrizio, Cecilia S; Hirschmann, Malia R; Lam, Tai Hing; Cheung, Teresa; Pang, Irene; Chan, Sophia; Stewart, Sunita M
2012-12-31
This paper describes efforts to generate evidence for community-developed programs to enhance family relationships in the Chinese culture of Hong Kong, within the framework of community-based participatory research (CBPR). The CBPR framework was applied to help maximize the development of the intervention and the public health impact of the studies, while enhancing the capabilities of the social service sector partners. Four academic-community research teams explored the process of designing and implementing randomized controlled trials in the community. In addition to the expected cultural barriers between teams of academics and community practitioners, with their different outlooks, concerns and languages, the team navigated issues in utilizing the principles of CBPR unique to this Chinese culture. Eventually the team developed tools for adaptation, such as an emphasis on building the relationship while respecting role delineation and an iterative process of defining the non-negotiable parameters of research design while maintaining scientific rigor. Lessons learned include the risk of underemphasizing the size of the operational and skills shift between usual agency practices and research studies, the importance of minimizing non-negotiable parameters in implementing rigorous research designs in the community, and the need to view community capacity enhancement as a long term process. The four pilot studies under the FAMILY Project demonstrated that nuanced design adaptations, such as wait list controls and shorter assessments, better served the needs of the community and led to the successful development and vigorous evaluation of a series of preventive, family-oriented interventions in the Chinese culture of Hong Kong.
DOT National Transportation Integrated Search
1996-04-01
THIS REPORT ALSO DESCRIBES THE PROCEDURES FOR DIRECT ESTIMATION OF INTERSECTION CAPACITY WITH SIMULATION, INCLUDING A SET OF RIGOROUS STATISTICAL TESTS FOR SIMULATION PARAMETER CALIBRATION FROM FIELD DATA.
Testing for Mutagens Using Fruit Flies.
ERIC Educational Resources Information Center
Liebl, Eric C.
1998-01-01
Describes a laboratory employed in undergraduate teaching that uses fruit flies to test student-selected compounds for their ability to cause mutations. Requires no prior experience with fruit flies, incorporates a student design component, and employs both rigorous controls and statistical analyses. (DDR)
On testing for spatial correspondence between maps of human brain structure and function.
Alexander-Bloch, Aaron F; Shou, Haochang; Liu, Siyuan; Satterthwaite, Theodore D; Glahn, David C; Shinohara, Russell T; Vandekar, Simon N; Raznahan, Armin
2018-06-01
A critical issue in many neuroimaging studies is the comparison between brain maps. Nonetheless, it remains unclear how one should test hypotheses focused on the overlap or spatial correspondence between two or more brain maps. This "correspondence problem" affects, for example, the interpretation of comparisons between task-based patterns of functional activation, resting-state networks or modules, and neuroanatomical landmarks. To date, this problem has been addressed with remarkable variability in terms of methodological approaches and statistical rigor. In this paper, we address the correspondence problem using a spatial permutation framework to generate null models of overlap by applying random rotations to spherical representations of the cortical surface, an approach for which we also provide a theoretical statistical foundation. We use this method to derive clusters of cognitive functions that are correlated in terms of their functional neuroatomical substrates. In addition, using publicly available data, we formally demonstrate the correspondence between maps of task-based functional activity, resting-state fMRI networks and gyral-based anatomical landmarks. We provide open-access code to implement the methods presented for two commonly-used tools for surface based cortical analysis (https://www.github.com/spin-test). This spatial permutation approach constitutes a useful advance over widely-used methods for the comparison of cortical maps, thereby opening new possibilities for the integration of diverse neuroimaging data. Copyright © 2018 Elsevier Inc. All rights reserved.
Evaluation of NMME temperature and precipitation bias and forecast skill for South Asia
NASA Astrophysics Data System (ADS)
Cash, Benjamin A.; Manganello, Julia V.; Kinter, James L.
2017-08-01
Systematic error and forecast skill for temperature and precipitation in two regions of Southern Asia are investigated using hindcasts initialized May 1 from the North American Multi-Model Ensemble. We focus on two contiguous but geographically and dynamically diverse regions: the Extended Indian Monsoon Rainfall (70-100E, 10-30 N) and the nearby mountainous area of Pakistan and Afghanistan (60-75E, 23-39 N). Forecast skill is assessed using the Sign test framework, a rigorous statistical method that can be applied to non-Gaussian variables such as precipitation and to different ensemble sizes without introducing bias. We find that models show significant systematic error in both precipitation and temperature for both regions. The multi-model ensemble mean (MMEM) consistently yields the lowest systematic error and the highest forecast skill for both regions and variables. However, we also find that the MMEM consistently provides a statistically significant increase in skill over climatology only in the first month of the forecast. While the MMEM tends to provide higher overall skill than climatology later in the forecast, the differences are not significant at the 95% level. We also find that MMEMs constructed with a relatively small number of ensemble members per model can equal or outperform MMEMs constructed with more members in skill. This suggests some ensemble members either provide no contribution to overall skill or even detract from it.
Burnecki, Krzysztof; Kepten, Eldad; Janczura, Joanna; Bronshtein, Irena; Garini, Yuval; Weron, Aleksander
2012-11-07
We present a systematic statistical analysis of the recently measured individual trajectories of fluorescently labeled telomeres in the nucleus of living human cells. The experiments were performed in the U2OS cancer cell line. We propose an algorithm for identification of the telomere motion. By expanding the previously published data set, we are able to explore the dynamics in six time orders, a task not possible earlier. As a result, we establish a rigorous mathematical characterization of the stochastic process and identify the basic mathematical mechanisms behind the telomere motion. We find that the increments of the motion are stationary, Gaussian, ergodic, and even more chaotic--mixing. Moreover, the obtained memory parameter estimates, as well as the ensemble average mean square displacement reveal subdiffusive behavior at all time spans. All these findings statistically prove a fractional Brownian motion for the telomere trajectories, which is confirmed by a generalized p-variation test. Taking into account the biophysical nature of telomeres as monomers in the chromatin chain, we suggest polymer dynamics as a sufficient framework for their motion with no influence of other models. In addition, these results shed light on other studies of telomere motion and the alternative telomere lengthening mechanism. We hope that identification of these mechanisms will allow the development of a proper physical and biological model for telomere subdynamics. This array of tests can be easily implemented to other data sets to enable quick and accurate analysis of their statistical characteristics. Copyright © 2012 Biophysical Society. Published by Elsevier Inc. All rights reserved.
Rigor Mortis: Statistical thoroughness in reporting and the making of truth.
Tal, Aner
2016-02-01
Should a uniform checklist be adopted for methodological and statistical reporting? The current article discusses this notion, with particular attention to the use of old versus new statistics, and a consideration of the arguments brought up by Von Roten. The article argues that an overly exhaustive checklist that is uniformly applied to all submitted papers may be unsuitable for multidisciplinary work, and would further result in undue clutter and potentially distract reviewers from pertinent considerations in their evaluation of research articles. © The Author(s) 2015.
Zonation in the deep benthic megafauna : Application of a general test.
Gardiner, Frederick P; Haedrich, Richard L
1978-01-01
A test based on Maxwell-Boltzman statistics, instead of the formerly suggested but inappropriate Bose-Einstein statistics (Pielou and Routledge, 1976), examines the distribution of the boundaries of species' ranges distributed along a gradient, and indicates whether they are random or clustered (zoned). The test is most useful as a preliminary to the application of more instructive but less statistically rigorous methods such as cluster analysis. The test indicates zonation is marked in the deep benthic megafauna living between 200 and 3000 m, but below 3000 m little zonation may be found.
High-order asynchrony-tolerant finite difference schemes for partial differential equations
NASA Astrophysics Data System (ADS)
Aditya, Konduri; Donzis, Diego A.
2017-12-01
Synchronizations of processing elements (PEs) in massively parallel simulations, which arise due to communication or load imbalances between PEs, significantly affect the scalability of scientific applications. We have recently proposed a method based on finite-difference schemes to solve partial differential equations in an asynchronous fashion - synchronization between PEs is relaxed at a mathematical level. While standard schemes can maintain their stability in the presence of asynchrony, their accuracy is drastically affected. In this work, we present a general methodology to derive asynchrony-tolerant (AT) finite difference schemes of arbitrary order of accuracy, which can maintain their accuracy when synchronizations are relaxed. We show that there are several choices available in selecting a stencil to derive these schemes and discuss their effect on numerical and computational performance. We provide a simple classification of schemes based on the stencil and derive schemes that are representative of different classes. Their numerical error is rigorously analyzed within a statistical framework to obtain the overall accuracy of the solution. Results from numerical experiments are used to validate the performance of the schemes.
Dynamics of two-phase interfaces and surface tensions: A density-functional theory perspective
NASA Astrophysics Data System (ADS)
Yatsyshin, Petr; Sibley, David N.; Duran-Olivencia, Miguel A.; Kalliadasis, Serafim
2016-11-01
Classical density functional theory (DFT) is a statistical mechanical framework for the description of fluids at the nanoscale, where the inhomogeneity of the fluid structure needs to be carefully accounted for. By expressing the grand free-energy of the fluid as a functional of the one-body density, DFT offers a theoretically consistent and computationally accessible way to obtain two-phase interfaces and respective interfacial tensions in a ternary solid-liquid-gas system. The dynamic version of DFT (DDFT) can be rigorously derived from the Smoluchowsky picture of the dynamics of colloidal particles in a solvent. It is generally agreed that DDFT can capture the diffusion-driven evolution of many soft-matter systems. In this context, we use DDFT to investigate the dynamic behaviour of two-phase interfaces in both equilibrium and dynamic wetting and discuss the possibility of defining a time-dependent surface tension, which still remains in debate. We acknowledge financial support from the European Research Council via Advanced Grant No. 247031 and from the Engineering and Physical Sciences Research Council of the UK via Grants No. EP/L027186 and EP/L020564.
Dynamo-based scheme for forecasting the magnitude of solar activity cycles
NASA Technical Reports Server (NTRS)
Layden, A. C.; Fox, P. A.; Howard, J. M.; Sarajedini, A.; Schatten, K. H.
1991-01-01
This paper presents a general framework for forecasting the smoothed maximum level of solar activity in a given cycle, based on a simple understanding of the solar dynamo. This type of forecasting requires knowledge of the sun's polar magnetic field strength at the preceding activity minimum. Because direct measurements of this quantity are difficult to obtain, the quality of a number of proxy indicators already used by other authors is evaluated, which are physically related to the sun's polar field. These indicators are subjected to a rigorous statistical analysis, and the analysis technique for each indicator is specified in detail in order to simplify and systematize reanalysis for future use. It is found that several of these proxies are in fact poorly correlated or uncorrelated with solar activity, and thus are of little value for predicting activity maxima. Also presented is a scheme in which the predictions of the individual proxies are combined via an appropriately weighted mean to produce a compound prediction. The scheme is then applied to the current cycle 22, and a maximum smoothed international sunspot number of 171 + or - 26 is estimated.
A strategy to estimate unknown viral diversity in mammals.
Anthony, Simon J; Epstein, Jonathan H; Murray, Kris A; Navarrete-Macias, Isamara; Zambrana-Torrelio, Carlos M; Solovyov, Alexander; Ojeda-Flores, Rafael; Arrigo, Nicole C; Islam, Ariful; Ali Khan, Shahneaz; Hosseini, Parviez; Bogich, Tiffany L; Olival, Kevin J; Sanchez-Leon, Maria D; Karesh, William B; Goldstein, Tracey; Luby, Stephen P; Morse, Stephen S; Mazet, Jonna A K; Daszak, Peter; Lipkin, W Ian
2013-09-03
The majority of emerging zoonoses originate in wildlife, and many are caused by viruses. However, there are no rigorous estimates of total viral diversity (here termed "virodiversity") for any wildlife species, despite the utility of this to future surveillance and control of emerging zoonoses. In this case study, we repeatedly sampled a mammalian wildlife host known to harbor emerging zoonotic pathogens (the Indian Flying Fox, Pteropus giganteus) and used PCR with degenerate viral family-level primers to discover and analyze the occurrence patterns of 55 viruses from nine viral families. We then adapted statistical techniques used to estimate biodiversity in vertebrates and plants and estimated the total viral richness of these nine families in P. giganteus to be 58 viruses. Our analyses demonstrate proof-of-concept of a strategy for estimating viral richness and provide the first statistically supported estimate of the number of undiscovered viruses in a mammalian host. We used a simple extrapolation to estimate that there are a minimum of 320,000 mammalian viruses awaiting discovery within these nine families, assuming all species harbor a similar number of viruses, with minimal turnover between host species. We estimate the cost of discovering these viruses to be ~$6.3 billion (or ~$1.4 billion for 85% of the total diversity), which if annualized over a 10-year study time frame would represent a small fraction of the cost of many pandemic zoonoses. Recent years have seen a dramatic increase in viral discovery efforts. However, most lack rigorous systematic design, which limits our ability to understand viral diversity and its ecological drivers and reduces their value to public health intervention. Here, we present a new framework for the discovery of novel viruses in wildlife and use it to make the first-ever estimate of the number of viruses that exist in a mammalian host. As pathogens continue to emerge from wildlife, this estimate allows us to put preliminary bounds around the potential size of the total zoonotic pool and facilitates a better understanding of where best to allocate resources for the subsequent discovery of global viral diversity.
Pavlacky, David C; Lukacs, Paul M; Blakesley, Jennifer A; Skorkowsky, Robert C; Klute, David S; Hahn, Beth A; Dreitz, Victoria J; George, T Luke; Hanni, David J
2017-01-01
Monitoring is an essential component of wildlife management and conservation. However, the usefulness of monitoring data is often undermined by the lack of 1) coordination across organizations and regions, 2) meaningful management and conservation objectives, and 3) rigorous sampling designs. Although many improvements to avian monitoring have been discussed, the recommendations have been slow to emerge in large-scale programs. We introduce the Integrated Monitoring in Bird Conservation Regions (IMBCR) program designed to overcome the above limitations. Our objectives are to outline the development of a statistically defensible sampling design to increase the value of large-scale monitoring data and provide example applications to demonstrate the ability of the design to meet multiple conservation and management objectives. We outline the sampling process for the IMBCR program with a focus on the Badlands and Prairies Bird Conservation Region (BCR 17). We provide two examples for the Brewer's sparrow (Spizella breweri) in BCR 17 demonstrating the ability of the design to 1) determine hierarchical population responses to landscape change and 2) estimate hierarchical habitat relationships to predict the response of the Brewer's sparrow to conservation efforts at multiple spatial scales. The collaboration across organizations and regions provided economy of scale by leveraging a common data platform over large spatial scales to promote the efficient use of monitoring resources. We designed the IMBCR program to address the information needs and core conservation and management objectives of the participating partner organizations. Although it has been argued that probabilistic sampling designs are not practical for large-scale monitoring, the IMBCR program provides a precedent for implementing a statistically defensible sampling design from local to bioregional scales. We demonstrate that integrating conservation and management objectives with rigorous statistical design and analyses ensures reliable knowledge about bird populations that is relevant and integral to bird conservation at multiple scales.
Hahn, Beth A.; Dreitz, Victoria J.; George, T. Luke
2017-01-01
Monitoring is an essential component of wildlife management and conservation. However, the usefulness of monitoring data is often undermined by the lack of 1) coordination across organizations and regions, 2) meaningful management and conservation objectives, and 3) rigorous sampling designs. Although many improvements to avian monitoring have been discussed, the recommendations have been slow to emerge in large-scale programs. We introduce the Integrated Monitoring in Bird Conservation Regions (IMBCR) program designed to overcome the above limitations. Our objectives are to outline the development of a statistically defensible sampling design to increase the value of large-scale monitoring data and provide example applications to demonstrate the ability of the design to meet multiple conservation and management objectives. We outline the sampling process for the IMBCR program with a focus on the Badlands and Prairies Bird Conservation Region (BCR 17). We provide two examples for the Brewer’s sparrow (Spizella breweri) in BCR 17 demonstrating the ability of the design to 1) determine hierarchical population responses to landscape change and 2) estimate hierarchical habitat relationships to predict the response of the Brewer’s sparrow to conservation efforts at multiple spatial scales. The collaboration across organizations and regions provided economy of scale by leveraging a common data platform over large spatial scales to promote the efficient use of monitoring resources. We designed the IMBCR program to address the information needs and core conservation and management objectives of the participating partner organizations. Although it has been argued that probabilistic sampling designs are not practical for large-scale monitoring, the IMBCR program provides a precedent for implementing a statistically defensible sampling design from local to bioregional scales. We demonstrate that integrating conservation and management objectives with rigorous statistical design and analyses ensures reliable knowledge about bird populations that is relevant and integral to bird conservation at multiple scales. PMID:29065128
Model-based analysis of pattern motion processing in mouse primary visual cortex
Muir, Dylan R.; Roth, Morgane M.; Helmchen, Fritjof; Kampa, Björn M.
2015-01-01
Neurons in sensory areas of neocortex exhibit responses tuned to specific features of the environment. In visual cortex, information about features such as edges or textures with particular orientations must be integrated to recognize a visual scene or object. Connectivity studies in rodent cortex have revealed that neurons make specific connections within sub-networks sharing common input tuning. In principle, this sub-network architecture enables local cortical circuits to integrate sensory information. However, whether feature integration indeed occurs locally in rodent primary sensory areas has not been examined directly. We studied local integration of sensory features in primary visual cortex (V1) of the mouse by presenting drifting grating and plaid stimuli, while recording the activity of neuronal populations with two-photon calcium imaging. Using a Bayesian model-based analysis framework, we classified single-cell responses as being selective for either individual grating components or for moving plaid patterns. Rather than relying on trial-averaged responses, our model-based framework takes into account single-trial responses and can easily be extended to consider any number of arbitrary predictive models. Our analysis method was able to successfully classify significantly more responses than traditional partial correlation (PC) analysis, and provides a rigorous statistical framework to rank any number of models and reject poorly performing models. We also found a large proportion of cells that respond strongly to only one stimulus class. In addition, a quarter of selectively responding neurons had more complex responses that could not be explained by any simple integration model. Our results show that a broad range of pattern integration processes already take place at the level of V1. This diversity of integration is consistent with processing of visual inputs by local sub-networks within V1 that are tuned to combinations of sensory features. PMID:26300738
DESIGNA ND ANALYSIS FOR THEMATIC MAP ACCURACY ASSESSMENT: FUNDAMENTAL PRINCIPLES
Before being used in scientific investigations and policy decisions, thematic maps constructed from remotely sensed data should be subjected to a statistically rigorous accuracy assessment. The three basic components of an accuracy assessment are: 1) the sampling design used to s...
Sensemaking in a Value Based Context for Large Scale Complex Engineered Systems
NASA Astrophysics Data System (ADS)
Sikkandar Basha, Nazareen
The design and the development of Large-Scale Complex Engineered Systems (LSCES) requires the involvement of multiple teams and numerous levels of the organization and interactions with large numbers of people and interdisciplinary departments. Traditionally, requirements-driven Systems Engineering (SE) is used in the design and development of these LSCES. The requirements are used to capture the preferences of the stakeholder for the LSCES. Due to the complexity of the system, multiple levels of interactions are required to elicit the requirements of the system within the organization. Since LSCES involves people and interactions between the teams and interdisciplinary departments, it should be socio-technical in nature. The elicitation of the requirements of most large-scale system projects are subjected to creep in time and cost due to the uncertainty and ambiguity of requirements during the design and development. In an organization structure, the cost and time overrun can occur at any level and iterate back and forth thus increasing the cost and time. To avoid such creep past researches have shown that rigorous approaches such as value based designing can be used to control it. But before the rigorous approaches can be used, the decision maker should have a proper understanding of requirements creep and the state of the system when the creep occurs. Sensemaking is used to understand the state of system when the creep occurs and provide a guidance to decision maker. This research proposes the use of the Cynefin framework, sensemaking framework which can be used in the design and development of LSCES. It can aide in understanding the system and decision making to minimize the value gap due to requirements creep by eliminating ambiguity which occurs during design and development. A sample hierarchical organization is used to demonstrate the state of the system at the occurrence of requirements creep in terms of cost and time using the Cynefin framework. These trials are continued for different requirements and at different sub-system level. The results obtained show that the Cynefin framework can be used to improve the value of the system and can be used for predictive analysis. The decision makers can use these findings and use rigorous approaches and improve the design of Large Scale Complex Engineered Systems.
Hayat, Matthew J; Schmiege, Sarah J; Cook, Paul F
2014-04-01
Statistics knowledge is essential for understanding the nursing and health care literature, as well as for applying rigorous science in nursing research. Statistical consultants providing services to faculty and students in an academic nursing program have the opportunity to identify gaps and challenges in statistics education for nursing students. This information may be useful to curriculum committees and statistics educators. This article aims to provide perspective on statistics education stemming from the experiences of three experienced statistics educators who regularly collaborate and consult with nurse investigators. The authors share their knowledge and express their views about data management, data screening and manipulation, statistical software, types of scientific investigation, and advanced statistical topics not covered in the usual coursework. The suggestions provided promote a call for data to study these topics. Relevant data about statistics education can assist educators in developing comprehensive statistics coursework for nursing students. Copyright 2014, SLACK Incorporated.
A Theoretical Framework for Lagrangian Descriptors
NASA Astrophysics Data System (ADS)
Lopesino, C.; Balibrea-Iniesta, F.; García-Garrido, V. J.; Wiggins, S.; Mancho, A. M.
This paper provides a theoretical background for Lagrangian Descriptors (LDs). The goal of achieving rigorous proofs that justify the ability of LDs to detect invariant manifolds is simplified by introducing an alternative definition for LDs. The definition is stated for n-dimensional systems with general time dependence, however we rigorously prove that this method reveals the stable and unstable manifolds of hyperbolic points in four particular 2D cases: a hyperbolic saddle point for linear autonomous systems, a hyperbolic saddle point for nonlinear autonomous systems, a hyperbolic saddle point for linear nonautonomous systems and a hyperbolic saddle point for nonlinear nonautonomous systems. We also discuss further rigorous results which show the ability of LDs to highlight additional invariants sets, such as n-tori. These results are just a simple extension of the ergodic partition theory which we illustrate by applying this methodology to well-known examples, such as the planar field of the harmonic oscillator and the 3D ABC flow. Finally, we provide a thorough discussion on the requirement of the objectivity (frame-invariance) property for tools designed to reveal phase space structures and their implications for Lagrangian descriptors.
Methodological Developments in Geophysical Assimilation Modeling
NASA Astrophysics Data System (ADS)
Christakos, George
2005-06-01
This work presents recent methodological developments in geophysical assimilation research. We revisit the meaning of the term "solution" of a mathematical model representing a geophysical system, and we examine its operational formulations. We argue that an assimilation solution based on epistemic cognition (which assumes that the model describes incomplete knowledge about nature and focuses on conceptual mechanisms of scientific thinking) could lead to more realistic representations of the geophysical situation than a conventional ontologic assimilation solution (which assumes that the model describes nature as is and focuses on form manipulations). Conceptually, the two approaches are fundamentally different. Unlike the reasoning structure of conventional assimilation modeling that is based mainly on ad hoc technical schemes, the epistemic cognition approach is based on teleologic criteria and stochastic adaptation principles. In this way some key ideas are introduced that could open new areas of geophysical assimilation to detailed understanding in an integrated manner. A knowledge synthesis framework can provide the rational means for assimilating a variety of knowledge bases (general and site specific) that are relevant to the geophysical system of interest. Epistemic cognition-based assimilation techniques can produce a realistic representation of the geophysical system, provide a rigorous assessment of the uncertainty sources, and generate informative predictions across space-time. The mathematics of epistemic assimilation involves a powerful and versatile spatiotemporal random field theory that imposes no restriction on the shape of the probability distributions or the form of the predictors (non-Gaussian distributions, multiple-point statistics, and nonlinear models are automatically incorporated) and accounts rigorously for the uncertainty features of the geophysical system. In the epistemic cognition context the assimilation concept may be used to investigate critical issues related to knowledge reliability, such as uncertainty due to model structure error (conceptual uncertainty).
NASA Astrophysics Data System (ADS)
Harter, T.; Davis, R.; Smart, D. R.; Brown, P. H.; Dzurella, K.; Bell, A.; Kourakos, G.
2017-12-01
Nutrient fluxes to groundwater have been subject to regulatory assessment and control only in a limited number of countries, including those in the European Union, where the Water Framework Directive requires member countries to manage groundwater basis toward achieving "good status", and California, where irrigated lands will be subject to permitting, stringent nutrient monitoring requirements, and development of practices that are protective of groundwater. However, research activities to rigorously assess agricultural practices for their impact on groundwater have been limited and instead focused on surface water protection. For groundwater-related assessment of agricultural practices, a wide range of modeling tools has been employed: vulnerability studies, nitrogen mass balance assessments, crop-soil-system models, and various statistical tools. These tools are predominantly used to identify high risk regions, practices, or crops. Here we present the development of a field site for rigorous in-situ evaluation of water and nutrient management practices in an irrigated agricultural setting. Integrating groundwater monitoring into agricultural practice assessment requires large research plots (on the order of 10s to 100s of hectares) and multi-year research time-frames - much larger than typical agricultural field research plots. Almonds are among the most common crops in California with intensive use of nitrogen fertilizer and were selected for their high water quality improvement potential. Availability of an orchard site with relatively vulnerable groundwater conditions (sandy soils, water table depth less than 10 m) was also important in site selection. Initial results show that shallow groundwater concentrations are commensurate with nitrogen leaching estimates obtained by considering historical, long-term field nitrogen mass balance and groundwater dynamics.
A Voice-Based E-Examination Framework for Visually Impaired Students in Open and Distance Learning
ERIC Educational Resources Information Center
Azeta, Ambrose A.; Inam, Itorobong A.; Daramola, Olawande
2018-01-01
Voice-based systems allow users access to information on the internet over a voice interface. Prior studies on Open and Distance Learning (ODL) e-examination systems that make use of voice interface do not sufficiently exhibit intelligent form of assessment, which diminishes the rigor of examination. The objective of this paper is to improve on…
ERIC Educational Resources Information Center
Wobmann, Ludger; Ludemann, Elke; Schutz, Gabriela; West, Martin R.
2007-01-01
Accountability, autonomy, and choice play a leading role in recent school reforms in many countries. This report provides new evidence on whether students perform better in school systems that have such institutional measures in place. We implement an internationally comparative approach within a rigorous micro-econometric framework that accounts…
ERIC Educational Resources Information Center
McCready, John W.
2010-01-01
The purpose of this study was to examine use of decision-making tools and feedback in strategic planning in order to develop a rigorous process that would promote the efficiency of strategic planning for acquisitions in the United States Coast Guard (USCG). Strategic planning is critical to agencies such as the USCG in order to be effective…
A Reduced Basis Method with Exact-Solution Certificates for Symmetric Coercive Equations
2013-11-06
the energy associated with the infinite - dimensional weak solution of parametrized symmetric coercive partial differential equations with piecewise...builds bounds with respect to the infinite - dimensional weak solution, aims to entirely remove the issue of the “truth” within the certified reduced basis...framework. We in particular introduce a reduced basis method that provides rigorous upper and lower bounds
ERIC Educational Resources Information Center
Brodersen, R. Marc; Yanoski, David; Hyslop, Alisha; Imperatore, Catherine
2016-01-01
Career and technical education (CTE) programs of study are subject to rigorous state and federal accountability systems that provide information on key student outcomes. However, while these outcome measures can form a basis for identifying high- and low-performing programs, they are insufficient for answering underlying questions about how or why…
ERIC Educational Resources Information Center
Council of the Great City Schools, 2017
2017-01-01
In the ongoing effort to improve instructional standards in our nation's urban public schools, the Council of the Great City Schools has released resources to help districts determine the quality and alignment of instructional materials at each grade level; to ensure that materials for English language learners are rigorous and aligned to district…
Exploring the Influence of 21st Century Skills in a Dual Language Program: A Case Study
ERIC Educational Resources Information Center
Heinrichs, Christine R.
2016-01-01
Preparing students as 21st century learners is a key reform in education. The Partnership for 21st Century Skills developed a framework that identifies outcomes needed for successful implementation of rigorous standards. The Dual Language (DL) program was identified as a structure for reform with systems and practices which can be used to prepare…
Principals in the Pipeline: Districts Construct a Framework to Develop School Leadership
ERIC Educational Resources Information Center
Mendels, Pamela
2012-01-01
A diverse school district hugging the eastern border of Washington, D.C., Prince George's County, has introduced rigorous hiring methods and other practices to boost the quality of leadership in its 198 schools. In so doing, the district has also earned a spot among the pioneers in efforts nationally to ensure that public schools are led by the…
ERIC Educational Resources Information Center
Chan, Joseph; To, Ho-Pong; Chan, Elaine
2006-01-01
Despite its growing currency in academic and policy circles, social cohesion is a term in need of a clearer and more rigorous definition. This article provides a critical review of the ways social cohesion has been conceptualized in the literature in many cases, definitions are too loosely made, with a common confusion between the content and the…
ERIC Educational Resources Information Center
New Teacher Project, 2011
2011-01-01
This "Rating a Teacher Observation Tool" identifies five simple questions and provides an easy-to-use scorecard to help policymakers decide whether an observation framework is likely to produce fair and accurate results. The five questions are: (1) Do the criteria and tools cover the classroom performance areas most connected to student outcomes?…
ERIC Educational Resources Information Center
Mwaniki, Munene
2014-01-01
Mother tongue education (MTE) has been a subject of rigorous debate for more than half a century, in both industrialised and developing societies. Despite disparate views on MTE, there is an uneasy consensus on its importance in educational systems, especially in the foundational years. Using the Language Management Framework, the article provides…
ERIC Educational Resources Information Center
Amador-Lankster, Clara
2018-01-01
The purpose of this article is to discuss a Fulbright Evaluation Framework and to analyze findings resulting from implementation of two contextualized measures designed as LEARNING BY DOING in response to achievement expectations from the National Education Ministry in Colombia in three areas. The goal of the Fulbright funded project was to…
Testing adaptive toolbox models: a Bayesian hierarchical approach.
Scheibehenne, Benjamin; Rieskamp, Jörg; Wagenmakers, Eric-Jan
2013-01-01
Many theories of human cognition postulate that people are equipped with a repertoire of strategies to solve the tasks they face. This theoretical framework of a cognitive toolbox provides a plausible account of intra- and interindividual differences in human behavior. Unfortunately, it is often unclear how to rigorously test the toolbox framework. How can a toolbox model be quantitatively specified? How can the number of toolbox strategies be limited to prevent uncontrolled strategy sprawl? How can a toolbox model be formally tested against alternative theories? The authors show how these challenges can be met by using Bayesian inference techniques. By means of parameter recovery simulations and the analysis of empirical data across a variety of domains (i.e., judgment and decision making, children's cognitive development, function learning, and perceptual categorization), the authors illustrate how Bayesian inference techniques allow toolbox models to be quantitatively specified, strategy sprawl to be contained, and toolbox models to be rigorously tested against competing theories. The authors demonstrate that their approach applies at the individual level but can also be generalized to the group level with hierarchical Bayesian procedures. The suggested Bayesian inference techniques represent a theoretical and methodological advancement for toolbox theories of cognition and behavior.
Properties of field functionals and characterization of local functionals
NASA Astrophysics Data System (ADS)
Brouder, Christian; Dang, Nguyen Viet; Laurent-Gengoux, Camille; Rejzner, Kasia
2018-02-01
Functionals (i.e., functions of functions) are widely used in quantum field theory and solid-state physics. In this paper, functionals are given a rigorous mathematical framework and their main properties are described. The choice of the proper space of test functions (smooth functions) and of the relevant concept of differential (Bastiani differential) are discussed. The relation between the multiple derivatives of a functional and the corresponding distributions is described in detail. It is proved that, in a neighborhood of every test function, the support of a smooth functional is uniformly compactly supported and the order of the corresponding distribution is uniformly bounded. Relying on a recent work by Dabrowski, several spaces of functionals are furnished with a complete and nuclear topology. In view of physical applications, it is shown that most formal manipulations can be given a rigorous meaning. A new concept of local functionals is proposed and two characterizations of them are given: the first one uses the additivity (or Hammerstein) property, the second one is a variant of Peetre's theorem. Finally, the first step of a cohomological approach to quantum field theory is carried out by proving a global Poincaré lemma and defining multi-vector fields and graded functionals within our framework.
A Formal Framework for the Analysis of Algorithms That Recover From Loss of Separation
NASA Technical Reports Server (NTRS)
Butler, RIcky W.; Munoz, Cesar A.
2008-01-01
We present a mathematical framework for the specification and verification of state-based conflict resolution algorithms that recover from loss of separation. In particular, we propose rigorous definitions of horizontal and vertical maneuver correctness that yield horizontal and vertical separation, respectively, in a bounded amount of time. We also provide sufficient conditions for independent correctness, i.e., separation under the assumption that only one aircraft maneuvers, and for implicitly coordinated correctness, i.e., separation under the assumption that both aircraft maneuver. An important benefit of this approach is that different aircraft can execute different algorithms and implicit coordination will still be achieved, as long as they all meet the explicit criteria of the framework. Towards this end we have sought to make the criteria as general as possible. The framework presented in this paper has been formalized and mechanically verified in the Prototype Verification System (PVS).
Comparative effectiveness research methodology using secondary data: A starting user's guide.
Sun, Maxine; Lipsitz, Stuart R
2018-04-01
The use of secondary data, such as claims or administrative data, in comparative effectiveness research has grown tremendously in recent years. We believe that the current review can help investigators relying on secondary data to (1) gain insight into both the methodologies and statistical methods, (2) better understand the necessity of a rigorous planning before initiating a comparative effectiveness investigation, and (3) optimize the quality of their investigations. Specifically, we review concepts of adjusted analyses and confounders, methods of propensity score analyses, and instrumental variable analyses, risk prediction models (logistic and time-to-event), decision-curve analysis, as well as the interpretation of the P value and hypothesis testing. Overall, we hope that the current review article can help research investigators relying on secondary data to perform comparative effectiveness research better understand the necessity of a rigorous planning before study start, and gain better insight in the choice of statistical methods so as to optimize the quality of the research study. Copyright © 2017 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Gottwald, Georg; Melbourne, Ian
2013-04-01
Whereas diffusion limits of stochastic multi-scale systems have a long and successful history, the case of constructing stochastic parametrizations of chaotic deterministic systems has been much less studied. We present rigorous results of convergence of a chaotic slow-fast system to a stochastic differential equation with multiplicative noise. Furthermore we present rigorous results for chaotic slow-fast maps, occurring as numerical discretizations of continuous time systems. This raises the issue of how to interpret certain stochastic integrals; surprisingly the resulting integrals of the stochastic limit system are generically neither of Stratonovich nor of Ito type in the case of maps. It is shown that the limit system of a numerical discretisation is different to the associated continuous time system. This has important consequences when interpreting the statistics of long time simulations of multi-scale systems - they may be very different to the one of the original continuous time system which we set out to study.
Using GIS to generate spatially balanced random survey designs for natural resource applications.
Theobald, David M; Stevens, Don L; White, Denis; Urquhart, N Scott; Olsen, Anthony R; Norman, John B
2007-07-01
Sampling of a population is frequently required to understand trends and patterns in natural resource management because financial and time constraints preclude a complete census. A rigorous probability-based survey design specifies where to sample so that inferences from the sample apply to the entire population. Probability survey designs should be used in natural resource and environmental management situations because they provide the mathematical foundation for statistical inference. Development of long-term monitoring designs demand survey designs that achieve statistical rigor and are efficient but remain flexible to inevitable logistical or practical constraints during field data collection. Here we describe an approach to probability-based survey design, called the Reversed Randomized Quadrant-Recursive Raster, based on the concept of spatially balanced sampling and implemented in a geographic information system. This provides environmental managers a practical tool to generate flexible and efficient survey designs for natural resource applications. Factors commonly used to modify sampling intensity, such as categories, gradients, or accessibility, can be readily incorporated into the spatially balanced sample design.
Harnessing Implementation Science to Increase the Impact of Health Equity Research.
Chinman, Matthew; Woodward, Eva N; Curran, Geoffrey M; Hausmann, Leslie R M
2017-09-01
Health disparities are differences in health or health care between groups based on social, economic, and/or environmental disadvantage. Disparity research often follows 3 steps: detecting (phase 1), understanding (phase 2), and reducing (phase 3), disparities. Although disparities have narrowed over time, many remain. We argue that implementation science could enhance disparities research by broadening the scope of phase 2 studies and offering rigorous methods to test disparity-reducing implementation strategies in phase 3 studies. We briefly review the focus of phase 2 and phase 3 disparities research. We then provide a decision tree and case examples to illustrate how implementation science frameworks and research designs could further enhance disparity research. Most health disparities research emphasizes patient and provider factors as predominant mechanisms underlying disparities. Applying implementation science frameworks like the Consolidated Framework for Implementation Research could help disparities research widen its scope in phase 2 studies and, in turn, develop broader disparities-reducing implementation strategies in phase 3 studies. Many phase 3 studies of disparity-reducing implementation strategies are similar to case studies, whose designs are not able to fully test causality. Implementation science research designs offer rigorous methods that could accelerate the pace at which equity is achieved in real-world practice. Disparities can be considered a "special case" of implementation challenges-when evidence-based clinical interventions are delivered to, and received by, vulnerable populations at lower rates. Bringing together health disparities research and implementation science could advance equity more than either could achieve on their own.
A traits-based approach for prioritizing species for monitoring and surrogacy selection
Pracheil, Brenda M.; McManamay, Ryan A.; Bevelhimer, Mark S.; ...
2016-11-28
The bar for justifying the use of vertebrate animals for study is being increasingly raised, thus requiring increased rigor for species selection and study design. Although we have power analyses to provide quantitative backing for the numbers of organisms used, quantitative backing for selection of study species is not frequently employed. This can be especially important when measuring the impacts of ecosystem alteration, when study species must be chosen that are both sensitive to the alteration and of sufficient abundance for study. Just as important is providing justification for designation of surrogate species for study, especially when the species ofmore » interest is rare or of conservation concern and selection of an appropriate surrogate can have legal implications. In this study, we use a combination of GIS, a fish traits database and multivariate statistical analyses to quantitatively prioritize species for study and to determine potential study surrogate species. We provide two case studies to illustrate our quantitative, traits-based approach for designating study species and surrogate species. In the first case study, we select broadly representative fish species to understand the effects of turbine passage on adult fishes based on traits that suggest sensitivity to turbine passage. In our second case study, we present a framework for selecting a surrogate species for an endangered species. Lastly, we suggest that our traits-based framework can provide quantitative backing and added justification to selection of study species while expanding the inference space of study results.« less
PRISM offers a comprehensive genomic approach to transcription factor function prediction
Wenger, Aaron M.; Clarke, Shoa L.; Guturu, Harendra; Chen, Jenny; Schaar, Bruce T.; McLean, Cory Y.; Bejerano, Gill
2013-01-01
The human genome encodes 1500–2000 different transcription factors (TFs). ChIP-seq is revealing the global binding profiles of a fraction of TFs in a fraction of their biological contexts. These data show that the majority of TFs bind directly next to a large number of context-relevant target genes, that most binding is distal, and that binding is context specific. Because of the effort and cost involved, ChIP-seq is seldom used in search of novel TF function. Such exploration is instead done using expression perturbation and genetic screens. Here we propose a comprehensive computational framework for transcription factor function prediction. We curate 332 high-quality nonredundant TF binding motifs that represent all major DNA binding domains, and improve cross-species conserved binding site prediction to obtain 3.3 million conserved, mostly distal, binding site predictions. We combine these with 2.4 million facts about all human and mouse gene functions, in a novel statistical framework, in search of enrichments of particular motifs next to groups of target genes of particular functions. Rigorous parameter tuning and a harsh null are used to minimize false positives. Our novel PRISM (predicting regulatory information from single motifs) approach obtains 2543 TF function predictions in a large variety of contexts, at a false discovery rate of 16%. The predictions are highly enriched for validated TF roles, and 45 of 67 (67%) tested binding site regions in five different contexts act as enhancers in functionally matched cells. PMID:23382538
A traits-based approach for prioritizing species for monitoring and surrogacy selection
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pracheil, Brenda M.; McManamay, Ryan A.; Bevelhimer, Mark S.
The bar for justifying the use of vertebrate animals for study is being increasingly raised, thus requiring increased rigor for species selection and study design. Although we have power analyses to provide quantitative backing for the numbers of organisms used, quantitative backing for selection of study species is not frequently employed. This can be especially important when measuring the impacts of ecosystem alteration, when study species must be chosen that are both sensitive to the alteration and of sufficient abundance for study. Just as important is providing justification for designation of surrogate species for study, especially when the species ofmore » interest is rare or of conservation concern and selection of an appropriate surrogate can have legal implications. In this study, we use a combination of GIS, a fish traits database and multivariate statistical analyses to quantitatively prioritize species for study and to determine potential study surrogate species. We provide two case studies to illustrate our quantitative, traits-based approach for designating study species and surrogate species. In the first case study, we select broadly representative fish species to understand the effects of turbine passage on adult fishes based on traits that suggest sensitivity to turbine passage. In our second case study, we present a framework for selecting a surrogate species for an endangered species. Lastly, we suggest that our traits-based framework can provide quantitative backing and added justification to selection of study species while expanding the inference space of study results.« less
Inferring the nature of anthropogenic threats from long-term abundance records.
Shoemaker, Kevin T; Akçakaya, H Resit
2015-02-01
Diagnosing the processes that threaten species persistence is critical for recovery planning and risk forecasting. Dominant threats are typically inferred by experts on the basis of a patchwork of informal methods. Transparent, quantitative diagnostic tools would contribute much-needed consistency, objectivity, and rigor to the process of diagnosing anthropogenic threats. Long-term census records, available for an increasingly large and diverse set of taxa, may exhibit characteristic signatures of specific threatening processes and thereby provide information for threat diagnosis. We developed a flexible Bayesian framework for diagnosing threats on the basis of long-term census records and diverse ancillary sources of information. We tested this framework with simulated data from artificial populations subjected to varying degrees of exploitation and habitat loss and several real-world abundance time series for which threatening processes are relatively well understood: bluefin tuna (Thunnus maccoyii) and Atlantic cod (Gadus morhua) (exploitation) and Red Grouse (Lagopus lagopus scotica) and Eurasian Skylark (Alauda arvensis) (habitat loss). Our method correctly identified the process driving population decline for over 90% of time series simulated under moderate to severe threat scenarios. Successful identification of threats approached 100% for severe exploitation and habitat loss scenarios. Our method identified threats less successfully when threatening processes were weak and when populations were simultaneously affected by multiple threats. Our method selected the presumed true threat model for all real-world case studies, although results were somewhat ambiguous in the case of the Eurasian Skylark. In the latter case, incorporation of an ancillary source of information (records of land-use change) increased the weight assigned to the presumed true model from 70% to 92%, illustrating the value of the proposed framework in bringing diverse sources of information into a common rigorous framework. Ultimately, our framework may greatly assist conservation organizations in documenting threatening processes and planning species recovery. © 2014 Society for Conservation Biology.
A Computational Framework for Automation of Point Defect Calculations
NASA Astrophysics Data System (ADS)
Goyal, Anuj; Gorai, Prashun; Peng, Haowei; Lany, Stephan; Stevanovic, Vladan; National Renewable Energy Laboratory, Golden, Colorado 80401 Collaboration
A complete and rigorously validated open-source Python framework to automate point defect calculations using density functional theory has been developed. The framework provides an effective and efficient method for defect structure generation, and creation of simple yet customizable workflows to analyze defect calculations. The package provides the capability to compute widely accepted correction schemes to overcome finite-size effects, including (1) potential alignment, (2) image-charge correction, and (3) band filling correction to shallow defects. Using Si, ZnO and In2O3as test examples, we demonstrate the package capabilities and validate the methodology. We believe that a robust automated tool like this will enable the materials by design community to assess the impact of point defects on materials performance. National Renewable Energy Laboratory, Golden, Colorado 80401.
Cluster stability in the analysis of mass cytometry data.
Melchiotti, Rossella; Gracio, Filipe; Kordasti, Shahram; Todd, Alan K; de Rinaldis, Emanuele
2017-01-01
Manual gating has been traditionally applied to cytometry data sets to identify cells based on protein expression. The advent of mass cytometry allows for a higher number of proteins to be simultaneously measured on cells, therefore providing a means to define cell clusters in a high dimensional expression space. This enhancement, whilst opening unprecedented opportunities for single cell-level analyses, makes the incremental replacement of manual gating with automated clustering a compelling need. To this aim many methods have been implemented and their successful applications demonstrated in different settings. However, the reproducibility of automatically generated clusters is proving challenging and an analytical framework to distinguish spurious clusters from more stable entities, and presumably more biologically relevant ones, is still missing. One way to estimate cell clusters' stability is the evaluation of their consistent re-occurrence within- and between-algorithms, a metric that is commonly used to evaluate results from gene expression. Herein we report the usage and importance of cluster stability evaluations, when applied to results generated from three popular clustering algorithms - SPADE, FLOCK and PhenoGraph - run on four different data sets. These algorithms were shown to generate clusters with various degrees of statistical stability, many of them being unstable. By comparing the results of automated clustering with manually gated populations, we illustrate how information on cluster stability can assist towards a more rigorous and informed interpretation of clustering results. We also explore the relationships between statistical stability and other properties such as clusters' compactness and isolation, demonstrating that whilst cluster stability is linked to other properties it cannot be reliably predicted by any of them. Our study proposes the introduction of cluster stability as a necessary checkpoint for cluster interpretation and contributes to the construction of a more systematic and standardized analytical framework for the assessment of cytometry clustering results. © 2016 International Society for Advancement of Cytometry. © 2016 International Society for Advancement of Cytometry.
Epoch of reionization window. II. Statistical methods for foreground wedge reduction
NASA Astrophysics Data System (ADS)
Liu, Adrian; Parsons, Aaron R.; Trott, Cathryn M.
2014-07-01
For there to be a successful measurement of the 21 cm epoch of reionization (EoR) power spectrum, it is crucial that strong foreground contaminants be robustly suppressed. These foregrounds come from a variety of sources (such as Galactic synchrotron emission and extragalactic point sources), but almost all share the property of being spectrally smooth and, when viewed through the chromatic response of an interferometer, occupy a signature "wedge" region in cylindrical k⊥k∥ Fourier space. The complement of the foreground wedge is termed the "EoR window" and is expected to be mostly foreground-free, allowing clean measurements of the power spectrum. This paper is a sequel to a previous paper that established a rigorous mathematical framework for describing the foreground wedge and the EoR window. Here, we use our framework to explore statistical methods by which the EoR window can be enlarged, thereby increasing the sensitivity of a power spectrum measurement. We adapt the Feldman-Kaiser-Peacock approximation (commonly used in galaxy surveys) for 21 cm cosmology and also compare the optimal quadratic estimator to simpler estimators that ignore covariances between different Fourier modes. The optimal quadratic estimator is found to suppress foregrounds by an extra factor of ˜105 in power at the peripheries of the EoR window, boosting the detection of the cosmological signal from 12σ to 50σ at the midpoint of reionization in our fiducial models. If numerical issues can be finessed, decorrelation techniques allow the EoR window to be further enlarged, enabling measurements to be made deep within the foreground wedge. These techniques do not assume that foreground is Gaussian distributed, and we additionally prove that a final round of foreground subtraction can be performed after decorrelation in a way that is guaranteed to have no cosmological signal loss.
A statistical physics perspective on criticality in financial markets
NASA Astrophysics Data System (ADS)
Bury, Thomas
2013-11-01
Stock markets are complex systems exhibiting collective phenomena and particular features such as synchronization, fluctuations distributed as power-laws, non-random structures and similarity to neural networks. Such specific properties suggest that markets operate at a very special point. Financial markets are believed to be critical by analogy to physical systems, but little statistically founded evidence has been given. Through a data-based methodology and comparison to simulations inspired by the statistical physics of complex systems, we show that the Dow Jones and index sets are not rigorously critical. However, financial systems are closer to criticality in the crash neighborhood.
Integration of Technology into the Classroom: Case Studies.
ERIC Educational Resources Information Center
Johnson, D. LaMont, Ed.; Maddux, Cleborne D., Ed.; Liu, Leping, Ed.
This book contains the following case studies on the integration of technology in education: (1) "First Steps toward a Statistically Generated Information Technology Integration Model" (D. LaMont Johnson and Leping Liu); (2) "Case Studies: Are We Rejecting Rigor or Rediscovering Richness?" (Cleborne D. Maddux); (3)…
A Psychometric Evaluation of the Digital Logic Concept Inventory
ERIC Educational Resources Information Center
Herman, Geoffrey L.; Zilles, Craig; Loui, Michael C.
2014-01-01
Concept inventories hold tremendous promise for promoting the rigorous evaluation of teaching methods that might remedy common student misconceptions and promote deep learning. The measurements from concept inventories can be trusted only if the concept inventories are evaluated both by expert feedback and statistical scrutiny (psychometric…
Tunable intraparticle frameworks for creating complex heterostructured nanoparticle libraries
NASA Astrophysics Data System (ADS)
Fenton, Julie L.; Steimle, Benjamin C.; Schaak, Raymond E.
2018-05-01
Complex heterostructured nanoparticles with precisely defined materials and interfaces are important for many applications. However, rationally incorporating such features into nanoparticles with rigorous morphology control remains a synthetic bottleneck. We define a modular divergent synthesis strategy that progressively transforms simple nanoparticle synthons into increasingly sophisticated products. We introduce a series of tunable interfaces into zero-, one-, and two-dimensional copper sulfide nanoparticles using cation exchange reactions. Subsequent manipulation of these intraparticle frameworks yielded a library of 47 distinct heterostructured metal sulfide derivatives, including particles that contain asymmetric, patchy, porous, and sculpted nanoarchitectures. This generalizable mix-and-match strategy provides predictable retrosynthetic pathways to complex nanoparticle features that are otherwise inaccessible.
Gray, Alistair; Veale, Jaimie F.; Binson, Diane; Sell, Randell L.
2013-01-01
Objective. Effectively addressing health disparities experienced by sexual minority populations requires high-quality official data on sexual orientation. We developed a conceptual framework of sexual orientation to improve the quality of sexual orientation data in New Zealand's Official Statistics System. Methods. We reviewed conceptual and methodological literature, culminating in a draft framework. To improve the framework, we held focus groups and key-informant interviews with sexual minority stakeholders and producers and consumers of official statistics. An advisory board of experts provided additional guidance. Results. The framework proposes working definitions of the sexual orientation topic and measurement concepts, describes dimensions of the measurement concepts, discusses variables framing the measurement concepts, and outlines conceptual grey areas. Conclusion. The framework proposes standard definitions and concepts for the collection of official sexual orientation data in New Zealand. It presents a model for producers of official statistics in other countries, who wish to improve the quality of health data on their citizens. PMID:23840231
ERIC Educational Resources Information Center
Reisman, Abby
2017-01-01
The Common Core State Standards (CCSS) call on science and social studies teachers to engage in literacy instruction that prepares students for the academic rigors of college. The Literacy Design Collaborative (LDC) designed a framework to address the challenge of literacy-content integration. At the heart of the intervention are fill-in-the-blank…
Self-report: psychology's four-letter word.
Haeffel, Gerald J; Howard, George S
2010-01-01
Self-report continues to be one of the most widely used measurement strategies in psychology despite longstanding concerns about its validity and scientific rigor. In this article, the merits of self-report are examined from a philosophy of science perspective. A framework is also provided for evaluating self-report measures. Specifically, four issues are presented that can be used as a decision aid when making choices about measurement.
A Statistical Framework for Analyzing Cyber Threats
defender cares most about the attacks against certain ports or services). The grey-box statistical framework formulates a new methodology of Cybersecurity ...the design of prediction models. Our research showed that the grey-box framework is effective in predicting cybersecurity situational awareness.
Improved key-rate bounds for practical decoy-state quantum-key-distribution systems
NASA Astrophysics Data System (ADS)
Zhang, Zhen; Zhao, Qi; Razavi, Mohsen; Ma, Xiongfeng
2017-01-01
The decoy-state scheme is the most widely implemented quantum-key-distribution protocol in practice. In order to account for the finite-size key effects on the achievable secret key generation rate, a rigorous statistical fluctuation analysis is required. Originally, a heuristic Gaussian-approximation technique was used for this purpose, which, despite its analytical convenience, was not sufficiently rigorous. The fluctuation analysis has recently been made rigorous by using the Chernoff bound. There is a considerable gap, however, between the key-rate bounds obtained from these techniques and that obtained from the Gaussian assumption. Here we develop a tighter bound for the decoy-state method, which yields a smaller failure probability. This improvement results in a higher key rate and increases the maximum distance over which secure key exchange is possible. By optimizing the system parameters, our simulation results show that our method almost closes the gap between the two previously proposed techniques and achieves a performance similar to that of conventional Gaussian approximations.
Prompt assessment and management actions are required if we are to reduce the current rapid loss of habitat and biodiversity worldwide. Statistically valid quantification of the biota and habitat condition in water bodies are prerequisites for rigorous assessment of aquatic biodi...
The US Environmental Protection Agency (EPA) is revising its strategy to obtain the information needed to answer questions pertinent to water-quality management efficiently and rigorously at national scales. One tool of this revised strategy is use of statistically based surveys ...
Examining Multidimensional Middle Grade Outcomes after Early Elementary School Grade Retention
ERIC Educational Resources Information Center
Hwang, Sophia; Cappella, Elise; Schwartz, Kate
2016-01-01
Recently, researchers have begun to employ rigorous statistical methods and developmentally-informed theories to evaluate outcomes for students retained in non-kindergarten early elementary school. However, the majority of this research focuses on academic outcomes. Gaps remain regarding retention's effects on psychosocial outcomes important to…
1985-02-01
Energy Analysis , a branch of dynamic modal analysis developed for analyzing acoustic vibration problems, its present stage of development embodies a...Maximum Entropy Stochastic Modelling and Reduced-Order Design Synthesis is a rigorous new approach to this class of problems. Inspired by Statistical
Beyond Composite Scores and Cronbach's Alpha: Advancing Methodological Rigor in Recreation Research
ERIC Educational Resources Information Center
Gagnon, Ryan J.; Stone, Garrett A.; Garst, Barry A.
2017-01-01
Critically examining common statistical approaches and their strengths and weaknesses is an important step in advancing recreation and leisure sciences. To continue this critical examination and to inform methodological decision making, this study compared three approaches to determine how alternative approaches may result in contradictory…
Conceptualizing a Framework for Advanced Placement Statistics Teaching Knowledge
ERIC Educational Resources Information Center
Haines, Brenna
2015-01-01
The purpose of this article is to sketch a conceptualization of a framework for Advanced Placement (AP) Statistics Teaching Knowledge. Recent research continues to problematize the lack of knowledge and preparation among secondary level statistics teachers. The College Board's AP Statistics course continues to grow and gain popularity, but is a…
Tipton, John; Hooten, Mevin B.; Goring, Simon
2017-01-01
Scientific records of temperature and precipitation have been kept for several hundred years, but for many areas, only a shorter record exists. To understand climate change, there is a need for rigorous statistical reconstructions of the paleoclimate using proxy data. Paleoclimate proxy data are often sparse, noisy, indirect measurements of the climate process of interest, making each proxy uniquely challenging to model statistically. We reconstruct spatially explicit temperature surfaces from sparse and noisy measurements recorded at historical United States military forts and other observer stations from 1820 to 1894. One common method for reconstructing the paleoclimate from proxy data is principal component regression (PCR). With PCR, one learns a statistical relationship between the paleoclimate proxy data and a set of climate observations that are used as patterns for potential reconstruction scenarios. We explore PCR in a Bayesian hierarchical framework, extending classical PCR in a variety of ways. First, we model the latent principal components probabilistically, accounting for measurement error in the observational data. Next, we extend our method to better accommodate outliers that occur in the proxy data. Finally, we explore alternatives to the truncation of lower-order principal components using different regularization techniques. One fundamental challenge in paleoclimate reconstruction efforts is the lack of out-of-sample data for predictive validation. Cross-validation is of potential value, but is computationally expensive and potentially sensitive to outliers in sparse data scenarios. To overcome the limitations that a lack of out-of-sample records presents, we test our methods using a simulation study, applying proper scoring rules including a computationally efficient approximation to leave-one-out cross-validation using the log score to validate model performance. The result of our analysis is a spatially explicit reconstruction of spatio-temporal temperature from a very sparse historical record.
Frazin, Richard A
2016-04-01
A new generation of telescopes with mirror diameters of 20 m or more, called extremely large telescopes (ELTs), has the potential to provide unprecedented imaging and spectroscopy of exoplanetary systems, if the difficulties in achieving the extremely high dynamic range required to differentiate the planetary signal from the star can be overcome to a sufficient degree. Fully utilizing the potential of ELTs for exoplanet imaging will likely require simultaneous and self-consistent determination of both the planetary image and the unknown aberrations in multiple planes of the optical system, using statistical inference based on the wavefront sensor and science camera data streams. This approach promises to overcome the most important systematic errors inherent in the various schemes based on differential imaging, such as angular differential imaging and spectral differential imaging. This paper is the first in a series on this subject, in which a formalism is established for the exoplanet imaging problem, setting the stage for the statistical inference methods to follow in the future. Every effort has been made to be rigorous and complete, so that validity of approximations to be made later can be assessed. Here, the polarimetric image is expressed in terms of aberrations in the various planes of a polarizing telescope with an adaptive optics system. Further, it is shown that current methods that utilize focal plane sensing to correct the speckle field, e.g., electric field conjugation, rely on the tacit assumption that aberrations on multiple optical surfaces can be represented as aberration on a single optical surface, ultimately limiting their potential effectiveness for ground-based astronomy.
Ergodicity of Truncated Stochastic Navier Stokes with Deterministic Forcing and Dispersion
NASA Astrophysics Data System (ADS)
Majda, Andrew J.; Tong, Xin T.
2016-10-01
Turbulence in idealized geophysical flows is a very rich and important topic. The anisotropic effects of explicit deterministic forcing, dispersive effects from rotation due to the β -plane and F-plane, and topography together with random forcing all combine to produce a remarkable number of realistic phenomena. These effects have been studied through careful numerical experiments in the truncated geophysical models. These important results include transitions between coherent jets and vortices, and direct and inverse turbulence cascades as parameters are varied, and it is a contemporary challenge to explain these diverse statistical predictions. Here we contribute to these issues by proving with full mathematical rigor that for any values of the deterministic forcing, the β - and F-plane effects and topography, with minimal stochastic forcing, there is geometric ergodicity for any finite Galerkin truncation. This means that there is a unique smooth invariant measure which attracts all statistical initial data at an exponential rate. In particular, this rigorous statistical theory guarantees that there are no bifurcations to multiple stable and unstable statistical steady states as geophysical parameters are varied in contrast to claims in the applied literature. The proof utilizes a new statistical Lyapunov function to account for enstrophy exchanges between the statistical mean and the variance fluctuations due to the deterministic forcing. It also requires careful proofs of hypoellipticity with geophysical effects and uses geometric control theory to establish reachability. To illustrate the necessity of these conditions, a two-dimensional example is developed which has the square of the Euclidean norm as the Lyapunov function and is hypoelliptic with nonzero noise forcing, yet fails to be reachable or ergodic.
Machine learning in the string landscape
NASA Astrophysics Data System (ADS)
Carifio, Jonathan; Halverson, James; Krioukov, Dmitri; Nelson, Brent D.
2017-09-01
We utilize machine learning to study the string landscape. Deep data dives and conjecture generation are proposed as useful frameworks for utilizing machine learning in the landscape, and examples of each are presented. A decision tree accurately predicts the number of weak Fano toric threefolds arising from reflexive polytopes, each of which determines a smooth F-theory compactification, and linear regression generates a previously proven conjecture for the gauge group rank in an ensemble of 4/3× 2.96× {10}^{755} F-theory compactifications. Logistic regression generates a new conjecture for when E 6 arises in the large ensemble of F-theory compactifications, which is then rigorously proven. This result may be relevant for the appearance of visible sectors in the ensemble. Through conjecture generation, machine learning is useful not only for numerics, but also for rigorous results.
Forecasting volatility with neural regression: a contribution to model adequacy.
Refenes, A N; Holt, W T
2001-01-01
Neural nets' usefulness for forecasting is limited by problems of overfitting and the lack of rigorous procedures for model identification, selection and adequacy testing. This paper describes a methodology for neural model misspecification testing. We introduce a generalization of the Durbin-Watson statistic for neural regression and discuss the general issues of misspecification testing using residual analysis. We derive a generalized influence matrix for neural estimators which enables us to evaluate the distribution of the statistic. We deploy Monte Carlo simulation to compare the power of the test for neural and linear regressors. While residual testing is not a sufficient condition for model adequacy, it is nevertheless a necessary condition to demonstrate that the model is a good approximation to the data generating process, particularly as neural-network estimation procedures are susceptible to partial convergence. The work is also an important step toward developing rigorous procedures for neural model identification, selection and adequacy testing which have started to appear in the literature. We demonstrate its applicability in the nontrivial problem of forecasting implied volatility innovations using high-frequency stock index options. Each step of the model building process is validated using statistical tests to verify variable significance and model adequacy with the results confirming the presence of nonlinear relationships in implied volatility innovations.
Quantification of Covariance in Tropical Cyclone Activity across Teleconnected Basins
NASA Astrophysics Data System (ADS)
Tolwinski-Ward, S. E.; Wang, D.
2015-12-01
Rigorous statistical quantification of natural hazard covariance across regions has important implications for risk management, and is also of fundamental scientific interest. We present a multivariate Bayesian Poisson regression model for inferring the covariance in tropical cyclone (TC) counts across multiple ocean basins and across Saffir-Simpson intensity categories. Such covariability results from the influence of large-scale modes of climate variability on local environments that can alternately suppress or enhance TC genesis and intensification, and our model also simultaneously quantifies the covariance of TC counts with various climatic modes in order to deduce the source of inter-basin TC covariability. The model explicitly treats the time-dependent uncertainty in observed maximum sustained wind data, and hence the nominal intensity category of each TC. Differences in annual TC counts as measured by different agencies are also formally addressed. The probabilistic output of the model can be probed for probabilistic answers to such questions as: - Does the relationship between different categories of TCs differ statistically by basin? - Which climatic predictors have significant relationships with TC activity in each basin? - Are the relationships between counts in different basins conditionally independent given the climatic predictors, or are there other factors at play affecting inter-basin covariability? - How can a portfolio of insured property be optimized across space to minimize risk? Although we present results of our model applied to TCs, the framework is generalizable to covariance estimation between multivariate counts of natural hazards across regions and/or across peril types.
Increased scientific rigor will improve reliability of research and effectiveness of management
Sells, Sarah N.; Bassing, Sarah B.; Barker, Kristin J.; Forshee, Shannon C.; Keever, Allison; Goerz, James W.; Mitchell, Michael S.
2018-01-01
Rigorous science that produces reliable knowledge is critical to wildlife management because it increases accurate understanding of the natural world and informs management decisions effectively. Application of a rigorous scientific method based on hypothesis testing minimizes unreliable knowledge produced by research. To evaluate the prevalence of scientific rigor in wildlife research, we examined 24 issues of the Journal of Wildlife Management from August 2013 through July 2016. We found 43.9% of studies did not state or imply a priori hypotheses, which are necessary to produce reliable knowledge. We posit that this is due, at least in part, to a lack of common understanding of what rigorous science entails, how it produces more reliable knowledge than other forms of interpreting observations, and how research should be designed to maximize inferential strength and usefulness of application. Current primary literature does not provide succinct explanations of the logic behind a rigorous scientific method or readily applicable guidance for employing it, particularly in wildlife biology; we therefore synthesized an overview of the history, philosophy, and logic that define scientific rigor for biological studies. A rigorous scientific method includes 1) generating a research question from theory and prior observations, 2) developing hypotheses (i.e., plausible biological answers to the question), 3) formulating predictions (i.e., facts that must be true if the hypothesis is true), 4) designing and implementing research to collect data potentially consistent with predictions, 5) evaluating whether predictions are consistent with collected data, and 6) drawing inferences based on the evaluation. Explicitly testing a priori hypotheses reduces overall uncertainty by reducing the number of plausible biological explanations to only those that are logically well supported. Such research also draws inferences that are robust to idiosyncratic observations and unavoidable human biases. Offering only post hoc interpretations of statistical patterns (i.e., a posteriorihypotheses) adds to uncertainty because it increases the number of plausible biological explanations without determining which have the greatest support. Further, post hocinterpretations are strongly subject to human biases. Testing hypotheses maximizes the credibility of research findings, makes the strongest contributions to theory and management, and improves reproducibility of research. Management decisions based on rigorous research are most likely to result in effective conservation of wildlife resources.
Fast synthesis of topographic mask effects based on rigorous solutions
NASA Astrophysics Data System (ADS)
Yan, Qiliang; Deng, Zhijie; Shiely, James
2007-10-01
Topographic mask effects can no longer be ignored at technology nodes of 45 nm, 32 nm and beyond. As feature sizes become comparable to the mask topographic dimensions and the exposure wavelength, the popular thin mask model breaks down, because the mask transmission no longer follows the layout. A reliable mask transmission function has to be derived from Maxwell equations. Unfortunately, rigorous solutions of Maxwell equations are only manageable for limited field sizes, but impractical for full-chip optical proximity corrections (OPC) due to the prohibitive runtime. Approximation algorithms are in demand to achieve a balance between acceptable computation time and tolerable errors. In this paper, a fast algorithm is proposed and demonstrated to model topographic mask effects for OPC applications. The ProGen Topographic Mask (POTOMAC) model synthesizes the mask transmission functions out of small-sized Maxwell solutions from a finite-difference-in-time-domain (FDTD) engine, an industry leading rigorous simulator of topographic mask effect from SOLID-E. The integral framework presents a seamless solution to the end user. Preliminary results indicate the overhead introduced by POTOMAC is contained within the same order of magnitude in comparison to the thin mask approach.
Enterprise and system of systems capability development life-cycle processes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beck, David Franklin
2014-08-01
This report and set of appendices are a collection of memoranda originally drafted circa 2007-2009 for the purpose of describing and detailing a models-based systems engineering approach for satisfying enterprise and system-of-systems life cycle process requirements. At the time there was interest and support to move from Capability Maturity Model Integration (CMMI) Level One (ad hoc processes) to Level Three. The main thrust of the material presents a rational exposâe of a structured enterprise development life cycle that uses the scientific method as a framework, with further rigor added from adapting relevant portions of standard systems engineering processes. While themore » approach described invokes application of the Department of Defense Architectural Framework (DoDAF), it is suitable for use with other architectural description frameworks.« less
Gravitational Physics: the birth of a new era
NASA Astrophysics Data System (ADS)
Sakellariadou, Mairi
2017-11-01
We live the golden age of cosmology, while the era of gravitational astronomy has finally begun. Still, fundamental puzzles remain. Standard cosmology is formulated within the framework of Einstein's General theory of Relativity. Notwithstanding, General Relativity is not adequate to explain the earliest stages of cosmic existence, and cannot provide an explanation for the Big Bang itself. Modern early universe cosmology is in need of a rigorous underpinning in Quantum Gravity.
Kallio, Hanna; Pietilä, Anna-Maija; Johnson, Martin; Kangasniemi, Mari
2016-12-01
To produce a framework for the development of a qualitative semi-structured interview guide. Rigorous data collection procedures fundamentally influence the results of studies. The semi-structured interview is a common data collection method, but methodological research on the development of a semi-structured interview guide is sparse. Systematic methodological review. We searched PubMed, CINAHL, Scopus and Web of Science for methodological papers on semi-structured interview guides from October 2004-September 2014. Having examined 2,703 titles and abstracts and 21 full texts, we finally selected 10 papers. We analysed the data using the qualitative content analysis method. Our analysis resulted in new synthesized knowledge on the development of a semi-structured interview guide, including five phases: (1) identifying the prerequisites for using semi-structured interviews; (2) retrieving and using previous knowledge; (3) formulating the preliminary semi-structured interview guide; (4) pilot testing the guide; and (5) presenting the complete semi-structured interview guide. Rigorous development of a qualitative semi-structured interview guide contributes to the objectivity and trustworthiness of studies and makes the results more plausible. Researchers should consider using this five-step process to develop a semi-structured interview guide and justify the decisions made during it. © 2016 John Wiley & Sons Ltd.
When is good, good enough? Methodological pragmatism for sustainable guideline development.
Browman, George P; Somerfield, Mark R; Lyman, Gary H; Brouwers, Melissa C
2015-03-06
Continuous escalation in methodological and procedural rigor for evidence-based processes in guideline development is associated with increasing costs and production delays that threaten sustainability. While health research methodologists are appropriately responsible for promoting increasing rigor in guideline development, guideline sponsors are responsible for funding such processes. This paper acknowledges that other stakeholders in addition to methodologists should be more involved in negotiating trade-offs between methodological procedures and efficiency in guideline production to produce guidelines that are 'good enough' to be trustworthy and affordable under specific circumstances. The argument for reasonable methodological compromise to meet practical circumstances is consistent with current implicit methodological practice. This paper proposes a conceptual tool as a framework to be used by different stakeholders in negotiating, and explicitly reporting, reasonable compromises for trustworthy as well as cost-worthy guidelines. The framework helps fill a transparency gap in how methodological choices in guideline development are made. The principle, 'when good is good enough' can serve as a basis for this approach. The conceptual tool 'Efficiency-Validity Methodological Continuum' acknowledges trade-offs between validity and efficiency in evidence-based guideline development and allows for negotiation, guided by methodologists, of reasonable methodological compromises among stakeholders. Collaboration among guideline stakeholders in the development process is necessary if evidence-based guideline development is to be sustainable.
Culture and symptom reporting at menopause.
Melby, Melissa K; Lock, Margaret; Kaufert, Patricia
2005-01-01
The purpose of the present paper is to review recent research on the relationship of culture and menopausal symptoms and propose a biocultural framework that makes use of both biological and cultural parameters in future research. Medline was searched for English-language articles published from 2000 to 2004 using the keyword 'menopause' in the journals--Menopause, Maturitas, Climacteric, Social Science and Medicine, Medical Anthropology Quarterly, Journal of Women's Health, Journal of the American Medical Association, American Journal of Epidemiology, Lancet and British Medical Journal, excluding articles concerning small clinical samples, surgical menopause or HRT. Additionally, references of retrieved articles and reviews were hand-searched. Although a large number of studies and publications exist, methodological differences limit attempts at comparison or systematic review. We outline a theoretical framework in which relevant biological and cultural variables can be operationalized and measured, making it possible for rigorous comparisons in the future. Several studies carried out in Japan, North America and Australia, using similar methodology but different culture/ethnic groups, indicate that differences in symptom reporting are real and highlight the importance of biocultural research. We suggest that both biological variation and cultural differences contribute to the menopausal transition, and that more rigorous data collection is required to elucidate how biology and culture interact in female ageing.
When Assessment Data Are Words: Validity Evidence for Qualitative Educational Assessments.
Cook, David A; Kuper, Ayelet; Hatala, Rose; Ginsburg, Shiphra
2016-10-01
Quantitative scores fail to capture all important features of learner performance. This awareness has led to increased use of qualitative data when assessing health professionals. Yet the use of qualitative assessments is hampered by incomplete understanding of their role in forming judgments, and lack of consensus in how to appraise the rigor of judgments therein derived. The authors articulate the role of qualitative assessment as part of a comprehensive program of assessment, and translate the concept of validity to apply to judgments arising from qualitative assessments. They first identify standards for rigor in qualitative research, and then use two contemporary assessment validity frameworks to reorganize these standards for application to qualitative assessment.Standards for rigor in qualitative research include responsiveness, reflexivity, purposive sampling, thick description, triangulation, transparency, and transferability. These standards can be reframed using Messick's five sources of validity evidence (content, response process, internal structure, relationships with other variables, and consequences) and Kane's four inferences in validation (scoring, generalization, extrapolation, and implications). Evidence can be collected and evaluated for each evidence source or inference. The authors illustrate this approach using published research on learning portfolios.The authors advocate a "methods-neutral" approach to assessment, in which a clearly stated purpose determines the nature of and approach to data collection and analysis. Increased use of qualitative assessments will necessitate more rigorous judgments of the defensibility (validity) of inferences and decisions. Evidence should be strategically sought to inform a coherent validity argument.
Inferring the source of evaporated waters using stable H and O isotopes
NASA Astrophysics Data System (ADS)
Bowen, G. J.; Putman, A.; Brooks, J. R.; Bowling, D. R.; Oerter, E.; Good, S. P.
2017-12-01
Stable isotope ratios of H and O are widely used identify the source of water, e.g., in aquifers, river runoff, soils, plant xylem, and plant-based beverages. In situations where the sampled water is partially evaporated, its isotope values will have evolved along an evaporation line (EL) in δ2H/δ18O space, and back-correction along the EL to its intersection with a meteoric water line (MWL) has been used to estimate the source water's isotope ratios. Several challenges and potential pitfalls exist with traditional approaches to this problem, including potential for bias from a commonly used regression-based approach for EL slope estimation and incomplete estimation of uncertainty in most studies. We suggest the value of a model-based approach to EL estimation, and introduce a mathematical framework that eliminates the need to explicitly estimate the EL-MWL intersection, simplifying analysis and facilitating more rigorous uncertainty estimation. We apply this analysis framework to data from 1,000 lakes sampled in EPA's 2007 National Lakes Assessment. We find that data for most lakes is consistent with a water source similar to annual runoff, estimated from monthly precipitation and evaporation within the lake basin. Strong evidence for both summer- and winter-biased sources exists, however, with winter bias pervasive in most snow-prone regions. The new analytical framework should improve the rigor of source-water inference from evaporated samples in ecohydrology and related sciences, and our initial results from U.S. lakes suggest that previous interpretations of lakes as unbiased isotope integrators may only be valid in certain climate regimes.
Harnessing Implementation Science to Increase the Impact of Health Disparity Research
Chinman, Matthew; Woodward, Eva N.; Curran, Geoffrey M.; Hausmann, Leslie R. M.
2017-01-01
Background Health disparities are differences in health or health care between groups based on social, economic, and/or environmental disadvantage. Disparity research often follows three steps: detecting (Phase 1), understanding (Phase 2), and reducing (Phase 3), disparities. While disparities have narrowed over time, many remain. Objectives We argue that implementation science could enhance disparities research by broadening the scope of Phase 2 studies and offering rigorous methods to test disparity-reducing implementation strategies in Phase 3 studies. Methods We briefly review the focus of Phase 2 and Phase 3 disparities research. We then provide a decision tree and case examples to illustrate how implementation science frameworks and research designs could further enhance disparity research. Results Most health disparities research emphasizes patient and provider factors as predominant mechanisms underlying disparities. Applying implementation science frameworks like the Consolidated Framework for Implementation Research could help disparities research widen its scope in Phase 2 studies and, in turn, develop broader disparities-reducing implementation strategies in Phase 3 studies. Many Phase 3 studies of disparity reducing implementation strategies are similar to case studies, whose designs are not able to fully test causality. Implementation science research designs offer rigorous methods that could accelerate the pace at which equity is achieved in real world practice. Conclusions Disparities can be considered a “special case” of implementation challenges—when evidence-based clinical interventions are delivered to, and received by, vulnerable populations at lower rates. Bringing together health disparities research and implementation science could advance equity more than either could achieve on their own. PMID:28806362
Scaling up to address data science challenges
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wendelberger, Joanne R.
Statistics and Data Science provide a variety of perspectives and technical approaches for exploring and understanding Big Data. Partnerships between scientists from different fields such as statistics, machine learning, computer science, and applied mathematics can lead to innovative approaches for addressing problems involving increasingly large amounts of data in a rigorous and effective manner that takes advantage of advances in computing. Here, this article will explore various challenges in Data Science and will highlight statistical approaches that can facilitate analysis of large-scale data including sampling and data reduction methods, techniques for effective analysis and visualization of large-scale simulations, and algorithmsmore » and procedures for efficient processing.« less
Scaling up to address data science challenges
Wendelberger, Joanne R.
2017-04-27
Statistics and Data Science provide a variety of perspectives and technical approaches for exploring and understanding Big Data. Partnerships between scientists from different fields such as statistics, machine learning, computer science, and applied mathematics can lead to innovative approaches for addressing problems involving increasingly large amounts of data in a rigorous and effective manner that takes advantage of advances in computing. Here, this article will explore various challenges in Data Science and will highlight statistical approaches that can facilitate analysis of large-scale data including sampling and data reduction methods, techniques for effective analysis and visualization of large-scale simulations, and algorithmsmore » and procedures for efficient processing.« less
O'Sullivan, Finbarr; Muzi, Mark; Spence, Alexander M; Mankoff, David M; O'Sullivan, Janet N; Fitzgerald, Niall; Newman, George C; Krohn, Kenneth A
2009-06-01
Kinetic analysis is used to extract metabolic information from dynamic positron emission tomography (PET) uptake data. The theory of indicator dilutions, developed in the seminal work of Meier and Zierler (1954), provides a probabilistic framework for representation of PET tracer uptake data in terms of a convolution between an arterial input function and a tissue residue. The residue is a scaled survival function associated with tracer residence in the tissue. Nonparametric inference for the residue, a deconvolution problem, provides a novel approach to kinetic analysis-critically one that is not reliant on specific compartmental modeling assumptions. A practical computational technique based on regularized cubic B-spline approximation of the residence time distribution is proposed. Nonparametric residue analysis allows formal statistical evaluation of specific parametric models to be considered. This analysis needs to properly account for the increased flexibility of the nonparametric estimator. The methodology is illustrated using data from a series of cerebral studies with PET and fluorodeoxyglucose (FDG) in normal subjects. Comparisons are made between key functionals of the residue, tracer flux, flow, etc., resulting from a parametric (the standard two-compartment of Phelps et al. 1979) and a nonparametric analysis. Strong statistical evidence against the compartment model is found. Primarily these differences relate to the representation of the early temporal structure of the tracer residence-largely a function of the vascular supply network. There are convincing physiological arguments against the representations implied by the compartmental approach but this is the first time that a rigorous statistical confirmation using PET data has been reported. The compartmental analysis produces suspect values for flow but, notably, the impact on the metabolic flux, though statistically significant, is limited to deviations on the order of 3%-4%. The general advantage of the nonparametric residue analysis is the ability to provide a valid kinetic quantitation in the context of studies where there may be heterogeneity or other uncertainty about the accuracy of a compartmental model approximation of the tissue residue.
Statistics, Formation and Stability of Exoplanetary Systems
NASA Astrophysics Data System (ADS)
Silburt, Ari
Over the past two decades scientists have detected thousands of exoplanets, and their collective properties are now emerging. This thesis contributes to the exoplanet field by analyzing the statistics, formation and stability of exoplanetary systems. The first part of this thesis conducts a statistical reconstruction of the radius and period distributions of Kepler planets. Accounting for observation and detection biases, as well as measurement errors, we calculate the occurrence of planetary systems, including the prevalence of Earth-like planets. This calculation is compared to related works, finding both similarities and differences. Second, the formation of Kepler planets near mean motion resonance (MMR) is investigated. In particular, 27 Kepler systems near 2:1 MMR are analyzed to determine whether tides are a viable mechanism for transporting Kepler planets from MMR. We find that tides alone cannot transport near-resonant planets from exact 2:1 MMR to their observed locations, and other mechanisms must be invoked to explain their formation. Third, a new hybrid integrator HERMES is presented, which is capable of simulating N-bodies undergoing close encounters. HERMES is specifically designed for planets embedded in planetesimal disks, and includes an adaptive routine for optimizing the close encounter boundary to help maintain accuracy. We find the performance of HERMES comparable to other popular hybrid integrators. Fourth, the longterm stability of planetary systems is investigated using machine learning techniques. Typical studies of longterm stability require thousands of realizations to acquire statistically rigorous results, which can take weeks or months to perform. Here we find that a trained machine is capable of quickly and accurately classifying longterm planet stability. Finally, the planetary system HD155358, consisting of two Jovian-sized planets near 2:1 MMR, is investigated using previously collected radial velocity data. New orbital parameters are derived using a Bayesian framework, and we find a high likelihood that the planets are in MMR. In addition, formation and stability constraints are placed on the HD155358 system.
Sex Differences in the Response of Children with ADHD to Once-Daily Formulations of Methylphenidate
ERIC Educational Resources Information Center
Sonuga-Barke, J. S.; Coghill, David; Markowitz, John S.; Swanson, James M.; Vandenberghe, Mieke; Hatch, Simon J.
2007-01-01
Objectives: Studies of sex differences in methylphenidate response by children with attention-deficit/hyperactivity disorder have lacked methodological rigor and statistical power. This paper reports an examination of sex differences based on further analysis of data from a comparison of two once-daily methylphenidate formulations (the COMACS…
The Role of Data Analysis Software in Graduate Programs in Education and Post-Graduate Research
ERIC Educational Resources Information Center
Harwell, Michael
2018-01-01
The importance of data analysis software in graduate programs in education and post-graduate educational research is self-evident. However the role of this software in facilitating supererogated statistical practice versus "cookbookery" is unclear. The need to rigorously document the role of data analysis software in students' graduate…
A Study of Statistics through Tootsie Pops
ERIC Educational Resources Information Center
Aaberg, Shelby; Vitosh, Jason; Smith, Wendy
2016-01-01
A classic TV commercial once asked, "How many licks does it take to get to the center of a Tootsie Roll Tootsie Pop?" The narrator claims, "The world may never know" (Tootsie Roll 2012), but an Internet search returns a multitude of answers, some of which include rigorous systematic approaches by academics to address the…
Meeting the needs of an ever-demanding market.
Rigby, Richard
2002-04-01
Balancing cost and performance in packaging is critical. This article outlines techniques to assist in this whilst delivering added value and product differentiation. The techniques include a rigorous statistical process capable of delivering cost reduction and improved quality and a computer modelling process that can save time when validating new packaging options.
Traditionally, the EPA has monitored aquatic ecosystems using statistically rigorous sample designs and intensive field efforts which provide high quality datasets. But by their nature they leave many aquatic systems unsampled, follow a top down approach, have a long lag between ...
ERIC Educational Resources Information Center
Benton-Borghi, Beatrice Hope; Chang, Young Mi
2011-01-01
The National Center for Educational Statistics (NCES, 2010) continues to report substantial underachievement of diverse student populations in the nation's schools. After decades of focus on diversity and multicultural education, with integrating field and clinical practice, candidates continue to graduate without adequate knowledge, skills and…
State College- and Career-Ready High School Graduation Requirements. Updated
ERIC Educational Resources Information Center
Achieve, Inc., 2013
2013-01-01
Research by Achieve, ACT, and others suggests that for high school graduates to be prepared for success in a wide range of postsecondary settings, they need to take four years of challenging mathematics--covering Advanced Algebra; Geometry; and data, probability, and statistics content--and four years of rigorous English aligned with college- and…
High School Redesign. Diplomas Count, 2016. Education Week. Volume 35, Number 33
ERIC Educational Resources Information Center
Edwards, Virginia B., Ed.
2016-01-01
This year's report focuses on efforts to redesign high schools. Those include incorporating student voice, implementing a rigorous and relevant curriculum, embracing career exploration, and more. The report also includes the latest statistics on the nation's overall, on-time high school graduation rate. Articles include: (1) To Build a Better High…
Interactive visual analysis promotes exploration of long-term ecological data
T.N. Pham; J.A. Jones; R. Metoyer; F.J. Swanson; R.J. Pabst
2013-01-01
Long-term ecological data are crucial in helping ecologists understand ecosystem function and environmental change. Nevertheless, these kinds of data sets are difficult to analyze because they are usually large, multivariate, and spatiotemporal. Although existing analysis tools such as statistical methods and spreadsheet software permit rigorous tests of pre-conceived...
Transformation and model choice for RNA-seq co-expression analysis.
Rau, Andrea; Maugis-Rabusseau, Cathy
2018-05-01
Although a large number of clustering algorithms have been proposed to identify groups of co-expressed genes from microarray data, the question of if and how such methods may be applied to RNA sequencing (RNA-seq) data remains unaddressed. In this work, we investigate the use of data transformations in conjunction with Gaussian mixture models for RNA-seq co-expression analyses, as well as a penalized model selection criterion to select both an appropriate transformation and number of clusters present in the data. This approach has the advantage of accounting for per-cluster correlation structures among samples, which can be strong in RNA-seq data. In addition, it provides a rigorous statistical framework for parameter estimation, an objective assessment of data transformations and number of clusters and the possibility of performing diagnostic checks on the quality and homogeneity of the identified clusters. We analyze four varied RNA-seq data sets to illustrate the use of transformations and model selection in conjunction with Gaussian mixture models. Finally, we propose a Bioconductor package coseq (co-expression of RNA-seq data) to facilitate implementation and visualization of the recommended RNA-seq co-expression analyses.
NASA Astrophysics Data System (ADS)
Collins, P. C.; Haden, C. V.; Ghamarian, I.; Hayes, B. J.; Ales, T.; Penso, G.; Dixit, V.; Harlow, G.
2014-07-01
Electron beam direct manufacturing, synonymously known as electron beam additive manufacturing, along with other additive "3-D printing" manufacturing processes, are receiving widespread attention as a means of producing net-shape (or near-net-shape) components, owing to potential manufacturing benefits. Yet, materials scientists know that differences in manufacturing processes often significantly influence the microstructure of even widely accepted materials and, thus, impact the properties and performance of a material in service. It is important to accelerate the understanding of the processing-structure-property relationship of materials being produced via these novel approaches in a framework that considers the performance in a statistically rigorous way. This article describes the development of a process model, the assessment of key microstructural features to be incorporated into a microstructure simulation model, a novel approach to extract a constitutive equation to predict tensile properties in Ti-6Al-4V (Ti-64), and a probabilistic approach to measure the fidelity of the property model against real data. This integrated approach will provide designers a tool to vary process parameters and understand the influence on performance, enabling design and optimization for these highly visible manufacturing approaches.
Gong, Ting; Szustakowski, Joseph D
2013-04-15
For heterogeneous tissues, measurements of gene expression through mRNA-Seq data are confounded by relative proportions of cell types involved. In this note, we introduce an efficient pipeline: DeconRNASeq, an R package for deconvolution of heterogeneous tissues based on mRNA-Seq data. It adopts a globally optimized non-negative decomposition algorithm through quadratic programming for estimating the mixing proportions of distinctive tissue types in next-generation sequencing data. We demonstrated the feasibility and validity of DeconRNASeq across a range of mixing levels and sources using mRNA-Seq data mixed in silico at known concentrations. We validated our computational approach for various benchmark data, with high correlation between our predicted cell proportions and the real fractions of tissues. Our study provides a rigorous, quantitative and high-resolution tool as a prerequisite to use mRNA-Seq data. The modularity of package design allows an easy deployment of custom analytical pipelines for data from other high-throughput platforms. DeconRNASeq is written in R, and is freely available at http://bioconductor.org/packages. Supplementary data are available at Bioinformatics online.
PARAGON: A Systematic, Integrated Approach to Aerosol Observation and Modeling
NASA Technical Reports Server (NTRS)
Diner, David J.; Kahn, Ralph A.; Braverman, Amy J.; Davies, Roger; Martonchik, John V.; Menzies, Robert T.; Ackerman, Thomas P.; Seinfeld, John H.; Anderson, Theodore L.; Charlson, Robert J.;
2004-01-01
Aerosols are generated and transformed by myriad processes operating across many spatial and temporal scales. Evaluation of climate models and their sensitivity to changes, such as in greenhouse gas abundances, requires quantifying natural and anthropogenic aerosol forcings and accounting for other critical factors, such as cloud feedbacks. High accuracy is required to provide sufficient sensitivity to perturbations, separate anthropogenic from natural influences, and develop confidence in inputs used to support policy decisions. Although many relevant data sources exist, the aerosol research community does not currently have the means to combine these diverse inputs into an integrated data set for maximum scientific benefit. Bridging observational gaps, adapting to evolving measurements, and establishing rigorous protocols for evaluating models are necessary, while simultaneously maintaining consistent, well understood accuracies. The Progressive Aerosol Retrieval and Assimilation Global Observing Network (PARAGON) concept represents a systematic, integrated approach to global aerosol Characterization, bringing together modern measurement and modeling techniques, geospatial statistics methodologies, and high-performance information technologies to provide the machinery necessary for achieving a comprehensive understanding of how aerosol physical, chemical, and radiative processes impact the Earth system. We outline a framework for integrating and interpreting observations and models and establishing an accurate, consistent and cohesive long-term data record.
Patel, Tulpesh; Blyth, Jacqueline C; Griffiths, Gareth; Kelly, Deirdre; Talcott, Joel B
2014-01-01
Proton Magnetic Resonance Spectroscopy ((1)H-MRS) is a non-invasive imaging technique that enables quantification of neurochemistry in vivo and thereby facilitates investigation of the biochemical underpinnings of human cognitive variability. Studies in the field of cognitive spectroscopy have commonly focused on relationships between measures of N-acetyl aspartate (NAA), a surrogate marker of neuronal health and function, and broad measures of cognitive performance, such as IQ. In this study, we used (1)H-MRS to interrogate single-voxels in occipitoparietal and frontal cortex, in parallel with assessments of psychometric intelligence, in a sample of 40 healthy adult participants. We found correlations between NAA and IQ that were within the range reported in previous studies. However, the magnitude of these effects was significantly modulated by the stringency of data screening and the extent to which outlying values contributed to statistical analyses. (1)H-MRS offers a sensitive tool for assessing neurochemistry non-invasively, yet the relationships between brain metabolites and broad aspects of human behavior such as IQ are subtle. We highlight the need to develop an increasingly rigorous analytical and interpretive framework for collecting and reporting data obtained from cognitive spectroscopy studies of this kind.
Optical methods in nano-biotechnology
NASA Astrophysics Data System (ADS)
Bruno, Luigi; Gentile, Francesco
2016-01-01
A scientific theory is not a mathematical paradigm. It is a framework that explains natural facts and may predict future observations. A scientific theory may be modified, improved, or rejected. Science is less a collection of theories and more the process that brings either to deny some hypothesis, maintain or accept somehow universal beliefs (or disbeliefs), and create new models that may improve or replace precedent theories. This process cannot be entrusted to common sense, personal experiences or anecdotes (many precepts in physics are indeed counterintuitive), but on a rigorous design, observation and rational to statistical analysis of new experiments. Scientific results are always provisional: scientists rarely proclaim an absolute truth or absolute certainty. Uncertainty is inevitable at the frontiers of knowledge. Notably, this is the definition of the scientific method and what we have written in the above echoes the opinion Marcia McNutt who is the Editor of Science 'Science is a method for deciding whether what we choose to believe has a basis in the laws of nature or not'. A new discovery, a new theory that explains that discovery and the scientific method itself need observations, verifications and are susceptible of falsification.
Naturalness of MSSM dark matter
NASA Astrophysics Data System (ADS)
Cabrera, María Eugenia; Casas, J. Alberto; Delgado, Antonio; Robles, Sandra; de Austri, Roberto Ruiz
2016-08-01
There exists a vast literature examining the electroweak (EW) fine-tuning problem in supersymmetric scenarios, but little concerned with the dark matter (DM) one, which should be combined with the former. In this paper, we study this problem in an, as much as possible, exhaustive and rigorous way. We have considered the MSSM framework, assuming that the LSP is the lightest neutralino, χ 1 0 , and exploring the various possibilities for the mass and composition of χ 1 0 , as well as different mechanisms for annihilation of the DM particles in the early Universe (well-tempered neutralinos, funnels and co-annihilation scenarios). We also present a discussion about the statistical meaning of the fine-tuning and how it should be computed for the DM abundance, and combined with the EW fine-tuning. The results are very robust and model-independent and favour some scenarios (like the h-funnel when {M}_{χ_1^0} is not too close to m h /2) with respect to others (such as the pure wino case). These features should be taken into account when one explores "natural SUSY" scenarios and their possible signatures at the LHC and in DM detection experiments.
Clogging arches in grains, colloids, and pedestrians flowing through constrictions
NASA Astrophysics Data System (ADS)
Zuriguel, Iker
When a group of particles pass through a narrow orifice, the flow might become intermittent due to the development of clogs that obstruct the constriction. This effect has been observed in many different fields such as mining transport, microbial growing, crowd dynamics, colloids, granular and active matter. In this work we introduce a general framework in which research in some of such scenarios can be encompassed. In particular, we analyze the statistical properties of the bottleneck flow in different experiments and simulations: granular media within vibrated silos, colloids, a flock of sheep and pedestrian evacuations. We reveal a common phenomenology that allows us to rigorously define a transition to a clogged state. Using this definition we explore the main variables involved, which are then grouped into three generic parameters. In addition, we will present results of the geometrical characteristics that the clogging arches have which are related with their stability against perturbations. We experimentally analyse the temporal evolution of the arches evidencing important differences among the structures that are easily destroyed and those that seem to resist forever (longer than the temporal window employed in our measurements). Ministerio de Economía y Competitividad (Spanish Government). Project No. FIS2014-57325.
A global inventory of small floating plastic debris
NASA Astrophysics Data System (ADS)
van Sebille, Erik; Wilcox, Chris; Lebreton, Laurent; Maximenko, Nikolai; Hardesty, Britta Denise; van Franeker, Jan A.; Eriksen, Marcus; Siegel, David; Galgani, Francois; Lavender Law, Kara
2015-12-01
Microplastic debris floating at the ocean surface can harm marine life. Understanding the severity of this harm requires knowledge of plastic abundance and distributions. Dozens of expeditions measuring microplastics have been carried out since the 1970s, but they have primarily focused on the North Atlantic and North Pacific accumulation zones, with much sparser coverage elsewhere. Here, we use the largest dataset of microplastic measurements assembled to date to assess the confidence we can have in global estimates of microplastic abundance and mass. We use a rigorous statistical framework to standardize a global dataset of plastic marine debris measured using surface-trawling plankton nets and coupled this with three different ocean circulation models to spatially interpolate the observations. Our estimates show that the accumulated number of microplastic particles in 2014 ranges from 15 to 51 trillion particles, weighing between 93 and 236 thousand metric tons, which is only approximately 1% of global plastic waste estimated to enter the ocean in the year 2010. These estimates are larger than previous global estimates, but vary widely because the scarcity of data in most of the world ocean, differences in model formulations, and fundamental knowledge gaps in the sources, transformations and fates of microplastics in the ocean.
The effect of miniaturized body size on skeletal morphology in frogs.
Yeh, Jennifer
2002-03-01
Miniaturization has evolved numerous times and reached impressive extremes in the Anura. I compared the skeletons of miniature frog species to those of closely related larger species to assess patterns of morphological change, sampling 129 species from 12 families. Two types of morphological data were examined: (1) qualitative data on bone presence and absence; and (2) thin-plate spline morphometric descriptions of skull structure and bone shape. Phylogenetic comparative methods were used to address the shared history of species. Miniature anurans were more likely to lose skull bones and phalangeal elements of the limbs. Their skulls also showed consistent differences compared to those of their larger relatives, including relatively larger braincases and sensory capsules, verticalization of lateral elements, rostral displacement of the jaw joint, and reduction of some skull elements. These features are explained by functional constraints and by paedomorphosis. Variation among lineages in the morphological response to miniaturization was also explored. Certain lineages appear to be unusually resistant to the morphological trends that characterize miniature frogs as a whole. This study represents the first large-scale examination of morphology and miniaturization across a major, diverse group of organisms conducted in a phylogenetic framework and with statistical rigor.
NASA Astrophysics Data System (ADS)
Ipsen, Andreas; Ebbels, Timothy M. D.
2014-10-01
In a recent article, we derived a probability distribution that was shown to closely approximate that of the data produced by liquid chromatography time-of-flight mass spectrometry (LC/TOFMS) instruments employing time-to-digital converters (TDCs) as part of their detection system. The approach of formulating detailed and highly accurate mathematical models of LC/MS data via probability distributions that are parameterized by quantities of analytical interest does not appear to have been fully explored before. However, we believe it could lead to a statistically rigorous framework for addressing many of the data analytical problems that arise in LC/MS studies. In this article, we present new procedures for correcting for TDC saturation using such an approach and demonstrate that there is potential for significant improvements in the effective dynamic range of TDC-based mass spectrometers, which could make them much more competitive with the alternative analog-to-digital converters (ADCs). The degree of improvement depends on our ability to generate mass and chromatographic peaks that conform to known mathematical functions and our ability to accurately describe the state of the detector dead time—tasks that may be best addressed through engineering efforts.
Wetting of heterogeneous substrates. A classical density-functional-theory approach
NASA Astrophysics Data System (ADS)
Yatsyshin, Peter; Parry, Andrew O.; Rascón, Carlos; Duran-Olivencia, Miguel A.; Kalliadasis, Serafim
2017-11-01
Wetting is a nucleation of a third phase (liquid) on the interface between two different phases (solid and gas). In many experimentally accessible cases of wetting, the interplay between the substrate structure, and the fluid-fluid and fluid-substrate intermolecular interactions leads to the appearance of a whole ``zoo'' of exciting interface phase transitions, associated with the formation of nano-droplets/bubbles, and thin films. Practical applications of wetting at small scales are numerous and include the design of lab-on-a-chip devices and superhydrophobic surfaces. In this talk, we will use a fully microscopic approach to explore the phase space of a planar wall, decorated with patches of different hydrophobicity, and demonstrate the highly non-trivial behaviour of the liquid-gas interface near the substrate. We will present fluid density profiles, adsorption isotherms and wetting phase diagrams. Our analysis is based on a formulation of statistical mechanics, commonly known as classical density-functional theory. It provides a computationally-friendly and rigorous framework, suitable for probing small-scale physics of classical fluids and other soft-matter systems. EPSRC Grants No. EP/L027186,EP/K503733;ERC Advanced Grant No. 247031.
A Two-Step Approach to Uncertainty Quantification of Core Simulators
Yankov, Artem; Collins, Benjamin; Klein, Markus; ...
2012-01-01
For the multiple sources of error introduced into the standard computational regime for simulating reactor cores, rigorous uncertainty analysis methods are available primarily to quantify the effects of cross section uncertainties. Two methods for propagating cross section uncertainties through core simulators are the XSUSA statistical approach and the “two-step” method. The XSUSA approach, which is based on the SUSA code package, is fundamentally a stochastic sampling method. Alternatively, the two-step method utilizes generalized perturbation theory in the first step and stochastic sampling in the second step. The consistency of these two methods in quantifying uncertainties in the multiplication factor andmore » in the core power distribution was examined in the framework of phase I-3 of the OECD Uncertainty Analysis in Modeling benchmark. With the Three Mile Island Unit 1 core as a base model for analysis, the XSUSA and two-step methods were applied with certain limitations, and the results were compared to those produced by other stochastic sampling-based codes. Based on the uncertainty analysis results, conclusions were drawn as to the method that is currently more viable for computing uncertainties in burnup and transient calculations.« less
Alarms about structural alerts.
Alves, Vinicius; Muratov, Eugene; Capuzzi, Stephen; Politi, Regina; Low, Yen; Braga, Rodolpho; Zakharov, Alexey V; Sedykh, Alexander; Mokshyna, Elena; Farag, Sherif; Andrade, Carolina; Kuz'min, Victor; Fourches, Denis; Tropsha, Alexander
2016-08-21
Structural alerts are widely accepted in chemical toxicology and regulatory decision support as a simple and transparent means to flag potential chemical hazards or group compounds into categories for read-across. However, there has been a growing concern that alerts disproportionally flag too many chemicals as toxic, which questions their reliability as toxicity markers. Conversely, the rigorously developed and properly validated statistical QSAR models can accurately and reliably predict the toxicity of a chemical; however, their use in regulatory toxicology has been hampered by the lack of transparency and interpretability. We demonstrate that contrary to the common perception of QSAR models as "black boxes" they can be used to identify statistically significant chemical substructures (QSAR-based alerts) that influence toxicity. We show through several case studies, however, that the mere presence of structural alerts in a chemical, irrespective of the derivation method (expert-based or QSAR-based), should be perceived only as hypotheses of possible toxicological effect. We propose a new approach that synergistically integrates structural alerts and rigorously validated QSAR models for a more transparent and accurate safety assessment of new chemicals.
Uncertainty Analysis of Instrument Calibration and Application
NASA Technical Reports Server (NTRS)
Tripp, John S.; Tcheng, Ping
1999-01-01
Experimental aerodynamic researchers require estimated precision and bias uncertainties of measured physical quantities, typically at 95 percent confidence levels. Uncertainties of final computed aerodynamic parameters are obtained by propagation of individual measurement uncertainties through the defining functional expressions. In this paper, rigorous mathematical techniques are extended to determine precision and bias uncertainties of any instrument-sensor system. Through this analysis, instrument uncertainties determined through calibration are now expressed as functions of the corresponding measurement for linear and nonlinear univariate and multivariate processes. Treatment of correlated measurement precision error is developed. During laboratory calibration, calibration standard uncertainties are assumed to be an order of magnitude less than those of the instrument being calibrated. Often calibration standards do not satisfy this assumption. This paper applies rigorous statistical methods for inclusion of calibration standard uncertainty and covariance due to the order of their application. The effects of mathematical modeling error on calibration bias uncertainty are quantified. The effects of experimental design on uncertainty are analyzed. The importance of replication is emphasized, techniques for estimation of both bias and precision uncertainties using replication are developed. Statistical tests for stationarity of calibration parameters over time are obtained.
ERIC Educational Resources Information Center
Thompson, Bruce
Web-based statistical instruction, like all statistical instruction, ought to focus on teaching the essence of the research endeavor: the exercise of reflective judgment. Using the framework of the recent report of the American Psychological Association (APA) Task Force on Statistical Inference (Wilkinson and the APA Task Force on Statistical…
NASA Astrophysics Data System (ADS)
Schunk, R. W.; Barakat, A. R.; Eccles, V.; Karimabadi, H.; Omelchenko, Y.; Khazanov, G. V.; Glocer, A.; Kistler, L. M.
2014-12-01
A Kinetic Framework for the Magnetosphere-Ionosphere-Plasmasphere-Polar Wind System is being developed in order to provide a rigorous approach to modeling the interaction of hot and cold particle interactions. The framework will include ion and electron kinetic species in the ionosphere, plasmasphere and polar wind, and kinetic ion, super-thermal electron and fluid electron species in the magnetosphere. The framework is ideally suited to modeling ion outflow from the ionosphere and plasmasphere, where a wide range for fluid and kinetic processes are important. These include escaping ion interactions with (1) photoelectrons, (2) cusp/auroral waves, double layers, and field-aligned currents, (3) double layers in the polar cap due to the interaction of cold ionospheric and hot magnetospheric electrons, (4) counter-streaming ions, and (5) electromagnetic wave turbulence. The kinetic ion interactions are particularly strong during geomagnetic storms and substorms. The presentation will provide a brief description of the models involved and discuss the effect that kinetic processes have on the ion outflow.
Stochastic and Deterministic Models for the Metastatic Emission Process: Formalisms and Crosslinks.
Gomez, Christophe; Hartung, Niklas
2018-01-01
Although the detection of metastases radically changes prognosis of and treatment decisions for a cancer patient, clinically undetectable micrometastases hamper a consistent classification into localized or metastatic disease. This chapter discusses mathematical modeling efforts that could help to estimate the metastatic risk in such a situation. We focus on two approaches: (1) a stochastic framework describing metastatic emission events at random times, formalized via Poisson processes, and (2) a deterministic framework describing the micrometastatic state through a size-structured density function in a partial differential equation model. Three aspects are addressed in this chapter. First, a motivation for the Poisson process framework is presented and modeling hypotheses and mechanisms are introduced. Second, we extend the Poisson model to account for secondary metastatic emission. Third, we highlight an inherent crosslink between the stochastic and deterministic frameworks and discuss its implications. For increased accessibility the chapter is split into an informal presentation of the results using a minimum of mathematical formalism and a rigorous mathematical treatment for more theoretically interested readers.
Theory and applications of structured light single pixel imaging
NASA Astrophysics Data System (ADS)
Stokoe, Robert J.; Stockton, Patrick A.; Pezeshki, Ali; Bartels, Randy A.
2018-02-01
Many single-pixel imaging techniques have been developed in recent years. Though the methods of image acquisition vary considerably, the methods share unifying features that make general analysis possible. Furthermore, the methods developed thus far are based on intuitive processes that enable simple and physically-motivated reconstruction algorithms, however, this approach may not leverage the full potential of single-pixel imaging. We present a general theoretical framework of single-pixel imaging based on frame theory, which enables general, mathematically rigorous analysis. We apply our theoretical framework to existing single-pixel imaging techniques, as well as provide a foundation for developing more-advanced methods of image acquisition and reconstruction. The proposed frame theoretic framework for single-pixel imaging results in improved noise robustness, decrease in acquisition time, and can take advantage of special properties of the specimen under study. By building on this framework, new methods of imaging with a single element detector can be developed to realize the full potential associated with single-pixel imaging.
Algebraic signal processing theory: 2-D spatial hexagonal lattice.
Pünschel, Markus; Rötteler, Martin
2007-06-01
We develop the framework for signal processing on a spatial, or undirected, 2-D hexagonal lattice for both an infinite and a finite array of signal samples. This framework includes the proper notions of z-transform, boundary conditions, filtering or convolution, spectrum, frequency response, and Fourier transform. In the finite case, the Fourier transform is called discrete triangle transform. Like the hexagonal lattice, this transform is nonseparable. The derivation of the framework makes it a natural extension of the algebraic signal processing theory that we recently introduced. Namely, we construct the proper signal models, given by polynomial algebras, bottom-up from a suitable definition of hexagonal space shifts using a procedure provided by the algebraic theory. These signal models, in turn, then provide all the basic signal processing concepts. The framework developed in this paper is related to Mersereau's early work on hexagonal lattices in the same way as the discrete cosine and sine transforms are related to the discrete Fourier transform-a fact that will be made rigorous in this paper.
Formally Verified Practical Algorithms for Recovery from Loss of Separation
NASA Technical Reports Server (NTRS)
Butler, Ricky W.; Munoz, Caesar A.
2009-01-01
In this paper, we develop and formally verify practical algorithms for recovery from loss of separation. The formal verification is performed in the context of a criteria-based framework. This framework provides rigorous definitions of horizontal and vertical maneuver correctness that guarantee divergence and achieve horizontal and vertical separation. The algorithms are shown to be independently correct, that is, separation is achieved when only one aircraft maneuvers, and implicitly coordinated, that is, separation is also achieved when both aircraft maneuver. In this paper we improve the horizontal criteria over our previous work. An important benefit of the criteria approach is that different aircraft can execute different algorithms and implicit coordination will still be achieved, as long as they all meet the explicit criteria of the framework. Towards this end we have sought to make the criteria as general as possible. The framework presented in this paper has been formalized and mechanically verified in the Prototype Verification System (PVS).
Tracing the foundations of a conceptual framework for a patient safety ontology.
Runciman, William B; Baker, G Ross; Michel, Philippe; Dovey, Susan; Lilford, Richard J; Jensen, Natasja; Flin, Rhona; Weeks, William B; Lewalle, Pierre; Larizgoitia, Itziar; Bates, David
2010-12-01
In work for the World Alliance for Patient Safety on research methods and measures and on defining key concepts for an International Patient Safety Classification (ICPS), it became apparent that there was a need to try to understand how the meaning of patient safety and underlying concepts relate to the existing safety and quality frameworks commonly used in healthcare. To unfold the concept of patient safety and how it relates to safety and quality frameworks commonly used in healthcare and to trace the evolution of the ICPS framework as a basis of the electronic capture of the component elements of patient safety. The ICPS conceptual framework for patient safety has its origins in existing frameworks and an international consultation process. Although its 10 classes and their semantic relationships may be used as a reference model for different disciplines, it must remain dynamic in the ever-changing world of healthcare. By expanding the ICPS by examining data from all available sources, and ensuring rigorous compliance with the latest principles of informatics, a deeper interdisciplinary approach will progressively be developed to address the complex, refractory problem of reducing healthcare-associated harm.
Building clinical networks: a developmental evaluation framework.
Carswell, Peter; Manning, Benjamin; Long, Janet; Braithwaite, Jeffrey
2014-05-01
Clinical networks have been designed as a cross-organisational mechanism to plan and deliver health services. With recent concerns about the effectiveness of these structures, it is timely to consider an evidence-informed approach for how they can be developed and evaluated. To document an evaluation framework for clinical networks by drawing on the network evaluation literature and a 5-year study of clinical networks. We searched literature in three domains: network evaluation, factors that aid or inhibit network development, and on robust methods to measure network characteristics. This material was used to build a framework required for effective developmental evaluation. The framework's architecture identifies three stages of clinical network development; partner selection, network design and network management. Within each stage is evidence about factors that act as facilitators and barriers to network growth. These factors can be used to measure progress via appropriate methods and tools. The framework can provide for network growth and support informed decisions about progress. For the first time in one place a framework incorporating rigorous methods and tools can identify factors known to affect the development of clinical networks. The target user group is internal stakeholders who need to conduct developmental evaluation to inform key decisions along their network's developmental pathway.
Understanding and Evolving the ML Module System
2005-05-01
kinds Abstract The ML module system stands as a high-water mark of programming language support for data abstraction. Nevertheless, it is not in a... language of part (3) using the framework of Harper and Stone, in which the meanings of “external” ML programs are interpreted by translation into an...researcher has been influenced to a large degree by their rigorous approach to programming language research and their profound sense of aesthetics. I
Tetraneutron: Rigorous continuum calculation
NASA Astrophysics Data System (ADS)
Deltuva, A.
2018-07-01
The four-neutron system is studied using exact continuum equations for transition operators and solving them in the momentum-space framework. A resonant behavior is found for strongly enhanced interaction but not a the physical strength, indicating the absence of an observable tetraneutron resonance, in contrast to a number of earlier works. As the transition operators acquire large values at low energies, it is conjectured that this behavior may explain peaks in many-body reactions even without a resonance.
Peer Review Documents Related to the Evaluation of ...
BMDS is one of the Agency's premier tools for estimating risk assessments, therefore the validity and reliability of its statistical models are of paramount importance. This page provides links to peer review and expert summaries of the BMDS application and its models as they were developed and eventually released documenting the rigorous review process taken to provide the best science tools available for statistical modeling. This page provides links to peer reviews and expert summaries of the BMDS applications and its models as they were developed and eventually released.
A criterion for establishing life limits. [for Space Shuttle Main Engine service
NASA Technical Reports Server (NTRS)
Skopp, G. H.; Porter, A. A.
1990-01-01
The development of a rigorous statistical method that would utilize hardware-demonstrated reliability to evaluate hardware capability and provide ground rules for safe flight margin is discussed. A statistical-based method using the Weibull/Weibayes cumulative distribution function is described. Its advantages and inadequacies are pointed out. Another, more advanced procedure, Single Flight Reliability (SFR), determines a life limit which ensures that the reliability of any single flight is never less than a stipulated value at a stipulated confidence level. Application of the SFR method is illustrated.
Statistical label fusion with hierarchical performance models
Asman, Andrew J.; Dagley, Alexander S.; Landman, Bennett A.
2014-01-01
Label fusion is a critical step in many image segmentation frameworks (e.g., multi-atlas segmentation) as it provides a mechanism for generalizing a collection of labeled examples into a single estimate of the underlying segmentation. In the multi-label case, typical label fusion algorithms treat all labels equally – fully neglecting the known, yet complex, anatomical relationships exhibited in the data. To address this problem, we propose a generalized statistical fusion framework using hierarchical models of rater performance. Building on the seminal work in statistical fusion, we reformulate the traditional rater performance model from a multi-tiered hierarchical perspective. This new approach provides a natural framework for leveraging known anatomical relationships and accurately modeling the types of errors that raters (or atlases) make within a hierarchically consistent formulation. Herein, we describe several contributions. First, we derive a theoretical advancement to the statistical fusion framework that enables the simultaneous estimation of multiple (hierarchical) performance models within the statistical fusion context. Second, we demonstrate that the proposed hierarchical formulation is highly amenable to the state-of-the-art advancements that have been made to the statistical fusion framework. Lastly, in an empirical whole-brain segmentation task we demonstrate substantial qualitative and significant quantitative improvement in overall segmentation accuracy. PMID:24817809
NASA Astrophysics Data System (ADS)
Karuppiah, R.; Faldi, A.; Laurenzi, I.; Usadi, A.; Venkatesh, A.
2014-12-01
An increasing number of studies are focused on assessing the environmental footprint of different products and processes, especially using life cycle assessment (LCA). This work shows how combining statistical methods and Geographic Information Systems (GIS) with environmental analyses can help improve the quality of results and their interpretation. Most environmental assessments in literature yield single numbers that characterize the environmental impact of a process/product - typically global or country averages, often unchanging in time. In this work, we show how statistical analysis and GIS can help address these limitations. For example, we demonstrate a method to separately quantify uncertainty and variability in the result of LCA models using a power generation case study. This is important for rigorous comparisons between the impacts of different processes. Another challenge is lack of data that can affect the rigor of LCAs. We have developed an approach to estimate environmental impacts of incompletely characterized processes using predictive statistical models. This method is applied to estimate unreported coal power plant emissions in several world regions. There is also a general lack of spatio-temporal characterization of the results in environmental analyses. For instance, studies that focus on water usage do not put in context where and when water is withdrawn. Through the use of hydrological modeling combined with GIS, we quantify water stress on a regional and seasonal basis to understand water supply and demand risks for multiple users. Another example where it is important to consider regional dependency of impacts is when characterizing how agricultural land occupation affects biodiversity in a region. We developed a data-driven methodology used in conjuction with GIS to determine if there is a statistically significant difference between the impacts of growing different crops on different species in various biomes of the world.
ERIC Educational Resources Information Center
Nitko, Anthony J.; Hsu, Tse-chi
Item analysis procedures appropriate for domain-referenced classroom testing are described. A conceptual framework within which item statistics can be considered and promising statistics in light of this framework are presented. The sampling fluctuations of the more promising item statistics for sample sizes comparable to the typical classroom…
Systematic review of the quality of prognosis studies in systemic lupus erythematosus.
Lim, Lily S H; Lee, Senq J; Feldman, Brian M; Gladman, Dafna D; Pullenayegum, Eleanor; Uleryk, Elizabeth; Silverman, Earl D
2014-10-01
Prognosis studies examine outcomes and/or seek to identify predictors or factors associated with outcomes. Many prognostic factors have been identified in systemic lupus erythematosus (SLE), but few have been consistently found across studies. We hypothesized that this is due to a lack of rigor of study designs. This study aimed to systematically assess the methodologic quality of prognosis studies in SLE. A search of prognosis studies in SLE was performed using MEDLINE and Embase, from January 1990 to June 2011. A representative sample of 150 articles was selected using a random number generator and assessed by 2 reviewers. Each study was assessed by a risk of bias tool according to 6 domains: study participation, study attrition, measurement of prognostic factors, measurement of outcomes, measurement/adjustment for confounders, and appropriateness of statistical analysis. Information about missing data was also collected. A cohort design was used in 71% of studies. High risk of bias was found in 65% of studies for confounders, 57% for study participation, 56% for attrition, 36% for statistical analyses, 20% for prognostic factors, and 18% for outcome. Missing covariate or outcome information was present in half of the studies. Only 6 studies discussed reasons for missing data and 2 imputed missing data. Lack of rigorous study design, especially in addressing confounding, study participation and attrition, and inadequately handled missing data, has limited the quality of prognosis studies in SLE. Future prognosis studies should be designed with consideration of these factors to improve methodologic rigor. Copyright © 2014 by the American College of Rheumatology.
Establishing Interventions via a Theory-Driven Single Case Design Research Cycle
ERIC Educational Resources Information Center
Kilgus, Stephen P.; Riley-Tillman, T. Chris; Kratochwill, Thomas R.
2016-01-01
Recent studies have suggested single case design (SCD) intervention research is subject to publication bias, wherein studies are more likely to be published if they possess large or statistically significant effects and use rigorous experimental methods. The nature of SCD and the purposes for which it might be used could suggest that large effects…
ERIC Educational Resources Information Center
Johnson, Donald M.; Shoulders, Catherine W.
2017-01-01
As members of a profession committed to the dissemination of rigorous research pertaining to agricultural education, authors publishing in the Journal of Agricultural Education (JAE) must seek methods to evaluate and, when necessary, improve their research methods. The purpose of this study was to describe how authors of manuscripts published in…
Sean Healey; Warren Cohen; Gretchen Moisen
2007-01-01
The need for current information about the effects of fires, harvest, and storms is evident in many areas of sustainable forest management. While there are several potential sources of this information, each source has its limitations. Generally speaking, the statistical rigor associated with traditional forest sampling is an important asset in any monitoring effort....
Preschool Center Care Quality Effects on Academic Achievement: An Instrumental Variables Analysis
ERIC Educational Resources Information Center
Auger, Anamarie; Farkas, George; Burchinal, Margaret R.; Duncan, Greg J.; Vandell, Deborah Lowe
2014-01-01
Much of child care research has focused on the effects of the quality of care in early childhood settings on children's school readiness skills. Although researchers increased the statistical rigor of their approaches over the past 15 years, researchers' ability to draw causal inferences has been limited because the studies are based on…
Statistical linearization for multi-input/multi-output nonlinearities
NASA Technical Reports Server (NTRS)
Lin, Ching-An; Cheng, Victor H. L.
1991-01-01
Formulas are derived for the computation of the random input-describing functions for MIMO nonlinearities; these straightforward and rigorous derivations are based on the optimal mean square linear approximation. The computations involve evaluations of multiple integrals. It is shown that, for certain classes of nonlinearities, multiple-integral evaluations are obviated and the computations are significantly simplified.
Slow off the Mark: Elementary School Teachers and the Crisis in STEM Education
ERIC Educational Resources Information Center
Epstein, Diana; Miller, Raegen T.
2011-01-01
Prospective teachers can typically obtain a license to teach elementary school without taking a rigorous college-level STEM class such as calculus, statistics, or chemistry, and without demonstrating a solid grasp of mathematics knowledge, scientific knowledge, or the nature of scientific inquiry. This is not a recipe for ensuring students have…
ERIC Educational Resources Information Center
Scrutton, Roger; Beames, Simon
2015-01-01
Outdoor adventure education (OAE) has a long history of being credited with the personal and social development (PSD) of its participants. PSD is notoriously difficult to measure quantitatively, yet stakeholders demand statistical evidence that given approaches to eliciting PSD are effective in their methods. Rightly or wrongly, many stakeholders…
Normalization, bias correction, and peak calling for ChIP-seq
Diaz, Aaron; Park, Kiyoub; Lim, Daniel A.; Song, Jun S.
2012-01-01
Next-generation sequencing is rapidly transforming our ability to profile the transcriptional, genetic, and epigenetic states of a cell. In particular, sequencing DNA from the immunoprecipitation of protein-DNA complexes (ChIP-seq) and methylated DNA (MeDIP-seq) can reveal the locations of protein binding sites and epigenetic modifications. These approaches contain numerous biases which may significantly influence the interpretation of the resulting data. Rigorous computational methods for detecting and removing such biases are still lacking. Also, multi-sample normalization still remains an important open problem. This theoretical paper systematically characterizes the biases and properties of ChIP-seq data by comparing 62 separate publicly available datasets, using rigorous statistical models and signal processing techniques. Statistical methods for separating ChIP-seq signal from background noise, as well as correcting enrichment test statistics for sequence-dependent and sonication biases, are presented. Our method effectively separates reads into signal and background components prior to normalization, improving the signal-to-noise ratio. Moreover, most peak callers currently use a generic null model which suffers from low specificity at the sensitivity level requisite for detecting subtle, but true, ChIP enrichment. The proposed method of determining a cell type-specific null model, which accounts for cell type-specific biases, is shown to be capable of achieving a lower false discovery rate at a given significance threshold than current methods. PMID:22499706
Spatial Statistical Data Fusion (SSDF)
NASA Technical Reports Server (NTRS)
Braverman, Amy J.; Nguyen, Hai M.; Cressie, Noel
2013-01-01
As remote sensing for scientific purposes has transitioned from an experimental technology to an operational one, the selection of instruments has become more coordinated, so that the scientific community can exploit complementary measurements. However, tech nological and scientific heterogeneity across devices means that the statistical characteristics of the data they collect are different. The challenge addressed here is how to combine heterogeneous remote sensing data sets in a way that yields optimal statistical estimates of the underlying geophysical field, and provides rigorous uncertainty measures for those estimates. Different remote sensing data sets may have different spatial resolutions, different measurement error biases and variances, and other disparate characteristics. A state-of-the-art spatial statistical model was used to relate the true, but not directly observed, geophysical field to noisy, spatial aggregates observed by remote sensing instruments. The spatial covariances of the true field and the covariances of the true field with the observations were modeled. The observations are spatial averages of the true field values, over pixels, with different measurement noise superimposed. A kriging framework is used to infer optimal (minimum mean squared error and unbiased) estimates of the true field at point locations from pixel-level, noisy observations. A key feature of the spatial statistical model is the spatial mixed effects model that underlies it. The approach models the spatial covariance function of the underlying field using linear combinations of basis functions of fixed size. Approaches based on kriging require the inversion of very large spatial covariance matrices, and this is usually done by making simplifying assumptions about spatial covariance structure that simply do not hold for geophysical variables. In contrast, this method does not require these assumptions, and is also computationally much faster. This method is fundamentally different than other approaches to data fusion for remote sensing data because it is inferential rather than merely descriptive. All approaches combine data in a way that minimizes some specified loss function. Most of these are more or less ad hoc criteria based on what looks good to the eye, or some criteria that relate only to the data at hand.
NASA Astrophysics Data System (ADS)
Cioaca, Alexandru
A deep scientific understanding of complex physical systems, such as the atmosphere, can be achieved neither by direct measurements nor by numerical simulations alone. Data assimila- tion is a rigorous procedure to fuse information from a priori knowledge of the system state, the physical laws governing the evolution of the system, and real measurements, all with associated error statistics. Data assimilation produces best (a posteriori) estimates of model states and parameter values, and results in considerably improved computer simulations. The acquisition and use of observations in data assimilation raises several important scientific questions related to optimal sensor network design, quantification of data impact, pruning redundant data, and identifying the most beneficial additional observations. These questions originate in operational data assimilation practice, and have started to attract considerable interest in the recent past. This dissertation advances the state of knowledge in four dimensional variational (4D-Var) data assimilation by developing, implementing, and validating a novel computational framework for estimating observation impact and for optimizing sensor networks. The framework builds on the powerful methodologies of second-order adjoint modeling and the 4D-Var sensitivity equations. Efficient computational approaches for quantifying the observation impact include matrix free linear algebra algorithms and low-rank approximations of the sensitivities to observations. The sensor network configuration problem is formulated as a meta-optimization problem. Best values for parameters such as sensor location are obtained by optimizing a performance criterion, subject to the constraint posed by the 4D-Var optimization. Tractable computational solutions to this "optimization-constrained" optimization problem are provided. The results of this work can be directly applied to the deployment of intelligent sensors and adaptive observations, as well as to reducing the operating costs of measuring networks, while preserving their ability to capture the essential features of the system under consideration.
NASA Astrophysics Data System (ADS)
Dufaux, Frederic
2011-06-01
The issue of privacy in video surveillance has drawn a lot of interest lately. However, thorough performance analysis and validation is still lacking, especially regarding the fulfillment of privacy-related requirements. In this paper, we first review recent Privacy Enabling Technologies (PET). Next, we discuss pertinent evaluation criteria for effective privacy protection. We then put forward a framework to assess the capacity of PET solutions to hide distinguishing facial information and to conceal identity. We conduct comprehensive and rigorous experiments to evaluate the performance of face recognition algorithms applied to images altered by PET. Results show the ineffectiveness of naïve PET such as pixelization and blur. Conversely, they demonstrate the effectiveness of more sophisticated scrambling techniques to foil face recognition.
NASA Astrophysics Data System (ADS)
Rubin, D.; Aldering, G.; Barbary, K.; Boone, K.; Chappell, G.; Currie, M.; Deustua, S.; Fagrelius, P.; Fruchter, A.; Hayden, B.; Lidman, C.; Nordin, J.; Perlmutter, S.; Saunders, C.; Sofiatti, C.; Supernova Cosmology Project, The
2015-11-01
While recent supernova (SN) cosmology research has benefited from improved measurements, current analysis approaches are not statistically optimal and will prove insufficient for future surveys. This paper discusses the limitations of current SN cosmological analyses in treating outliers, selection effects, shape- and color-standardization relations, unexplained dispersion, and heterogeneous observations. We present a new Bayesian framework, called UNITY (Unified Nonlinear Inference for Type-Ia cosmologY), that incorporates significant improvements in our ability to confront these effects. We apply the framework to real SN observations and demonstrate smaller statistical and systematic uncertainties. We verify earlier results that SNe Ia require nonlinear shape and color standardizations, but we now include these nonlinear relations in a statistically well-justified way. This analysis was primarily performed blinded, in that the basic framework was first validated on simulated data before transitioning to real data. We also discuss possible extensions of the method.
Status and Future of the Naval R&D Establishment
2010-09-23
Market mostly here C u w e s t c o s t Global Allies Quadrant...Becoming more important and threatening L o w Free Market • Requires new mechanisms to handle 17 Framework for Assessment Implications S li Bupp er ase...t c o s t B a s e Corner: • Use sparingly Other US military US H i g u s t o m e r B • Prioritize rigorously Government US Market C u
Hotspot Patterns: The Formal Definition and Automatic Detection of Architecture Smells
2015-01-15
serious question for a project manager or architect: how to determine which parts of the code base should be given higher priority for maintenance and...services framework; Hadoop8 is a tool for distributed processing of large data sets; HBase9 is the Hadoop database; Ivy10 is a dependency management tool...answer this question more rigorously, we conducted Pearson Correlation Analysis to test the dependency between the number of issues a file involves
DOE Office of Scientific and Technical Information (OSTI.GOV)
Long, Jeffrey R.
The design and characterization of new materials for hydrogen storage is an important area of research, as the ability to store hydrogen at lower pressures and higher temperatures than currently feasible would lower operating costs for small hydrogen fuel cell vehicles. In particular, metal-organic frameworks (MOFs) represent promising materials for use in storing hydrogen in this capacity. MOFs are highly porous, three-dimensional crystalline solids that are formed via linkages between metal ions (e.g., iron, nickel, and zinc) and organic molecules. MOFs can store hydrogen via strong adsorptive interactions between the gas molecules and the pores of the framework, providing amore » high surface area for gas adsorption and thus the opportunity to store hydrogen at significantly lower pressures than with current technologies. By lowering the energy required for hydrogen storage, these materials hold promise in rendering hydrogen a more viable fuel for motor vehicles, which is a highly desirable outcome given the clean nature of hydrogen fuel cells (water is the only byproduct of combustion) and the current state of global climate change resulting from the combustion of fossil fuels. The work presented in this report is the result of collaborative efforts between researchers at Lawrence Berkeley National Lab (LBNL), the National Institute of Standards and Technology (NIST), and General Motors Corporation (GM) to discover novel MOFs promising for H 2 storage and characterize their properties. Described herein are several new framework systems with improved gravimetric and volumetric capacity to strongly bind H 2 at temperatures relevant for vehicle storage. These materials were rigorously characterized using neutron diffraction, to determine the precise binding locations of hydrogen within the frameworks, and high-pressure H 2 adsorption measurements, to provide a comprehensive picture of H 2 adsorption at all relevant pressures. A rigorous understanding of experimental findings was further achieved via first-principles electronic structure calculations, which also supported synthetic efforts through predictions of additional novel frameworks with promising properties for vehicular H 2 storage. The results of the computational efforts also helped to elucidate the fundamental principles governing the interaction of H 2 with the frameworks, and in particular with exposed metal sites in the pores of these materials. Significant accomplishments from this project include the discovery of a metal-organic framework with a high H 2 binding enthalpy and volumetric capacity at 25 °C and 100 bar, which surpasses the metrics of any other known metal-organic framework. Additionally this material was designed to be extremely cost effective compared to most comparable adsorbents, which is imperative for eventual real-world applications. Progress toward synthesizing new frameworks containing multiple open coordination sites is also discussed, and appears to be the most promising future direction for hydrogen storage in these porous materials.« less
Increasing rigor in NMR-based metabolomics through validated and open source tools
Eghbalnia, Hamid R; Romero, Pedro R; Westler, William M; Baskaran, Kumaran; Ulrich, Eldon L; Markley, John L
2016-01-01
The metabolome, the collection of small molecules associated with an organism, is a growing subject of inquiry, with the data utilized for data-intensive systems biology, disease diagnostics, biomarker discovery, and the broader characterization of small molecules in mixtures. Owing to their close proximity to the functional endpoints that govern an organism’s phenotype, metabolites are highly informative about functional states. The field of metabolomics identifies and quantifies endogenous and exogenous metabolites in biological samples. Information acquired from nuclear magnetic spectroscopy (NMR), mass spectrometry (MS), and the published literature, as processed by statistical approaches, are driving increasingly wider applications of metabolomics. This review focuses on the role of databases and software tools in advancing the rigor, robustness, reproducibility, and validation of metabolomics studies. PMID:27643760
Increasing rigor in NMR-based metabolomics through validated and open source tools.
Eghbalnia, Hamid R; Romero, Pedro R; Westler, William M; Baskaran, Kumaran; Ulrich, Eldon L; Markley, John L
2017-02-01
The metabolome, the collection of small molecules associated with an organism, is a growing subject of inquiry, with the data utilized for data-intensive systems biology, disease diagnostics, biomarker discovery, and the broader characterization of small molecules in mixtures. Owing to their close proximity to the functional endpoints that govern an organism's phenotype, metabolites are highly informative about functional states. The field of metabolomics identifies and quantifies endogenous and exogenous metabolites in biological samples. Information acquired from nuclear magnetic spectroscopy (NMR), mass spectrometry (MS), and the published literature, as processed by statistical approaches, are driving increasingly wider applications of metabolomics. This review focuses on the role of databases and software tools in advancing the rigor, robustness, reproducibility, and validation of metabolomics studies. Copyright © 2016. Published by Elsevier Ltd.
Dynamic State Estimation of Terrestrial and Solar Plasmas
NASA Astrophysics Data System (ADS)
Kamalabadi, Farzad
A pervasive problem in virtually all branches of space science is the estimation of multi-dimensional state parameters of a dynamical system from a collection of indirect, often incomplete, and imprecise measurements. Subsequent scientific inference is predicated on rigorous analysis, interpretation, and understanding of physical observations and on the reliability of the associated quantitative statistical bounds and performance characteristics of the algorithms used. In this work, we focus on these dynamic state estimation problems and illustrate their importance in the context of two timely activities in space remote sensing. First, we discuss the estimation of multi-dimensional ionospheric state parameters from UV spectral imaging measurements anticipated to be acquired the recently selected NASA Heliophysics mission, Ionospheric Connection Explorer (ICON). Next, we illustrate that similar state-space formulations provide the means for the estimation of 3D, time-dependent densities and temperatures in the solar corona from a series of white-light and EUV measurements. We demonstrate that, while a general framework for the stochastic formulation of the state estimation problem is suited for systematic inference of the parameters of a hidden Markov process, several challenges must be addressed in the assimilation of an increasing volume and diversity of space observations. These challenges are: (1) the computational tractability when faced with voluminous and multimodal data, (2) the inherent limitations of the underlying models which assume, often incorrectly, linear dynamics and Gaussian noise, and (3) the unavailability or inaccuracy of transition probabilities and noise statistics. We argue that pursuing answers to these questions necessitates cross-disciplinary research that enables progress toward systematically reconciling observational and theoretical understanding of the space environment.
CORSSA: The Community Online Resource for Statistical Seismicity Analysis
Michael, Andrew J.; Wiemer, Stefan
2010-01-01
Statistical seismology is the application of rigorous statistical methods to earthquake science with the goal of improving our knowledge of how the earth works. Within statistical seismology there is a strong emphasis on the analysis of seismicity data in order to improve our scientific understanding of earthquakes and to improve the evaluation and testing of earthquake forecasts, earthquake early warning, and seismic hazards assessments. Given the societal importance of these applications, statistical seismology must be done well. Unfortunately, a lack of educational resources and available software tools make it difficult for students and new practitioners to learn about this discipline. The goal of the Community Online Resource for Statistical Seismicity Analysis (CORSSA) is to promote excellence in statistical seismology by providing the knowledge and resources necessary to understand and implement the best practices, so that the reader can apply these methods to their own research. This introduction describes the motivation for and vision of CORRSA. It also describes its structure and contents.
NASA Astrophysics Data System (ADS)
Militello, F.; Farley, T.; Mukhi, K.; Walkden, N.; Omotani, J. T.
2018-05-01
A statistical framework was introduced in Militello and Omotani [Nucl. Fusion 56, 104004 (2016)] to correlate the dynamics and statistics of L-mode and inter-ELM plasma filaments with the radial profiles of thermodynamic quantities they generate in the Scrape Off Layer. This paper extends the framework to cases in which the filaments are emitted from the separatrix at different toroidal positions and with a finite toroidal velocity. It is found that the toroidal velocity does not affect the profiles, while the toroidal distribution of filament emission renormalises the waiting time between two events. Experimental data collected by visual camera imaging are used to evaluate the statistics of the fluctuations, to inform the choice of the probability distribution functions used in the application of the framework. It is found that the toroidal separation of the filaments is exponentially distributed, thus suggesting the lack of a toroidal modal structure. Finally, using these measurements, the framework is applied to an experimental case and good agreement is found.
A Comparison of Alternate Approaches to Creating Indices of Academic Rigor. Research Report 2012-11
ERIC Educational Resources Information Center
Beatty, Adam S.; Sackett, Paul R.; Kuncel, Nathan R.; Kiger, Thomas B.; Rigdon, Jana L.; Shen, Winny; Walmsley, Philip T.
2013-01-01
In recent decades, there has been an increasing emphasis placed on college graduation rates and reducing attrition due to the social and economic benefits, at both the individual and national levels, proposed to accrue from a more highly educated population (Bureau of Labor Statistics, 2011). In the United States in particular, there is a concern…
Comparing the Rigor of Compressed Format Courses to Their Regular Semester Counterparts
ERIC Educational Resources Information Center
Lutes, Lyndell; Davies, Randall
2013-01-01
This study compared workloads of undergraduate courses taught in 16-week and 8-week sessions. A statistically significant difference in workload was found between the two. Based on survey data from approximately 29,000 students, on average students spent about 17 minutes more per credit per week on 16-week courses than on similar 8-week courses.…
Statistical tests and measures for the presence and influence of digit preference
Jay Beaman; Grenier Michel
1998-01-01
Digit preference which is really showing preference for certain numbers has often described as the heaping or rounding of responses to numbers ending in zero or five. Number preference, NP, has been a topic in the social science literature for some years. However, until recently concepts were not adequately rigorously specified to allow, for example, the estimation of...
A new assessment of the alleged link between element 115 and element 117 decay chains
NASA Astrophysics Data System (ADS)
Forsberg, U.; Rudolph, D.; Fahlander, C.; Golubev, P.; Sarmiento, L. G.; Åberg, S.; Block, M.; Düllmann, Ch. E.; Heßberger, F. P.; Kratz, J. V.; Yakushev, A.
2016-09-01
A novel rigorous statistical treatment is applied to available data (May 9, 2016) from search and spectroscopy experiments on the elements with atomic numbers Z = 115 and Z = 117. The present analysis implies that the hitherto proposed cross-reaction link between α-decay chains associated with the isotopes 293117 and 289115 is highly improbable.
ERIC Educational Resources Information Center
Arbaugh, J. B.; Hwang, Alvin
2013-01-01
Seeking to assess the analytical rigor of empirical research in management education, this article reviews the use of multivariate statistical techniques in 85 studies of online and blended management education over the past decade and compares them with prescriptions offered by both the organization studies and educational research communities.…
Statistical rigor in LiDAR-assisted estimation of aboveground forest biomass
Timothy G. Gregoire; Erik Næsset; Ronald E. McRoberts; Göran Ståhl; Hans Andersen; Terje Gobakken; Liviu Ene; Ross Nelson
2016-01-01
For many decades remotely sensed data have been used as a source of auxiliary information when conducting regional or national surveys of forest resources. In the past decade, airborne scanning LiDAR (Light Detection and Ranging) has emerged as a promising tool for sample surveys aimed at improving estimation of aboveground forest biomass. This technology is now...
ERIC Educational Resources Information Center
Stoneberg, Bert D.
2015-01-01
The National Center of Education Statistics conducted a mapping study that equated the percentage proficient or above on each state's NCLB reading and mathematics tests in grades 4 and 8 to the NAEP scale. Each "NAEP equivalent score" was labeled according to NAEP's achievement levels and used to compare state proficiency standards and…
Code of Federal Regulations, 2012 CFR
2012-07-01
... rigorous statistical experimental design and interpretation (Reference 16.4). 14.0Pollution Prevention 14... fluids. 1.4This method has been designed to show positive contamination for 5% of representative crude....1Sample collection bottles/jars—New, pre-cleaned bottles/jars, lot-certified to be free of artifacts...
Code of Federal Regulations, 2013 CFR
2013-07-01
... rigorous statistical experimental design and interpretation (Reference 16.4). 14.0Pollution Prevention 14... fluids. 1.4This method has been designed to show positive contamination for 5% of representative crude....1Sample collection bottles/jars—New, pre-cleaned bottles/jars, lot-certified to be free of artifacts...
Code of Federal Regulations, 2014 CFR
2014-07-01
... rigorous statistical experimental design and interpretation (Reference 16.4). 14.0Pollution Prevention 14... fluids. 1.4This method has been designed to show positive contamination for 5% of representative crude....1Sample collection bottles/jars—New, pre-cleaned bottles/jars, lot-certified to be free of artifacts...
Code of Federal Regulations, 2011 CFR
2011-07-01
... rigorous statistical experimental design and interpretation (Reference 16.4). 14.0Pollution Prevention 14... oil contamination in drilling fluids. 1.4This method has been designed to show positive contamination....1Sample collection bottles/jars—New, pre-cleaned bottles/jars, lot-certified to be free of artifacts...
Bayesian Inference: with ecological applications
Link, William A.; Barker, Richard J.
2010-01-01
This text provides a mathematically rigorous yet accessible and engaging introduction to Bayesian inference with relevant examples that will be of interest to biologists working in the fields of ecology, wildlife management and environmental studies as well as students in advanced undergraduate statistics.. This text opens the door to Bayesian inference, taking advantage of modern computational efficiencies and easily accessible software to evaluate complex hierarchical models.
Interoperable Data Sharing for Diverse Scientific Disciplines
NASA Astrophysics Data System (ADS)
Hughes, John S.; Crichton, Daniel; Martinez, Santa; Law, Emily; Hardman, Sean
2016-04-01
For diverse scientific disciplines to interoperate they must be able to exchange information based on a shared understanding. To capture this shared understanding, we have developed a knowledge representation framework using ontologies and ISO level archive and metadata registry reference models. This framework provides multi-level governance, evolves independent of implementation technologies, and promotes agile development, namely adaptive planning, evolutionary development, early delivery, continuous improvement, and rapid and flexible response to change. The knowledge representation framework is populated through knowledge acquisition from discipline experts. It is also extended to meet specific discipline requirements. The result is a formalized and rigorous knowledge base that addresses data representation, integrity, provenance, context, quantity, and their relationships within the community. The contents of the knowledge base is translated and written to files in appropriate formats to configure system software and services, provide user documentation, validate ingested data, and support data analytics. This presentation will provide an overview of the framework, present the Planetary Data System's PDS4 as a use case that has been adopted by the international planetary science community, describe how the framework is being applied to other disciplines, and share some important lessons learned.
Mascia, Michael B; Fox, Helen E; Glew, Louise; Ahmadia, Gabby N; Agrawal, Arun; Barnes, Megan; Basurto, Xavier; Craigie, Ian; Darling, Emily; Geldmann, Jonas; Gill, David; Holst Rice, Susie; Jensen, Olaf P; Lester, Sarah E; McConney, Patrick; Mumby, Peter J; Nenadovic, Mateja; Parks, John E; Pomeroy, Robert S; White, Alan T
2017-07-01
Environmental conservation initiatives, including marine protected areas (MPAs), have proliferated in recent decades. Designed to conserve marine biodiversity, many MPAs also seek to foster sustainable development. As is the case for many other environmental policies and programs, the impacts of MPAs are poorly understood. Social-ecological systems, impact evaluation, and common-pool resource governance are three complementary scientific frameworks for documenting and explaining the ecological and social impacts of conservation interventions. We review key components of these three frameworks and their implications for the study of conservation policy, program, and project outcomes. Using MPAs as an illustrative example, we then draw upon these three frameworks to describe an integrated approach for rigorous empirical documentation and causal explanation of conservation impacts. This integrated three-framework approach for impact evaluation of governance in social-ecological systems (3FIGS) accounts for alternative explanations, builds upon and advances social theory, and provides novel policy insights in ways that no single approach affords. Despite the inherent complexity of social-ecological systems and the difficulty of causal inference, the 3FIGS approach can dramatically advance our understanding of, and the evidentiary basis for, effective MPAs and other conservation initiatives. © 2017 New York Academy of Sciences.
Adult asthma disease management: an analysis of studies, approaches, outcomes, and methods.
Maciejewski, Matthew L; Chen, Shih-Yin; Au, David H
2009-07-01
Disease management has been implemented for patients with asthma in various ways. We describe the approaches to and components of adult asthma disease-management interventions, examine the outcomes evaluated, and assess the quality of published studies. We searched the MEDLINE, EMBASE, CINAHL, PsychInfo, and Cochrane databases for studies published in 1986 through 2008, on adult asthma management. With the studies that met our inclusion criteria, we examined the clinical, process, medication, economic, and patient-reported outcomes reported, and the study designs, provider collaboration during the studies, and statistical methods. Twenty-nine articles describing 27 studies satisfied our inclusion criteria. There was great variation in the content, extent of collaboration between physician and non-physician providers responsible for intervention delivery, and outcomes examined across the 27 studies. Because of limitations in the design of 22 of the 27 studies, the differences in outcomes assessed, and the lack of rigorous statistical adjustment, we could not draw definitive conclusions about the effectiveness or cost-effectiveness of the asthma disease-management programs or which approach was most effective. Few well-designed studies with rigorous evaluations have been conducted to evaluate disease-management interventions for adults with asthma. Current evidence is insufficient to recommend any particular intervention.
NASA Astrophysics Data System (ADS)
Bao, Zhenkun; Li, Xiaolong; Luo, Xiangyang
2017-01-01
Extracting informative statistic features is the most essential technical issue of steganalysis. Among various steganalysis methods, probability density function (PDF) and characteristic function (CF) moments are two important types of features due to the excellent ability for distinguishing the cover images from the stego ones. The two types of features are quite similar in definition. The only difference is that the PDF moments are computed in the spatial domain, while the CF moments are computed in the Fourier-transformed domain. Then, the comparison between PDF and CF moments is an interesting question of steganalysis. Several theoretical results have been derived, and CF moments are proved better than PDF moments in some cases. However, in the log prediction error wavelet subband of wavelet decomposition, some experiments show that the result is opposite and lacks a rigorous explanation. To solve this problem, a comparison result based on the rigorous proof is presented: the first-order PDF moment is proved better than the CF moment, while the second-order CF moment is better than the PDF moment. It tries to open the theoretical discussion on steganalysis and the question of finding suitable statistical features.
An ORCID based synchronization framework for a national CRIS ecosystem.
Mendes Moreira, João; Cunha, Alcino; Macedo, Nuno
2015-01-01
PTCRIS (Portuguese Current Research Information System) is a program aiming at the creation and sustained development of a national integrated information ecosystem, to support research management according to the best international standards and practices. This paper reports on the experience of designing and prototyping a synchronization framework for PTCRIS based on ORCID (Open Researcher and Contributor ID). This framework embraces the "input once, re-use often" principle, and will enable a substantial reduction of the research output management burden by allowing automatic information exchange between the various national systems. The design of the framework followed best practices in rigorous software engineering, namely well-established principles in the research field of consistency management, and relied on formal analysis techniques and tools for its validation and verification. The notion of consistency between the services was formally specified and discussed with the stakeholders before the technical aspects on how to preserve said consistency were explored. Formal specification languages and automated verification tools were used to analyze the specifications and generate usage scenarios, useful for validation with the stakeholder and essential to certificate compliant services.
The integration of claims to health-care: a programming approach.
Anand, Paul
2003-09-01
The paper contributes to the use of social choice and welfare theory in health economics by developing and applying the integration of claims framework to health-care rationing. Related to Sen's critique of neo-classical welfare economics, the integration of claims framework recognises three primitive sources of claim: consequences, deontology and procedures. A taxonomy is presented with the aid of which it is shown that social welfare functions reflecting these claims individually or together, can be specified. Some of the resulting social choice rules can be regarded as generalisations of health-maximisation and all have normative justifications, though the justifications may not be universally acceptable. The paper shows how non-linear programming can be used to operationalise such choice rules and illustrates their differential impacts on the optimal provision of health-care. Following discussion of relations to the capabilities framework and the context in which rationing occurs, the paper concludes that the integration of claims provides a viable framework for modelling health-care rationing that is technically rigorous, general and tractable, as well as being consistent with relevant moral considerations and citizen preferences.
Doctor coach: a deliberate practice approach to teaching and learning clinical skills.
Gifford, Kimberly A; Fall, Leslie H
2014-02-01
The rapidly evolving medical education landscape requires restructuring the approach to teaching and learning across the continuum of medical education. The deliberate practice strategies used to coach learners in disciplines beyond medicine can also be used to train medical learners. However, these deliberate practice strategies are not explicitly taught in most medical schools or residencies. The authors designed the Doctor Coach framework and competencies in 2007-2008 to serve as the foundation for new faculty development and resident-as-teacher programs. In addition to teaching deliberate practice strategies, the programs model a deliberate practice approach that promotes the continuous integration of newly developed coaching competencies by participants into their daily teaching practice. Early evaluation demonstrated the feasibility and efficacy of implementing the Doctor Coach framework across the continuum of medical education. Additionally, the Doctor Coach framework has been disseminated through national workshops, which have resulted in additional institutions applying the framework and competencies to develop their own coaching programs. Design of a multisource evaluation tool based on the coaching competencies will enable more rigorous study of the Doctor Coach framework and training programs and provide a richer feedback mechanism for participants. The framework will also facilitate the faculty development needed to implement the milestones and entrustable professional activities in medical education.
ERIC Educational Resources Information Center
LeMire, Steven D.
2010-01-01
This paper proposes an argument framework for the teaching of null hypothesis statistical testing and its application in support of research. Elements of the Toulmin (1958) model of argument are used to illustrate the use of p values and Type I and Type II error rates in support of claims about statistical parameters and subject matter research…
Rigorous theoretical framework for particle sizing in turbid colloids using light refraction.
García-Valenzuela, Augusto; Barrera, Rubén G; Gutierrez-Reyes, Edahí
2008-11-24
Using a non-local effective-medium approach, we analyze the refraction of light in a colloidal medium. We discuss the theoretical grounds and all the necessary precautions to design and perform experiments to measure the effective refractive index in dilute colloids. As an application, we show that it is possible to retrieve the size of small dielectric particles in a colloid by measuring the complex effective refractive index and the volume fraction occupied by the particles.
Invariant Tori in the Secular Motions of the Three-body Planetary Systems
NASA Astrophysics Data System (ADS)
Locatelli, Ugo; Giorgilli, Antonio
We consider the problem of the applicability of KAM theorem to a realistic problem of three bodies. In the framework of the averaged dynamics over the fast angles for the Sun-Jupiter-Saturn system we can prove the perpetual stability of the orbit. The proof is based on semi-numerical algorithms requiring both explicit algebraic manipulations of series and analytical estimates. The proof is made rigorous by using interval arithmetics in order to control the numerical errors.
Rohan, Annie J; Fullerton, Judith; Escallier, Lori A; Pati, Susmita
A novel, sustainable digital badge-awarding online course was developed to prepare learners with familiarity of patient navigation. Learners offered favorable endorsement of essentially all elements of the program, especially the utility of the Blackboard learning management software program. Quality Matters standards provided a rigorous framework for the challenges of designing, implementing, and evaluating online curricula. Online education is an effective method for meeting the professional development needs of those seeking careers in care coordination/patient navigation.
Equilibration, thermalisation, and the emergence of statistical mechanics in closed quantum systems
NASA Astrophysics Data System (ADS)
Gogolin, Christian; Eisert, Jens
2016-05-01
We review selected advances in the theoretical understanding of complex quantum many-body systems with regard to emergent notions of quantum statistical mechanics. We cover topics such as equilibration and thermalisation in pure state statistical mechanics, the eigenstate thermalisation hypothesis, the equivalence of ensembles, non-equilibration dynamics following global and local quenches as well as ramps. We also address initial state independence, absence of thermalisation, and many-body localisation. We elucidate the role played by key concepts for these phenomena, such as Lieb-Robinson bounds, entanglement growth, typicality arguments, quantum maximum entropy principles and the generalised Gibbs ensembles, and quantum (non-)integrability. We put emphasis on rigorous approaches and present the most important results in a unified language.
NASA Astrophysics Data System (ADS)
Röpke, G.
2018-01-01
One of the fundamental problems in physics that are not yet rigorously solved is the statistical mechanics of nonequilibrium processes. An important contribution to describing irreversible behavior starting from reversible Hamiltonian dynamics was given by D. N. Zubarev, who invented the method of the nonequilibrium statistical operator. We discuss this approach, in particular, the extended von Neumann equation, and as an example consider the electrical conductivity of a system of charged particles. We consider the selection of the set of relevant observables. We show the relation between kinetic theory and linear response theory. Using thermodynamic Green's functions, we present a systematic treatment of correlation functions, but the convergence needs investigation. We compare different expressions for the conductivity and list open questions.
Equilibration, thermalisation, and the emergence of statistical mechanics in closed quantum systems.
Gogolin, Christian; Eisert, Jens
2016-05-01
We review selected advances in the theoretical understanding of complex quantum many-body systems with regard to emergent notions of quantum statistical mechanics. We cover topics such as equilibration and thermalisation in pure state statistical mechanics, the eigenstate thermalisation hypothesis, the equivalence of ensembles, non-equilibration dynamics following global and local quenches as well as ramps. We also address initial state independence, absence of thermalisation, and many-body localisation. We elucidate the role played by key concepts for these phenomena, such as Lieb-Robinson bounds, entanglement growth, typicality arguments, quantum maximum entropy principles and the generalised Gibbs ensembles, and quantum (non-)integrability. We put emphasis on rigorous approaches and present the most important results in a unified language.
Statistical Characterization and Classification of Edge-Localized Plasma Instabilities
NASA Astrophysics Data System (ADS)
Webster, A. J.; Dendy, R. O.
2013-04-01
The statistics of edge-localized plasma instabilities (ELMs) in toroidal magnetically confined fusion plasmas are considered. From first principles, standard experimentally motivated assumptions are shown to determine a specific probability distribution for the waiting times between ELMs: the Weibull distribution. This is confirmed empirically by a statistically rigorous comparison with a large data set from the Joint European Torus. The successful characterization of ELM waiting times enables future work to progress in various ways. Here we present a quantitative classification of ELM types, complementary to phenomenological approaches. It also informs us about the nature of ELM processes, such as whether they are random or deterministic. The methods are extremely general and can be applied to numerous other quasiperiodic intermittent phenomena.
NASA Astrophysics Data System (ADS)
Skorobogatiy, Maksim; Sadasivan, Jayesh; Guerboukha, Hichem
2018-05-01
In this paper, we first discuss the main types of noise in a typical pump-probe system, and then focus specifically on terahertz time domain spectroscopy (THz-TDS) setups. We then introduce four statistical models for the noisy pulses obtained in such systems, and detail rigorous mathematical algorithms to de-noise such traces, find the proper averages and characterise various types of experimental noise. Finally, we perform a comparative analysis of the performance, advantages and limitations of the algorithms by testing them on the experimental data collected using a particular THz-TDS system available in our laboratories. We conclude that using advanced statistical models for trace averaging results in the fitting errors that are significantly smaller than those obtained when only a simple statistical average is used.
NASA Astrophysics Data System (ADS)
Luzzi, R.; Vasconcellos, A. R.; Ramos, J. G.; Rodrigues, C. G.
2018-01-01
We describe the formalism of statistical irreversible thermodynamics constructed based on Zubarev's nonequilibrium statistical operator (NSO) method, which is a powerful and universal tool for investigating the most varied physical phenomena. We present brief overviews of the statistical ensemble formalism and statistical irreversible thermodynamics. The first can be constructed either based on a heuristic approach or in the framework of information theory in the Jeffreys-Jaynes scheme of scientific inference; Zubarev and his school used both approaches in formulating the NSO method. We describe the main characteristics of statistical irreversible thermodynamics and discuss some particular considerations of several authors. We briefly describe how Rosenfeld, Bohr, and Prigogine proposed to derive a thermodynamic uncertainty principle.
NASA Astrophysics Data System (ADS)
Chern, J. D.; Tao, W. K.; Lang, S. E.; Matsui, T.; Mohr, K. I.
2014-12-01
Four six-month (March-August 2014) experiments with the Goddard Multi-scale Modeling Framework (MMF) were performed to study the impacts of different Goddard one-moment bulk microphysical schemes and large-scale forcings on the performance of the MMF. Recently a new Goddard one-moment bulk microphysics with four-ice classes (cloud ice, snow, graupel, and frozen drops/hail) has been developed based on cloud-resolving model simulations with large-scale forcings from field campaign observations. The new scheme has been successfully implemented to the MMF and two MMF experiments were carried out with this new scheme and the old three-ice classes (cloud ice, snow graupel) scheme. The MMF has global coverage and can rigorously evaluate microphysics performance for different cloud regimes. The results show MMF with the new scheme outperformed the old one. The MMF simulations are also strongly affected by the interaction between large-scale and cloud-scale processes. Two MMF sensitivity experiments with and without nudging large-scale forcings to those of ERA-Interim reanalysis were carried out to study the impacts of large-scale forcings. The model simulated mean and variability of surface precipitation, cloud types, cloud properties such as cloud amount, hydrometeors vertical profiles, and cloud water contents, etc. in different geographic locations and climate regimes are evaluated against GPM, TRMM, CloudSat/CALIPSO satellite observations. The Goddard MMF has also been coupled with the Goddard Satellite Data Simulation Unit (G-SDSU), a system with multi-satellite, multi-sensor, and multi-spectrum satellite simulators. The statistics of MMF simulated radiances and backscattering can be directly compared with satellite observations to assess the strengths and/or deficiencies of MMF simulations and provide guidance on how to improve the MMF and microphysics.
Hierarchical fractional-step approximations and parallel kinetic Monte Carlo algorithms
DOE Office of Scientific and Technical Information (OSTI.GOV)
Arampatzis, Giorgos, E-mail: garab@math.uoc.gr; Katsoulakis, Markos A., E-mail: markos@math.umass.edu; Plechac, Petr, E-mail: plechac@math.udel.edu
2012-10-01
We present a mathematical framework for constructing and analyzing parallel algorithms for lattice kinetic Monte Carlo (KMC) simulations. The resulting algorithms have the capacity to simulate a wide range of spatio-temporal scales in spatially distributed, non-equilibrium physiochemical processes with complex chemistry and transport micro-mechanisms. Rather than focusing on constructing exactly the stochastic trajectories, our approach relies on approximating the evolution of observables, such as density, coverage, correlations and so on. More specifically, we develop a spatial domain decomposition of the Markov operator (generator) that describes the evolution of all observables according to the kinetic Monte Carlo algorithm. This domain decompositionmore » corresponds to a decomposition of the Markov generator into a hierarchy of operators and can be tailored to specific hierarchical parallel architectures such as multi-core processors or clusters of Graphical Processing Units (GPUs). Based on this operator decomposition, we formulate parallel Fractional step kinetic Monte Carlo algorithms by employing the Trotter Theorem and its randomized variants; these schemes, (a) are partially asynchronous on each fractional step time-window, and (b) are characterized by their communication schedule between processors. The proposed mathematical framework allows us to rigorously justify the numerical and statistical consistency of the proposed algorithms, showing the convergence of our approximating schemes to the original serial KMC. The approach also provides a systematic evaluation of different processor communicating schedules. We carry out a detailed benchmarking of the parallel KMC schemes using available exact solutions, for example, in Ising-type systems and we demonstrate the capabilities of the method to simulate complex spatially distributed reactions at very large scales on GPUs. Finally, we discuss work load balancing between processors and propose a re-balancing scheme based on probabilistic mass transport methods.« less
Validating clustering of molecular dynamics simulations using polymer models.
Phillips, Joshua L; Colvin, Michael E; Newsam, Shawn
2011-11-14
Molecular dynamics (MD) simulation is a powerful technique for sampling the meta-stable and transitional conformations of proteins and other biomolecules. Computational data clustering has emerged as a useful, automated technique for extracting conformational states from MD simulation data. Despite extensive application, relatively little work has been done to determine if the clustering algorithms are actually extracting useful information. A primary goal of this paper therefore is to provide such an understanding through a detailed analysis of data clustering applied to a series of increasingly complex biopolymer models. We develop a novel series of models using basic polymer theory that have intuitive, clearly-defined dynamics and exhibit the essential properties that we are seeking to identify in MD simulations of real biomolecules. We then apply spectral clustering, an algorithm particularly well-suited for clustering polymer structures, to our models and MD simulations of several intrinsically disordered proteins. Clustering results for the polymer models provide clear evidence that the meta-stable and transitional conformations are detected by the algorithm. The results for the polymer models also help guide the analysis of the disordered protein simulations by comparing and contrasting the statistical properties of the extracted clusters. We have developed a framework for validating the performance and utility of clustering algorithms for studying molecular biopolymer simulations that utilizes several analytic and dynamic polymer models which exhibit well-behaved dynamics including: meta-stable states, transition states, helical structures, and stochastic dynamics. We show that spectral clustering is robust to anomalies introduced by structural alignment and that different structural classes of intrinsically disordered proteins can be reliably discriminated from the clustering results. To our knowledge, our framework is the first to utilize model polymers to rigorously test the utility of clustering algorithms for studying biopolymers.
Validating clustering of molecular dynamics simulations using polymer models
2011-01-01
Background Molecular dynamics (MD) simulation is a powerful technique for sampling the meta-stable and transitional conformations of proteins and other biomolecules. Computational data clustering has emerged as a useful, automated technique for extracting conformational states from MD simulation data. Despite extensive application, relatively little work has been done to determine if the clustering algorithms are actually extracting useful information. A primary goal of this paper therefore is to provide such an understanding through a detailed analysis of data clustering applied to a series of increasingly complex biopolymer models. Results We develop a novel series of models using basic polymer theory that have intuitive, clearly-defined dynamics and exhibit the essential properties that we are seeking to identify in MD simulations of real biomolecules. We then apply spectral clustering, an algorithm particularly well-suited for clustering polymer structures, to our models and MD simulations of several intrinsically disordered proteins. Clustering results for the polymer models provide clear evidence that the meta-stable and transitional conformations are detected by the algorithm. The results for the polymer models also help guide the analysis of the disordered protein simulations by comparing and contrasting the statistical properties of the extracted clusters. Conclusions We have developed a framework for validating the performance and utility of clustering algorithms for studying molecular biopolymer simulations that utilizes several analytic and dynamic polymer models which exhibit well-behaved dynamics including: meta-stable states, transition states, helical structures, and stochastic dynamics. We show that spectral clustering is robust to anomalies introduced by structural alignment and that different structural classes of intrinsically disordered proteins can be reliably discriminated from the clustering results. To our knowledge, our framework is the first to utilize model polymers to rigorously test the utility of clustering algorithms for studying biopolymers. PMID:22082218
A Framework for Thinking about Informal Statistical Inference
ERIC Educational Resources Information Center
Makar, Katie; Rubin, Andee
2009-01-01
Informal inferential reasoning has shown some promise in developing students' deeper understanding of statistical processes. This paper presents a framework to think about three key principles of informal inference--generalizations "beyond the data," probabilistic language, and data as evidence. The authors use primary school classroom…
NASA Astrophysics Data System (ADS)
Blanchard, Philippe; Hellmich, Mario; Ługiewicz, Piotr; Olkiewicz, Robert
Quantum mechanics is the greatest revision of our conception of the character of the physical world since Newton. Consequently, David Hilbert was very interested in quantum mechanics. He and John von Neumann discussed it frequently during von Neumann's residence in Göttingen. He published in 1932 his book Mathematical Foundations of Quantum Mechanics. In Hilbert's opinion it was the first exposition of quantum mechanics in a mathematically rigorous way. The pioneers of quantum mechanics, Heisenberg and Dirac, neither had use for rigorous mathematics nor much interest in it. Conceptually, quantum theory as developed by Bohr and Heisenberg is based on the positivism of Mach as it describes only observable quantities. It first emerged as a result of experimental data in the form of statistical observations of quantum noise, the basic concept of quantum probability.
Patel, Tulpesh; Blyth, Jacqueline C.; Griffiths, Gareth; Kelly, Deirdre; Talcott, Joel B.
2014-01-01
Background: Proton Magnetic Resonance Spectroscopy (1H-MRS) is a non-invasive imaging technique that enables quantification of neurochemistry in vivo and thereby facilitates investigation of the biochemical underpinnings of human cognitive variability. Studies in the field of cognitive spectroscopy have commonly focused on relationships between measures of N-acetyl aspartate (NAA), a surrogate marker of neuronal health and function, and broad measures of cognitive performance, such as IQ. Methodology/Principal Findings: In this study, we used 1H-MRS to interrogate single-voxels in occipitoparietal and frontal cortex, in parallel with assessments of psychometric intelligence, in a sample of 40 healthy adult participants. We found correlations between NAA and IQ that were within the range reported in previous studies. However, the magnitude of these effects was significantly modulated by the stringency of data screening and the extent to which outlying values contributed to statistical analyses. Conclusions/Significance: 1H-MRS offers a sensitive tool for assessing neurochemistry non-invasively, yet the relationships between brain metabolites and broad aspects of human behavior such as IQ are subtle. We highlight the need to develop an increasingly rigorous analytical and interpretive framework for collecting and reporting data obtained from cognitive spectroscopy studies of this kind. PMID:24592224
Miller-Graff, Laura E; Campion, Karen
2016-03-01
In the past 15 years, there have been a substantial number of rigorous studies examining the effectiveness of various treatments for child trauma and posttraumatic stress disorder (PTSD). Although a number of review articles exist, many have focused on randomized controlled trials or specific treatment methodologies, both of which limit the ability to draw conclusions across studies and the statistical power to test the effect of particular treatment characteristics on treatment outcomes. The current study is a review and meta-analysis of 74 studies examining treatments for children exposed to violence. After reviewing the literature, we examined the relationship of a variety of treatment characteristics (e.g., group or individual treatments) and sample characteristics (e.g., average age) on treatment effect sizes. Results indicated that individual therapies and those with exposure paradigms within a cognitive-behavioral therapy or skills-building framework show the most promise, but treatment is somewhat less effective for those with more severe symptomology and for younger children. Future treatments should consider the developmental and social contexts that may impede treatment progress for young children and consider how best to develop the effectiveness of group interventions that can be readily delivered in settings of mass trauma. © 2015 Wiley Periodicals, Inc.
Decision-making for foot-and-mouth disease control: Objectives matter
Probert, William J. M.; Shea, Katriona; Fonnesbeck, Christopher J.; Runge, Michael C.; Carpenter, Tim E.; Durr, Salome; Garner, M. Graeme; Harvey, Neil; Stevenson, Mark A.; Webb, Colleen T.; Werkman, Marleen; Tildesley, Michael J.; Ferrari, Matthew J.
2016-01-01
Formal decision-analytic methods can be used to frame disease control problems, the first step of which is to define a clear and specific objective. We demonstrate the imperative of framing clearly-defined management objectives in finding optimal control actions for control of disease outbreaks. We illustrate an analysis that can be applied rapidly at the start of an outbreak when there are multiple stakeholders involved with potentially multiple objectives, and when there are also multiple disease models upon which to compare control actions. The output of our analysis frames subsequent discourse between policy-makers, modellers and other stakeholders, by highlighting areas of discord among different management objectives and also among different models used in the analysis. We illustrate this approach in the context of a hypothetical foot-and-mouth disease (FMD) outbreak in Cumbria, UK using outputs from five rigorously-studied simulation models of FMD spread. We present both relative rankings and relative performance of controls within each model and across a range of objectives. Results illustrate how control actions change across both the base metric used to measure management success and across the statistic used to rank control actions according to said metric. This work represents a first step towards reconciling the extensive modelling work on disease control problems with frameworks for structured decision making.
When Can Species Abundance Data Reveal Non-neutrality?
Al Hammal, Omar; Alonso, David; Etienne, Rampal S.; Cornell, Stephen J.
2015-01-01
Species abundance distributions (SAD) are probably ecology’s most well-known empirical pattern, and over the last decades many models have been proposed to explain their shape. There is no consensus over which model is correct, because the degree to which different processes can be discerned from SAD patterns has not yet been rigorously quantified. We present a power calculation to quantify our ability to detect deviations from neutrality using species abundance data. We study non-neutral stochastic community models, and show that the presence of non-neutral processes is detectable if sample size is large enough and/or the amplitude of the effect is strong enough. Our framework can be used for any candidate community model that can be simulated on a computer, and determines both the sampling effort required to distinguish between alternative processes, and a range for the strength of non-neutral processes in communities whose patterns are statistically consistent with neutral theory. We find that even data sets of the scale of the 50 Ha forest plot on Barro Colorado Island, Panama, are unlikely to be large enough to detect deviations from neutrality caused by competitive interactions alone, though the presence of multiple non-neutral processes with contrasting effects on abundance distributions may be detectable. PMID:25793889
From Maximum Entropy Models to Non-Stationarity and Irreversibility
NASA Astrophysics Data System (ADS)
Cofre, Rodrigo; Cessac, Bruno; Maldonado, Cesar
The maximum entropy distribution can be obtained from a variational principle. This is important as a matter of principle and for the purpose of finding approximate solutions. One can exploit this fact to obtain relevant information about the underlying stochastic process. We report here in recent progress in three aspects to this approach.1- Biological systems are expected to show some degree of irreversibility in time. Based on the transfer matrix technique to find the spatio-temporal maximum entropy distribution, we build a framework to quantify the degree of irreversibility of any maximum entropy distribution.2- The maximum entropy solution is characterized by a functional called Gibbs free energy (solution of the variational principle). The Legendre transformation of this functional is the rate function, which controls the speed of convergence of empirical averages to their ergodic mean. We show how the correct description of this functional is determinant for a more rigorous characterization of first and higher order phase transitions.3- We assess the impact of a weak time-dependent external stimulus on the collective statistics of spiking neuronal networks. We show how to evaluate this impact on any higher order spatio-temporal correlation. RC supported by ERC advanced Grant ``Bridges'', BC: KEOPS ANR-CONICYT, Renvision and CM: CONICYT-FONDECYT No. 3140572.
Tokunaga river networks: New empirical evidence and applications to transport problems
NASA Astrophysics Data System (ADS)
Tejedor, A.; Zaliapin, I. V.
2013-12-01
The Tokunaga self-similarity has proven to be an important constraint for the observed river networks. Notably, various Horton laws are naturally satisfied by the Tokunaga networks, which makes this model of considerable interest for theoretical analysis and modeling of environmental transport. Recall that Horton self-similarity is a weaker property of a tree graph that addresses its principal branching; it is a counterpart of the power-law size distribution for system's elements. The stronger Tokunaga self-similarity addresses so-called side branching; it ensures that different levels of a hierarchy have the same probabilistic structure (in a sense that can be rigorously defined). We describe an improved statistical framework for testing self-similarity in a finite tree and estimating the related parameters. The developed inference is applied to the major river basins in continental United States and Iberian Peninsula. The results demonstrate the validity of the Tokunaga model for the majority of the examined networks with very narrow (universal) range of parameter values. Next, we explore possible relationships between the Tokunaga parameter anomalies (deviations from the universal values) and climatic and geomorphologic characteristics of a region. Finally, we apply the Tokunaga model to explore vulnerability of river networks, defined via reaction of the river discharge to a storm.
Nanosystem self-assembly pathways discovered via all-atom multiscale analysis.
Pankavich, Stephen D; Ortoleva, Peter J
2012-07-26
We consider the self-assembly of composite structures from a group of nanocomponents, each consisting of particles within an N-atom system. Self-assembly pathways and rates for nanocomposites are derived via a multiscale analysis of the classical Liouville equation. From a reduced statistical framework, rigorous stochastic equations for population levels of beginning, intermediate, and final aggregates are also derived. It is shown that the definition of an assembly type is a self-consistency criterion that must strike a balance between precision and the need for population levels to be slowly varying relative to the time scale of atomic motion. The deductive multiscale approach is complemented by a qualitative notion of multicomponent association and the ensemble of exact atomic-level configurations consistent with them. In processes such as viral self-assembly from proteins and RNA or DNA, there are many possible intermediates, so that it is usually difficult to predict the most efficient assembly pathway. However, in the current study, rates of assembly of each possible intermediate can be predicted. This avoids the need, as in a phenomenological approach, for recalibration with each new application. The method accounts for the feedback across scales in space and time that is fundamental to nanosystem self-assembly. The theory has applications to bionanostructures, geomaterials, engineered composites, and nanocapsule therapeutic delivery systems.
Kiebish, Michael A.; Yang, Kui; Han, Xianlin; Gross, Richard W.; Chuang, Jeffrey
2012-01-01
The regulation and maintenance of the cellular lipidome through biosynthetic, remodeling, and catabolic mechanisms are critical for biological homeostasis during development, health and disease. These complex mechanisms control the architectures of lipid molecular species, which have diverse yet highly regulated fatty acid chains at both the sn1 and sn2 positions. Phosphatidylcholine (PC) and phosphatidylethanolamine (PE) serve as the predominant biophysical scaffolds in membranes, acting as reservoirs for potent lipid signals and regulating numerous enzymatic processes. Here we report the first rigorous computational dissection of the mechanisms influencing PC and PE molecular architectures from high-throughput shotgun lipidomic data. Using novel statistical approaches, we have analyzed multidimensional mass spectrometry-based shotgun lipidomic data from developmental mouse heart and mature mouse heart, lung, brain, and liver tissues. We show that in PC and PE, sn1 and sn2 positions are largely independent, though for low abundance species regulatory processes may interact with both the sn1 and sn2 chain simultaneously, leading to cooperative effects. Chains with similar biochemical properties appear to be remodeled similarly. We also see that sn2 positions are more regulated than sn1, and that PC exhibits stronger cooperative effects than PE. A key aspect of our work is a novel statistically rigorous approach to determine cooperativity based on a modified Fisher's exact test using Markov Chain Monte Carlo sampling. This computational approach provides a novel tool for developing mechanistic insight into lipidomic regulation. PMID:22662143
Modeling tree crown dynamics with 3D partial differential equations.
Beyer, Robert; Letort, Véronique; Cournède, Paul-Henry
2014-01-01
We characterize a tree's spatial foliage distribution by the local leaf area density. Considering this spatially continuous variable allows to describe the spatiotemporal evolution of the tree crown by means of 3D partial differential equations. These offer a framework to rigorously take locally and adaptively acting effects into account, notably the growth toward light. Biomass production through photosynthesis and the allocation to foliage and wood are readily included in this model framework. The system of equations stands out due to its inherent dynamic property of self-organization and spontaneous adaptation, generating complex behavior from even only a few parameters. The density-based approach yields spatially structured tree crowns without relying on detailed geometry. We present the methodological fundamentals of such a modeling approach and discuss further prospects and applications.
Taking Stock of Parent Education in the Family Courts: Envisioning a Public Health Model
Salem, Peter; Sandler, Irwin; Wolchik, Sharlene
2012-01-01
The paper reviewed the development and current status of the parent education movement in the Family Courts. Parent education programs are now being implemented in courts throughout the United States and have a high level of public acceptance; however, a stronger research methodology to evaluate the effects and continued work to align the goals with the content and teaching strategies of these programs are needed. A new conceptual framework is proposed for parent education, which views divorce as a public health problem for children as well as a legal issue. The three-level framework uses concepts from public health to align the goals, content and format of parent education programs and to enable rigorous evaluations of the outcomes achieved by these programs. PMID:23641191
A Rigorous Theory of Many-Body Prethermalization for Periodically Driven and Closed Quantum Systems
NASA Astrophysics Data System (ADS)
Abanin, Dmitry; De Roeck, Wojciech; Ho, Wen Wei; Huveneers, François
2017-09-01
Prethermalization refers to the transient phenomenon where a system thermalizes according to a Hamiltonian that is not the generator of its evolution. We provide here a rigorous framework for quantum spin systems where prethermalization is exhibited for very long times. First, we consider quantum spin systems under periodic driving at high frequency {ν}. We prove that up to a quasi-exponential time {τ_* ˜ e^{c ν/log^3 ν}}, the system barely absorbs energy. Instead, there is an effective local Hamiltonian {\\widehat D} that governs the time evolution up to {τ_*}, and hence this effective Hamiltonian is a conserved quantity up to {τ_*}. Next, we consider systems without driving, but with a separation of energy scales in the Hamiltonian. A prime example is the Fermi-Hubbard model where the interaction U is much larger than the hopping J. Also here we prove the emergence of an effective conserved quantity, different from the Hamiltonian, up to a time {τ_*} that is (almost) exponential in {U/J}.
Physics of atmospheric luminous anomalies: a sieve for SETI?
NASA Astrophysics Data System (ADS)
Teodorani, M.
2004-06-01
Anomalous atmospheric light phenomena reoccur in many locations of Earth, some of which have become a laboratory area for a rigorous instrumented study of the involved physics. Three Italian missions to Hessdalen (Norway) furnished crucial multi-wavelength data, the analysis of which has recently permitted us to establish that the very most part of light phenomena are caused by a geophysical mechanism producing light balls whose structure and radiant characteristics are very similar to the ones of ball lightning. While most of light phenomena in Hessdalen and elsewhere can now be successfully explained within the framework of a natural mechanism, a residual of "locally overlapping data" remains presently unexplained. To investigate them also the ETV (Extraterrestrial Visitation) working hypothesis is taken into account. It is shown how the search for ETV (SETV), consistent with the assumption of interstellar and galactic diffusion, can be carried out only from a rigorous data screening coming originally from the study of natural phenomena.
Statistical issues in the design, conduct and analysis of two large safety studies.
Gaffney, Michael
2016-10-01
The emergence, post approval, of serious medical events, which may be associated with the use of a particular drug or class of drugs, is an important public health and regulatory issue. The best method to address this issue is through a large, rigorously designed safety study. Therefore, it is important to elucidate the statistical issues involved in these large safety studies. Two such studies are PRECISION and EAGLES. PRECISION is the primary focus of this article. PRECISION is a non-inferiority design with a clinically relevant non-inferiority margin. Statistical issues in the design, conduct and analysis of PRECISION are discussed. Quantitative and clinical aspects of the selection of the composite primary endpoint, the determination and role of the non-inferiority margin in a large safety study and the intent-to-treat and modified intent-to-treat analyses in a non-inferiority safety study are shown. Protocol changes that were necessary during the conduct of PRECISION are discussed from a statistical perspective. Issues regarding the complex analysis and interpretation of the results of PRECISION are outlined. EAGLES is presented as a large, rigorously designed safety study when a non-inferiority margin was not able to be determined by a strong clinical/scientific method. In general, when a non-inferiority margin is not able to be determined, the width of the 95% confidence interval is a way to size the study and to assess the cost-benefit of relative trial size. A non-inferiority margin, when able to be determined by a strong scientific method, should be included in a large safety study. Although these studies could not be called "pragmatic," they are examples of best real-world designs to address safety and regulatory concerns. © The Author(s) 2016.
ERIC Educational Resources Information Center
Zhong, Hua; Schwartz, Jennifer
2010-01-01
Underage drinking is among the most serious of public health problems facing adolescents in the United States. Recent concerns have centered on young women, reflected in media reports and arrest statistics on their increasing problematic alcohol use. This study rigorously examined whether girls' alcohol use rose by applying time series methods to…
Decomposition of the Inequality of Income Distribution by Income Types—Application for Romania
NASA Astrophysics Data System (ADS)
Andrei, Tudorel; Oancea, Bogdan; Richmond, Peter; Dhesi, Gurjeet; Herteliu, Claudiu
2017-09-01
This paper identifies the salient factors that characterize the inequality income distribution for Romania. Data analysis is rigorously carried out using sophisticated techniques borrowed from classical statistics (Theil). Decomposition of the inequalities measured by the Theil index is also performed. This study relies on an exhaustive (11.1 million records for 2014) data-set for total personal gross income of Romanian citizens.
ERIC Educational Resources Information Center
Harwell, Michael
2014-01-01
Commercial data analysis software has been a fixture of quantitative analyses in education for more than three decades. Despite its apparent widespread use there is no formal evidence cataloging what software is used in educational research and educational statistics classes, by whom and for what purpose, and whether some programs should be…
Fault Management Design Strategies
NASA Technical Reports Server (NTRS)
Day, John C.; Johnson, Stephen B.
2014-01-01
Development of dependable systems relies on the ability of the system to determine and respond to off-nominal system behavior. Specification and development of these fault management capabilities must be done in a structured and principled manner to improve our understanding of these systems, and to make significant gains in dependability (safety, reliability and availability). Prior work has described a fundamental taxonomy and theory of System Health Management (SHM), and of its operational subset, Fault Management (FM). This conceptual foundation provides a basis to develop framework to design and implement FM design strategies that protect mission objectives and account for system design limitations. Selection of an SHM strategy has implications for the functions required to perform the strategy, and it places constraints on the set of possible design solutions. The framework developed in this paper provides a rigorous and principled approach to classifying SHM strategies, as well as methods for determination and implementation of SHM strategies. An illustrative example is used to describe the application of the framework and the resulting benefits to system and FM design and dependability.
Deciphering Rashomon: an approach to verbal autopsies of maternal deaths.
Iyer, Aditi; Sen, Gita; Sreevathsa, Anuradha
2013-01-01
The paper discusses an approach to verbal autopsies that engages with the Rashomon phenomenon affecting ex post facto constructions of death and responds to the call for maternal safety. This method differs from other verbal autopsies in its approach to data collection and its framework of analysis. In our approach, data collection entails working with and triangulating multiple narratives, and minimising power inequalities in the investigation process. The framework of analysis focuses on the missed opportunities for death prevention as an alternative to (or deepening of) the Three Delays Model. This framework assesses the behavioural responses of health providers, as well as community and family members at each opportunity for death prevention and categorises them into four groups: non-actions, inadequate actions, inappropriate actions and unavoidably delayed actions. We demonstrate the application of this approach to show how verbal autopsies can delve beneath multiple narratives and rigorously identify health system, behavioural and cultural factors that contribute to avoidable maternal mortality.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xu, J.; Liao, J.; Gyulassy, M.
Here, we build a new phenomenological framework that bridges the long wavelength bulk viscous transport properties of the strongly-coupled quark-gluon plasma (sQGP) and short distance hard jet transport properties in the QGP. The full nonperturbative chromo-electric (E) and chromo-magnetic (M) structure of the near “perfect fluid” like sQGP in the critical transition region are integrated into a semi-Quark-Gluon-Monopole Plasma (sQGMP) model lattice-compatibly and implemented into the new CUJET3.0 jet quenching framework. All observables computed from CUJET3.0 are found to be consistent with available data at RHIC and LHC simultaneously. Moreover, a quantitative connection between the shear viscosity and jet transportmore » parameter is rigorously established within this framework. Finally, we deduce the T = 160-600 MeV dependence of the QGP’s η/s: its near vanishing value in the near T c regime is determined by the composition of E and M charges, it increases as T rises, and its high T limit is fixed by color screening scales.« less
Tucker, Carole A.; Bevans, Katherine B.; Teneralli, Rachel E.; Smith, Ashley Wilder; Bowles, Heather R; Forrest, Christopher B.
2014-01-01
Purpose Children's physical activity (PA) levels are commonly assessed in pediatric clinical research, but rigorous self-report assessment tools for children are scarce, and computer adaptive test implementations are rare. Our objective was to improve pediatric self-report measures of activity using semi-structured interviews with experts and children for conceptualization of a child-informed framework. Methods Semi-structured interviews were conducted to conceptualize physical activity, sedentary behaviors, and strengthening activities. We performed systematic literature reviews to identify item-level concepts used to assess these 3 domains. Results We developed conceptual frameworks for each domain using words and phrases identified by children as relevant. Conclusions Semi-structured interview methods provide valuable information of children's perspectives and the ways children recall previous activities. Conceptualized domains of physical activity are based on the literature and expert views that also reflect children's experiences and understanding providing a basis for pediatric self-report instruments. PMID:25251789
The impact of contracting-out on health system performance: a conceptual framework.
Liu, Xingzhu; Hotchkiss, David R; Bose, Sujata
2007-07-01
Despite the increased popularity of contracting-out of health services in developing countries, its effectiveness on overall health system performance is not yet conclusive. Except for substantial evidence of contracting-out's positive effect on access to health services and some evidence on improved equity in access, there is little evidence of contracting-out's impact on quality and efficiency. Most studies on the subject evaluate specific contracting-out projects against narrowly specified project objectives, not against more broadly defined health system goals. For this reason, conclusions of positive effects pertaining to project level may not hold at system level. This paper presents a conceptual framework that is expected to facilitate comprehensive, rigorous, and standardized evaluation of contracting-out at health system level. Specifically, this framework supports: full and standardized description of contracting-out interventions, study of the determinants of effectiveness, examination of provider and purchaser responses, assessment of the impact of contracting-out on all dimensions of health system performance, and cross-project analyses.
Coast, Joanna; Flynn, Terry; Sutton, Eileen; Al-Janabi, Hareth; Vosper, Jane; Lavender, Sarita; Louviere, Jordan; Peters, Tim
2008-10-01
This paper deals with three concerns about the evaluative framework that is currently dominant within health economics. These concerns are: that the evaluative framework is concerned entirely with health; that the evaluative framework has an individualistic focus on patients alone; and that the methods used to estimate 'health' within the current evaluative framework could be improved both in terms of the generation of descriptive systems and in using valuation methods that rely less on people's ability to express their preferences on a cardinal scale. In exploring these issues the Investigating Choice Experiments for Preferences of Older People (ICEPOP) programme has explicitly focused on both the topic of older people and the methods of discrete choice experiments. A capability index has been developed and attributes for an economic measure of end-of-life care are currently being generated, providing the possibility of extending the evaluative framework beyond health alone. A measure of carer's experience and a framework for extending measurement in end-of-life care to loved ones are both also in development, thus extending the evaluative framework beyond the patient alone. Rigorous qualitative methods employing an iterative approach have been developed for use in constructing attributes, and best-worst scaling has been utilized to reduce task complexity and provide insights into heterogeneity. There are a number of avenues for further research in all these areas, but in particular there is need for greater attention to be paid to the theory underlying the evaluative framework within health economics.
Clark, Kevin B
2010-03-01
Fringe quantum biology theories often adopt the concept of Bose-Einstein condensation when explaining how consciousness, emotion, perception, learning, and reasoning emerge from operations of intact animal nervous systems and other computational media. However, controversial empirical evidence and mathematical formalism concerning decoherence rates of bioprocesses keep these frameworks from satisfactorily accounting for the physical nature of cognitive-like events. This study, inspired by the discovery that preferential attachment rules computed by complex technological networks obey Bose-Einstein statistics, is the first rigorous attempt to examine whether analogues of Bose-Einstein condensation precipitate learned decision making in live biological systems as bioenergetics optimization predicts. By exploiting the ciliate Spirostomum ambiguum's capacity to learn and store behavioral strategies advertising mating availability into heuristics of topologically invariant computational networks, three distinct phases of strategy use were found to map onto statistical distributions described by Bose-Einstein, Fermi-Dirac, and classical Maxwell-Boltzmann behavior. Ciliates that sensitized or habituated signaling patterns to emit brief periods of either deceptive 'harder-to-get' or altruistic 'easier-to-get' serial escape reactions began testing condensed on initially perceived fittest 'courting' solutions. When these ciliates switched from their first strategy choices, Bose-Einstein condensation of strategy use abruptly dissipated into a Maxwell-Boltzmann computational phase no longer dominated by a single fittest strategy. Recursive trial-and-error strategy searches annealed strategy use back into a condensed phase consistent with performance optimization. 'Social' decisions performed by ciliates showing no nonassociative learning were largely governed by Fermi-Dirac statistics, resulting in degenerate distributions of strategy choices. These findings corroborate previous work demonstrating ciliates with improving expertise search grouped 'courting' assurances at quantum efficiencies and verify efficient processing by primitive 'social' intelligences involves network forms of Bose-Einstein condensation coupled to preceding thermodynamic-sensitive computational phases. 2009 Elsevier Ireland Ltd. All rights reserved.
Separating intrinsic from extrinsic fluctuations in dynamic biological systems
Paulsson, Johan
2011-01-01
From molecules in cells to organisms in ecosystems, biological populations fluctuate due to the intrinsic randomness of individual events and the extrinsic influence of changing environments. The combined effect is often too complex for effective analysis, and many studies therefore make simplifying assumptions, for example ignoring either intrinsic or extrinsic effects to reduce the number of model assumptions. Here we mathematically demonstrate how two identical and independent reporters embedded in a shared fluctuating environment can be used to identify intrinsic and extrinsic noise terms, but also how these contributions are qualitatively and quantitatively different from what has been previously reported. Furthermore, we show for which classes of biological systems the noise contributions identified by dual-reporter methods correspond to the noise contributions predicted by correct stochastic models of either intrinsic or extrinsic mechanisms. We find that for broad classes of systems, the extrinsic noise from the dual-reporter method can be rigorously analyzed using models that ignore intrinsic stochasticity. In contrast, the intrinsic noise can be rigorously analyzed using models that ignore extrinsic stochasticity only under very special conditions that rarely hold in biology. Testing whether the conditions are met is rarely possible and the dual-reporter method may thus produce flawed conclusions about the properties of the system, particularly about the intrinsic noise. Our results contribute toward establishing a rigorous framework to analyze dynamically fluctuating biological systems. PMID:21730172
Separating intrinsic from extrinsic fluctuations in dynamic biological systems.
Hilfinger, Andreas; Paulsson, Johan
2011-07-19
From molecules in cells to organisms in ecosystems, biological populations fluctuate due to the intrinsic randomness of individual events and the extrinsic influence of changing environments. The combined effect is often too complex for effective analysis, and many studies therefore make simplifying assumptions, for example ignoring either intrinsic or extrinsic effects to reduce the number of model assumptions. Here we mathematically demonstrate how two identical and independent reporters embedded in a shared fluctuating environment can be used to identify intrinsic and extrinsic noise terms, but also how these contributions are qualitatively and quantitatively different from what has been previously reported. Furthermore, we show for which classes of biological systems the noise contributions identified by dual-reporter methods correspond to the noise contributions predicted by correct stochastic models of either intrinsic or extrinsic mechanisms. We find that for broad classes of systems, the extrinsic noise from the dual-reporter method can be rigorously analyzed using models that ignore intrinsic stochasticity. In contrast, the intrinsic noise can be rigorously analyzed using models that ignore extrinsic stochasticity only under very special conditions that rarely hold in biology. Testing whether the conditions are met is rarely possible and the dual-reporter method may thus produce flawed conclusions about the properties of the system, particularly about the intrinsic noise. Our results contribute toward establishing a rigorous framework to analyze dynamically fluctuating biological systems.
Structure-Specific Statistical Mapping of White Matter Tracts
Yushkevich, Paul A.; Zhang, Hui; Simon, Tony; Gee, James C.
2008-01-01
We present a new model-based framework for the statistical analysis of diffusion imaging data associated with specific white matter tracts. The framework takes advantage of the fact that several of the major white matter tracts are thin sheet-like structures that can be effectively modeled by medial representations. The approach involves segmenting major tracts and fitting them with deformable geometric medial models. The medial representation makes it possible to average and combine tensor-based features along directions locally perpendicular to the tracts, thus reducing data dimensionality and accounting for errors in normalization. The framework enables the analysis of individual white matter structures, and provides a range of possibilities for computing statistics and visualizing differences between cohorts. The framework is demonstrated in a study of white matter differences in pediatric chromosome 22q11.2 deletion syndrome. PMID:18407524
A Survey of Statistical Models for Reverse Engineering Gene Regulatory Networks
Huang, Yufei; Tienda-Luna, Isabel M.; Wang, Yufeng
2009-01-01
Statistical models for reverse engineering gene regulatory networks are surveyed in this article. To provide readers with a system-level view of the modeling issues in this research, a graphical modeling framework is proposed. This framework serves as the scaffolding on which the review of different models can be systematically assembled. Based on the framework, we review many existing models for many aspects of gene regulation; the pros and cons of each model are discussed. In addition, network inference algorithms are also surveyed under the graphical modeling framework by the categories of point solutions and probabilistic solutions and the connections and differences among the algorithms are provided. This survey has the potential to elucidate the development and future of reverse engineering GRNs and bring statistical signal processing closer to the core of this research. PMID:20046885
Sociomateriality: a theoretical framework for studying distributed medical education.
MacLeod, Anna; Kits, Olga; Whelan, Emma; Fournier, Cathy; Wilson, Keith; Power, Gregory; Mann, Karen; Tummons, Jonathan; Brown, Peggy Alexiadis
2015-11-01
Distributed medical education (DME) is a type of distance learning in which students participate in medical education from diverse geographic locations using Web conferencing, videoconferencing, e-learning, and similar tools. DME is becoming increasingly widespread in North America and around the world.Although relatively new to medical education, distance learning has a long history in the broader field of education and a related body of literature that speaks to the importance of engaging in rigorous and theoretically informed studies of distance learning. The existing DME literature is helpful, but it has been largely descriptive and lacks a critical "lens"-that is, a theoretical perspective from which to rigorously conceptualize and interrogate DME's social (relationships, people) and material (technologies, tools) aspects.The authors describe DME and theories about distance learning and show that such theories focus on social, pedagogical, and cognitive considerations without adequately taking into account material factors. They address this gap by proposing sociomateriality as a theoretical framework allowing researchers and educators to study DME and (1) understand and consider previously obscured actors, infrastructure, and other factors that, on the surface, seem unrelated and even unimportant; (2) see clearly how the social and material components of learning are intertwined in fluid, messy, and often uncertain ways; and (3) perhaps think differently, even in ways that disrupt traditional approaches, as they explore DME. The authors conclude that DME brings with it substantial investments of social and material resources, and therefore needs careful study, using approaches that embrace its complexity.
Practice guidelines for program evaluation in community-based rehabilitation.
Grandisson, Marie; Hébert, Michèle; Thibeault, Rachel
2017-06-01
This paper proposes practice guidelines to evaluate community-based rehabilitation (CBR) programs. These were developed through a rigorous three-phase research process including a literature review on good practices in CBR program evaluation, a field study during which a South Africa CBR program was evaluated, and a Delphi study to generate consensus among a highly credible panel of CBR experts from a wide range of backgrounds and geographical areas. The 10 guidelines developed are summarized into a practice model highlighting key features of sound CBR program evaluation. They strongly indicate that sound CBR evaluations are those that give a voice and as much control as possible to the most affected groups, embrace the challenge of diversity, and foster use of evaluation processes and findings through a rigorous, collaborative and empowering approach. The practice guidelines should facilitate CBR evaluation decisions in respect to facilitating an evaluation process, using frameworks and designing methods. Implications for rehabilitation Ten practice guidelines provide guidance to facilitate sound community-based rehabilitation (CBR) program evaluation decisions. Key indications of good practice include: • being as participatory and empowering as possible; • ensuring that all, including the most affected, have a real opportunity to share their thoughts; • highly considering mixed methods and participatory tools; • adapting to fit evaluation context, local culture and language(s); • defining evaluation questions and reporting findings using shared CBR language when possible, which the framework offered may facilitate.
Wickham, Hadley; Hofmann, Heike
2011-12-01
We propose a new framework for visualising tables of counts, proportions and probabilities. We call our framework product plots, alluding to the computation of area as a product of height and width, and the statistical concept of generating a joint distribution from the product of conditional and marginal distributions. The framework, with extensions, is sufficient to encompass over 20 visualisations previously described in fields of statistical graphics and infovis, including bar charts, mosaic plots, treemaps, equal area plots and fluctuation diagrams. © 2011 IEEE
Weak value amplification considered harmful
NASA Astrophysics Data System (ADS)
Ferrie, Christopher; Combes, Joshua
2014-03-01
We show using statistically rigorous arguments that the technique of weak value amplification does not perform better than standard statistical techniques for the tasks of parameter estimation and signal detection. We show that using all data and considering the joint distribution of all measurement outcomes yields the optimal estimator. Moreover, we show estimation using the maximum likelihood technique with weak values as small as possible produces better performance for quantum metrology. In doing so, we identify the optimal experimental arrangement to be the one which reveals the maximal eigenvalue of the square of system observables. We also show these conclusions do not change in the presence of technical noise.
Beckwith, Sue; Dickinson, Angela; Kendall, Sally
2008-12-01
This paper draws on the work of Paley and Duncan et al in order to extend and engender debate regarding the use of Concept Analysis frameworks. Despite the apparent plethora of Concept Analysis frameworks used in nursing studies we found that over half of those used were derived from the work of one author. This paper explores the suitability and use of these frameworks and is set at a time when the numbers of published concept analysis papers are increasing. For the purpose of this study thirteen commonly used frameworks, identified from the nursing journals 1993 to 2005, were explored to reveal their origins, ontological and philosophical stance, and any common elements. The frameworks were critiqued and links made between their antecedents. It was noted if the articles contained discussion of any possible tensions between the ontological perspective of the framework used, the process of analysis, praxis and possible nursing theory developments. It was found that the thirteen identified frameworks are mainly based on hermeneutic propositions regarding understandings and are interpretive procedures founded on self-reflective modes of discovery. Six frameworks rely on or include the use of casuistry. Seven of the frameworks identified are predicated on, or adapt the work of Wilson, a school master writing for his pupils. Wilson's framework has a simplistic eleven step, binary and reductionist structure. Other frameworks identified include Morse et al's framework which this article suggests employs a contestable theory of concept maturity. Based on the findings revealed through our exploration of the use of concept analysis frameworks in the nursing literature, concerns were raised regarding the unjustified adaptation and alterations and the uncritical use of the frameworks. There is little evidence that these frameworks provide the necessary depth, rigor or replicability to enable the development in nursing theory which they underpin.
AthenaMT: upgrading the ATLAS software framework for the many-core world with multi-threading
NASA Astrophysics Data System (ADS)
Leggett, Charles; Baines, John; Bold, Tomasz; Calafiura, Paolo; Farrell, Steven; van Gemmeren, Peter; Malon, David; Ritsch, Elmar; Stewart, Graeme; Snyder, Scott; Tsulaia, Vakhtang; Wynne, Benjamin; ATLAS Collaboration
2017-10-01
ATLAS’s current software framework, Gaudi/Athena, has been very successful for the experiment in LHC Runs 1 and 2. However, its single threaded design has been recognized for some time to be increasingly problematic as CPUs have increased core counts and decreased available memory per core. Even the multi-process version of Athena, AthenaMP, will not scale to the range of architectures we expect to use beyond Run2. After concluding a rigorous requirements phase, where many design components were examined in detail, ATLAS has begun the migration to a new data-flow driven, multi-threaded framework, which enables the simultaneous processing of singleton, thread unsafe legacy Algorithms, cloned Algorithms that execute concurrently in their own threads with different Event contexts, and fully re-entrant, thread safe Algorithms. In this paper we report on the process of modifying the framework to safely process multiple concurrent events in different threads, which entails significant changes in the underlying handling of features such as event and time dependent data, asynchronous callbacks, metadata, integration with the online High Level Trigger for partial processing in certain regions of interest, concurrent I/O, as well as ensuring thread safety of core services. We also report on upgrading the framework to handle Algorithms that are fully re-entrant.
Johannesen, Kasper M; Claxton, Karl; Sculpher, Mark J; Wailoo, Allan J
2018-02-01
This paper presents a conceptual framework to analyse the design of the cost-effectiveness appraisal process of new healthcare technologies. The framework characterises the appraisal processes as a diagnostic test aimed at identifying cost-effective (true positive) and non-cost-effective (true negative) technologies. Using the framework, factors that influence the value of operating an appraisal process, in terms of net gain to population health, are identified. The framework is used to gain insight into current policy questions including (a) how rigorous the process should be, (b) who should have the burden of proof, and (c) how optimal design changes when allowing for appeals, price reductions, resubmissions, and re-evaluations. The paper demonstrates that there is no one optimal appraisal process and the process should be adapted over time and to the specific technology under assessment. Optimal design depends on country-specific features of (future) technologies, for example, effect, price, and size of the patient population, which might explain the difference in appraisal processes across countries. It is shown that burden of proof should be placed on the producers and that the impact of price reductions and patient access schemes on the producer's price setting should be considered when designing the appraisal process. Copyright © 2017 John Wiley & Sons, Ltd.
ERIC Educational Resources Information Center
Holzer, Harry J.; Schanzenbach, Diane Whitmore; Duncan, Greg J.; Ludwig, Jens
2007-01-01
In this paper, we review a range of rigorous research studies that estimate the average statistical relationships between children growing up in poverty and their earnings, propensity to commit crime, and quality of health later in life. We also review estimates of the costs that crime and poor health per person impose on the economy. Then we…
Enumerating Sparse Organisms in Ships’ Ballast Water: Why Counting to 10 Is Not So Easy
2011-01-01
To reduce ballast water-borne aquatic invasions worldwide, the International Maritime Organization and United States Coast Guard have each proposed discharge standards specifying maximum concentrations of living biota that may be released in ships’ ballast water (BW), but these regulations still lack guidance for standardized type approval and compliance testing of treatment systems. Verifying whether BW meets a discharge standard poses significant challenges. Properly treated BW will contain extremely sparse numbers of live organisms, and robust estimates of rare events require extensive sampling efforts. A balance of analytical rigor and practicality is essential to determine the volume of BW that can be reasonably sampled and processed, yet yield accurate live counts. We applied statistical modeling to a range of sample volumes, plankton concentrations, and regulatory scenarios (i.e., levels of type I and type II errors), and calculated the statistical power of each combination to detect noncompliant discharge concentrations. The model expressly addresses the roles of sampling error, BW volume, and burden of proof on the detection of noncompliant discharges in order to establish a rigorous lower limit of sampling volume. The potential effects of recovery errors (i.e., incomplete recovery and detection of live biota) in relation to sample volume are also discussed. PMID:21434685
Understanding photon sideband statistics and correlation for determining phonon coherence
NASA Astrophysics Data System (ADS)
Ding, Ding; Yin, Xiaobo; Li, Baowen
2018-01-01
Generating and detecting coherent high-frequency heat-carrying phonons have been topics of great interest in recent years. Although there have been successful attempts in generating and observing coherent phonons, rigorous techniques to characterize and detect phonon coherence in a crystalline material have been lagging compared to what has been achieved for photons. One main challenge is a lack of detailed understanding of how detection signals for phonons can be related to coherence. The quantum theory of photoelectric detection has greatly advanced the ability to characterize photon coherence in the past century, and a similar theory for phonon detection is necessary. Here, we reexamine the optical sideband fluorescence technique that has been used to detect high-frequency phonons in materials with optically active defects. We propose a quantum theory of phonon detection using the sideband technique and found that there are distinct differences in sideband counting statistics between thermal and coherent phonons. We further propose a second-order correlation function unique to sideband signals that allows for a rigorous distinction between thermal and coherent phonons. Our theory is relevant to a correlation measurement with nontrivial response functions at the quantum level and can potentially bridge the gap of experimentally determining phonon coherence to be on par with that of photons.
Enumerating sparse organisms in ships' ballast water: why counting to 10 is not so easy.
Miller, A Whitman; Frazier, Melanie; Smith, George E; Perry, Elgin S; Ruiz, Gregory M; Tamburri, Mario N
2011-04-15
To reduce ballast water-borne aquatic invasions worldwide, the International Maritime Organization and United States Coast Guard have each proposed discharge standards specifying maximum concentrations of living biota that may be released in ships' ballast water (BW), but these regulations still lack guidance for standardized type approval and compliance testing of treatment systems. Verifying whether BW meets a discharge standard poses significant challenges. Properly treated BW will contain extremely sparse numbers of live organisms, and robust estimates of rare events require extensive sampling efforts. A balance of analytical rigor and practicality is essential to determine the volume of BW that can be reasonably sampled and processed, yet yield accurate live counts. We applied statistical modeling to a range of sample volumes, plankton concentrations, and regulatory scenarios (i.e., levels of type I and type II errors), and calculated the statistical power of each combination to detect noncompliant discharge concentrations. The model expressly addresses the roles of sampling error, BW volume, and burden of proof on the detection of noncompliant discharges in order to establish a rigorous lower limit of sampling volume. The potential effects of recovery errors (i.e., incomplete recovery and detection of live biota) in relation to sample volume are also discussed.
Santos, Radleigh; Buying, Alcinette; Sabri, Nazila; Yu, John; Gringeri, Anthony; Bender, James; Janetzki, Sylvia; Pinilla, Clemencia; Judkowski, Valeria A.
2014-01-01
Immune monitoring of functional responses is a fundamental parameter to establish correlates of protection in clinical trials evaluating vaccines and therapies to boost antigen-specific responses. The IFNγ ELISPOT assay is a well-standardized and validated method for the determination of functional IFNγ-producing T-cells in peripheral blood mononuclear cells (PBMC); however, its performance greatly depends on the quality and integrity of the cryopreserved PBMC. Here, we investigate the effect of overnight (ON) resting of the PBMC on the detection of CD8-restricted peptide-specific responses by IFNγ ELISPOT. The study used PBMC from healthy donors to evaluate the CD8 T-cell response to five pooled or individual HLA-A2 viral peptides. The results were analyzed using a modification of the existing distribution free resampling (DFR) recommended for the analysis of ELISPOT data to ensure the most rigorous possible standard of significance. The results of the study demonstrate that ON resting of PBMC samples prior to IFNγ ELISPOT increases both the magnitude and the statistical significance of the responses. In addition, a comparison of the results with a 13-day preculture of PBMC with the peptides before testing demonstrates that ON resting is sufficient for the efficient evaluation of immune functioning. PMID:25546016
Only marginal alignment of disc galaxies
NASA Astrophysics Data System (ADS)
Andrae, René; Jahnke, Knud
2011-12-01
Testing theories of angular-momentum acquisition of rotationally supported disc galaxies is the key to understanding the formation of this type of galaxies. The tidal-torque theory aims to explain this acquisition process in a cosmological framework and predicts positive autocorrelations of angular-momentum orientation and spiral-arm handedness, i.e. alignment of disc galaxies, on short distance scales of 1 Mpc h-1. This disc alignment can also cause systematic effects in weak-lensing measurements. Previous observations claimed discovering these correlations but are overly optimistic in the reported level of statistical significance of the detections. Errors in redshift, ellipticity and morphological classifications were not taken into account, although they have a significant impact. We explain how to rigorously propagate all the important errors through the estimation process. Analysing disc galaxies in the Sloan Digital Sky Survey (SDSS) data base, we find that positive autocorrelations of spiral-arm handedness and angular-momentum orientations on distance scales of 1 Mpc h-1 are plausible but not statistically significant. Current data appear not good enough to constrain parameters of theory. This result agrees with a simple hypothesis test in the Local Group, where we also find no evidence for disc alignment. Moreover, we demonstrate that ellipticity estimates based on second moments are strongly biased by galactic bulges even for Scd galaxies, thereby corrupting correlation estimates and overestimating the impact of disc alignment on weak-lensing studies. Finally, we discuss the potential of future sky surveys. We argue that photometric redshifts have too large errors, i.e. PanSTARRS and LSST cannot be used. Conversely, the EUCLID project will not cover the relevant redshift regime. We also discuss the potentials and problems of front-edge classifications of galaxy discs in order to improve the autocorrelation estimates of angular-momentum orientation.
Palmer, Lance E; Dejori, Mathaeus; Bolanos, Randall; Fasulo, Daniel
2010-01-15
With the rapid expansion of DNA sequencing databases, it is now feasible to identify relevant information from prior sequencing projects and completed genomes and apply it to de novo sequencing of new organisms. As an example, this paper demonstrates how such extra information can be used to improve de novo assemblies by augmenting the overlapping step. Finding all pairs of overlapping reads is a key task in many genome assemblers, and to this end, highly efficient algorithms have been developed to find alignments in large collections of sequences. It is well known that due to repeated sequences, many aligned pairs of reads nevertheless do not overlap. But no overlapping algorithm to date takes a rigorous approach to separating aligned but non-overlapping read pairs from true overlaps. We present an approach that extends the Minimus assembler by a data driven step to classify overlaps as true or false prior to contig construction. We trained several different classification models within the Weka framework using various statistics derived from overlaps of reads available from prior sequencing projects. These statistics included percent mismatch and k-mer frequencies within the overlaps as well as a comparative genomics score derived from mapping reads to multiple reference genomes. We show that in real whole-genome sequencing data from the E. coli and S. aureus genomes, by providing a curated set of overlaps to the contigging phase of the assembler, we nearly doubled the median contig length (N50) without sacrificing coverage of the genome or increasing the number of mis-assemblies. Machine learning methods that use comparative and non-comparative features to classify overlaps as true or false can be used to improve the quality of a sequence assembly.
Investigating the Cosmic Web with Topological Data Analysis
NASA Astrophysics Data System (ADS)
Cisewski-Kehe, Jessi; Wu, Mike; Fasy, Brittany; Hellwing, Wojciech; Lovell, Mark; Rinaldo, Alessandro; Wasserman, Larry
2018-01-01
Data exhibiting complicated spatial structures are common in many areas of science (e.g. cosmology, biology), but can be difficult to analyze. Persistent homology is a popular approach within the area of Topological Data Analysis that offers a new way to represent, visualize, and interpret complex data by extracting topological features, which can be used to infer properties of the underlying structures. In particular, TDA may be useful for analyzing the large-scale structure (LSS) of the Universe, which is an intricate and spatially complex web of matter. In order to understand the physics of the Universe, theoretical and computational cosmologists develop large-scale simulations that allow for visualizing and analyzing the LSS under varying physical assumptions. Each point in the 3D data set represents a galaxy or a cluster of galaxies, and topological summaries ("persistent diagrams") can be obtained summarizing the different ordered holes in the data (e.g. connected components, loops, voids).The topological summaries are interesting and informative descriptors of the Universe on their own, but hypothesis tests using the topological summaries would provide a way to make more rigorous comparisons of LSS under different theoretical models. For example, the received cosmological model has cold dark matter (CDM); however, while the case is strong for CDM, there are some observational inconsistencies with this theory. Another possibility is warm dark matter (WDM). It is of interest to see if a CDM Universe and WDM Universe produce LSS that is topologically distinct.We present several possible test statistics for two-sample hypothesis tests using the topological summaries, carryout a simulation study to investigate the suitableness of the proposed test statistics using simulated data from a variation of the Voronoi foam model, and finally we apply the proposed inference framework to WDM vs. CDM cosmological simulation data.
NASA Astrophysics Data System (ADS)
Sundberg, R.; Moberg, A.; Hind, A.
2012-08-01
A statistical framework for comparing the output of ensemble simulations from global climate models with networks of climate proxy and instrumental records has been developed, focusing on near-surface temperatures for the last millennium. This framework includes the formulation of a joint statistical model for proxy data, instrumental data and simulation data, which is used to optimize a quadratic distance measure for ranking climate model simulations. An essential underlying assumption is that the simulations and the proxy/instrumental series have a shared component of variability that is due to temporal changes in external forcing, such as volcanic aerosol load, solar irradiance or greenhouse gas concentrations. Two statistical tests have been formulated. Firstly, a preliminary test establishes whether a significant temporal correlation exists between instrumental/proxy and simulation data. Secondly, the distance measure is expressed in the form of a test statistic of whether a forced simulation is closer to the instrumental/proxy series than unforced simulations. The proposed framework allows any number of proxy locations to be used jointly, with different seasons, record lengths and statistical precision. The goal is to objectively rank several competing climate model simulations (e.g. with alternative model parameterizations or alternative forcing histories) by means of their goodness of fit to the unobservable true past climate variations, as estimated from noisy proxy data and instrumental observations.
Improved analyses using function datasets and statistical modeling
John S. Hogland; Nathaniel M. Anderson
2014-01-01
Raster modeling is an integral component of spatial analysis. However, conventional raster modeling techniques can require a substantial amount of processing time and storage space and have limited statistical functionality and machine learning algorithms. To address this issue, we developed a new modeling framework using C# and ArcObjects and integrated that framework...
How Does Teacher Knowledge in Statistics Impact on Teacher Listening?
ERIC Educational Resources Information Center
Burgess, Tim
2012-01-01
For teaching statistics investigations at primary school level, teacher knowledge has been identified using a framework developed from a classroom based study. Through development of the framework, three types of teacher listening problems were identified, each of which had potential impact on the students' learning. The three types of problems…
NASA Astrophysics Data System (ADS)
Bookstein, Fred L.
1995-08-01
Recent advances in computational geometry have greatly extended the range of neuroanatomical questions that can be approached by rigorous quantitative methods. One of the major current challenges in this area is to describe the variability of human cortical surface form and its implications for individual differences in neurophysiological functioning. Existing techniques for representation of stochastically invaginated surfaces do not conduce to the necessary parametric statistical summaries. In this paper, following a hint from David Van Essen and Heather Drury, I sketch a statistical method customized for the constraints of this complex data type. Cortical surface form is represented by its Riemannian metric tensor and averaged according to parameters of a smooth averaged surface. Sulci are represented by integral trajectories of the smaller principal strains of this metric, and their statistics follow the statistics of that relative metric. The diagrams visualizing this tensor analysis look like alligator leather but summarize all aspects of cortical surface form in between the principal sulci, the reliable ones; no flattening is required.
General Aviation Avionics Statistics : 1975
DOT National Transportation Integrated Search
1978-06-01
This report presents avionics statistics for the 1975 general aviation (GA) aircraft fleet and updates a previous publication, General Aviation Avionics Statistics: 1974. The statistics are presented in a capability group framework which enables one ...
NASA Astrophysics Data System (ADS)
Gillam, Thomas P. S.; Lester, Christopher G.
2014-11-01
We consider current and alternative approaches to setting limits on new physics signals having backgrounds from misidentified objects; for example jets misidentified as leptons, b-jets or photons. Many ATLAS and CMS analyses have used a heuristic "matrix method" for estimating the background contribution from such sources. We demonstrate that the matrix method suffers from statistical shortcomings that can adversely affect its ability to set robust limits. A rigorous alternative method is discussed, and is seen to produce fake rate estimates and limits with better qualities, but is found to be too costly to use. Having investigated the nature of the approximations used to derive the matrix method, we propose a third strategy that is seen to marry the speed of the matrix method to the performance and physicality of the more rigorous approach.
NASA Astrophysics Data System (ADS)
Toman, Blaza; Nelson, Michael A.; Bedner, Mary
2017-06-01
Chemical measurement methods are designed to promote accurate knowledge of a measurand or system. As such, these methods often allow elicitation of latent sources of variability and correlation in experimental data. They typically implement measurement equations that support quantification of effects associated with calibration standards and other known or observed parametric variables. Additionally, multiple samples and calibrants are usually analyzed to assess accuracy of the measurement procedure and repeatability by the analyst. Thus, a realistic assessment of uncertainty for most chemical measurement methods is not purely bottom-up (based on the measurement equation) or top-down (based on the experimental design), but inherently contains elements of both. Confidence in results must be rigorously evaluated for the sources of variability in all of the bottom-up and top-down elements. This type of analysis presents unique challenges due to various statistical correlations among the outputs of measurement equations. One approach is to use a Bayesian hierarchical (BH) model which is intrinsically rigorous, thus making it a straightforward method for use with complex experimental designs, particularly when correlations among data are numerous and difficult to elucidate or explicitly quantify. In simpler cases, careful analysis using GUM Supplement 1 (MC) methods augmented with random effects meta analysis yields similar results to a full BH model analysis. In this article we describe both approaches to rigorous uncertainty evaluation using as examples measurements of 25-hydroxyvitamin D3 in solution reference materials via liquid chromatography with UV absorbance detection (LC-UV) and liquid chromatography mass spectrometric detection using isotope dilution (LC-IDMS).
Calha, Nuno; Messias, Ana; Guerra, Fernando; Martinho, Beatriz; Neto, Maria Augusta; Nicolau, Pedro
2017-04-01
To evaluate the effect of geometry on the displacement and the strain distribution of anterior implant-supported zirconia frameworks under static load using the 3D digital image correlation method. Two groups (n=5) of 4-unit zirconia frameworks were produced by CAD/CAM for the implant-abutment assembly. Group 1 comprised five straight configuration frameworks and group 2 consisted of five curved configuration frameworks. Specimens were cemented and submitted to static load up to 200N. Displacements were captured with two high-speed photographic cameras and analyzed with video correlation system in three spacial axes U, V, W. Statistical analysis was made using the nonparametric Mann-Whitney test. Up to 150N loads, the vertical displacements (V axis) were statistically higher for curved frameworks (-267.83±23.76μm), when compared to the straight frameworks (-120.73±36.17μm) (p=0.008), as well as anterior displacements in the W transformed axis (589.55±64.51μm vs 224.29±50.38μm for the curved and straight frameworks), respectively (p=0.008). The mean von Mises strains over the surface frameworks were statistically higher for the curved frameworks under any load. Within the limitations of this in vitro study, it is possible to conclude that the geometric configuration influences the deformation of 4-unit anterior frameworks under static load. The higher strain distribution and micro-movements of the curved frameworks reflect less rigidity and increased risk of fractures associated to FPDs. Copyright © 2016 Japan Prosthodontic Society. Published by Elsevier Ltd. All rights reserved.
Yusof, Maryati Mohd; Kuljis, Jasna; Papazafeiropoulou, Anastasia; Stergioulas, Lampros K
2008-06-01
The realization of Health Information Systems (HIS) requires rigorous evaluation that addresses technology, human and organization issues. Our review indicates that current evaluation methods evaluate different aspects of HIS and they can be improved upon. A new evaluation framework, human, organization and technology-fit (HOT-fit) was developed after having conducted a critical appraisal of the findings of existing HIS evaluation studies. HOT-fit builds on previous models of IS evaluation--in particular, the IS Success Model and the IT-Organization Fit Model. This paper introduces the new framework for HIS evaluation that incorporates comprehensive dimensions and measures of HIS and provides a technological, human and organizational fit. Literature review on HIS and IS evaluation studies and pilot testing of developed framework. The framework was used to evaluate a Fundus Imaging System (FIS) of a primary care organization in the UK. The case study was conducted through observation, interview and document analysis. The main findings show that having the right user attitude and skills base together with good leadership, IT-friendly environment and good communication can have positive influence on the system adoption. Comprehensive, specific evaluation factors, dimensions and measures in the new framework (HOT-fit) are applicable in HIS evaluation. The use of such a framework is argued to be useful not only for comprehensive evaluation of the particular FIS system under investigation, but potentially also for any Health Information System in general.
Learning from Science and Sport - How we, Safety, "Engage with Rigor"
NASA Astrophysics Data System (ADS)
Herd, A.
2012-01-01
As the world of spaceflight safety is relatively small and potentially inward-looking, we need to be aware of the "outside world". We should then try to remind ourselves to be open to the possibility that data, knowledge or experience from outside of the spaceflight community may provide some constructive alternate perspectives. This paper will assess aspects from two seemingly tangential fields, science and sport, and align these with the world of safety. In doing so some useful insights will be given to the challenges we face and may provide solutions relevant in our everyday (of safety engineering). Sport, particularly a contact sport such as rugby union, requires direct interaction between members of two (opposing) teams. Professional, accurately timed and positioned interaction for a desired outcome. These interactions, whilst an essential part of the game, are however not without their constraints. The rugby scrum has constraints as to the formation and engagement of the two teams. The controlled engagement provides for an interaction between the two teams in a safe manner. The constraints arising from the reality that an incorrect engagement could cause serious injury to members of either team. In academia, scientific rigor is applied to assure that the arguments provided and the conclusions drawn in academic papers presented for publication are valid, legitimate and credible. The scientific goal of the need for rigor may be expressed in the example of achieving a statistically relevant sample size, n, in order to assure analysis validity of the data pool. A failure to apply rigor could then place the entire study at risk of failing to have the respective paper published. This paper will consider the merits of these two different aspects, scientific rigor and sports engagement, and offer a reflective look at how this may provide a "modus operandi" for safety engineers at any level whether at their desks (creating or reviewing safety assessments) or in a safety review meeting (providing a verbal critique of the presented safety case).
Towards a Framework for Evaluating Mobile Mental Health Apps.
Chan, Steven; Torous, John; Hinton, Ladson; Yellowlees, Peter
2015-12-01
Mobile phones are ubiquitous in society and owned by a majority of psychiatric patients, including those with severe mental illness. Their versatility as a platform can extend mental health services in the areas of communication, self-monitoring, self-management, diagnosis, and treatment. However, the efficacy and reliability of publicly available applications (apps) have yet to be demonstrated. Numerous articles have noted the need for rigorous evaluation of the efficacy and clinical utility of smartphone apps, which are largely unregulated. Professional clinical organizations do not provide guidelines for evaluating mobile apps. Guidelines and frameworks are needed to evaluate medical apps. Numerous frameworks and evaluation criteria exist from the engineering and informatics literature, as well as interdisciplinary organizations in similar fields such as telemedicine and healthcare informatics. We propose criteria for both patients and providers to use in assessing not just smartphone apps, but also wearable devices and smartwatch apps for mental health. Apps can be evaluated by their usefulness, usability, and integration and infrastructure. Apps can be categorized by their usability in one or more stages of a mental health provider's workflow. Ultimately, leadership is needed to develop a framework for describing apps, and guidelines are needed for both patients and mental health providers.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Suter II, G.W.
2003-06-18
The objective of this research is to provide the DoD with a framework based on a systematic, risk-based approach to assess impacts for management of natural resources in an ecosystem context. This risk assessment framework is consistent with, but extends beyond, the EPA's ecological risk assessment framework, and specifically addresses DoD activities and management needs. MERAF is intended to be consistent with existing procedures for environmental assessment and planning with DoD testing and training. The intention is to supplement these procedures rather than creating new procedural requirements. MERAF is suitable for use for training and testing area assessment and management.more » It does not include human health risks nor does it address specific permitting or compliance requirements, although it may be useful in some of these cases. Use of MERAF fits into the National Environmental Policy Act (NEPA) process by providing a consistent and rigorous way of organizing and conducting the technical analysis for Environmental Impact Statements (EISs) (Sigal 1993; Carpenter 1995; Canter and Sadler 1997). It neither conflicts with, nor replaces, procedural requirements within the NEPA process or document management processes already in place within DoD.« less
Harder, Thomas; Takla, Anja; Eckmanns, Tim; Ellis, Simon; Forland, Frode; James, Roberta; Meerpohl, Joerg J; Morgan, Antony; Rehfuess, Eva; Schünemann, Holger; Zuiderent-Jerak, Teun; de Carvalho Gomes, Helena; Wichmann, Ole
2017-10-01
Decisions in public health should be based on the best available evidence, reviewed and appraised using a rigorous and transparent methodology. The Project on a Framework for Rating Evidence in Public Health (PRECEPT) defined a methodology for evaluating and grading evidence in infectious disease epidemiology, prevention and control that takes different domains and question types into consideration. The methodology rates evidence in four domains: disease burden, risk factors, diagnostics and intervention. The framework guiding it has four steps going from overarching questions to an evidence statement. In step 1, approaches for identifying relevant key areas and developing specific questions to guide systematic evidence searches are described. In step 2, methodological guidance for conducting systematic reviews is provided; 15 study quality appraisal tools are proposed and an algorithm is given for matching a given study design with a tool. In step 3, a standardised evidence-grading scheme using the Grading of Recommendations Assessment, Development and Evaluation Working Group (GRADE) methodology is provided, whereby findings are documented in evidence profiles. Step 4 consists of preparing a narrative evidence summary. Users of this framework should be able to evaluate and grade scientific evidence from the four domains in a transparent and reproducible way.
Harder, Thomas; Takla, Anja; Eckmanns, Tim; Ellis, Simon; Forland, Frode; James, Roberta; Meerpohl, Joerg J; Morgan, Antony; Rehfuess, Eva; Schünemann, Holger; Zuiderent-Jerak, Teun; de Carvalho Gomes, Helena; Wichmann, Ole
2017-01-01
Decisions in public health should be based on the best available evidence, reviewed and appraised using a rigorous and transparent methodology. The Project on a Framework for Rating Evidence in Public Health (PRECEPT) defined a methodology for evaluating and grading evidence in infectious disease epidemiology, prevention and control that takes different domains and question types into consideration. The methodology rates evidence in four domains: disease burden, risk factors, diagnostics and intervention. The framework guiding it has four steps going from overarching questions to an evidence statement. In step 1, approaches for identifying relevant key areas and developing specific questions to guide systematic evidence searches are described. In step 2, methodological guidance for conducting systematic reviews is provided; 15 study quality appraisal tools are proposed and an algorithm is given for matching a given study design with a tool. In step 3, a standardised evidence-grading scheme using the Grading of Recommendations Assessment, Development and Evaluation Working Group (GRADE) methodology is provided, whereby findings are documented in evidence profiles. Step 4 consists of preparing a narrative evidence summary. Users of this framework should be able to evaluate and grade scientific evidence from the four domains in a transparent and reproducible way. PMID:29019317
Conceptual framework for behavioral and social science in HIV vaccine clinical research
Lau, Chuen-Yen; Swann, Edith M.; Singh, Sagri; Kafaar, Zuhayr; Meissner, Helen I.; Stansbury, James P.
2011-01-01
HIV vaccine clinical research occurs within a context where biomedical science and social issues are interlinked. Previous HIV vaccine research has considered behavioral and social issues, but often treated them as independent of clinical research processes. Systematic attention to the intersection of behavioral and social issues within a defined clinical research framework is needed to address gaps, such as those related to participation in trials, completion of trials, and the overall research experience. Rigorous attention to these issues at project inception can inform trial design and conduct by matching research approaches to the context in which trials are to be conducted. Conducting behavioral and social sciences research concurrent with vaccine clinical research is important because it can help identify potential barriers to trial implementation, as well as ultimate acceptance and dissemination of trial results. We therefore propose a conceptual framework for behavioral and social science in HIV vaccine clinical research and use examples from the behavioral and social science literature to demonstrate how the model can facilitate identification of significant areas meriting additional exploration. Standardized use of the conceptual framework could improve HIV vaccine clinical research efficiency and relevance. PMID:21821083
Conceptual framework for behavioral and social science in HIV vaccine clinical research.
Lau, Chuen-Yen; Swann, Edith M; Singh, Sagri; Kafaar, Zuhayr; Meissner, Helen I; Stansbury, James P
2011-10-13
HIV vaccine clinical research occurs within a context where biomedical science and social issues are interlinked. Previous HIV vaccine research has considered behavioral and social issues, but often treated them as independent of clinical research processes. Systematic attention to the intersection of behavioral and social issues within a defined clinical research framework is needed to address gaps, such as those related to participation in trials, completion of trials, and the overall research experience. Rigorous attention to these issues at project inception can inform trial design and conduct by matching research approaches to the context in which trials are to be conducted. Conducting behavioral and social sciences research concurrent with vaccine clinical research is important because it can help identify potential barriers to trial implementation, as well as ultimate acceptance and dissemination of trial results. We therefore propose a conceptual framework for behavioral and social science in HIV vaccine clinical research and use examples from the behavioral and social science literature to demonstrate how the model can facilitate identification of significant areas meriting additional exploration. Standardized use of the conceptual framework could improve HIV vaccine clinical research efficiency and relevance. Published by Elsevier Ltd.
An Evolutionary Framework for Understanding the Origin of Eukaryotes.
Blackstone, Neil W
2016-04-27
Two major obstacles hinder the application of evolutionary theory to the origin of eukaryotes. The first is more apparent than real-the endosymbiosis that led to the mitochondrion is often described as "non-Darwinian" because it deviates from the incremental evolution championed by the modern synthesis. Nevertheless, endosymbiosis can be accommodated by a multi-level generalization of evolutionary theory, which Darwin himself pioneered. The second obstacle is more serious-all of the major features of eukaryotes were likely present in the last eukaryotic common ancestor thus rendering comparative methods ineffective. In addition to a multi-level theory, the development of rigorous, sequence-based phylogenetic and comparative methods represents the greatest achievement of modern evolutionary theory. Nevertheless, the rapid evolution of major features in the eukaryotic stem group requires the consideration of an alternative framework. Such a framework, based on the contingent nature of these evolutionary events, is developed and illustrated with three examples: the putative intron proliferation leading to the nucleus and the cell cycle; conflict and cooperation in the origin of eukaryotic bioenergetics; and the inter-relationship between aerobic metabolism, sterol synthesis, membranes, and sex. The modern synthesis thus provides sufficient scope to develop an evolutionary framework to understand the origin of eukaryotes.
Model of dissolution in the framework of tissue engineering and drug delivery.
Sanz-Herrera, J A; Soria, L; Reina-Romo, E; Torres, Y; Boccaccini, A R
2018-05-22
Dissolution phenomena are ubiquitously present in biomaterials in many different fields. Despite the advantages of simulation-based design of biomaterials in medical applications, additional efforts are needed to derive reliable models which describe the process of dissolution. A phenomenologically based model, available for simulation of dissolution in biomaterials, is introduced in this paper. The model turns into a set of reaction-diffusion equations implemented in a finite element numerical framework. First, a parametric analysis is conducted in order to explore the role of model parameters on the overall dissolution process. Then, the model is calibrated and validated versus a straightforward but rigorous experimental setup. Results show that the mathematical model macroscopically reproduces the main physicochemical phenomena that take place in the tests, corroborating its usefulness for design of biomaterials in the tissue engineering and drug delivery research areas.
Chen, Zheng; Liu, Liu; Mu, Lin
2017-05-03
In this paper, we consider the linear transport equation under diffusive scaling and with random inputs. The method is based on the generalized polynomial chaos approach in the stochastic Galerkin framework. Several theoretical aspects will be addressed. Additionally, a uniform numerical stability with respect to the Knudsen number ϵ, and a uniform in ϵ error estimate is given. For temporal and spatial discretizations, we apply the implicit–explicit scheme under the micro–macro decomposition framework and the discontinuous Galerkin method, as proposed in Jang et al. (SIAM J Numer Anal 52:2048–2072, 2014) for deterministic problem. Lastly, we provide a rigorous proof ofmore » the stochastic asymptotic-preserving (sAP) property. Extensive numerical experiments that validate the accuracy and sAP of the method are conducted.« less
Completion of the universal I-Love-Q relations in compact stars including the mass
NASA Astrophysics Data System (ADS)
Reina, Borja; Sanchis-Gual, Nicolas; Vera, Raül; Font, José A.
2017-09-01
In a recent paper, we applied a rigorous perturbed matching framework to show the amendment of the mass of rotating stars in Hartle's model. Here, we apply this framework to the tidal problem in binary systems. Our approach fully accounts for the correction to the Love numbers needed to obtain the universal I-Love-Q relations. We compute the corrected mass versus radius configurations of rotating quark stars, revisiting a classical paper on the subject. These corrections allow us to find a universal relation involving the second-order contribution to the mass δM. We thus complete the set of universal relations for the tidal problem in binary systems, involving four perturbation parameters, namely I, Love, Q and δM. These relations can be used to obtain the perturbation parameters directly from observational data.
Schrom, Edward C; Graham, Andrea L
2017-12-01
Over recent years, extensive phenotypic variability and plasticity have been revealed among the T-helper cells of the mammalian adaptive immune system, even within clonal lineages of identical antigen specificity. This challenges the conventional view that T-helper cells assort into functionally distinct subsets following differential instruction by the innate immune system. We argue that the adaptive value of coping with uncertainty can reconcile the 'instructed subset' framework with T-helper variability and plasticity. However, we also suggest that T-helper cells might better be understood as agile swarms engaged in collective decision-making to promote host fitness. With rigorous testing, the 'agile swarms' framework may illuminate how variable and plastic individual T-helper cells interact to create coherent immunity. Copyright © 2017 Elsevier Ltd. All rights reserved.
Glycoconjugate Vaccines: The Regulatory Framework.
Jones, Christopher
2015-01-01
Most vaccines, including the currently available glycoconjugate vaccines, are administered to healthy infants, to prevent future disease. The safety of a prospective vaccine is a key prerequisite for approval. Undesired side effects would not only have the potential to damage the individual infant but also lead to a loss of confidence in the respective vaccine-or vaccines in general-on a population level. Thus, regulatory requirements, particularly with regard to safety, are extremely rigorous. This chapter highlights regulatory aspects on carbohydrate-based vaccines with an emphasis on analytical approaches to ensure the consistent quality of successive manufacturing lots.
Scaling Limit for a Generalization of the Nelson Model and its Application to Nuclear Physics
NASA Astrophysics Data System (ADS)
Suzuki, Akito
We study a mathematically rigorous derivation of a quantum mechanical Hamiltonian in a general framework. We derive such a Hamiltonian by taking a scaling limit for a generalization of the Nelson model, which is an abstract interaction model between particles and a Bose field with some internal degrees of freedom. Applying it to a model for the field of the nuclear force with isospins, we obtain a Schrödinger Hamiltonian with a matrix-valued potential, the one pion exchange potential, describing an effective interaction between nucleons.
Modeling and simulation of high dimensional stochastic multiscale PDE systems at the exascale
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zabaras, Nicolas J.
2016-11-08
Predictive Modeling of multiscale and Multiphysics systems requires accurate data driven characterization of the input uncertainties, and understanding of how they propagate across scales and alter the final solution. This project develops a rigorous mathematical framework and scalable uncertainty quantification algorithms to efficiently construct realistic low dimensional input models, and surrogate low complexity systems for the analysis, design, and control of physical systems represented by multiscale stochastic PDEs. The work can be applied to many areas including physical and biological processes, from climate modeling to systems biology.
A consortium-driven framework to guide the implementation of ICH M7 Option 4 control strategies.
Barber, Chris; Antonucci, Vincent; Baumann, Jens-Christoph; Brown, Roland; Covey-Crump, Elizabeth; Elder, David; Elliott, Eric; Fennell, Jared W; Gallou, Fabrice; Ide, Nathan D; Jordine, Guido; Kallemeyn, Jeffrey M; Lauwers, Dirk; Looker, Adam R; Lovelle, Lucie E; McLaughlin, Mark; Molzahn, Robert; Ott, Martin; Schils, Didier; Oestrich, Rolf Schulte; Stevenson, Neil; Talavera, Pere; Teasdale, Andrew; Urquhart, Michael W; Varie, David L; Welch, Dennie
2017-11-01
The ICH M7 Option 4 control of (potentially) mutagenic impurities is based on the use of scientific principles in lieu of routine analytical testing. This approach can reduce the burden of analytical testing without compromising patient safety, provided a scientifically rigorous approach is taken which is backed up by sufficient theoretical and/or analytical data. This paper introduces a consortium-led initiative and offers a proposal on the supporting evidence that could be presented in regulatory submissions. Copyright © 2017 Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Council of Chief State School Officers, 2012
2012-01-01
In the advent of the development and mass adoption of the common core state standards for English language arts and mathematics, state and local agencies have now expressed a need to the Council of Chief State School Officers (CCSSO or the Council) for assistance as they upgrade existing social studies standards to meet the practical goal of…
Combining statistical inference and decisions in ecology
Williams, Perry J.; Hooten, Mevin B.
2016-01-01
Statistical decision theory (SDT) is a sub-field of decision theory that formally incorporates statistical investigation into a decision-theoretic framework to account for uncertainties in a decision problem. SDT provides a unifying analysis of three types of information: statistical results from a data set, knowledge of the consequences of potential choices (i.e., loss), and prior beliefs about a system. SDT links the theoretical development of a large body of statistical methods including point estimation, hypothesis testing, and confidence interval estimation. The theory and application of SDT have mainly been developed and published in the fields of mathematics, statistics, operations research, and other decision sciences, but have had limited exposure in ecology. Thus, we provide an introduction to SDT for ecologists and describe its utility for linking the conventionally separate tasks of statistical investigation and decision making in a single framework. We describe the basic framework of both Bayesian and frequentist SDT, its traditional use in statistics, and discuss its application to decision problems that occur in ecology. We demonstrate SDT with two types of decisions: Bayesian point estimation, and an applied management problem of selecting a prescribed fire rotation for managing a grassland bird species. Central to SDT, and decision theory in general, are loss functions. Thus, we also provide basic guidance and references for constructing loss functions for an SDT problem.
2012-01-01
Background It is known from recent studies that more than 90% of human multi-exon genes are subject to Alternative Splicing (AS), a key molecular mechanism in which multiple transcripts may be generated from a single gene. It is widely recognized that a breakdown in AS mechanisms plays an important role in cellular differentiation and pathologies. Polymerase Chain Reactions, microarrays and sequencing technologies have been applied to the study of transcript diversity arising from alternative expression. Last generation Affymetrix GeneChip Human Exon 1.0 ST Arrays offer a more detailed view of the gene expression profile providing information on the AS patterns. The exon array technology, with more than five million data points, can detect approximately one million exons, and it allows performing analyses at both gene and exon level. In this paper we describe BEAT, an integrated user-friendly bioinformatics framework to store, analyze and visualize exon arrays datasets. It combines a data warehouse approach with some rigorous statistical methods for assessing the AS of genes involved in diseases. Meta statistics are proposed as a novel approach to explore the analysis results. BEAT is available at http://beat.ba.itb.cnr.it. Results BEAT is a web tool which allows uploading and analyzing exon array datasets using standard statistical methods and an easy-to-use graphical web front-end. BEAT has been tested on a dataset with 173 samples and tuned using new datasets of exon array experiments from 28 colorectal cancer and 26 renal cell cancer samples produced at the Medical Genetics Unit of IRCCS Casa Sollievo della Sofferenza. To highlight all possible AS events, alternative names, accession Ids, Gene Ontology terms and biochemical pathways annotations are integrated with exon and gene level expression plots. The user can customize the results choosing custom thresholds for the statistical parameters and exploiting the available clinical data of the samples for a multivariate AS analysis. Conclusions Despite exon array chips being widely used for transcriptomics studies, there is a lack of analysis tools offering advanced statistical features and requiring no programming knowledge. BEAT provides a user-friendly platform for a comprehensive study of AS events in human diseases, displaying the analysis results with easily interpretable and interactive tables and graphics. PMID:22536968
The development of a digital logic concept inventory
NASA Astrophysics Data System (ADS)
Herman, Geoffrey Lindsay
Instructors in electrical and computer engineering and in computer science have developed innovative methods to teach digital logic circuits. These methods attempt to increase student learning, satisfaction, and retention. Although there are readily accessible and accepted means for measuring satisfaction and retention, there are no widely accepted means for assessing student learning. Rigorous assessment of learning is elusive because differences in topic coverage, curriculum and course goals, and exam content prevent direct comparison of two teaching methods when using tools such as final exam scores or course grades. Because of these difficulties, computing educators have issued a general call for the adoption of assessment tools to critically evaluate and compare the various teaching methods. Science, Technology, Engineering, and Mathematics (STEM) education researchers commonly measure students' conceptual learning to compare how much different pedagogies improve learning. Conceptual knowledge is often preferred because all engineering courses should teach a fundamental set of concepts even if they emphasize design or analysis to different degrees. Increasing conceptual learning is also important, because students who can organize facts and ideas within a consistent conceptual framework are able to learn new information quickly and can apply what they know in new situations. If instructors can accurately assess their students' conceptual knowledge, they can target instructional interventions to remedy common problems. To properly assess conceptual learning, several researchers have developed concept inventories (CIs) for core subjects in engineering sciences. CIs are multiple-choice assessment tools that evaluate how well a student's conceptual framework matches the accepted conceptual framework of a discipline or common faulty conceptual frameworks. We present how we created and evaluated the digital logic concept inventory (DLCI).We used a Delphi process to identify the important and difficult concepts to include on the DLCI. To discover and describe common student misconceptions, we interviewed students who had completed a digital logic course. Students vocalized their thoughts as they solved digital logic problems. We analyzed the interview data using a qualitative grounded theory approach. We have administered the DLCI at several institutions and have checked the validity, reliability, and bias of the DLCI with classical testing theory procedures. These procedures consisted of follow-up interviews with students, analysis of administration results with statistical procedures, and expert feedback. We discuss these results and present the DLCI's potential for providing a meaningful tool for comparing student learning at different institutions.
Innovation in neurosurgery: less than IDEAL? A systematic review.
Muskens, I S; Diederen, S J H; Senders, J T; Zamanipoor Najafabadi, A H; van Furth, W R; May, A M; Smith, T R; Bredenoord, A L; Broekman, M L D
2017-10-01
Surgical innovation is different from the introduction of novel pharmaceuticals. To help address this, in 2009 the IDEAL Collaboration (Idea, Development, Exploration, Assessment, Long-term follow-up) introduced the five-stage framework for surgical innovation. To evaluate the framework feasibility for novel neurosurgical procedure introduction, two innovative surgical procedures were examined: the endoscopic endonasal approach for skull base meningiomas (EEMS) and the WovenEndobridge (WEB device) for endovascular treatment of intracranial aneurysms. The published literature on EEMS and WEB devices was systematically reviewed. Identified studies were classified according to the IDEAL framework stage. Next, studies were evaluated for possible categorization according to the IDEAL framework. Five hundred seventy-six papers describing EEMS were identified of which 26 papers were included. No prospective studies were identified, and no studies reported on ethical approval or patient informed consent for the innovative procedure. Therefore, no clinical studies could be categorized according to the IDEAL Framework. For WEB devices, 6229 articles were screened of which 21 were included. In contrast to EEMS, two studies were categorized as 2a and two as 2b. The results of this systematic review demonstrate that both EEMS and WEB devices were not introduced according to the (later developed in the case of EEMS) IDEAL framework. Elements of the framework such as informed consent, ethical approval, and rigorous outcomes reporting are important and could serve to improve the quality of neurosurgical research. Alternative study designs and the use of big data could be useful modifications of the IDEAL framework for innovation in neurosurgery.
ERIC Educational Resources Information Center
Martin, James L.
This paper reports on attempts by the author to construct a theoretical framework of adult education participation using a theory development process and the corresponding multivariate statistical techniques. Two problems are identified: the lack of theoretical framework in studying problems, and the limiting of statistical analysis to univariate…
Teaching Introductory Business Statistics Using the DCOVA Framework
ERIC Educational Resources Information Center
Levine, David M.; Stephan, David F.
2011-01-01
Introductory business statistics students often receive little guidance on how to apply the methods they learn to further business objectives they may one day face. And those students may fail to see the continuity among the topics taught in an introductory course if they learn those methods outside a context that provides a unifying framework.…
Gershunov, A.; Barnett, T.P.; Cayan, D.R.; Tubbs, T.; Goddard, L.
2000-01-01
Three long-range forecasting methods have been evaluated for prediction and downscaling of seasonal and intraseasonal precipitation statistics in California. Full-statistical, hybrid-dynamical - statistical and full-dynamical approaches have been used to forecast El Nin??o - Southern Oscillation (ENSO) - related total precipitation, daily precipitation frequency, and average intensity anomalies during the January - March season. For El Nin??o winters, the hybrid approach emerges as the best performer, while La Nin??a forecasting skill is poor. The full-statistical forecasting method features reasonable forecasting skill for both La Nin??a and El Nin??o winters. The performance of the full-dynamical approach could not be evaluated as rigorously as that of the other two forecasting schemes. Although the full-dynamical forecasting approach is expected to outperform simpler forecasting schemes in the long run, evidence is presented to conclude that, at present, the full-dynamical forecasting approach is the least viable of the three, at least in California. The authors suggest that operational forecasting of any intraseasonal temperature, precipitation, or streamflow statistic derivable from the available records is possible now for ENSO-extreme years.
General Aviation Avionics Statistics : 1976
DOT National Transportation Integrated Search
1979-11-01
This report presents avionics statistics for the 1976 general aviation (GA) aircraft fleet and is the third in a series titled "General Aviation Avionics Statistics." The statistics are presented in a capability group framework which enables one to r...
General Aviation Avionics Statistics : 1978 Data
DOT National Transportation Integrated Search
1980-12-01
The report presents avionics statistics for the 1978 general aviation (GA) aircraft fleet and is the fifth in a series titled "General Aviation Statistics." The statistics are presented in a capability group framework which enables one to relate airb...
General Aviation Avionics Statistics : 1979 Data
DOT National Transportation Integrated Search
1981-04-01
This report presents avionics statistics for the 1979 general aviation (GA) aircraft fleet and is the sixth in a series titled General Aviation Avionics Statistics. The statistics preseneted in a capability group framework which enables one to relate...
All biology is computational biology.
Markowetz, Florian
2017-03-01
Here, I argue that computational thinking and techniques are so central to the quest of understanding life that today all biology is computational biology. Computational biology brings order into our understanding of life, it makes biological concepts rigorous and testable, and it provides a reference map that holds together individual insights. The next modern synthesis in biology will be driven by mathematical, statistical, and computational methods being absorbed into mainstream biological training, turning biology into a quantitative science.
Fleisher, Linda; Wen, Kuang Yi; Miller, Suzanne M; Diefenbach, Michael; Stanton, Annette L; Ropka, Mary; Morra, Marion; Raich, Peter C
2015-11-01
Cancer patients and survivors are assuming active roles in decision-making and digital patient support tools are widely used to facilitate patient engagement. As part of Cancer Information Service Research Consortium's randomized controlled trials focused on the efficacy of eHealth interventions to promote informed treatment decision-making for newly diagnosed prostate and breast cancer patients, and post-treatment breast cancer, we conducted a rigorous process evaluation to examine the actual use of and perceived benefits of two complementary communication channels -- print and eHealth interventions. The three Virtual Cancer Information Service (V-CIS) interventions were developed through a rigorous developmental process, guided by self-regulatory theory, informed decision-making frameworks, and health communications best practices. Control arm participants received NCI print materials; experimental arm participants received the additional V-CIS patient support tool. Actual usage data from the web-based V-CIS was also obtained and reported. Print materials were highly used by all groups. About 60% of the experimental group reported using the V-CIS. Those who did use the V-CIS rated it highly on improvements in knowledge, patient-provider communication and decision-making. The findings show that how patients actually use eHealth interventions either singularly or within the context of other communication channels is complex. Integrating rigorous best practices and theoretical foundations is essential and multiple communication approaches should be considered to support patient preferences.
Spatially Controlled Relay Beamforming
NASA Astrophysics Data System (ADS)
Kalogerias, Dionysios
This thesis is about fusion of optimal stochastic motion control and physical layer communications. Distributed, networked communication systems, such as relay beamforming networks (e.g., Amplify & Forward (AF)), are typically designed without explicitly considering how the positions of the respective nodes might affect the quality of the communication. Optimum placement of network nodes, which could potentially improve the quality of the communication, is not typically considered. However, in most practical settings in physical layer communications, such as relay beamforming, the Channel State Information (CSI) observed by each node, per channel use, although it might be (modeled as) random, it is both spatially and temporally correlated. It is, therefore, reasonable to ask if and how the performance of the system could be improved by (predictively) controlling the positions of the network nodes (e.g., the relays), based on causal side (CSI) information, and exploitting the spatiotemporal dependencies of the wireless medium. In this work, we address this problem in the context of AF relay beamforming networks. This novel, cyber-physical system approach to relay beamforming is termed as "Spatially Controlled Relay Beamforming". First, we discuss wireless channel modeling, however, in a rigorous, Bayesian framework. Experimentally accurate and, at the same time, technically precise channel modeling is absolutely essential for designing and analyzing spatially controlled communication systems. In this work, we are interested in two distinct spatiotemporal statistical models, for describing the behavior of the log-scale magnitude of the wireless channel: 1. Stationary Gaussian Fields: In this case, the channel is assumed to evolve as a stationary, Gaussian stochastic field in continuous space and discrete time (say, for instance, time slots). Under such assumptions, spatial and temporal statistical interactions are determined by a set of time and space invariant parameters, which completely determine the mean and covariance of the underlying Gaussian measure. This model is relatively simple to describe, and can be sufficiently characterized, at least for our purposes, both statistically and topologically. Additionally, the model is rather versatile and there is existing experimental evidence, supporting its practical applicability. Our contributions are summarized in properly formulating the whole spatiotemporal model in a completely rigorous mathematical setting, under a convenient measure theoretic framework. Such framework greatly facilitates formulation of meaningful stochastic control problems, where the wireless channel field (or a function of it) can be regarded as a stochastic optimization surface.. 2. Conditionally Gaussian Fields, when conditioned on a Markovian channel state: This is a completely novel approach to wireless channel modeling. In this approach, the communication medium is assumed to behave as a partially observable (or hidden) system, where a hidden, global, temporally varying underlying stochastic process, called the channel state, affects the spatial interactions of the actual channel magnitude, evaluated at any set of locations in the plane. More specifically, we assume that, conditioned on the channel state, the wireless channel constitutes an observable, conditionally Gaussian stochastic process. The channel state evolves in time according to a known, possibly non stationary, non Gaussian, low dimensional Markov kernel. Recognizing the intractability of general nonlinear state estimation, we advocate the use of grid based approximate nonlinear filters as an effective and robust means for recursive tracking of the channel state. We also propose a sequential spatiotemporal predictor for tracking the channel gains at any point in time and space, providing real time sequential estimates for the respective channel gain map. In this context, our contributions are multifold. Except for the introduction of the layered channel model previously described, this line of research has resulted in a number of general, asymptotic convergence results, advancing the theory of grid-based approximate nonlinear stochastic filtering. In particular, sufficient conditions, ensuring asymptotic optimality are relaxed, and, at the same time, the mode of convergence is strengthened. Although the need for such results initiated as an attempt to theoretically characterize the performance of the proposed approximate methods for statistical inference, in regard to the proposed channel modeling approach, they turn out to be of fundamental importance in the areas of nonlinear estimation and stochastic control. The experimental validation of the proposed channel model, as well as the related parameter estimation problem, termed as "Markovian Channel Profiling (MCP)", fundamentally important for any practical deployment, are subject of current, ongoing research. Second, adopting the first of the two aforementioned channel modeling approaches, we consider the spatially controlled relay beamforming problem for an AF network with a single source, a single destination, and multiple, controlled at will, relay nodes. (Abstract shortened by ProQuest.).
Time Series Expression Analyses Using RNA-seq: A Statistical Approach
Oh, Sunghee; Song, Seongho; Grabowski, Gregory; Zhao, Hongyu; Noonan, James P.
2013-01-01
RNA-seq is becoming the de facto standard approach for transcriptome analysis with ever-reducing cost. It has considerable advantages over conventional technologies (microarrays) because it allows for direct identification and quantification of transcripts. Many time series RNA-seq datasets have been collected to study the dynamic regulations of transcripts. However, statistically rigorous and computationally efficient methods are needed to explore the time-dependent changes of gene expression in biological systems. These methods should explicitly account for the dependencies of expression patterns across time points. Here, we discuss several methods that can be applied to model timecourse RNA-seq data, including statistical evolutionary trajectory index (SETI), autoregressive time-lagged regression (AR(1)), and hidden Markov model (HMM) approaches. We use three real datasets and simulation studies to demonstrate the utility of these dynamic methods in temporal analysis. PMID:23586021
Thermodynamics of ideal quantum gas with fractional statistics in D dimensions.
Potter, Geoffrey G; Müller, Gerhard; Karbach, Michael
2007-06-01
We present exact and explicit results for the thermodynamic properties (isochores, isotherms, isobars, response functions, velocity of sound) of a quantum gas in dimensions D > or = 1 and with fractional exclusion statistics 0 < or = g < or =1 connecting bosons (g=0) and fermions (g=1) . In D=1 the results are equivalent to those of the Calogero-Sutherland model. Emphasis is given to the crossover between bosonlike and fermionlike features, caused by aspects of the statistical interaction that mimic long-range attraction and short-range repulsion. A phase transition along the isobar occurs at a nonzero temperature in all dimensions. The T dependence of the velocity of sound is in simple relation to isochores and isobars. The effects of soft container walls are accounted for rigorously for the case of a pure power-law potential.
Time series expression analyses using RNA-seq: a statistical approach.
Oh, Sunghee; Song, Seongho; Grabowski, Gregory; Zhao, Hongyu; Noonan, James P
2013-01-01
RNA-seq is becoming the de facto standard approach for transcriptome analysis with ever-reducing cost. It has considerable advantages over conventional technologies (microarrays) because it allows for direct identification and quantification of transcripts. Many time series RNA-seq datasets have been collected to study the dynamic regulations of transcripts. However, statistically rigorous and computationally efficient methods are needed to explore the time-dependent changes of gene expression in biological systems. These methods should explicitly account for the dependencies of expression patterns across time points. Here, we discuss several methods that can be applied to model timecourse RNA-seq data, including statistical evolutionary trajectory index (SETI), autoregressive time-lagged regression (AR(1)), and hidden Markov model (HMM) approaches. We use three real datasets and simulation studies to demonstrate the utility of these dynamic methods in temporal analysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Öztürk, Hande; Noyan, I. Cevdet
A rigorous study of sampling and intensity statistics applicable for a powder diffraction experiment as a function of crystallite size is presented. Our analysis yields approximate equations for the expected value, variance and standard deviations for both the number of diffracting grains and the corresponding diffracted intensity for a given Bragg peak. The classical formalism published in 1948 by Alexander, Klug & Kummer [J. Appl. Phys.(1948),19, 742–753] appears as a special case, limited to large crystallite sizes, here. It is observed that both the Lorentz probability expression and the statistics equations used in the classical formalism are inapplicable for nanocrystallinemore » powder samples.« less
Öztürk, Hande; Noyan, I. Cevdet
2017-08-24
A rigorous study of sampling and intensity statistics applicable for a powder diffraction experiment as a function of crystallite size is presented. Our analysis yields approximate equations for the expected value, variance and standard deviations for both the number of diffracting grains and the corresponding diffracted intensity for a given Bragg peak. The classical formalism published in 1948 by Alexander, Klug & Kummer [J. Appl. Phys.(1948),19, 742–753] appears as a special case, limited to large crystallite sizes, here. It is observed that both the Lorentz probability expression and the statistics equations used in the classical formalism are inapplicable for nanocrystallinemore » powder samples.« less
Mechanical properties of frog skeletal muscles in iodoacetic acid rigor.
Mulvany, M J
1975-01-01
1. Methods have been developed for describing the length: tension characteristics of frog skeletal muscles which go into rigor at 4 degrees C following iodoacetic acid poisoning either in the presence of Ca2+ (Ca-rigor) or its absence (Ca-free-rigor). 2. Such rigor muscles showed less resistance to slow stretch (slow rigor resistance) that to fast stretch (fast rigor resistance). The slow and fast rigor resistances of Ca-free-rigor muscles were much lower than those of Ca-rigor muscles. 3. The slow rigor resistance of Ca-rigor muscles was proportional to the amount of overlap between the contractile filaments present when the muscles were put into rigor. 4. Withdrawing Ca2+ from Ca-rigor muscles (induced-Ca-free rigor) reduced their slow and fast rigor resistances. Readdition of Ca2+ (but not Mg2+, Mn2+ or Sr2+) reversed the effect. 5. The slow and fast rigor resistances of Ca-rigor muscles (but not of Ca-free-rigor muscles) decreased with time. 6.The sarcomere structure of Ca-rigor and induced-Ca-free rigor muscles stretched by 0.2lo was destroyed in proportion to the amount of stretch, but the lengths of the remaining intact sarcomeres were essentially unchanged. This suggests that there had been a successive yielding of the weakeast sarcomeres. 7. The difference between the slow and fast rigor resistance and the effect of calcium on these resistances are discussed in relation to possible variations in the strength of crossbridges between the thick and thin filaments. Images Plate 1 Plate 2 PMID:1082023
A consistent framework for Horton regression statistics that leads to a modified Hack's law
Furey, P.R.; Troutman, B.M.
2008-01-01
A statistical framework is introduced that resolves important problems with the interpretation and use of traditional Horton regression statistics. The framework is based on a univariate regression model that leads to an alternative expression for Horton ratio, connects Horton regression statistics to distributional simple scaling, and improves the accuracy in estimating Horton plot parameters. The model is used to examine data for drainage area A and mainstream length L from two groups of basins located in different physiographic settings. Results show that confidence intervals for the Horton plot regression statistics are quite wide. Nonetheless, an analysis of covariance shows that regression intercepts, but not regression slopes, can be used to distinguish between basin groups. The univariate model is generalized to include n > 1 dependent variables. For the case where the dependent variables represent ln A and ln L, the generalized model performs somewhat better at distinguishing between basin groups than two separate univariate models. The generalized model leads to a modification of Hack's law where L depends on both A and Strahler order ??. Data show that ?? plays a statistically significant role in the modified Hack's law expression. ?? 2008 Elsevier B.V.
Turning great strategy into great performance.
Mankins, Michael C; Steele, Richard
2005-01-01
Despite the enormous time and energy that goes into strategy development, many companies have little to show for their efforts. Indeed, research by the consultancy Marakon Associates suggests that companies on average deliver only 63% of the financial performance their strategies promise. In this article, Michael Mankins and Richard Steele of Marakon present the findings of this research. They draw on their experience with high-performing companies like Barclays, Cisco, Dow Chemical, 3M, and Roche to establish some basic rules for setting and delivering strategy: Keep it simple, make it concrete. Avoid long, drawn-out descriptions of lofty goals and instead stick to clear language describing what your company will and won't do. Debate assumptions, not forecasts. Create cross-functional teams drawn from strategy, marketing, and finance to ensure the assumptions underlying your long-term plans reflect both the real economics of your company's markets and its actual performance relative to competitors. Use a rigorous analytic framework. Ensure that the dialogue between the corporate center and the business units about market trends and assumptions is conducted within a rigorous framework, such as that of "profit pools". Discuss resource deployments early. Create more realistic forecasts and more executable plans by discussing up front the level and timing of critical deployments. Clearly identify priorities. Prioritize tactics so that employees have a clear sense of where to direct their efforts. Continuously monitor performance. Track resource deployment and results against plan, using continuous feedback to reset assumptions and reallocate resources. Reward and develop execution capabilities. Motivate and develop staff. Following these rules strictly can help narrow the strategy-to-performance gap.
Akl, E A; Ward, K D; Bteddini, D; Khaliel, R; Alexander, A C; Lotfi, T; Alaouie, H; Afifi, R A
2015-01-01
Objective The objective of this narrative review is to highlight the determinants of the epidemic rise in waterpipe tobacco smoking (WTS) among youth globally. The Ecological Model of Health Promotion (EMHP) was the guiding framework for the review. Data sources The following electronic databases were searched: Cochrane library, MEDLINE, EMBASE, PsycINFO, Web of Science and CINAHL Plus with Full Text. Search terms included waterpipe and its many variant terms. Study selection Articles were included if they were published between 1990 and 2014, were in English, were available in full text and included the age group 10–29 years. Data extraction Articles which analysed determinants of WTS at any of the levels of the EMHP were retained regardless of methodological rigour: 131 articles are included. Articles were coded in a standard template that abstracted methods as well as results. Data synthesis The review found that methodologies used to assess determinants of WTS among youth were often conventional and lacked rigor: 3/4 of the studies were cross-sectional surveys and most enrolled non-representative samples. Within the framework, the review identified determinants of WTS at the intrapersonal, interpersonal, organisational, community and policy levels. Conclusions The review suggests potential interventions to control WTS among youth, with emphasis on creative utilisation of social media, and tobacco control policies that include the specificities of WTS. The review further suggests the need for rigorous qualitative work to better contextualise determinants, and prospective observational and experimental studies that track and manipulate them to assess their viability as intervention targets. PMID:25618895
ERIC Educational Resources Information Center
Dong, Nianbo; Spybrook, Jessaca; Kelcey, Ben
2016-01-01
The purpose of this study is to propose a general framework for power analyses to detect the moderator effects in two- and three-level cluster randomized trials (CRTs). The study specifically aims to: (1) develop the statistical formulations for calculating statistical power, minimum detectable effect size (MDES) and its confidence interval to…
ERIC Educational Resources Information Center
McGrath, April L.; Ferns, Alyssa; Greiner, Leigh; Wanamaker, Kayla; Brown, Shelley
2015-01-01
In this study we assessed the usefulness of a multifaceted teaching framework in an advanced statistics course. We sought to expand on past findings by using this framework to assess changes in anxiety and self-efficacy, and we collected focus group data to ascertain whether students attribute such changes to a multifaceted teaching approach.…
Multi-Disciplinary Knowledge Synthesis for Human Health Assessment on Earth and in Space
NASA Astrophysics Data System (ADS)
Christakos, G.
We discuss methodological developments in multi-disciplinary knowledge synthesis (KS) of human health assessment. A theoretical KS framework can provide the rational means for the assimilation of various information bases (general, site-specific etc.) that are relevant to the life system of interest. KS-based techniques produce a realistic representation of the system, provide a rigorous assessment of the uncertainty sources, and generate informative health state predictions across space-time. The underlying epistemic cognition methodology is based on teleologic criteria and stochastic logic principles. The mathematics of KS involves a powerful and versatile spatiotemporal random field model that accounts rigorously for the uncertainty features of the life system and imposes no restriction on the shape of the probability distributions or the form of the predictors. KS theory is instrumental in understanding natural heterogeneities, assessing crucial human exposure correlations and laws of physical change, and explaining toxicokinetic mechanisms and dependencies in a spatiotemporal life system domain. It is hoped that a better understanding of KS fundamentals would generate multi-disciplinary models that are useful for the maintenance of human health on Earth and in Space.
PRO development: rigorous qualitative research as the crucial foundation.
Lasch, Kathryn Eilene; Marquis, Patrick; Vigneux, Marc; Abetz, Linda; Arnould, Benoit; Bayliss, Martha; Crawford, Bruce; Rosa, Kathleen
2010-10-01
Recently published articles have described criteria to assess qualitative research in the health field in general, but very few articles have delineated qualitative methods to be used in the development of Patient-Reported Outcomes (PROs). In fact, how PROs are developed with subject input through focus groups and interviews has been given relatively short shrift in the PRO literature when compared to the plethora of quantitative articles on the psychometric properties of PROs. If documented at all, most PRO validation articles give little for the reader to evaluate the content validity of the measures and the credibility and trustworthiness of the methods used to develop them. Increasingly, however, scientists and authorities want to be assured that PRO items and scales have meaning and relevance to subjects. This article was developed by an international, interdisciplinary group of psychologists, psychometricians, regulatory experts, a physician, and a sociologist. It presents rigorous and appropriate qualitative research methods for developing PROs with content validity. The approach described combines an overarching phenomenological theoretical framework with grounded theory data collection and analysis methods to yield PRO items and scales that have content validity.
Qualitative research: the "what," "why," "who," and "how"!
Cypress, Brigitte S
2015-01-01
There has been a general view of qualitative research as a lower level form of inquiry and the diverse conceptualizations of what it is, its use or utility, its users, the process of how it is conducted, and its scientific merit. This fragmented understanding and varied ways in which qualitative research is conceived, synthesized, and presented have a myriad of implications in demonstrating and enhancing the utilization of its findings and the ways and skills required in transforming knowledge gained from it. The purpose of this article is to define qualitative research and discuss its significance in research, the questions it addresses, its characteristics, methods and criteria for rigor, and the type of results it can offer. A framework for understanding the "what," "why," "who," and "how" of qualitative research; the different approaches; and the strategies to achieve trustworthiness are presented. Qualitative research provides insights into health-related phenomena and seeks to understand and interpret subjective experience and thus humanizes health care and can enrich further research inquiries and be made clearer and more rigorous as it is relevant to the perspective and goals of nursing.
PRO development: rigorous qualitative research as the crucial foundation
Marquis, Patrick; Vigneux, Marc; Abetz, Linda; Arnould, Benoit; Bayliss, Martha; Crawford, Bruce; Rosa, Kathleen
2010-01-01
Recently published articles have described criteria to assess qualitative research in the health field in general, but very few articles have delineated qualitative methods to be used in the development of Patient-Reported Outcomes (PROs). In fact, how PROs are developed with subject input through focus groups and interviews has been given relatively short shrift in the PRO literature when compared to the plethora of quantitative articles on the psychometric properties of PROs. If documented at all, most PRO validation articles give little for the reader to evaluate the content validity of the measures and the credibility and trustworthiness of the methods used to develop them. Increasingly, however, scientists and authorities want to be assured that PRO items and scales have meaning and relevance to subjects. This article was developed by an international, interdisciplinary group of psychologists, psychometricians, regulatory experts, a physician, and a sociologist. It presents rigorous and appropriate qualitative research methods for developing PROs with content validity. The approach described combines an overarching phenomenological theoretical framework with grounded theory data collection and analysis methods to yield PRO items and scales that have content validity. PMID:20512662
A data fusion framework for meta-evaluation of intelligent transportation system effectiveness
DOT National Transportation Integrated Search
This study presents a framework for the meta-evaluation of Intelligent Transportation System effectiveness. The framework is based on data fusion approaches that adjust for data biases and violations of other standard statistical assumptions. Operati...
Applications of statistics to medical science (1) Fundamental concepts.
Watanabe, Hiroshi
2011-01-01
The conceptual framework of statistical tests and statistical inferences are discussed, and the epidemiological background of statistics is briefly reviewed. This study is one of a series in which we survey the basics of statistics and practical methods used in medical statistics. Arguments related to actual statistical analysis procedures will be made in subsequent papers.
Galka, Andreas; Siniatchkin, Michael; Stephani, Ulrich; Groening, Kristina; Wolff, Stephan; Bosch-Bayard, Jorge; Ozaki, Tohru
2010-12-01
The analysis of time series obtained by functional magnetic resonance imaging (fMRI) may be approached by fitting predictive parametric models, such as nearest-neighbor autoregressive models with exogeneous input (NNARX). As a part of the modeling procedure, it is possible to apply instantaneous linear transformations to the data. Spatial smoothing, a common preprocessing step, may be interpreted as such a transformation. The autoregressive parameters may be constrained, such that they provide a response behavior that corresponds to the canonical haemodynamic response function (HRF). We present an algorithm for estimating the parameters of the linear transformations and of the HRF within a rigorous maximum-likelihood framework. Using this approach, an optimal amount of both the spatial smoothing and the HRF can be estimated simultaneously for a given fMRI data set. An example from a motor-task experiment is discussed. It is found that, for this data set, weak, but non-zero, spatial smoothing is optimal. Furthermore, it is demonstrated that activated regions can be estimated within the maximum-likelihood framework.
Quality assurance in transnational higher education: a case study of the tropEd network
2013-01-01
Introduction Transnational or cross-border higher education has rapidly expanded since the 1980s. Together with that expansion issues on quality assurance came to the forefront. This article aims to identify key issues regarding quality assurance of transnational higher education and discusses the quality assurance of the tropEd Network for International Health in Higher Education in relation to these key issues. Methods Literature review and review of documents. Results From the literature the following key issues regarding transnational quality assurance were identified and explored: comparability of quality assurance frameworks, true collaboration versus erosion of national education sovereignty, accreditation agencies and transparency. The tropEd network developed a transnational quality assurance framework for the network. The network accredits modules through a rigorous process which has been accepted by major stakeholders. This process was a participatory learning process and at the same time the process worked positive for the relations between the institutions. Discussion The development of the quality assurance framework and the process provides a potential example for others. PMID:23537108
Causal Analysis After Haavelmo
Heckman, James; Pinto, Rodrigo
2014-01-01
Haavelmo's seminal 1943 and 1944 papers are the first rigorous treatment of causality. In them, he distinguished the definition of causal parameters from their identification. He showed that causal parameters are defined using hypothetical models that assign variation to some of the inputs determining outcomes while holding all other inputs fixed. He thus formalized and made operational Marshall's (1890) ceteris paribus analysis. We embed Haavelmo's framework into the recursive framework of Directed Acyclic Graphs (DAGs) used in one influential recent approach to causality (Pearl, 2000) and in the related literature on Bayesian nets (Lauritzen, 1996). We compare the simplicity of an analysis of causality based on Haavelmo's methodology with the complex and nonintuitive approach used in the causal literature of DAGs—the “do-calculus” of Pearl (2009). We discuss the severe limitations of DAGs and in particular of the do-calculus of Pearl in securing identification of economic models. We extend our framework to consider models for simultaneous causality, a central contribution of Haavelmo. In general cases, DAGs cannot be used to analyze models for simultaneous causality, but Haavelmo's approach naturally generalizes to cover them. PMID:25729123
Reference condition approach to restoration planning
Nestler, J.M.; Theiling, C.H.; Lubinski, S.J.; Smith, D.L.
2010-01-01
Ecosystem restoration planning requires quantitative rigor to evaluate alternatives, define end states, report progress and perform environmental benefits analysis (EBA). Unfortunately, existing planning frameworks are, at best, semi-quantitative. In this paper, we: (1) describe a quantitative restoration planning approach based on a comprehensive, but simple mathematical framework that can be used to effectively apply knowledge and evaluate alternatives, (2) use the approach to derive a simple but precisely defined lexicon based on the reference condition concept and allied terms and (3) illustrate the approach with an example from the Upper Mississippi River System (UMRS) using hydrologic indicators. The approach supports the development of a scaleable restoration strategy that, in theory, can be expanded to ecosystem characteristics such as hydraulics, geomorphology, habitat and biodiversity. We identify three reference condition types, best achievable condition (A BAC), measured magnitude (MMi which can be determined at one or many times and places) and desired future condition (ADFC) that, when used with the mathematical framework, provide a complete system of accounts useful for goal-oriented system-level management and restoration. Published in 2010 by John Wiley & Sons, Ltd.
Development of a theoretical framework for analyzing cerebrospinal fluid dynamics
Cohen, Benjamin; Voorhees, Abram; Vedel, Søren; Wei, Timothy
2009-01-01
Background To date hydrocephalus researchers acknowledge the need for rigorous but utilitarian fluid mechanics understanding and methodologies in studying normal and hydrocephalic intracranial dynamics. Pressure volume models and electric circuit analogs introduced pressure into volume conservation; but control volume analysis enforces independent conditions on pressure and volume. Previously, utilization of clinical measurements has been limited to understanding of the relative amplitude and timing of flow, volume and pressure waveforms; qualitative approaches without a clear framework for meaningful quantitative comparison. Methods Control volume analysis is presented to introduce the reader to the theoretical background of this foundational fluid mechanics technique for application to general control volumes. This approach is able to directly incorporate the diverse measurements obtained by clinicians to better elucidate intracranial dynamics and progression to disorder. Results Several examples of meaningful intracranial control volumes and the particular measurement sets needed for the analysis are discussed. Conclusion Control volume analysis provides a framework to guide the type and location of measurements and also a way to interpret the resulting data within a fundamental fluid physics analysis. PMID:19772652
NASA Astrophysics Data System (ADS)
Brown, Bryan A.; Kloser, Matt
2009-12-01
We respond to Hwang and Kim and Yeo's critiques of the conceptual continuity framework in science education. First, we address the criticism that their analysis fails to recognize the situated perspective of learning by denying the dichotomy of the formal and informal knowledge as a starting point in the learning process. Second, we address the critique that students' descriptions fail to meet the "gold standard" of science education—alignment with an authoritative source and generalizability—by highlighting some student-expert congruence that could serve as the foundation for future learning. Third, we address the critique that a conceptual continuity framework could lead to less rigorous science education goals by arguing that the ultimate goals do not change, but rather that if the pathways that lead to the goals' achievement could recognize existing lexical continuities' science teaching may become more efficient. In sum, we argue that a conceptual continuities framework provides an asset, not deficit lexical perspective from which science teacher educators and science educators can begin to address and build complete science understandings.
Long wavelength perfect fluidity from short distance jet transport in quark-gluon plasmas
Xu, J.; Liao, J.; Gyulassy, M.
2015-12-01
Here, we build a new phenomenological framework that bridges the long wavelength bulk viscous transport properties of the strongly-coupled quark-gluon plasma (sQGP) and short distance hard jet transport properties in the QGP. The full nonperturbative chromo-electric (E) and chromo-magnetic (M) structure of the near “perfect fluid” like sQGP in the critical transition region are integrated into a semi-Quark-Gluon-Monopole Plasma (sQGMP) model lattice-compatibly and implemented into the new CUJET3.0 jet quenching framework. All observables computed from CUJET3.0 are found to be consistent with available data at RHIC and LHC simultaneously. Moreover, a quantitative connection between the shear viscosity and jet transportmore » parameter is rigorously established within this framework. Finally, we deduce the T = 160-600 MeV dependence of the QGP’s η/s: its near vanishing value in the near T c regime is determined by the composition of E and M charges, it increases as T rises, and its high T limit is fixed by color screening scales.« less
Combining statistical inference and decisions in ecology.
Williams, Perry J; Hooten, Mevin B
2016-09-01
Statistical decision theory (SDT) is a sub-field of decision theory that formally incorporates statistical investigation into a decision-theoretic framework to account for uncertainties in a decision problem. SDT provides a unifying analysis of three types of information: statistical results from a data set, knowledge of the consequences of potential choices (i.e., loss), and prior beliefs about a system. SDT links the theoretical development of a large body of statistical methods, including point estimation, hypothesis testing, and confidence interval estimation. The theory and application of SDT have mainly been developed and published in the fields of mathematics, statistics, operations research, and other decision sciences, but have had limited exposure in ecology. Thus, we provide an introduction to SDT for ecologists and describe its utility for linking the conventionally separate tasks of statistical investigation and decision making in a single framework. We describe the basic framework of both Bayesian and frequentist SDT, its traditional use in statistics, and discuss its application to decision problems that occur in ecology. We demonstrate SDT with two types of decisions: Bayesian point estimation and an applied management problem of selecting a prescribed fire rotation for managing a grassland bird species. Central to SDT, and decision theory in general, are loss functions. Thus, we also provide basic guidance and references for constructing loss functions for an SDT problem. © 2016 by the Ecological Society of America.
Detection, isolation and diagnosability analysis of intermittent faults in stochastic systems
NASA Astrophysics Data System (ADS)
Yan, Rongyi; He, Xiao; Wang, Zidong; Zhou, D. H.
2018-02-01
Intermittent faults (IFs) have the properties of unpredictability, non-determinacy, inconsistency and repeatability, switching systems between faulty and healthy status. In this paper, the fault detection and isolation (FDI) problem of IFs in a class of linear stochastic systems is investigated. For the detection and isolation of IFs, it includes: (1) to detect all the appearing time and the disappearing time of an IF; (2) to detect each appearing (disappearing) time of the IF before the subsequent disappearing (appearing) time; (3) to determine where the IFs happen. Based on the outputs of the observers we designed, a novel set of residuals is constructed by using the sliding-time window technique, and two hypothesis tests are proposed to detect all the appearing time and disappearing time of IFs. The isolation problem of IFs is also considered. Furthermore, within a statistical framework, the definition of the diagnosability of IFs is proposed, and a sufficient condition is brought forward for the diagnosability of IFs. Quantitative performance analysis results for the false alarm rate and missing detection rate are discussed, and the influences of some key parameters of the proposed scheme on performance indices such as the false alarm rate and missing detection rate are analysed rigorously. The effectiveness of the proposed scheme is illustrated via a simulation example of an unmanned helicopter longitudinal control system.
The Health Impact Assessment (HIA) Resource and Tool ...
Health Impact Assessment (HIA) is a relatively new and rapidly emerging field in the U.S. An inventory of available HIA resources and tools was conducted, with a primary focus on resources developed in the U.S. The resources and tools available to HIA practitioners in the conduct of their work were identified through multiple methods and compiled into a comprehensive list. The compilation includes tools and resources related to the HIA process itself and those that can be used to collect and analyze data, establish a baseline profile, assess potential health impacts, and establish benchmarks and indicators for monitoring and evaluation. These resources include literature and evidence bases, data and statistics, guidelines, benchmarks, decision and economic analysis tools, scientific models, methods, frameworks, indices, mapping, and various data collection tools. Understanding the data, tools, models, methods, and other resources available to perform HIAs will help to advance the HIA community of practice in the U.S., improve the quality and rigor of assessments upon which stakeholder and policy decisions are based, and potentially improve the overall effectiveness of HIA to promote healthy and sustainable communities. The Health Impact Assessment (HIA) Resource and Tool Compilation is a comprehensive list of resources and tools that can be utilized by HIA practitioners with all levels of HIA experience to guide them throughout the HIA process. The HIA Resource
Roberts, Megan C; Clyne, Mindy; Kennedy, Amy E; Chambers, David A; Khoury, Muin J
2017-10-26
PurposeImplementation science offers methods to evaluate the translation of genomic medicine research into practice. The extent to which the National Institutes of Health (NIH) human genomics grant portfolio includes implementation science is unknown. This brief report's objective is to describe recently funded implementation science studies in genomic medicine in the NIH grant portfolio, and identify remaining gaps.MethodsWe identified investigator-initiated NIH research grants on implementation science in genomic medicine (funding initiated 2012-2016). A codebook was adapted from the literature, three authors coded grants, and descriptive statistics were calculated for each code.ResultsForty-two grants fit the inclusion criteria (~1.75% of investigator-initiated genomics grants). The majority of included grants proposed qualitative and/or quantitative methods with cross-sectional study designs, and described clinical settings and primarily white, non-Hispanic study populations. Most grants were in oncology and examined genetic testing for risk assessment. Finally, grants lacked the use of implementation science frameworks, and most examined uptake of genomic medicine and/or assessed patient-centeredness.ConclusionWe identified large gaps in implementation science studies in genomic medicine in the funded NIH portfolio over the past 5 years. To move the genomics field forward, investigator-initiated research grants should employ rigorous implementation science methods within diverse settings and populations.Genetics in Medicine advance online publication, 26 October 2017; doi:10.1038/gim.2017.180.
Frerichs, L; Ataga, O; Corbie-Smith, G; Tessler Lindau, S
2016-12-01
A growing number of childhood obesity interventions involve children and youth in participatory roles, but these types of interventions have not been systematically reviewed. We aimed to identify child and youth participatory interventions in the peer-reviewed literature in order to characterize the approaches and examine their impact on obesity and obesity-related lifestyle behaviours. We searched PubMed/Medline, psychINFO and ERIC for quasi-experimental and randomized trials conducted from date of database initiation through May 2015 that engaged children or youth in implementing healthy eating, physical activity or weight management strategies. Eighteen studies met our eligibility criteria. Most (n = 14) trained youth to implement pre-defined strategies targeting their peers. A few (n = 4) assisted youth to plan and implement interventions that addressed environmental changes. Thirteen studies reported at least one statistically significant weight, physical activity or dietary change outcome. Participatory approaches have potential, but variation in strategies and outcomes leave questions unanswered about the mechanisms through which child and youth engagement impact childhood obesity. Future research should compare child-delivered or youth-delivered to adult-delivered health promotion interventions and more rigorously evaluate natural experiments that engage youth to implement environmental changes. With careful attention to theoretical frameworks, process and outcome measures, these studies could strengthen the effectiveness of child and youth participatory approaches. © 2016 World Obesity Federation.
Good pharmacovigilance practices: technology enabled.
Nelson, Robert C; Palsulich, Bruce; Gogolak, Victor
2002-01-01
The assessment of spontaneous reports is most effective it is conducted within a defined and rigorous process. The framework for good pharmacovigilance process (GPVP) is proposed as a subset of good postmarketing surveillance process (GPMSP), a functional structure for both a public health and corporate risk management strategy. GPVP has good practices that implement each step within a defined process. These practices are designed to efficiently and effectively detect and alert the drug safety professional to new and potentially important information on drug-associated adverse reactions. These practices are enabled by applied technology designed specifically for the review and assessment of spontaneous reports. Specific practices include rules-based triage, active query prompts for severe organ insults, contextual single case evaluation, statistical proportionality and correlational checks, case-series analyses, and templates for signal work-up and interpretation. These practices and the overall GPVP are supported by state-of-the-art web-based systems with powerful analytical engines, workflow and audit trials to allow validated systems support for valid drug safety signalling efforts. It is also important to understand that a process has a defined set of steps and any one cannot stand independently. Specifically, advanced use of technical alerting methods in isolation can mislead and allow one to misunderstand priorities and relative value. In the end, pharmacovigilance is a clinical art and a component process to the science of pharmacoepidemiology and risk management.
Comparison of advertising strategies between the indoor tanning and tobacco industries.
Greenman, Jennifer; Jones, David A
2010-04-01
The indoor tanning industry is large and continues to grow, with 2007 domestic sales in excess of $5 billion. Advertising is central to shaping the consumer's perception of indoor tanning as well as driving industry demand. This article aims to identify key drivers of consumer appeal by comparing tanning advertising strategies to those used by tobacco marketers. Tobacco advertising was selected as a reference framework because it is both well documented and designed to promote a product with known health hazards. Two thousand advertisements from 4 large tobacco advertisement databases were analyzed for type of advertisement strategy used, and 4 advertising method categories were devised to incorporate the maximum number of advertisements reviewed. Subsequently, contemporary tanning advertisements were collected from industry magazines and salon websites and evaluated relative to the identified strategy profiles. Both industries have relied on similar advertising strategies, including mitigating health concerns, appealing to a sense of social acceptance, emphasizing psychotropic effects, and targeting specific population segments. This examination is a small observational study, which was conducted without rigorous statistical analysis, and which is limited both by the number of advertisements and by advertising strategies examined. Given the strong parallels between tobacco and tanning advertising methodologies, further consumer education and investigation into the public health risks of indoor tanning is needed. Copyright 2009 American Academy of Dermatology, Inc. Published by Mosby, Inc. All rights reserved.
Software Verification of Orion Cockpit Displays
NASA Technical Reports Server (NTRS)
Biswas, M. A. Rafe; Garcia, Samuel; Prado, Matthew; Hossain, Sadad; Souris, Matthew; Morin, Lee
2017-01-01
NASA's latest spacecraft Orion is in the development process of taking humans deeper into space. Orion is equipped with three main displays to monitor and control the spacecraft. To ensure the software behind the glass displays operates without faults, rigorous testing is needed. To conduct such testing, the Rapid Prototyping Lab at NASA's Johnson Space Center along with the University of Texas at Tyler employed a software verification tool, EggPlant Functional by TestPlant. It is an image based test automation tool that allows users to create scripts to verify the functionality within a program. A set of edge key framework and Common EggPlant Functions were developed to enable creation of scripts in an efficient fashion. This framework standardized the way to code and to simulate user inputs in the verification process. Moreover, the Common EggPlant Functions can be used repeatedly in verification of different displays.