Statistical Data Analyses of Trace Chemical, Biochemical, and Physical Analytical Signatures
DOE Office of Scientific and Technical Information (OSTI.GOV)
Udey, Ruth Norma
Analytical and bioanalytical chemistry measurement results are most meaningful when interpreted using rigorous statistical treatments of the data. The same data set may provide many dimensions of information depending on the questions asked through the applied statistical methods. Three principal projects illustrated the wealth of information gained through the application of statistical data analyses to diverse problems.
NASA Technical Reports Server (NTRS)
Thomas-Keprta, Kathie L.; Clemett, Simon J.; Bazylinski, Dennis A.; Kirschvink, Joseph L.; McKay, David S.; Wentworth, Susan J.; Vali, H.; Gibson, Everett K.
2000-01-01
Here we use rigorous mathematical modeling to compare ALH84001 prismatic magnetites with those produced by terrestrial magnetotactic bacteria, MV-1. We find that this subset of the Martian magnetites appears to be statistically indistinguishable from those of MV-1.
Comparative effectiveness research methodology using secondary data: A starting user's guide.
Sun, Maxine; Lipsitz, Stuart R
2018-04-01
The use of secondary data, such as claims or administrative data, in comparative effectiveness research has grown tremendously in recent years. We believe that the current review can help investigators relying on secondary data to (1) gain insight into both the methodologies and statistical methods, (2) better understand the necessity of a rigorous planning before initiating a comparative effectiveness investigation, and (3) optimize the quality of their investigations. Specifically, we review concepts of adjusted analyses and confounders, methods of propensity score analyses, and instrumental variable analyses, risk prediction models (logistic and time-to-event), decision-curve analysis, as well as the interpretation of the P value and hypothesis testing. Overall, we hope that the current review article can help research investigators relying on secondary data to perform comparative effectiveness research better understand the necessity of a rigorous planning before study start, and gain better insight in the choice of statistical methods so as to optimize the quality of the research study. Copyright © 2017 Elsevier Inc. All rights reserved.
Testing for Mutagens Using Fruit Flies.
ERIC Educational Resources Information Center
Liebl, Eric C.
1998-01-01
Describes a laboratory employed in undergraduate teaching that uses fruit flies to test student-selected compounds for their ability to cause mutations. Requires no prior experience with fruit flies, incorporates a student design component, and employs both rigorous controls and statistical analyses. (DDR)
Schaid, Daniel J
2010-01-01
Measures of genomic similarity are the basis of many statistical analytic methods. We review the mathematical and statistical basis of similarity methods, particularly based on kernel methods. A kernel function converts information for a pair of subjects to a quantitative value representing either similarity (larger values meaning more similar) or distance (smaller values meaning more similar), with the requirement that it must create a positive semidefinite matrix when applied to all pairs of subjects. This review emphasizes the wide range of statistical methods and software that can be used when similarity is based on kernel methods, such as nonparametric regression, linear mixed models and generalized linear mixed models, hierarchical models, score statistics, and support vector machines. The mathematical rigor for these methods is summarized, as is the mathematical framework for making kernels. This review provides a framework to move from intuitive and heuristic approaches to define genomic similarities to more rigorous methods that can take advantage of powerful statistical modeling and existing software. A companion paper reviews novel approaches to creating kernels that might be useful for genomic analyses, providing insights with examples [1]. Copyright © 2010 S. Karger AG, Basel.
ERIC Educational Resources Information Center
Harwell, Michael
2014-01-01
Commercial data analysis software has been a fixture of quantitative analyses in education for more than three decades. Despite its apparent widespread use there is no formal evidence cataloging what software is used in educational research and educational statistics classes, by whom and for what purpose, and whether some programs should be…
First Monte Carlo analysis of fragmentation functions from single-inclusive e + e - annihilation
Sato, Nobuo; Ethier, J. J.; Melnitchouk, W.; ...
2016-12-02
Here, we perform the first iterative Monte Carlo (IMC) analysis of fragmentation functions constrained by all available data from single-inclusive $e^+ e^-$ annihilation into pions and kaons. The IMC method eliminates potential bias in traditional analyses based on single fits introduced by fixing parameters not well contrained by the data, and provides a statistically rigorous determination of uncertainties. Our analysis reveals specific features of fragmentation functions using the new IMC methodology and those obtained from previous analyses, especially for light quarks and for strange quark fragmentation to kaons.
NASA Astrophysics Data System (ADS)
Karuppiah, R.; Faldi, A.; Laurenzi, I.; Usadi, A.; Venkatesh, A.
2014-12-01
An increasing number of studies are focused on assessing the environmental footprint of different products and processes, especially using life cycle assessment (LCA). This work shows how combining statistical methods and Geographic Information Systems (GIS) with environmental analyses can help improve the quality of results and their interpretation. Most environmental assessments in literature yield single numbers that characterize the environmental impact of a process/product - typically global or country averages, often unchanging in time. In this work, we show how statistical analysis and GIS can help address these limitations. For example, we demonstrate a method to separately quantify uncertainty and variability in the result of LCA models using a power generation case study. This is important for rigorous comparisons between the impacts of different processes. Another challenge is lack of data that can affect the rigor of LCAs. We have developed an approach to estimate environmental impacts of incompletely characterized processes using predictive statistical models. This method is applied to estimate unreported coal power plant emissions in several world regions. There is also a general lack of spatio-temporal characterization of the results in environmental analyses. For instance, studies that focus on water usage do not put in context where and when water is withdrawn. Through the use of hydrological modeling combined with GIS, we quantify water stress on a regional and seasonal basis to understand water supply and demand risks for multiple users. Another example where it is important to consider regional dependency of impacts is when characterizing how agricultural land occupation affects biodiversity in a region. We developed a data-driven methodology used in conjuction with GIS to determine if there is a statistically significant difference between the impacts of growing different crops on different species in various biomes of the world.
Sunspot activity and influenza pandemics: a statistical assessment of the purported association.
Towers, S
2017-10-01
Since 1978, a series of papers in the literature have claimed to find a significant association between sunspot activity and the timing of influenza pandemics. This paper examines these analyses, and attempts to recreate the three most recent statistical analyses by Ertel (1994), Tapping et al. (2001), and Yeung (2006), which all have purported to find a significant relationship between sunspot numbers and pandemic influenza. As will be discussed, each analysis had errors in the data. In addition, in each analysis arbitrary selections or assumptions were also made, and the authors did not assess the robustness of their analyses to changes in those arbitrary assumptions. Varying the arbitrary assumptions to other, equally valid, assumptions negates the claims of significance. Indeed, an arbitrary selection made in one of the analyses appears to have resulted in almost maximal apparent significance; changing it only slightly yields a null result. This analysis applies statistically rigorous methodology to examine the purported sunspot/pandemic link, using more statistically powerful un-binned analysis methods, rather than relying on arbitrarily binned data. The analyses are repeated using both the Wolf and Group sunspot numbers. In all cases, no statistically significant evidence of any association was found. However, while the focus in this particular analysis was on the purported relationship of influenza pandemics to sunspot activity, the faults found in the past analyses are common pitfalls; inattention to analysis reproducibility and robustness assessment are common problems in the sciences, that are unfortunately not noted often enough in review.
a Critical Review of Automated Photogrammetric Processing of Large Datasets
NASA Astrophysics Data System (ADS)
Remondino, F.; Nocerino, E.; Toschi, I.; Menna, F.
2017-08-01
The paper reports some comparisons between commercial software able to automatically process image datasets for 3D reconstruction purposes. The main aspects investigated in the work are the capability to correctly orient large sets of image of complex environments, the metric quality of the results, replicability and redundancy. Different datasets are employed, each one featuring a diverse number of images, GSDs at cm and mm resolutions, and ground truth information to perform statistical analyses of the 3D results. A summary of (photogrammetric) terms is also provided, in order to provide rigorous terms of reference for comparisons and critical analyses.
Can power-law scaling and neuronal avalanches arise from stochastic dynamics?
Touboul, Jonathan; Destexhe, Alain
2010-02-11
The presence of self-organized criticality in biology is often evidenced by a power-law scaling of event size distributions, which can be measured by linear regression on logarithmic axes. We show here that such a procedure does not necessarily mean that the system exhibits self-organized criticality. We first provide an analysis of multisite local field potential (LFP) recordings of brain activity and show that event size distributions defined as negative LFP peaks can be close to power-law distributions. However, this result is not robust to change in detection threshold, or when tested using more rigorous statistical analyses such as the Kolmogorov-Smirnov test. Similar power-law scaling is observed for surrogate signals, suggesting that power-law scaling may be a generic property of thresholded stochastic processes. We next investigate this problem analytically, and show that, indeed, stochastic processes can produce spurious power-law scaling without the presence of underlying self-organized criticality. However, this power-law is only apparent in logarithmic representations, and does not survive more rigorous analysis such as the Kolmogorov-Smirnov test. The same analysis was also performed on an artificial network known to display self-organized criticality. In this case, both the graphical representations and the rigorous statistical analysis reveal with no ambiguity that the avalanche size is distributed as a power-law. We conclude that logarithmic representations can lead to spurious power-law scaling induced by the stochastic nature of the phenomenon. This apparent power-law scaling does not constitute a proof of self-organized criticality, which should be demonstrated by more stringent statistical tests.
Statistical ecology comes of age.
Gimenez, Olivier; Buckland, Stephen T; Morgan, Byron J T; Bez, Nicolas; Bertrand, Sophie; Choquet, Rémi; Dray, Stéphane; Etienne, Marie-Pierre; Fewster, Rachel; Gosselin, Frédéric; Mérigot, Bastien; Monestiez, Pascal; Morales, Juan M; Mortier, Frédéric; Munoz, François; Ovaskainen, Otso; Pavoine, Sandrine; Pradel, Roger; Schurr, Frank M; Thomas, Len; Thuiller, Wilfried; Trenkel, Verena; de Valpine, Perry; Rexstad, Eric
2014-12-01
The desire to predict the consequences of global environmental change has been the driver towards more realistic models embracing the variability and uncertainties inherent in ecology. Statistical ecology has gelled over the past decade as a discipline that moves away from describing patterns towards modelling the ecological processes that generate these patterns. Following the fourth International Statistical Ecology Conference (1-4 July 2014) in Montpellier, France, we analyse current trends in statistical ecology. Important advances in the analysis of individual movement, and in the modelling of population dynamics and species distributions, are made possible by the increasing use of hierarchical and hidden process models. Exciting research perspectives include the development of methods to interpret citizen science data and of efficient, flexible computational algorithms for model fitting. Statistical ecology has come of age: it now provides a general and mathematically rigorous framework linking ecological theory and empirical data.
Statistical ecology comes of age
Gimenez, Olivier; Buckland, Stephen T.; Morgan, Byron J. T.; Bez, Nicolas; Bertrand, Sophie; Choquet, Rémi; Dray, Stéphane; Etienne, Marie-Pierre; Fewster, Rachel; Gosselin, Frédéric; Mérigot, Bastien; Monestiez, Pascal; Morales, Juan M.; Mortier, Frédéric; Munoz, François; Ovaskainen, Otso; Pavoine, Sandrine; Pradel, Roger; Schurr, Frank M.; Thomas, Len; Thuiller, Wilfried; Trenkel, Verena; de Valpine, Perry; Rexstad, Eric
2014-01-01
The desire to predict the consequences of global environmental change has been the driver towards more realistic models embracing the variability and uncertainties inherent in ecology. Statistical ecology has gelled over the past decade as a discipline that moves away from describing patterns towards modelling the ecological processes that generate these patterns. Following the fourth International Statistical Ecology Conference (1–4 July 2014) in Montpellier, France, we analyse current trends in statistical ecology. Important advances in the analysis of individual movement, and in the modelling of population dynamics and species distributions, are made possible by the increasing use of hierarchical and hidden process models. Exciting research perspectives include the development of methods to interpret citizen science data and of efficient, flexible computational algorithms for model fitting. Statistical ecology has come of age: it now provides a general and mathematically rigorous framework linking ecological theory and empirical data. PMID:25540151
Pavlacky, David C; Lukacs, Paul M; Blakesley, Jennifer A; Skorkowsky, Robert C; Klute, David S; Hahn, Beth A; Dreitz, Victoria J; George, T Luke; Hanni, David J
2017-01-01
Monitoring is an essential component of wildlife management and conservation. However, the usefulness of monitoring data is often undermined by the lack of 1) coordination across organizations and regions, 2) meaningful management and conservation objectives, and 3) rigorous sampling designs. Although many improvements to avian monitoring have been discussed, the recommendations have been slow to emerge in large-scale programs. We introduce the Integrated Monitoring in Bird Conservation Regions (IMBCR) program designed to overcome the above limitations. Our objectives are to outline the development of a statistically defensible sampling design to increase the value of large-scale monitoring data and provide example applications to demonstrate the ability of the design to meet multiple conservation and management objectives. We outline the sampling process for the IMBCR program with a focus on the Badlands and Prairies Bird Conservation Region (BCR 17). We provide two examples for the Brewer's sparrow (Spizella breweri) in BCR 17 demonstrating the ability of the design to 1) determine hierarchical population responses to landscape change and 2) estimate hierarchical habitat relationships to predict the response of the Brewer's sparrow to conservation efforts at multiple spatial scales. The collaboration across organizations and regions provided economy of scale by leveraging a common data platform over large spatial scales to promote the efficient use of monitoring resources. We designed the IMBCR program to address the information needs and core conservation and management objectives of the participating partner organizations. Although it has been argued that probabilistic sampling designs are not practical for large-scale monitoring, the IMBCR program provides a precedent for implementing a statistically defensible sampling design from local to bioregional scales. We demonstrate that integrating conservation and management objectives with rigorous statistical design and analyses ensures reliable knowledge about bird populations that is relevant and integral to bird conservation at multiple scales.
Hahn, Beth A.; Dreitz, Victoria J.; George, T. Luke
2017-01-01
Monitoring is an essential component of wildlife management and conservation. However, the usefulness of monitoring data is often undermined by the lack of 1) coordination across organizations and regions, 2) meaningful management and conservation objectives, and 3) rigorous sampling designs. Although many improvements to avian monitoring have been discussed, the recommendations have been slow to emerge in large-scale programs. We introduce the Integrated Monitoring in Bird Conservation Regions (IMBCR) program designed to overcome the above limitations. Our objectives are to outline the development of a statistically defensible sampling design to increase the value of large-scale monitoring data and provide example applications to demonstrate the ability of the design to meet multiple conservation and management objectives. We outline the sampling process for the IMBCR program with a focus on the Badlands and Prairies Bird Conservation Region (BCR 17). We provide two examples for the Brewer’s sparrow (Spizella breweri) in BCR 17 demonstrating the ability of the design to 1) determine hierarchical population responses to landscape change and 2) estimate hierarchical habitat relationships to predict the response of the Brewer’s sparrow to conservation efforts at multiple spatial scales. The collaboration across organizations and regions provided economy of scale by leveraging a common data platform over large spatial scales to promote the efficient use of monitoring resources. We designed the IMBCR program to address the information needs and core conservation and management objectives of the participating partner organizations. Although it has been argued that probabilistic sampling designs are not practical for large-scale monitoring, the IMBCR program provides a precedent for implementing a statistically defensible sampling design from local to bioregional scales. We demonstrate that integrating conservation and management objectives with rigorous statistical design and analyses ensures reliable knowledge about bird populations that is relevant and integral to bird conservation at multiple scales. PMID:29065128
Rigorous Science: a How-To Guide.
Casadevall, Arturo; Fang, Ferric C
2016-11-08
Proposals to improve the reproducibility of biomedical research have emphasized scientific rigor. Although the word "rigor" is widely used, there has been little specific discussion as to what it means and how it can be achieved. We suggest that scientific rigor combines elements of mathematics, logic, philosophy, and ethics. We propose a framework for rigor that includes redundant experimental design, sound statistical analysis, recognition of error, avoidance of logical fallacies, and intellectual honesty. These elements lead to five actionable recommendations for research education. Copyright © 2016 Casadevall and Fang.
NASA Astrophysics Data System (ADS)
Gillam, Thomas P. S.; Lester, Christopher G.
2014-11-01
We consider current and alternative approaches to setting limits on new physics signals having backgrounds from misidentified objects; for example jets misidentified as leptons, b-jets or photons. Many ATLAS and CMS analyses have used a heuristic "matrix method" for estimating the background contribution from such sources. We demonstrate that the matrix method suffers from statistical shortcomings that can adversely affect its ability to set robust limits. A rigorous alternative method is discussed, and is seen to produce fake rate estimates and limits with better qualities, but is found to be too costly to use. Having investigated the nature of the approximations used to derive the matrix method, we propose a third strategy that is seen to marry the speed of the matrix method to the performance and physicality of the more rigorous approach.
Time Series Expression Analyses Using RNA-seq: A Statistical Approach
Oh, Sunghee; Song, Seongho; Grabowski, Gregory; Zhao, Hongyu; Noonan, James P.
2013-01-01
RNA-seq is becoming the de facto standard approach for transcriptome analysis with ever-reducing cost. It has considerable advantages over conventional technologies (microarrays) because it allows for direct identification and quantification of transcripts. Many time series RNA-seq datasets have been collected to study the dynamic regulations of transcripts. However, statistically rigorous and computationally efficient methods are needed to explore the time-dependent changes of gene expression in biological systems. These methods should explicitly account for the dependencies of expression patterns across time points. Here, we discuss several methods that can be applied to model timecourse RNA-seq data, including statistical evolutionary trajectory index (SETI), autoregressive time-lagged regression (AR(1)), and hidden Markov model (HMM) approaches. We use three real datasets and simulation studies to demonstrate the utility of these dynamic methods in temporal analysis. PMID:23586021
Time series expression analyses using RNA-seq: a statistical approach.
Oh, Sunghee; Song, Seongho; Grabowski, Gregory; Zhao, Hongyu; Noonan, James P
2013-01-01
RNA-seq is becoming the de facto standard approach for transcriptome analysis with ever-reducing cost. It has considerable advantages over conventional technologies (microarrays) because it allows for direct identification and quantification of transcripts. Many time series RNA-seq datasets have been collected to study the dynamic regulations of transcripts. However, statistically rigorous and computationally efficient methods are needed to explore the time-dependent changes of gene expression in biological systems. These methods should explicitly account for the dependencies of expression patterns across time points. Here, we discuss several methods that can be applied to model timecourse RNA-seq data, including statistical evolutionary trajectory index (SETI), autoregressive time-lagged regression (AR(1)), and hidden Markov model (HMM) approaches. We use three real datasets and simulation studies to demonstrate the utility of these dynamic methods in temporal analysis.
Unperturbed Schelling Segregation in Two or Three Dimensions
NASA Astrophysics Data System (ADS)
Barmpalias, George; Elwes, Richard; Lewis-Pye, Andrew
2016-09-01
Schelling's models of segregation, first described in 1969 (Am Econ Rev 59:488-493, 1969) are among the best known models of self-organising behaviour. Their original purpose was to identify mechanisms of urban racial segregation. But his models form part of a family which arises in statistical mechanics, neural networks, social science, and beyond, where populations of agents interact on networks. Despite extensive study, unperturbed Schelling models have largely resisted rigorous analysis, prior results generally focusing on variants in which noise is introduced into the dynamics, the resulting system being amenable to standard techniques from statistical mechanics or stochastic evolutionary game theory (Young in Individual strategy and social structure: an evolutionary theory of institutions, Princeton University Press, Princeton, 1998). A series of recent papers (Brandt et al. in: Proceedings of the 44th annual ACM symposium on theory of computing (STOC 2012), 2012); Barmpalias et al. in: 55th annual IEEE symposium on foundations of computer science, Philadelphia, 2014, J Stat Phys 158:806-852, 2015), has seen the first rigorous analyses of 1-dimensional unperturbed Schelling models, in an asymptotic framework largely unknown in statistical mechanics. Here we provide the first such analysis of 2- and 3-dimensional unperturbed models, establishing most of the phase diagram, and answering a challenge from Brandt et al. in: Proceedings of the 44th annual ACM symposium on theory of computing (STOC 2012), 2012).
Systematic review of the quality of prognosis studies in systemic lupus erythematosus.
Lim, Lily S H; Lee, Senq J; Feldman, Brian M; Gladman, Dafna D; Pullenayegum, Eleanor; Uleryk, Elizabeth; Silverman, Earl D
2014-10-01
Prognosis studies examine outcomes and/or seek to identify predictors or factors associated with outcomes. Many prognostic factors have been identified in systemic lupus erythematosus (SLE), but few have been consistently found across studies. We hypothesized that this is due to a lack of rigor of study designs. This study aimed to systematically assess the methodologic quality of prognosis studies in SLE. A search of prognosis studies in SLE was performed using MEDLINE and Embase, from January 1990 to June 2011. A representative sample of 150 articles was selected using a random number generator and assessed by 2 reviewers. Each study was assessed by a risk of bias tool according to 6 domains: study participation, study attrition, measurement of prognostic factors, measurement of outcomes, measurement/adjustment for confounders, and appropriateness of statistical analysis. Information about missing data was also collected. A cohort design was used in 71% of studies. High risk of bias was found in 65% of studies for confounders, 57% for study participation, 56% for attrition, 36% for statistical analyses, 20% for prognostic factors, and 18% for outcome. Missing covariate or outcome information was present in half of the studies. Only 6 studies discussed reasons for missing data and 2 imputed missing data. Lack of rigorous study design, especially in addressing confounding, study participation and attrition, and inadequately handled missing data, has limited the quality of prognosis studies in SLE. Future prognosis studies should be designed with consideration of these factors to improve methodologic rigor. Copyright © 2014 by the American College of Rheumatology.
Krawczyk, Christopher; Gradziel, Pat; Geraghty, Estella M.
2014-01-01
Objectives. We used a geographic information system and cluster analyses to determine locations in need of enhanced Special Supplemental Nutrition Program for Women, Infants, and Children (WIC) Program services. Methods. We linked documented births in the 2010 California Birth Statistical Master File with the 2010 data from the WIC Integrated Statewide Information System. Analyses focused on the density of pregnant women who were eligible for but not receiving WIC services in California’s 7049 census tracts. We used incremental spatial autocorrelation and hot spot analyses to identify clusters of WIC-eligible nonparticipants. Results. We detected clusters of census tracts with higher-than-expected densities, compared with the state mean density of WIC-eligible nonparticipants, in 21 of 58 (36.2%) California counties (P < .05). In subsequent county-level analyses, we located neighborhood-level clusters of higher-than-expected densities of eligible nonparticipants in Sacramento, San Francisco, Fresno, and Los Angeles Counties (P < .05). Conclusions. Hot spot analyses provided a rigorous and objective approach to determine the locations of statistically significant clusters of WIC-eligible nonparticipants. Results helped inform WIC program and funding decisions, including the opening of new WIC centers, and offered a novel approach for targeting public health services. PMID:24354821
Quantitative Analysis of Venus Radar Backscatter Data in ArcGIS
NASA Technical Reports Server (NTRS)
Long, S. M.; Grosfils, E. B.
2005-01-01
Ongoing mapping of the Ganiki Planitia (V14) quadrangle of Venus and definition of material units has involved an integrated but qualitative analysis of Magellan radar backscatter images and topography using standard geomorphological mapping techniques. However, such analyses do not take full advantage of the quantitative information contained within the images. Analysis of the backscatter coefficient allows a much more rigorous statistical comparison between mapped units, permitting first order selfsimilarity tests of geographically separated materials assigned identical geomorphological labels. Such analyses cannot be performed directly on pixel (DN) values from Magellan backscatter images, because the pixels are scaled to the Muhleman law for radar echoes on Venus and are not corrected for latitudinal variations in incidence angle. Therefore, DN values must be converted based on pixel latitude back to their backscatter coefficient values before accurate statistical analysis can occur. Here we present a method for performing the conversions and analysis of Magellan backscatter data using commonly available ArcGIS software and illustrate the advantages of the process for geological mapping.
Rigorous Science: a How-To Guide
Fang, Ferric C.
2016-01-01
ABSTRACT Proposals to improve the reproducibility of biomedical research have emphasized scientific rigor. Although the word “rigor” is widely used, there has been little specific discussion as to what it means and how it can be achieved. We suggest that scientific rigor combines elements of mathematics, logic, philosophy, and ethics. We propose a framework for rigor that includes redundant experimental design, sound statistical analysis, recognition of error, avoidance of logical fallacies, and intellectual honesty. These elements lead to five actionable recommendations for research education. PMID:27834205
A statistical anomaly indicates symbiotic origins of eukaryotic membranes
Bansal, Suneyna; Mittal, Aditya
2015-01-01
Compositional analyses of nucleic acids and proteins have shed light on possible origins of living cells. In this work, rigorous compositional analyses of ∼5000 plasma membrane lipid constituents of 273 species in the three life domains (archaea, eubacteria, and eukaryotes) revealed a remarkable statistical paradox, indicating symbiotic origins of eukaryotic cells involving eubacteria. For lipids common to plasma membranes of the three domains, the number of carbon atoms in eubacteria was found to be similar to that in eukaryotes. However, mutually exclusive subsets of same data show exactly the opposite—the number of carbon atoms in lipids of eukaryotes was higher than in eubacteria. This statistical paradox, called Simpson's paradox, was absent for lipids in archaea and for lipids not common to plasma membranes of the three domains. This indicates the presence of interaction(s) and/or association(s) in lipids forming plasma membranes of eubacteria and eukaryotes but not for those in archaea. Further inspection of membrane lipid structures affecting physicochemical properties of plasma membranes provides the first evidence (to our knowledge) on the symbiotic origins of eukaryotic cells based on the “third front” (i.e., lipids) in addition to the growing compositional data from nucleic acids and proteins. PMID:25631820
Methodological rigor and citation frequency in patient compliance literature.
Bruer, J T
1982-01-01
An exhaustive bibliography which assesses the methodological rigor of the patient compliance literature, and citation data from the Science Citation Index (SCI) are combined to determine if methodologically rigorous papers are used with greater frequency than substandard articles by compliance investigators. There are low, but statistically significant, correlations between methodological rigor and citation indicators for 138 patient compliance papers published in SCI source journals during 1975 and 1976. The correlation is not strong enough to warrant use of citation measures as indicators of rigor on a paper-by-paper basis. The data do suggest that citation measures might be developed as crude indicators of methodological rigor. There is no evidence that randomized trials are cited more frequently than studies that employ other experimental designs. PMID:7114334
Peer Review of EPA's Draft BMDS Document: Exponential ...
BMDS is one of the Agency's premier tools for estimating risk assessments, therefore the validity and reliability of its statistical models are of paramount importance. This page provides links to peer review of the BMDS applications and its models as they were developed and eventually released documenting the rigorous review process taken to provide the best science tools available for statistical modeling. This page provides links to peer review of the BMDS applications and its models as they were developed and eventually released documenting the rigorous review process taken to provide the best science tools available for statistical modeling.
Quality control and conduct of genome-wide association meta-analyses.
Winkler, Thomas W; Day, Felix R; Croteau-Chonka, Damien C; Wood, Andrew R; Locke, Adam E; Mägi, Reedik; Ferreira, Teresa; Fall, Tove; Graff, Mariaelisa; Justice, Anne E; Luan, Jian'an; Gustafsson, Stefan; Randall, Joshua C; Vedantam, Sailaja; Workalemahu, Tsegaselassie; Kilpeläinen, Tuomas O; Scherag, André; Esko, Tonu; Kutalik, Zoltán; Heid, Iris M; Loos, Ruth J F
2014-05-01
Rigorous organization and quality control (QC) are necessary to facilitate successful genome-wide association meta-analyses (GWAMAs) of statistics aggregated across multiple genome-wide association studies. This protocol provides guidelines for (i) organizational aspects of GWAMAs, and for (ii) QC at the study file level, the meta-level across studies and the meta-analysis output level. Real-world examples highlight issues experienced and solutions developed by the GIANT Consortium that has conducted meta-analyses including data from 125 studies comprising more than 330,000 individuals. We provide a general protocol for conducting GWAMAs and carrying out QC to minimize errors and to guarantee maximum use of the data. We also include details for the use of a powerful and flexible software package called EasyQC. Precise timings will be greatly influenced by consortium size. For consortia of comparable size to the GIANT Consortium, this protocol takes a minimum of about 10 months to complete.
Quality control and conduct of genome-wide association meta-analyses
Winkler, Thomas W; Day, Felix R; Croteau-Chonka, Damien C; Wood, Andrew R; Locke, Adam E; Mägi, Reedik; Ferreira, Teresa; Fall, Tove; Graff, Mariaelisa; Justice, Anne E; Luan, Jian'an; Gustafsson, Stefan; Randall, Joshua C; Vedantam, Sailaja; Workalemahu, Tsegaselassie; Kilpeläinen, Tuomas O; Scherag, André; Esko, Tonu; Kutalik, Zoltán; Heid, Iris M; Loos, Ruth JF
2014-01-01
Rigorous organization and quality control (QC) are necessary to facilitate successful genome-wide association meta-analyses (GWAMAs) of statistics aggregated across multiple genome-wide association studies. This protocol provides guidelines for [1] organizational aspects of GWAMAs, and for [2] QC at the study file level, the meta-level across studies, and the meta-analysis output level. Real–world examples highlight issues experienced and solutions developed by the GIANT Consortium that has conducted meta-analyses including data from 125 studies comprising more than 330,000 individuals. We provide a general protocol for conducting GWAMAs and carrying out QC to minimize errors and to guarantee maximum use of the data. We also include details for use of a powerful and flexible software package called EasyQC. For consortia of comparable size to the GIANT consortium, the present protocol takes a minimum of about 10 months to complete. PMID:24762786
Stopka, Thomas J; Goulart, Michael A; Meyers, David J; Hutcheson, Marga; Barton, Kerri; Onofrey, Shauna; Church, Daniel; Donahue, Ashley; Chui, Kenneth K H
2017-04-20
Hepatitis C virus (HCV) infections have increased during the past decade but little is known about geographic clustering patterns. We used a unique analytical approach, combining geographic information systems (GIS), spatial epidemiology, and statistical modeling to identify and characterize HCV hotspots, statistically significant clusters of census tracts with elevated HCV counts and rates. We compiled sociodemographic and HCV surveillance data (n = 99,780 cases) for Massachusetts census tracts (n = 1464) from 2002 to 2013. We used a five-step spatial epidemiological approach, calculating incremental spatial autocorrelations and Getis-Ord Gi* statistics to identify clusters. We conducted logistic regression analyses to determine factors associated with the HCV hotspots. We identified nine HCV clusters, with the largest in Boston, New Bedford/Fall River, Worcester, and Springfield (p < 0.05). In multivariable analyses, we found that HCV hotspots were independently and positively associated with the percent of the population that was Hispanic (adjusted odds ratio [AOR]: 1.07; 95% confidence interval [CI]: 1.04, 1.09) and the percent of households receiving food stamps (AOR: 1.83; 95% CI: 1.22, 2.74). HCV hotspots were independently and negatively associated with the percent of the population that were high school graduates or higher (AOR: 0.91; 95% CI: 0.89, 0.93) and the percent of the population in the "other" race/ethnicity category (AOR: 0.88; 95% CI: 0.85, 0.91). We identified locations where HCV clusters were a concern, and where enhanced HCV prevention, treatment, and care can help combat the HCV epidemic in Massachusetts. GIS, spatial epidemiological and statistical analyses provided a rigorous approach to identify hotspot clusters of disease, which can inform public health policy and intervention targeting. Further studies that incorporate spatiotemporal cluster analyses, Bayesian spatial and geostatistical models, spatially weighted regression analyses, and assessment of associations between HCV clustering and the built environment are needed to expand upon our combined spatial epidemiological and statistical methods.
Chang, Chih-Cheng; Su, Jian-An; Tsai, Ching-Shu; Yen, Cheng-Fang; Liu, Jiun-Horng; Lin, Chung-Ying
2015-06-01
To examine the psychometrics of the Affiliate Stigma Scale using rigorous psychometric analysis: classical test theory (CTT) (traditional) and Rasch analysis (modern). Differential item functioning (DIF) items were also tested using Rasch analysis. Caregivers of relatives with mental illness (n = 453; mean age: 53.29 ± 13.50 years) were recruited from southern Taiwan. Each participant filled out four questionnaires: Affiliate Stigma Scale, Rosenberg Self-Esteem Scale, Beck Anxiety Inventory, and one background information sheet. CTT analyses showed that the Affiliate Stigma Scale had satisfactory internal consistency (α = 0.85-0.94) and concurrent validity (Rosenberg Self-Esteem Scale: r = -0.52 to -0.46; Beck Anxiety Inventory: r = 0.27-0.34). Rasch analyses supported the unidimensionality of three domains in the Affiliate Stigma Scale and indicated four DIF items (affect domain: 1; cognitive domain: 3) across gender. Our findings, based on rigorous statistical analysis, verified the psychometrics of the Affiliate Stigma Scale and reported its DIF items. We conclude that the three domains of the Affiliate Stigma Scale can be separately used and are suitable for measuring the affiliate stigma of caregivers of relatives with mental illness. Copyright © 2015 Elsevier Inc. All rights reserved.
Statistical issues in the design, conduct and analysis of two large safety studies.
Gaffney, Michael
2016-10-01
The emergence, post approval, of serious medical events, which may be associated with the use of a particular drug or class of drugs, is an important public health and regulatory issue. The best method to address this issue is through a large, rigorously designed safety study. Therefore, it is important to elucidate the statistical issues involved in these large safety studies. Two such studies are PRECISION and EAGLES. PRECISION is the primary focus of this article. PRECISION is a non-inferiority design with a clinically relevant non-inferiority margin. Statistical issues in the design, conduct and analysis of PRECISION are discussed. Quantitative and clinical aspects of the selection of the composite primary endpoint, the determination and role of the non-inferiority margin in a large safety study and the intent-to-treat and modified intent-to-treat analyses in a non-inferiority safety study are shown. Protocol changes that were necessary during the conduct of PRECISION are discussed from a statistical perspective. Issues regarding the complex analysis and interpretation of the results of PRECISION are outlined. EAGLES is presented as a large, rigorously designed safety study when a non-inferiority margin was not able to be determined by a strong clinical/scientific method. In general, when a non-inferiority margin is not able to be determined, the width of the 95% confidence interval is a way to size the study and to assess the cost-benefit of relative trial size. A non-inferiority margin, when able to be determined by a strong scientific method, should be included in a large safety study. Although these studies could not be called "pragmatic," they are examples of best real-world designs to address safety and regulatory concerns. © The Author(s) 2016.
Grey literature in meta-analyses.
Conn, Vicki S; Valentine, Jeffrey C; Cooper, Harris M; Rantz, Marilyn J
2003-01-01
In meta-analysis, researchers combine the results of individual studies to arrive at cumulative conclusions. Meta-analysts sometimes include "grey literature" in their evidential base, which includes unpublished studies and studies published outside widely available journals. Because grey literature is a source of data that might not employ peer review, critics have questioned the validity of its data and the results of meta-analyses that include it. To examine evidence regarding whether grey literature should be included in meta-analyses and strategies to manage grey literature in quantitative synthesis. This article reviews evidence on whether the results of studies published in peer-reviewed journals are representative of results from broader samplings of research on a topic as a rationale for inclusion of grey literature. Strategies to enhance access to grey literature are addressed. The most consistent and robust difference between published and grey literature is that published research is more likely to contain results that are statistically significant. Effect size estimates of published research are about one-third larger than those of unpublished studies. Unfunded and small sample studies are less likely to be published. Yet, importantly, methodological rigor does not differ between published and grey literature. Meta-analyses that exclude grey literature likely (a) over-represent studies with statistically significant findings, (b) inflate effect size estimates, and (c) provide less precise effect size estimates than meta-analyses including grey literature. Meta-analyses should include grey literature to fully reflect the existing evidential base and should assess the impact of methodological variations through moderator analysis.
Practice-based evidence study design for comparative effectiveness research.
Horn, Susan D; Gassaway, Julie
2007-10-01
To describe a new, rigorous, comprehensive practice-based evidence for clinical practice improvement (PBE-CPI) study methodology, and compare its features, advantages, and disadvantages to those of randomized controlled trials and sophisticated statistical methods for comparative effectiveness research. PBE-CPI incorporates natural variation within data from routine clinical practice to determine what works, for whom, when, and at what cost. It uses the knowledge of front-line caregivers, who develop study questions and define variables as part of a transdisciplinary team. Its comprehensive measurement framework provides a basis for analyses of significant bivariate and multivariate associations between treatments and outcomes, controlling for patient differences, such as severity of illness. PBE-CPI studies can uncover better practices more quickly than randomized controlled trials or sophisticated statistical methods, while achieving many of the same advantages. We present examples of actionable findings from PBE-CPI studies in postacute care settings related to comparative effectiveness of medications, nutritional support approaches, incontinence products, physical therapy activities, and other services. Outcomes improved when practices associated with better outcomes in PBE-CPI analyses were adopted in practice.
Studies on the estimation of the postmortem interval. 3. Rigor mortis (author's transl).
Suzutani, T; Ishibashi, H; Takatori, T
1978-11-01
The authors have devised a method for classifying rigor mortis into 10 types based on its appearance and strength in various parts of a cadaver. By applying the method to the findings of 436 cadavers which were subjected to medico-legal autopsies in our laboratory during the last 10 years, it has been demonstrated that the classifying method is effective for analyzing the phenomenon of onset, persistence and disappearance of rigor mortis statistically. The investigation of the relationship between each type of rigor mortis and the postmortem interval has demonstrated that rigor mortis may be utilized as a basis for estimating the postmortem interval but the values have greater deviation than those described in current textbooks.
Mass spectrometry-based protein identification with accurate statistical significance assignment.
Alves, Gelio; Yu, Yi-Kuo
2015-03-01
Assigning statistical significance accurately has become increasingly important as metadata of many types, often assembled in hierarchies, are constructed and combined for further biological analyses. Statistical inaccuracy of metadata at any level may propagate to downstream analyses, undermining the validity of scientific conclusions thus drawn. From the perspective of mass spectrometry-based proteomics, even though accurate statistics for peptide identification can now be achieved, accurate protein level statistics remain challenging. We have constructed a protein ID method that combines peptide evidences of a candidate protein based on a rigorous formula derived earlier; in this formula the database P-value of every peptide is weighted, prior to the final combination, according to the number of proteins it maps to. We have also shown that this protein ID method provides accurate protein level E-value, eliminating the need of using empirical post-processing methods for type-I error control. Using a known protein mixture, we find that this protein ID method, when combined with the Sorić formula, yields accurate values for the proportion of false discoveries. In terms of retrieval efficacy, the results from our method are comparable with other methods tested. The source code, implemented in C++ on a linux system, is available for download at ftp://ftp.ncbi.nlm.nih.gov/pub/qmbp/qmbp_ms/RAId/RAId_Linux_64Bit. Published by Oxford University Press 2014. This work is written by US Government employees and is in the public domain in the US.
Bayesian Reconstruction of Disease Outbreaks by Combining Epidemiologic and Genomic Data
Jombart, Thibaut; Cori, Anne; Didelot, Xavier; Cauchemez, Simon; Fraser, Christophe; Ferguson, Neil
2014-01-01
Recent years have seen progress in the development of statistically rigorous frameworks to infer outbreak transmission trees (“who infected whom”) from epidemiological and genetic data. Making use of pathogen genome sequences in such analyses remains a challenge, however, with a variety of heuristic approaches having been explored to date. We introduce a statistical method exploiting both pathogen sequences and collection dates to unravel the dynamics of densely sampled outbreaks. Our approach identifies likely transmission events and infers dates of infections, unobserved cases and separate introductions of the disease. It also proves useful for inferring numbers of secondary infections and identifying heterogeneous infectivity and super-spreaders. After testing our approach using simulations, we illustrate the method with the analysis of the beginning of the 2003 Singaporean outbreak of Severe Acute Respiratory Syndrome (SARS), providing new insights into the early stage of this epidemic. Our approach is the first tool for disease outbreak reconstruction from genetic data widely available as free software, the R package outbreaker. It is applicable to various densely sampled epidemics, and improves previous approaches by detecting unobserved and imported cases, as well as allowing multiple introductions of the pathogen. Because of its generality, we believe this method will become a tool of choice for the analysis of densely sampled disease outbreaks, and will form a rigorous framework for subsequent methodological developments. PMID:24465202
GEOquery: a bridge between the Gene Expression Omnibus (GEO) and BioConductor.
Davis, Sean; Meltzer, Paul S
2007-07-15
Microarray technology has become a standard molecular biology tool. Experimental data have been generated on a huge number of organisms, tissue types, treatment conditions and disease states. The Gene Expression Omnibus (Barrett et al., 2005), developed by the National Center for Bioinformatics (NCBI) at the National Institutes of Health is a repository of nearly 140,000 gene expression experiments. The BioConductor project (Gentleman et al., 2004) is an open-source and open-development software project built in the R statistical programming environment (R Development core Team, 2005) for the analysis and comprehension of genomic data. The tools contained in the BioConductor project represent many state-of-the-art methods for the analysis of microarray and genomics data. We have developed a software tool that allows access to the wealth of information within GEO directly from BioConductor, eliminating many the formatting and parsing problems that have made such analyses labor-intensive in the past. The software, called GEOquery, effectively establishes a bridge between GEO and BioConductor. Easy access to GEO data from BioConductor will likely lead to new analyses of GEO data using novel and rigorous statistical and bioinformatic tools. Facilitating analyses and meta-analyses of microarray data will increase the efficiency with which biologically important conclusions can be drawn from published genomic data. GEOquery is available as part of the BioConductor project.
Weir, Christopher J; Butcher, Isabella; Assi, Valentina; Lewis, Stephanie C; Murray, Gordon D; Langhorne, Peter; Brady, Marian C
2018-03-07
Rigorous, informative meta-analyses rely on availability of appropriate summary statistics or individual participant data. For continuous outcomes, especially those with naturally skewed distributions, summary information on the mean or variability often goes unreported. While full reporting of original trial data is the ideal, we sought to identify methods for handling unreported mean or variability summary statistics in meta-analysis. We undertook two systematic literature reviews to identify methodological approaches used to deal with missing mean or variability summary statistics. Five electronic databases were searched, in addition to the Cochrane Colloquium abstract books and the Cochrane Statistics Methods Group mailing list archive. We also conducted cited reference searching and emailed topic experts to identify recent methodological developments. Details recorded included the description of the method, the information required to implement the method, any underlying assumptions and whether the method could be readily applied in standard statistical software. We provided a summary description of the methods identified, illustrating selected methods in example meta-analysis scenarios. For missing standard deviations (SDs), following screening of 503 articles, fifteen methods were identified in addition to those reported in a previous review. These included Bayesian hierarchical modelling at the meta-analysis level; summary statistic level imputation based on observed SD values from other trials in the meta-analysis; a practical approximation based on the range; and algebraic estimation of the SD based on other summary statistics. Following screening of 1124 articles for methods estimating the mean, one approximate Bayesian computation approach and three papers based on alternative summary statistics were identified. Illustrative meta-analyses showed that when replacing a missing SD the approximation using the range minimised loss of precision and generally performed better than omitting trials. When estimating missing means, a formula using the median, lower quartile and upper quartile performed best in preserving the precision of the meta-analysis findings, although in some scenarios, omitting trials gave superior results. Methods based on summary statistics (minimum, maximum, lower quartile, upper quartile, median) reported in the literature facilitate more comprehensive inclusion of randomised controlled trials with missing mean or variability summary statistics within meta-analyses.
Graphical Descriptives: A Way to Improve Data Transparency and Methodological Rigor in Psychology.
Tay, Louis; Parrigon, Scott; Huang, Qiming; LeBreton, James M
2016-09-01
Several calls have recently been issued to the social sciences for enhanced transparency of research processes and enhanced rigor in the methodological treatment of data and data analytics. We propose the use of graphical descriptives (GDs) as one mechanism for responding to both of these calls. GDs provide a way to visually examine data. They serve as quick and efficient tools for checking data distributions, variable relations, and the potential appropriateness of different statistical analyses (e.g., do data meet the minimum assumptions for a particular analytic method). Consequently, we believe that GDs can promote increased transparency in the journal review process, encourage best practices for data analysis, and promote a more inductive approach to understanding psychological data. We illustrate the value of potentially including GDs as a step in the peer-review process and provide a user-friendly online resource (www.graphicaldescriptives.org) for researchers interested in including data visualizations in their research. We conclude with suggestions on how GDs can be expanded and developed to enhance transparency. © The Author(s) 2016.
Applying Sociocultural Theory to Teaching Statistics for Doctoral Social Work Students
ERIC Educational Resources Information Center
Mogro-Wilson, Cristina; Reeves, Michael G.; Charter, Mollie Lazar
2015-01-01
This article describes the development of two doctoral-level multivariate statistics courses utilizing sociocultural theory, an integrative pedagogical framework. In the first course, the implementation of sociocultural theory helps to support the students through a rigorous introduction to statistics. The second course involves students…
Authenticated DNA from Ancient Wood Remains
LIEPELT, SASCHA; SPERISEN, CHRISTOPH; DEGUILLOUX, MARIE-FRANCE; PETIT, REMY J.; KISSLING, ROY; SPENCER, MATTHEW; DE BEAULIEU, JACQUES-LOUIS; TABERLET, PIERRE; GIELLY, LUDOVIC; ZIEGENHAGEN, BIRGIT
2006-01-01
• Background The reconstruction of biological processes and human activities during the last glacial cycle relies mainly on data from biological remains. Highly abundant tissues, such as wood, are candidates for a genetic analysis of past populations. While well-authenticated DNA has now been recovered from various fossil remains, the final ‘proof’ is still missing for wood, despite some promising studies. • Scope The goal of this study was to determine if ancient wood can be analysed routinely in studies of archaeology and palaeogenetics. An experiment was designed which included blind testing, independent replicates, extensive contamination controls and rigorous statistical tests. Ten samples of ancient wood from major European forest tree genera were analysed with plastid DNA markers. • Conclusions Authentic DNA was retrieved from wood samples up to 1000 years of age. A new tool for real-time vegetation history and archaeology is ready to use. PMID:16987920
diffHic: a Bioconductor package to detect differential genomic interactions in Hi-C data.
Lun, Aaron T L; Smyth, Gordon K
2015-08-19
Chromatin conformation capture with high-throughput sequencing (Hi-C) is a technique that measures the in vivo intensity of interactions between all pairs of loci in the genome. Most conventional analyses of Hi-C data focus on the detection of statistically significant interactions. However, an alternative strategy involves identifying significant changes in the interaction intensity (i.e., differential interactions) between two or more biological conditions. This is more statistically rigorous and may provide more biologically relevant results. Here, we present the diffHic software package for the detection of differential interactions from Hi-C data. diffHic provides methods for read pair alignment and processing, counting into bin pairs, filtering out low-abundance events and normalization of trended or CNV-driven biases. It uses the statistical framework of the edgeR package to model biological variability and to test for significant differences between conditions. Several options for the visualization of results are also included. The use of diffHic is demonstrated with real Hi-C data sets. Performance against existing methods is also evaluated with simulated data. On real data, diffHic is able to successfully detect interactions with significant differences in intensity between biological conditions. It also compares favourably to existing software tools on simulated data sets. These results suggest that diffHic is a viable approach for differential analyses of Hi-C data.
Statistical Methods Applied to Gamma-ray Spectroscopy Algorithms in Nuclear Security Missions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fagan, Deborah K.; Robinson, Sean M.; Runkle, Robert C.
2012-10-01
In a wide range of nuclear security missions, gamma-ray spectroscopy is a critical research and development priority. One particularly relevant challenge is the interdiction of special nuclear material for which gamma-ray spectroscopy supports the goals of detecting and identifying gamma-ray sources. This manuscript examines the existing set of spectroscopy methods, attempts to categorize them by the statistical methods on which they rely, and identifies methods that have yet to be considered. Our examination shows that current methods effectively estimate the effect of counting uncertainty but in many cases do not address larger sources of decision uncertainty—ones that are significantly moremore » complex. We thus explore the premise that significantly improving algorithm performance requires greater coupling between the problem physics that drives data acquisition and statistical methods that analyze such data. Untapped statistical methods, such as Bayes Modeling Averaging and hierarchical and empirical Bayes methods have the potential to reduce decision uncertainty by more rigorously and comprehensively incorporating all sources of uncertainty. We expect that application of such methods will demonstrate progress in meeting the needs of nuclear security missions by improving on the existing numerical infrastructure for which these analyses have not been conducted.« less
NASA Astrophysics Data System (ADS)
Romenskyy, Maksym; Herbert-Read, James E.; Ward, Ashley J. W.; Sumpter, David J. T.
2017-04-01
While a rich variety of self-propelled particle models propose to explain the collective motion of fish and other animals, rigorous statistical comparison between models and data remains a challenge. Plausible models should be flexible enough to capture changes in the collective behaviour of animal groups at their different developmental stages and group sizes. Here, we analyse the statistical properties of schooling fish (Pseudomugil signifer) through a combination of experiments and simulations. We make novel use of a Boltzmann inversion method, usually applied in molecular dynamics, to identify the effective potential of the mean force of fish interactions. Specifically, we show that larger fish have a larger repulsion zone, but stronger attraction, resulting in greater alignment in their collective motion. We model the collective dynamics of schools using a self-propelled particle model, modified to include varying particle speed and a local repulsion rule. We demonstrate that the statistical properties of the fish schools are reproduced by our model, thereby capturing a number of features of the behaviour and development of schooling fish.
Rigorous Statistical Bounds in Uncertainty Quantification for One-Layer Turbulent Geophysical Flows
NASA Astrophysics Data System (ADS)
Qi, Di; Majda, Andrew J.
2018-04-01
Statistical bounds controlling the total fluctuations in mean and variance about a basic steady-state solution are developed for the truncated barotropic flow over topography. Statistical ensemble prediction is an important topic in weather and climate research. Here, the evolution of an ensemble of trajectories is considered using statistical instability analysis and is compared and contrasted with the classical deterministic instability for the growth of perturbations in one pointwise trajectory. The maximum growth of the total statistics in fluctuations is derived relying on the statistical conservation principle of the pseudo-energy. The saturation bound of the statistical mean fluctuation and variance in the unstable regimes with non-positive-definite pseudo-energy is achieved by linking with a class of stable reference states and minimizing the stable statistical energy. Two cases with dependence on initial statistical uncertainty and on external forcing and dissipation are compared and unified under a consistent statistical stability framework. The flow structures and statistical stability bounds are illustrated and verified by numerical simulations among a wide range of dynamical regimes, where subtle transient statistical instability exists in general with positive short-time exponential growth in the covariance even when the pseudo-energy is positive-definite. Among the various scenarios in this paper, there exist strong forward and backward energy exchanges between different scales which are estimated by the rigorous statistical bounds.
Spatial scaling and multi-model inference in landscape genetics: Martes americana in northern Idaho
Tzeidle N. Wasserman; Samuel A. Cushman; Michael K. Schwartz; David O. Wallin
2010-01-01
Individual-based analyses relating landscape structure to genetic distances across complex landscapes enable rigorous evaluation of multiple alternative hypotheses linking landscape structure to gene flow. We utilize two extensions to increase the rigor of the individual-based causal modeling approach to inferring relationships between landscape patterns and gene flow...
Kline, Joshua C.
2014-01-01
Over the past four decades, various methods have been implemented to measure synchronization of motor-unit firings. In this work, we provide evidence that prior reports of the existence of universal common inputs to all motoneurons and the presence of long-term synchronization are misleading, because they did not use sufficiently rigorous statistical tests to detect synchronization. We developed a statistically based method (SigMax) for computing synchronization and tested it with data from 17,736 motor-unit pairs containing 1,035,225 firing instances from the first dorsal interosseous and vastus lateralis muscles—a data set one order of magnitude greater than that reported in previous studies. Only firing data, obtained from surface electromyographic signal decomposition with >95% accuracy, were used in the study. The data were not subjectively selected in any manner. Because of the size of our data set and the statistical rigor inherent to SigMax, we have confidence that the synchronization values that we calculated provide an improved estimate of physiologically driven synchronization. Compared with three other commonly used techniques, ours revealed three types of discrepancies that result from failing to use sufficient statistical tests necessary to detect synchronization. 1) On average, the z-score method falsely detected synchronization at 16 separate latencies in each motor-unit pair. 2) The cumulative sum method missed one out of every four synchronization identifications found by SigMax. 3) The common input assumption method identified synchronization from 100% of motor-unit pairs studied. SigMax revealed that only 50% of motor-unit pairs actually manifested synchronization. PMID:25210152
Yu, Feiqiao Brian; Blainey, Paul C; Schulz, Frederik; Woyke, Tanja; Horowitz, Mark A; Quake, Stephen R
2017-07-05
Metagenomics and single-cell genomics have enabled genome discovery from unknown branches of life. However, extracting novel genomes from complex mixtures of metagenomic data can still be challenging and represents an ill-posed problem which is generally approached with ad hoc methods. Here we present a microfluidic-based mini-metagenomic method which offers a statistically rigorous approach to extract novel microbial genomes while preserving single-cell resolution. We used this approach to analyze two hot spring samples from Yellowstone National Park and extracted 29 new genomes, including three deeply branching lineages. The single-cell resolution enabled accurate quantification of genome function and abundance, down to 1% in relative abundance. Our analyses of genome level SNP distributions also revealed low to moderate environmental selection. The scale, resolution, and statistical power of microfluidic-based mini-metagenomics make it a powerful tool to dissect the genomic structure of microbial communities while effectively preserving the fundamental unit of biology, the single cell.
Hong, Bonnie; Du, Yingzhou; Mukerji, Pushkor; Roper, Jason M; Appenzeller, Laura M
2017-07-12
Regulatory-compliant rodent subchronic feeding studies are compulsory regardless of a hypothesis to test, according to recent EU legislation for the safety assessment of whole food/feed produced from genetically modified (GM) crops containing a single genetic transformation event (European Union Commission Implementing Regulation No. 503/2013). The Implementing Regulation refers to guidelines set forth by the European Food Safety Authority (EFSA) for the design, conduct, and analysis of rodent subchronic feeding studies. The set of EFSA recommendations was rigorously applied to a 90-day feeding study in Sprague-Dawley rats. After study completion, the appropriateness and applicability of these recommendations were assessed using a battery of statistical analysis approaches including both retrospective and prospective statistical power analyses as well as variance-covariance decomposition. In the interest of animal welfare considerations, alternative experimental designs were investigated and evaluated in the context of informing the health risk assessment of food/feed from GM crops.
Handwriting Examination: Moving from Art to Science
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jarman, K.H.; Hanlen, R.C.; Manzolillo, P.A.
In this document, we present a method for validating the premises and methodology of forensic handwriting examination. This method is intuitively appealing because it relies on quantitative measurements currently used qualitatively by FDE's in making comparisons, and it is scientifically rigorous because it exploits the power of multivariate statistical analysis. This approach uses measures of both central tendency and variation to construct a profile for a given individual. (Central tendency and variation are important for characterizing an individual's writing and both are currently used by FDE's in comparative analyses). Once constructed, different profiles are then compared for individuality using clustermore » analysis; they are grouped so that profiles within a group cannot be differentiated from one another based on the measured characteristics, whereas profiles between groups can. The cluster analysis procedure used here exploits the power of multivariate hypothesis testing. The result is not only a profile grouping but also an indication of statistical significance of the groups generated.« less
Single-case synthesis tools I: Comparing tools to evaluate SCD quality and rigor.
Zimmerman, Kathleen N; Ledford, Jennifer R; Severini, Katherine E; Pustejovsky, James E; Barton, Erin E; Lloyd, Blair P
2018-03-03
Tools for evaluating the quality and rigor of single case research designs (SCD) are often used when conducting SCD syntheses. Preferred components include evaluations of design features related to the internal validity of SCD to obtain quality and/or rigor ratings. Three tools for evaluating the quality and rigor of SCD (Council for Exceptional Children, What Works Clearinghouse, and Single-Case Analysis and Design Framework) were compared to determine if conclusions regarding the effectiveness of antecedent sensory-based interventions for young children changed based on choice of quality evaluation tool. Evaluation of SCD quality differed across tools, suggesting selection of quality evaluation tools impacts evaluation findings. Suggestions for selecting an appropriate quality and rigor assessment tool are provided and across-tool conclusions are drawn regarding the quality and rigor of studies. Finally, authors provide guidance for using quality evaluations in conjunction with outcome analyses when conducting syntheses of interventions evaluated in the context of SCD. Copyright © 2018 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Kahveci, Ajda; Kahveci, Murat; Mansour, Nasser; Alarfaj, Maher Mohammed
2017-06-01
Teachers play a key role in moving reform-based science education practices into the classroom. Based on research that emphasizes the importance of teachers' affective states, this study aimed to explore the constructs pedagogical discontentment, science teaching self-efficacy, intentions to reform, and their correlations. Also, it aimed to provide empirical evidence in light of a previously proposed theoretical model while focusing on an entirely new context in Middle East. Data were collected in Saudi Arabia with a total of randomly selected 994 science teachers, 656 of whom were females and 338 were males. To collect the data, the Arabic versions of the Science Teachers' Pedagogical Discontentment scale, the Science Teaching Efficacy Beliefs Instrument and the Intentions to Reform Science Teaching scale were developed. For assuring the validity of the instruments in a non-Western context, rigorous cross-cultural validations procedures were followed. Factor analyses were conducted for construct validation and descriptive statistical analyses were performed including frequency distributions and normality checks. Univariate analyses of variance were run to explore statistically significant differences between groups of teachers. Cross-tabulation and correlation analyses were conducted to explore relationships. The findings suggest effect of teacher characteristics such as age and professional development program attendance on the affective states. The results demonstrate that teachers who attended a relatively higher number of programs had lower level of intentions to reform raising issues regarding the conduct and outcomes of professional development. Some of the findings concerning interrelationships among the three constructs challenge and serve to expand the previously proposed theoretical model.
Implementation errors in the GingerALE Software: Description and recommendations.
Eickhoff, Simon B; Laird, Angela R; Fox, P Mickle; Lancaster, Jack L; Fox, Peter T
2017-01-01
Neuroscience imaging is a burgeoning, highly sophisticated field the growth of which has been fostered by grant-funded, freely distributed software libraries that perform voxel-wise analyses in anatomically standardized three-dimensional space on multi-subject, whole-brain, primary datasets. Despite the ongoing advances made using these non-commercial computational tools, the replicability of individual studies is an acknowledged limitation. Coordinate-based meta-analysis offers a practical solution to this limitation and, consequently, plays an important role in filtering and consolidating the enormous corpus of functional and structural neuroimaging results reported in the peer-reviewed literature. In both primary data and meta-analytic neuroimaging analyses, correction for multiple comparisons is a complex but critical step for ensuring statistical rigor. Reports of errors in multiple-comparison corrections in primary-data analyses have recently appeared. Here, we report two such errors in GingerALE, a widely used, US National Institutes of Health (NIH)-funded, freely distributed software package for coordinate-based meta-analysis. These errors have given rise to published reports with more liberal statistical inferences than were specified by the authors. The intent of this technical report is threefold. First, we inform authors who used GingerALE of these errors so that they can take appropriate actions including re-analyses and corrective publications. Second, we seek to exemplify and promote an open approach to error management. Third, we discuss the implications of these and similar errors in a scientific environment dependent on third-party software. Hum Brain Mapp 38:7-11, 2017. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
Rigorous force field optimization principles based on statistical distance minimization
Vlcek, Lukas; Chialvo, Ariel A.
2015-10-12
We use the concept of statistical distance to define a measure of distinguishability between a pair of statistical mechanical systems, i.e., a model and its target, and show that its minimization leads to general convergence of the model’s static measurable properties to those of the target. Here we exploit this feature to define a rigorous basis for the development of accurate and robust effective molecular force fields that are inherently compatible with coarse-grained experimental data. The new model optimization principles and their efficient implementation are illustrated through selected examples, whose outcome demonstrates the higher robustness and predictive accuracy of themore » approach compared to other currently used methods, such as force matching and relative entropy minimization. We also discuss relations between the newly developed principles and established thermodynamic concepts, which include the Gibbs-Bogoliubov inequality and the thermodynamic length.« less
Haslam, Divna; Filus, Ania; Morawska, Alina; Sanders, Matthew R; Fletcher, Renee
2015-06-01
This paper outlines the development and validation of the Work-Family Conflict Scale (WAFCS) designed to measure work-to-family conflict (WFC) and family-to-work conflict (FWC) for use with parents of young children. An expert informant and consumer feedback approach was utilised to develop and refine 20 items, which were subjected to a rigorous validation process using two separate samples of parents of 2-12 year old children (n = 305 and n = 264). As a result of statistical analyses several items were dropped resulting in a brief 10-item scale comprising two subscales assessing theoretically distinct but related constructs: FWC (five items) and WFC (five items). Analyses revealed both subscales have good internal consistency, construct validity as well as concurrent and predictive validity. The results indicate the WAFCS is a promising brief measure for the assessment of work-family conflict in parents. Benefits of the measure as well as potential uses are discussed.
The MR-Base platform supports systematic causal inference across the human phenome
Wade, Kaitlin H; Haberland, Valeriia; Baird, Denis; Laurin, Charles; Burgess, Stephen; Bowden, Jack; Langdon, Ryan; Tan, Vanessa Y; Yarmolinsky, James; Shihab, Hashem A; Timpson, Nicholas J; Evans, David M; Relton, Caroline; Martin, Richard M; Davey Smith, George
2018-01-01
Results from genome-wide association studies (GWAS) can be used to infer causal relationships between phenotypes, using a strategy known as 2-sample Mendelian randomization (2SMR) and bypassing the need for individual-level data. However, 2SMR methods are evolving rapidly and GWAS results are often insufficiently curated, undermining efficient implementation of the approach. We therefore developed MR-Base (http://www.mrbase.org): a platform that integrates a curated database of complete GWAS results (no restrictions according to statistical significance) with an application programming interface, web app and R packages that automate 2SMR. The software includes several sensitivity analyses for assessing the impact of horizontal pleiotropy and other violations of assumptions. The database currently comprises 11 billion single nucleotide polymorphism-trait associations from 1673 GWAS and is updated on a regular basis. Integrating data with software ensures more rigorous application of hypothesis-driven analyses and allows millions of potential causal relationships to be efficiently evaluated in phenome-wide association studies. PMID:29846171
ERIC Educational Resources Information Center
Perna, Laura W.; May, Henry; Yee, April; Ransom, Tafaya; Rodriguez, Awilda; Fester, Rachél
2015-01-01
This study explores whether students from low-income families and racial/ethnic minority groups have the opportunity to benefit in what is arguably the most rigorous type of credit-based transition program: the International Baccalaureate Diploma Programme (IBDP). The analyses first describe national longitudinal trends in characteristics of…
Gruber, Bernd; Unmack, Peter J; Berry, Oliver F; Georges, Arthur
2018-05-01
Although vast technological advances have been made and genetic software packages are growing in number, it is not a trivial task to analyse SNP data. We announce a new r package, dartr, enabling the analysis of single nucleotide polymorphism data for population genomic and phylogenomic applications. dartr provides user-friendly functions for data quality control and marker selection, and permits rigorous evaluations of conformation to Hardy-Weinberg equilibrium, gametic-phase disequilibrium and neutrality. The package reports standard descriptive statistics, permits exploration of patterns in the data through principal components analysis and conducts standard F-statistics, as well as basic phylogenetic analyses, population assignment, isolation by distance and exports data to a variety of commonly used downstream applications (e.g., newhybrids, faststructure and phylogeny applications) outside of the r environment. The package serves two main purposes: first, a user-friendly approach to lower the hurdle to analyse such data-therefore, the package comes with a detailed tutorial targeted to the r beginner to allow data analysis without requiring deep knowledge of r. Second, we use a single, well-established format-genlight from the adegenet package-as input for all our functions to avoid data reformatting. By strictly using the genlight format, we hope to facilitate this format as the de facto standard of future software developments and hence reduce the format jungle of genetic data sets. The dartr package is available via the r CRAN network and GitHub. © 2017 John Wiley & Sons Ltd.
Origin of the spike-timing-dependent plasticity rule
NASA Astrophysics Data System (ADS)
Cho, Myoung Won; Choi, M. Y.
2016-08-01
A biological synapse changes its efficacy depending on the difference between pre- and post-synaptic spike timings. Formulating spike-timing-dependent interactions in terms of the path integral, we establish a neural-network model, which makes it possible to predict relevant quantities rigorously by means of standard methods in statistical mechanics and field theory. In particular, the biological synaptic plasticity rule is shown to emerge as the optimal form for minimizing the free energy. It is further revealed that maximization of the entropy of neural activities gives rise to the competitive behavior of biological learning. This demonstrates that statistical mechanics helps to understand rigorously key characteristic behaviors of a neural network, thus providing the possibility of physics serving as a useful and relevant framework for probing life.
Ipsen, Andreas
2015-02-03
Despite the widespread use of mass spectrometry (MS) in a broad range of disciplines, the nature of MS data remains very poorly understood, and this places important constraints on the quality of MS data analysis as well as on the effectiveness of MS instrument design. In the following, a procedure for calculating the statistical distribution of the mass peak intensity for MS instruments that use analog-to-digital converters (ADCs) and electron multipliers is presented. It is demonstrated that the physical processes underlying the data-generation process, from the generation of the ions to the signal induced at the detector, and on to the digitization of the resulting voltage pulse, result in data that can be well-approximated by a Gaussian distribution whose mean and variance are determined by physically meaningful instrumental parameters. This allows for a very precise understanding of the signal-to-noise ratio of mass peak intensities and suggests novel ways of improving it. Moreover, it is a prerequisite for being able to address virtually all data analytical problems in downstream analyses in a statistically rigorous manner. The model is validated with experimental data.
De Luca, Carlo J; Kline, Joshua C
2014-12-01
Over the past four decades, various methods have been implemented to measure synchronization of motor-unit firings. In this work, we provide evidence that prior reports of the existence of universal common inputs to all motoneurons and the presence of long-term synchronization are misleading, because they did not use sufficiently rigorous statistical tests to detect synchronization. We developed a statistically based method (SigMax) for computing synchronization and tested it with data from 17,736 motor-unit pairs containing 1,035,225 firing instances from the first dorsal interosseous and vastus lateralis muscles--a data set one order of magnitude greater than that reported in previous studies. Only firing data, obtained from surface electromyographic signal decomposition with >95% accuracy, were used in the study. The data were not subjectively selected in any manner. Because of the size of our data set and the statistical rigor inherent to SigMax, we have confidence that the synchronization values that we calculated provide an improved estimate of physiologically driven synchronization. Compared with three other commonly used techniques, ours revealed three types of discrepancies that result from failing to use sufficient statistical tests necessary to detect synchronization. 1) On average, the z-score method falsely detected synchronization at 16 separate latencies in each motor-unit pair. 2) The cumulative sum method missed one out of every four synchronization identifications found by SigMax. 3) The common input assumption method identified synchronization from 100% of motor-unit pairs studied. SigMax revealed that only 50% of motor-unit pairs actually manifested synchronization. Copyright © 2014 the American Physiological Society.
NASA Astrophysics Data System (ADS)
Calì, M.; Santarelli, M. G. L.; Leone, P.
Gas Turbine Technologies (GTT) and Politecnico di Torino, both located in Torino (Italy), have been involved in the design and installation of a SOFC laboratory in order to analyse the operation, in cogenerative configuration, of the CHP 100 kW e SOFC Field Unit, built by Siemens-Westinghouse Power Corporation (SWPC), which is at present (May 2005) starting its operation and which will supply electric and thermal power to the GTT factory. In order to take the better advantage from the analysis of the on-site operation, and especially to correctly design the scheduled experimental tests on the system, we developed a mathematical model and run a simulated experimental campaign, applying a rigorous statistical approach to the analysis of the results. The aim of this work is the computer experimental analysis, through a statistical methodology (2 k factorial experiments), of the CHP 100 performance. First, the mathematical model has been calibrated with the results acquired during the first CHP100 demonstration at EDB/ELSAM in Westerwoort. After, the simulated tests have been performed in the form of computer experimental session, and the measurement uncertainties have been simulated with perturbation imposed to the model independent variables. The statistical methodology used for the computer experimental analysis is the factorial design (Yates' Technique): using the ANOVA technique the effect of the main independent variables (air utilization factor U ox, fuel utilization factor U F, internal fuel and air preheating and anodic recycling flow rate) has been investigated in a rigorous manner. Analysis accounts for the effects of parameters on stack electric power, thermal recovered power, single cell voltage, cell operative temperature, consumed fuel flow and steam to carbon ratio. Each main effect and interaction effect of parameters is shown with particular attention on generated electric power and stack heat recovered.
Does McNemar's test compare the sensitivities and specificities of two diagnostic tests?
Kim, Soeun; Lee, Woojoo
2017-02-01
McNemar's test is often used in practice to compare the sensitivities and specificities for the evaluation of two diagnostic tests. For correct evaluation of accuracy, an intuitive recommendation is to test the diseased and the non-diseased groups separately so that the sensitivities can be compared among the diseased, and specificities can be compared among the healthy group of people. This paper provides a rigorous theoretical framework for this argument and study the validity of McNemar's test regardless of the conditional independence assumption. We derive McNemar's test statistic under the null hypothesis considering both assumptions of conditional independence and conditional dependence. We then perform power analyses to show how the result is affected by the amount of the conditional dependence under alternative hypothesis.
Walsh, Daniel P.; Norton, Andrew S.; Storm, Daniel J.; Van Deelen, Timothy R.; Heisy, Dennis M.
2018-01-01
Implicit and explicit use of expert knowledge to inform ecological analyses is becoming increasingly common because it often represents the sole source of information in many circumstances. Thus, there is a need to develop statistical methods that explicitly incorporate expert knowledge, and can successfully leverage this information while properly accounting for associated uncertainty during analysis. Studies of cause-specific mortality provide an example of implicit use of expert knowledge when causes-of-death are uncertain and assigned based on the observer's knowledge of the most likely cause. To explicitly incorporate this use of expert knowledge and the associated uncertainty, we developed a statistical model for estimating cause-specific mortality using a data augmentation approach within a Bayesian hierarchical framework. Specifically, for each mortality event, we elicited the observer's belief of cause-of-death by having them specify the probability that the death was due to each potential cause. These probabilities were then used as prior predictive values within our framework. This hierarchical framework permitted a simple and rigorous estimation method that was easily modified to include covariate effects and regularizing terms. Although applied to survival analysis, this method can be extended to any event-time analysis with multiple event types, for which there is uncertainty regarding the true outcome. We conducted simulations to determine how our framework compared to traditional approaches that use expert knowledge implicitly and assume that cause-of-death is specified accurately. Simulation results supported the inclusion of observer uncertainty in cause-of-death assignment in modeling of cause-specific mortality to improve model performance and inference. Finally, we applied the statistical model we developed and a traditional method to cause-specific survival data for white-tailed deer, and compared results. We demonstrate that model selection results changed between the two approaches, and incorporating observer knowledge in cause-of-death increased the variability associated with parameter estimates when compared to the traditional approach. These differences between the two approaches can impact reported results, and therefore, it is critical to explicitly incorporate expert knowledge in statistical methods to ensure rigorous inference.
Statistical hydrodynamics and related problems in spaces of probability measures
NASA Astrophysics Data System (ADS)
Dostoglou, Stamatios
2017-11-01
A rigorous theory of statistical solutions of the Navier-Stokes equations, suitable for exploring Kolmogorov's ideas, has been developed by M.I. Vishik and A.V. Fursikov, culminating in their monograph "Mathematical problems of Statistical Hydromechanics." We review some progress made in recent years following this approach, with emphasis on problems concerning the correlation of velocities and corresponding questions in the space of probability measures on Hilbert spaces.
Treetrimmer: a method for phylogenetic dataset size reduction.
Maruyama, Shinichiro; Eveleigh, Robert J M; Archibald, John M
2013-04-12
With rapid advances in genome sequencing and bioinformatics, it is now possible to generate phylogenetic trees containing thousands of operational taxonomic units (OTUs) from a wide range of organisms. However, use of rigorous tree-building methods on such large datasets is prohibitive and manual 'pruning' of sequence alignments is time consuming and raises concerns over reproducibility. There is a need for bioinformatic tools with which to objectively carry out such pruning procedures. Here we present 'TreeTrimmer', a bioinformatics procedure that removes unnecessary redundancy in large phylogenetic datasets, alleviating the size effect on more rigorous downstream analyses. The method identifies and removes user-defined 'redundant' sequences, e.g., orthologous sequences from closely related organisms and 'recently' evolved lineage-specific paralogs. Representative OTUs are retained for more rigorous re-analysis. TreeTrimmer reduces the OTU density of phylogenetic trees without sacrificing taxonomic diversity while retaining the original tree topology, thereby speeding up downstream computer-intensive analyses, e.g., Bayesian and maximum likelihood tree reconstructions, in a reproducible fashion.
Quantifying falsifiability of scientific theories
NASA Astrophysics Data System (ADS)
Nemenman, Ilya
I argue that the notion of falsifiability, a key concept in defining a valid scientific theory, can be quantified using Bayesian Model Selection, which is a standard tool in modern statistics. This relates falsifiability to the quantitative version of the statistical Occam's razor, and allows transforming some long-running arguments about validity of scientific theories from philosophical discussions to rigorous mathematical calculations.
Fadıloğlu, Eylem Ezgi; Serdaroğlu, Meltem
2018-01-01
Abstract This study was conducted to evaluate the effects of pre and post-rigor marinade injections on some quality parameters of Longissimus dorsi (LD) muscles. Three marinade formulations were prepared with 2% NaCl, 2% NaCl+0.5 M lactic acid and 2% NaCl+0.5 M sodium lactate. In this study marinade uptake, pH, free water, cooking loss, drip loss and color properties were analyzed. Injection time had significant effect on marinade uptake levels of samples. Regardless of marinate formulation, marinade uptake of pre-rigor samples injected with marinade solutions were higher than post rigor samples. Injection of sodium lactate increased pH values of samples whereas lactic acid injection decreased pH. Marinade treatment and storage period had significant effect on cooking loss. At each evaluation period interaction between marinade treatment and injection time showed different effect on free water content. Storage period and marinade application had significant effect on drip loss values. Drip loss in all samples increased during the storage. During all storage days, lowest CIE L* value was found in pre-rigor samples injected with sodium lactate. Lactic acid injection caused color fade in pre-rigor and post-rigor samples. Interaction between marinade treatment and storage period was found statistically significant (p<0.05). At day 0 and 3, the lowest CIE b* values obtained pre-rigor samples injected with sodium lactate and there were no differences were found in other samples. At day 6, no significant differences were found in CIE b* values of all samples. PMID:29805282
Fadıloğlu, Eylem Ezgi; Serdaroğlu, Meltem
2018-04-01
This study was conducted to evaluate the effects of pre and post-rigor marinade injections on some quality parameters of Longissimus dorsi (LD) muscles. Three marinade formulations were prepared with 2% NaCl, 2% NaCl+0.5 M lactic acid and 2% NaCl+0.5 M sodium lactate. In this study marinade uptake, pH, free water, cooking loss, drip loss and color properties were analyzed. Injection time had significant effect on marinade uptake levels of samples. Regardless of marinate formulation, marinade uptake of pre-rigor samples injected with marinade solutions were higher than post rigor samples. Injection of sodium lactate increased pH values of samples whereas lactic acid injection decreased pH. Marinade treatment and storage period had significant effect on cooking loss. At each evaluation period interaction between marinade treatment and injection time showed different effect on free water content. Storage period and marinade application had significant effect on drip loss values. Drip loss in all samples increased during the storage. During all storage days, lowest CIE L* value was found in pre-rigor samples injected with sodium lactate. Lactic acid injection caused color fade in pre-rigor and post-rigor samples. Interaction between marinade treatment and storage period was found statistically significant ( p <0.05). At day 0 and 3, the lowest CIE b* values obtained pre-rigor samples injected with sodium lactate and there were no differences were found in other samples. At day 6, no significant differences were found in CIE b* values of all samples.
Joyce, Brendan; Lee, Danny; Rubio, Alex; Ogurtsov, Aleksey; Alves, Gelio; Yu, Yi-Kuo
2018-03-15
RAId is a software package that has been actively developed for the past 10 years for computationally and visually analyzing MS/MS data. Founded on rigorous statistical methods, RAId's core program computes accurate E-values for peptides and proteins identified during database searches. Making this robust tool readily accessible for the proteomics community by developing a graphical user interface (GUI) is our main goal here. We have constructed a graphical user interface to facilitate the use of RAId on users' local machines. Written in Java, RAId_GUI not only makes easy executions of RAId but also provides tools for data/spectra visualization, MS-product analysis, molecular isotopic distribution analysis, and graphing the retrieval versus the proportion of false discoveries. The results viewer displays and allows the users to download the analyses results. Both the knowledge-integrated organismal databases and the code package (containing source code, the graphical user interface, and a user manual) are available for download at https://www.ncbi.nlm.nih.gov/CBBresearch/Yu/downloads/raid.html .
Yu, Feiqiao Brian; Blainey, Paul C; Schulz, Frederik; Woyke, Tanja; Horowitz, Mark A; Quake, Stephen R
2017-01-01
Metagenomics and single-cell genomics have enabled genome discovery from unknown branches of life. However, extracting novel genomes from complex mixtures of metagenomic data can still be challenging and represents an ill-posed problem which is generally approached with ad hoc methods. Here we present a microfluidic-based mini-metagenomic method which offers a statistically rigorous approach to extract novel microbial genomes while preserving single-cell resolution. We used this approach to analyze two hot spring samples from Yellowstone National Park and extracted 29 new genomes, including three deeply branching lineages. The single-cell resolution enabled accurate quantification of genome function and abundance, down to 1% in relative abundance. Our analyses of genome level SNP distributions also revealed low to moderate environmental selection. The scale, resolution, and statistical power of microfluidic-based mini-metagenomics make it a powerful tool to dissect the genomic structure of microbial communities while effectively preserving the fundamental unit of biology, the single cell. DOI: http://dx.doi.org/10.7554/eLife.26580.001 PMID:28678007
Scott, David J.; Winzor, Donald J.
2009-01-01
Abstract We have examined in detail analytical solutions of expressions for sedimentation equilibrium in the analytical ultracentrifuge to describe self-association under nonideal conditions. We find that those containing the radial dependence of total solute concentration that incorporate the Adams-Fujita assumption for composition-dependence of activity coefficients reveal potential shortcomings for characterizing such systems. Similar deficiencies are shown in the use of the NONLIN software incorporating the same assumption about the interrelationship between activity coefficients for monomer and polymer species. These difficulties can be overcome by iterative analyses incorporating expressions for the composition-dependence of activity coefficients predicted by excluded volume considerations. A recommendation is therefore made for the replacement of current software packages by programs that incorporate rigorous statistical-mechanical allowance for thermodynamic nonideality in sedimentation equilibrium distributions reflecting solute self-association. PMID:19651047
Trends in Mediation Analysis in Nursing Research: Improving Current Practice.
Hertzog, Melody
2018-06-01
The purpose of this study was to describe common approaches used by nursing researchers to test mediation models and evaluate them within the context of current methodological advances. MEDLINE was used to locate studies testing a mediation model and published from 2004 to 2015 in nursing journals. Design (experimental/correlation, cross-sectional/longitudinal, model complexity) and analysis (method, inclusion of test of mediated effect, violations/discussion of assumptions, sample size/power) characteristics were coded for 456 studies. General trends were identified using descriptive statistics. Consistent with findings of reviews in other disciplines, evidence was found that nursing researchers may not be aware of the strong assumptions and serious limitations of their analyses. Suggestions for strengthening the rigor of such studies and an overview of current methods for testing more complex models, including longitudinal mediation processes, are presented.
Musicians, postural quality and musculoskeletal health: A literature's review.
Blanco-Piñeiro, Patricia; Díaz-Pereira, M Pino; Martínez, Aurora
2017-01-01
An analysis of the salient characteristics of research papers published between 1989 and 2015 that evaluate the relationship between postural quality during musical performance and various performance quality and health factors, with emphasis on musculoskeletal health variables. Searches of Medline, Scopus and Google Scholar for papers that analysed the subject of the study objective. The following MeSH descriptors were used: posture; postural balance; muscle, skeletal; task performance and analysis; back; and spine and music. A descriptive statistical analysis of their methodology (sample types, temporal design, and postural, health and other variables analysed) and findings has been made. The inclusion criterion was that the body postural quality of the musicians during performance was included among the target study variables. Forty-one relevant empirical studies were found, written in English. Comparison and analysis of their results was hampered by great disparities in measuring instruments and operationalization of variables. Despite the growing interest in the relationships among these variables, the empirical knowledge base still has many limitations, making rigorous comparative analysis difficult. Copyright © 2016 Elsevier Ltd. All rights reserved.
Ganger, Michael T; Dietz, Geoffrey D; Ewing, Sarah J
2017-12-01
qPCR has established itself as the technique of choice for the quantification of gene expression. Procedures for conducting qPCR have received significant attention; however, more rigorous approaches to the statistical analysis of qPCR data are needed. Here we develop a mathematical model, termed the Common Base Method, for analysis of qPCR data based on threshold cycle values (C q ) and efficiencies of reactions (E). The Common Base Method keeps all calculations in the logscale as long as possible by working with log 10 (E) ∙ C q , which we call the efficiency-weighted C q value; subsequent statistical analyses are then applied in the logscale. We show how efficiency-weighted C q values may be analyzed using a simple paired or unpaired experimental design and develop blocking methods to help reduce unexplained variation. The Common Base Method has several advantages. It allows for the incorporation of well-specific efficiencies and multiple reference genes. The method does not necessitate the pairing of samples that must be performed using traditional analysis methods in order to calculate relative expression ratios. Our method is also simple enough to be implemented in any spreadsheet or statistical software without additional scripts or proprietary components.
Bioregional monitoring design and occupancy estimation for two Sierra Nevadan amphibian taxa
Land-management agencies need quantitative, statistically rigorous monitoring data, often at large spatial and temporal scales, to support resource-management decisions. Monitoring designs typically must accommodate multiple ecological, logistical, political, and economic objec...
Nicolay, C R; Purkayastha, S; Greenhalgh, A; Benn, J; Chaturvedi, S; Phillips, N; Darzi, A
2012-03-01
The demand for the highest-quality patient care coupled with pressure on funding has led to the increasing use of quality improvement (QI) methodologies from the manufacturing industry. The aim of this systematic review was to identify and evaluate the application and effectiveness of these QI methodologies to the field of surgery. MEDLINE, the Cochrane Database, Allied and Complementary Medicine Database, British Nursing Index, Cumulative Index to Nursing and Allied Health Literature, Embase, Health Business(™) Elite, the Health Management Information Consortium and PsycINFO(®) were searched according to the Preferred Reporting Items for Systematic Reviews and Meta-Analyses statement. Empirical studies were included that implemented a described QI methodology to surgical care and analysed a named outcome statistically. Some 34 of 1595 articles identified met the inclusion criteria after consensus from two independent investigators. Nine studies described continuous quality improvement (CQI), five Six Sigma, five total quality management (TQM), five plan-do-study-act (PDSA) or plan-do-check-act (PDCA) cycles, five statistical process control (SPC) or statistical quality control (SQC), four Lean and one Lean Six Sigma; 20 of the studies were undertaken in the USA. The most common aims were to reduce complications or improve outcomes (11), to reduce infection (7), and to reduce theatre delays (7). There was one randomized controlled trial. QI methodologies from industry can have significant effects on improving surgical care, from reducing infection rates to increasing operating room efficiency. The evidence is generally of suboptimal quality, and rigorous randomized multicentre studies are needed to bring evidence-based management into the same league as evidence-based medicine. Copyright © 2011 British Journal of Surgery Society Ltd. Published by John Wiley & Sons, Ltd.
Chouteau, Mathieu; Whibley, Annabel; Joron, Mathieu; Llaurens, Violaine
2016-01-01
Identifying the genetic basis of adaptive variation is challenging in non-model organisms and quantitative real time PCR. is a useful tool for validating predictions regarding the expression of candidate genes. However, comparing expression levels in different conditions requires rigorous experimental design and statistical analyses. Here, we focused on the neotropical passion-vine butterflies Heliconius, non-model species studied in evolutionary biology for their adaptive variation in wing color patterns involved in mimicry and in the signaling of their toxicity to predators. We aimed at selecting stable reference genes to be used for normalization of gene expression data in RT-qPCR analyses from developing wing discs according to the minimal guidelines described in Minimum Information for publication of Quantitative Real-Time PCR Experiments (MIQE). To design internal RT-qPCR controls, we studied the stability of expression of nine candidate reference genes (actin, annexin, eF1α, FK506BP, PolyABP, PolyUBQ, RpL3, RPS3A, and tubulin) at two developmental stages (prepupal and pupal) using three widely used programs (GeNorm, NormFinder and BestKeeper). Results showed that, despite differences in statistical methods, genes RpL3, eF1α, polyABP, and annexin were stably expressed in wing discs in late larval and pupal stages of Heliconius numata. This combination of genes may be used as a reference for a reliable study of differential expression in wings for instance for genes involved in important phenotypic variation, such as wing color pattern variation. Through this example, we provide general useful technical recommendations as well as relevant statistical strategies for evolutionary biologists aiming to identify candidate-genes involved adaptive variation in non-model organisms. PMID:27271971
Digital morphogenesis via Schelling segregation
NASA Astrophysics Data System (ADS)
Barmpalias, George; Elwes, Richard; Lewis-Pye, Andrew
2018-04-01
Schelling’s model of segregation looks to explain the way in which particles or agents of two types may come to arrange themselves spatially into configurations consisting of large homogeneous clusters, i.e. connected regions consisting of only one type. As one of the earliest agent based models studied by economists and perhaps the most famous model of self-organising behaviour, it also has direct links to areas at the interface between computer science and statistical mechanics, such as the Ising model and the study of contagion and cascading phenomena in networks. While the model has been extensively studied it has largely resisted rigorous analysis, prior results from the literature generally pertaining to variants of the model which are tweaked so as to be amenable to standard techniques from statistical mechanics or stochastic evolutionary game theory. In Brandt et al (2012 Proc. 44th Annual ACM Symp. on Theory of Computing) provided the first rigorous analysis of the unperturbed model, for a specific set of input parameters. Here we provide a rigorous analysis of the model’s behaviour much more generally and establish some surprising forms of threshold behaviour, notably the existence of situations where an increased level of intolerance for neighbouring agents of opposite type leads almost certainly to decreased segregation.
QTest: Quantitative Testing of Theories of Binary Choice.
Regenwetter, Michel; Davis-Stober, Clintin P; Lim, Shiau Hong; Guo, Ying; Popova, Anna; Zwilling, Chris; Cha, Yun-Shil; Messner, William
2014-01-01
The goal of this paper is to make modeling and quantitative testing accessible to behavioral decision researchers interested in substantive questions. We provide a novel, rigorous, yet very general, quantitative diagnostic framework for testing theories of binary choice. This permits the nontechnical scholar to proceed far beyond traditionally rather superficial methods of analysis, and it permits the quantitatively savvy scholar to triage theoretical proposals before investing effort into complex and specialized quantitative analyses. Our theoretical framework links static algebraic decision theory with observed variability in behavioral binary choice data. The paper is supplemented with a custom-designed public-domain statistical analysis package, the QTest software. We illustrate our approach with a quantitative analysis using published laboratory data, including tests of novel versions of "Random Cumulative Prospect Theory." A major asset of the approach is the potential to distinguish decision makers who have a fixed preference and commit errors in observed choices from decision makers who waver in their preferences.
Mindfulness Meditation for Chronic Pain: Systematic Review and Meta-analysis.
Hilton, Lara; Hempel, Susanne; Ewing, Brett A; Apaydin, Eric; Xenakis, Lea; Newberry, Sydne; Colaiaco, Ben; Maher, Alicia Ruelaz; Shanman, Roberta M; Sorbero, Melony E; Maglione, Margaret A
2017-04-01
Chronic pain patients increasingly seek treatment through mindfulness meditation. This study aims to synthesize evidence on efficacy and safety of mindfulness meditation interventions for the treatment of chronic pain in adults. We conducted a systematic review on randomized controlled trials (RCTs) with meta-analyses using the Hartung-Knapp-Sidik-Jonkman method for random-effects models. Quality of evidence was assessed using the GRADE approach. Outcomes included pain, depression, quality of life, and analgesic use. Thirty-eight RCTs met inclusion criteria; seven reported on safety. We found low-quality evidence that mindfulness meditation is associated with a small decrease in pain compared with all types of controls in 30 RCTs. Statistically significant effects were also found for depression symptoms and quality of life. While mindfulness meditation improves pain and depression symptoms and quality of life, additional well-designed, rigorous, and large-scale RCTs are needed to decisively provide estimates of the efficacy of mindfulness meditation for chronic pain.
Rate, Andrew W
2018-06-15
Urban environments are dynamic and highly heterogeneous, and multiple additions of potential contaminants are likely on timescales which are short relative to natural processes. The likely sources and location of soil or sediment contamination in urban environment should therefore be detectable using multielement geochemical composition combined with rigorously applied multivariate statistical techniques. Soil, wetland sediment, and street dust was sampled along intersecting transects in Robertson Park in metropolitan Perth, Western Australia. Samples were analysed for near-total concentrations of multiple elements (including Cd, Ce, Co, Cr, Cu, Fe, Gd, La, Mn, Nd, Ni, Pb, Y, and Zn), as well as pH, and electrical conductivity. Samples at some locations within Robertson Park had high concentrations of potentially toxic elements (Pb above Health Investigation Limits; As, Ba, Cu, Mn, Ni, Pb, V, and Zn above Ecological Investigation Limits). However, these concentrations carry low risk due to the main land use as recreational open space, the low proportion of samples exceeding guideline values, and a tendency for the highest concentrations to be located within the less accessible wetland basin. The different spatial distributions of different groups of contaminants was consistent with different inputs of contaminants related to changes in land use and technology over the history of the site. Multivariate statistical analyses reinforced the spatial information, with principal component analysis identifying geochemical associations of elements which were also spatially related. A multivariate linear discriminant model was able to discriminate samples into a-priori types, and could predict sample type with 84% accuracy based on multielement composition. The findings suggest substantial advantages of characterising a site using multielement and multivariate analyses, an approach which could benefit investigations of other sites of concern. Copyright © 2018 Elsevier B.V. All rights reserved.
2015-01-01
The impact of living abroad is a topic that has intrigued researchers for almost a century, if not longer. While many acculturation phenomena have been studied over this time, the development of new research methods and statistical software in recent years means that these can be revisited and examined in a more rigorous manner. In the present study we were able to follow approximately 2,500 intercultural exchange students situated in over 50 different countries worldwide, over time both before and during their travel using online surveys. Advanced statistical analyses were employed to examine the course of sojourners stress and adjustment over time, its antecedents and consequences. By comparing a sojourner sample with a control group of nonsojourning peers we were able to highlight the uniqueness of the sojourn experience in terms of stress variability over time. Using Latent Class Growth Analysis to examine the nature of this variability revealed 5 distinct patterns of change in stress experienced by sojourners over the course of their exchange: a reverse J-curve, inverse U-curve, mild stress, minor relief, and resilience pattern. Antecedent explanatory variables for stress variability were examined using both variable-centered and person-centered analyses and evidence for the role of personality, empathy, cultural adaptation, and coping strategies was found in each case. Lastly, we examined the relationship between stress abroad with behavioral indicators of (mal)adjustment: number of family changes and early termination of the exchange program. PMID:26191963
Knowledge translation and implementation in spinal cord injury: a systematic review.
Noonan, V K; Wolfe, D L; Thorogood, N P; Park, S E; Hsieh, J T; Eng, J J
2014-08-01
To conduct a systematic review examining the effectiveness of knowledge translation (KT) interventions in changing clinical practice and patient outcomes. MEDLINE/PubMed, CINAHL, EMBASE and PsycINFO were searched for studies published from January 1980 to July 2012 that reported and evaluated an implemented KT intervention in spinal cord injury (SCI) care. We reviewed and summarized results from studies that documented the implemented KT intervention, its impact on changing clinician behavior and patient outcomes as well as the facilitators and barriers encountered during the implementation. A total of 13 articles featuring 10 studies were selected and abstracted from 4650 identified articles. KT interventions included developing and implementing patient care protocols, providing clinician education and incorporating outcome measures into clinical practice. The methods (or drivers) to facilitate the implementation included organizing training sessions for clinical staff, introducing computerized reminders and involving organizational leaders. The methodological quality of studies was mostly poor. Only 3 out of 10 studies evaluated the success of the implementation using statistical analyses, and all 3 reported significant behavior change. Out of the 10 studies, 6 evaluated the effect of the implementation on patient outcomes using statistical analyses, with 4 reporting significant improvements. The commonly cited facilitators and barriers were communication and resources, respectively. The field of KT in SCI is in its infancy with only a few relevant publications. However, there is some evidence that KT interventions may change clinician behavior and improve patient outcomes. Future studies should ensure rigorous study methods are used to evaluate KT interventions.
PCA as a practical indicator of OPLS-DA model reliability.
Worley, Bradley; Powers, Robert
Principal Component Analysis (PCA) and Orthogonal Projections to Latent Structures Discriminant Analysis (OPLS-DA) are powerful statistical modeling tools that provide insights into separations between experimental groups based on high-dimensional spectral measurements from NMR, MS or other analytical instrumentation. However, when used without validation, these tools may lead investigators to statistically unreliable conclusions. This danger is especially real for Partial Least Squares (PLS) and OPLS, which aggressively force separations between experimental groups. As a result, OPLS-DA is often used as an alternative method when PCA fails to expose group separation, but this practice is highly dangerous. Without rigorous validation, OPLS-DA can easily yield statistically unreliable group separation. A Monte Carlo analysis of PCA group separations and OPLS-DA cross-validation metrics was performed on NMR datasets with statistically significant separations in scores-space. A linearly increasing amount of Gaussian noise was added to each data matrix followed by the construction and validation of PCA and OPLS-DA models. With increasing added noise, the PCA scores-space distance between groups rapidly decreased and the OPLS-DA cross-validation statistics simultaneously deteriorated. A decrease in correlation between the estimated loadings (added noise) and the true (original) loadings was also observed. While the validity of the OPLS-DA model diminished with increasing added noise, the group separation in scores-space remained basically unaffected. Supported by the results of Monte Carlo analyses of PCA group separations and OPLS-DA cross-validation metrics, we provide practical guidelines and cross-validatory recommendations for reliable inference from PCA and OPLS-DA models.
DOT National Transportation Integrated Search
1996-04-01
THIS REPORT ALSO DESCRIBES THE PROCEDURES FOR DIRECT ESTIMATION OF INTERSECTION CAPACITY WITH SIMULATION, INCLUDING A SET OF RIGOROUS STATISTICAL TESTS FOR SIMULATION PARAMETER CALIBRATION FROM FIELD DATA.
Rigor Mortis: Statistical thoroughness in reporting and the making of truth.
Tal, Aner
2016-02-01
Should a uniform checklist be adopted for methodological and statistical reporting? The current article discusses this notion, with particular attention to the use of old versus new statistics, and a consideration of the arguments brought up by Von Roten. The article argues that an overly exhaustive checklist that is uniformly applied to all submitted papers may be unsuitable for multidisciplinary work, and would further result in undue clutter and potentially distract reviewers from pertinent considerations in their evaluation of research articles. © The Author(s) 2015.
Zonation in the deep benthic megafauna : Application of a general test.
Gardiner, Frederick P; Haedrich, Richard L
1978-01-01
A test based on Maxwell-Boltzman statistics, instead of the formerly suggested but inappropriate Bose-Einstein statistics (Pielou and Routledge, 1976), examines the distribution of the boundaries of species' ranges distributed along a gradient, and indicates whether they are random or clustered (zoned). The test is most useful as a preliminary to the application of more instructive but less statistically rigorous methods such as cluster analysis. The test indicates zonation is marked in the deep benthic megafauna living between 200 and 3000 m, but below 3000 m little zonation may be found.
DESIGNA ND ANALYSIS FOR THEMATIC MAP ACCURACY ASSESSMENT: FUNDAMENTAL PRINCIPLES
Before being used in scientific investigations and policy decisions, thematic maps constructed from remotely sensed data should be subjected to a statistically rigorous accuracy assessment. The three basic components of an accuracy assessment are: 1) the sampling design used to s...
Hayat, Matthew J; Schmiege, Sarah J; Cook, Paul F
2014-04-01
Statistics knowledge is essential for understanding the nursing and health care literature, as well as for applying rigorous science in nursing research. Statistical consultants providing services to faculty and students in an academic nursing program have the opportunity to identify gaps and challenges in statistics education for nursing students. This information may be useful to curriculum committees and statistics educators. This article aims to provide perspective on statistics education stemming from the experiences of three experienced statistics educators who regularly collaborate and consult with nurse investigators. The authors share their knowledge and express their views about data management, data screening and manipulation, statistical software, types of scientific investigation, and advanced statistical topics not covered in the usual coursework. The suggestions provided promote a call for data to study these topics. Relevant data about statistics education can assist educators in developing comprehensive statistics coursework for nursing students. Copyright 2014, SLACK Incorporated.
Zheng, Jie; Harris, Marcelline R; Masci, Anna Maria; Lin, Yu; Hero, Alfred; Smith, Barry; He, Yongqun
2016-09-14
Statistics play a critical role in biological and clinical research. However, most reports of scientific results in the published literature make it difficult for the reader to reproduce the statistical analyses performed in achieving those results because they provide inadequate documentation of the statistical tests and algorithms applied. The Ontology of Biological and Clinical Statistics (OBCS) is put forward here as a step towards solving this problem. The terms in OBCS including 'data collection', 'data transformation in statistics', 'data visualization', 'statistical data analysis', and 'drawing a conclusion based on data', cover the major types of statistical processes used in basic biological research and clinical outcome studies. OBCS is aligned with the Basic Formal Ontology (BFO) and extends the Ontology of Biomedical Investigations (OBI), an OBO (Open Biological and Biomedical Ontologies) Foundry ontology supported by over 20 research communities. Currently, OBCS comprehends 878 terms, representing 20 BFO classes, 403 OBI classes, 229 OBCS specific classes, and 122 classes imported from ten other OBO ontologies. We discuss two examples illustrating how the ontology is being applied. In the first (biological) use case, we describe how OBCS was applied to represent the high throughput microarray data analysis of immunological transcriptional profiles in human subjects vaccinated with an influenza vaccine. In the second (clinical outcomes) use case, we applied OBCS to represent the processing of electronic health care data to determine the associations between hospital staffing levels and patient mortality. Our case studies were designed to show how OBCS can be used for the consistent representation of statistical analysis pipelines under two different research paradigms. Other ongoing projects using OBCS for statistical data processing are also discussed. The OBCS source code and documentation are available at: https://github.com/obcs/obcs . The Ontology of Biological and Clinical Statistics (OBCS) is a community-based open source ontology in the domain of biological and clinical statistics. OBCS is a timely ontology that represents statistics-related terms and their relations in a rigorous fashion, facilitates standard data analysis and integration, and supports reproducible biological and clinical research.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Josse, Florent; Lefebvre, Yannick; Todeschini, Patrick
2006-07-01
Assessing the structural integrity of a nuclear Reactor Pressure Vessel (RPV) subjected to pressurized-thermal-shock (PTS) transients is extremely important to safety. In addition to conventional deterministic calculations to confirm RPV integrity, Electricite de France (EDF) carries out probabilistic analyses. Probabilistic analyses are interesting because some key variables, albeit conventionally taken at conservative values, can be modeled more accurately through statistical variability. One variable which significantly affects RPV structural integrity assessment is cleavage fracture initiation toughness. The reference fracture toughness method currently in use at EDF is the RCCM and ASME Code lower-bound K{sub IC} based on the indexing parameter RT{submore » NDT}. However, in order to quantify the toughness scatter for probabilistic analyses, the master curve method is being analyzed at present. Furthermore, the master curve method is a direct means of evaluating fracture toughness based on K{sub JC} data. In the framework of the master curve investigation undertaken by EDF, this article deals with the following two statistical items: building a master curve from an extract of a fracture toughness dataset (from the European project 'Unified Reference Fracture Toughness Design curves for RPV Steels') and controlling statistical uncertainty for both mono-temperature and multi-temperature tests. Concerning the first point, master curve temperature dependence is empirical in nature. To determine the 'original' master curve, Wallin postulated that a unified description of fracture toughness temperature dependence for ferritic steels is possible, and used a large number of data corresponding to nuclear-grade pressure vessel steels and welds. Our working hypothesis is that some ferritic steels may behave in slightly different ways. Therefore we focused exclusively on the basic french reactor vessel metal of types A508 Class 3 and A 533 grade B Class 1, taking the sampling level and direction into account as well as the test specimen type. As for the second point, the emphasis is placed on the uncertainties in applying the master curve approach. For a toughness dataset based on different specimens of a single product, application of the master curve methodology requires the statistical estimation of one parameter: the reference temperature T{sub 0}. Because of the limited number of specimens, estimation of this temperature is uncertain. The ASTM standard provides a rough evaluation of this statistical uncertainty through an approximate confidence interval. In this paper, a thorough study is carried out to build more meaningful confidence intervals (for both mono-temperature and multi-temperature tests). These results ensure better control over uncertainty, and allow rigorous analysis of the impact of its influencing factors: the number of specimens and the temperatures at which they have been tested. (authors)« less
NASA Astrophysics Data System (ADS)
Ivanov, Martin; Warrach-Sagi, Kirsten; Wulfmeyer, Volker
2018-04-01
A new approach for rigorous spatial analysis of the downscaling performance of regional climate model (RCM) simulations is introduced. It is based on a multiple comparison of the local tests at the grid cells and is also known as `field' or `global' significance. The block length for the local resampling tests is precisely determined to adequately account for the time series structure. New performance measures for estimating the added value of downscaled data relative to the large-scale forcing fields are developed. The methodology is exemplarily applied to a standard EURO-CORDEX hindcast simulation with the Weather Research and Forecasting (WRF) model coupled with the land surface model NOAH at 0.11 ∘ grid resolution. Daily precipitation climatology for the 1990-2009 period is analysed for Germany for winter and summer in comparison with high-resolution gridded observations from the German Weather Service. The field significance test controls the proportion of falsely rejected local tests in a meaningful way and is robust to spatial dependence. Hence, the spatial patterns of the statistically significant local tests are also meaningful. We interpret them from a process-oriented perspective. While the downscaled precipitation distributions are statistically indistinguishable from the observed ones in most regions in summer, the biases of some distribution characteristics are significant over large areas in winter. WRF-NOAH generates appropriate stationary fine-scale climate features in the daily precipitation field over regions of complex topography in both seasons and appropriate transient fine-scale features almost everywhere in summer. As the added value of global climate model (GCM)-driven simulations cannot be smaller than this perfect-boundary estimate, this work demonstrates in a rigorous manner the clear additional value of dynamical downscaling over global climate simulations. The evaluation methodology has a broad spectrum of applicability as it is distribution-free, robust to spatial dependence, and accounts for time series structure.
Transformation and model choice for RNA-seq co-expression analysis.
Rau, Andrea; Maugis-Rabusseau, Cathy
2018-05-01
Although a large number of clustering algorithms have been proposed to identify groups of co-expressed genes from microarray data, the question of if and how such methods may be applied to RNA sequencing (RNA-seq) data remains unaddressed. In this work, we investigate the use of data transformations in conjunction with Gaussian mixture models for RNA-seq co-expression analyses, as well as a penalized model selection criterion to select both an appropriate transformation and number of clusters present in the data. This approach has the advantage of accounting for per-cluster correlation structures among samples, which can be strong in RNA-seq data. In addition, it provides a rigorous statistical framework for parameter estimation, an objective assessment of data transformations and number of clusters and the possibility of performing diagnostic checks on the quality and homogeneity of the identified clusters. We analyze four varied RNA-seq data sets to illustrate the use of transformations and model selection in conjunction with Gaussian mixture models. Finally, we propose a Bioconductor package coseq (co-expression of RNA-seq data) to facilitate implementation and visualization of the recommended RNA-seq co-expression analyses.
A two-factor error model for quantitative steganalysis
NASA Astrophysics Data System (ADS)
Böhme, Rainer; Ker, Andrew D.
2006-02-01
Quantitative steganalysis refers to the exercise not only of detecting the presence of hidden stego messages in carrier objects, but also of estimating the secret message length. This problem is well studied, with many detectors proposed but only a sparse analysis of errors in the estimators. A deep understanding of the error model, however, is a fundamental requirement for the assessment and comparison of different detection methods. This paper presents a rationale for a two-factor model for sources of error in quantitative steganalysis, and shows evidence from a dedicated large-scale nested experimental set-up with a total of more than 200 million attacks. Apart from general findings about the distribution functions found in both classes of errors, their respective weight is determined, and implications for statistical hypothesis tests in benchmarking scenarios or regression analyses are demonstrated. The results are based on a rigorous comparison of five different detection methods under many different external conditions, such as size of the carrier, previous JPEG compression, and colour channel selection. We include analyses demonstrating the effects of local variance and cover saturation on the different sources of error, as well as presenting the case for a relative bias model for between-image error.
NASA Astrophysics Data System (ADS)
Gottwald, Georg; Melbourne, Ian
2013-04-01
Whereas diffusion limits of stochastic multi-scale systems have a long and successful history, the case of constructing stochastic parametrizations of chaotic deterministic systems has been much less studied. We present rigorous results of convergence of a chaotic slow-fast system to a stochastic differential equation with multiplicative noise. Furthermore we present rigorous results for chaotic slow-fast maps, occurring as numerical discretizations of continuous time systems. This raises the issue of how to interpret certain stochastic integrals; surprisingly the resulting integrals of the stochastic limit system are generically neither of Stratonovich nor of Ito type in the case of maps. It is shown that the limit system of a numerical discretisation is different to the associated continuous time system. This has important consequences when interpreting the statistics of long time simulations of multi-scale systems - they may be very different to the one of the original continuous time system which we set out to study.
Using GIS to generate spatially balanced random survey designs for natural resource applications.
Theobald, David M; Stevens, Don L; White, Denis; Urquhart, N Scott; Olsen, Anthony R; Norman, John B
2007-07-01
Sampling of a population is frequently required to understand trends and patterns in natural resource management because financial and time constraints preclude a complete census. A rigorous probability-based survey design specifies where to sample so that inferences from the sample apply to the entire population. Probability survey designs should be used in natural resource and environmental management situations because they provide the mathematical foundation for statistical inference. Development of long-term monitoring designs demand survey designs that achieve statistical rigor and are efficient but remain flexible to inevitable logistical or practical constraints during field data collection. Here we describe an approach to probability-based survey design, called the Reversed Randomized Quadrant-Recursive Raster, based on the concept of spatially balanced sampling and implemented in a geographic information system. This provides environmental managers a practical tool to generate flexible and efficient survey designs for natural resource applications. Factors commonly used to modify sampling intensity, such as categories, gradients, or accessibility, can be readily incorporated into the spatially balanced sample design.
ERIC Educational Resources Information Center
Higginbotham, David L.
2013-01-01
This study leveraged the complementary nature of confirmatory factor (CFA), item response theory (IRT), and latent class (LCA) analyses to strengthen the rigor and sophistication of evaluation of two new measures of the Air Force Academy's "leader of character" definition--the Character Mosaic Virtues (CMV) and the Leadership Mosaic…
NASA Astrophysics Data System (ADS)
Lanfredi, M.; Simoniello, T.; Cuomo, V.; Macchiato, M.
2009-02-01
This study originated from recent results reported in literature, which support the existence of long-range (power-law) persistence in atmospheric temperature fluctuations on monthly and inter-annual scales. We investigated the results of Detrended Fluctuation Analysis (DFA) carried out on twenty-two historical daily time series recorded in Europe in order to evaluate the reliability of such findings in depth. More detailed inspections emphasized systematic deviations from power-law and high statistical confidence for functional form misspecification. Rigorous analyses did not support scale-free correlation as an operative concept for Climate modelling, as instead suggested in literature. In order to understand the physical implications of our results better, we designed a bivariate Markov process, parameterised on the basis of the atmospheric observational data by introducing a slow dummy variable. The time series generated by this model, analysed both in time and frequency domains, tallied with the real ones very well. They accounted for both the deceptive scaling found in literature and the correlation details enhanced by our analysis. Our results seem to evidence the presence of slow fluctuations from another climatic sub-system such as ocean, which inflates temperature variance up to several months. They advise more precise re-analyses of temperature time series before suggesting dynamical paradigms useful for Climate modelling and for the assessment of Climate Change.
NASA Astrophysics Data System (ADS)
Lanfredi, M.; Simoniello, T.; Cuomo, V.; Macchiato, M.
2009-07-01
This study originated from recent results reported in literature, which support the existence of long-range (power-law) persistence in atmospheric temperature fluctuations on monthly and inter-annual scales. We investigated the results of Detrended Fluctuation Analysis (DFA) carried out on twenty-two historical daily time series recorded in Europe in order to evaluate the reliability of such findings in depth. More detailed inspections emphasized systematic deviations from power-law and high statistical confidence for functional form misspecification. Rigorous analyses did not support scale-free correlation as an operative concept for Climate modelling, as instead suggested in literature. In order to understand the physical implications of our results better, we designed a bivariate Markov process, parameterised on the basis of the atmospheric observational data by introducing a slow dummy variable. The time series generated by this model, analysed both in time and frequency domains, tallied with the real ones very well. They accounted for both the deceptive scaling found in literature and the correlation details enhanced by our analysis. Our results seem to evidence the presence of slow fluctuations from another climatic sub-system such as ocean, which inflates temperature variance up to several months. They advise more precise re-analyses of temperature time series before suggesting dynamical paradigms useful for Climate modelling and for the assessment of Climate Change.
A statistical physics perspective on criticality in financial markets
NASA Astrophysics Data System (ADS)
Bury, Thomas
2013-11-01
Stock markets are complex systems exhibiting collective phenomena and particular features such as synchronization, fluctuations distributed as power-laws, non-random structures and similarity to neural networks. Such specific properties suggest that markets operate at a very special point. Financial markets are believed to be critical by analogy to physical systems, but little statistically founded evidence has been given. Through a data-based methodology and comparison to simulations inspired by the statistical physics of complex systems, we show that the Dow Jones and index sets are not rigorously critical. However, financial systems are closer to criticality in the crash neighborhood.
Integration of Technology into the Classroom: Case Studies.
ERIC Educational Resources Information Center
Johnson, D. LaMont, Ed.; Maddux, Cleborne D., Ed.; Liu, Leping, Ed.
This book contains the following case studies on the integration of technology in education: (1) "First Steps toward a Statistically Generated Information Technology Integration Model" (D. LaMont Johnson and Leping Liu); (2) "Case Studies: Are We Rejecting Rigor or Rediscovering Richness?" (Cleborne D. Maddux); (3)…
A Psychometric Evaluation of the Digital Logic Concept Inventory
ERIC Educational Resources Information Center
Herman, Geoffrey L.; Zilles, Craig; Loui, Michael C.
2014-01-01
Concept inventories hold tremendous promise for promoting the rigorous evaluation of teaching methods that might remedy common student misconceptions and promote deep learning. The measurements from concept inventories can be trusted only if the concept inventories are evaluated both by expert feedback and statistical scrutiny (psychometric…
Cheng, Ning; Rahman, Md Motiur; Alatawi, Yasser; Qian, Jingjing; Peissig, Peggy L; Berg, Richard L; Page, C David; Hansen, Richard A
2018-04-01
Several different types of drugs acting on the central nervous system (CNS) have previously been associated with an increased risk of suicide and suicidal ideation (broadly referred to as suicide). However, a differential association between brand and generic CNS drugs and suicide has not been reported. This study compares suicide adverse event rates for brand versus generic CNS drugs using multiple sources of data. Selected examples of CNS drugs (sertraline, gabapentin, zolpidem, and methylphenidate) were evaluated via the US FDA Adverse Event Reporting System (FAERS) for a hypothesis-generating study, and then via administrative claims and electronic health record (EHR) data for a more rigorous retrospective cohort study. Disproportionality analyses with reporting odds ratios and 95% confidence intervals (CIs) were used in the FAERS analyses to quantify the association between each drug and reported suicide. For the cohort studies, Cox proportional hazards models were used, controlling for demographic and clinical characteristics as well as the background risk of suicide in the insured population. The FAERS analyses found significantly lower suicide reporting rates for brands compared with generics for all four studied products (Breslow-Day P < 0.05). In the claims- and EHR-based cohort study, the adjusted hazard ratio (HR) was statistically significant only for sertraline (HR 0.58; 95% CI 0.38-0.88). Suicide reporting rates were disproportionately larger for generic than for brand CNS drugs in FAERS and adjusted retrospective cohort analyses remained significant only for sertraline. However, even for sertraline, temporal confounding related to the close proximity of black box warnings and generic availability is possible. Additional analyses in larger data sources with additional drugs are needed.
Knowledge translation and implementation in spinal cord injury: a systematic review
Noonan, VK; Wolfe, DL; Thorogood, NP; Park, SE; Hsieh, JT; Eng, JJ
2015-01-01
Objective To conduct a systematic review examining the effectiveness of knowledge translation (KT) interventions in changing clinical practice and patient outcomes. Methods MEDLINE/PubMed, CINAHL, EMBASE and PsycINFO were searched for studies published from January 1980 to July 2012 that reported and evaluated an implemented KT intervention in spinal cord injury (SCI) care. We reviewed and summarized results from studies that documented the implemented KT intervention, its impact on changing clinician behavior and patient outcomes as well as the facilitators and barriers encountered during the implementation. Results A total of 13 articles featuring 10 studies were selected and abstracted from 4650 identified articles. KT interventions included developing and implementing patient care protocols, providing clinician education and incorporating outcome measures into clinical practice. The methods (or drivers) to facilitate the implementation included organizing training sessions for clinical staff, introducing computerized reminders and involving organizational leaders. The methodological quality of studies was mostly poor. Only 3 out of 10 studies evaluated the success of the implementation using statistical analyses, and all 3 reported significant behavior change. Out of the 10 studies, 6 evaluated the effect of the implementation on patient outcomes using statistical analyses, with 4 reporting significant improvements. The commonly cited facilitators and barriers were communication and resources, respectively. Conclusion The field of KT in SCI is in its infancy with only a few relevant publications. However, there is some evidence that KT interventions may change clinician behavior and improve patient outcomes. Future studies should ensure rigorous study methods are used to evaluate KT interventions. PMID:24796445
Nielsen, Rasmus Østergaard; Malisoux, Laurent; Møller, Merete; Theisen, Daniel; Parner, Erik Thorlund
2016-04-01
The etiological mechanism underpinning any sports-related injury is complex and multifactorial. Frequently, athletes perceive "excessive training" as the principal factor in their injury, an observation that is biologically plausible yet somewhat ambiguous. If the applied training load is suddenly increased, this may increase the risk for sports injury development, irrespective of the absolute amount of training. Indeed, little to no rigorous scientific evidence exists to support the hypothesis that fluctuations in training load, compared to absolute training load, are more important in explaining sports injury development. One reason for this could be that prospective data from scientific studies should be analyzed in a different manner. Time-to-event analysis is a useful statistical tool in which to analyze the influence of changing exposures on injury risk. However, the potential of time-to-event analysis remains insufficiently exploited in sports injury research. Therefore, the purpose of the present article was to present and discuss measures of association used in time-to-event analyses and to present the advanced concept of time-varying exposures and outcomes. In the paper, different measures of association, such as cumulative relative risk, cumulative risk difference, and the classical hazard rate ratio, are presented in a nontechnical manner, and suggestions for interpretation of study results are provided. To summarize, time-to-event analysis complements the statistical arsenal of sports injury prevention researchers, because it enables them to analyze the complex and highly dynamic reality of injury etiology, injury recurrence, and time to recovery across a range of sporting contexts.
Improved key-rate bounds for practical decoy-state quantum-key-distribution systems
NASA Astrophysics Data System (ADS)
Zhang, Zhen; Zhao, Qi; Razavi, Mohsen; Ma, Xiongfeng
2017-01-01
The decoy-state scheme is the most widely implemented quantum-key-distribution protocol in practice. In order to account for the finite-size key effects on the achievable secret key generation rate, a rigorous statistical fluctuation analysis is required. Originally, a heuristic Gaussian-approximation technique was used for this purpose, which, despite its analytical convenience, was not sufficiently rigorous. The fluctuation analysis has recently been made rigorous by using the Chernoff bound. There is a considerable gap, however, between the key-rate bounds obtained from these techniques and that obtained from the Gaussian assumption. Here we develop a tighter bound for the decoy-state method, which yields a smaller failure probability. This improvement results in a higher key rate and increases the maximum distance over which secure key exchange is possible. By optimizing the system parameters, our simulation results show that our method almost closes the gap between the two previously proposed techniques and achieves a performance similar to that of conventional Gaussian approximations.
Estimation of the time since death--reconsidering the re-establishment of rigor mortis.
Anders, Sven; Kunz, Michaela; Gehl, Axel; Sehner, Susanne; Raupach, Tobias; Beck-Bornholdt, Hans-Peter
2013-01-01
In forensic medicine, there is an undefined data background for the phenomenon of re-establishment of rigor mortis after mechanical loosening, a method used in establishing time since death in forensic casework that is thought to occur up to 8 h post-mortem. Nevertheless, the method is widely described in textbooks on forensic medicine. We examined 314 joints (elbow and knee) of 79 deceased at defined time points up to 21 h post-mortem (hpm). Data were analysed using a random intercept model. Here, we show that re-establishment occurred in 38.5% of joints at 7.5 to 19 hpm. Therefore, the maximum time span for the re-establishment of rigor mortis appears to be 2.5-fold longer than thought so far. These findings have major impact on the estimation of time since death in forensic casework.
Prompt assessment and management actions are required if we are to reduce the current rapid loss of habitat and biodiversity worldwide. Statistically valid quantification of the biota and habitat condition in water bodies are prerequisites for rigorous assessment of aquatic biodi...
The US Environmental Protection Agency (EPA) is revising its strategy to obtain the information needed to answer questions pertinent to water-quality management efficiently and rigorously at national scales. One tool of this revised strategy is use of statistically based surveys ...
Examining Multidimensional Middle Grade Outcomes after Early Elementary School Grade Retention
ERIC Educational Resources Information Center
Hwang, Sophia; Cappella, Elise; Schwartz, Kate
2016-01-01
Recently, researchers have begun to employ rigorous statistical methods and developmentally-informed theories to evaluate outcomes for students retained in non-kindergarten early elementary school. However, the majority of this research focuses on academic outcomes. Gaps remain regarding retention's effects on psychosocial outcomes important to…
1985-02-01
Energy Analysis , a branch of dynamic modal analysis developed for analyzing acoustic vibration problems, its present stage of development embodies a...Maximum Entropy Stochastic Modelling and Reduced-Order Design Synthesis is a rigorous new approach to this class of problems. Inspired by Statistical
Beyond Composite Scores and Cronbach's Alpha: Advancing Methodological Rigor in Recreation Research
ERIC Educational Resources Information Center
Gagnon, Ryan J.; Stone, Garrett A.; Garst, Barry A.
2017-01-01
Critically examining common statistical approaches and their strengths and weaknesses is an important step in advancing recreation and leisure sciences. To continue this critical examination and to inform methodological decision making, this study compared three approaches to determine how alternative approaches may result in contradictory…
Ergodicity of Truncated Stochastic Navier Stokes with Deterministic Forcing and Dispersion
NASA Astrophysics Data System (ADS)
Majda, Andrew J.; Tong, Xin T.
2016-10-01
Turbulence in idealized geophysical flows is a very rich and important topic. The anisotropic effects of explicit deterministic forcing, dispersive effects from rotation due to the β -plane and F-plane, and topography together with random forcing all combine to produce a remarkable number of realistic phenomena. These effects have been studied through careful numerical experiments in the truncated geophysical models. These important results include transitions between coherent jets and vortices, and direct and inverse turbulence cascades as parameters are varied, and it is a contemporary challenge to explain these diverse statistical predictions. Here we contribute to these issues by proving with full mathematical rigor that for any values of the deterministic forcing, the β - and F-plane effects and topography, with minimal stochastic forcing, there is geometric ergodicity for any finite Galerkin truncation. This means that there is a unique smooth invariant measure which attracts all statistical initial data at an exponential rate. In particular, this rigorous statistical theory guarantees that there are no bifurcations to multiple stable and unstable statistical steady states as geophysical parameters are varied in contrast to claims in the applied literature. The proof utilizes a new statistical Lyapunov function to account for enstrophy exchanges between the statistical mean and the variance fluctuations due to the deterministic forcing. It also requires careful proofs of hypoellipticity with geophysical effects and uses geometric control theory to establish reachability. To illustrate the necessity of these conditions, a two-dimensional example is developed which has the square of the Euclidean norm as the Lyapunov function and is hypoelliptic with nonzero noise forcing, yet fails to be reachable or ergodic.
Jaspard, Emmanuel; Macherel, David; Hunault, Gilles
2012-01-01
Late Embryogenesis Abundant Proteins (LEAPs) are ubiquitous proteins expected to play major roles in desiccation tolerance. Little is known about their structure - function relationships because of the scarcity of 3-D structures for LEAPs. The previous building of LEAPdb, a database dedicated to LEAPs from plants and other organisms, led to the classification of 710 LEAPs into 12 non-overlapping classes with distinct properties. Using this resource, numerous physico-chemical properties of LEAPs and amino acid usage by LEAPs have been computed and statistically analyzed, revealing distinctive features for each class. This unprecedented analysis allowed a rigorous characterization of the 12 LEAP classes, which differed also in multiple structural and physico-chemical features. Although most LEAPs can be predicted as intrinsically disordered proteins, the analysis indicates that LEAP class 7 (PF03168) and probably LEAP class 11 (PF04927) are natively folded proteins. This study thus provides a detailed description of the structural properties of this protein family opening the path toward further LEAP structure - function analysis. Finally, since each LEAP class can be clearly characterized by a unique set of physico-chemical properties, this will allow development of software to predict proteins as LEAPs. PMID:22615859
Lustyk, M Kathleen B; Gerrish, Winslow G; Shaver, Shelley; Keys, Shaunie L
2009-04-01
We systematically reviewed empirical studies that investigated the use of cognitive-behavioral therapy (CBT) for premenstrual syndrome (PMS) or premenstrual dysphoric disorder (PMDD). Our multi-database search identified seven published empirical reports. Three were identified as randomized controlled trials (RCTs). The methods utilized to investigate therapeutic efficacy of CBT in these studies varied widely from case reports to RCTs with pharmacotherapy comparison groups. Initially we provide a brief overview of CBT and justifications for its potential use to treat PMS/PMDD. Next, we provide critical evaluations of the analyses used in each study focusing on the detection of intervention effects assessed by statistically significant time by group interactions. When possible we calculate effect sizes to elucidate the clinical significance of results. Our review revealed a dearth of evidence providing statistically significant CBT intervention effects. Issues such as overall time investment, latency to treatment effects, and complementary and combined therapies are considered. We present a theoretical argument for applying mindfulness- and acceptance-based CBT interventions to PMS/PMDD and suggest future research in this area. In conclusion, to produce the necessary evidence-base support for PMS/PMDD given the limited empirical evidence reported here, researchers are called on to produce methodologically rigorous investigations of psychosocial interventions for PMS/PMDD.
Schlägel, Ulrike E; Lewis, Mark A
2016-12-01
Discrete-time random walks and their extensions are common tools for analyzing animal movement data. In these analyses, resolution of temporal discretization is a critical feature. Ideally, a model both mirrors the relevant temporal scale of the biological process of interest and matches the data sampling rate. Challenges arise when resolution of data is too coarse due to technological constraints, or when we wish to extrapolate results or compare results obtained from data with different resolutions. Drawing loosely on the concept of robustness in statistics, we propose a rigorous mathematical framework for studying movement models' robustness against changes in temporal resolution. In this framework, we define varying levels of robustness as formal model properties, focusing on random walk models with spatially-explicit component. With the new framework, we can investigate whether models can validly be applied to data across varying temporal resolutions and how we can account for these different resolutions in statistical inference results. We apply the new framework to movement-based resource selection models, demonstrating both analytical and numerical calculations, as well as a Monte Carlo simulation approach. While exact robustness is rare, the concept of approximate robustness provides a promising new direction for analyzing movement models.
NASA Astrophysics Data System (ADS)
Lee, K. David; Wiesenfeld, Eric; Gelfand, Andrew
2007-04-01
One of the greatest challenges in modern combat is maintaining a high level of timely Situational Awareness (SA). In many situations, computational complexity and accuracy considerations make the development and deployment of real-time, high-level inference tools very difficult. An innovative hybrid framework that combines Bayesian inference, in the form of Bayesian Networks, and Possibility Theory, in the form of Fuzzy Logic systems, has recently been introduced to provide a rigorous framework for high-level inference. In previous research, the theoretical basis and benefits of the hybrid approach have been developed. However, lacking is a concrete experimental comparison of the hybrid framework with traditional fusion methods, to demonstrate and quantify this benefit. The goal of this research, therefore, is to provide a statistical analysis on the comparison of the accuracy and performance of hybrid network theory, with pure Bayesian and Fuzzy systems and an inexact Bayesian system approximated using Particle Filtering. To accomplish this task, domain specific models will be developed under these different theoretical approaches and then evaluated, via Monte Carlo Simulation, in comparison to situational ground truth to measure accuracy and fidelity. Following this, a rigorous statistical analysis of the performance results will be performed, to quantify the benefit of hybrid inference to other fusion tools.
Forecasting volatility with neural regression: a contribution to model adequacy.
Refenes, A N; Holt, W T
2001-01-01
Neural nets' usefulness for forecasting is limited by problems of overfitting and the lack of rigorous procedures for model identification, selection and adequacy testing. This paper describes a methodology for neural model misspecification testing. We introduce a generalization of the Durbin-Watson statistic for neural regression and discuss the general issues of misspecification testing using residual analysis. We derive a generalized influence matrix for neural estimators which enables us to evaluate the distribution of the statistic. We deploy Monte Carlo simulation to compare the power of the test for neural and linear regressors. While residual testing is not a sufficient condition for model adequacy, it is nevertheless a necessary condition to demonstrate that the model is a good approximation to the data generating process, particularly as neural-network estimation procedures are susceptible to partial convergence. The work is also an important step toward developing rigorous procedures for neural model identification, selection and adequacy testing which have started to appear in the literature. We demonstrate its applicability in the nontrivial problem of forecasting implied volatility innovations using high-frequency stock index options. Each step of the model building process is validated using statistical tests to verify variable significance and model adequacy with the results confirming the presence of nonlinear relationships in implied volatility innovations.
Das, Abhik; Tyson, Jon; Pedroza, Claudia; Schmidt, Barbara; Gantz, Marie; Wallace, Dennis; Truog, William E; Higgins, Rosemary D
2016-10-01
Impressive advances in neonatology have occurred over the 30 years of life of The Eunice Kennedy Shriver National Institute of Child Health and Human Development Neonatal Research Network (NRN). However, substantial room for improvement remains in investigating and further developing the evidence base for improving outcomes among the extremely premature. We discuss some of the specific methodological challenges in the statistical design and analysis of randomized trials and observational studies in this population. Challenges faced by the NRN include designing trials for unusual or rare outcomes, accounting for and explaining center variations, identifying other subgroup differences, and balancing safety and efficacy concerns between short-term hospital outcomes and longer-term neurodevelopmental outcomes. In conclusion, the constellation of unique patient characteristics in neonates calls for broad understanding and careful consideration of the issues identified in this article for conducting rigorous studies in this population. Copyright © 2016 Elsevier Inc. All rights reserved.
Transcriptional response according to strength of calorie restriction in Saccharomyces cerevisiae.
Lee, Yae-Lim; Lee, Cheol-Koo
2008-09-30
To characterize gene expression that is dependent on the strength of calorie restriction (CR), we obtained transcriptome at different levels of glucose, which is a major energy and carbon source for budding yeast. To faithfully mimic mammalian CR in yeast culture, we reconstituted and grew seeding yeast cells in fresh 2% YPD media before inoculating into 2%, 1%, 0.5% and 0.25% YPD media to reflect different CR strengths. We collected and characterized 160 genes that responded to CR strength based on the rigorous statistical analyses of multiple test corrected ANOVA (adjusted p
Waggoner, Jane; Carline, Jan D; Durning, Steven J
2016-05-01
The authors of this article reviewed the methodology of three common consensus methods: nominal group process, consensus development panels, and the Delphi technique. The authors set out to determine how a majority of researchers are conducting these studies, how they are analyzing results, and subsequently the manner in which they are reporting their findings. The authors conclude with a set of guidelines and suggestions designed to aid researchers who choose to use the consensus methodology in their work.Overall, researchers need to describe their inclusion criteria. In addition to this, on the basis of the current literature the authors found that a panel size of 5 to 11 members was most beneficial across all consensus methods described. Lastly, the authors agreed that the statistical analyses done in consensus method studies should be as rigorous as possible and that the predetermined definition of consensus must be included in the ultimate manuscript. More specific recommendations are given for each of the three consensus methods described in the article.
Using the Depression Anxiety Stress Scale 21 (DASS-21) across cultures.
Oei, Tian P S; Sawang, Sukanlaya; Goh, Yong Wah; Mukhtar, Firdaus
2013-01-01
The DASS-21 is a well-established instrument for measuring depression, anxiety, and stress with good reliability and validity reported from Hispanic American, British, and Australian adults. However, the lack of appropriate validation among Asian populations continues to pose concerns over the use of DASS-21 in Asian samples. Cultural variation may influence the individual's experience and emotional expression. Thus, when researchers and practitioners employ Western-based assessments with Asian populations by directly translating them without an appropriate validation, the process can be challenging. We conducted a series of rigorous statistical tests and minimized any potential confounds from the demographic information. Following factor analyses, we performed multigroup analysis across six nations to demonstrate consistency of our findings. The advantages of this revised DASS-18 stress scale are twofold. First, it possesses fewer items, which results in a cleaner factorial structure. Second, it has a smaller interfactor correlation. With these justifications, the revised DASS-18 stress scale is potentially more suitable for Asian populations. Nonetheless, given limitations, findings should be considered preliminary.
NASA Astrophysics Data System (ADS)
Nikolic, Sasha; Suesse, Thomas F.; McCarthy, Timothy J.; Goldfinch, Thomas L.
2017-11-01
Minimal research papers have investigated the use of student evaluations on the laboratory, a learning medium usually run by teaching assistants with little control of the content, delivery and equipment. Finding the right mix of teaching assistants for the laboratory can be an onerous task due to the many skills required including theoretical and practical know-how, troubleshooting, safety and class management. Using larger classes with multiple teaching assistants, a team-based teaching (TBT) format may be advantageous. A rigorous three-year study across twenty-five courses over repetitive laboratory classes is analysed using a multi-level statistical model considering students, laboratory classes and courses. The study is used to investigate the effectiveness of the TBT format, and quantify the influence each demonstrator has on the laboratory experience. The study found that TBT is effective and the lead demonstrator most influential, influencing up to 55% of the laboratory experience evaluation.
Data Assimilation - Advances and Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Williams, Brian J.
2014-07-30
This presentation provides an overview of data assimilation (model calibration) for complex computer experiments. Calibration refers to the process of probabilistically constraining uncertain physics/engineering model inputs to be consistent with observed experimental data. An initial probability distribution for these parameters is updated using the experimental information. Utilization of surrogate models and empirical adjustment for model form error in code calibration form the basis for the statistical methodology considered. The role of probabilistic code calibration in supporting code validation is discussed. Incorporation of model form uncertainty in rigorous uncertainty quantification (UQ) analyses is also addressed. Design criteria used within a batchmore » sequential design algorithm are introduced for efficiently achieving predictive maturity and improved code calibration. Predictive maturity refers to obtaining stable predictive inference with calibrated computer codes. These approaches allow for augmentation of initial experiment designs for collecting new physical data. A standard framework for data assimilation is presented and techniques for updating the posterior distribution of the state variables based on particle filtering and the ensemble Kalman filter are introduced.« less
Probabilistic risk analysis of building contamination.
Bolster, D T; Tartakovsky, D M
2008-10-01
We present a general framework for probabilistic risk assessment (PRA) of building contamination. PRA provides a powerful tool for the rigorous quantification of risk in contamination of building spaces. A typical PRA starts by identifying relevant components of a system (e.g. ventilation system components, potential sources of contaminants, remediation methods) and proceeds by using available information and statistical inference to estimate the probabilities of their failure. These probabilities are then combined by means of fault-tree analyses to yield probabilistic estimates of the risk of system failure (e.g. building contamination). A sensitivity study of PRAs can identify features and potential problems that need to be addressed with the most urgency. Often PRAs are amenable to approximations, which can significantly simplify the approach. All these features of PRA are presented in this paper via a simple illustrative example, which can be built upon in further studies. The tool presented here can be used to design and maintain adequate ventilation systems to minimize exposure of occupants to contaminants.
QTest: Quantitative Testing of Theories of Binary Choice
Regenwetter, Michel; Davis-Stober, Clintin P.; Lim, Shiau Hong; Guo, Ying; Popova, Anna; Zwilling, Chris; Cha, Yun-Shil; Messner, William
2014-01-01
The goal of this paper is to make modeling and quantitative testing accessible to behavioral decision researchers interested in substantive questions. We provide a novel, rigorous, yet very general, quantitative diagnostic framework for testing theories of binary choice. This permits the nontechnical scholar to proceed far beyond traditionally rather superficial methods of analysis, and it permits the quantitatively savvy scholar to triage theoretical proposals before investing effort into complex and specialized quantitative analyses. Our theoretical framework links static algebraic decision theory with observed variability in behavioral binary choice data. The paper is supplemented with a custom-designed public-domain statistical analysis package, the QTest software. We illustrate our approach with a quantitative analysis using published laboratory data, including tests of novel versions of “Random Cumulative Prospect Theory.” A major asset of the approach is the potential to distinguish decision makers who have a fixed preference and commit errors in observed choices from decision makers who waver in their preferences. PMID:24999495
Increased scientific rigor will improve reliability of research and effectiveness of management
Sells, Sarah N.; Bassing, Sarah B.; Barker, Kristin J.; Forshee, Shannon C.; Keever, Allison; Goerz, James W.; Mitchell, Michael S.
2018-01-01
Rigorous science that produces reliable knowledge is critical to wildlife management because it increases accurate understanding of the natural world and informs management decisions effectively. Application of a rigorous scientific method based on hypothesis testing minimizes unreliable knowledge produced by research. To evaluate the prevalence of scientific rigor in wildlife research, we examined 24 issues of the Journal of Wildlife Management from August 2013 through July 2016. We found 43.9% of studies did not state or imply a priori hypotheses, which are necessary to produce reliable knowledge. We posit that this is due, at least in part, to a lack of common understanding of what rigorous science entails, how it produces more reliable knowledge than other forms of interpreting observations, and how research should be designed to maximize inferential strength and usefulness of application. Current primary literature does not provide succinct explanations of the logic behind a rigorous scientific method or readily applicable guidance for employing it, particularly in wildlife biology; we therefore synthesized an overview of the history, philosophy, and logic that define scientific rigor for biological studies. A rigorous scientific method includes 1) generating a research question from theory and prior observations, 2) developing hypotheses (i.e., plausible biological answers to the question), 3) formulating predictions (i.e., facts that must be true if the hypothesis is true), 4) designing and implementing research to collect data potentially consistent with predictions, 5) evaluating whether predictions are consistent with collected data, and 6) drawing inferences based on the evaluation. Explicitly testing a priori hypotheses reduces overall uncertainty by reducing the number of plausible biological explanations to only those that are logically well supported. Such research also draws inferences that are robust to idiosyncratic observations and unavoidable human biases. Offering only post hoc interpretations of statistical patterns (i.e., a posteriorihypotheses) adds to uncertainty because it increases the number of plausible biological explanations without determining which have the greatest support. Further, post hocinterpretations are strongly subject to human biases. Testing hypotheses maximizes the credibility of research findings, makes the strongest contributions to theory and management, and improves reproducibility of research. Management decisions based on rigorous research are most likely to result in effective conservation of wildlife resources.
A novel statistical method for quantitative comparison of multiple ChIP-seq datasets.
Chen, Li; Wang, Chi; Qin, Zhaohui S; Wu, Hao
2015-06-15
ChIP-seq is a powerful technology to measure the protein binding or histone modification strength in the whole genome scale. Although there are a number of methods available for single ChIP-seq data analysis (e.g. 'peak detection'), rigorous statistical method for quantitative comparison of multiple ChIP-seq datasets with the considerations of data from control experiment, signal to noise ratios, biological variations and multiple-factor experimental designs is under-developed. In this work, we develop a statistical method to perform quantitative comparison of multiple ChIP-seq datasets and detect genomic regions showing differential protein binding or histone modification. We first detect peaks from all datasets and then union them to form a single set of candidate regions. The read counts from IP experiment at the candidate regions are assumed to follow Poisson distribution. The underlying Poisson rates are modeled as an experiment-specific function of artifacts and biological signals. We then obtain the estimated biological signals and compare them through the hypothesis testing procedure in a linear model framework. Simulations and real data analyses demonstrate that the proposed method provides more accurate and robust results compared with existing ones. An R software package ChIPComp is freely available at http://web1.sph.emory.edu/users/hwu30/software/ChIPComp.html. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Scaling up to address data science challenges
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wendelberger, Joanne R.
Statistics and Data Science provide a variety of perspectives and technical approaches for exploring and understanding Big Data. Partnerships between scientists from different fields such as statistics, machine learning, computer science, and applied mathematics can lead to innovative approaches for addressing problems involving increasingly large amounts of data in a rigorous and effective manner that takes advantage of advances in computing. Here, this article will explore various challenges in Data Science and will highlight statistical approaches that can facilitate analysis of large-scale data including sampling and data reduction methods, techniques for effective analysis and visualization of large-scale simulations, and algorithmsmore » and procedures for efficient processing.« less
Scaling up to address data science challenges
Wendelberger, Joanne R.
2017-04-27
Statistics and Data Science provide a variety of perspectives and technical approaches for exploring and understanding Big Data. Partnerships between scientists from different fields such as statistics, machine learning, computer science, and applied mathematics can lead to innovative approaches for addressing problems involving increasingly large amounts of data in a rigorous and effective manner that takes advantage of advances in computing. Here, this article will explore various challenges in Data Science and will highlight statistical approaches that can facilitate analysis of large-scale data including sampling and data reduction methods, techniques for effective analysis and visualization of large-scale simulations, and algorithmsmore » and procedures for efficient processing.« less
NASA Astrophysics Data System (ADS)
Craig, S. E.; Lee, Z.; Du, K.; Lin, J.
2016-02-01
An approach based on empirical orthogonal function (EOF) analysis of ocean colour spectra has been shown to accurately derive inherent optical properties (IOPs) and chlorophyll concentration in scenarios, such as optically complex waters, where standard algorithms often perform poorly. The algorithm has been successfully used in a number of regional applications, and has also shown promise in a global implementation based on the NASA NOMAD data set. Additionally, it has demonstrated the unique ability to derive ocean colour products from top of atmosphere (TOA) signals with either no or minimal atmospheric correction applied. Due to its high potential for use over coastal and inland waters, the EOF approach is currently being rigorously characterised as part of a suite of approaches that will be used to support the new NASA ocean colour mission, PACE (Pre-Aerosol, Clouds and ocean Ecosystem). A major component in this model characterisation is the generation of a synthetic TOA data set using a coupled ocean-atmosphere radiative transfer model, which has been run to mimic PACE spectral resolution, and under a wide range of geographical locations, water constituent concentrations, and sea surface and atmospheric conditions. The resulting multidimensional data set will be analysed, and results presented on the sensitivity of the model to various combinations of parameters, and preliminary conclusions made regarding the optimal implementation strategy of this promising approach (e.g. on a global, optical water type or regional basis). This will provide vital guidance for operational implementation of the model for both existing satellite ocean colour sensors and the upcoming PACE mission.
Latest Results From the QuakeFinder Statistical Analysis Framework
NASA Astrophysics Data System (ADS)
Kappler, K. N.; MacLean, L. S.; Schneider, D.; Bleier, T.
2017-12-01
Since 2005 QuakeFinder (QF) has acquired an unique dataset with outstanding spatial and temporal sampling of earth's magnetic field along several active fault systems. This QF network consists of 124 stations in California and 45 stations along fault zones in Greece, Taiwan, Peru, Chile and Indonesia. Each station is equipped with three feedback induction magnetometers, two ion sensors, a 4 Hz geophone, a temperature sensor, and a humidity sensor. Data are continuously recorded at 50 Hz with GPS timing and transmitted daily to the QF data center in California for analysis. QF is attempting to detect and characterize anomalous EM activity occurring ahead of earthquakes. There have been many reports of anomalous variations in the earth's magnetic field preceding earthquakes. Specifically, several authors have drawn attention to apparent anomalous pulsations seen preceding earthquakes. Often studies in long term monitoring of seismic activity are limited by availability of event data. It is particularly difficult to acquire a large dataset for rigorous statistical analyses of the magnetic field near earthquake epicenters because large events are relatively rare. Since QF has acquired hundreds of earthquakes in more than 70 TB of data, we developed an automated approach for finding statistical significance of precursory behavior and developed an algorithm framework. Previously QF reported on the development of an Algorithmic Framework for data processing and hypothesis testing. The particular instance of algorithm we discuss identifies and counts magnetic variations from time series data and ranks each station-day according to the aggregate number of pulses in a time window preceding the day in question. If the hypothesis is true that magnetic field activity increases over some time interval preceding earthquakes, this should reveal itself by the station-days on which earthquakes occur receiving higher ranks than they would if the ranking scheme were random. This can be analysed using the Receiver Operating Characteristic test. In this presentation we give a status report of our latest results, largely focussed on reproducibility of results, robust statistics in the presence of missing data, and exploring optimization landscapes in our parameter space.
Sex Differences in the Response of Children with ADHD to Once-Daily Formulations of Methylphenidate
ERIC Educational Resources Information Center
Sonuga-Barke, J. S.; Coghill, David; Markowitz, John S.; Swanson, James M.; Vandenberghe, Mieke; Hatch, Simon J.
2007-01-01
Objectives: Studies of sex differences in methylphenidate response by children with attention-deficit/hyperactivity disorder have lacked methodological rigor and statistical power. This paper reports an examination of sex differences based on further analysis of data from a comparison of two once-daily methylphenidate formulations (the COMACS…
The Role of Data Analysis Software in Graduate Programs in Education and Post-Graduate Research
ERIC Educational Resources Information Center
Harwell, Michael
2018-01-01
The importance of data analysis software in graduate programs in education and post-graduate educational research is self-evident. However the role of this software in facilitating supererogated statistical practice versus "cookbookery" is unclear. The need to rigorously document the role of data analysis software in students' graduate…
A Study of Statistics through Tootsie Pops
ERIC Educational Resources Information Center
Aaberg, Shelby; Vitosh, Jason; Smith, Wendy
2016-01-01
A classic TV commercial once asked, "How many licks does it take to get to the center of a Tootsie Roll Tootsie Pop?" The narrator claims, "The world may never know" (Tootsie Roll 2012), but an Internet search returns a multitude of answers, some of which include rigorous systematic approaches by academics to address the…
Meeting the needs of an ever-demanding market.
Rigby, Richard
2002-04-01
Balancing cost and performance in packaging is critical. This article outlines techniques to assist in this whilst delivering added value and product differentiation. The techniques include a rigorous statistical process capable of delivering cost reduction and improved quality and a computer modelling process that can save time when validating new packaging options.
Traditionally, the EPA has monitored aquatic ecosystems using statistically rigorous sample designs and intensive field efforts which provide high quality datasets. But by their nature they leave many aquatic systems unsampled, follow a top down approach, have a long lag between ...
ERIC Educational Resources Information Center
Benton-Borghi, Beatrice Hope; Chang, Young Mi
2011-01-01
The National Center for Educational Statistics (NCES, 2010) continues to report substantial underachievement of diverse student populations in the nation's schools. After decades of focus on diversity and multicultural education, with integrating field and clinical practice, candidates continue to graduate without adequate knowledge, skills and…
State College- and Career-Ready High School Graduation Requirements. Updated
ERIC Educational Resources Information Center
Achieve, Inc., 2013
2013-01-01
Research by Achieve, ACT, and others suggests that for high school graduates to be prepared for success in a wide range of postsecondary settings, they need to take four years of challenging mathematics--covering Advanced Algebra; Geometry; and data, probability, and statistics content--and four years of rigorous English aligned with college- and…
High School Redesign. Diplomas Count, 2016. Education Week. Volume 35, Number 33
ERIC Educational Resources Information Center
Edwards, Virginia B., Ed.
2016-01-01
This year's report focuses on efforts to redesign high schools. Those include incorporating student voice, implementing a rigorous and relevant curriculum, embracing career exploration, and more. The report also includes the latest statistics on the nation's overall, on-time high school graduation rate. Articles include: (1) To Build a Better High…
Interactive visual analysis promotes exploration of long-term ecological data
T.N. Pham; J.A. Jones; R. Metoyer; F.J. Swanson; R.J. Pabst
2013-01-01
Long-term ecological data are crucial in helping ecologists understand ecosystem function and environmental change. Nevertheless, these kinds of data sets are difficult to analyze because they are usually large, multivariate, and spatiotemporal. Although existing analysis tools such as statistical methods and spreadsheet software permit rigorous tests of pre-conceived...
Alarms about structural alerts.
Alves, Vinicius; Muratov, Eugene; Capuzzi, Stephen; Politi, Regina; Low, Yen; Braga, Rodolpho; Zakharov, Alexey V; Sedykh, Alexander; Mokshyna, Elena; Farag, Sherif; Andrade, Carolina; Kuz'min, Victor; Fourches, Denis; Tropsha, Alexander
2016-08-21
Structural alerts are widely accepted in chemical toxicology and regulatory decision support as a simple and transparent means to flag potential chemical hazards or group compounds into categories for read-across. However, there has been a growing concern that alerts disproportionally flag too many chemicals as toxic, which questions their reliability as toxicity markers. Conversely, the rigorously developed and properly validated statistical QSAR models can accurately and reliably predict the toxicity of a chemical; however, their use in regulatory toxicology has been hampered by the lack of transparency and interpretability. We demonstrate that contrary to the common perception of QSAR models as "black boxes" they can be used to identify statistically significant chemical substructures (QSAR-based alerts) that influence toxicity. We show through several case studies, however, that the mere presence of structural alerts in a chemical, irrespective of the derivation method (expert-based or QSAR-based), should be perceived only as hypotheses of possible toxicological effect. We propose a new approach that synergistically integrates structural alerts and rigorously validated QSAR models for a more transparent and accurate safety assessment of new chemicals.
Uncertainty Analysis of Instrument Calibration and Application
NASA Technical Reports Server (NTRS)
Tripp, John S.; Tcheng, Ping
1999-01-01
Experimental aerodynamic researchers require estimated precision and bias uncertainties of measured physical quantities, typically at 95 percent confidence levels. Uncertainties of final computed aerodynamic parameters are obtained by propagation of individual measurement uncertainties through the defining functional expressions. In this paper, rigorous mathematical techniques are extended to determine precision and bias uncertainties of any instrument-sensor system. Through this analysis, instrument uncertainties determined through calibration are now expressed as functions of the corresponding measurement for linear and nonlinear univariate and multivariate processes. Treatment of correlated measurement precision error is developed. During laboratory calibration, calibration standard uncertainties are assumed to be an order of magnitude less than those of the instrument being calibrated. Often calibration standards do not satisfy this assumption. This paper applies rigorous statistical methods for inclusion of calibration standard uncertainty and covariance due to the order of their application. The effects of mathematical modeling error on calibration bias uncertainty are quantified. The effects of experimental design on uncertainty are analyzed. The importance of replication is emphasized, techniques for estimation of both bias and precision uncertainties using replication are developed. Statistical tests for stationarity of calibration parameters over time are obtained.
NASA Astrophysics Data System (ADS)
Kim, Seongryong; Tkalčić, Hrvoje; Mustać, Marija; Rhie, Junkee; Ford, Sean
2016-04-01
A framework is presented within which we provide rigorous estimations for seismic sources and structures in the Northeast Asia. We use Bayesian inversion methods, which enable statistical estimations of models and their uncertainties based on data information. Ambiguities in error statistics and model parameterizations are addressed by hierarchical and trans-dimensional (trans-D) techniques, which can be inherently implemented in the Bayesian inversions. Hence reliable estimation of model parameters and their uncertainties is possible, thus avoiding arbitrary regularizations and parameterizations. Hierarchical and trans-D inversions are performed to develop a three-dimensional velocity model using ambient noise data. To further improve the model, we perform joint inversions with receiver function data using a newly developed Bayesian method. For the source estimation, a novel moment tensor inversion method is presented and applied to regional waveform data of the North Korean nuclear explosion tests. By the combination of new Bayesian techniques and the structural model, coupled with meaningful uncertainties related to each of the processes, more quantitative monitoring and discrimination of seismic events is possible.
Does climate directly influence NPP globally?
Chu, Chengjin; Bartlett, Megan; Wang, Youshi; He, Fangliang; Weiner, Jacob; Chave, Jérôme; Sack, Lawren
2016-01-01
The need for rigorous analyses of climate impacts has never been more crucial. Current textbooks state that climate directly influences ecosystem annual net primary productivity (NPP), emphasizing the urgent need to monitor the impacts of climate change. A recent paper challenged this consensus, arguing, based on an analysis of NPP for 1247 woody plant communities across global climate gradients, that temperature and precipitation have negligible direct effects on NPP and only perhaps have indirect effects by constraining total stand biomass (Mtot ) and stand age (a). The authors of that study concluded that the length of the growing season (lgs ) might have a minor influence on NPP, an effect they considered not to be directly related to climate. In this article, we describe flaws that affected that study's conclusions and present novel analyses to disentangle the effects of stand variables and climate in determining NPP. We re-analyzed the same database to partition the direct and indirect effects of climate on NPP, using three approaches: maximum-likelihood model selection, independent-effects analysis, and structural equation modeling. These new analyses showed that about half of the global variation in NPP could be explained by Mtot combined with climate variables and supported strong and direct influences of climate independently of Mtot , both for NPP and for net biomass change averaged across the known lifetime of the stands (ABC = average biomass change). We show that lgs is an important climate variable, intrinsically correlated with, and contributing to mean annual temperature and precipitation (Tann and Pann ), all important climatic drivers of NPP. Our analyses provide guidance for statistical and mechanistic analyses of climate drivers of ecosystem processes for predictive modeling and provide novel evidence supporting the strong, direct role of climate in determining vegetation productivity at the global scale. © 2015 John Wiley & Sons Ltd.
Silvestre, Dolores; Fraga, Miriam; Gormaz, María; Torres, Ester; Vento, Máximo
2014-07-01
The variability of human milk (HM) composition renders analysis of its components essential for optimal nutrition of preterm fed either with donor's or own mother's milk. To fulfil this requirement, various analytical instruments have been subjected to scientific and clinical evaluation. The objective of this study was to evaluate the suitability of a rapid method for the analysis of macronutrients in HM as compared with the analytical methods applied by cow's milk industry. Mature milk from 39 donors was analysed using an infrared human milk analyser (HMA) and compared with biochemical reference laboratory methods. The statistical analysis was based on the use of paired data tests. The use of an infrared HMA for the analysis of lipids, proteins and lactose in HM proved satisfactory as regards the rapidity, simplicity and the required sample volume. The instrument afforded good linearity and precision in application to all three nutrients. However, accuracy was not acceptable when compared with the reference methods, with overestimation of the lipid content and underestimation of the amount of proteins and lactose contents. The use of mid-infrared HMA might become the standard for rapid analysis of HM once standardisation and rigorous and systematic calibration is provided. © 2012 John Wiley & Sons Ltd.
Loescher, Lois J; Rains, Stephen A; Kramer, Sandra S; Akers, Chelsie; Moussa, Renee
2018-05-01
To systematically review healthy lifestyle interventions targeted to adolescents and delivered using text messaging (TM). PubMed, Embase, CINAHL, PsycINFO, and Web of Science databases. Study Inclusion Criteria: Research articles published during 2011 to 2014; analyses focused on intervention targeting adolescents (10-19 years), with healthy lifestyle behaviors as main variables, delivered via mobile phone-based TM. The authors extracted data from 27 of 281 articles using the Preferred Reporting Items for Systematic Reviews and Meta-Analyses method. Adolescent and setting characteristics, study design and rigor, intervention effectiveness, challenges, and risk of bias. Across studies, 16 (59.3%) of 27 included non-Caucasians. The gender was split for 22 (81.5%) of 27 studies. Thirteen studies were randomized controlled trials. There was heterogeneity among targeted conditions, rigor of methods, and intervention effects. Interventions for monitoring/adherence (n = 8) reported more positive results than those for health behavior change (n = 19). Studies that only included message delivered via TM (n = 14) reported more positive effects than studies integrating multiple intervention components. Interventions delivered using TM presented minimal challenges, but selection and performance bias were observed across studies. Interventions delivered using TM have the potential, under certain conditions, to improve healthy lifestyle behaviors in adolescents. However, the rigor of studies varies, and established theory and validated measures have been inconsistently incorporated.
Peer Review Documents Related to the Evaluation of ...
BMDS is one of the Agency's premier tools for estimating risk assessments, therefore the validity and reliability of its statistical models are of paramount importance. This page provides links to peer review and expert summaries of the BMDS application and its models as they were developed and eventually released documenting the rigorous review process taken to provide the best science tools available for statistical modeling. This page provides links to peer reviews and expert summaries of the BMDS applications and its models as they were developed and eventually released.
A criterion for establishing life limits. [for Space Shuttle Main Engine service
NASA Technical Reports Server (NTRS)
Skopp, G. H.; Porter, A. A.
1990-01-01
The development of a rigorous statistical method that would utilize hardware-demonstrated reliability to evaluate hardware capability and provide ground rules for safe flight margin is discussed. A statistical-based method using the Weibull/Weibayes cumulative distribution function is described. Its advantages and inadequacies are pointed out. Another, more advanced procedure, Single Flight Reliability (SFR), determines a life limit which ensures that the reliability of any single flight is never less than a stipulated value at a stipulated confidence level. Application of the SFR method is illustrated.
Flint, Lorraine E.; Flint, Alan L.
2012-01-01
The methodology, which includes a sequence of rigorous analyses and calculations, is intended to reduce the addition of uncertainty to the climate data as a result of the downscaling while providing the fine-scale climate information necessary for ecological analyses. It results in new but consistent data sets for the US at 4 km, the southwest US at 270 m, and California at 90 m and illustrates the utility of fine-scale downscaling to analyses of ecological processes influenced by topographic complexity.
NASA Astrophysics Data System (ADS)
Mulligan, T.; Blake, J.; Spence, H. E.; Jordan, A. P.; Shaul, D.; Quenby, J.
2007-12-01
On August 20, 2006 a Forbush decrease observed at Polar in the Earth's magnetosphere was also seen at the INTEGRAL spacecraft outside the magnetosphere during a very active time in the solar wind. Data from Polar HIST and from INTEGRAL's Ge detector saturation rate (GEDSAT), which measures the GCR background with a threshold of ~200 MeV, show similar, short-period GCR variations in and around the Forbush decrease. The solar wind magnetic field and plasma conditions during this time reveals three interplanetary shocks present in the days leading up to and including the Forbush decrease. The first two shocks are driven by interplanetary coronal mass ejections (ICMEs) and the last one by a high-speed stream. However, the solar wind following these shocks and during the Forbush decrease is not particularly geoeffective. The Forbush decrease, which begins at ~1200 UT on August 20, 2006 is the largest intensity change during this active time, but there are many others on a variety of timescales. Looking at more than 14 consecutive hours of INTEGRAL and Polar data on August 21, 2006 shows great similarities in the time history of the measurements made aboard the two satellites coupled with differences that must be due to GCR variability on a scale size of the order or less than their separation distance. Despite the spacecraft separation of over 25 Re, many of the larger intensity fluctuations remain identical at both satellites. Autocorrelation and power spectral analyses have shown these are not ar-n processes and that these fluctuations are statistically significant. Such analyses can be done with high confidence because both detectors aboard Polar and INTEGRAL have large geometric factors that generate high count rates on the order of 1000 particles per spin, ensuring rigorous, statistically significant samples.
Establishing Interventions via a Theory-Driven Single Case Design Research Cycle
ERIC Educational Resources Information Center
Kilgus, Stephen P.; Riley-Tillman, T. Chris; Kratochwill, Thomas R.
2016-01-01
Recent studies have suggested single case design (SCD) intervention research is subject to publication bias, wherein studies are more likely to be published if they possess large or statistically significant effects and use rigorous experimental methods. The nature of SCD and the purposes for which it might be used could suggest that large effects…
ERIC Educational Resources Information Center
Johnson, Donald M.; Shoulders, Catherine W.
2017-01-01
As members of a profession committed to the dissemination of rigorous research pertaining to agricultural education, authors publishing in the Journal of Agricultural Education (JAE) must seek methods to evaluate and, when necessary, improve their research methods. The purpose of this study was to describe how authors of manuscripts published in…
Sean Healey; Warren Cohen; Gretchen Moisen
2007-01-01
The need for current information about the effects of fires, harvest, and storms is evident in many areas of sustainable forest management. While there are several potential sources of this information, each source has its limitations. Generally speaking, the statistical rigor associated with traditional forest sampling is an important asset in any monitoring effort....
Preschool Center Care Quality Effects on Academic Achievement: An Instrumental Variables Analysis
ERIC Educational Resources Information Center
Auger, Anamarie; Farkas, George; Burchinal, Margaret R.; Duncan, Greg J.; Vandell, Deborah Lowe
2014-01-01
Much of child care research has focused on the effects of the quality of care in early childhood settings on children's school readiness skills. Although researchers increased the statistical rigor of their approaches over the past 15 years, researchers' ability to draw causal inferences has been limited because the studies are based on…
Statistical linearization for multi-input/multi-output nonlinearities
NASA Technical Reports Server (NTRS)
Lin, Ching-An; Cheng, Victor H. L.
1991-01-01
Formulas are derived for the computation of the random input-describing functions for MIMO nonlinearities; these straightforward and rigorous derivations are based on the optimal mean square linear approximation. The computations involve evaluations of multiple integrals. It is shown that, for certain classes of nonlinearities, multiple-integral evaluations are obviated and the computations are significantly simplified.
Slow off the Mark: Elementary School Teachers and the Crisis in STEM Education
ERIC Educational Resources Information Center
Epstein, Diana; Miller, Raegen T.
2011-01-01
Prospective teachers can typically obtain a license to teach elementary school without taking a rigorous college-level STEM class such as calculus, statistics, or chemistry, and without demonstrating a solid grasp of mathematics knowledge, scientific knowledge, or the nature of scientific inquiry. This is not a recipe for ensuring students have…
ERIC Educational Resources Information Center
Scrutton, Roger; Beames, Simon
2015-01-01
Outdoor adventure education (OAE) has a long history of being credited with the personal and social development (PSD) of its participants. PSD is notoriously difficult to measure quantitatively, yet stakeholders demand statistical evidence that given approaches to eliciting PSD are effective in their methods. Rightly or wrongly, many stakeholders…
Cahn, David B; Handorf, Elizabeth A; Ghiraldi, Eric M; Ristau, Benjamin T; Geynisman, Daniel M; Churilla, Thomas M; Horwitz, Eric M; Sobczak, Mark L; Chen, David Y T; Viterbo, Rosalia; Greenberg, Richard E; Kutikov, Alexander; Uzzo, Robert G; Smaldone, Marc C
2017-11-15
The current study was performed to examine temporal trends and compare overall survival (OS) in patients undergoing radical cystectomy (RC) or bladder-preservation therapy (BPT) for muscle-invasive urothelial carcinoma of the bladder. The authors reviewed the National Cancer Data Base to identify patients with AJCC stage II to III urothelial carcinoma of the bladder from 2004 through 2013. Patients receiving BPT were stratified as having received any external-beam radiotherapy (any XRT), definitive XRT (50-80 grays), and definitive XRT with chemotherapy (CRT). Treatment trends and OS outcomes for the BPT and RC cohorts were evaluated using Cochran-Armitage tests, unadjusted Kaplan-Meier curves, adjusted Cox multivariate regression, and propensity score matching, using increasingly stringent selection criteria. A total of 32,300 patients met the inclusion criteria and were treated with RC (22,680 patients) or BPT (9620 patients). Of the patients treated with BPT, 26.4% (2540 patients) and 15.5% (1489 patients), respectively, were treated with definitive XRT and CRT. Improved OS was observed for RC in all groups. After adjustments with more rigorous statistical models controlling for confounders and with more restrictive BPT cohorts, the magnitude of the OS benefit became attenuated on multivariate (any XRT: hazard ratio [HR], 2.115 [95% confidence interval [95% CI], 2.045-2.188]; definitive XRT: HR, 1.870 [95% CI, 1.773-1.972]; and CRT: HR, 1.578 [95% CI, 1.474-1.691]) and propensity score (any XRT: HR, 2.008 [95% CI, 1.871-2.154]; definitive XRT: HR, 1.606 [95% CI, 1.453-1.776]; and CRT: HR, 1.406 [95% CI, 1.235-1.601]) analyses. In the National Cancer Data Base, receipt of BPT was associated with decreased OS compared with RC in patients with stage II to III urothelial carcinoma. Increasingly stringent definitions of BPT and more rigorous statistical methods adjusting for selection biases attenuated observed survival differences. Cancer 2017;123:4337-45. © 2017 American Cancer Society. © 2017 American Cancer Society.
NASA Astrophysics Data System (ADS)
Ivanov, Martin; Warrach-Sagi, Kirsten; Wulfmeyer, Volker
2018-04-01
A new approach for rigorous spatial analysis of the downscaling performance of regional climate model (RCM) simulations is introduced. It is based on a multiple comparison of the local tests at the grid cells and is also known as "field" or "global" significance. New performance measures for estimating the added value of downscaled data relative to the large-scale forcing fields are developed. The methodology is exemplarily applied to a standard EURO-CORDEX hindcast simulation with the Weather Research and Forecasting (WRF) model coupled with the land surface model NOAH at 0.11 ∘ grid resolution. Monthly temperature climatology for the 1990-2009 period is analysed for Germany for winter and summer in comparison with high-resolution gridded observations from the German Weather Service. The field significance test controls the proportion of falsely rejected local tests in a meaningful way and is robust to spatial dependence. Hence, the spatial patterns of the statistically significant local tests are also meaningful. We interpret them from a process-oriented perspective. In winter and in most regions in summer, the downscaled distributions are statistically indistinguishable from the observed ones. A systematic cold summer bias occurs in deep river valleys due to overestimated elevations, in coastal areas due probably to enhanced sea breeze circulation, and over large lakes due to the interpolation of water temperatures. Urban areas in concave topography forms have a warm summer bias due to the strong heat islands, not reflected in the observations. WRF-NOAH generates appropriate fine-scale features in the monthly temperature field over regions of complex topography, but over spatially homogeneous areas even small biases can lead to significant deteriorations relative to the driving reanalysis. As the added value of global climate model (GCM)-driven simulations cannot be smaller than this perfect-boundary estimate, this work demonstrates in a rigorous manner the clear additional value of dynamical downscaling over global climate simulations. The evaluation methodology has a broad spectrum of applicability as it is distribution-free, robust to spatial dependence, and accounts for time series structure.
A strategy to estimate unknown viral diversity in mammals.
Anthony, Simon J; Epstein, Jonathan H; Murray, Kris A; Navarrete-Macias, Isamara; Zambrana-Torrelio, Carlos M; Solovyov, Alexander; Ojeda-Flores, Rafael; Arrigo, Nicole C; Islam, Ariful; Ali Khan, Shahneaz; Hosseini, Parviez; Bogich, Tiffany L; Olival, Kevin J; Sanchez-Leon, Maria D; Karesh, William B; Goldstein, Tracey; Luby, Stephen P; Morse, Stephen S; Mazet, Jonna A K; Daszak, Peter; Lipkin, W Ian
2013-09-03
The majority of emerging zoonoses originate in wildlife, and many are caused by viruses. However, there are no rigorous estimates of total viral diversity (here termed "virodiversity") for any wildlife species, despite the utility of this to future surveillance and control of emerging zoonoses. In this case study, we repeatedly sampled a mammalian wildlife host known to harbor emerging zoonotic pathogens (the Indian Flying Fox, Pteropus giganteus) and used PCR with degenerate viral family-level primers to discover and analyze the occurrence patterns of 55 viruses from nine viral families. We then adapted statistical techniques used to estimate biodiversity in vertebrates and plants and estimated the total viral richness of these nine families in P. giganteus to be 58 viruses. Our analyses demonstrate proof-of-concept of a strategy for estimating viral richness and provide the first statistically supported estimate of the number of undiscovered viruses in a mammalian host. We used a simple extrapolation to estimate that there are a minimum of 320,000 mammalian viruses awaiting discovery within these nine families, assuming all species harbor a similar number of viruses, with minimal turnover between host species. We estimate the cost of discovering these viruses to be ~$6.3 billion (or ~$1.4 billion for 85% of the total diversity), which if annualized over a 10-year study time frame would represent a small fraction of the cost of many pandemic zoonoses. Recent years have seen a dramatic increase in viral discovery efforts. However, most lack rigorous systematic design, which limits our ability to understand viral diversity and its ecological drivers and reduces their value to public health intervention. Here, we present a new framework for the discovery of novel viruses in wildlife and use it to make the first-ever estimate of the number of viruses that exist in a mammalian host. As pathogens continue to emerge from wildlife, this estimate allows us to put preliminary bounds around the potential size of the total zoonotic pool and facilitates a better understanding of where best to allocate resources for the subsequent discovery of global viral diversity.
Normalization, bias correction, and peak calling for ChIP-seq
Diaz, Aaron; Park, Kiyoub; Lim, Daniel A.; Song, Jun S.
2012-01-01
Next-generation sequencing is rapidly transforming our ability to profile the transcriptional, genetic, and epigenetic states of a cell. In particular, sequencing DNA from the immunoprecipitation of protein-DNA complexes (ChIP-seq) and methylated DNA (MeDIP-seq) can reveal the locations of protein binding sites and epigenetic modifications. These approaches contain numerous biases which may significantly influence the interpretation of the resulting data. Rigorous computational methods for detecting and removing such biases are still lacking. Also, multi-sample normalization still remains an important open problem. This theoretical paper systematically characterizes the biases and properties of ChIP-seq data by comparing 62 separate publicly available datasets, using rigorous statistical models and signal processing techniques. Statistical methods for separating ChIP-seq signal from background noise, as well as correcting enrichment test statistics for sequence-dependent and sonication biases, are presented. Our method effectively separates reads into signal and background components prior to normalization, improving the signal-to-noise ratio. Moreover, most peak callers currently use a generic null model which suffers from low specificity at the sensitivity level requisite for detecting subtle, but true, ChIP enrichment. The proposed method of determining a cell type-specific null model, which accounts for cell type-specific biases, is shown to be capable of achieving a lower false discovery rate at a given significance threshold than current methods. PMID:22499706
Retrospective Analysis of a Classical Biological Control Programme
USDA-ARS?s Scientific Manuscript database
1. Classical biological control has been a key technology in the management of invasive arthropod pests globally for over 120 years, yet rigorous quantitative evaluations of programme success or failure are rare. Here, I used life table and matrix model analyses, and life table response experiments ...
Increasing rigor in NMR-based metabolomics through validated and open source tools
Eghbalnia, Hamid R; Romero, Pedro R; Westler, William M; Baskaran, Kumaran; Ulrich, Eldon L; Markley, John L
2016-01-01
The metabolome, the collection of small molecules associated with an organism, is a growing subject of inquiry, with the data utilized for data-intensive systems biology, disease diagnostics, biomarker discovery, and the broader characterization of small molecules in mixtures. Owing to their close proximity to the functional endpoints that govern an organism’s phenotype, metabolites are highly informative about functional states. The field of metabolomics identifies and quantifies endogenous and exogenous metabolites in biological samples. Information acquired from nuclear magnetic spectroscopy (NMR), mass spectrometry (MS), and the published literature, as processed by statistical approaches, are driving increasingly wider applications of metabolomics. This review focuses on the role of databases and software tools in advancing the rigor, robustness, reproducibility, and validation of metabolomics studies. PMID:27643760
Increasing rigor in NMR-based metabolomics through validated and open source tools.
Eghbalnia, Hamid R; Romero, Pedro R; Westler, William M; Baskaran, Kumaran; Ulrich, Eldon L; Markley, John L
2017-02-01
The metabolome, the collection of small molecules associated with an organism, is a growing subject of inquiry, with the data utilized for data-intensive systems biology, disease diagnostics, biomarker discovery, and the broader characterization of small molecules in mixtures. Owing to their close proximity to the functional endpoints that govern an organism's phenotype, metabolites are highly informative about functional states. The field of metabolomics identifies and quantifies endogenous and exogenous metabolites in biological samples. Information acquired from nuclear magnetic spectroscopy (NMR), mass spectrometry (MS), and the published literature, as processed by statistical approaches, are driving increasingly wider applications of metabolomics. This review focuses on the role of databases and software tools in advancing the rigor, robustness, reproducibility, and validation of metabolomics studies. Copyright © 2016. Published by Elsevier Ltd.
Prasifka, J R; Hellmich, R L; Dively, G P; Higgins, L S; Dixon, P M; Duan, J J
2008-02-01
One of the possible adverse effects of transgenic insecticidal crops is the unintended decline in the abundance of nontarget arthropods. Field trials designed to evaluate potential nontarget effects can be more complex than expected because decisions to conduct field trials and the selection of taxa to include are not always guided by the results of laboratory tests. Also, recent studies emphasize the potential for indirect effects (adverse impacts to nontarget arthropods without feeding directly on plant tissues), which are difficult to predict because of interactions among nontarget arthropods, target pests, and transgenic crops. As a consequence, field studies may attempt to monitor expansive lists of arthropod taxa, making the design of such broad studies more difficult and reducing the likelihood of detecting any negative effects that might be present. To improve the taxonomic focus and statistical rigor of future studies, existing field data and corresponding power analysis may provide useful guidance. Analysis of control data from several nontarget field trials using repeated-measures designs suggests that while detection of small effects may require considerable increases in replication, there are taxa from different ecological roles that are sampled effectively using standard methods. The use of statistical power to guide selection of taxa for nontarget trials reflects scientists' inability to predict the complex interactions among arthropod taxa, particularly when laboratory trials fail to provide guidance on which groups are more likely to be affected. However, scientists still may exercise judgment, including taxa that are not included in or supported by power analyses.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baumgartner, S.; Bieli, R.; Bergmann, U. C.
2012-07-01
An overview is given of existing CPR design criteria and the methods used in BWR reload analysis to evaluate the impact of channel bow on CPR margins. Potential weaknesses in today's methodologies are discussed. Westinghouse in collaboration with KKL and Axpo - operator and owner of the Leibstadt NPP - has developed an optimized CPR methodology based on a new criterion to protect against dryout during normal operation and with a more rigorous treatment of channel bow. The new steady-state criterion is expressed in terms of an upper limit of 0.01 for the dryout failure probability per year. This ismore » considered a meaningful and appropriate criterion that can be directly related to the probabilistic criteria set-up for the analyses of Anticipated Operation Occurrences (AOOs) and accidents. In the Monte Carlo approach a statistical modeling of channel bow and an accurate evaluation of CPR response functions allow the associated CPR penalties to be included directly in the plant SLMCPR and OLMCPR in a best-estimate manner. In this way, the treatment of channel bow is equivalent to all other uncertainties affecting CPR. Emphasis is put on quantifying the statistical distribution of channel bow throughout the core using measurement data. The optimized CPR methodology has been implemented in the Westinghouse Monte Carlo code, McSLAP. The methodology improves the quality of dryout safety assessments by supplying more valuable information and better control of conservatisms in establishing operational limits for CPR. The methodology is demonstrated with application examples from the introduction at KKL. (authors)« less
Kilborn, Joshua P; Jones, David L; Peebles, Ernst B; Naar, David F
2017-04-01
Clustering data continues to be a highly active area of data analysis, and resemblance profiles are being incorporated into ecological methodologies as a hypothesis testing-based approach to clustering multivariate data. However, these new clustering techniques have not been rigorously tested to determine the performance variability based on the algorithm's assumptions or any underlying data structures. Here, we use simulation studies to estimate the statistical error rates for the hypothesis test for multivariate structure based on dissimilarity profiles (DISPROF). We concurrently tested a widely used algorithm that employs the unweighted pair group method with arithmetic mean (UPGMA) to estimate the proficiency of clustering with DISPROF as a decision criterion. We simulated unstructured multivariate data from different probability distributions with increasing numbers of objects and descriptors, and grouped data with increasing overlap, overdispersion for ecological data, and correlation among descriptors within groups. Using simulated data, we measured the resolution and correspondence of clustering solutions achieved by DISPROF with UPGMA against the reference grouping partitions used to simulate the structured test datasets. Our results highlight the dynamic interactions between dataset dimensionality, group overlap, and the properties of the descriptors within a group (i.e., overdispersion or correlation structure) that are relevant to resemblance profiles as a clustering criterion for multivariate data. These methods are particularly useful for multivariate ecological datasets that benefit from distance-based statistical analyses. We propose guidelines for using DISPROF as a clustering decision tool that will help future users avoid potential pitfalls during the application of methods and the interpretation of results.
CORSSA: The Community Online Resource for Statistical Seismicity Analysis
Michael, Andrew J.; Wiemer, Stefan
2010-01-01
Statistical seismology is the application of rigorous statistical methods to earthquake science with the goal of improving our knowledge of how the earth works. Within statistical seismology there is a strong emphasis on the analysis of seismicity data in order to improve our scientific understanding of earthquakes and to improve the evaluation and testing of earthquake forecasts, earthquake early warning, and seismic hazards assessments. Given the societal importance of these applications, statistical seismology must be done well. Unfortunately, a lack of educational resources and available software tools make it difficult for students and new practitioners to learn about this discipline. The goal of the Community Online Resource for Statistical Seismicity Analysis (CORSSA) is to promote excellence in statistical seismology by providing the knowledge and resources necessary to understand and implement the best practices, so that the reader can apply these methods to their own research. This introduction describes the motivation for and vision of CORRSA. It also describes its structure and contents.
Wafer, Lucas; Kloczewiak, Marek; Luo, Yin
2016-07-01
Analytical ultracentrifugation-sedimentation velocity (AUC-SV) is often used to quantify high molar mass species (HMMS) present in biopharmaceuticals. Although these species are often present in trace quantities, they have received significant attention due to their potential immunogenicity. Commonly, AUC-SV data is analyzed as a diffusion-corrected, sedimentation coefficient distribution, or c(s), using SEDFIT to numerically solve Lamm-type equations. SEDFIT also utilizes maximum entropy or Tikhonov-Phillips regularization to further allow the user to determine relevant sample information, including the number of species present, their sedimentation coefficients, and their relative abundance. However, this methodology has several, often unstated, limitations, which may impact the final analysis of protein therapeutics. These include regularization-specific effects, artificial "ripple peaks," and spurious shifts in the sedimentation coefficients. In this investigation, we experimentally verified that an explicit Bayesian approach, as implemented in SEDFIT, can largely correct for these effects. Clear guidelines on how to implement this technique and interpret the resulting data, especially for samples containing micro-heterogeneity (e.g., differential glycosylation), are also provided. In addition, we demonstrated how the Bayesian approach can be combined with F statistics to draw more accurate conclusions and rigorously exclude artifactual peaks. Numerous examples with an antibody and an antibody-drug conjugate were used to illustrate the strengths and drawbacks of each technique.
Validation of AIRS/AMSU Cloud Retrievals Using MODIS Cloud Analyses
NASA Technical Reports Server (NTRS)
Molnar, Gyula I.; Susskind, Joel
2005-01-01
The AIRS/AMSU (flying on the EOS-AQUA satellite) sounding retrieval methodology allows for the retrieval of key atmospheric/surface parameters under partially cloudy conditions (Susskind et al.). In addition, cloud parameters are also derived from the AIRS/AMSU observations. Within each AIRS footprint, cloud parameters at up to 2 cloud layers are determined with differing cloud top pressures and effective (product of infrared emissivity at 11 microns and physical cloud fraction) cloud fractions. However, so far the AIRS cloud product has not been rigorously evaluated/validated. Fortunately, collocated/coincident radiances measured by MODIS/AQUA (at a much lower spectral resolution but roughly an order of-magnitude higher spatial resolution than that of AIRS) are used to determine analogous cloud products from MODIS. This allows us for a rather rare and interesting possibility: the intercomparisons and mutual validation of imager vs. sounder-based cloud products obtained from the same satellite positions. First, we present results of small-scale (granules) instantaneous intercomparisons. Next, we will evaluate differences of temporally averaged (monthly) means as well as the representation of inter-annual variability of cloud parameters as presented by the two cloud data sets. In particular, we present statistical differences in the retrieved parameters of cloud fraction and cloud top pressure. We will investigate what type of cloud systems are retrieved most consistently (if any) with both retrieval schemes, and attempt to assess reasons behind statistically significant differences.
Genetic Epidemiology of Glucose-6-Dehydrogenase Deficiency in the Arab World.
Doss, C George Priya; Alasmar, Dima R; Bux, Reem I; Sneha, P; Bakhsh, Fadheela Dad; Al-Azwani, Iman; Bekay, Rajaa El; Zayed, Hatem
2016-11-17
A systematic search was implemented using four literature databases (PubMed, Embase, Science Direct and Web of Science) to capture all the causative mutations of Glucose-6-phosphate dehydrogenase (G6PD) deficiency (G6PDD) in the 22 Arab countries. Our search yielded 43 studies that captured 33 mutations (23 missense, one silent, two deletions, and seven intronic mutations), in 3,430 Arab patients with G6PDD. The 23 missense mutations were then subjected to phenotypic classification using in silico prediction tools, which were compared to the WHO pathogenicity scale as a reference. These in silico tools were tested for their predicting efficiency using rigorous statistical analyses. Of the 23 missense mutations, p.S188F, p.I48T, p.N126D, and p.V68M, were identified as the most common mutations among Arab populations, but were not unique to the Arab world, interestingly, our search strategy found four other mutations (p.N135T, p.S179N, p.R246L, and p.Q307P) that are unique to Arabs. These mutations were exposed to structural analysis and molecular dynamics simulation analysis (MDSA), which predicting these mutant forms as potentially affect the enzyme function. The combination of the MDSA, structural analysis, and in silico predictions and statistical tools we used will provide a platform for future prediction accuracy for the pathogenicity of genetic mutations.
This research will quantify the extent of de facto reuse of untreated wastewater at the global scale. Through the integration of multiple existing spatial data sources, this project will produce rigorous analyses assessing the relationship between wastewater irrigation, hea...
Gender bias affects forests worldwide
Marlène Elias; Susan S Hummel; Bimbika S Basnett; Carol J.P. Colfer
2017-01-01
Gender biases persist in forestry research and practice. These biases result in reduced scientific rigor and inequitable, ineffective, and less efficient policies, programs, and interventions. Drawing from a two-volume collection of current and classic analyses on gender in forests, we outline five persistent and inter-related themes: gendered governance, tree tenure,...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Martin, Spencer; Rodrigues, George, E-mail: george.rodrigues@lhsc.on.ca; Department of Epidemiology/Biostatistics, University of Western Ontario, London
2013-01-01
Purpose: To perform a rigorous technological assessment and statistical validation of a software technology for anatomic delineations of the prostate on MRI datasets. Methods and Materials: A 3-phase validation strategy was used. Phase I consisted of anatomic atlas building using 100 prostate cancer MRI data sets to provide training data sets for the segmentation algorithms. In phase II, 2 experts contoured 15 new MRI prostate cancer cases using 3 approaches (manual, N points, and region of interest). In phase III, 5 new physicians with variable MRI prostate contouring experience segmented the same 15 phase II datasets using 3 approaches: manual,more » N points with no editing, and full autosegmentation with user editing allowed. Statistical analyses for time and accuracy (using Dice similarity coefficient) endpoints used traditional descriptive statistics, analysis of variance, analysis of covariance, and pooled Student t test. Results: In phase I, average (SD) total and per slice contouring time for the 2 physicians was 228 (75), 17 (3.5), 209 (65), and 15 seconds (3.9), respectively. In phase II, statistically significant differences in physician contouring time were observed based on physician, type of contouring, and case sequence. The N points strategy resulted in superior segmentation accuracy when initial autosegmented contours were compared with final contours. In phase III, statistically significant differences in contouring time were observed based on physician, type of contouring, and case sequence again. The average relative timesaving for N points and autosegmentation were 49% and 27%, respectively, compared with manual contouring. The N points and autosegmentation strategies resulted in average Dice values of 0.89 and 0.88, respectively. Pre- and postedited autosegmented contours demonstrated a higher average Dice similarity coefficient of 0.94. Conclusion: The software provided robust contours with minimal editing required. Observed time savings were seen for all physicians irrespective of experience level and baseline manual contouring speed.« less
A Comparison of Alternate Approaches to Creating Indices of Academic Rigor. Research Report 2012-11
ERIC Educational Resources Information Center
Beatty, Adam S.; Sackett, Paul R.; Kuncel, Nathan R.; Kiger, Thomas B.; Rigdon, Jana L.; Shen, Winny; Walmsley, Philip T.
2013-01-01
In recent decades, there has been an increasing emphasis placed on college graduation rates and reducing attrition due to the social and economic benefits, at both the individual and national levels, proposed to accrue from a more highly educated population (Bureau of Labor Statistics, 2011). In the United States in particular, there is a concern…
Comparing the Rigor of Compressed Format Courses to Their Regular Semester Counterparts
ERIC Educational Resources Information Center
Lutes, Lyndell; Davies, Randall
2013-01-01
This study compared workloads of undergraduate courses taught in 16-week and 8-week sessions. A statistically significant difference in workload was found between the two. Based on survey data from approximately 29,000 students, on average students spent about 17 minutes more per credit per week on 16-week courses than on similar 8-week courses.…
Statistical tests and measures for the presence and influence of digit preference
Jay Beaman; Grenier Michel
1998-01-01
Digit preference which is really showing preference for certain numbers has often described as the heaping or rounding of responses to numbers ending in zero or five. Number preference, NP, has been a topic in the social science literature for some years. However, until recently concepts were not adequately rigorously specified to allow, for example, the estimation of...
A new assessment of the alleged link between element 115 and element 117 decay chains
NASA Astrophysics Data System (ADS)
Forsberg, U.; Rudolph, D.; Fahlander, C.; Golubev, P.; Sarmiento, L. G.; Åberg, S.; Block, M.; Düllmann, Ch. E.; Heßberger, F. P.; Kratz, J. V.; Yakushev, A.
2016-09-01
A novel rigorous statistical treatment is applied to available data (May 9, 2016) from search and spectroscopy experiments on the elements with atomic numbers Z = 115 and Z = 117. The present analysis implies that the hitherto proposed cross-reaction link between α-decay chains associated with the isotopes 293117 and 289115 is highly improbable.
ERIC Educational Resources Information Center
Arbaugh, J. B.; Hwang, Alvin
2013-01-01
Seeking to assess the analytical rigor of empirical research in management education, this article reviews the use of multivariate statistical techniques in 85 studies of online and blended management education over the past decade and compares them with prescriptions offered by both the organization studies and educational research communities.…
Statistical rigor in LiDAR-assisted estimation of aboveground forest biomass
Timothy G. Gregoire; Erik Næsset; Ronald E. McRoberts; Göran Ståhl; Hans Andersen; Terje Gobakken; Liviu Ene; Ross Nelson
2016-01-01
For many decades remotely sensed data have been used as a source of auxiliary information when conducting regional or national surveys of forest resources. In the past decade, airborne scanning LiDAR (Light Detection and Ranging) has emerged as a promising tool for sample surveys aimed at improving estimation of aboveground forest biomass. This technology is now...
ERIC Educational Resources Information Center
Stoneberg, Bert D.
2015-01-01
The National Center of Education Statistics conducted a mapping study that equated the percentage proficient or above on each state's NCLB reading and mathematics tests in grades 4 and 8 to the NAEP scale. Each "NAEP equivalent score" was labeled according to NAEP's achievement levels and used to compare state proficiency standards and…
Code of Federal Regulations, 2012 CFR
2012-07-01
... rigorous statistical experimental design and interpretation (Reference 16.4). 14.0Pollution Prevention 14... fluids. 1.4This method has been designed to show positive contamination for 5% of representative crude....1Sample collection bottles/jars—New, pre-cleaned bottles/jars, lot-certified to be free of artifacts...
Code of Federal Regulations, 2013 CFR
2013-07-01
... rigorous statistical experimental design and interpretation (Reference 16.4). 14.0Pollution Prevention 14... fluids. 1.4This method has been designed to show positive contamination for 5% of representative crude....1Sample collection bottles/jars—New, pre-cleaned bottles/jars, lot-certified to be free of artifacts...
Code of Federal Regulations, 2014 CFR
2014-07-01
... rigorous statistical experimental design and interpretation (Reference 16.4). 14.0Pollution Prevention 14... fluids. 1.4This method has been designed to show positive contamination for 5% of representative crude....1Sample collection bottles/jars—New, pre-cleaned bottles/jars, lot-certified to be free of artifacts...
Code of Federal Regulations, 2011 CFR
2011-07-01
... rigorous statistical experimental design and interpretation (Reference 16.4). 14.0Pollution Prevention 14... oil contamination in drilling fluids. 1.4This method has been designed to show positive contamination....1Sample collection bottles/jars—New, pre-cleaned bottles/jars, lot-certified to be free of artifacts...
Bayesian Inference: with ecological applications
Link, William A.; Barker, Richard J.
2010-01-01
This text provides a mathematically rigorous yet accessible and engaging introduction to Bayesian inference with relevant examples that will be of interest to biologists working in the fields of ecology, wildlife management and environmental studies as well as students in advanced undergraduate statistics.. This text opens the door to Bayesian inference, taking advantage of modern computational efficiencies and easily accessible software to evaluate complex hierarchical models.
ERIC Educational Resources Information Center
Horne, Lela M.; Rachal, John R.; Shelley, Kyna
2012-01-01
A mixed methods framework utilized quantitative and qualitative data to determine whether statistically significant differences existed between high school and GED[R] student perceptions of credential value. An exploratory factor analysis (n=326) extracted four factors and then a MANOVA procedure was performed with a stratified quota sample…
Sonuga-Barke, Edmund J S; Brandeis, Daniel; Cortese, Samuele; Daley, David; Ferrin, Maite; Holtmann, Martin; Stevenson, Jim; Danckaerts, Marina; van der Oord, Saskia; Döpfner, Manfred; Dittmann, Ralf W; Simonoff, Emily; Zuddas, Alessandro; Banaschewski, Tobias; Buitelaar, Jan; Coghill, David; Hollis, Chris; Konofal, Eric; Lecendreux, Michel; Wong, Ian C K; Sergeant, Joseph
2013-03-01
Nonpharmacological treatments are available for attention deficit hyperactivity disorder (ADHD), although their efficacy remains uncertain. The authors undertook meta-analyses of the efficacy of dietary (restricted elimination diets, artificial food color exclusions, and free fatty acid supplementation) and psychological (cognitive training, neurofeedback, and behavioral interventions) ADHD treatments. Using a common systematic search and a rigorous coding and data extraction strategy across domains, the authors searched electronic databases to identify published randomized controlled trials that involved individuals who were diagnosed with ADHD (or who met a validated cutoff on a recognized rating scale) and that included an ADHD outcome. Fifty-four of the 2,904 nonduplicate screened records were included in the analyses. Two different analyses were performed. When the outcome measure was based on ADHD assessments by raters closest to the therapeutic setting, all dietary (standardized mean differences=0.21-0.48) and psychological (standardized mean differences=0.40-0.64) treatments produced statistically significant effects. However, when the best probably blinded assessment was employed, effects remained significant for free fatty acid supplementation (standardized mean difference=0.16) and artificial food color exclusion (standardized mean difference=0.42) but were substantially attenuated to nonsignificant levels for other treatments. Free fatty acid supplementation produced small but significant reductions in ADHD symptoms even with probably blinded assessments, although the clinical significance of these effects remains to be determined. Artificial food color exclusion produced larger effects but often in individuals selected for food sensitivities. Better evidence for efficacy from blinded assessments is required for behavioral interventions, neurofeedback, cognitive training, and restricted elimination diets before they can be supported as treatments for core ADHD symptoms.
Adult asthma disease management: an analysis of studies, approaches, outcomes, and methods.
Maciejewski, Matthew L; Chen, Shih-Yin; Au, David H
2009-07-01
Disease management has been implemented for patients with asthma in various ways. We describe the approaches to and components of adult asthma disease-management interventions, examine the outcomes evaluated, and assess the quality of published studies. We searched the MEDLINE, EMBASE, CINAHL, PsychInfo, and Cochrane databases for studies published in 1986 through 2008, on adult asthma management. With the studies that met our inclusion criteria, we examined the clinical, process, medication, economic, and patient-reported outcomes reported, and the study designs, provider collaboration during the studies, and statistical methods. Twenty-nine articles describing 27 studies satisfied our inclusion criteria. There was great variation in the content, extent of collaboration between physician and non-physician providers responsible for intervention delivery, and outcomes examined across the 27 studies. Because of limitations in the design of 22 of the 27 studies, the differences in outcomes assessed, and the lack of rigorous statistical adjustment, we could not draw definitive conclusions about the effectiveness or cost-effectiveness of the asthma disease-management programs or which approach was most effective. Few well-designed studies with rigorous evaluations have been conducted to evaluate disease-management interventions for adults with asthma. Current evidence is insufficient to recommend any particular intervention.
NASA Astrophysics Data System (ADS)
Bao, Zhenkun; Li, Xiaolong; Luo, Xiangyang
2017-01-01
Extracting informative statistic features is the most essential technical issue of steganalysis. Among various steganalysis methods, probability density function (PDF) and characteristic function (CF) moments are two important types of features due to the excellent ability for distinguishing the cover images from the stego ones. The two types of features are quite similar in definition. The only difference is that the PDF moments are computed in the spatial domain, while the CF moments are computed in the Fourier-transformed domain. Then, the comparison between PDF and CF moments is an interesting question of steganalysis. Several theoretical results have been derived, and CF moments are proved better than PDF moments in some cases. However, in the log prediction error wavelet subband of wavelet decomposition, some experiments show that the result is opposite and lacks a rigorous explanation. To solve this problem, a comparison result based on the rigorous proof is presented: the first-order PDF moment is proved better than the CF moment, while the second-order CF moment is better than the PDF moment. It tries to open the theoretical discussion on steganalysis and the question of finding suitable statistical features.
The Relationship Between Professional Burnout and Quality and Safety in Healthcare: A Meta-Analysis.
Salyers, Michelle P; Bonfils, Kelsey A; Luther, Lauren; Firmin, Ruth L; White, Dominique A; Adams, Erin L; Rollins, Angela L
2017-04-01
Healthcare provider burnout is considered a factor in quality of care, yet little is known about the consistency and magnitude of this relationship. This meta-analysis examined relationships between provider burnout (emotional exhaustion, depersonalization, and reduced personal accomplishment) and the quality (perceived quality, patient satisfaction) and safety of healthcare. Publications were identified through targeted literature searches in Ovid MEDLINE, PsycINFO, Web of Science, CINAHL, and ProQuest Dissertations & Theses through March of 2015. Two coders extracted data to calculate effect sizes and potential moderators. We calculated Pearson's r for all independent relationships between burnout and quality measures, using a random effects model. Data were assessed for potential impact of study rigor, outliers, and publication bias. Eighty-two studies including 210,669 healthcare providers were included. Statistically significant negative relationships emerged between burnout and quality (r = -0.26, 95 % CI [-0.29, -0.23]) and safety (r = -0.23, 95 % CI [-0.28, -0.17]). In both cases, the negative relationship implied that greater burnout among healthcare providers was associated with poorer-quality healthcare and reduced safety for patients. Moderators for the quality relationship included dimension of burnout, unit of analysis, and quality data source. Moderators for the relationship between burnout and safety were safety indicator type, population, and country. Rigor of the study was not a significant moderator. This is the first study to systematically, quantitatively analyze the links between healthcare provider burnout and healthcare quality and safety across disciplines. Provider burnout shows consistent negative relationships with perceived quality (including patient satisfaction), quality indicators, and perceptions of safety. Though the effects are small to medium, the findings highlight the importance of effective burnout interventions for healthcare providers. Moderator analyses suggest contextual factors to consider for future study.
Zeni, Mary Beth
2012-03-01
The purpose of this study was to evaluate if paediatric asthma educational intervention studies included in the Cochrane Collaboration database incorporated concepts of health literacy. Inclusion criteria were established to identify review categories in the Cochrane Collaboration database specific to paediatric asthma educational interventions. Articles that met the inclusion criteria were selected from the Cochrane Collaboration database in 2010. The health literacy definition from Healthy People 2010 was used to develop a 4-point a priori rating scale to determine the extent a study reported aspects of health literacy in the development of an educational intervention for parents and/or children. Five Cochrane review categories met the inclusion criteria; 75 studies were rated for health literacy content regarding educational interventions with families and children living with asthma. A priori criteria were used for the rating process. While 52 (69%) studies had no information pertaining to health literacy, 23 (31%) reported an aspect of health literacy. Although all studies maintained the rigorous standards of randomized clinical trials, a model of health literacy was not reported regarding the design and implementation of interventions. While a more comprehensive health literacy model for the development of educational interventions with families and children may have been available after the reviewed studies were conducted, general literacy levels still could have been addressed. The findings indicate a need to incorporate health literacy in the design of client-centred educational interventions and in the selection criteria of relevant Cochrane reviews. Inclusion assures that health literacy is as important as randomization and statistical analyses in the research design of educational interventions and may even assure participation of people with literacy challenges. © 2012 The Author. International Journal of Evidence-Based Healthcare © 2012 The Joanna Briggs Institute.
Chiu, Grace S; Wu, Margaret A; Lu, Lin
2013-01-01
The ability to quantitatively assess ecological health is of great interest to those tasked with monitoring and conserving ecosystems. For decades, biomonitoring research and policies have relied on multimetric health indices of various forms. Although indices are numbers, many are constructed based on qualitative procedures, thus limiting the quantitative rigor of the practical interpretations of such indices. The statistical modeling approach to construct the latent health factor index (LHFI) was recently developed. With ecological data that otherwise are used to construct conventional multimetric indices, the LHFI framework expresses such data in a rigorous quantitative model, integrating qualitative features of ecosystem health and preconceived ecological relationships among such features. This hierarchical modeling approach allows unified statistical inference of health for observed sites (along with prediction of health for partially observed sites, if desired) and of the relevance of ecological drivers, all accompanied by formal uncertainty statements from a single, integrated analysis. Thus far, the LHFI approach has been demonstrated and validated in a freshwater context. We adapt this approach to modeling estuarine health, and illustrate it on the previously unassessed system in Richibucto in New Brunswick, Canada, where active oyster farming is a potential stressor through its effects on sediment properties. Field data correspond to health metrics that constitute the popular AZTI marine biotic index and the infaunal trophic index, as well as abiotic predictors preconceived to influence biota. Our paper is the first to construct a scientifically sensible model that rigorously identifies the collective explanatory capacity of salinity, distance downstream, channel depth, and silt-clay content-all regarded a priori as qualitatively important abiotic drivers-towards site health in the Richibucto ecosystem. This suggests the potential effectiveness of the LHFI approach for assessing not only freshwater systems but aquatic ecosystems in general.
NASA Astrophysics Data System (ADS)
Mustac, M.; Kim, S.; Tkalcic, H.; Rhie, J.; Chen, Y.; Ford, S. R.; Sebastian, N.
2015-12-01
Conventional approaches to inverse problems suffer from non-linearity and non-uniqueness in estimations of seismic structures and source properties. Estimated results and associated uncertainties are often biased by applied regularizations and additional constraints, which are commonly introduced to solve such problems. Bayesian methods, however, provide statistically meaningful estimations of models and their uncertainties constrained by data information. In addition, hierarchical and trans-dimensional (trans-D) techniques are inherently implemented in the Bayesian framework to account for involved error statistics and model parameterizations, and, in turn, allow more rigorous estimations of the same. Here, we apply Bayesian methods throughout the entire inference process to estimate seismic structures and source properties in Northeast Asia including east China, the Korean peninsula, and the Japanese islands. Ambient noise analysis is first performed to obtain a base three-dimensional (3-D) heterogeneity model using continuous broadband waveforms from more than 300 stations. As for the tomography of surface wave group and phase velocities in the 5-70 s band, we adopt a hierarchical and trans-D Bayesian inversion method using Voronoi partition. The 3-D heterogeneity model is further improved by joint inversions of teleseismic receiver functions and dispersion data using a newly developed high-efficiency Bayesian technique. The obtained model is subsequently used to prepare 3-D structural Green's functions for the source characterization. A hierarchical Bayesian method for point source inversion using regional complete waveform data is applied to selected events from the region. The seismic structure and source characteristics with rigorously estimated uncertainties from the novel Bayesian methods provide enhanced monitoring and discrimination of seismic events in northeast Asia.
Overview of Aro Program on Network Science for Human Decision Making
NASA Astrophysics Data System (ADS)
West, Bruce J.
This program brings together researchers from disparate disciplines to work on a complex research problem that defies confinement within any single discipline. Consequently, not only are new and rewarding solutions sought and obtained for a problem of importance to society and the Army, that is, the human dimension of complex networks, but, in addition, collaborations are established that would not otherwise have formed given the traditional disciplinary compartmentalization of research. This program develops the basic research foundation of a science of networks supporting the linkage between the physical and human (cognitive and social) domains as they relate to human decision making. The strategy is to extend the recent methods of non-equilibrium statistical physics to non-stationary, renewal stochastic processes that appear to be characteristic of the interactions among nodes in complex networks. We also pursue understanding of the phenomenon of synchronization, whose mathematical formulation has recently provided insight into how complex networks reach accommodation and cooperation. The theoretical analyses of complex networks, although mathematically rigorous, often elude analytic solutions and require computer simulation and computation to analyze the underlying dynamic process.
Intranasal Oxytocin: Myths and Delusions.
Leng, Gareth; Ludwig, Mike
2016-02-01
Despite widespread reports that intranasal application of oxytocin has a variety of behavioral effects, very little of the huge amounts applied intranasally appears to reach the cerebrospinal fluid. However, peripheral concentrations are increased to supraphysiologic levels, with likely effects on diverse targets including the gastrointestinal tract, heart, and reproductive tract. The wish to believe in the effectiveness of intranasal oxytocin appears to be widespread and needs to be guarded against with scepticism and rigor. Preregistering trials, declaring primary and secondary outcomes in advance, specifying the statistical methods to be applied, and making all data openly available should minimize problems of publication bias and questionable post hoc analyses. Effects of intranasal oxytocin also need proper dose-response studies, and such studies need to include control subjects for peripheral effects, by administering oxytocin peripherally and by blocking peripheral actions with antagonists. Reports in the literature of oxytocin measurements include many that have been made with discredited methodology. Claims that peripheral measurements of oxytocin reflect central release are questionable at best. Copyright © 2016 Society of Biological Psychiatry. Published by Elsevier Inc. All rights reserved.
Equilibration, thermalisation, and the emergence of statistical mechanics in closed quantum systems
NASA Astrophysics Data System (ADS)
Gogolin, Christian; Eisert, Jens
2016-05-01
We review selected advances in the theoretical understanding of complex quantum many-body systems with regard to emergent notions of quantum statistical mechanics. We cover topics such as equilibration and thermalisation in pure state statistical mechanics, the eigenstate thermalisation hypothesis, the equivalence of ensembles, non-equilibration dynamics following global and local quenches as well as ramps. We also address initial state independence, absence of thermalisation, and many-body localisation. We elucidate the role played by key concepts for these phenomena, such as Lieb-Robinson bounds, entanglement growth, typicality arguments, quantum maximum entropy principles and the generalised Gibbs ensembles, and quantum (non-)integrability. We put emphasis on rigorous approaches and present the most important results in a unified language.
NASA Astrophysics Data System (ADS)
Röpke, G.
2018-01-01
One of the fundamental problems in physics that are not yet rigorously solved is the statistical mechanics of nonequilibrium processes. An important contribution to describing irreversible behavior starting from reversible Hamiltonian dynamics was given by D. N. Zubarev, who invented the method of the nonequilibrium statistical operator. We discuss this approach, in particular, the extended von Neumann equation, and as an example consider the electrical conductivity of a system of charged particles. We consider the selection of the set of relevant observables. We show the relation between kinetic theory and linear response theory. Using thermodynamic Green's functions, we present a systematic treatment of correlation functions, but the convergence needs investigation. We compare different expressions for the conductivity and list open questions.
Equilibration, thermalisation, and the emergence of statistical mechanics in closed quantum systems.
Gogolin, Christian; Eisert, Jens
2016-05-01
We review selected advances in the theoretical understanding of complex quantum many-body systems with regard to emergent notions of quantum statistical mechanics. We cover topics such as equilibration and thermalisation in pure state statistical mechanics, the eigenstate thermalisation hypothesis, the equivalence of ensembles, non-equilibration dynamics following global and local quenches as well as ramps. We also address initial state independence, absence of thermalisation, and many-body localisation. We elucidate the role played by key concepts for these phenomena, such as Lieb-Robinson bounds, entanglement growth, typicality arguments, quantum maximum entropy principles and the generalised Gibbs ensembles, and quantum (non-)integrability. We put emphasis on rigorous approaches and present the most important results in a unified language.
Statistical Characterization and Classification of Edge-Localized Plasma Instabilities
NASA Astrophysics Data System (ADS)
Webster, A. J.; Dendy, R. O.
2013-04-01
The statistics of edge-localized plasma instabilities (ELMs) in toroidal magnetically confined fusion plasmas are considered. From first principles, standard experimentally motivated assumptions are shown to determine a specific probability distribution for the waiting times between ELMs: the Weibull distribution. This is confirmed empirically by a statistically rigorous comparison with a large data set from the Joint European Torus. The successful characterization of ELM waiting times enables future work to progress in various ways. Here we present a quantitative classification of ELM types, complementary to phenomenological approaches. It also informs us about the nature of ELM processes, such as whether they are random or deterministic. The methods are extremely general and can be applied to numerous other quasiperiodic intermittent phenomena.
NASA Astrophysics Data System (ADS)
Skorobogatiy, Maksim; Sadasivan, Jayesh; Guerboukha, Hichem
2018-05-01
In this paper, we first discuss the main types of noise in a typical pump-probe system, and then focus specifically on terahertz time domain spectroscopy (THz-TDS) setups. We then introduce four statistical models for the noisy pulses obtained in such systems, and detail rigorous mathematical algorithms to de-noise such traces, find the proper averages and characterise various types of experimental noise. Finally, we perform a comparative analysis of the performance, advantages and limitations of the algorithms by testing them on the experimental data collected using a particular THz-TDS system available in our laboratories. We conclude that using advanced statistical models for trace averaging results in the fitting errors that are significantly smaller than those obtained when only a simple statistical average is used.
Risk and return: evaluating Reverse Tracing of Precursors earthquake predictions
NASA Astrophysics Data System (ADS)
Zechar, J. Douglas; Zhuang, Jiancang
2010-09-01
In 2003, the Reverse Tracing of Precursors (RTP) algorithm attracted the attention of seismologists and international news agencies when researchers claimed two successful predictions of large earthquakes. These researchers had begun applying RTP to seismicity in Japan, California, the eastern Mediterranean and Italy; they have since applied it to seismicity in the northern Pacific, Oregon and Nevada. RTP is a pattern recognition algorithm that uses earthquake catalogue data to declare alarms, and these alarms indicate that RTP expects a moderate to large earthquake in the following months. The spatial extent of alarms is highly variable and each alarm typically lasts 9 months, although the algorithm may extend alarms in time and space. We examined the record of alarms and outcomes since the prospective application of RTP began, and in this paper we report on the performance of RTP to date. To analyse these predictions, we used a recently developed approach based on a gambling score, and we used a simple reference model to estimate the prior probability of target earthquakes for each alarm. Formally, we believe that RTP investigators did not rigorously specify the first two `successful' predictions in advance of the relevant earthquakes; because this issue is contentious, we consider analyses with and without these alarms. When we included contentious alarms, RTP predictions demonstrate statistically significant skill. Under a stricter interpretation, the predictions are marginally unsuccessful.
Tilov, Boris; Dimitrova, Donka; Stoykova, Maria; Tornjova, Bianka; Foreva, Gergana; Stoyanov, Drozdstoj
2012-12-01
Health-care professions have long been considered prone to work-related stress, yet recent research in Bulgaria indicates alarmingly high levels of burnout. Cloninger's inventory is used to analyse and evaluate correlation between personality characteristics and degree of burnout syndrome manifestation among the risk categories of health-care professionals. The primary goal of this study was to test the conceptual validity and cross-cultural applicability of the revised TCI (TCI-R), developed in the United States, in a culturally, socially and economically diverse setting. Linguistic validation, test-retest studies, statistical and expert analyses were performed to assess cross-cultural applicability of the revised Cloninger's temperament and character inventory in Bulgarian, its reliability and internal consistency and construct validity. The overall internal consistency of TCI-R and its scales as well as the interscale and test-retest correlations prove that the translated version of the questionnaire is acceptable and cross-culturally applicable for the purposes of studying organizational stress and burnout risk in health-care professionals. In general the cross-cultural adaptation process, even if carried out in a rigorous way, does not always lead to the best target version and suggests it would be useful to develop new scales specific to each culture and, at the same time, to think about the trans-cultural adaptation. © 2012 Blackwell Publishing Ltd.
Bamberger, Michael; Tarsilla, Michele; Hesse-Biber, Sharlene
2016-04-01
Many widely-used impact evaluation designs, including randomized control trials (RCTs) and quasi-experimental designs (QEDs), frequently fail to detect what are often quite serious unintended consequences of development programs. This seems surprising as experienced planners and evaluators are well aware that unintended consequences frequently occur. Most evaluation designs are intended to determine whether there is credible evidence (statistical, theory-based or narrative) that programs have achieved their intended objectives and the logic of many evaluation designs, even those that are considered the most "rigorous," does not permit the identification of outcomes that were not specified in the program design. We take the example of RCTs as they are considered by many to be the most rigorous evaluation designs. We present a numbers of cases to illustrate how infusing RCTs with a mixed-methods approach (sometimes called an "RCT+" design) can strengthen the credibility of these designs and can also capture important unintended consequences. We provide a Mixed Methods Evaluation Framework that identifies 9 ways in which UCs can occur, and we apply this framework to two of the case studies. Copyright © 2016 Elsevier Ltd. All rights reserved.
Dialyzer Reuse with Peracetic Acid Does Not Impact Patient Mortality
Bond, T. Christopher; Krishnan, Mahesh; Wilson, Steven M.; Mayne, Tracy
2011-01-01
Summary Background and objectives Numerous studies have shown the overall benefits of dialysis filter reuse, including superior biocompatibility and decreased nonbiodegradable medical waste generation, without increased risk of mortality. A recent study reported that dialyzer reprocessing was associated with decreased patient survival; however, it did not control for sources of potential confounding. We sought to determine the effect of dialyzer reprocessing with peracetic acid on patient mortality using contemporary outcomes data and rigorous analytical techniques. Design, setting, participants, & measurements We conducted a series of analyses of hemodialysis patients examining the effects of reuse on mortality using three techniques to control for potential confounding: instrumental variables, propensity-score matching, and time-dependent survival analysis. Results In the instrumental variables analysis, patients at high reuse centers had 16.2 versus 15.9 deaths/100 patient-years in nonreuse centers. In the propensity-score matched analysis, patients with reuse had a lower death rate per 100 patient-years than those without reuse (15.2 versus 15.5). The risk ratios for the time-dependent survival analyses were 0.993 (per percent of sessions with reuse) and 0.995 (per unit of last reuse), respectively. Over the study period, 13.8 million dialyzers were saved, representing 10,000 metric tons of medical waste. Conclusions Despite the large sample size, powered to detect miniscule effects, neither the instrumental variables nor propensity-matched analyses were statistically significant. The time-dependent survival analysis showed a protective effect of reuse. These data are consistent with the preponderance of evidence showing reuse limits medical waste generation without negatively affecting clinical outcomes. PMID:21566107
NASA Astrophysics Data System (ADS)
Blanchard, Philippe; Hellmich, Mario; Ługiewicz, Piotr; Olkiewicz, Robert
Quantum mechanics is the greatest revision of our conception of the character of the physical world since Newton. Consequently, David Hilbert was very interested in quantum mechanics. He and John von Neumann discussed it frequently during von Neumann's residence in Göttingen. He published in 1932 his book Mathematical Foundations of Quantum Mechanics. In Hilbert's opinion it was the first exposition of quantum mechanics in a mathematically rigorous way. The pioneers of quantum mechanics, Heisenberg and Dirac, neither had use for rigorous mathematics nor much interest in it. Conceptually, quantum theory as developed by Bohr and Heisenberg is based on the positivism of Mach as it describes only observable quantities. It first emerged as a result of experimental data in the form of statistical observations of quantum noise, the basic concept of quantum probability.
Kiebish, Michael A.; Yang, Kui; Han, Xianlin; Gross, Richard W.; Chuang, Jeffrey
2012-01-01
The regulation and maintenance of the cellular lipidome through biosynthetic, remodeling, and catabolic mechanisms are critical for biological homeostasis during development, health and disease. These complex mechanisms control the architectures of lipid molecular species, which have diverse yet highly regulated fatty acid chains at both the sn1 and sn2 positions. Phosphatidylcholine (PC) and phosphatidylethanolamine (PE) serve as the predominant biophysical scaffolds in membranes, acting as reservoirs for potent lipid signals and regulating numerous enzymatic processes. Here we report the first rigorous computational dissection of the mechanisms influencing PC and PE molecular architectures from high-throughput shotgun lipidomic data. Using novel statistical approaches, we have analyzed multidimensional mass spectrometry-based shotgun lipidomic data from developmental mouse heart and mature mouse heart, lung, brain, and liver tissues. We show that in PC and PE, sn1 and sn2 positions are largely independent, though for low abundance species regulatory processes may interact with both the sn1 and sn2 chain simultaneously, leading to cooperative effects. Chains with similar biochemical properties appear to be remodeled similarly. We also see that sn2 positions are more regulated than sn1, and that PC exhibits stronger cooperative effects than PE. A key aspect of our work is a novel statistically rigorous approach to determine cooperativity based on a modified Fisher's exact test using Markov Chain Monte Carlo sampling. This computational approach provides a novel tool for developing mechanistic insight into lipidomic regulation. PMID:22662143
Quality and Rigor of the Concept Mapping Methodology: A Pooled Study Analysis
ERIC Educational Resources Information Center
Rosas, Scott R.; Kane, Mary
2012-01-01
The use of concept mapping in research and evaluation has expanded dramatically over the past 20 years. Researchers in academic, organizational, and community-based settings have applied concept mapping successfully without the benefit of systematic analyses across studies to identify the features of a methodologically sound study. Quantitative…
Immigrants as New Speakers in Galicia and Wales: Issues of Integration, Belonging and Legitimacy
ERIC Educational Resources Information Center
Bermingham, Nicola; Higham, Gwennan
2018-01-01
Immigrant integration in nation states increasingly focuses on the importance of learning the national state language. This is evidenced by increased emphasis on rigorous language testing and tighter citizenship regulations. This paper analyses immigrant integration in two sub-state contexts, Galicia and Wales, where presence of a national…
ERIC Educational Resources Information Center
Kinsler, Paul; Favaro, Alberto; McCall, Martin W.
2009-01-01
The Poynting vector is an invaluable tool for analysing electromagnetic problems. However, even a rigorous stress-energy tensor approach can still leave us with the question: is it best defined as E x H or as D x B? Typical electromagnetic treatments provide yet another perspective: they regard E x B as the appropriate definition, because E and B…
An IRT Analysis of Preservice Teacher Self-Efficacy in Technology Integration
ERIC Educational Resources Information Center
Browne, Jeremy
2011-01-01
The need for rigorously developed measures of preservice teacher traits regarding technology integration training has been acknowledged (Kay 2006), but such instruments are still extremely rare. The Technology Integration Confidence Scale (TICS) represents one such measure, but past analyses of its functioning have been limited by sample size and…
ERIC Educational Resources Information Center
Zhong, Hua; Schwartz, Jennifer
2010-01-01
Underage drinking is among the most serious of public health problems facing adolescents in the United States. Recent concerns have centered on young women, reflected in media reports and arrest statistics on their increasing problematic alcohol use. This study rigorously examined whether girls' alcohol use rose by applying time series methods to…
Decomposition of the Inequality of Income Distribution by Income Types—Application for Romania
NASA Astrophysics Data System (ADS)
Andrei, Tudorel; Oancea, Bogdan; Richmond, Peter; Dhesi, Gurjeet; Herteliu, Claudiu
2017-09-01
This paper identifies the salient factors that characterize the inequality income distribution for Romania. Data analysis is rigorously carried out using sophisticated techniques borrowed from classical statistics (Theil). Decomposition of the inequalities measured by the Theil index is also performed. This study relies on an exhaustive (11.1 million records for 2014) data-set for total personal gross income of Romanian citizens.
NASA Astrophysics Data System (ADS)
Anishchenko, V. S.; Boev, Ya. I.; Semenova, N. I.; Strelkova, G. I.
2015-07-01
We review rigorous and numerical results on the statistics of Poincaré recurrences which are related to the modern development of the Poincaré recurrence problem. We analyze and describe the rigorous results which are achieved both in the classical (local) approach and in the recently developed global approach. These results are illustrated by numerical simulation data for simple chaotic and ergodic systems. It is shown that the basic theoretical laws can be applied to noisy systems if the probability measure is ergodic and stationary. Poincaré recurrences are studied numerically in nonautonomous systems. Statistical characteristics of recurrences are analyzed in the framework of the global approach for the cases of positive and zero topological entropy. We show that for the positive entropy, there is a relationship between the Afraimovich-Pesin dimension, Lyapunov exponents and the Kolmogorov-Sinai entropy either without and in the presence of external noise. The case of zero topological entropy is exemplified by numerical results for the Poincare recurrence statistics in the circle map. We show and prove that the dependence of minimal recurrence times on the return region size demonstrates universal properties for the golden and the silver ratio. The behavior of Poincaré recurrences is analyzed at the critical point of Feigenbaum attractor birth. We explore Poincaré recurrences for an ergodic set which is generated in the stroboscopic section of a nonautonomous oscillator and is similar to a circle shift. Based on the obtained results we show how the Poincaré recurrence statistics can be applied for solving a number of nonlinear dynamics issues. We propose and illustrate alternative methods for diagnosing effects of external and mutual synchronization of chaotic systems in the context of the local and global approaches. The properties of the recurrence time probability density can be used to detect the stochastic resonance phenomenon. We also discuss how the fractal dimension of chaotic attractors can be estimated using the Poincaré recurrence statistics.
A Bayesian nonparametric method for prediction in EST analysis
Lijoi, Antonio; Mena, Ramsés H; Prünster, Igor
2007-01-01
Background Expressed sequence tags (ESTs) analyses are a fundamental tool for gene identification in organisms. Given a preliminary EST sample from a certain library, several statistical prediction problems arise. In particular, it is of interest to estimate how many new genes can be detected in a future EST sample of given size and also to determine the gene discovery rate: these estimates represent the basis for deciding whether to proceed sequencing the library and, in case of a positive decision, a guideline for selecting the size of the new sample. Such information is also useful for establishing sequencing efficiency in experimental design and for measuring the degree of redundancy of an EST library. Results In this work we propose a Bayesian nonparametric approach for tackling statistical problems related to EST surveys. In particular, we provide estimates for: a) the coverage, defined as the proportion of unique genes in the library represented in the given sample of reads; b) the number of new unique genes to be observed in a future sample; c) the discovery rate of new genes as a function of the future sample size. The Bayesian nonparametric model we adopt conveys, in a statistically rigorous way, the available information into prediction. Our proposal has appealing properties over frequentist nonparametric methods, which become unstable when prediction is required for large future samples. EST libraries, previously studied with frequentist methods, are analyzed in detail. Conclusion The Bayesian nonparametric approach we undertake yields valuable tools for gene capture and prediction in EST libraries. The estimators we obtain do not feature the kind of drawbacks associated with frequentist estimators and are reliable for any size of the additional sample. PMID:17868445
Weak value amplification considered harmful
NASA Astrophysics Data System (ADS)
Ferrie, Christopher; Combes, Joshua
2014-03-01
We show using statistically rigorous arguments that the technique of weak value amplification does not perform better than standard statistical techniques for the tasks of parameter estimation and signal detection. We show that using all data and considering the joint distribution of all measurement outcomes yields the optimal estimator. Moreover, we show estimation using the maximum likelihood technique with weak values as small as possible produces better performance for quantum metrology. In doing so, we identify the optimal experimental arrangement to be the one which reveals the maximal eigenvalue of the square of system observables. We also show these conclusions do not change in the presence of technical noise.
Maximum entropy models as a tool for building precise neural controls.
Savin, Cristina; Tkačik, Gašper
2017-10-01
Neural responses are highly structured, with population activity restricted to a small subset of the astronomical range of possible activity patterns. Characterizing these statistical regularities is important for understanding circuit computation, but challenging in practice. Here we review recent approaches based on the maximum entropy principle used for quantifying collective behavior in neural activity. We highlight recent models that capture population-level statistics of neural data, yielding insights into the organization of the neural code and its biological substrate. Furthermore, the MaxEnt framework provides a general recipe for constructing surrogate ensembles that preserve aspects of the data, but are otherwise maximally unstructured. This idea can be used to generate a hierarchy of controls against which rigorous statistical tests are possible. Copyright © 2017 Elsevier Ltd. All rights reserved.
A spatial-dynamic value transfer model of economic losses from a biological invasion
Thomas P. Holmes; Andrew M. Liebhold; Kent F. Kovacs; Betsy Von Holle
2010-01-01
Rigorous assessments of the economic impacts of introduced species at broad spatial scales are required to provide credible information to policy makers. We propose that economic models of aggregate damages induced by biological invasions need to link microeconomic analyses of site-specific economic damages with spatial-dynamic models of value change associated with...
What Do We Know and How Well Do We Know It? Identifying Practice-Based Insights in Education
ERIC Educational Resources Information Center
Miller, Barbara; Pasley, Joan
2012-01-01
Knowledge derived from practice forms a significant portion of the knowledge base in the education field, yet is not accessible using existing empirical research methods. This paper describes a systematic, rigorous, grounded approach to collecting and analysing practice-based knowledge using the authors' research in teacher leadership as an…
ERIC Educational Resources Information Center
Wharton, Tracy
2017-01-01
Dissemination of research is the most challenging aspect of building the evidence base. Despite peer review, evidence suggests that a substantial proportion of papers leave out details that are necessary to judge bias, consider replication, or initiate meta-analyses and systematic reviews. Reporting guidelines were created to ensure minimally…
ERIC Educational Resources Information Center
Bremer, Emily; Crozier, Michael; Lloyd, Meghann
2016-01-01
The purpose of this review was to systematically search and critically analyse the literature pertaining to behavioural outcomes of exercise interventions for individuals with autism spectrum disorder aged ?16 years. This systematic review employed a comprehensive peer-reviewed search strategy, two-stage screening process and rigorous critical…
ERIC Educational Resources Information Center
Holzer, Harry J.; Schanzenbach, Diane Whitmore; Duncan, Greg J.; Ludwig, Jens
2007-01-01
In this paper, we review a range of rigorous research studies that estimate the average statistical relationships between children growing up in poverty and their earnings, propensity to commit crime, and quality of health later in life. We also review estimates of the costs that crime and poor health per person impose on the economy. Then we…
Enumerating Sparse Organisms in Ships’ Ballast Water: Why Counting to 10 Is Not So Easy
2011-01-01
To reduce ballast water-borne aquatic invasions worldwide, the International Maritime Organization and United States Coast Guard have each proposed discharge standards specifying maximum concentrations of living biota that may be released in ships’ ballast water (BW), but these regulations still lack guidance for standardized type approval and compliance testing of treatment systems. Verifying whether BW meets a discharge standard poses significant challenges. Properly treated BW will contain extremely sparse numbers of live organisms, and robust estimates of rare events require extensive sampling efforts. A balance of analytical rigor and practicality is essential to determine the volume of BW that can be reasonably sampled and processed, yet yield accurate live counts. We applied statistical modeling to a range of sample volumes, plankton concentrations, and regulatory scenarios (i.e., levels of type I and type II errors), and calculated the statistical power of each combination to detect noncompliant discharge concentrations. The model expressly addresses the roles of sampling error, BW volume, and burden of proof on the detection of noncompliant discharges in order to establish a rigorous lower limit of sampling volume. The potential effects of recovery errors (i.e., incomplete recovery and detection of live biota) in relation to sample volume are also discussed. PMID:21434685
Understanding photon sideband statistics and correlation for determining phonon coherence
NASA Astrophysics Data System (ADS)
Ding, Ding; Yin, Xiaobo; Li, Baowen
2018-01-01
Generating and detecting coherent high-frequency heat-carrying phonons have been topics of great interest in recent years. Although there have been successful attempts in generating and observing coherent phonons, rigorous techniques to characterize and detect phonon coherence in a crystalline material have been lagging compared to what has been achieved for photons. One main challenge is a lack of detailed understanding of how detection signals for phonons can be related to coherence. The quantum theory of photoelectric detection has greatly advanced the ability to characterize photon coherence in the past century, and a similar theory for phonon detection is necessary. Here, we reexamine the optical sideband fluorescence technique that has been used to detect high-frequency phonons in materials with optically active defects. We propose a quantum theory of phonon detection using the sideband technique and found that there are distinct differences in sideband counting statistics between thermal and coherent phonons. We further propose a second-order correlation function unique to sideband signals that allows for a rigorous distinction between thermal and coherent phonons. Our theory is relevant to a correlation measurement with nontrivial response functions at the quantum level and can potentially bridge the gap of experimentally determining phonon coherence to be on par with that of photons.
Enumerating sparse organisms in ships' ballast water: why counting to 10 is not so easy.
Miller, A Whitman; Frazier, Melanie; Smith, George E; Perry, Elgin S; Ruiz, Gregory M; Tamburri, Mario N
2011-04-15
To reduce ballast water-borne aquatic invasions worldwide, the International Maritime Organization and United States Coast Guard have each proposed discharge standards specifying maximum concentrations of living biota that may be released in ships' ballast water (BW), but these regulations still lack guidance for standardized type approval and compliance testing of treatment systems. Verifying whether BW meets a discharge standard poses significant challenges. Properly treated BW will contain extremely sparse numbers of live organisms, and robust estimates of rare events require extensive sampling efforts. A balance of analytical rigor and practicality is essential to determine the volume of BW that can be reasonably sampled and processed, yet yield accurate live counts. We applied statistical modeling to a range of sample volumes, plankton concentrations, and regulatory scenarios (i.e., levels of type I and type II errors), and calculated the statistical power of each combination to detect noncompliant discharge concentrations. The model expressly addresses the roles of sampling error, BW volume, and burden of proof on the detection of noncompliant discharges in order to establish a rigorous lower limit of sampling volume. The potential effects of recovery errors (i.e., incomplete recovery and detection of live biota) in relation to sample volume are also discussed.
Santos, Radleigh; Buying, Alcinette; Sabri, Nazila; Yu, John; Gringeri, Anthony; Bender, James; Janetzki, Sylvia; Pinilla, Clemencia; Judkowski, Valeria A.
2014-01-01
Immune monitoring of functional responses is a fundamental parameter to establish correlates of protection in clinical trials evaluating vaccines and therapies to boost antigen-specific responses. The IFNγ ELISPOT assay is a well-standardized and validated method for the determination of functional IFNγ-producing T-cells in peripheral blood mononuclear cells (PBMC); however, its performance greatly depends on the quality and integrity of the cryopreserved PBMC. Here, we investigate the effect of overnight (ON) resting of the PBMC on the detection of CD8-restricted peptide-specific responses by IFNγ ELISPOT. The study used PBMC from healthy donors to evaluate the CD8 T-cell response to five pooled or individual HLA-A2 viral peptides. The results were analyzed using a modification of the existing distribution free resampling (DFR) recommended for the analysis of ELISPOT data to ensure the most rigorous possible standard of significance. The results of the study demonstrate that ON resting of PBMC samples prior to IFNγ ELISPOT increases both the magnitude and the statistical significance of the responses. In addition, a comparison of the results with a 13-day preculture of PBMC with the peptides before testing demonstrates that ON resting is sufficient for the efficient evaluation of immune functioning. PMID:25546016
NASA Astrophysics Data System (ADS)
Bookstein, Fred L.
1995-08-01
Recent advances in computational geometry have greatly extended the range of neuroanatomical questions that can be approached by rigorous quantitative methods. One of the major current challenges in this area is to describe the variability of human cortical surface form and its implications for individual differences in neurophysiological functioning. Existing techniques for representation of stochastically invaginated surfaces do not conduce to the necessary parametric statistical summaries. In this paper, following a hint from David Van Essen and Heather Drury, I sketch a statistical method customized for the constraints of this complex data type. Cortical surface form is represented by its Riemannian metric tensor and averaged according to parameters of a smooth averaged surface. Sulci are represented by integral trajectories of the smaller principal strains of this metric, and their statistics follow the statistics of that relative metric. The diagrams visualizing this tensor analysis look like alligator leather but summarize all aspects of cortical surface form in between the principal sulci, the reliable ones; no flattening is required.
NASA Astrophysics Data System (ADS)
Toman, Blaza; Nelson, Michael A.; Bedner, Mary
2017-06-01
Chemical measurement methods are designed to promote accurate knowledge of a measurand or system. As such, these methods often allow elicitation of latent sources of variability and correlation in experimental data. They typically implement measurement equations that support quantification of effects associated with calibration standards and other known or observed parametric variables. Additionally, multiple samples and calibrants are usually analyzed to assess accuracy of the measurement procedure and repeatability by the analyst. Thus, a realistic assessment of uncertainty for most chemical measurement methods is not purely bottom-up (based on the measurement equation) or top-down (based on the experimental design), but inherently contains elements of both. Confidence in results must be rigorously evaluated for the sources of variability in all of the bottom-up and top-down elements. This type of analysis presents unique challenges due to various statistical correlations among the outputs of measurement equations. One approach is to use a Bayesian hierarchical (BH) model which is intrinsically rigorous, thus making it a straightforward method for use with complex experimental designs, particularly when correlations among data are numerous and difficult to elucidate or explicitly quantify. In simpler cases, careful analysis using GUM Supplement 1 (MC) methods augmented with random effects meta analysis yields similar results to a full BH model analysis. In this article we describe both approaches to rigorous uncertainty evaluation using as examples measurements of 25-hydroxyvitamin D3 in solution reference materials via liquid chromatography with UV absorbance detection (LC-UV) and liquid chromatography mass spectrometric detection using isotope dilution (LC-IDMS).
Learning from Science and Sport - How we, Safety, "Engage with Rigor"
NASA Astrophysics Data System (ADS)
Herd, A.
2012-01-01
As the world of spaceflight safety is relatively small and potentially inward-looking, we need to be aware of the "outside world". We should then try to remind ourselves to be open to the possibility that data, knowledge or experience from outside of the spaceflight community may provide some constructive alternate perspectives. This paper will assess aspects from two seemingly tangential fields, science and sport, and align these with the world of safety. In doing so some useful insights will be given to the challenges we face and may provide solutions relevant in our everyday (of safety engineering). Sport, particularly a contact sport such as rugby union, requires direct interaction between members of two (opposing) teams. Professional, accurately timed and positioned interaction for a desired outcome. These interactions, whilst an essential part of the game, are however not without their constraints. The rugby scrum has constraints as to the formation and engagement of the two teams. The controlled engagement provides for an interaction between the two teams in a safe manner. The constraints arising from the reality that an incorrect engagement could cause serious injury to members of either team. In academia, scientific rigor is applied to assure that the arguments provided and the conclusions drawn in academic papers presented for publication are valid, legitimate and credible. The scientific goal of the need for rigor may be expressed in the example of achieving a statistically relevant sample size, n, in order to assure analysis validity of the data pool. A failure to apply rigor could then place the entire study at risk of failing to have the respective paper published. This paper will consider the merits of these two different aspects, scientific rigor and sports engagement, and offer a reflective look at how this may provide a "modus operandi" for safety engineers at any level whether at their desks (creating or reviewing safety assessments) or in a safety review meeting (providing a verbal critique of the presented safety case).
Lima, Tarcisio Brandão; Dias, Josilainne Marcelino; Mazuquin, Bruno Fles; da Silva, Carla Tassiana; Nogueira, Regiane Mazzarioli Pereira; Marques, Amélia Pasqual; Lavado, Edson Lopes; Cardoso, Jefferson Rosa
2013-10-01
To assess the effectiveness of aquatic physical therapy in the treatment of fibromyalgia. The search strategy was undertaken using the following databases, from 1950 to December 2012: MEDLINE, EMBASE, CINAHL, LILACS, SCIELO, WEB OF SCIENCE, SCOPUS, SPORTDiscus, Cochrane Library Controlled Trials Register, Cochrane Disease Group Trials Register, PEDro and DARE. The studies were separated into groups: Group I - aquatic physical therapy × no treatment, Group II - aquatic physical therapy × land-based exercises and Group III - aquatic physical therapy × other treatments. Seventy-two abstracts were found, 27 of which met the inclusion criteria. For the functional ability (Fibromyalgia Impact Questionnaire), three studies were considered with a treatment time of more than 20 weeks and a mean difference (MD) of -1.35 [-2.04; -0.67], P = 0.0001 was found in favour of the aquatic physical therapy group versus no treatment. The same results were identified for stiffness and the 6-minute walk test where two studies were pooled with an MD of -1.58 [-2.58; -0.58], P = 0.002 and 43.5 (metres) [3.8; 83.2], P = 0.03, respectively. Three meta-analyses showed statistically significant results in favour of the aquatic physical therapy (Fibromyalgia Impact Questionnaire, stiffness and the 6-minute walk test) during a period of longer than 20 weeks. Due to the low methodological rigor, the results were insufficient to demonstrate statistical and clinical differences in most of the outcomes.
Renault, Nisa K E; Pritchett, Sonja M; Howell, Robin E; Greer, Wenda L; Sapienza, Carmen; Ørstavik, Karen Helene; Hamilton, David C
2013-01-01
In eutherian mammals, one X-chromosome in every XX somatic cell is transcriptionally silenced through the process of X-chromosome inactivation (XCI). Females are thus functional mosaics, where some cells express genes from the paternal X, and the others from the maternal X. The relative abundance of the two cell populations (X-inactivation pattern, XIP) can have significant medical implications for some females. In mice, the ‘choice' of which X to inactivate, maternal or paternal, in each cell of the early embryo is genetically influenced. In humans, the timing of XCI choice and whether choice occurs completely randomly or under a genetic influence is debated. Here, we explore these questions by analysing the distribution of XIPs in large populations of normal females. Models were generated to predict XIP distributions resulting from completely random or genetically influenced choice. Each model describes the discrete primary distribution at the onset of XCI, and the continuous secondary distribution accounting for changes to the XIP as a result of development and ageing. Statistical methods are used to compare models with empirical data from Danish and Utah populations. A rigorous data treatment strategy maximises information content and allows for unbiased use of unphased XIP data. The Anderson–Darling goodness-of-fit statistics and likelihood ratio tests indicate that a model of genetically influenced XCI choice better fits the empirical data than models of completely random choice. PMID:23652377
Hershman, Dawn L; Unger, Joseph M; Crew, Katherine D; Till, Cathee; Greenlee, Heather; Minasian, Lori M; Moinpour, Carol M; Lew, Danika L; Fehrenbacher, Louis; Wade, James L; Wong, Siu-Fun; Fisch, Michael J; Lynn Henry, N; Albain, Kathy S
2018-06-01
Chemotherapy-induced peripheral neuropathy (CIPN) is a common and disabling side effect of taxanes. Acetyl-L-carnitine (ALC) was unexpectedly found to increase CIPN in a randomized trial. We investigated the long-term patterns of CIPN among patients in this trial. S0715 was a randomized, double-blind, multicenter trial comparing ALC (1000 mg three times a day) with placebo for 24 weeks in women undergoing adjuvant taxane-based chemotherapy for breast cancer. CIPN was measured by the 11-item neurotoxicity (NTX) component of the FACT-Taxane scale at weeks 12, 24, 36, 52, and 104. We examined NTX scores over two years using linear mixed models for longitudinal data. Individual time points were examined using linear regression. Regression analyses included stratification factors and the baseline score as covariates. All statistical tests were two-sided. Four-hundred nine subjects were eligible for evaluation. Patients receiving ALC had a statistically significantly (P = .01) greater reduction in NTX scores (worse CIPN) of -1.39 points (95% confidence interval [CI] = -2.48 to -0.30) than the placebo group. These differences were particularly evident at weeks 24 (-1.68, 95% CI = -3.02 to -0.33), 36 (-1.37, 95% CI = -2.69 to -0.04), and 52 (-1.83, 95% CI = -3.35 to -0.32). At 104 weeks, 39.5% on the ALC arm and 34.4% on the placebo arm reported a five-point (10%) decrease from baseline. For both treatment groups, 104-week NTX scores were statistically significantly different compared with baseline (P < .001). For both groups, NTX scores were reduced from baseline and remained persistently low. Twenty-four weeks of ALC therapy resulted in statistically significantly worse CIPN over two years. Understanding the mechanism of this persistent effect may inform prevention and treatment strategies. Until then, the potential efficacy and harms of commonly used supplements should be rigorously studied.
Walach, Harald; Falkenberg, Torkel; Fønnebø, Vinjar; Lewith, George; Jonas, Wayne B
2006-01-01
Background The reasoning behind evaluating medical interventions is that a hierarchy of methods exists which successively produce improved and therefore more rigorous evidence based medicine upon which to make clinical decisions. At the foundation of this hierarchy are case studies, retrospective and prospective case series, followed by cohort studies with historical and concomitant non-randomized controls. Open-label randomized controlled studies (RCTs), and finally blinded, placebo-controlled RCTs, which offer most internal validity are considered the most reliable evidence. Rigorous RCTs remove bias. Evidence from RCTs forms the basis of meta-analyses and systematic reviews. This hierarchy, founded on a pharmacological model of therapy, is generalized to other interventions which may be complex and non-pharmacological (healing, acupuncture and surgery). Discussion The hierarchical model is valid for limited questions of efficacy, for instance for regulatory purposes and newly devised products and pharmacological preparations. It is inadequate for the evaluation of complex interventions such as physiotherapy, surgery and complementary and alternative medicine (CAM). This has to do with the essential tension between internal validity (rigor and the removal of bias) and external validity (generalizability). Summary Instead of an Evidence Hierarchy, we propose a Circular Model. This would imply a multiplicity of methods, using different designs, counterbalancing their individual strengths and weaknesses to arrive at pragmatic but equally rigorous evidence which would provide significant assistance in clinical and health systems innovation. Such evidence would better inform national health care technology assessment agencies and promote evidence based health reform. PMID:16796762
NASA Astrophysics Data System (ADS)
Šprlák, M.; Han, S.-C.; Featherstone, W. E.
2017-12-01
Rigorous modelling of the spherical gravitational potential spectra from the volumetric density and geometry of an attracting body is discussed. Firstly, we derive mathematical formulas for the spatial analysis of spherical harmonic coefficients. Secondly, we present a numerically efficient algorithm for rigorous forward modelling. We consider the finite-amplitude topographic modelling methods as special cases, with additional postulates on the volumetric density and geometry. Thirdly, we implement our algorithm in the form of computer programs and test their correctness with respect to the finite-amplitude topography routines. For this purpose, synthetic and realistic numerical experiments, applied to the gravitational field and geometry of the Moon, are performed. We also investigate the optimal choice of input parameters for the finite-amplitude modelling methods. Fourth, we exploit the rigorous forward modelling for the determination of the spherical gravitational potential spectra inferred by lunar crustal models with uniform, laterally variable, radially variable, and spatially (3D) variable bulk density. Also, we analyse these four different crustal models in terms of their spectral characteristics and band-limited radial gravitation. We demonstrate applicability of the rigorous forward modelling using currently available computational resources up to degree and order 2519 of the spherical harmonic expansion, which corresponds to a resolution of 2.2 km on the surface of the Moon. Computer codes, a user manual and scripts developed for the purposes of this study are publicly available to potential users.
Orbital State Uncertainty Realism
NASA Astrophysics Data System (ADS)
Horwood, J.; Poore, A. B.
2012-09-01
Fundamental to the success of the space situational awareness (SSA) mission is the rigorous inclusion of uncertainty in the space surveillance network. The *proper characterization of uncertainty* in the orbital state of a space object is a common requirement to many SSA functions including tracking and data association, resolution of uncorrelated tracks (UCTs), conjunction analysis and probability of collision, sensor resource management, and anomaly detection. While tracking environments, such as air and missile defense, make extensive use of Gaussian and local linearity assumptions within algorithms for uncertainty management, space surveillance is inherently different due to long time gaps between updates, high misdetection rates, nonlinear and non-conservative dynamics, and non-Gaussian phenomena. The latter implies that "covariance realism" is not always sufficient. SSA also requires "uncertainty realism"; the proper characterization of both the state and covariance and all non-zero higher-order cumulants. In other words, a proper characterization of a space object's full state *probability density function (PDF)* is required. In order to provide a more statistically rigorous treatment of uncertainty in the space surveillance tracking environment and to better support the aforementioned SSA functions, a new class of multivariate PDFs are formulated which more accurately characterize the uncertainty of a space object's state or orbit. The new distribution contains a parameter set controlling the higher-order cumulants which gives the level sets a distinctive "banana" or "boomerang" shape and degenerates to a Gaussian in a suitable limit. Using the new class of PDFs within the general Bayesian nonlinear filter, the resulting filter prediction step (i.e., uncertainty propagation) is shown to have the *same computational cost as the traditional unscented Kalman filter* with the former able to maintain a proper characterization of the uncertainty for up to *ten times as long* as the latter. The filter correction step also furnishes a statistically rigorous *prediction error* which appears in the likelihood ratios for scoring the association of one report or observation to another. Thus, the new filter can be used to support multi-target tracking within a general multiple hypothesis tracking framework. Additionally, the new distribution admits a distance metric which extends the classical Mahalanobis distance (chi^2 statistic). This metric provides a test for statistical significance and facilitates single-frame data association methods with the potential to easily extend the covariance-based track association algorithm of Hill, Sabol, and Alfriend. The filtering, data fusion, and association methods using the new class of orbital state PDFs are shown to be mathematically tractable and operationally viable.
Eggshell Porosity Provides Insight on Evolution of Nesting in Dinosaurs.
Tanaka, Kohei; Zelenitsky, Darla K; Therrien, François
2015-01-01
Knowledge about the types of nests built by dinosaurs can provide insight into the evolution of nesting and reproductive behaviors among archosaurs. However, the low preservation potential of their nesting materials and nesting structures means that most information can only be gleaned indirectly through comparison with extant archosaurs. Two general nest types are recognized among living archosaurs: 1) covered nests, in which eggs are incubated while fully covered by nesting material (as in crocodylians and megapodes), and 2) open nests, in which eggs are exposed in the nest and brooded (as in most birds). Previously, dinosaur nest types had been inferred by estimating the water vapor conductance (i.e., diffusive capacity) of their eggs, based on the premise that high conductance corresponds to covered nests and low conductance to open nests. However, a lack of statistical rigor and inconsistencies in this method render its application problematic and its validity questionable. As an alternative we propose a statistically rigorous approach to infer nest type based on large datasets of eggshell porosity and egg mass compiled for over 120 extant archosaur species and 29 archosaur extinct taxa/ootaxa. The presence of a strong correlation between eggshell porosity and nest type among extant archosaurs indicates that eggshell porosity can be used as a proxy for nest type, and thus discriminant analyses can help predict nest type in extinct taxa. Our results suggest that: 1) covered nests are likely the primitive condition for dinosaurs (and probably archosaurs), and 2) open nests first evolved among non-avian theropods more derived than Lourinhanosaurus and were likely widespread in non-avian maniraptorans, well before the appearance of birds. Although taphonomic evidence suggests that basal open nesters (i.e., oviraptorosaurs and troodontids) were potentially the first dinosaurs to brood their clutches, they still partially buried their eggs in sediment. Open nests with fully exposed eggs only became widespread among Euornithes. A potential co-evolution of open nests and brooding behavior among maniraptorans may have freed theropods from the ground-based restrictions inherent to covered nests and allowed the exploitation of alternate nesting locations. These changes in nesting styles and behaviors thus may have played a role in the evolutionary success of maniraptorans (including birds).
Eggshell Porosity Provides Insight on Evolution of Nesting in Dinosaurs
2015-01-01
Knowledge about the types of nests built by dinosaurs can provide insight into the evolution of nesting and reproductive behaviors among archosaurs. However, the low preservation potential of their nesting materials and nesting structures means that most information can only be gleaned indirectly through comparison with extant archosaurs. Two general nest types are recognized among living archosaurs: 1) covered nests, in which eggs are incubated while fully covered by nesting material (as in crocodylians and megapodes), and 2) open nests, in which eggs are exposed in the nest and brooded (as in most birds). Previously, dinosaur nest types had been inferred by estimating the water vapor conductance (i.e., diffusive capacity) of their eggs, based on the premise that high conductance corresponds to covered nests and low conductance to open nests. However, a lack of statistical rigor and inconsistencies in this method render its application problematic and its validity questionable. As an alternative we propose a statistically rigorous approach to infer nest type based on large datasets of eggshell porosity and egg mass compiled for over 120 extant archosaur species and 29 archosaur extinct taxa/ootaxa. The presence of a strong correlation between eggshell porosity and nest type among extant archosaurs indicates that eggshell porosity can be used as a proxy for nest type, and thus discriminant analyses can help predict nest type in extinct taxa. Our results suggest that: 1) covered nests are likely the primitive condition for dinosaurs (and probably archosaurs), and 2) open nests first evolved among non-avian theropods more derived than Lourinhanosaurus and were likely widespread in non-avian maniraptorans, well before the appearance of birds. Although taphonomic evidence suggests that basal open nesters (i.e., oviraptorosaurs and troodontids) were potentially the first dinosaurs to brood their clutches, they still partially buried their eggs in sediment. Open nests with fully exposed eggs only became widespread among Euornithes. A potential co-evolution of open nests and brooding behavior among maniraptorans may have freed theropods from the ground-based restrictions inherent to covered nests and allowed the exploitation of alternate nesting locations. These changes in nesting styles and behaviors thus may have played a role in the evolutionary success of maniraptorans (including birds). PMID:26605799
Gershunov, A.; Barnett, T.P.; Cayan, D.R.; Tubbs, T.; Goddard, L.
2000-01-01
Three long-range forecasting methods have been evaluated for prediction and downscaling of seasonal and intraseasonal precipitation statistics in California. Full-statistical, hybrid-dynamical - statistical and full-dynamical approaches have been used to forecast El Nin??o - Southern Oscillation (ENSO) - related total precipitation, daily precipitation frequency, and average intensity anomalies during the January - March season. For El Nin??o winters, the hybrid approach emerges as the best performer, while La Nin??a forecasting skill is poor. The full-statistical forecasting method features reasonable forecasting skill for both La Nin??a and El Nin??o winters. The performance of the full-dynamical approach could not be evaluated as rigorously as that of the other two forecasting schemes. Although the full-dynamical forecasting approach is expected to outperform simpler forecasting schemes in the long run, evidence is presented to conclude that, at present, the full-dynamical forecasting approach is the least viable of the three, at least in California. The authors suggest that operational forecasting of any intraseasonal temperature, precipitation, or streamflow statistic derivable from the available records is possible now for ENSO-extreme years.
Quality and rigor of the concept mapping methodology: a pooled study analysis.
Rosas, Scott R; Kane, Mary
2012-05-01
The use of concept mapping in research and evaluation has expanded dramatically over the past 20 years. Researchers in academic, organizational, and community-based settings have applied concept mapping successfully without the benefit of systematic analyses across studies to identify the features of a methodologically sound study. Quantitative characteristics and estimates of quality and rigor that may guide for future studies are lacking. To address this gap, we conducted a pooled analysis of 69 concept mapping studies to describe characteristics across study phases, generate specific indicators of validity and reliability, and examine the relationship between select study characteristics and quality indicators. Individual study characteristics and estimates were pooled and quantitatively summarized, describing the distribution, variation and parameters for each. In addition, variation in the concept mapping data collection in relation to characteristics and estimates was examined. Overall, results suggest concept mapping yields strong internal representational validity and very strong sorting and rating reliability estimates. Validity and reliability were consistently high despite variation in participation and task completion percentages across data collection modes. The implications of these findings as a practical reference to assess the quality and rigor for future concept mapping studies are discussed. Copyright © 2011 Elsevier Ltd. All rights reserved.
Iterative categorization (IC): a systematic technique for analysing qualitative data
2016-01-01
Abstract The processes of analysing qualitative data, particularly the stage between coding and publication, are often vague and/or poorly explained within addiction science and research more broadly. A simple but rigorous and transparent technique for analysing qualitative textual data, developed within the field of addiction, is described. The technique, iterative categorization (IC), is suitable for use with inductive and deductive codes and can support a range of common analytical approaches, e.g. thematic analysis, Framework, constant comparison, analytical induction, content analysis, conversational analysis, discourse analysis, interpretative phenomenological analysis and narrative analysis. Once the data have been coded, the only software required is a standard word processing package. Worked examples are provided. PMID:26806155
Analyses of Alternatives: Toward a More Rigorous Determination of Scope
2014-04-30
justification is required to avoid three months of wasted effort, the change is unlikely to happen. Applying a Systems View Systems Thinking and...Defense Acquisition Guidebook para. 10.5.2 IOC FOC TRL 1‐3 TRL 4 TRL 7 TRL 8 TRL 9 Compon‐ ent and/or Bread ‐ board Validation In a Relevant Environ‐ ment
ERIC Educational Resources Information Center
Hinnerich, Bjorn Tyrefors; Höglin, Erik; Johannesson, Magnus
2015-01-01
We rigorously test for discrimination against students with foreign backgrounds in high school grading in Sweden. We analyse a random sample of national tests in the Swedish language graded both non-blindly by the student's own teacher and blindly without any identifying information. The increase in the test score due to non-blind grading is…
Phytoremediation of Hazardous Wastes
1995-07-26
TITLE AND SUBTITLE Phytoremediation of Hazardous Wastes 6. AUTHOR(S) Steven C. McCutcheon, N. Lee Wolfe, Laura H. Carreria and Tse-Yuan Ou 5... phytoremediation (the use of plants to degrade hazardous contaminants) was developed. The new approach to phytoremediation involves rigorous pathway analyses...SUBJECT TERMS phytoremediation , nitroreductase, laccase enzymes, SERDP 15. NUMBER OF PAGES 8 16. PRICE CODE N/A 17. SECURITY CLASSIFICATION OF
Benefits of remote real-time side-effect monitoring systems for patients receiving cancer treatment.
Kofoed, Sarah; Breen, Sibilah; Gough, Karla; Aranda, Sanchia
2012-03-05
In Australia, the incidence of cancer diagnoses is rising along with an aging population. Cancer treatments, such as chemotherapy, are increasingly being provided in the ambulatory care setting. Cancer treatments are commonly associated with distressing and serious side-effects and patients often struggle to manage these themselves without specialized real-time support. Unlike chronic disease populations, few systems for the remote real-time monitoring of cancer patients have been reported. However, several prototype systems have been developed and have received favorable reports. This review aimed to identify and detail systems that reported statistical analyses of changes in patient clinical outcomes, health care system usage or health economic analyses. Five papers were identified that met these criteria. There was wide variation in the design of the monitoring systems in terms of data input method, clinician alerting and response, groups of patients targeted and clinical outcomes measured. The majority of studies had significant methodological weaknesses. These included no control group comparisons, small sample sizes, poor documentation of clinical interventions or measures of adherence to the monitoring systems. In spite of the limitations, promising results emerged in terms of improved clinical outcomes (e.g. pain, depression, fatigue). Health care system usage was assessed in two papers with inconsistent results. No studies included health economic analyses. The diversity in systems described, outcomes measured and methodological issues all limited between-study comparisons. Given the acceptability of remote monitoring and the promising outcomes from the few studies analyzing patient or health care system outcomes, future research is needed to rigorously trial these systems to enable greater patient support and safety in the ambulatory setting.
Validity of Meta-analysis in Diabetes: We Need to Be Aware of Its Limitations
Home, Philip D.
2013-01-01
To deliver high-quality clinical care to patients with diabetes and other chronic conditions, clinicians must understand the evidence available from studies that have been performed to address important clinical management questions. In an evidence-based approach to clinical care, the evidence from clinical research should be integrated with clinical expertise, pathophysiological knowledge, and an understanding of patient values. As such, in an effort to provide information from many studies, the publication of diabetes meta-analyses has increased markedly in the recent past, using either observational or clinical trial data. In this regard, guidelines have been developed to direct the performance of meta-analysis to provide consistency among contributions. Thus, when done appropriately, meta-analysis can provide estimates from clinically and statistically homogeneous but underpowered studies and is useful in supporting clinical decisions, guidelines, and cost-effectiveness analysis. However, often these conditions are not met, the data considered are unreliable, and the results should not be assumed to be any more valid than the data underlying the included studies. To provide an understanding of both sides of the argument, we provide a discussion of this topic as part of this two-part point-counterpoint narrative. In the point narrative as presented below, Dr. Home provides his opinion and review of the data to date showing that we need to carefully evaluate meta-analyses and to learn what results are reliable. In the counterpoint narrative following Dr. Home’s contribution, Drs. Golden and Bass emphasize that an effective system exists to guide meta-analysis and that rigorously conducted, high-quality systematic reviews and meta-analyses are an indispensable tool in evidence synthesis despite their limitations. —William T. Cefalu, MD Editor in Chief, Diabetes Care PMID:24065844
Benefits of remote real-time side-effect monitoring systems for patients receiving cancer treatment
Kofoed, Sarah; Breen, Sibilah; Gough, Karla; Aranda, Sanchia
2012-01-01
In Australia, the incidence of cancer diagnoses is rising along with an aging population. Cancer treatments, such as chemotherapy, are increasingly being provided in the ambulatory care setting. Cancer treatments are commonly associated with distressing and serious side-effects and patients often struggle to manage these themselves without specialized real-time support. Unlike chronic disease populations, few systems for the remote real-time monitoring of cancer patients have been reported. However, several prototype systems have been developed and have received favorable reports. This review aimed to identify and detail systems that reported statistical analyses of changes in patient clinical outcomes, health care system usage or health economic analyses. Five papers were identified that met these criteria. There was wide variation in the design of the monitoring systems in terms of data input method, clinician alerting and response, groups of patients targeted and clinical outcomes measured. The majority of studies had significant methodological weaknesses. These included no control group comparisons, small sample sizes, poor documentation of clinical interventions or measures of adherence to the monitoring systems. In spite of the limitations, promising results emerged in terms of improved clinical outcomes (e.g. pain, depression, fatigue). Health care system usage was assessed in two papers with inconsistent results. No studies included health economic analyses. The diversity in systems described, outcomes measured and methodological issues all limited between-study comparisons. Given the acceptability of remote monitoring and the promising outcomes from the few studies analyzing patient or health care system outcomes, future research is needed to rigorously trial these systems to enable greater patient support and safety in the ambulatory setting. PMID:25992209
All biology is computational biology.
Markowetz, Florian
2017-03-01
Here, I argue that computational thinking and techniques are so central to the quest of understanding life that today all biology is computational biology. Computational biology brings order into our understanding of life, it makes biological concepts rigorous and testable, and it provides a reference map that holds together individual insights. The next modern synthesis in biology will be driven by mathematical, statistical, and computational methods being absorbed into mainstream biological training, turning biology into a quantitative science.
Thermodynamics of ideal quantum gas with fractional statistics in D dimensions.
Potter, Geoffrey G; Müller, Gerhard; Karbach, Michael
2007-06-01
We present exact and explicit results for the thermodynamic properties (isochores, isotherms, isobars, response functions, velocity of sound) of a quantum gas in dimensions D > or = 1 and with fractional exclusion statistics 0 < or = g < or =1 connecting bosons (g=0) and fermions (g=1) . In D=1 the results are equivalent to those of the Calogero-Sutherland model. Emphasis is given to the crossover between bosonlike and fermionlike features, caused by aspects of the statistical interaction that mimic long-range attraction and short-range repulsion. A phase transition along the isobar occurs at a nonzero temperature in all dimensions. The T dependence of the velocity of sound is in simple relation to isochores and isobars. The effects of soft container walls are accounted for rigorously for the case of a pure power-law potential.
A traits-based approach for prioritizing species for monitoring and surrogacy selection
Pracheil, Brenda M.; McManamay, Ryan A.; Bevelhimer, Mark S.; ...
2016-11-28
The bar for justifying the use of vertebrate animals for study is being increasingly raised, thus requiring increased rigor for species selection and study design. Although we have power analyses to provide quantitative backing for the numbers of organisms used, quantitative backing for selection of study species is not frequently employed. This can be especially important when measuring the impacts of ecosystem alteration, when study species must be chosen that are both sensitive to the alteration and of sufficient abundance for study. Just as important is providing justification for designation of surrogate species for study, especially when the species ofmore » interest is rare or of conservation concern and selection of an appropriate surrogate can have legal implications. In this study, we use a combination of GIS, a fish traits database and multivariate statistical analyses to quantitatively prioritize species for study and to determine potential study surrogate species. We provide two case studies to illustrate our quantitative, traits-based approach for designating study species and surrogate species. In the first case study, we select broadly representative fish species to understand the effects of turbine passage on adult fishes based on traits that suggest sensitivity to turbine passage. In our second case study, we present a framework for selecting a surrogate species for an endangered species. Lastly, we suggest that our traits-based framework can provide quantitative backing and added justification to selection of study species while expanding the inference space of study results.« less
A traits-based approach for prioritizing species for monitoring and surrogacy selection
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pracheil, Brenda M.; McManamay, Ryan A.; Bevelhimer, Mark S.
The bar for justifying the use of vertebrate animals for study is being increasingly raised, thus requiring increased rigor for species selection and study design. Although we have power analyses to provide quantitative backing for the numbers of organisms used, quantitative backing for selection of study species is not frequently employed. This can be especially important when measuring the impacts of ecosystem alteration, when study species must be chosen that are both sensitive to the alteration and of sufficient abundance for study. Just as important is providing justification for designation of surrogate species for study, especially when the species ofmore » interest is rare or of conservation concern and selection of an appropriate surrogate can have legal implications. In this study, we use a combination of GIS, a fish traits database and multivariate statistical analyses to quantitatively prioritize species for study and to determine potential study surrogate species. We provide two case studies to illustrate our quantitative, traits-based approach for designating study species and surrogate species. In the first case study, we select broadly representative fish species to understand the effects of turbine passage on adult fishes based on traits that suggest sensitivity to turbine passage. In our second case study, we present a framework for selecting a surrogate species for an endangered species. Lastly, we suggest that our traits-based framework can provide quantitative backing and added justification to selection of study species while expanding the inference space of study results.« less
Bansal, Ravi; Peterson, Bradley S
2018-06-01
Identifying regional effects of interest in MRI datasets usually entails testing a priori hypotheses across many thousands of brain voxels, requiring control for false positive findings in these multiple hypotheses testing. Recent studies have suggested that parametric statistical methods may have incorrectly modeled functional MRI data, thereby leading to higher false positive rates than their nominal rates. Nonparametric methods for statistical inference when conducting multiple statistical tests, in contrast, are thought to produce false positives at the nominal rate, which has thus led to the suggestion that previously reported studies should reanalyze their fMRI data using nonparametric tools. To understand better why parametric methods may yield excessive false positives, we assessed their performance when applied both to simulated datasets of 1D, 2D, and 3D Gaussian Random Fields (GRFs) and to 710 real-world, resting-state fMRI datasets. We showed that both the simulated 2D and 3D GRFs and the real-world data contain a small percentage (<6%) of very large clusters (on average 60 times larger than the average cluster size), which were not present in 1D GRFs. These unexpectedly large clusters were deemed statistically significant using parametric methods, leading to empirical familywise error rates (FWERs) as high as 65%: the high empirical FWERs were not a consequence of parametric methods failing to model spatial smoothness accurately, but rather of these very large clusters that are inherently present in smooth, high-dimensional random fields. In fact, when discounting these very large clusters, the empirical FWER for parametric methods was 3.24%. Furthermore, even an empirical FWER of 65% would yield on average less than one of those very large clusters in each brain-wide analysis. Nonparametric methods, in contrast, estimated distributions from those large clusters, and therefore, by construct rejected the large clusters as false positives at the nominal FWERs. Those rejected clusters were outlying values in the distribution of cluster size but cannot be distinguished from true positive findings without further analyses, including assessing whether fMRI signal in those regions correlates with other clinical, behavioral, or cognitive measures. Rejecting the large clusters, however, significantly reduced the statistical power of nonparametric methods in detecting true findings compared with parametric methods, which would have detected most true findings that are essential for making valid biological inferences in MRI data. Parametric analyses, in contrast, detected most true findings while generating relatively few false positives: on average, less than one of those very large clusters would be deemed a true finding in each brain-wide analysis. We therefore recommend the continued use of parametric methods that model nonstationary smoothness for cluster-level, familywise control of false positives, particularly when using a Cluster Defining Threshold of 2.5 or higher, and subsequently assessing rigorously the biological plausibility of the findings, even for large clusters. Finally, because nonparametric methods yielded a large reduction in statistical power to detect true positive findings, we conclude that the modest reduction in false positive findings that nonparametric analyses afford does not warrant a re-analysis of previously published fMRI studies using nonparametric techniques. Copyright © 2018 Elsevier Inc. All rights reserved.
Calculations vs. measurements of remnant dose rates for SNS spent structures
NASA Astrophysics Data System (ADS)
Popova, I. I.; Gallmeier, F. X.; Trotter, S.; Dayton, M.
2018-06-01
Residual dose rate measurements were conducted on target vessel #13 and proton beam window #5 after extraction from their service locations. These measurements were used to verify calculation methods of radionuclide inventory assessment that are typically performed for nuclear waste characterization and transportation of these structures. Neutronics analyses for predicting residual dose rates were carried out using the transport code MCNPX and the transmutation code CINDER90. For transport analyses complex and rigorous geometry model of the structures and their surrounding are applied. The neutronics analyses were carried out using Bertini and CEM high energy physics models for simulating particles interaction. Obtained preliminary calculational results were analysed and compared to the measured dose rates and overall are showing good agreement with in 40% in average.
Calculations vs. measurements of remnant dose rates for SNS spent structures
DOE Office of Scientific and Technical Information (OSTI.GOV)
Popova, Irina I.; Gallmeier, Franz X.; Trotter, Steven M.
Residual dose rate measurements were conducted on target vessel #13 and proton beam window #5 after extraction from their service locations. These measurements were used to verify calculation methods of radionuclide inventory assessment that are typically performed for nuclear waste characterization and transportation of these structures. Neutronics analyses for predicting residual dose rates were carried out using the transport code MCNPX and the transmutation code CINDER90. For transport analyses complex and rigorous geometry model of the structures and their surrounding are applied. The neutronics analyses were carried out using Bertini and CEM high energy physics models for simulating particles interaction.more » Obtained preliminary calculational results were analysed and compared to the measured dose rates and overall are showing good agreement with in 40% in average.« less
SPA- STATISTICAL PACKAGE FOR TIME AND FREQUENCY DOMAIN ANALYSIS
NASA Technical Reports Server (NTRS)
Brownlow, J. D.
1994-01-01
The need for statistical analysis often arises when data is in the form of a time series. This type of data is usually a collection of numerical observations made at specified time intervals. Two kinds of analysis may be performed on the data. First, the time series may be treated as a set of independent observations using a time domain analysis to derive the usual statistical properties including the mean, variance, and distribution form. Secondly, the order and time intervals of the observations may be used in a frequency domain analysis to examine the time series for periodicities. In almost all practical applications, the collected data is actually a mixture of the desired signal and a noise signal which is collected over a finite time period with a finite precision. Therefore, any statistical calculations and analyses are actually estimates. The Spectrum Analysis (SPA) program was developed to perform a wide range of statistical estimation functions. SPA can provide the data analyst with a rigorous tool for performing time and frequency domain studies. In a time domain statistical analysis the SPA program will compute the mean variance, standard deviation, mean square, and root mean square. It also lists the data maximum, data minimum, and the number of observations included in the sample. In addition, a histogram of the time domain data is generated, a normal curve is fit to the histogram, and a goodness-of-fit test is performed. These time domain calculations may be performed on both raw and filtered data. For a frequency domain statistical analysis the SPA program computes the power spectrum, cross spectrum, coherence, phase angle, amplitude ratio, and transfer function. The estimates of the frequency domain parameters may be smoothed with the use of Hann-Tukey, Hamming, Barlett, or moving average windows. Various digital filters are available to isolate data frequency components. Frequency components with periods longer than the data collection interval are removed by least-squares detrending. As many as ten channels of data may be analyzed at one time. Both tabular and plotted output may be generated by the SPA program. This program is written in FORTRAN IV and has been implemented on a CDC 6000 series computer with a central memory requirement of approximately 142K (octal) of 60 bit words. This core requirement can be reduced by segmentation of the program. The SPA program was developed in 1978.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Öztürk, Hande; Noyan, I. Cevdet
A rigorous study of sampling and intensity statistics applicable for a powder diffraction experiment as a function of crystallite size is presented. Our analysis yields approximate equations for the expected value, variance and standard deviations for both the number of diffracting grains and the corresponding diffracted intensity for a given Bragg peak. The classical formalism published in 1948 by Alexander, Klug & Kummer [J. Appl. Phys.(1948),19, 742–753] appears as a special case, limited to large crystallite sizes, here. It is observed that both the Lorentz probability expression and the statistics equations used in the classical formalism are inapplicable for nanocrystallinemore » powder samples.« less
Öztürk, Hande; Noyan, I. Cevdet
2017-08-24
A rigorous study of sampling and intensity statistics applicable for a powder diffraction experiment as a function of crystallite size is presented. Our analysis yields approximate equations for the expected value, variance and standard deviations for both the number of diffracting grains and the corresponding diffracted intensity for a given Bragg peak. The classical formalism published in 1948 by Alexander, Klug & Kummer [J. Appl. Phys.(1948),19, 742–753] appears as a special case, limited to large crystallite sizes, here. It is observed that both the Lorentz probability expression and the statistics equations used in the classical formalism are inapplicable for nanocrystallinemore » powder samples.« less
Mechanical properties of frog skeletal muscles in iodoacetic acid rigor.
Mulvany, M J
1975-01-01
1. Methods have been developed for describing the length: tension characteristics of frog skeletal muscles which go into rigor at 4 degrees C following iodoacetic acid poisoning either in the presence of Ca2+ (Ca-rigor) or its absence (Ca-free-rigor). 2. Such rigor muscles showed less resistance to slow stretch (slow rigor resistance) that to fast stretch (fast rigor resistance). The slow and fast rigor resistances of Ca-free-rigor muscles were much lower than those of Ca-rigor muscles. 3. The slow rigor resistance of Ca-rigor muscles was proportional to the amount of overlap between the contractile filaments present when the muscles were put into rigor. 4. Withdrawing Ca2+ from Ca-rigor muscles (induced-Ca-free rigor) reduced their slow and fast rigor resistances. Readdition of Ca2+ (but not Mg2+, Mn2+ or Sr2+) reversed the effect. 5. The slow and fast rigor resistances of Ca-rigor muscles (but not of Ca-free-rigor muscles) decreased with time. 6.The sarcomere structure of Ca-rigor and induced-Ca-free rigor muscles stretched by 0.2lo was destroyed in proportion to the amount of stretch, but the lengths of the remaining intact sarcomeres were essentially unchanged. This suggests that there had been a successive yielding of the weakeast sarcomeres. 7. The difference between the slow and fast rigor resistance and the effect of calcium on these resistances are discussed in relation to possible variations in the strength of crossbridges between the thick and thin filaments. Images Plate 1 Plate 2 PMID:1082023
Geographic profiling applied to testing models of bumble-bee foraging.
Raine, Nigel E; Rossmo, D Kim; Le Comber, Steven C
2009-03-06
Geographic profiling (GP) was originally developed as a statistical tool to help police forces prioritize lists of suspects in investigations of serial crimes. GP uses the location of related crime sites to make inferences about where the offender is most likely to live, and has been extremely successful in criminology. Here, we show how GP is applicable to experimental studies of animal foraging, using the bumble-bee Bombus terrestris. GP techniques enable us to simplify complex patterns of spatial data down to a small number of parameters (2-3) for rigorous hypothesis testing. Combining computer model simulations and experimental observation of foraging bumble-bees, we demonstrate that GP can be used to discriminate between foraging patterns resulting from (i) different hypothetical foraging algorithms and (ii) different food item (flower) densities. We also demonstrate that combining experimental and simulated data can be used to elucidate animal foraging strategies: specifically that the foraging patterns of real bumble-bees can be reliably discriminated from three out of nine hypothetical foraging algorithms. We suggest that experimental systems, like foraging bees, could be used to test and refine GP model predictions, and that GP offers a useful technique to analyse spatial animal behaviour data in both the laboratory and field.
Neurofeedback with fMRI: A critical systematic review.
Thibault, Robert T; MacPherson, Amanda; Lifshitz, Michael; Roth, Raquel R; Raz, Amir
2018-05-15
Neurofeedback relying on functional magnetic resonance imaging (fMRI-nf) heralds new prospects for self-regulating brain and behavior. Here we provide the first comprehensive review of the fMRI-nf literature and the first systematic database of fMRI-nf findings. We synthesize information from 99 fMRI-nf experiments-the bulk of currently available data. The vast majority of fMRI-nf findings suggest that self-regulation of specific brain signatures seems viable; however, replication of concomitant behavioral outcomes remains sparse. To disentangle placebo influences and establish the specific effects of neurofeedback, we highlight the need for double-blind placebo-controlled studies alongside rigorous and standardized statistical analyses. Before fMRI-nf can join the clinical armamentarium, research must first confirm the sustainability, transferability, and feasibility of fMRI-nf in patients as well as in healthy individuals. Whereas modulating specific brain activity promises to mold cognition, emotion, thought, and action, reducing complex mental health issues to circumscribed brain regions may represent a tenuous goal. We can certainly change brain activity with fMRI-nf. However, it remains unclear whether such changes translate into meaningful behavioral improvements in the clinical domain. Copyright © 2017 Elsevier Inc. All rights reserved.
Patel, Tulpesh; Blyth, Jacqueline C; Griffiths, Gareth; Kelly, Deirdre; Talcott, Joel B
2014-01-01
Proton Magnetic Resonance Spectroscopy ((1)H-MRS) is a non-invasive imaging technique that enables quantification of neurochemistry in vivo and thereby facilitates investigation of the biochemical underpinnings of human cognitive variability. Studies in the field of cognitive spectroscopy have commonly focused on relationships between measures of N-acetyl aspartate (NAA), a surrogate marker of neuronal health and function, and broad measures of cognitive performance, such as IQ. In this study, we used (1)H-MRS to interrogate single-voxels in occipitoparietal and frontal cortex, in parallel with assessments of psychometric intelligence, in a sample of 40 healthy adult participants. We found correlations between NAA and IQ that were within the range reported in previous studies. However, the magnitude of these effects was significantly modulated by the stringency of data screening and the extent to which outlying values contributed to statistical analyses. (1)H-MRS offers a sensitive tool for assessing neurochemistry non-invasively, yet the relationships between brain metabolites and broad aspects of human behavior such as IQ are subtle. We highlight the need to develop an increasingly rigorous analytical and interpretive framework for collecting and reporting data obtained from cognitive spectroscopy studies of this kind.
A Novel Estimator for the Rate of Information Transfer by Continuous Signals
Takalo, Jouni; Ignatova, Irina; Weckström, Matti; Vähäsöyrinki, Mikko
2011-01-01
The information transfer rate provides an objective and rigorous way to quantify how much information is being transmitted through a communications channel whose input and output consist of time-varying signals. However, current estimators of information content in continuous signals are typically based on assumptions about the system's linearity and signal statistics, or they require prohibitive amounts of data. Here we present a novel information rate estimator without these limitations that is also optimized for computational efficiency. We validate the method with a simulated Gaussian information channel and demonstrate its performance with two example applications. Information transfer between the input and output signals of a nonlinear system is analyzed using a sensory receptor neuron as the model system. Then, a climate data set is analyzed to demonstrate that the method can be applied to a system based on two outputs generated by interrelated random processes. These analyses also demonstrate that the new method offers consistent performance in situations where classical methods fail. In addition to these examples, the method is applicable to a wide range of continuous time series commonly observed in the natural sciences, economics and engineering. PMID:21494562
Wagner, Katrin; Mendieta-Leiva, Glenda; Zotz, Gerhard
2015-01-01
Information on the degree of host specificity is fundamental for an understanding of the ecology of structurally dependent plants such as vascular epiphytes. Starting with the seminal paper of A.F.W. Schimper on epiphyte ecology in the late 19th century over 200 publications have dealt with the issue of host specificity in vascular epiphytes. We review and critically discuss this extensive literature. The available evidence indicates that host ranges of vascular epiphytes are largely unrestricted while a certain host bias is ubiquitous. However, tree size and age and spatial autocorrelation of tree and epiphyte species have not been adequately considered in most statistical analyses. More refined null expectations and adequate replication are needed to allow more rigorous conclusions. Host specificity could be caused by a large number of tree traits (e.g. bark characteristics and architectural traits), which influence epiphyte performance. After reviewing the empirical evidence for their relevance, we conclude that future research should use a more comprehensive approach by determining the relative importance of various potential mechanisms acting locally and by testing several proposed hypotheses regarding the relative strength of host specificity in different habitats and among different groups of structurally dependent flora. PMID:25564514
Clogging arches in grains, colloids, and pedestrians flowing through constrictions
NASA Astrophysics Data System (ADS)
Zuriguel, Iker
When a group of particles pass through a narrow orifice, the flow might become intermittent due to the development of clogs that obstruct the constriction. This effect has been observed in many different fields such as mining transport, microbial growing, crowd dynamics, colloids, granular and active matter. In this work we introduce a general framework in which research in some of such scenarios can be encompassed. In particular, we analyze the statistical properties of the bottleneck flow in different experiments and simulations: granular media within vibrated silos, colloids, a flock of sheep and pedestrian evacuations. We reveal a common phenomenology that allows us to rigorously define a transition to a clogged state. Using this definition we explore the main variables involved, which are then grouped into three generic parameters. In addition, we will present results of the geometrical characteristics that the clogging arches have which are related with their stability against perturbations. We experimentally analyse the temporal evolution of the arches evidencing important differences among the structures that are easily destroyed and those that seem to resist forever (longer than the temporal window employed in our measurements). Ministerio de Economía y Competitividad (Spanish Government). Project No. FIS2014-57325.
Filtering Meteoroid Flights Using Multiple Unscented Kalman Filters
NASA Astrophysics Data System (ADS)
Sansom, E. K.; Bland, P. A.; Rutten, M. G.; Paxman, J.; Towner, M. C.
2016-11-01
Estimator algorithms are immensely versatile and powerful tools that can be applied to any problem where a dynamic system can be modeled by a set of equations and where observations are available. A well designed estimator enables system states to be optimally predicted and errors to be rigorously quantified. Unscented Kalman filters (UKFs) and interactive multiple models can be found in methods from satellite tracking to self-driving cars. The luminous trajectory of the Bunburra Rockhole fireball was observed by the Desert Fireball Network in mid-2007. The recorded data set is used in this paper to examine the application of these two techniques as a viable approach to characterizing fireball dynamics. The nonlinear, single-body system of equations, used to model meteoroid entry through the atmosphere, is challenged by gross fragmentation events that may occur. The incorporation of the UKF within an interactive multiple model smoother provides a likely solution for when fragmentation events may occur as well as providing a statistical analysis of the state uncertainties. In addition to these benefits, another advantage of this approach is its automatability for use within an image processing pipeline to facilitate large fireball data analyses and meteorite recoveries.
Seven common mistakes in population genetics and how to avoid them.
Meirmans, Patrick G
2015-07-01
As the data resulting from modern genotyping tools are astoundingly complex, genotyping studies require great care in the sampling design, genotyping, data analysis and interpretation. Such care is necessary because, with data sets containing thousands of loci, small biases can easily become strongly significant patterns. Such biases may already be present in routine tasks that are present in almost every genotyping study. Here, I discuss seven common mistakes that can be frequently encountered in the genotyping literature: (i) giving more attention to genotyping than to sampling, (ii) failing to perform or report experimental randomization in the laboratory, (iii) equating geopolitical borders with biological borders, (iv) testing significance of clustering output, (v) misinterpreting Mantel's r statistic, (vi) only interpreting a single value of k and (vii) forgetting that only a small portion of the genome will be associated with climate. For every of those issues, I give some suggestions how to avoid the mistake. Overall, I argue that genotyping studies would benefit from establishing a more rigorous experimental design, involving proper sampling design, randomization and better distinction of a priori hypotheses and exploratory analyses. © 2015 John Wiley & Sons Ltd.
An Overview of Meta-Analyses of Danhong Injection for Unstable Angina.
Zhang, Xiaoxia; Wang, Hui; Chang, Yanxu; Wang, Yuefei; Lei, Xiang; Fu, Shufei; Zhang, Junhua
2015-01-01
Objective. To systematically collect evidence and evaluate the effects of Danhong injection (DHI) for unstable angina (UA). Methods. A comprehensive search was conducted in seven electronic databases up to January 2015. The methodological and reporting quality of included studies was assessed by using AMSTAR and PRISMA. Result. Five articles were included. The conclusions suggest that DHI plus conventional medicine treatment was effective for UA pectoris treatment, could alleviate symptoms of angina and ameliorate electrocardiograms. Flaws of the original studies and systematic reviews weaken the strength of evidence. Limitations of the methodology quality include performing an incomprehensive literature search, lacking detailed characteristics, ignoring clinical heterogeneity, and not assessing publication bias and other forms of bias. The flaws of reporting systematic reviews included the following: not providing a structured summary, no standardized search strategy. For the pooled findings, researchers took statistical heterogeneity into consideration, but clinical and methodology heterogeneity were ignored. Conclusion. DHI plus conventional medicine treatment generally appears to be effective for UA treatment. However, the evidence is not hard enough due to methodological flaws in original clinical trials and systematic reviews. Furthermore, rigorous designed randomized controlled trials are also needed. The methodology and reporting quality of systematic reviews should be improved.
An Overview of Meta-Analyses of Danhong Injection for Unstable Angina
Zhang, Xiaoxia; Chang, Yanxu; Wang, Yuefei; Lei, Xiang; Fu, Shufei; Zhang, Junhua
2015-01-01
Objective. To systematically collect evidence and evaluate the effects of Danhong injection (DHI) for unstable angina (UA). Methods. A comprehensive search was conducted in seven electronic databases up to January 2015. The methodological and reporting quality of included studies was assessed by using AMSTAR and PRISMA. Result. Five articles were included. The conclusions suggest that DHI plus conventional medicine treatment was effective for UA pectoris treatment, could alleviate symptoms of angina and ameliorate electrocardiograms. Flaws of the original studies and systematic reviews weaken the strength of evidence. Limitations of the methodology quality include performing an incomprehensive literature search, lacking detailed characteristics, ignoring clinical heterogeneity, and not assessing publication bias and other forms of bias. The flaws of reporting systematic reviews included the following: not providing a structured summary, no standardized search strategy. For the pooled findings, researchers took statistical heterogeneity into consideration, but clinical and methodology heterogeneity were ignored. Conclusion. DHI plus conventional medicine treatment generally appears to be effective for UA treatment. However, the evidence is not hard enough due to methodological flaws in original clinical trials and systematic reviews. Furthermore, rigorous designed randomized controlled trials are also needed. The methodology and reporting quality of systematic reviews should be improved. PMID:26539221
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tregillis, Ian Lee
This document examines the performance of a generic flat-mirror multimonochromatic imager (MMI), with special emphasis on existing instruments at NIF and Omega. We begin by deriving the standard equation for the mean number of photons detected per resolution element. The pinhole energy bandwidth is a contributing factor; this is dominated by the finite size of the source and may be considerable. The most common method for estimating the spatial resolution of such a system (quadrature addition) is, technically, mathematically invalid for this case. However, under the proper circumstances it may produce good estimates compared to a rigorous calculation based onmore » the convolution of point-spread functions. Diffraction is an important contribution to the spatial resolution. Common approximations based on Fraunhofer (farfield) diffraction may be inappropriate and misleading, as the instrument may reside in multiple regimes depending upon its configuration or the energy of interest. It is crucial to identify the correct diffraction regime; Fraunhofer and Fresnel (near-field) diffraction profiles are substantially different, the latter being considerably wider. Finally, we combine the photonics and resolution analyses to derive an expression for the minimum signal level such that the resulting images are not dominated by photon statistics. This analysis is consistent with observed performance of the NIF MMI.« less
Chiu, Grace S.; Wu, Margaret A.; Lu, Lin
2013-01-01
The ability to quantitatively assess ecological health is of great interest to those tasked with monitoring and conserving ecosystems. For decades, biomonitoring research and policies have relied on multimetric health indices of various forms. Although indices are numbers, many are constructed based on qualitative procedures, thus limiting the quantitative rigor of the practical interpretations of such indices. The statistical modeling approach to construct the latent health factor index (LHFI) was recently developed. With ecological data that otherwise are used to construct conventional multimetric indices, the LHFI framework expresses such data in a rigorous quantitative model, integrating qualitative features of ecosystem health and preconceived ecological relationships among such features. This hierarchical modeling approach allows unified statistical inference of health for observed sites (along with prediction of health for partially observed sites, if desired) and of the relevance of ecological drivers, all accompanied by formal uncertainty statements from a single, integrated analysis. Thus far, the LHFI approach has been demonstrated and validated in a freshwater context. We adapt this approach to modeling estuarine health, and illustrate it on the previously unassessed system in Richibucto in New Brunswick, Canada, where active oyster farming is a potential stressor through its effects on sediment properties. Field data correspond to health metrics that constitute the popular AZTI marine biotic index and the infaunal trophic index, as well as abiotic predictors preconceived to influence biota. Our paper is the first to construct a scientifically sensible model that rigorously identifies the collective explanatory capacity of salinity, distance downstream, channel depth, and silt–clay content–all regarded a priori as qualitatively important abiotic drivers–towards site health in the Richibucto ecosystem. This suggests the potential effectiveness of the LHFI approach for assessing not only freshwater systems but aquatic ecosystems in general. PMID:23785443
The Independent Technical Analysis Process
DOE Office of Scientific and Technical Information (OSTI.GOV)
Duberstein, Corey A.; Ham, Kenneth D.; Dauble, Dennis D.
2007-04-13
The Bonneville Power Administration (BPA) contracted with the Pacific Northwest National Laboratory (PNNL) to provide technical analytical support for system-wide fish passage information (BPA Project No. 2006-010-00). The goal of this project was to produce rigorous technical analysis products using independent analysts and anonymous peer reviewers. In the past, regional parties have interacted with a single entity, the Fish Passage Center to access the data, analyses, and coordination related to fish passage. This project provided an independent technical source for non-routine fish passage analyses while allowing routine support functions to be performed by other well-qualified entities.
Statistics for NAEG: past efforts, new results, and future plans
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gilbert, R.O.; Simpson, J.C.; Kinnison, R.R.
A brief review of Nevada Applied Ecology Group (NAEG) objectives is followed by a summary of past statistical analyses conducted by Pacific Northwest Laboratory for the NAEG. Estimates of spatial pattern of radionuclides and other statistical analyses at NS's 201, 219 and 221 are reviewed as background for new analyses presented in this paper. Suggested NAEG activities and statistical analyses needed for the projected termination date of NAEG studies in March 1986 are given.
Novick, Steven; Shen, Yan; Yang, Harry; Peterson, John; LeBlond, Dave; Altan, Stan
2015-01-01
Dissolution (or in vitro release) studies constitute an important aspect of pharmaceutical drug development. One important use of such studies is for justifying a biowaiver for post-approval changes which requires establishing equivalence between the new and old product. We propose a statistically rigorous modeling approach for this purpose based on the estimation of what we refer to as the F2 parameter, an extension of the commonly used f2 statistic. A Bayesian test procedure is proposed in relation to a set of composite hypotheses that capture the similarity requirement on the absolute mean differences between test and reference dissolution profiles. Several examples are provided to illustrate the application. Results of our simulation study comparing the performance of f2 and the proposed method show that our Bayesian approach is comparable to or in many cases superior to the f2 statistic as a decision rule. Further useful extensions of the method, such as the use of continuous-time dissolution modeling, are considered.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Doebling, Scott William
The purpose of the verification project is to establish, through rigorous convergence analysis, that each ASC computational physics code correctly implements a set of physics models and algorithms (code verification); Evaluate and analyze the uncertainties of code outputs associated with the choice of temporal and spatial discretization (solution or calculation verification); and Develop and maintain the capability to expand and update these analyses on demand. This presentation describes project milestones.
2014-06-01
lower density compared with aramid fibers such as Kevlar and Twaron. Numerical modeling is used to design more effective fiber-based composite armor...in measuring fibers and doing experiments. vi INTENTIONALLY LEFT BLANK. 1 1. Introduction Aramid fibers such as Kevlar (DuPont) and Twaron...methyl methacrylate blocks. The efficacy of this method to grip Kevlar fibers has been rigorously studied using a variety of statistical methods at
2011-06-01
Committee Meeting. 23 June 2008. Bjorkman, Eileen A. and Frank B. Gray . “Testing in a Joint Environment 2004-2008: Findings, Conclusions and...the LVC joint test environment to evaluate system performance and joint mission effectiveness (Bjorkman and Gray 2009a). The LVC battlespace...attack (Bjorkman and Gray 2009b). Figure 3 - JTEM Methodology (Bjorkman 2008) A key INTEGRAL FIRE lesson learned was realizing the need for each
NASA Astrophysics Data System (ADS)
Cenek, Martin; Dahl, Spencer K.
2016-11-01
Systems with non-linear dynamics frequently exhibit emergent system behavior, which is important to find and specify rigorously to understand the nature of the modeled phenomena. Through this analysis, it is possible to characterize phenomena such as how systems assemble or dissipate and what behaviors lead to specific final system configurations. Agent Based Modeling (ABM) is one of the modeling techniques used to study the interaction dynamics between a system's agents and its environment. Although the methodology of ABM construction is well understood and practiced, there are no computational, statistically rigorous, comprehensive tools to evaluate an ABM's execution. Often, a human has to observe an ABM's execution in order to analyze how the ABM functions, identify the emergent processes in the agent's behavior, or study a parameter's effect on the system-wide behavior. This paper introduces a new statistically based framework to automatically analyze agents' behavior, identify common system-wide patterns, and record the probability of agents changing their behavior from one pattern of behavior to another. We use network based techniques to analyze the landscape of common behaviors in an ABM's execution. Finally, we test the proposed framework with a series of experiments featuring increasingly emergent behavior. The proposed framework will allow computational comparison of ABM executions, exploration of a model's parameter configuration space, and identification of the behavioral building blocks in a model's dynamics.
Cenek, Martin; Dahl, Spencer K
2016-11-01
Systems with non-linear dynamics frequently exhibit emergent system behavior, which is important to find and specify rigorously to understand the nature of the modeled phenomena. Through this analysis, it is possible to characterize phenomena such as how systems assemble or dissipate and what behaviors lead to specific final system configurations. Agent Based Modeling (ABM) is one of the modeling techniques used to study the interaction dynamics between a system's agents and its environment. Although the methodology of ABM construction is well understood and practiced, there are no computational, statistically rigorous, comprehensive tools to evaluate an ABM's execution. Often, a human has to observe an ABM's execution in order to analyze how the ABM functions, identify the emergent processes in the agent's behavior, or study a parameter's effect on the system-wide behavior. This paper introduces a new statistically based framework to automatically analyze agents' behavior, identify common system-wide patterns, and record the probability of agents changing their behavior from one pattern of behavior to another. We use network based techniques to analyze the landscape of common behaviors in an ABM's execution. Finally, we test the proposed framework with a series of experiments featuring increasingly emergent behavior. The proposed framework will allow computational comparison of ABM executions, exploration of a model's parameter configuration space, and identification of the behavioral building blocks in a model's dynamics.
NASA Astrophysics Data System (ADS)
Shearer, P.; Jawed, M. K.; Raines, J. M.; Lepri, S. T.; Gilbert, J. A.; von Steiger, R.; Zurbuchen, T.
2013-12-01
The SWICS instruments aboard ACE and Ulysses have performed in situ measurements of individual solar wind ions for a period spanning over two decades. Solar wind composition is determined by accumulating the measurements into an ion count histogram in which each species appears as a distinct peak. Assigning counts to the appropriate species is a challenging statistical problem because of the limited counts for some species and overlap between some peaks. We show that the most commonly used count assignment methods can suffer from significant bias when a highly abundant species overlaps with a much less abundant one. For ACE/SWICS data, this bias results in an overestimated Ne/O ratio. Bias is greatly reduced by switching to a rigorous maximum likelihood count assignment method, resulting in a 30-50% reduction in the estimated Ne abundance. We will discuss the new Ne/O values and put them in context with the solar system abundances for Ne derived from other techniques, such as in situ collection from Genesis and its heritage instrument, the Solar Foil experiment during the Apollo era. The new count assignment method is currently being applied to reanalyze the archived ACE and Ulysses data and obtain revised abundances of C, N, O, Ne, Mg, Si, S, and Fe, leading to revised datasets that will be made publicly available.
The benefits of health information exchange: an updated systematic review.
Menachemi, Nir; Rahurkar, Saurabh; Harle, Christopher A; Vest, Joshua R
2018-04-28
Widespread health information exchange (HIE) is a national objective motivated by the promise of improved care and a reduction in costs. Previous reviews have found little rigorous evidence that HIE positively affects these anticipated benefits. However, early studies of HIE were methodologically limited. The purpose of the current study is to review the recent literature on the impact of HIE. We used the Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines to conduct our systematic review. PubMed and Scopus databases were used to identify empirical articles that evaluated HIE in the context of a health care outcome. Our search strategy identified 24 articles that included 63 individual analyses. The majority of the studies were from the United States representing 9 states; and about 40% of the included analyses occurred in a handful of HIEs from the state of New York. Seven of the 24 studies used designs suitable for causal inference and all reported some beneficial effect from HIE; none reported adverse effects. The current systematic review found that studies with more rigorous designs all reported benefits from HIE. Such benefits include fewer duplicated procedures, reduced imaging, lower costs, and improved patient safety. We also found that studies evaluating community HIEs were more likely to find benefits than studies that evaluated enterprise HIEs or vendor-mediated exchanges. Overall, these finding bode well for the HIEs ability to deliver on anticipated improvements in care delivery and reduction in costs.
NASA Astrophysics Data System (ADS)
Cuntz, Matthias; Mai, Juliane; Zink, Matthias; Thober, Stephan; Kumar, Rohini; Schäfer, David; Schrön, Martin; Craven, John; Rakovec, Oldrich; Spieler, Diana; Prykhodko, Vladyslav; Dalmasso, Giovanni; Musuuza, Jude; Langenberg, Ben; Attinger, Sabine; Samaniego, Luis
2015-08-01
Environmental models tend to require increasing computational time and resources as physical process descriptions are improved or new descriptions are incorporated. Many-query applications such as sensitivity analysis or model calibration usually require a large number of model evaluations leading to high computational demand. This often limits the feasibility of rigorous analyses. Here we present a fully automated sequential screening method that selects only informative parameters for a given model output. The method requires a number of model evaluations that is approximately 10 times the number of model parameters. It was tested using the mesoscale hydrologic model mHM in three hydrologically unique European river catchments. It identified around 20 informative parameters out of 52, with different informative parameters in each catchment. The screening method was evaluated with subsequent analyses using all 52 as well as only the informative parameters. Subsequent Sobol's global sensitivity analysis led to almost identical results yet required 40% fewer model evaluations after screening. mHM was calibrated with all and with only informative parameters in the three catchments. Model performances for daily discharge were equally high in both cases with Nash-Sutcliffe efficiencies above 0.82. Calibration using only the informative parameters needed just one third of the number of model evaluations. The universality of the sequential screening method was demonstrated using several general test functions from the literature. We therefore recommend the use of the computationally inexpensive sequential screening method prior to rigorous analyses on complex environmental models.
NASA Astrophysics Data System (ADS)
Mai, Juliane; Cuntz, Matthias; Zink, Matthias; Thober, Stephan; Kumar, Rohini; Schäfer, David; Schrön, Martin; Craven, John; Rakovec, Oldrich; Spieler, Diana; Prykhodko, Vladyslav; Dalmasso, Giovanni; Musuuza, Jude; Langenberg, Ben; Attinger, Sabine; Samaniego, Luis
2016-04-01
Environmental models tend to require increasing computational time and resources as physical process descriptions are improved or new descriptions are incorporated. Many-query applications such as sensitivity analysis or model calibration usually require a large number of model evaluations leading to high computational demand. This often limits the feasibility of rigorous analyses. Here we present a fully automated sequential screening method that selects only informative parameters for a given model output. The method requires a number of model evaluations that is approximately 10 times the number of model parameters. It was tested using the mesoscale hydrologic model mHM in three hydrologically unique European river catchments. It identified around 20 informative parameters out of 52, with different informative parameters in each catchment. The screening method was evaluated with subsequent analyses using all 52 as well as only the informative parameters. Subsequent Sobol's global sensitivity analysis led to almost identical results yet required 40% fewer model evaluations after screening. mHM was calibrated with all and with only informative parameters in the three catchments. Model performances for daily discharge were equally high in both cases with Nash-Sutcliffe efficiencies above 0.82. Calibration using only the informative parameters needed just one third of the number of model evaluations. The universality of the sequential screening method was demonstrated using several general test functions from the literature. We therefore recommend the use of the computationally inexpensive sequential screening method prior to rigorous analyses on complex environmental models.
Hendrickson, Carolyn M; Dobbins, Sarah; Redick, Brittney J; Greenberg, Molly D; Calfee, Carolyn S; Cohen, Mitchell Jay
2015-09-01
Adherence to rigorous research protocols for identifying adult respiratory distress syndrome (ARDS) after trauma is variable. To examine how misclassification of ARDS may bias observational studies in trauma populations, we evaluated the agreement of two methods for adjudicating ARDS after trauma: the current gold standard, direct review of chest radiographs and review of dictated radiology reports, a commonly used alternative. This nested cohort study included 123 mechanically ventilated patients between 2005 and 2008, with at least one PaO2/FIO2 less than 300 within the first 8 days of admission. Two blinded physician investigators adjudicated ARDS by two methods. The investigators directly reviewed all chest radiographs to evaluate for bilateral infiltrates. Several months later, blinded to their previous assessments, they adjudicated ARDS using a standardized rubric to classify radiology reports. A κ statistics was calculated. Regression analyses quantified the association between established risk factors as well as important clinical outcomes and ARDS determined by the aforementioned methods as well as hypoxemia as a surrogate marker. The κ was 0.47 for the observed agreement between ARDS adjudicated by direct review of chest radiographs and ARDS adjudicated by review of radiology reports. Both the magnitude and direction of bias on the estimates of association between ARDS and established risk factors as well as clinical outcomes varied by method of adjudication. Classification of ARDS by review of dictated radiology reports had only moderate agreement with the current gold standard, ARDS adjudicated by direct review of chest radiographs. While the misclassification of ARDS had varied effects on the estimates of associations with established risk factors, it tended to weaken the association of ARDS with important clinical outcomes. A standardized approach to ARDS adjudication after trauma by direct review of chest radiographs will minimize misclassification bias in future observational studies. Diagnostic study, level II.
Song, Dawei; Meng, Bin; Gan, Minfeng; Niu, Junjie; Li, Shiyan; Chen, Hao; Yuan, Chenxi; Yang, Huilin
2015-08-01
Percutaneous vertebroplasty (PVP) and balloon kyphoplasty (BKP) are minimally invasive and effective vertebral augmentation techniques for managing osteoporotic vertebral compression fractures (OVCFs). Recent meta-analyses have compared the incidence of secondary vertebral fractures between patients treated with vertebral augmentation techniques or conservative treatment; however, the inclusions were not thorough and rigorous enough, and the effects of each technique on the incidence of secondary vertebral fractures remain unclear. To perform an updated systematic review and meta-analysis of the studies with more rigorous inclusion criteria on the effects of vertebral augmentation techniques and conservative treatment for OVCF on the incidence of secondary vertebral fractures. PubMed, MEDLINE, EMBASE, SpringerLink, Web of Science, and the Cochrane Library database were searched for relevant original articles comparing the incidence of secondary vertebral fractures between vertebral augmentation techniques and conservative treatment for patients with OVCFs. Randomized controlled trials (RCTs) and prospective non-randomized controlled trials (NRCTs) were identified. The methodological qualities of the studies were evaluated, relevant data were extracted and recorded, and an appropriate meta-analysis was conducted. A total of 13 articles were included. The pooled results from included studies showed no statistically significant differences in the incidence of secondary vertebral fractures between patients treated with vertebral augmentation techniques and conservative treatment. Subgroup analysis comparing different study designs, durations of symptoms, follow-up times, races of patients, and techniques were conducted, and no significant differences in the incidence of secondary fractures were identified (P > 0.05). No obvious publication bias was detected by either Begg's test (P = 0.360 > 0.05) or Egger's test (P = 0.373 > 0.05). Despite current thinking in the field that vertebral augmentation procedures may increase the incidence of secondary fractures, we found no differences in the incidence of secondary fractures between vertebral augmentation techniques and conservative treatment for patients with OVCFs. © The Foundation Acta Radiologica 2014.
GIA Model Statistics for GRACE Hydrology, Cryosphere, and Ocean Science
NASA Astrophysics Data System (ADS)
Caron, L.; Ivins, E. R.; Larour, E.; Adhikari, S.; Nilsson, J.; Blewitt, G.
2018-03-01
We provide a new analysis of glacial isostatic adjustment (GIA) with the goal of assembling the model uncertainty statistics required for rigorously extracting trends in surface mass from the Gravity Recovery and Climate Experiment (GRACE) mission. Such statistics are essential for deciphering sea level, ocean mass, and hydrological changes because the latter signals can be relatively small (≤2 mm/yr water height equivalent) over very large regions, such as major ocean basins and watersheds. With abundant new >7 year continuous measurements of vertical land motion (VLM) reported by Global Positioning System stations on bedrock and new relative sea level records, our new statistical evaluation of GIA uncertainties incorporates Bayesian methodologies. A unique aspect of the method is that both the ice history and 1-D Earth structure vary through a total of 128,000 forward models. We find that best fit models poorly capture the statistical inferences needed to correctly invert for lower mantle viscosity and that GIA uncertainty exceeds the uncertainty ascribed to trends from 14 years of GRACE data in polar regions.
Output statistics of laser anemometers in sparsely seeded flows
NASA Technical Reports Server (NTRS)
Edwards, R. V.; Jensen, A. S.
1982-01-01
It is noted that until very recently, research on this topic concentrated on the particle arrival statistics and the influence of the optical parameters on them. Little attention has been paid to the influence of subsequent processing on the measurement statistics. There is also controversy over whether the effects of the particle statistics can be measured. It is shown here that some of the confusion derives from a lack of understanding of the experimental parameters that are to be controlled or known. A rigorous framework is presented for examining the measurement statistics of such systems. To provide examples, two problems are then addressed. The first has to do with a sample and hold processor, the second with what is called a saturable processor. The sample and hold processor converts the output to a continuous signal by holding the last reading until a new one is obtained. The saturable system is one where the maximum processable rate is arrived at by the dead time of some unit in the system. At high particle rates, the processed rate is determined through the dead time.
Estimation of integral curves from high angular resolution diffusion imaging (HARDI) data.
Carmichael, Owen; Sakhanenko, Lyudmila
2015-05-15
We develop statistical methodology for a popular brain imaging technique HARDI based on the high order tensor model by Özarslan and Mareci [10]. We investigate how uncertainty in the imaging procedure propagates through all levels of the model: signals, tensor fields, vector fields, and fibers. We construct asymptotically normal estimators of the integral curves or fibers which allow us to trace the fibers together with confidence ellipsoids. The procedure is computationally intense as it blends linear algebra concepts from high order tensors with asymptotical statistical analysis. The theoretical results are illustrated on simulated and real datasets. This work generalizes the statistical methodology proposed for low angular resolution diffusion tensor imaging by Carmichael and Sakhanenko [3], to several fibers per voxel. It is also a pioneering statistical work on tractography from HARDI data. It avoids all the typical limitations of the deterministic tractography methods and it delivers the same information as probabilistic tractography methods. Our method is computationally cheap and it provides well-founded mathematical and statistical framework where diverse functionals on fibers, directions and tensors can be studied in a systematic and rigorous way.
Estimation of integral curves from high angular resolution diffusion imaging (HARDI) data
Carmichael, Owen; Sakhanenko, Lyudmila
2015-01-01
We develop statistical methodology for a popular brain imaging technique HARDI based on the high order tensor model by Özarslan and Mareci [10]. We investigate how uncertainty in the imaging procedure propagates through all levels of the model: signals, tensor fields, vector fields, and fibers. We construct asymptotically normal estimators of the integral curves or fibers which allow us to trace the fibers together with confidence ellipsoids. The procedure is computationally intense as it blends linear algebra concepts from high order tensors with asymptotical statistical analysis. The theoretical results are illustrated on simulated and real datasets. This work generalizes the statistical methodology proposed for low angular resolution diffusion tensor imaging by Carmichael and Sakhanenko [3], to several fibers per voxel. It is also a pioneering statistical work on tractography from HARDI data. It avoids all the typical limitations of the deterministic tractography methods and it delivers the same information as probabilistic tractography methods. Our method is computationally cheap and it provides well-founded mathematical and statistical framework where diverse functionals on fibers, directions and tensors can be studied in a systematic and rigorous way. PMID:25937674
Topological Isomorphisms of Human Brain and Financial Market Networks
Vértes, Petra E.; Nicol, Ruth M.; Chapman, Sandra C.; Watkins, Nicholas W.; Robertson, Duncan A.; Bullmore, Edward T.
2011-01-01
Although metaphorical and conceptual connections between the human brain and the financial markets have often been drawn, rigorous physical or mathematical underpinnings of this analogy remain largely unexplored. Here, we apply a statistical and graph theoretic approach to the study of two datasets – the time series of 90 stocks from the New York stock exchange over a 3-year period, and the fMRI-derived time series acquired from 90 brain regions over the course of a 10-min-long functional MRI scan of resting brain function in healthy volunteers. Despite the many obvious substantive differences between these two datasets, graphical analysis demonstrated striking commonalities in terms of global network topological properties. Both the human brain and the market networks were non-random, small-world, modular, hierarchical systems with fat-tailed degree distributions indicating the presence of highly connected hubs. These properties could not be trivially explained by the univariate time series statistics of stock price returns. This degree of topological isomorphism suggests that brains and markets can be regarded broadly as members of the same family of networks. The two systems, however, were not topologically identical. The financial market was more efficient and more modular – more highly optimized for information processing – than the brain networks; but also less robust to systemic disintegration as a result of hub deletion. We conclude that the conceptual connections between brains and markets are not merely metaphorical; rather these two information processing systems can be rigorously compared in the same mathematical language and turn out often to share important topological properties in common to some degree. There will be interesting scientific arbitrage opportunities in further work at the graph-theoretically mediated interface between systems neuroscience and the statistical physics of financial markets. PMID:22007161
Analyzing thematic maps and mapping for accuracy
Rosenfield, G.H.
1982-01-01
Two problems which exist while attempting to test the accuracy of thematic maps and mapping are: (1) evaluating the accuracy of thematic content, and (2) evaluating the effects of the variables on thematic mapping. Statistical analysis techniques are applicable to both these problems and include techniques for sampling the data and determining their accuracy. In addition, techniques for hypothesis testing, or inferential statistics, are used when comparing the effects of variables. A comprehensive and valid accuracy test of a classification project, such as thematic mapping from remotely sensed data, includes the following components of statistical analysis: (1) sample design, including the sample distribution, sample size, size of the sample unit, and sampling procedure; and (2) accuracy estimation, including estimation of the variance and confidence limits. Careful consideration must be given to the minimum sample size necessary to validate the accuracy of a given. classification category. The results of an accuracy test are presented in a contingency table sometimes called a classification error matrix. Usually the rows represent the interpretation, and the columns represent the verification. The diagonal elements represent the correct classifications. The remaining elements of the rows represent errors by commission, and the remaining elements of the columns represent the errors of omission. For tests of hypothesis that compare variables, the general practice has been to use only the diagonal elements from several related classification error matrices. These data are arranged in the form of another contingency table. The columns of the table represent the different variables being compared, such as different scales of mapping. The rows represent the blocking characteristics, such as the various categories of classification. The values in the cells of the tables might be the counts of correct classification or the binomial proportions of these counts divided by either the row totals or the column totals from the original classification error matrices. In hypothesis testing, when the results of tests of multiple sample cases prove to be significant, some form of statistical test must be used to separate any results that differ significantly from the others. In the past, many analyses of the data in this error matrix were made by comparing the relative magnitudes of the percentage of correct classifications, for either individual categories, the entire map or both. More rigorous analyses have used data transformations and (or) two-way classification analysis of variance. A more sophisticated step of data analysis techniques would be to use the entire classification error matrices using the methods of discrete multivariate analysis or of multiviariate analysis of variance.
Statistical model of exotic rotational correlations in emergent space-time
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hogan, Craig; Kwon, Ohkyung; Richardson, Jonathan
2017-06-06
A statistical model is formulated to compute exotic rotational correlations that arise as inertial frames and causal structure emerge on large scales from entangled Planck scale quantum systems. Noncommutative quantum dynamics are represented by random transverse displacements that respect causal symmetry. Entanglement is represented by covariance of these displacements in Planck scale intervals defined by future null cones of events on an observer's world line. Light that propagates in a nonradial direction inherits a projected component of the exotic rotational correlation that accumulates as a random walk in phase. A calculation of the projection and accumulation leads to exact predictionsmore » for statistical properties of exotic Planck scale correlations in an interferometer of any configuration. The cross-covariance for two nearly co-located interferometers is shown to depart only slightly from the autocovariance. Specific examples are computed for configurations that approximate realistic experiments, and show that the model can be rigorously tested.« less
The log-periodic-AR(1)-GARCH(1,1) model for financial crashes
NASA Astrophysics Data System (ADS)
Gazola, L.; Fernandes, C.; Pizzinga, A.; Riera, R.
2008-02-01
This paper intends to meet recent claims for the attainment of more rigorous statistical methodology within the econophysics literature. To this end, we consider an econometric approach to investigate the outcomes of the log-periodic model of price movements, which has been largely used to forecast financial crashes. In order to accomplish reliable statistical inference for unknown parameters, we incorporate an autoregressive dynamic and a conditional heteroskedasticity structure in the error term of the original model, yielding the log-periodic-AR(1)-GARCH(1,1) model. Both the original and the extended models are fitted to financial indices of U. S. market, namely S&P500 and NASDAQ. Our analysis reveal two main points: (i) the log-periodic-AR(1)-GARCH(1,1) model has residuals with better statistical properties and (ii) the estimation of the parameter concerning the time of the financial crash has been improved.
Robust Statistical Detection of Power-Law Cross-Correlation.
Blythe, Duncan A J; Nikulin, Vadim V; Müller, Klaus-Robert
2016-06-02
We show that widely used approaches in statistical physics incorrectly indicate the existence of power-law cross-correlations between financial stock market fluctuations measured over several years and the neuronal activity of the human brain lasting for only a few minutes. While such cross-correlations are nonsensical, no current methodology allows them to be reliably discarded, leaving researchers at greater risk when the spurious nature of cross-correlations is not clear from the unrelated origin of the time series and rather requires careful statistical estimation. Here we propose a theory and method (PLCC-test) which allows us to rigorously and robustly test for power-law cross-correlations, correctly detecting genuine and discarding spurious cross-correlations, thus establishing meaningful relationships between processes in complex physical systems. Our method reveals for the first time the presence of power-law cross-correlations between amplitudes of the alpha and beta frequency ranges of the human electroencephalogram.
Robust Statistical Detection of Power-Law Cross-Correlation
Blythe, Duncan A. J.; Nikulin, Vadim V.; Müller, Klaus-Robert
2016-01-01
We show that widely used approaches in statistical physics incorrectly indicate the existence of power-law cross-correlations between financial stock market fluctuations measured over several years and the neuronal activity of the human brain lasting for only a few minutes. While such cross-correlations are nonsensical, no current methodology allows them to be reliably discarded, leaving researchers at greater risk when the spurious nature of cross-correlations is not clear from the unrelated origin of the time series and rather requires careful statistical estimation. Here we propose a theory and method (PLCC-test) which allows us to rigorously and robustly test for power-law cross-correlations, correctly detecting genuine and discarding spurious cross-correlations, thus establishing meaningful relationships between processes in complex physical systems. Our method reveals for the first time the presence of power-law cross-correlations between amplitudes of the alpha and beta frequency ranges of the human electroencephalogram. PMID:27250630
Statistical methods for thermonuclear reaction rates and nucleosynthesis simulations
NASA Astrophysics Data System (ADS)
Iliadis, Christian; Longland, Richard; Coc, Alain; Timmes, F. X.; Champagne, Art E.
2015-03-01
Rigorous statistical methods for estimating thermonuclear reaction rates and nucleosynthesis are becoming increasingly established in nuclear astrophysics. The main challenge being faced is that experimental reaction rates are highly complex quantities derived from a multitude of different measured nuclear parameters (e.g., astrophysical S-factors, resonance energies and strengths, particle and γ-ray partial widths). We discuss the application of the Monte Carlo method to two distinct, but related, questions. First, given a set of measured nuclear parameters, how can one best estimate the resulting thermonuclear reaction rates and associated uncertainties? Second, given a set of appropriate reaction rates, how can one best estimate the abundances from nucleosynthesis (i.e., reaction network) calculations? The techniques described here provide probability density functions that can be used to derive statistically meaningful reaction rates and final abundances for any desired coverage probability. Examples are given for applications to s-process neutron sources, core-collapse supernovae, classical novae, and Big Bang nucleosynthesis.
Prediction of coefficients of thermal expansion for unidirectional composites
NASA Technical Reports Server (NTRS)
Bowles, David E.; Tompkins, Stephen S.
1989-01-01
Several analyses for predicting the longitudinal, alpha(1), and transverse, alpha(2), coefficients of thermal expansion of unidirectional composites were compared with each other, and with experimental data on different graphite fiber reinforced resin, metal, and ceramic matrix composites. Analytical and numerical analyses that accurately accounted for Poisson restraining effects in the transverse direction were in consistently better agreement with experimental data for alpha(2), than the less rigorous analyses. All of the analyses predicted similar values of alpha(1), and were in good agreement with the experimental data. A sensitivity analysis was conducted to determine the relative influence of constituent properties on the predicted values of alpha(1), and alpha(2). As would be expected, the prediction of alpha(1) was most sensitive to longitudinal fiber properties and the prediction of alpha(2) was most sensitive to matrix properties.
Using cancer to make cellular reproduction rigorous and relevant
NASA Astrophysics Data System (ADS)
Duncan, Cynthia F.
The 1983 report Nation at Risk highlighted the fact that test scores of American students were far below that of competing nations and educational standards were being lowered. This trend has continued and studies have also shown that students are not entering college ready for success. This trend can be reversed. Students can better understand and retain biology content expectations if they are taught in a way that is both rigorous and relevant. In the past, students have learned the details of cellular reproduction with little knowledge of why it is important to their everyday lives. This material is learned only for the test. Knowing the details of cellular reproduction is crucial for understanding cancer. Cancer is a topic that will likely affect all of my students at some point in their lives. Students used hands on activities, including simulations, labs, and models to learn about cellular reproduction with cancer as a theme throughout. Students were challenged to learn how to use the rigorous biology content expectations to think about cancer, including stem cell research. Students that will some day be college students, voting citizens, and parents, will become better learners. Students were assessed before and after the completion of the unit to determine if learning occurs. Students did learn the material and became more critical thinkers. Statistical analysis was completed to insure confidence in the results.
Effectiveness of Culturally Appropriate Adaptations to Juvenile Justice Services
Vergara, Andrew T.; Kathuria, Parul; Woodmass, Kyler; Janke, Robert; Wells, Susan J.
2017-01-01
Despite efforts to increase cultural competence of services within juvenile justice systems, disproportional minority contact (DMC) persists throughout Canada and the United States. Commonly cited approaches to decreasing DMC include large-scale systemic changes as well as enhancement of the cultural relevance and responsiveness of services delivered. Cultural adaptations to service delivery focus on prevention, decision-making, and treatment services to reduce initial contact, minimize unnecessary restraint, and reduce recidivism. Though locating rigorous testing of these approaches compared to standard interventions is difficult, this paper identifies and reports on such research. The Cochrane guidelines for systematic literature reviews and meta-analyses served as a foundation for study methodology. Databases such as Legal Periodicals and Books were searched through June 2015. Three studies were sufficiently rigorous to identify the effect of the cultural adaptations, and three studies that are making potentially important contributions to the field were also reviewed. PMID:29468092
DOE Office of Scientific and Technical Information (OSTI.GOV)
Webster, Anthony J.; CCFE, Culham Science Centre, Abingdon OX14 3DB
2014-11-15
The generic question is considered: How can we determine the probability of an otherwise quasi-random event, having been triggered by an external influence? A specific problem is the quantification of the success of techniques to trigger, and hence control, edge-localised plasma instabilities (ELMs) in magnetically confined fusion (MCF) experiments. The development of such techniques is essential to ensure tolerable heat loads on components in large MCF fusion devices, and is necessary for their development into economically successful power plants. Bayesian probability theory is used to rigorously formulate the problem and to provide a formal solution. Accurate but pragmatic methods aremore » developed to estimate triggering probabilities, and are illustrated with experimental data. These allow results from experiments to be quantitatively assessed, and rigorously quantified conclusions to be formed. Example applications include assessing whether triggering of ELMs is a statistical or deterministic process, and the establishment of thresholds to ensure that ELMs are reliably triggered.« less
Probability bounds analysis for nonlinear population ecology models.
Enszer, Joshua A; Andrei Măceș, D; Stadtherr, Mark A
2015-09-01
Mathematical models in population ecology often involve parameters that are empirically determined and inherently uncertain, with probability distributions for the uncertainties not known precisely. Propagating such imprecise uncertainties rigorously through a model to determine their effect on model outputs can be a challenging problem. We illustrate here a method for the direct propagation of uncertainties represented by probability bounds though nonlinear, continuous-time, dynamic models in population ecology. This makes it possible to determine rigorous bounds on the probability that some specified outcome for a population is achieved, which can be a core problem in ecosystem modeling for risk assessment and management. Results can be obtained at a computational cost that is considerably less than that required by statistical sampling methods such as Monte Carlo analysis. The method is demonstrated using three example systems, with focus on a model of an experimental aquatic food web subject to the effects of contamination by ionic liquids, a new class of potentially important industrial chemicals. Copyright © 2015. Published by Elsevier Inc.
Li, Jia; Lam, Edmund Y
2014-04-21
Mask topography effects need to be taken into consideration for a more accurate solution of source mask optimization (SMO) in advanced optical lithography. However, rigorous 3D mask models generally involve intensive computation and conventional SMO fails to manipulate the mask-induced undesired phase errors that degrade the usable depth of focus (uDOF) and process yield. In this work, an optimization approach incorporating pupil wavefront aberrations into SMO procedure is developed as an alternative to maximize the uDOF. We first design the pupil wavefront function by adding primary and secondary spherical aberrations through the coefficients of the Zernike polynomials, and then apply the conjugate gradient method to achieve an optimal source-mask pair under the condition of aberrated pupil. We also use a statistical model to determine the Zernike coefficients for the phase control and adjustment. Rigorous simulations of thick masks show that this approach provides compensation for mask topography effects by improving the pattern fidelity and increasing uDOF.
A framework for grouping nanoparticles based on their measurable characteristics.
Sayes, Christie M; Smith, P Alex; Ivanov, Ivan V
2013-01-01
There is a need to take a broader look at nanotoxicological studies. Eventually, the field will demand that some generalizations be made. To begin to address this issue, we posed a question: are metal colloids on the nanometer-size scale a homogeneous group? In general, most people can agree that the physicochemical properties of nanomaterials can be linked and related to their induced toxicological responses. The focus of this study was to determine how a set of selected physicochemical properties of five specific metal-based colloidal materials on the nanometer-size scale - silver, copper, nickel, iron, and zinc - could be used as nanodescriptors that facilitate the grouping of these metal-based colloids. The example of the framework pipeline processing provided in this paper shows the utility of specific statistical and pattern recognition techniques in grouping nanoparticles based on experimental data about their physicochemical properties. Interestingly, the results of the analyses suggest that a seemingly homogeneous group of nanoparticles could be separated into sub-groups depending on interdependencies observed in their nanodescriptors. These particles represent an important category of nanomaterials that are currently mass produced. Each has been reputed to induce toxicological and/or cytotoxicological effects. Here, we propose an experimental methodology coupled with mathematical and statistical modeling that can serve as a prototype for a rigorous framework that aids in the ability to group nanomaterials together and to facilitate the subsequent analysis of trends in data based on quantitative modeling of nanoparticle-specific structure-activity relationships. The computational part of the proposed framework is rather general and can be applied to other groups of nanomaterials as well.
In situ antimicrobial behavior of materials with copper-based additives in a hospital environment.
Palza, Humberto; Nuñez, Mauricio; Bastías, Roberto; Delgado, Katherine
2018-06-01
Copper and its alloys are effective antimicrobial surface materials in the laboratory and in clinical trials. Copper has been used in the healthcare setting to reduce environmental contamination, and thus prevent healthcare-associated infections, complementing traditional protocols. The addition of copper nanoparticles to polymer/plastic matrices can also produce antimicrobial materials, as confirmed under laboratory conditions. However, there is a lack of studies validating the antimicrobial effects of these nanocomposite materials in clinical trials. To satisfy this issue, plastic waiting room chairs with embedded metal copper nanoparticles, and metal hospital IV pools coated with an organic paint with nanostructured zeolite/copper particles were produced and tested in a hospital environment. These prototypes were sampled once weekly for 10 weeks and the viable microorganisms were analysed and compared with the copper-free materials. In the waiting rooms, chairs with copper reduced by around 73% the total viable microorganisms present, showing activity regardless of the microorganism tested. Although there were only low levels of microorganisms in the IV pools installed in operating rooms because of rigorous hygiene protocols, samples with copper presented lower total viable microorganisms than unfilled materials. Some results did not have statistical significance because of the low load of microorganisms; however, during at least three weeks the IV pools with copper had reduced levels of microorganisms by a statistically significant 50%. These findings show for the first time the feasibility of utilizing the antimicrobial property of copper by adding nanosized fillers to other materials in a hospital environment. Copyright © 2018 Elsevier B.V. and International Society of Chemotherapy. All rights reserved.
On Statistical Approaches for Demonstrating Analytical Similarity in the Presence of Correlation.
Yang, Harry; Novick, Steven; Burdick, Richard K
Analytical similarity is the foundation for demonstration of biosimilarity between a proposed product and a reference product. For this assessment, currently the U.S. Food and Drug Administration (FDA) recommends a tiered system in which quality attributes are categorized into three tiers commensurate with their risk and approaches of varying statistical rigor are subsequently used for the three-tier quality attributes. Key to the analyses of Tiers 1 and 2 quality attributes is the establishment of equivalence acceptance criterion and quality range. For particular licensure applications, the FDA has provided advice on statistical methods for demonstration of analytical similarity. For example, for Tier 1 assessment, an equivalence test can be used based on an equivalence margin of 1.5 σ R , where σ R is the reference product variability estimated by the sample standard deviation S R from a sample of reference lots. The quality range for demonstrating Tier 2 analytical similarity is of the form X̄ R ± K × σ R where the constant K is appropriately justified. To demonstrate Tier 2 analytical similarity, a large percentage (e.g., 90%) of test product must fall in the quality range. In this paper, through both theoretical derivations and simulations, we show that when the reference drug product lots are correlated, the sample standard deviation S R underestimates the true reference product variability σ R As a result, substituting S R for σ R in the Tier 1 equivalence acceptance criterion and the Tier 2 quality range inappropriately reduces the statistical power and the ability to declare analytical similarity. Also explored is the impact of correlation among drug product lots on Type I error rate and power. Three methods based on generalized pivotal quantities are introduced, and their performance is compared against a two-one-sided tests (TOST) approach. Finally, strategies to mitigate risk of correlation among the reference products lots are discussed. A biosimilar is a generic version of the original biological drug product. A key component of a biosimilar development is the demonstration of analytical similarity between the biosimilar and the reference product. Such demonstration relies on application of statistical methods to establish a similarity margin and appropriate test for equivalence between the two products. This paper discusses statistical issues with demonstration of analytical similarity and provides alternate approaches to potentially mitigate these problems. © PDA, Inc. 2016.
Statistical Analysis of Protein Ensembles
NASA Astrophysics Data System (ADS)
Máté, Gabriell; Heermann, Dieter
2014-04-01
As 3D protein-configuration data is piling up, there is an ever-increasing need for well-defined, mathematically rigorous analysis approaches, especially that the vast majority of the currently available methods rely heavily on heuristics. We propose an analysis framework which stems from topology, the field of mathematics which studies properties preserved under continuous deformations. First, we calculate a barcode representation of the molecules employing computational topology algorithms. Bars in this barcode represent different topological features. Molecules are compared through their barcodes by statistically determining the difference in the set of their topological features. As a proof-of-principle application, we analyze a dataset compiled of ensembles of different proteins, obtained from the Ensemble Protein Database. We demonstrate that our approach correctly detects the different protein groupings.
Hickey, Graeme L; Blackstone, Eugene H
2016-08-01
Clinical risk-prediction models serve an important role in healthcare. They are used for clinical decision-making and measuring the performance of healthcare providers. To establish confidence in a model, external model validation is imperative. When designing such an external model validation study, thought must be given to patient selection, risk factor and outcome definitions, missing data, and the transparent reporting of the analysis. In addition, there are a number of statistical methods available for external model validation. Execution of a rigorous external validation study rests in proper study design, application of suitable statistical methods, and transparent reporting. Copyright © 2016 The American Association for Thoracic Surgery. Published by Elsevier Inc. All rights reserved.
Escape rates over potential barriers: variational principles and the Hamilton-Jacobi equation
NASA Astrophysics Data System (ADS)
Cortés, Emilio; Espinosa, Francisco
We describe a rigorous formalism to study some extrema statistics problems, like maximum probability events or escape rate processes, by taking into account that the Hamilton-Jacobi equation completes, in a natural way, the required set of boundary conditions of the Euler-Lagrange equation, for this kind of variational problem. We apply this approach to a one-dimensional stochastic process, driven by colored noise, for a double-parabola potential, where we have one stable and one unstable steady states.
Demodulation of messages received with low signal to noise ratio
NASA Astrophysics Data System (ADS)
Marguinaud, A.; Quignon, T.; Romann, B.
The implementation of this all-digital demodulator is derived from maximum likelihood considerations applied to an analytical representation of the received signal. Traditional adapted filters and phase lock loops are replaced by minimum variance estimators and hypothesis tests. These statistical tests become very simple when working on phase signal. These methods, combined with rigorous control data representation allow significant computation savings as compared to conventional realizations. Nominal operation has been verified down to energetic signal over noise of -3 dB upon a QPSK demodulator.
Rapid Creation and Quantitative Monitoring of High Coverage shRNA Libraries
Bassik, Michael C.; Lebbink, Robert Jan; Churchman, L. Stirling; Ingolia, Nicholas T.; Patena, Weronika; LeProust, Emily M.; Schuldiner, Maya; Weissman, Jonathan S.; McManus, Michael T.
2009-01-01
Short hairpin RNA (shRNA) libraries are limited by the low efficacy of many shRNAs, giving false negatives, and off-target effects, giving false positives. Here we present a strategy for rapidly creating expanded shRNA pools (∼30 shRNAs/gene) that are analyzed by deep-sequencing (EXPAND). This approach enables identification of multiple effective target-specific shRNAs from a complex pool, allowing a rigorous statistical evaluation of whether a gene is a true hit. PMID:19448642
Ray-optical theory of broadband partially coherent emission
NASA Astrophysics Data System (ADS)
Epstein, Ariel; Tessler, Nir; Einziger, Pinchas D.
2013-04-01
We present a rigorous formulation of the effects of spectral broadening on emission of partially coherent source ensembles embedded in multilayered formations with arbitrarily shaped interfaces, provided geometrical optics is valid. The resulting ray-optical theory, applicable to a variety of optical systems from terahertz lenses to photovoltaic cells, quantifies the fundamental interplay between bandwidth and layer dimensions, and sheds light on common practices in optical analysis of statistical fields, e.g., disregarding multiple reflections or neglecting interference cross terms.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kupper, L.L.; Setzer, R.W.; Schwartzbaum, J.
1987-07-01
This document reports on a reevaluation of data obtained in a previous report on occupational factors associated with the development of malignant melanomas at Lawrence Livermore National Laboratory. The current report reduces the number of these factors from five to three based on a rigorous statistical analysis of the original data. Recommendations include restructuring the original questionnaire and trying to contact more individuals that worked with volatile photographic chemicals. 17 refs., 7 figs., 22 tabs. (TEM)
ERIC Educational Resources Information Center
Plonsky, Luke
2011-01-01
I began this study with two assumptions. Assumption 1: Study quality matters. If the means by which researchers design, carry out, and report on their studies lack in rigor or transparency, theory and practice are likely to be misguided or at least decelerated. Assumption 2 is an implication of Assumption 1: Quality should be measured rather than…
Development of rigor mortis is not affected by muscle volume.
Kobayashi, M; Ikegaya, H; Takase, I; Hatanaka, K; Sakurada, K; Iwase, H
2001-04-01
There is a hypothesis suggesting that rigor mortis progresses more rapidly in small muscles than in large muscles. We measured rigor mortis as tension determined isometrically in rat musculus erector spinae that had been cut into muscle bundles of various volumes. The muscle volume did not influence either the progress or the resolution of rigor mortis, which contradicts the hypothesis. Differences in pre-rigor load on the muscles influenced the onset and resolution of rigor mortis in a few pairs of samples, but did not influence the time taken for rigor mortis to reach its full extent after death. Moreover, the progress of rigor mortis in this muscle was biphasic; this may reflect the early rigor of red muscle fibres and the late rigor of white muscle fibres.
Hughes, Brianna H; Greenberg, Neil J; Yang, Tom C; Skonberg, Denise I
2015-01-01
High-pressure processing (HPP) is used to increase meat safety and shelf-life, with conflicting quality effects depending on rigor status during HPP. In the seafood industry, HPP is used to shuck and pasteurize oysters, but its use on abalones has only been minimally evaluated and the effect of rigor status during HPP on abalone quality has not been reported. Farm-raised abalones (Haliotis rufescens) were divided into 12 HPP treatments and 1 unprocessed control treatment. Treatments were processed pre-rigor or post-rigor at 2 pressures (100 and 300 MPa) and 3 processing times (1, 3, and 5 min). The control was analyzed post-rigor. Uniform plugs were cut from adductor and foot meat for texture profile analysis, shear force, and color analysis. Subsamples were used for scanning electron microscopy of muscle ultrastructure. Texture profile analysis revealed that post-rigor processed abalone was significantly (P < 0.05) less firm and chewy than pre-rigor processed irrespective of muscle type, processing time, or pressure. L values increased with pressure to 68.9 at 300 MPa for pre-rigor processed foot, 73.8 for post-rigor processed foot, 90.9 for pre-rigor processed adductor, and 89.0 for post-rigor processed adductor. Scanning electron microscopy images showed fraying of collagen fibers in processed adductor, but did not show pressure-induced compaction of the foot myofibrils. Post-rigor processed abalone meat was more tender than pre-rigor processed meat, and post-rigor processed foot meat was lighter in color than pre-rigor processed foot meat, suggesting that waiting for rigor to resolve prior to processing abalones may improve consumer perceptions of quality and market value. © 2014 Institute of Food Technologists®
German translation, cultural adaptation, and validation of the Health Literacy Questionnaire (HLQ).
Nolte, Sandra; Osborne, Richard H; Dwinger, Sarah; Elsworth, Gerald R; Conrad, Melanie L; Rose, Matthias; Härter, Martin; Dirmaier, Jörg; Zill, Jördis M
2017-01-01
The Health Literacy Questionnaire (HLQ), developed in Australia in 2012 using a 'validity-driven' approach, has been rapidly adopted and is being applied in many countries and languages. It is a multidimensional measure comprising nine distinct domains that may be used for surveys, needs assessment, evaluation and outcomes assessment as well as for informing service improvement and the development of interventions. The aim of this paper is to describe the German translation of the HLQ and to present the results of the validation of the culturally adapted version. The HLQ comprises 44 items, which were translated and culturally adapted to the German context. This study uses data collected from a sample of 1,058 persons with chronic conditions. Statistical analyses include descriptive and confirmatory factor analyses. In one-factor congeneric models, all scales demonstrated good fit after few model adjustments. In a single, highly restrictive nine-factor model (no cross-loadings, no correlated errors) replication of the original English-language version was achieved with fit indices and psychometric properties similar to the original HLQ. Reliability for all scales was excellent, with a Cronbach's Alpha of at least 0.77. High to very high correlations between some HLQ factors were observed, suggesting that higher order factors may be present. Our rigorous development and validation protocol, as well as strict adaptation processes, have generated a remarkable reproduction of the HLQ in German. The results of this validation provide evidence that the HLQ is robust and can be recommended for use in German-speaking populations. German Clinical Trial Registration (DRKS): DRKS00000584. Registered 23 March 2011.
Brabencová, Sylva; Ihnatová, Ivana; Potěšil, David; Fojtová, Miloslava; Fajkus, Jiří; Zdráhal, Zbyněk; Lochmanová, Gabriela
2017-01-01
Inter-individual variability of conspecific plants is governed by differences in their genetically determined growth and development traits, environmental conditions, and adaptive responses under epigenetic control involving histone post-translational modifications. The apparent variability in histone modifications among plants might be increased by technical variation introduced in sample processing during epigenetic analyses. Thus, to detect true variations in epigenetic histone patterns associated with given factors, the basal variability among samples that is not associated with them must be estimated. To improve knowledge of relative contribution of biological and technical variation, mass spectrometry was used to examine histone modification patterns (acetylation and methylation) among Arabidopsis thaliana plants of ecotypes Columbia 0 (Col-0) and Wassilewskija (Ws) homogenized by two techniques (grinding in a cryomill or with a mortar and pestle). We found little difference in histone modification profiles between the ecotypes. However, in comparison of the biological and technical components of variability, we found consistently higher inter-individual variability in histone mark levels among Ws plants than among Col-0 plants (grown from seeds collected either from single plants or sets of plants). Thus, more replicates of Ws would be needed for rigorous analysis of epigenetic marks. Regarding technical variability, the cryomill introduced detectably more heterogeneity in the data than the mortar and pestle treatment, but mass spectrometric analyses had minor apparent effects. Our study shows that it is essential to consider inter-sample variance and estimate suitable numbers of biological replicates for statistical analysis for each studied organism when investigating changes in epigenetic histone profiles. PMID:29270186
Brabencová, Sylva; Ihnatová, Ivana; Potěšil, David; Fojtová, Miloslava; Fajkus, Jiří; Zdráhal, Zbyněk; Lochmanová, Gabriela
2017-01-01
Inter-individual variability of conspecific plants is governed by differences in their genetically determined growth and development traits, environmental conditions, and adaptive responses under epigenetic control involving histone post-translational modifications. The apparent variability in histone modifications among plants might be increased by technical variation introduced in sample processing during epigenetic analyses. Thus, to detect true variations in epigenetic histone patterns associated with given factors, the basal variability among samples that is not associated with them must be estimated. To improve knowledge of relative contribution of biological and technical variation, mass spectrometry was used to examine histone modification patterns (acetylation and methylation) among Arabidopsis thaliana plants of ecotypes Columbia 0 (Col-0) and Wassilewskija (Ws) homogenized by two techniques (grinding in a cryomill or with a mortar and pestle). We found little difference in histone modification profiles between the ecotypes. However, in comparison of the biological and technical components of variability, we found consistently higher inter-individual variability in histone mark levels among Ws plants than among Col-0 plants (grown from seeds collected either from single plants or sets of plants). Thus, more replicates of Ws would be needed for rigorous analysis of epigenetic marks. Regarding technical variability, the cryomill introduced detectably more heterogeneity in the data than the mortar and pestle treatment, but mass spectrometric analyses had minor apparent effects. Our study shows that it is essential to consider inter-sample variance and estimate suitable numbers of biological replicates for statistical analysis for each studied organism when investigating changes in epigenetic histone profiles.
Reduced incidence of Prevotella and other fermenters in intestinal microflora of autistic children.
Kang, Dae-Wook; Park, Jin Gyoon; Ilhan, Zehra Esra; Wallstrom, Garrick; Labaer, Joshua; Adams, James B; Krajmalnik-Brown, Rosa
2013-01-01
High proportions of autistic children suffer from gastrointestinal (GI) disorders, implying a link between autism and abnormalities in gut microbial functions. Increasing evidence from recent high-throughput sequencing analyses indicates that disturbances in composition and diversity of gut microbiome are associated with various disease conditions. However, microbiome-level studies on autism are limited and mostly focused on pathogenic bacteria. Therefore, here we aimed to define systemic changes in gut microbiome associated with autism and autism-related GI problems. We recruited 20 neurotypical and 20 autistic children accompanied by a survey of both autistic severity and GI symptoms. By pyrosequencing the V2/V3 regions in bacterial 16S rDNA from fecal DNA samples, we compared gut microbiomes of GI symptom-free neurotypical children with those of autistic children mostly presenting GI symptoms. Unexpectedly, the presence of autistic symptoms, rather than the severity of GI symptoms, was associated with less diverse gut microbiomes. Further, rigorous statistical tests with multiple testing corrections showed significantly lower abundances of the genera Prevotella, Coprococcus, and unclassified Veillonellaceae in autistic samples. These are intriguingly versatile carbohydrate-degrading and/or fermenting bacteria, suggesting a potential influence of unusual diet patterns observed in autistic children. However, multivariate analyses showed that autism-related changes in both overall diversity and individual genus abundances were correlated with the presence of autistic symptoms but not with their diet patterns. Taken together, autism and accompanying GI symptoms were characterized by distinct and less diverse gut microbial compositions with lower levels of Prevotella, Coprococcus, and unclassified Veillonellaceae.
Reduced Incidence of Prevotella and Other Fermenters in Intestinal Microflora of Autistic Children
Ilhan, Zehra Esra; Wallstrom, Garrick; LaBaer, Joshua; Adams, James B.; Krajmalnik-Brown, Rosa
2013-01-01
High proportions of autistic children suffer from gastrointestinal (GI) disorders, implying a link between autism and abnormalities in gut microbial functions. Increasing evidence from recent high-throughput sequencing analyses indicates that disturbances in composition and diversity of gut microbiome are associated with various disease conditions. However, microbiome-level studies on autism are limited and mostly focused on pathogenic bacteria. Therefore, here we aimed to define systemic changes in gut microbiome associated with autism and autism-related GI problems. We recruited 20 neurotypical and 20 autistic children accompanied by a survey of both autistic severity and GI symptoms. By pyrosequencing the V2/V3 regions in bacterial 16S rDNA from fecal DNA samples, we compared gut microbiomes of GI symptom-free neurotypical children with those of autistic children mostly presenting GI symptoms. Unexpectedly, the presence of autistic symptoms, rather than the severity of GI symptoms, was associated with less diverse gut microbiomes. Further, rigorous statistical tests with multiple testing corrections showed significantly lower abundances of the genera Prevotella, Coprococcus, and unclassified Veillonellaceae in autistic samples. These are intriguingly versatile carbohydrate-degrading and/or fermenting bacteria, suggesting a potential influence of unusual diet patterns observed in autistic children. However, multivariate analyses showed that autism-related changes in both overall diversity and individual genus abundances were correlated with the presence of autistic symptoms but not with their diet patterns. Taken together, autism and accompanying GI symptoms were characterized by distinct and less diverse gut microbial compositions with lower levels of Prevotella, Coprococcus, and unclassified Veillonellaceae. PMID:23844187
NASA Astrophysics Data System (ADS)
Brovelli, M. A.; Oxoli, D.; Zurbarán, M. A.
2016-06-01
During the past years Web 2.0 technologies have caused the emergence of platforms where users can share data related to their activities which in some cases are then publicly released with open licenses. Popular categories for this include community platforms where users can upload GPS tracks collected during slow travel activities (e.g. hiking, biking and horse riding) and platforms where users share their geolocated photos. However, due to the high heterogeneity of the information available on the Web, the sole use of these user-generated contents makes it an ambitious challenge to understand slow mobility flows as well as to detect the most visited locations in a region. Exploiting the available data on community sharing websites allows to collect near real-time open data streams and enables rigorous spatial-temporal analysis. This work presents an approach for collecting, unifying and analysing pointwise geolocated open data available from different sources with the aim of identifying the main locations and destinations of slow mobility activities. For this purpose, we collected pointwise open data from the Wikiloc platform, Twitter, Flickr and Foursquare. The analysis was confined to the data uploaded in Lombardy Region (Northern Italy) - corresponding to millions of pointwise data. Collected data was processed through the use of Free and Open Source Software (FOSS) in order to organize them into a suitable database. This allowed to run statistical analyses on data distribution in both time and space by enabling the detection of users' slow mobility preferences as well as places of interest at a regional scale.
Consumption Patterns and Perception Analyses of Hangwa
Kwock, Chang Geun; Lee, Min A; Park, So Hyun
2012-01-01
Hangwa is a traditional food, corresponding to the current consumption trend, in need of marketing strategies to extend its consumption. Therefore, the purpose of this study was to analyze consumers’ consumption patterns and perception of Hangwa to increase consumption in the market. A questionnaire was sent to 250 consumers by e-mail from Oct 8∼23, 2009 and the data from 231 persons were analyzed in this study. Statistical, descriptive, paired samples t-test, and importance-performance analyses were conducted using SPSS WIN 17.0. According to the results, Hangwa was purchased mainly ‘for present’ (39.8%) and the main reasons for buying it were ‘traditional image’ (33.3%) and ‘taste’ (22.5%). When importance and performance of attributes considered in purchasing Hangwa were evaluated, performance was assessed to be lower than importance for all attributes. The attributes in the first quadrant with a high importance and a high performance were ‘a sanitary process’, ‘a rigorous quality mark’ and ‘taste’, which were related with quality of the products. In addition, those with a high importance but a low performance were ‘popularization through advertisement’, ‘promotion through mass media’, ‘conversion of thought on traditional foods’, ‘a reasonable price’ and ‘a wide range of price’. In conclusion, Hangwa manufacturers need to diversify products and extend the expiration date based on technologies to promote its consumption. In terms of price, Hangwa should become more available by lowering the price barrier for consumers who are sensitive to price. PMID:24471065
Protecting genomic data analytics in the cloud: state of the art and opportunities.
Tang, Haixu; Jiang, Xiaoqian; Wang, Xiaofeng; Wang, Shuang; Sofia, Heidi; Fox, Dov; Lauter, Kristin; Malin, Bradley; Telenti, Amalio; Xiong, Li; Ohno-Machado, Lucila
2016-10-13
The outsourcing of genomic data into public cloud computing settings raises concerns over privacy and security. Significant advancements in secure computation methods have emerged over the past several years, but such techniques need to be rigorously evaluated for their ability to support the analysis of human genomic data in an efficient and cost-effective manner. With respect to public cloud environments, there are concerns about the inadvertent exposure of human genomic data to unauthorized users. In analyses involving multiple institutions, there is additional concern about data being used beyond agreed research scope and being prcoessed in untrused computational environments, which may not satisfy institutional policies. To systematically investigate these issues, the NIH-funded National Center for Biomedical Computing iDASH (integrating Data for Analysis, 'anonymization' and SHaring) hosted the second Critical Assessment of Data Privacy and Protection competition to assess the capacity of cryptographic technologies for protecting computation over human genomes in the cloud and promoting cross-institutional collaboration. Data scientists were challenged to design and engineer practical algorithms for secure outsourcing of genome computation tasks in working software, whereby analyses are performed only on encrypted data. They were also challenged to develop approaches to enable secure collaboration on data from genomic studies generated by multiple organizations (e.g., medical centers) to jointly compute aggregate statistics without sharing individual-level records. The results of the competition indicated that secure computation techniques can enable comparative analysis of human genomes, but greater efficiency (in terms of compute time and memory utilization) are needed before they are sufficiently practical for real world environments.
Gray, Clark; Frankenberg, Elizabeth; Gillespie, Thomas; Sumantri, Cecep; Thomas, Duncan
2014-01-01
Understanding of human vulnerability to environmental change has advanced in recent years, but measuring vulnerability and interpreting mobility across many sites differentially affected by change remains a significant challenge. Drawing on longitudinal data collected on the same respondents who were living in coastal areas of Indonesia before the 2004 Indian Ocean tsunami and were re-interviewed after the tsunami, this paper illustrates how the combination of population-based survey methods, satellite imagery and multivariate statistical analyses has the potential to provide new insights into vulnerability, mobility and impacts of major disasters on population well-being. The data are used to map and analyze vulnerability to post-tsunami displacement across the provinces of Aceh and North Sumatra and to compare patterns of migration after the tsunami between damaged areas and areas not directly affected by the tsunami. The comparison reveals that migration after a disaster is less selective overall than migration in other contexts. Gender and age, for example, are strong predictors of moving from undamaged areas but are not related to displacement in areas experiencing damage. In our analyses traditional predictors of vulnerability do not always operate in expected directions. Low levels of socioeconomic status and education were not predictive of moving after the tsunami, although for those who did move, they were predictive of displacement to a camp rather than a private home. This survey-based approach, though not without difficulties, is broadly applicable to many topics in human-environment research, and potentially opens the door to rigorous testing of new hypotheses in this literature. PMID:24839300
Iterative categorization (IC): a systematic technique for analysing qualitative data.
Neale, Joanne
2016-06-01
The processes of analysing qualitative data, particularly the stage between coding and publication, are often vague and/or poorly explained within addiction science and research more broadly. A simple but rigorous and transparent technique for analysing qualitative textual data, developed within the field of addiction, is described. The technique, iterative categorization (IC), is suitable for use with inductive and deductive codes and can support a range of common analytical approaches, e.g. thematic analysis, Framework, constant comparison, analytical induction, content analysis, conversational analysis, discourse analysis, interpretative phenomenological analysis and narrative analysis. Once the data have been coded, the only software required is a standard word processing package. Worked examples are provided. © 2016 The Authors. Addiction published by John Wiley & Sons Ltd on behalf of Society for the Study of Addiction.
How Confident can we be in Flood Risk Assessments?
NASA Astrophysics Data System (ADS)
Merz, B.
2017-12-01
Flood risk management should be based on risk analyses quantifying the risk and its reduction for different risk reduction strategies. However, validating risk estimates by comparing model simulations with past observations is hardly possible, since the assessment typically encompasses extreme events and their impacts that have not been observed before. Hence, risk analyses are strongly based on assumptions and expert judgement. This situation opens the door for cognitive biases, such as `illusion of certainty', `overconfidence' or `recency bias'. Such biases operate specifically in complex situations with many factors involved, when uncertainty is high and events are probabilistic, or when close learning feedback loops are missing - aspects that all apply to risk analyses. This contribution discusses how confident we can be in flood risk assessments, and reflects about more rigorous approaches towards their validation.
A Methodology for Conducting Integrative Mixed Methods Research and Data Analyses
Castro, Felipe González; Kellison, Joshua G.; Boyd, Stephen J.; Kopak, Albert
2011-01-01
Mixed methods research has gained visibility within the last few years, although limitations persist regarding the scientific caliber of certain mixed methods research designs and methods. The need exists for rigorous mixed methods designs that integrate various data analytic procedures for a seamless transfer of evidence across qualitative and quantitative modalities. Such designs can offer the strength of confirmatory results drawn from quantitative multivariate analyses, along with “deep structure” explanatory descriptions as drawn from qualitative analyses. This article presents evidence generated from over a decade of pilot research in developing an integrative mixed methods methodology. It presents a conceptual framework and methodological and data analytic procedures for conducting mixed methods research studies, and it also presents illustrative examples from the authors' ongoing integrative mixed methods research studies. PMID:22167325
Greaves, Paul; Clear, Andrew; Coutinho, Rita; Wilson, Andrew; Matthews, Janet; Owen, Andrew; Shanyinde, Milensu; Lister, T. Andrew; Calaminici, Maria; Gribben, John G.
2013-01-01
Purpose The immune microenvironment is key to the pathophysiology of classical Hodgkin lymphoma (CHL). Twenty percent of patients experience failure of their initial treatment, and others receive excessively toxic treatment. Prognostic scores and biomarkers have yet to influence outcomes significantly. Previous biomarker studies have been limited by the extent of tissue analyzed, statistical inconsistencies, and failure to validate findings. We aimed to overcome these limitations by validating recently identified microenvironment biomarkers (CD68, FOXP3, and CD20) in a new patient cohort with a greater extent of tissue and by using rigorous statistical methodology. Patients and Methods Diagnostic tissue from 122 patients with CHL was microarrayed and stained, and positive cells were counted across 10 to 20 high-powered fields per patient by using an automated system. Two statistical analyses were performed: a categorical analysis with test/validation set-defined cut points and Kaplan-Meier estimated outcome measures of 5-year overall survival (OS), disease-specific survival (DSS), and freedom from first-line treatment failure (FFTF) and an independent multivariate analysis of absolute uncategorized counts. Results Increased CD20 expression confers superior OS. Increased FOXP3 expression confers superior OS, and increased CD68 confers inferior FFTF and OS. FOXP3 varies independently of CD68 expression and retains significance when analyzed as a continuous variable in multivariate analysis. A simple score combining FOXP3 and CD68 discriminates three groups: FFTF 93%, 62%, and 47% (P < .001), DSS 93%, 82%, and 63% (P = .03), and OS 93%, 82%, and 59% (P = .002). Conclusion We have independently validated CD68, FOXP3, and CD20 as prognostic biomarkers in CHL, and we demonstrate, to the best of our knowledge for the first time, that combining FOXP3 and CD68 may further improve prognostic stratification. PMID:23045593
Wang, An-Lu; Chen, Zhuo; Luo, Jing; Shang, Qing-Hua; Xu, Hao
2016-01-01
This systemic review evaluated the efficacy and safety of Chinese herbal medicines (CHMs) in patients with coronary heart disease (CHD) complicated with depression. All databases were retrieved till September 30, 2014. Randomized controlled trials (RCTs) comparing CHMs with placebo or conventional Western medicine were retrieved. Data extraction, analyses and quality assessment were performed according to the Cochrane standards. RevMan 5.3 was used to synthesize the results. Thirteen RCTs enrolling 1,095 patients were included. Subgroup analysis was used to assess data. In reducing the degree of depression, CHMs showed no statistic difference in the 4th week [mean difference (MD)=-1.06; 95% confidence interval (CI)-2.38 to 0.26; n=501; I(2)=73%], but it was associated with a statistically significant difference in the 8th week (MD=-1.00; 95% CI-1.64 to-0.36; n=436; I(2)=48%). Meanwhile, the combination therapy (CHMs together with antidepressants) showed significant statistic differences both in the 4th week (MD=-1.99; 95% CI-3.80 to-0.18; n=90) and in the 8th week (MD=-5.61; 95% CI-6.26 to-4.97; n=242; I(2)=87%). In CHD-related clinical evaluation, 3 trials reported the intervention group was superior to the control group. Four trials showed adverse events in the intervention group was less than that in the control group. CHMs showed potentially benefits on patients with CHD complicated with depression. Moreover, the effect of CHMs may be similar to or better than antidepressant in certain fields but with less side effects. However, because of small sample size and potential bias of most trials, this result should be interpreted with caution. More rigorous trials with larger sample size and higher quality are warranted to give high quality of evidence to support the use of CHMs for CHD complicated with depression.
Preserving pre-rigor meat functionality for beef patty production.
Claus, J R; Sørheim, O
2006-06-01
Three methods were examined for preserving pre-rigor meat functionality in beef patties. Hot-boned semimembranosus muscles were processed as follows: (1) pre-rigor ground, salted, patties immediately cooked; (2) pre-rigor ground, salted and stored overnight; (3) pre-rigor injected with brine; and (4) post-rigor ground and salted. Raw patties contained 60% lean beef, 19.7% beef fat trim, 1.7% NaCl, 3.6% starch, and 15% water. Pre-rigor processing occurred at 3-3.5h postmortem. Patties made from pre-rigor ground meat had higher pH values; greater protein solubility; firmer, more cohesive, and chewier texture; and substantially lower cooking losses than the other treatments. Addition of salt was sufficient to reduce the rate and extent of glycolysis. Brine injection of intact pre-rigor muscles resulted in some preservation of the functional properties but not as pronounced as with salt addition to pre-rigor ground meat.
[Comparative analysis of quality labels of health websites].
Padilla-Garrido, N; Aguado-Correa, F; Huelva-López, L; Ortega-Moreno, M
2016-01-01
The search for health related information on the Internet is a growing phenomenon, buts its main drawback is the lack of reliability of information consulted. The aim of this study was to analyse and compare existing quality labels of health websites. A cross-sectional study was performed by searching Medline, IBECS, Google, and Yahoo, in both English and Spanish, between 8 and 9 March, 2015. Different keywords were used depending on whether the search was conducted in medical databases or generic search engines. The quality labels were classified according to their origin, analysing their character, year of implementation, the existence of the accreditation process, number of categories, criteria and standards, possibility of self-assessment, number of levels of certification, certification scope, validity, analytical quality of content, fee, results of the accreditation process, application and number of websites granted the seal, and quality labels obtained by the accrediting organisation. Seven quality labels, five of Spanish origin (WMA, PAWS, WIS, SEAFORMEC and M21) and two international ones (HONcode and Health Web Site Accreditation), were analysed. There was disparity in carrying out the accreditation process, with some not detailing key aspects of the process, or providing incomplete, outdated, or even inaccurate information. The most rigorous guaranteed the level of confidence that the websites had in relation to the content of information, but none checked the quality of them. Although rigorous quality labels may become useful, the deficiencies in some of them cast doubt on their current usefulness. Copyright © 2015 SECA. Publicado por Elsevier España, S.L.U. All rights reserved.
Systematic reviews and meta-analyses on treatment of asthma: critical evaluation
Jadad, Alejandro R; Moher, Michael; Browman, George P; Booker, Lynda; Sigouin, Christopher; Fuentes, Mario; Stevens, Robert
2000-01-01
Objective To evaluate the clinical, methodological, and reporting aspects of systematic reviews and meta-analyses on the treatment of asthma and to compare those published by the Cochrane Collaboration with those published in paper based journals. Design Analysis of studies identified from Medline, CINAHL, HealthSTAR, EMBASE, Cochrane Library, personal collections, and reference lists. Studies Articles describing a systematic review or a meta-analysis of the treatment of asthma that were published as a full report, in any language or format, in a peer reviewed journal or the Cochrane Library. Main outcome measures General characteristics of studies reviewed and methodological characteristics (sources of articles; language restrictions; format, design, and publication status of studies included; type of data synthesis; and methodological quality). Results 50 systematic reviews and meta-analyses were included. More than half were published in the past two years. Twelve reviews were published in the Cochrane Library and 38 were published in 22 peer reviewed journals. Forced expiratory volume in one second was the most frequently used outcome, but few reviews evaluated the effect of treatment on costs or patient preferences. Forty reviews were judged to have serious or extensive flaws. All six reviews associated with industry were in this group. Seven of the 10 most rigorous reviews were published in the Cochrane Library. Conclusions Most reviews published in peer reviewed journals or funded by industry have serious methodological flaws that limit their value to guide decisions. Cochrane reviews are more rigorous and better reported than those published in peer reviewed journals. PMID:10688558
NASA Technical Reports Server (NTRS)
Pulkkinen, A.; Rastaetter, L.; Kuznetsova, M.; Singer, H.; Balch, C.; Weimer, D.; Toth, G.; Ridley, A.; Gombosi, T.; Wiltberger, M.;
2013-01-01
In this paper we continue the community-wide rigorous modern space weather model validation efforts carried out within GEM, CEDAR and SHINE programs. In this particular effort, in coordination among the Community Coordinated Modeling Center (CCMC), NOAA Space Weather Prediction Center (SWPC), modelers, and science community, we focus on studying the models' capability to reproduce observed ground magnetic field fluctuations, which are closely related to geomagnetically induced current phenomenon. One of the primary motivations of the work is to support NOAA SWPC in their selection of the next numerical model that will be transitioned into operations. Six geomagnetic events and 12 geomagnetic observatories were selected for validation.While modeled and observed magnetic field time series are available for all 12 stations, the primary metrics analysis is based on six stations that were selected to represent the high-latitude and mid-latitude locations. Events-based analysis and the corresponding contingency tables were built for each event and each station. The elements in the contingency table were then used to calculate Probability of Detection (POD), Probability of False Detection (POFD) and Heidke Skill Score (HSS) for rigorous quantification of the models' performance. In this paper the summary results of the metrics analyses are reported in terms of POD, POFD and HSS. More detailed analyses can be carried out using the event by event contingency tables provided as an online appendix. An online interface built at CCMC and described in the supporting information is also available for more detailed time series analyses.
ERIC Educational Resources Information Center
Pearce, Jone L.
2016-01-01
Arbaugh, Fornaciari, and Hwang (2016) are to be commended for undertaking two worthy tasks: demonstrating the value of citation counts in the business and management education (BME) field and attracting new scholars to the field by drawing on rigorous citation analyses. In this commentary, Jone Pearce first addresses the use of citation counts in…
Machkovech, Heather M.; Bedford, Trevor; Suchard, Marc A.
2015-01-01
ABSTRACT Numerous experimental studies have demonstrated that CD8+ T cells contribute to immunity against influenza by limiting viral replication. It is therefore surprising that rigorous statistical tests have failed to find evidence of positive selection in the epitopes targeted by CD8+ T cells. Here we use a novel computational approach to test for selection in CD8+ T-cell epitopes. We define all epitopes in the nucleoprotein (NP) and matrix protein (M1) with experimentally identified human CD8+ T-cell responses and then compare the evolution of these epitopes in parallel lineages of human and swine influenza viruses that have been diverging since roughly 1918. We find a significant enrichment of substitutions that alter human CD8+ T-cell epitopes in NP of human versus swine influenza virus, consistent with the idea that these epitopes are under positive selection. Furthermore, we show that epitope-altering substitutions in human influenza virus NP are enriched on the trunk versus the branches of the phylogenetic tree, indicating that viruses that acquire these mutations have a selective advantage. However, even in human influenza virus NP, sites in T-cell epitopes evolve more slowly than do nonepitope sites, presumably because these epitopes are under stronger inherent functional constraint. Overall, our work demonstrates that there is clear selection from CD8+ T cells in human influenza virus NP and illustrates how comparative analyses of viral lineages from different hosts can identify positive selection that is otherwise obscured by strong functional constraint. IMPORTANCE There is a strong interest in correlates of anti-influenza immunity that are protective against diverse virus strains. CD8+ T cells provide such broad immunity, since they target conserved viral proteins. An important question is whether T-cell immunity is sufficiently strong to drive influenza virus evolution. Although many studies have shown that T cells limit viral replication in animal models and are associated with decreased symptoms in humans, no studies have proven with statistical significance that influenza virus evolves under positive selection to escape T cells. Here we use comparisons of human and swine influenza viruses to rigorously demonstrate that human influenza virus evolves under pressure to fix mutations in the nucleoprotein that promote escape from T cells. We further show that viruses with these mutations have a selective advantage since they are preferentially located on the “trunk” of the phylogenetic tree. Overall, our results show that CD8+ T cells targeting nucleoprotein play an important role in shaping influenza virus evolution. PMID:26311880
DOE Office of Scientific and Technical Information (OSTI.GOV)
More, R.M.
A new statistical model (the quantum-statistical model (QSM)) was recently introduced by Kalitkin and Kuzmina for the calculation of thermodynamic properties of compressed matter. This paper examines the QSM and gives (i) a numerical QSM calculation of pressure and energy for aluminum and comparison to existing augmented-plane-wave data; (ii) display of separate kinetic, exchange, and quantum pressure terms; (iii) a study of electron density at the nucleus; (iv) a study of the effects of the Kirzhnitz-Weizsacker parameter controlling the gradient terms; (v) an analytic expansion for very high densities; and (vi) rigorous pressure theorems including a general version of themore » virial theorem which applies to an arbitrary microscopic volume. It is concluded that the QSM represents the most accurate and consistent theory of the Thomas-Fermi type.« less
Do climate extreme events foster violent civil conflicts? A coincidence analysis
NASA Astrophysics Data System (ADS)
Schleussner, Carl-Friedrich; Donges, Jonathan F.; Donner, Reik V.
2014-05-01
Civil conflicts promoted by adverse environmental conditions represent one of the most important potential feedbacks in the global socio-environmental nexus. While the role of climate extremes as a triggering factor is often discussed, no consensus is yet reached about the cause-and-effect relation in the observed data record. Here we present results of a rigorous statistical coincidence analysis based on the Munich Re Inc. extreme events database and the Uppsala conflict data program. We report evidence for statistically significant synchronicity between climate extremes with high economic impact and violent conflicts for various regions, although no coherent global signal emerges from our analysis. Our results indicate the importance of regional vulnerability and might aid to identify hot-spot regions for potential climate-triggered violent social conflicts.
NASA Astrophysics Data System (ADS)
Kushnir, A. F.; Troitsky, E. V.; Haikin, L. M.; Dainty, A.
1999-06-01
A semi-automatic procedure has been developed to achieve statistically optimum discrimination between earthquakes and explosions at local or regional distances based on a learning set specific to a given region. The method is used for step-by-step testing of candidate discrimination features to find the optimum (combination) subset of features, with the decision taken on a rigorous statistical basis. Linear (LDF) and Quadratic (QDF) Discriminant Functions based on Gaussian distributions of the discrimination features are implemented and statistically grounded; the features may be transformed by the Box-Cox transformation z=(1/ α)( yα-1) to make them more Gaussian. Tests of the method were successfully conducted on seismograms from the Israel Seismic Network using features consisting of spectral ratios between and within phases. Results showed that the QDF was more effective than the LDF and required five features out of 18 candidates for the optimum set. It was found that discrimination improved with increasing distance within the local range, and that eliminating transformation of the features and failing to correct for noise led to degradation of discrimination.
Statistical moments of the Strehl ratio
NASA Astrophysics Data System (ADS)
Yaitskova, Natalia; Esselborn, Michael; Gladysz, Szymon
2012-07-01
Knowledge of the statistical characteristics of the Strehl ratio is essential for the performance assessment of the existing and future adaptive optics systems. For full assessment not only the mean value of the Strehl ratio but also higher statistical moments are important. Variance is related to the stability of an image and skewness reflects the chance to have in a set of short exposure images more or less images with the quality exceeding the mean. Skewness is a central parameter in the domain of lucky imaging. We present a rigorous theory for the calculation of the mean value, the variance and the skewness of the Strehl ratio. In our approach we represent the residual wavefront as being formed by independent cells. The level of the adaptive optics correction defines the number of the cells and the variance of the cells, which are the two main parameters of our theory. The deliverables are the values of the three moments as the functions of the correction level. We make no further assumptions except for the statistical independence of the cells.
How Do You Determine Whether The Earth Is Warming Up?
NASA Astrophysics Data System (ADS)
Restrepo, J. M.; Comeau, D.; Flaschka, H.
2012-12-01
How does one determine whether the extreme summer temperatures in the North East of the US, or in Moscow during the summer of 2010, was an extreme weather fluctuation or the result of a systematic global climate warming trend? It is only under exceptional circumstances that one can determine whether an observational climate signal belongs to a particular statistical distribution. In fact, observed climate signals are rarely "statistical" and thus there is usually no way to rigorously obtain enough field data to produce a trend or tendency, based upon data alone. Furthermore, this type of data is often multi-scale. We propose a trend or tendency methodology that does not make use of a parametric or a statistical assumption. The most important feature of this trend strategy is that it is defined in very precise mathematical terms. The tendency is easily understood and practical, and its algorithmic realization is fairly robust. In addition to proposing a trend, the methodology can be adopted to generate surrogate statistical models, useful in reduced filtering schemes of time dependent processes.
Tan, Ming T; Liu, Jian-ping; Lao, Lixing
2012-08-01
Recently, proper use of the statistical methods in traditional Chinese medicine (TCM) randomized controlled trials (RCTs) has received increased attention. Statistical inference based on hypothesis testing is the foundation of clinical trials and evidence-based medicine. In this article, the authors described the methodological differences between literature published in Chinese and Western journals in the design and analysis of acupuncture RCTs and the application of basic statistical principles. In China, qualitative analysis method has been widely used in acupuncture and TCM clinical trials, while the between-group quantitative analysis methods on clinical symptom scores are commonly used in the West. The evidence for and against these analytical differences were discussed based on the data of RCTs assessing acupuncture for pain relief. The authors concluded that although both methods have their unique advantages, quantitative analysis should be used as the primary analysis while qualitative analysis can be a secondary criterion for analysis. The purpose of this paper is to inspire further discussion of such special issues in clinical research design and thus contribute to the increased scientific rigor of TCM research.
Statistical modeling of natural backgrounds in hyperspectral LWIR data
NASA Astrophysics Data System (ADS)
Truslow, Eric; Manolakis, Dimitris; Cooley, Thomas; Meola, Joseph
2016-09-01
Hyperspectral sensors operating in the long wave infrared (LWIR) have a wealth of applications including remote material identification and rare target detection. While statistical models for modeling surface reflectance in visible and near-infrared regimes have been well studied, models for the temperature and emissivity in the LWIR have not been rigorously investigated. In this paper, we investigate modeling hyperspectral LWIR data using a statistical mixture model for the emissivity and surface temperature. Statistical models for the surface parameters can be used to simulate surface radiances and at-sensor radiance which drives the variability of measured radiance and ultimately the performance of signal processing algorithms. Thus, having models that adequately capture data variation is extremely important for studying performance trades. The purpose of this paper is twofold. First, we study the validity of this model using real hyperspectral data, and compare the relative variability of hyperspectral data in the LWIR and visible and near-infrared (VNIR) regimes. Second, we illustrate how materials that are easily distinguished in the VNIR, may be difficult to separate when imaged in the LWIR.
NASA Astrophysics Data System (ADS)
Bovier, Anton
2006-06-01
Our mathematical understanding of the statistical mechanics of disordered systems is going through a period of stunning progress. This self-contained book is a graduate-level introduction for mathematicians and for physicists interested in the mathematical foundations of the field, and can be used as a textbook for a two-semester course on mathematical statistical mechanics. It assumes only basic knowledge of classical physics and, on the mathematics side, a good working knowledge of graduate-level probability theory. The book starts with a concise introduction to statistical mechanics, proceeds to disordered lattice spin systems, and concludes with a presentation of the latest developments in the mathematical understanding of mean-field spin glass models. In particular, recent progress towards a rigorous understanding of the replica symmetry-breaking solutions of the Sherrington-Kirkpatrick spin glass models, due to Guerra, Aizenman-Sims-Starr and Talagrand, is reviewed in some detail. Comprehensive introduction to an active and fascinating area of research Clear exposition that builds to the state of the art in the mathematics of spin glasses Written by a well-known and active researcher in the field
Student peer assessment in evidence-based medicine (EBM) searching skills training: an experiment
Eldredge, Jonathan D.; Bear, David G.; Wayne, Sharon J.; Perea, Paul P.
2013-01-01
Background: Student peer assessment (SPA) has been used intermittently in medical education for more than four decades, particularly in connection with skills training. SPA generally has not been rigorously tested, so medical educators have limited evidence about SPA effectiveness. Methods: Experimental design: Seventy-one first-year medical students were stratified by previous test scores into problem-based learning tutorial groups, and then these assigned groups were randomized further into intervention and control groups. All students received evidence-based medicine (EBM) training. Only the intervention group members received SPA training, practice with assessment rubrics, and then application of anonymous SPA to assignments submitted by other members of the intervention group. Results: Students in the intervention group had higher mean scores on the formative test with a potential maximum score of 49 points than did students in the control group, 45.7 and 43.5, respectively (P = 0.06). Conclusions: SPA training and the application of these skills by the intervention group resulted in higher scores on formative tests compared to those in the control group, a difference approaching statistical significance. The extra effort expended by librarians, other personnel, and medical students must be factored into the decision to use SPA in any specific educational context. Implications: SPA has not been rigorously tested, particularly in medical education. Future, similarly rigorous studies could further validate use of SPA so that librarians can optimally make use of limited contact time for information skills training in medical school curricula. PMID:24163593
A methodology for the rigorous verification of plasma simulation codes
NASA Astrophysics Data System (ADS)
Riva, Fabio
2016-10-01
The methodology used to assess the reliability of numerical simulation codes constitutes the Verification and Validation (V&V) procedure. V&V is composed by two separate tasks: the verification, which is a mathematical issue targeted to assess that the physical model is correctly solved, and the validation, which determines the consistency of the code results, and therefore of the physical model, with experimental data. In the present talk we focus our attention on the verification, which in turn is composed by the code verification, targeted to assess that a physical model is correctly implemented in a simulation code, and the solution verification, that quantifies the numerical error affecting a simulation. Bridging the gap between plasma physics and other scientific domains, we introduced for the first time in our domain a rigorous methodology for the code verification, based on the method of manufactured solutions, as well as a solution verification based on the Richardson extrapolation. This methodology was applied to GBS, a three-dimensional fluid code based on a finite difference scheme, used to investigate the plasma turbulence in basic plasma physics experiments and in the tokamak scrape-off layer. Overcoming the difficulty of dealing with a numerical method intrinsically affected by statistical noise, we have now generalized the rigorous verification methodology to simulation codes based on the particle-in-cell algorithm, which are employed to solve Vlasov equation in the investigation of a number of plasma physics phenomena.
Facial patterns in a tropical social wasp correlate with colony membership
NASA Astrophysics Data System (ADS)
Baracchi, David; Turillazzi, Stefano; Chittka, Lars
2016-10-01
Social insects excel in discriminating nestmates from intruders, typically relying on colony odours. Remarkably, some wasp species achieve such discrimination using visual information. However, while it is universally accepted that odours mediate a group level recognition, the ability to recognise colony members visually has been considered possible only via individual recognition by which wasps discriminate `friends' and `foes'. Using geometric morphometric analysis, which is a technique based on a rigorous statistical theory of shape allowing quantitative multivariate analyses on structure shapes, we first quantified facial marking variation of Liostenogaster flavolineata wasps. We then compared this facial variation with that of chemical profiles (generated by cuticular hydrocarbons) within and between colonies. Principal component analysis and discriminant analysis applied to sets of variables containing pure shape information showed that despite appreciable intra-colony variation, the faces of females belonging to the same colony resemble one another more than those of outsiders. This colony-specific variation in facial patterns was on a par with that observed for odours. While the occurrence of face discrimination at the colony level remains to be tested by behavioural experiments, overall our results suggest that, in this species, wasp faces display adequate information that might be potentially perceived and used by wasps for colony level recognition.
Cardiff, Robert D; Hubbard, Neil E; Engelberg, Jesse A; Munn, Robert J; Miller, Claramae H; Walls, Judith E; Chen, Jane Q; Velásquez-García, Héctor A; Galvez, Jose J; Bell, Katie J; Beckett, Laurel A; Li, Yue-Ju; Borowsky, Alexander D
2013-01-01
Quantitative Image Analysis (QIA) of digitized whole slide images for morphometric parameters and immunohistochemistry of breast cancer antigens was used to evaluate the technical reproducibility, biological variability, and intratumoral heterogeneity in three transplantable mouse mammary tumor models of human breast cancer. The relative preservation of structure and immunogenicity of the three mouse models and three human breast cancers was also compared when fixed with representatives of four distinct classes of fixatives. The three mouse mammary tumor cell models were an ER + /PR + model (SSM2), a Her2 + model (NDL), and a triple negative model (MET1). The four breast cancer antigens were ER, PR, Her2, and Ki67. The fixatives included examples of (1) strong cross-linkers, (2) weak cross-linkers, (3) coagulants, and (4) combination fixatives. Each parameter was quantitatively analyzed using modified Aperio Technologies ImageScope algorithms. Careful pre-analytical adjustments to the algorithms were required to provide accurate results. The QIA permitted rigorous statistical analysis of results and grading by rank order. The analyses suggested excellent technical reproducibility and confirmed biological heterogeneity within each tumor. The strong cross-linker fixatives, such as formalin, consistently ranked higher than weak cross-linker, coagulant and combination fixatives in both the morphometric and immunohistochemical parameters. PMID:23399853
Promoting the Multidimensional Character of Scientific Reasoning.
Bradshaw, William S; Nelson, Jennifer; Adams, Byron J; Bell, John D
2017-04-01
This study reports part of a long-term program to help students improve scientific reasoning using higher-order cognitive tasks set in the discipline of cell biology. This skill was assessed using problems requiring the construction of valid conclusions drawn from authentic research data. We report here efforts to confirm the hypothesis that data interpretation is a complex, multifaceted exercise. Confirmation was obtained using a statistical treatment showing that various such problems rank students differently-each contains a unique set of cognitive challenges. Additional analyses of performance results have allowed us to demonstrate that individuals differ in their capacity to navigate five independent generic elements that constitute successful data interpretation: biological context, connection to course concepts, experimental protocols, data inference, and integration of isolated experimental observations into a coherent model. We offer these aspects of scientific thinking as a "data analysis skills inventory," along with usable sample problems that illustrate each element. Additionally, we show that this kind of reasoning is rigorous in that it is difficult for most novice students, who are unable to intuitively implement strategies for improving these skills. Instructors armed with knowledge of the specific challenges presented by different types of problems can provide specific helpful feedback during formative practice. The use of this instructional model is most likely to require changes in traditional classroom instruction.
Patel, Tulpesh; Blyth, Jacqueline C.; Griffiths, Gareth; Kelly, Deirdre; Talcott, Joel B.
2014-01-01
Background: Proton Magnetic Resonance Spectroscopy (1H-MRS) is a non-invasive imaging technique that enables quantification of neurochemistry in vivo and thereby facilitates investigation of the biochemical underpinnings of human cognitive variability. Studies in the field of cognitive spectroscopy have commonly focused on relationships between measures of N-acetyl aspartate (NAA), a surrogate marker of neuronal health and function, and broad measures of cognitive performance, such as IQ. Methodology/Principal Findings: In this study, we used 1H-MRS to interrogate single-voxels in occipitoparietal and frontal cortex, in parallel with assessments of psychometric intelligence, in a sample of 40 healthy adult participants. We found correlations between NAA and IQ that were within the range reported in previous studies. However, the magnitude of these effects was significantly modulated by the stringency of data screening and the extent to which outlying values contributed to statistical analyses. Conclusions/Significance: 1H-MRS offers a sensitive tool for assessing neurochemistry non-invasively, yet the relationships between brain metabolites and broad aspects of human behavior such as IQ are subtle. We highlight the need to develop an increasingly rigorous analytical and interpretive framework for collecting and reporting data obtained from cognitive spectroscopy studies of this kind. PMID:24592224
Breast Cancer Status in Iran: Statistical Analysis of 3010 Cases between 1998 and 2014
Akbari, Mohammad Esmaeil; Sayad, Saed; Khayamzadeh, Maryam; Shojaee, Leila; Shormeji, Zeynab; Amiri, Mojtaba
2017-01-01
Background Breast cancer is the 5th leading cause of cancer death in Iranian women. This study analyzed 3010 women with breast cancer that had been referred to a cancer research center in Tehran between 1998 and 2014. Methods In this retrospective study, we analyzed 3010 breast cancer cases with 32 clinical and paraclinical attributes. We checked the data quality rigorously and removed any invalid values or records. The method was data mining (problem definition, data preparation, data exploration, modeling, evaluation, and deployment). However, only the descriptive analyses' results of the variables are presented in this article. To our knowledge, this is the most comprehensive study on breast cancer status in Iran. Results A typical Iranian breast cancer patient has been a 40–50-year-old married woman with two children, who has a high school diploma and no history of abortion, smoking, or diabetes. Most patients were estrogen and progesterone receptor positive, human epidermal growth factor (HER) negative, and P53 negative. Most cases were detected in stage 2 with intermediate grade. Conclusion This study revealed original findings which can be used in national policymaking to find the best early detection method and improve the care quality and breast cancer prevention in Iran. PMID:29201466
Changing Work and Work-Family Conflict: Evidence from the Work, Family, and Health Network*
Kelly, Erin L.; Moen, Phyllis; Oakes, J. Michael; Fan, Wen; Okechukwu, Cassandra; Davis, Kelly D.; Hammer, Leslie; Kossek, Ellen; King, Rosalind Berkowitz; Hanson, Ginger; Mierzwa, Frank; Casper, Lynne
2013-01-01
Schedule control and supervisor support for family and personal life are work resources that may help employees manage the work-family interface. However, existing data and designs have made it difficult to conclusively identify the effects of these work resources. This analysis utilizes a group-randomized trial in which some units in an information technology workplace were randomly assigned to participate in an initiative, called STAR, that targeted work practices, interactions, and expectations by (a) training supervisors on the value of demonstrating support for employees’ personal lives and (b) prompting employees to reconsider when and where they work. We find statistically significant, though modest, improvements in employees’ work-family conflict and family time adequacy and larger changes in schedule control and supervisor support for family and personal life. We find no evidence that this intervention increased work hours or perceived job demands, as might have happened with increased permeability of work across time and space. Subgroup analyses suggest the intervention brings greater benefits to employees more vulnerable to work-family conflict. This study advances our understanding of the impact of social structures on individual lives by investigating deliberate organizational changes and their effects on work resources and the work-family interface with a rigorous design. PMID:25349460
Changing Work and Work-Family Conflict: Evidence from the Work, Family, and Health Network*
Kelly, Erin L; Moen, Phyllis; Oakes, J Michael; Fan, Wen; Okechukwu, Cassandra; Davis, Kelly D; Hammer, Leslie; Kossek, Ellen; King, Rosalind Berkowitz; Hanson, Ginger; Mierzwa, Frank; Casper, Lynne
2014-06-01
Schedule control and supervisor support for family and personal life are work resources that may help employees manage the work-family interface. However, existing data and designs have made it difficult to conclusively identify the effects of these work resources. This analysis utilizes a group-randomized trial in which some units in an information technology workplace were randomly assigned to participate in an initiative, called STAR, that targeted work practices, interactions, and expectations by (a) training supervisors on the value of demonstrating support for employees' personal lives and (b) prompting employees to reconsider when and where they work. We find statistically significant, though modest, improvements in employees' work-family conflict and family time adequacy and larger changes in schedule control and supervisor support for family and personal life. We find no evidence that this intervention increased work hours or perceived job demands, as might have happened with increased permeability of work across time and space. Subgroup analyses suggest the intervention brings greater benefits to employees more vulnerable to work-family conflict. This study advances our understanding of the impact of social structures on individual lives by investigating deliberate organizational changes and their effects on work resources and the work-family interface with a rigorous design.
NASA Astrophysics Data System (ADS)
Mills, G.; Buse, A.; Gimeno, B.; Bermejo, V.; Holland, M.; Emberson, L.; Pleijel, H.
Crop-response data from over 700 published papers and conference proceedings have been analysed with the aim of establishing ozone dose-response functions for a wide range of European agricultural and horticultural crops. Data that met rigorous selection criteria (e.g. field-based, ozone concentrations within European range, full season exposure period) were used to derive AOT40-yield response functions for 19 crops by first converting the published ozone concentration data into AOT40 (AOT40 is the hourly mean ozone concentration accumulated over a threshold ozone concentration of 40 ppb during daylight hours, units ppm h). For any individual crop, there were no significant differences in the linear response functions derived for experiments conducted in the USA or Europe, or for individual cultivars. Three statistically independent groups were identified: ozone sensitive crops (wheat, water melon, pulses, cotton, turnip, tomato, onion, soybean and lettuce); moderately sensitive crops (sugar beet, potato, oilseed rape, tobacco, rice, maize, grape and broccoli) and ozone resistant (barley and fruit represented by plum and strawberry). Critical levels of a 3 month AOT40 of 3 ppm h and a 3.5 month AOT40 of 6 ppm h were derived from the functions for wheat and tomato, respectively.
NASA Astrophysics Data System (ADS)
Widge, Alik S.; Moritz, Chet T.
2014-04-01
Objective. There is great interest in closed-loop neurostimulators that sense and respond to a patient's brain state. Such systems may have value for neurological and psychiatric illnesses where symptoms have high intraday variability. Animal models of closed-loop stimulators would aid preclinical testing. We therefore sought to demonstrate that rodents can directly control a closed-loop limbic neurostimulator via a brain-computer interface (BCI). Approach. We trained rats to use an auditory BCI controlled by single units in prefrontal cortex (PFC). The BCI controlled electrical stimulation in the medial forebrain bundle, a limbic structure involved in reward-seeking. Rigorous offline analyses were performed to confirm volitional control of the neurostimulator. Main results. All animals successfully learned to use the BCI and neurostimulator, with closed-loop control of this challenging task demonstrated at 80% of PFC recording locations. Analysis across sessions and animals confirmed statistically robust BCI control and specific, rapid modulation of PFC activity. Significance. Our results provide a preliminary demonstration of a method for emotion-regulating closed-loop neurostimulation. They further suggest that activity in PFC can be used to control a BCI without pre-training on a predicate task. This offers the potential for BCI-based treatments in refractory neurological and mental illness.
The impact of hyperglycemia on survival in glioblastoma: A systematic review and meta-analysis.
Lu, Victor M; Goyal, Anshit; Vaughan, Lachlin S; McDonald, Kerrie L
2018-07-01
In the management of glioblastoma (GBM), there is a considerable predisposition to hyperglycemia due to significant integration of corticosteroid therapy to treat predictable clinical sequelae following diagnosis and treatment. The aim of this study was to quantify effect of hyperglycemia during the management of GBM on overall survival (OS). Searches of seven electronic databases from inception to January 2018 were conducted following Preferred Reporting Items for Systematic reviews and Meta-Analyses (PRISMA) guidelines. There were 1475 articles identified for screening. Prognostic hazard ratios (HRs) derived from multivariate regression analysis were extracted, and analyzed using meta-analysis of proportions and linear regression. Six observational studies reporting prognostic HRs in 10 cohorts were included. They described 1481 GBM diagnoses, all surveyed for hyperglycemia during management. Hyperglycemia was found to confer a statistically significant poorer OS outcome (HR, 1.671; p < 0.001). This trend and its significance was not modified by study year, size or proportion of pre-diagnostic diabetes mellitus. Hyperglycemia in GBM is an independent poor prognostic factor for OS. Heterogeneity in clinical course limits inter-study comparability. Future, prospective, randomized studies will validate the findings of this study, and ascertain the potential benefit of more rigorous monitoring for hyperglycemia and glycemic control. Copyright © 2018 Elsevier B.V. All rights reserved.
"What If" Analyses: Ways to Interpret Statistical Significance Test Results Using EXCEL or "R"
ERIC Educational Resources Information Center
Ozturk, Elif
2012-01-01
The present paper aims to review two motivations to conduct "what if" analyses using Excel and "R" to understand the statistical significance tests through the sample size context. "What if" analyses can be used to teach students what statistical significance tests really do and in applied research either prospectively to estimate what sample size…
Trends in Study Methods Used in Undergraduate Medical Education Research, 1969–2007
Baernstein, Amy; Liss, Hillary K.; Carney, Patricia A.; Elmore, Joann G.
2011-01-01
Context Evidence-based medical education requires rigorous studies appraising educational efficacy. Objectives To assess trends over time in methods used to evaluate undergraduate medical education interventions and to identify whether participation of medical education departments or centers is associated with more rigorous methods. Data Sources The PubMed, Cochrane Controlled Trials Registry, Campbell Collaboration, and ERIC databases (January 1966–March 2007) were searched using terms equivalent to students, medical and education, medical crossed with all relevant study designs. Study Selection We selected publications in all languages from every fifth year, plus the most recent 12 months, that evaluated an educational intervention for undergraduate medical students. Four hundred seventy-two publications met criteria for review. Data Extraction Data were abstracted on number of participants; types of comparison groups; whether outcomes assessed were objective, subjective, and/or validated; timing of outcome assessments; funding; and participation of medical education departments and centers. Ten percent of publications were independently abstracted by 2 authors to assess validity of the data abstraction. Results The annual number of publications increased over time from 1 (1969–1970) to 147 (2006–2007). In the most recent year, there was a mean of 145 medical student participants; 9 (6%) recruited participants from multiple institutions; 80 (54%) used comparison groups; 37 (25%) used randomized control groups; 91 (62%) had objective outcomes; 23 (16%) had validated outcomes; 35 (24%) assessed an outcome more than 1 month later; 21 (14%) estimated statistical power; and 66 (45%) reported funding. In 2006–2007, medical education department or center participation, reported in 46 (31%) of the recent publications, was associated only with enrolling more medical student participants (P = .04); for all studies from 1969 to 2007, it was associated only with measuring an objective outcome (P = .048). Between 1969 and 2007, the percentage of publications reporting statistical power and funding increased; percentages did not change for other study features. Conclusions The annual number of published studies of undergraduate medical education interventions demonstrating methodological rigor has been increasing. However, considerable opportunities for improvement remain. PMID:17785648
Accelerated recovery of Atlantic salmon (Salmo salar) from effects of crowding by swimming.
Veiseth, Eva; Fjaera, Svein Olav; Bjerkeng, Bjørn; Skjervold, Per Olav
2006-07-01
The effects of post-crowding swimming velocity (0, 0.35, and 0.70 m/s) and recovery time (1.5, 6, and 12 h) on physiological recovery and processing quality parameters of adult Atlantic salmon (Salmo salar) were determined. Atlantic salmon crowded to a density similar to that of a commercial slaughter process (>200 kg/m(3), 40 min) were transferred to a swimming chamber for recovery treatment. Osmolality and concentrations of cortisol, glucose and lactate in blood plasma were used as physiological stress indicators, whereas image analyses of extent and duration of rigor contraction, and fillet gaping were used as measures of processing quality. Crowded salmon had a 5.8-fold higher plasma cortisol concentration than control salmon (P<0.05). The elevated plasma cortisol concentration was reduced by increasing the swimming velocity, and had returned to control levels after 6 h recovery at high water velocity. Similar effects of swimming velocity were observed for plasma osmolality and lactate concentration. A lower plasma glucose concentration was present in crowded than in control fish (P<0.05), although a typical post-stress elevation in plasma glucose was observed after the recovery treatments. Lower muscle pH was found in crowded compared with control salmon (P<0.05), but muscle pH returned to control levels after 6 h recovery at intermediate and high swimming velocities and after 12 h in the low velocity group. Crowding caused an early onset of rigor mortis contraction. However, subjecting crowded salmon to active swimming for 6 h before slaughter delayed the onset of rigor mortis contraction from 2.5 to 7.5 h post mortem. The extent of rigor mortis contraction was also affected by crowding and post-stress swimming activity (P<0.05), and the largest degree of contraction was found in crowded salmon. In conclusion, active swimming accelerated the return of plasma cortisol, hydromineral balance, and the energy metabolism of adult Atlantic salmon to pre-stress levels. Moreover, an active swimming period delayed the onset of rigor mortis contraction, which has a positive technological implication for the salmon processing industry.
Benn, Emma K T; Tu, Chengcheng; Palermo, Ann-Gel S; Borrell, Luisa N; Kiernan, Michaela; Sandre, Mary; Bagiella, Emilia
2017-08-01
As clinical researchers at academic medical institutions across the United States increasingly manage complex clinical databases and registries, they often lack the statistical expertise to utilize the data for research purposes. This statistical inadequacy prevents junior investigators from disseminating clinical findings in peer-reviewed journals and from obtaining research funding, thereby hindering their potential for promotion. Underrepresented minorities, in particular, confront unique challenges as clinical investigators stemming from a lack of methodologically rigorous research training in their graduate medical education. This creates a ripple effect for them with respect to acquiring full-time appointments, obtaining federal research grants, and promotion to leadership positions in academic medicine. To fill this major gap in the statistical training of junior faculty and fellows, the authors developed the Applied Statistical Independence in Biological Systems (ASIBS) Short Course. The overall goal of ASIBS is to provide formal applied statistical training, via a hybrid distance and in-person learning format, to junior faculty and fellows actively involved in research at US academic medical institutions, with a special emphasis on underrepresented minorities. The authors present an overview of the design and implementation of ASIBS, along with a short-term evaluation of its impact for the first cohort of ASIBS participants.
Haring, Bernhard; Wu, Chunyuan; Mossavar-Rahmani, Yasmin; Snetselaar, Linda; Brunner, Robert; Wallace, Robert B.; Neuhouser, Marian L.; Wassertheil-Smoller, Sylvia
2015-01-01
Background Data on the association between dietary patterns and age-related cognitive decline are inconsistent. Objective To determine whether dietary patterns assessed by the alternate Mediterranean diet score (aMED), the Healthy Eating Index (HEI) 2010, the Alternate Healthy Eating Index (AHEI) 2010 or the Dietary Approach to Stop Hypertension (DASH) diet score are associated with cognitive decline in older women. To examine if dietary patterns modify the risk for cognitive decline in hypertensive women. Design Prospective, longitudinal cohort study. Food frequency questionnaires (FFQs) were used to derive dietary patterns at baseline. Hypertension was defined as self-report of current drug therapy for hypertension or clinic measurement of SBP ≥ 140mmHg or DBP ≥ 90mmHg. Participants/setting Postmenopausal women (N=6,425) aged 65 to 79 years who participated in the Women’s Health Initiative Memory Study (WHIMS) and were cognitively intact at baseline. Main Outcome Measures Cognitive decline was defined as cases of mild cognitive impairment (MCI) or probable dementia (PD). Cases were identified through rigorous screening and expert adjudication. Statistical Analyses performed Cox proportional hazards models with multivariable adjustment were used to estimate the relative risk for developing MCI or PD. Results During a median follow-up of 9.11 years, we documented 499 cases of MCI and 390 of PD. In multivariable analyses we did not detect any statistically significant relationships across quintiles of aMED, HEI-2010, DASH and AHEI-2010 scores and MCI or PD (ptrend=0.30, 0.44, 0.23 and 0.45). In hypertensive women we found no significant association between dietary patterns and cognitive decline (ptrend=0.19, 0.08, 0.07 and 0.60). Conclusions Dietary patterns characterized by the aMED, HEI-2010, AHEI-2010 or DASH dietary score were not associated with cognitive decline in older women. Adherence to a healthy dietary pattern did not modify the risk for cognitive decline in hypertensive women. PMID:27050728
Surface-water radon-222 distribution along the west-central Florida shelf
Smith, C.G.; Robbins, L.L.
2012-01-01
In February 2009 and August 2009, the spatial distribution of radon-222 in surface water was mapped along the west-central Florida shelf as collaboration between the Response of Florida Shelf Ecosystems to Climate Change project and a U.S. Geological Survey Mendenhall Research Fellowship project. This report summarizes the surface distribution of radon-222 from two cruises and evaluates potential physical controls on radon-222 fluxes. Radon-222 is an inert gas produced overwhelmingly in sediment and has a short half-life of 3.8 days; activities in surface water ranged between 30 and 170 becquerels per cubic meter. Overall, radon-222 activities were enriched in nearshore surface waters relative to offshore waters. Dilution in offshore waters is expected to be the cause of the low offshore activities. While thermal stratification of the water column during the August survey may explain higher radon-222 activities relative to the February survey, radon-222 activity and integrated surface-water inventories decreased exponentially from the shoreline during both cruises. By estimating radon-222 evasion by wind from nearby buoy data and accounting for internal production from dissolved radium-226, its radiogenic long-lived parent, a simple one-dimensional model was implemented to determine the role that offshore mixing, benthic influx, and decay have on the distribution of excess radon-222 inventories along the west Florida shelf. For multiple statistically based boundary condition scenarios (first quartile, median, third quartile, and maximum radon-222 inshore of 5 kilometers), the cross-shelf mixing rates and average nearshore submarine groundwater discharge (SGD) rates varied from 100.38 to 10-3.4 square kilometers per day and 0.00 to 1.70 centimeters per day, respectively. This dataset and modeling provide the first attempt to assess cross-shelf mixing and SGD on such a large spatial scale. Such estimates help scale up SGD rates that are often made at 1- to 10-meter resolution to a coarser but more regionally applicable scale of 1- to 10-kilometer resolution. More stringent analyses and model evaluation are required, but results and analyses presented in this report provide the foundation for conducting a more rigorous statistical assessment.
2012-01-01
Background It is known from recent studies that more than 90% of human multi-exon genes are subject to Alternative Splicing (AS), a key molecular mechanism in which multiple transcripts may be generated from a single gene. It is widely recognized that a breakdown in AS mechanisms plays an important role in cellular differentiation and pathologies. Polymerase Chain Reactions, microarrays and sequencing technologies have been applied to the study of transcript diversity arising from alternative expression. Last generation Affymetrix GeneChip Human Exon 1.0 ST Arrays offer a more detailed view of the gene expression profile providing information on the AS patterns. The exon array technology, with more than five million data points, can detect approximately one million exons, and it allows performing analyses at both gene and exon level. In this paper we describe BEAT, an integrated user-friendly bioinformatics framework to store, analyze and visualize exon arrays datasets. It combines a data warehouse approach with some rigorous statistical methods for assessing the AS of genes involved in diseases. Meta statistics are proposed as a novel approach to explore the analysis results. BEAT is available at http://beat.ba.itb.cnr.it. Results BEAT is a web tool which allows uploading and analyzing exon array datasets using standard statistical methods and an easy-to-use graphical web front-end. BEAT has been tested on a dataset with 173 samples and tuned using new datasets of exon array experiments from 28 colorectal cancer and 26 renal cell cancer samples produced at the Medical Genetics Unit of IRCCS Casa Sollievo della Sofferenza. To highlight all possible AS events, alternative names, accession Ids, Gene Ontology terms and biochemical pathways annotations are integrated with exon and gene level expression plots. The user can customize the results choosing custom thresholds for the statistical parameters and exploiting the available clinical data of the samples for a multivariate AS analysis. Conclusions Despite exon array chips being widely used for transcriptomics studies, there is a lack of analysis tools offering advanced statistical features and requiring no programming knowledge. BEAT provides a user-friendly platform for a comprehensive study of AS events in human diseases, displaying the analysis results with easily interpretable and interactive tables and graphics. PMID:22536968
NASA Technical Reports Server (NTRS)
Payne, M. H.
1973-01-01
The bounds for the normalized associated Legendre functions P sub nm were studied to provide a rational basis for the truncation of the geopotential series in spherical harmonics in various orbital analyses. The conjecture is made that the largest maximum of the normalized associated Legendre function lies in the interval which indicates the greatest integer function. A procedure is developed for verifying this conjecture. An on-line algebraic manipulator, IAM, is used to implement the procedure and the verification is carried out for all n equal to or less than 2m, for m = 1 through 6. A rigorous proof of the conjecture is not available.
Reproducible analyses of microbial food for advanced life support systems
NASA Technical Reports Server (NTRS)
Petersen, Gene R.
1988-01-01
The use of yeasts in controlled ecological life support systems (CELSS) for microbial food regeneration in space required the accurate and reproducible analysis of intracellular carbohydrate and protein levels. The reproducible analysis of glycogen was a key element in estimating overall content of edibles in candidate yeast strains. Typical analytical methods for estimating glycogen in Saccharomyces were not found to be entirely aplicable to other candidate strains. Rigorous cell lysis coupled with acid/base fractionation followed by specific enzymatic glycogen analyses were required to obtain accurate results in two strains of Candida. A profile of edible fractions of these strains was then determined. The suitability of yeasts as food sources in CELSS food production processes is discussed.
OCT Amplitude and Speckle Statistics of Discrete Random Media.
Almasian, Mitra; van Leeuwen, Ton G; Faber, Dirk J
2017-11-01
Speckle, amplitude fluctuations in optical coherence tomography (OCT) images, contains information on sub-resolution structural properties of the imaged sample. Speckle statistics could therefore be utilized in the characterization of biological tissues. However, a rigorous theoretical framework relating OCT speckle statistics to structural tissue properties has yet to be developed. As a first step, we present a theoretical description of OCT speckle, relating the OCT amplitude variance to size and organization for samples of discrete random media (DRM). Starting the calculations from the size and organization of the scattering particles, we analytically find expressions for the OCT amplitude mean, amplitude variance, the backscattering coefficient and the scattering coefficient. We assume fully developed speckle and verify the validity of this assumption by experiments on controlled samples of silica microspheres suspended in water. We show that the OCT amplitude variance is sensitive to sub-resolution changes in size and organization of the scattering particles. Experimentally determined and theoretically calculated optical properties are compared and in good agreement.
Statistical Model Selection for TID Hardness Assurance
NASA Technical Reports Server (NTRS)
Ladbury, R.; Gorelick, J. L.; McClure, S.
2010-01-01
Radiation Hardness Assurance (RHA) methodologies against Total Ionizing Dose (TID) degradation impose rigorous statistical treatments for data from a part's Radiation Lot Acceptance Test (RLAT) and/or its historical performance. However, no similar methods exist for using "similarity" data - that is, data for similar parts fabricated in the same process as the part under qualification. This is despite the greater difficulty and potential risk in interpreting of similarity data. In this work, we develop methods to disentangle part-to-part, lot-to-lot and part-type-to-part-type variation. The methods we develop apply not just for qualification decisions, but also for quality control and detection of process changes and other "out-of-family" behavior. We begin by discussing the data used in ·the study and the challenges of developing a statistic providing a meaningful measure of degradation across multiple part types, each with its own performance specifications. We then develop analysis techniques and apply them to the different data sets.
Mourning dove hunting regulation strategy based on annual harvest statistics and banding data
Otis, D.L.
2006-01-01
Although managers should strive to base game bird harvest management strategies on mechanistic population models, monitoring programs required to build and continuously update these models may not be in place. Alternatively, If estimates of total harvest and harvest rates are available, then population estimates derived from these harvest data can serve as the basis for making hunting regulation decisions based on population growth rates derived from these estimates. I present a statistically rigorous approach for regulation decision-making using a hypothesis-testing framework and an assumed framework of 3 hunting regulation alternatives. I illustrate and evaluate the technique with historical data on the mid-continent mallard (Anas platyrhynchos) population. I evaluate the statistical properties of the hypothesis-testing framework using the best available data on mourning doves (Zenaida macroura). I use these results to discuss practical implementation of the technique as an interim harvest strategy for mourning doves until reliable mechanistic population models and associated monitoring programs are developed.
NASA Technical Reports Server (NTRS)
Westwater, Ed R.; Falls, M. J.; Fionda, E.
1992-01-01
A limited network of four dual-channel microwave radiometers, with frequencies of 20.6 and 31.65 GHz, was operated in the front range of eastern Colorado from 1985 to 1988. Data, from November 1987 through October 1988 are analyzed to determine both single-station and joint-station brightness temperature and attenuation statistics. Only zenith observations were made. The spatial separations of the stations varied from 50 km to 190 km. Before the statistics were developed, the data were screened by rigorous quality control methods. One such method, that of 20.6 vs. 31.65 GHz scatter plots, is analyzed in detail, and comparisons are made of measured vs calculated data. At 20.6 and 31.65 GHz, vertical attenuations of 5 and 8 dB are exceeded 0.01 percent of the time. For these four stations and at the same 0.01 percent level, diversity gains from 6 to 8 dB are possible with the 50 to 190 km separations.
Economic evaluations and usefulness of standardized nursing terminologies.
Stone, Patricia W; Lee, Nam-Ju; Giannini, Melinna; Bakken, Suzanne
2004-01-01
To review different types of economic analyses commonly found in healthcare literature, discuss methodologic considerations in framing economic analyses, identify useful resources for economic evaluations, and describe the current and potential roles of standardized nursing terminologies in providing cost and outcome data for economic analysis. The Advanced Billing Concepts Code Resource-based Relative Value Scale and Nursing Outcomes Classification. Using case studies, the applicability of standardized nursing terminologies in cost-effectiveness analysis is demonstrated. While there is potential to inform specific questions, comparisons across analyses are limited because of the many outcome measures. Including a standardized quality-of-life measure in nursing terminologies would allow for the calculation of accepted outcome measures and dollars per quality adjusted life years gained. The nurse's ability to assess and contribute to all aspects of rigorous economic evidence is an essential competency for responsible practice.
The transition of oncologic imaging from its “industrial era” to it is “information era” demands analytical methods that 1) extract information from this data that is clinically and biologically relevant; 2) integrate imaging, clinical, and genomic data via rigorous statistical and computational methodologies in order to derive models valuable for understanding cancer mechanisms, diagnosis, prognostic assessment, response evaluation, and personalized treatment management; 3) are available to the biomedical community for easy use and application, with the aim of understanding, diagnosing, an
Exclusion Bounds for Extended Anyons
NASA Astrophysics Data System (ADS)
Larson, Simon; Lundholm, Douglas
2018-01-01
We introduce a rigorous approach to the many-body spectral theory of extended anyons, that is quantum particles confined to two dimensions that interact via attached magnetic fluxes of finite extent. Our main results are many-body magnetic Hardy inequalities and local exclusion principles for these particles, leading to estimates for the ground-state energy of the anyon gas over the full range of the parameters. This brings out further non-trivial aspects in the dependence on the anyonic statistics parameter, and also gives improvements in the ideal (non-extended) case.
Clopper-Pearson bounds from HEP data cuts
NASA Astrophysics Data System (ADS)
Berg, B. A.
2001-08-01
For the measurement of Ns signals in N events rigorous confidence bounds on the true signal probability pexact were established in a classical paper by Clopper and Pearson [Biometrica 26, 404 (1934)]. Here, their bounds are generalized to the HEP situation where cuts on the data tag signals with probability Ps and background data with likelihood Pb
NASA Astrophysics Data System (ADS)
Batic, Matej; Begalli, Marcia; Han, Min Cheol; Hauf, Steffen; Hoff, Gabriela; Kim, Chan Hyeong; Kim, Han Sung; Grazia Pia, Maria; Saracco, Paolo; Weidenspointner, Georg
2014-06-01
A systematic review of methods and data for the Monte Carlo simulation of photon interactions is in progress: it concerns a wide set of theoretical modeling approaches and data libraries available for this purpose. Models and data libraries are assessed quantitatively with respect to an extensive collection of experimental measurements documented in the literature to determine their accuracy; this evaluation exploits rigorous statistical analysis methods. The computational performance of the associated modeling algorithms is evaluated as well. An overview of the assessment of photon interaction models and results of the experimental validation are presented.
Power-law ansatz in complex systems: Excessive loss of information.
Tsai, Sun-Ting; Chang, Chin-De; Chang, Ching-Hao; Tsai, Meng-Xue; Hsu, Nan-Jung; Hong, Tzay-Ming
2015-12-01
The ubiquity of power-law relations in empirical data displays physicists' love of simple laws and uncovering common causes among seemingly unrelated phenomena. However, many reported power laws lack statistical support and mechanistic backings, not to mention discrepancies with real data are often explained away as corrections due to finite size or other variables. We propose a simple experiment and rigorous statistical procedures to look into these issues. Making use of the fact that the occurrence rate and pulse intensity of crumple sound obey a power law with an exponent that varies with material, we simulate a complex system with two driving mechanisms by crumpling two different sheets together. The probability function of the crumple sound is found to transit from two power-law terms to a bona fide power law as compaction increases. In addition to showing the vicinity of these two distributions in the phase space, this observation nicely demonstrates the effect of interactions to bring about a subtle change in macroscopic behavior and more information may be retrieved if the data are subject to sorting. Our analyses are based on the Akaike information criterion that is a direct measurement of information loss and emphasizes the need to strike a balance between model simplicity and goodness of fit. As a show of force, the Akaike information criterion also found the Gutenberg-Richter law for earthquakes and the scale-free model for a brain functional network, a two-dimensional sandpile, and solar flare intensity to suffer an excessive loss of information. They resemble more the crumpled-together ball at low compactions in that there appear to be two driving mechanisms that take turns occurring.
NASA Astrophysics Data System (ADS)
Kaplan, D. A.; Casey, S. T.; Cohen, M. J.; Acharya, S.; Jawitz, J. W.
2016-12-01
A century of hydrologic modification has altered the physical and biological drivers of landscape processes in the Everglades (Florida, USA). Restoring the ridge-slough patterned landscape, a dominant feature of the historical system, is a priority, but requires an understanding of pattern genesis and degradation mechanisms. Physical experiments to evaluate alternative pattern formation mechanisms are limited by the long time scales of peat accumulation and loss, necessitating model-based comparisons, where support for a particular mechanism is based on model replication of extant patterning and trajectories of degradation. However, multiple mechanisms yield patch elongation in the direction of historical flow (a central feature of ridge-slough patterning), limiting the utility of that characteristic for discriminating among alternatives. Using data from vegetation maps, we investigated the statistical features of ridge-slough spatial patterning (ridge density, patch perimeter, elongation, patch-size distributions, and spatial periodicity) to establish more rigorous criteria for evaluating model performance and to inform controls on pattern variation across the contemporary system. Two independent analyses (2-D periodograms and patch size distributions) provide strong evidence against regular patterning, with the landscape exhibiting neither a characteristic wavelength nor a characteristic patch size, both of which are expected under conditions that produce regular patterns. Rather, landscape properties suggest robust scale-free patterning, indicating genesis from the coupled effects of local facilitation and a global negative feedback operating uniformly at the landscape-scale. This finding challenges widespread invocation of scale-dependent negative feedbacks for explaining ridge-slough pattern origins. These results help discern among genesis mechanisms and provide an improved statistical description of the landscape that can be used to compare among model outputs, as well as to assess the success of future restoration projects.
Huang, Frederick Y; Chung, Henry; Kroenke, Kurt; Delucchi, Kevin L; Spitzer, Robert L
2006-06-01
The Patient Health Questionnaire depression scale (PHQ-9) is a well-validated, Diagnostic and Statistical Manual of Mental Disorders- Fourth Edition (DSM-IV) criterion-based measure for diagnosing depression, assessing severity and monitoring treatment response. The performance of most depression scales including the PHQ-9, however, has not been rigorously evaluated in different racial/ethnic populations. Therefore, we compared the factor structure of the PHQ-9 between different racial/ethnic groups as well as the rates of endorsement and differential item functioning (DIF) of the 9 items of the PHQ-9. The presence of DIF would indicate that responses to an individual item differ significantly between groups, controlling for the level of depression. A combined dataset from 2 separate studies of 5,053 primary care patients including non-Hispanic white (n=2,520), African American (n=598), Chinese American (n=941), and Latino (n=974) patients was used for our analysis. Exploratory principal components factor analysis was used to derive the factor structure of the PHQ-9 in each of the 4 racial/ethnic groups. A generalized Mantel-Haenszel statistic was used to test for DIF. One main factor that included all PHQ-9 items was found in each racial/ethnic group with alpha coefficients ranging from 0.79 to 0.89. Although endorsement rates of individual items were generally similar among the 4 groups, evidence of DIF was found for some items. Our analyses indicate that in African American, Chinese American, Latino, and non-Hispanic white patient groups the PHQ-9 measures a common concept of depression and can be effective for the detection and monitoring of depression in these diverse populations.
Only marginal alignment of disc galaxies
NASA Astrophysics Data System (ADS)
Andrae, René; Jahnke, Knud
2011-12-01
Testing theories of angular-momentum acquisition of rotationally supported disc galaxies is the key to understanding the formation of this type of galaxies. The tidal-torque theory aims to explain this acquisition process in a cosmological framework and predicts positive autocorrelations of angular-momentum orientation and spiral-arm handedness, i.e. alignment of disc galaxies, on short distance scales of 1 Mpc h-1. This disc alignment can also cause systematic effects in weak-lensing measurements. Previous observations claimed discovering these correlations but are overly optimistic in the reported level of statistical significance of the detections. Errors in redshift, ellipticity and morphological classifications were not taken into account, although they have a significant impact. We explain how to rigorously propagate all the important errors through the estimation process. Analysing disc galaxies in the Sloan Digital Sky Survey (SDSS) data base, we find that positive autocorrelations of spiral-arm handedness and angular-momentum orientations on distance scales of 1 Mpc h-1 are plausible but not statistically significant. Current data appear not good enough to constrain parameters of theory. This result agrees with a simple hypothesis test in the Local Group, where we also find no evidence for disc alignment. Moreover, we demonstrate that ellipticity estimates based on second moments are strongly biased by galactic bulges even for Scd galaxies, thereby corrupting correlation estimates and overestimating the impact of disc alignment on weak-lensing studies. Finally, we discuss the potential of future sky surveys. We argue that photometric redshifts have too large errors, i.e. PanSTARRS and LSST cannot be used. Conversely, the EUCLID project will not cover the relevant redshift regime. We also discuss the potentials and problems of front-edge classifications of galaxy discs in order to improve the autocorrelation estimates of angular-momentum orientation.
Yang, Li-Hua; Du, Shi-Zheng; Sun, Jin-Fang; Mei, Si-Juan; Wang, Xiao-Qing; Zhang, Yuan-Yuan
2014-01-01
Abstract Objectives: To assess the clinical evidence of auriculotherapy for constipation treatment and to identify the efficacy of groups using Semen vaccariae or magnetic pellets as taped objects in managing constipation. Methods: Databases were searched, including five English-language databases (the Cochrane Library, PubMed, Embase, CINAHL, and AMED) and four Chinese medical databases. Only randomized controlled trials were included in the review process. Critical appraisal was conducted using the Cochrane risk of bias tool. Results: Seventeen randomized, controlled trials (RCTs) met the inclusion criteria, of which 2 had low risk of bias. The primary outcome measures were the improvement rate and total effective rate. A meta-analysis of 15 RCTs showed a moderate, significant effect of auriculotherapy in managing constipation compared with controls (relative risk [RR], 2.06; 95% confidence interval [CI], 1.52– 2.79; p<0.00001). The 15 RCTs also showed a moderate, significant effect of auriculotherapy in relieving constipation (RR, 1.28; 95% CI, 1.13–1.44; p<0.0001). For other symptoms associated with constipation, such as abdominal distension or anorexia, results of the meta-analyses showed no statistical significance. Subgroup analysis revealed that use of S. vaccariae and use of magnetic pellets were both statistically favored over the control in relieving constipation. Conclusions: Current evidence illustrated that auriculotherapy, a relatively safe strategy, is probably beneficial in managing constipation. However, most of the eligible RCTs had a high risk of bias, and all were conducted in China. No definitive conclusion can be made because of cultural and geographic differences. Further rigorous RCTs from around the world are warranted to confirm the effect and safety of auriculotherapy for constipation. PMID:25020089
Yang, Li-Hua; Duan, Pei-Bei; Du, Shi-Zheng; Sun, Jin-Fang; Mei, Si-Juan; Wang, Xiao-Qing; Zhang, Yuan-Yuan
2014-08-01
To assess the clinical evidence of auriculotherapy for constipation treatment and to identify the efficacy of groups using Semen vaccariae or magnetic pellets as taped objects in managing constipation. Databases were searched, including five English-language databases (the Cochrane Library, PubMed, Embase, CINAHL, and AMED) and four Chinese medical databases. Only randomized controlled trials were included in the review process. Critical appraisal was conducted using the Cochrane risk of bias tool. Seventeen randomized, controlled trials (RCTs) met the inclusion criteria, of which 2 had low risk of bias. The primary outcome measures were the improvement rate and total effective rate. A meta-analysis of 15 RCTs showed a moderate, significant effect of auriculotherapy in managing constipation compared with controls (relative risk [RR], 2.06; 95% confidence interval [CI], 1.52- 2.79; p<0.00001). The 15 RCTs also showed a moderate, significant effect of auriculotherapy in relieving constipation (RR, 1.28; 95% CI, 1.13-1.44; p<0.0001). For other symptoms associated with constipation, such as abdominal distension or anorexia, results of the meta-analyses showed no statistical significance. Subgroup analysis revealed that use of S. vaccariae and use of magnetic pellets were both statistically favored over the control in relieving constipation. Current evidence illustrated that auriculotherapy, a relatively safe strategy, is probably beneficial in managing constipation. However, most of the eligible RCTs had a high risk of bias, and all were conducted in China. No definitive conclusion can be made because of cultural and geographic differences. Further rigorous RCTs from around the world are warranted to confirm the effect and safety of auriculotherapy for constipation.
Marko, Nicholas F.; Weil, Robert J.
2012-01-01
Introduction Gene expression data is often assumed to be normally-distributed, but this assumption has not been tested rigorously. We investigate the distribution of expression data in human cancer genomes and study the implications of deviations from the normal distribution for translational molecular oncology research. Methods We conducted a central moments analysis of five cancer genomes and performed empiric distribution fitting to examine the true distribution of expression data both on the complete-experiment and on the individual-gene levels. We used a variety of parametric and nonparametric methods to test the effects of deviations from normality on gene calling, functional annotation, and prospective molecular classification using a sixth cancer genome. Results Central moments analyses reveal statistically-significant deviations from normality in all of the analyzed cancer genomes. We observe as much as 37% variability in gene calling, 39% variability in functional annotation, and 30% variability in prospective, molecular tumor subclassification associated with this effect. Conclusions Cancer gene expression profiles are not normally-distributed, either on the complete-experiment or on the individual-gene level. Instead, they exhibit complex, heavy-tailed distributions characterized by statistically-significant skewness and kurtosis. The non-Gaussian distribution of this data affects identification of differentially-expressed genes, functional annotation, and prospective molecular classification. These effects may be reduced in some circumstances, although not completely eliminated, by using nonparametric analytics. This analysis highlights two unreliable assumptions of translational cancer gene expression analysis: that “small” departures from normality in the expression data distributions are analytically-insignificant and that “robust” gene-calling algorithms can fully compensate for these effects. PMID:23118863
Jorgenson, Andrew K; Clark, Brett
2013-01-01
This study examines the regional and temporal differences in the statistical relationship between national-level carbon dioxide emissions and national-level population size. The authors analyze panel data from 1960 to 2005 for a diverse sample of nations, and employ descriptive statistics and rigorous panel regression modeling techniques. Initial descriptive analyses indicate that all regions experienced overall increases in carbon emissions and population size during the 45-year period of investigation, but with notable differences. For carbon emissions, the sample of countries in Asia experienced the largest percent increase, followed by countries in Latin America, Africa, and lastly the sample of relatively affluent countries in Europe, North America, and Oceania combined. For population size, the sample of countries in Africa experienced the largest percent increase, followed countries in Latin America, Asia, and the combined sample of countries in Europe, North America, and Oceania. Findings for two-way fixed effects panel regression elasticity models of national-level carbon emissions indicate that the estimated elasticity coefficient for population size is much smaller for nations in Africa than for nations in other regions of the world. Regarding potential temporal changes, from 1960 to 2005 the estimated elasticity coefficient for population size decreased by 25% for the sample of Africa countries, 14% for the sample of Asia countries, 6.5% for the sample of Latin America countries, but remained the same in size for the sample of countries in Europe, North America, and Oceania. Overall, while population size continues to be the primary driver of total national-level anthropogenic carbon dioxide emissions, the findings for this study highlight the need for future research and policies to recognize that the actual impacts of population size on national-level carbon emissions differ across both time and region.
Graupner, Katharina; Scherlach, Kirstin; Bretschneider, Tom; Lackner, Gerald; Roth, Martin; Gross, Harald; Hertweck, Christian
2012-12-21
Caught in the act: imaging mass spectrometry of a button mushroom infected with the soft rot pathogen Janthinobacterium agaricidamnosum in conjunction with genome mining revealed jagaricin as a highly antifungal virulence factor that is not produced under standard cultivation conditions. The structure of jagaricin was rigorously elucidated by a combination of physicochemical analyses, chemical derivatization, and bioinformatics. Copyright © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Flinn, D.G.; Hall, S.; Morris, J.
This volume describes the background research, the application of the proposed loss evaluation techniques, and the results. The research identified present loss calculation methods as appropriate, provided care was taken to represent the various system elements in sufficient detail. The literature search of past methods and typical data revealed that extreme caution in using typical values (load factor, etc.) should be taken to ensure that all factors were referred to the same time base (daily, weekly, etc.). The performance of the method (and computer program) proposed in this project was determined by comparison of results with a rigorous evaluation ofmore » losses on the Salt River Project system. This rigorous evaluation used statistical modeling of the entire system as well as explicit enumeration of all substation and distribution transformers. Further tests were conducted at Public Service Electric and Gas of New Jersey to check the appropriateness of the methods in a northern environment. Finally sensitivity tests indicated data elements inaccuracy of which would most affect the determination of losses using the method developed in this project.« less
Sequence-based heuristics for faster annotation of non-coding RNA families.
Weinberg, Zasha; Ruzzo, Walter L
2006-01-01
Non-coding RNAs (ncRNAs) are functional RNA molecules that do not code for proteins. Covariance Models (CMs) are a useful statistical tool to find new members of an ncRNA gene family in a large genome database, using both sequence and, importantly, RNA secondary structure information. Unfortunately, CM searches are extremely slow. Previously, we created rigorous filters, which provably sacrifice none of a CM's accuracy, while making searches significantly faster for virtually all ncRNA families. However, these rigorous filters make searches slower than heuristics could be. In this paper we introduce profile HMM-based heuristic filters. We show that their accuracy is usually superior to heuristics based on BLAST. Moreover, we compared our heuristics with those used in tRNAscan-SE, whose heuristics incorporate a significant amount of work specific to tRNAs, where our heuristics are generic to any ncRNA. Performance was roughly comparable, so we expect that our heuristics provide a high-quality solution that--unlike family-specific solutions--can scale to hundreds of ncRNA families. The source code is available under GNU Public License at the supplementary web site.
Numerical proof of stability of roll waves in the small-amplitude limit for inclined thin film flow
NASA Astrophysics Data System (ADS)
Barker, Blake
2014-10-01
We present a rigorous numerical proof based on interval arithmetic computations categorizing the linearized and nonlinear stability of periodic viscous roll waves of the KdV-KS equation modeling weakly unstable flow of a thin fluid film on an incline in the small-amplitude KdV limit. The argument proceeds by verification of a stability condition derived by Bar-Nepomnyashchy and Johnson-Noble-Rodrigues-Zumbrun involving inner products of various elliptic functions arising through the KdV equation. One key point in the analysis is a bootstrap argument balancing the extremely poor sup norm bounds for these functions against the extremely good convergence properties for analytic interpolation in order to obtain a feasible computation time. Another is the way of handling analytic interpolation in several variables by a two-step process carving up the parameter space into manageable pieces for rigorous evaluation. These and other general aspects of the analysis should serve as blueprints for more general analyses of spectral stability.
Ezenwa, Miriam O; Suarez, Marie L; Carrasco, Jesus D; Hipp, Theresa; Gill, Anayza; Miller, Jacob; Shea, Robert; Shuey, David; Zhao, Zhongsheng; Angulo, Veronica; McCurry, Timothy; Martin, Joanna; Yao, Yingwei; Molokie, Robert E; Wang, Zaijie Jim; Wilkie, Diana J
2017-07-01
This purpose of this article is to describe how we adhere to the Patient-Centered Outcomes Research Institute's (PCORI) methodology standards relevant to the design and implementation of our PCORI-funded study, the PAIN RelieveIt Trial. We present details of the PAIN RelieveIt Trial organized by the PCORI methodology standards and components that are relevant to our study. The PAIN RelieveIt Trial adheres to four PCORI standards and 21 subsumed components. The four standards include standards for formulating research questions, standards associated with patient centeredness, standards for data integrity and rigorous analyses, and standards for preventing and handling missing data. In the past 24 months, we screened 2,837 cancer patients and their caregivers; 874 dyads were eligible; 223.5 dyads consented and provided baseline data. Only 55 patients were lost to follow-up-a 25% attrition rate. The design and implementation of the PAIN RelieveIt Trial adhered to PCORI's methodology standards for research rigor.
Phytoremediation of hazardous wastes. Technical report, 23--26 July 1995
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCutcheon, S.C.; Wolfe, N.L.; Carreria, L.H.
1995-07-26
A new and innovative approach to phytoremediation (the use of plants to degrade hazardous contaminants) was developed. The new approach to phytoremediation involves rigorous pathway analyses, mass balance determinations, and identification of specific enzymes that break down trinitrotoluene (TNT), other explosives (RDX and HMX), nitrobenzene, and chlorinated solvents (e.g., TCE and PCE) (EPA 1994). As a good example, TNT is completely and rapidly degraded by nitroreductase and laccase enzymes. The aromatic ring is broken and the carbon in the ring fragments is incorporated into new plant fiber, as part of the natural lignification process. Half lives for TNT degradation approachmore » 1 hr or less under ideal laboratory conditions. Continuous-flow pilot studies indicate that scale up residence times in created wetlands may be two to three times longer than in laboratory batch studies. The use of created wetlands and land farming techniques guided by rigorous field biochemistry and ecology promises to be a vital part of a newly evolving field, ecological engineering.« less
Phytoremediation of hazardous wastes
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCutcheon, S.C.; Wolfe, N.L.; Carreria, L.H.
1995-11-01
A new and innovative approach to phytoremediation (the use of plants to degrade hazardous contaminants) was developed. The new approach to phytoremediation involves rigorous pathway analyses, mass balance determinations, and identification of specific enzymes that break down trinitrotoluene (TNT), other explosives (RDX and HMX), nitrobenzene, and chlorinated solvents (e.g., TCE and PCE) (EPA 1994). As a good example, TNT is completely and rapidly degraded by nitroreductase and laccase enzymes. The aromatic ring is broken and the carbon in the ring fragments is incorporated into new plant fiber, as part of the natural lignification process. Half lives for TNT degradation approachmore » 1 hr or less under ideal laboratory conditions. Continuous-flow pilot studies indicate that scale up residence times in created wetlands may be two to three times longer than in laboratory batch studies. The use of created wetlands and land farming techniques guided by rigorous field biochemistry and ecology promises to be a vital part of a newly evolving field, ecological engineering.« less
Rohwer, Anke; Schoonees, Anel; Young, Taryn
2014-11-02
This paper describes the process, our experience and the lessons learnt in doing document reviews of health science curricula. Since we could not find relevant literature to guide us on how to approach these reviews, we feel that sharing our experience would benefit researchers embarking on similar projects. We followed a rigorous, transparent, pre-specified approach that included the preparation of a protocol, a pre-piloted data extraction form and coding schedule. Data were extracted, analysed and synthesised. Quality checks were included at all stages of the process. The main lessons we learnt related to time and project management, continuous quality assurance, selecting the software that meets the needs of the project, involving experts as needed and disseminating the findings to relevant stakeholders. A complete curriculum evaluation comprises, apart from a document review, interviews with students and lecturers to assess the learnt and taught curricula respectively. Rigorous methods must be used to ensure an objective assessment.
Raychaudhuri, Soumya; Korn, Joshua M.; McCarroll, Steven A.; Altshuler, David; Sklar, Pamela; Purcell, Shaun; Daly, Mark J.
2010-01-01
Investigators have linked rare copy number variation (CNVs) to neuropsychiatric diseases, such as schizophrenia. One hypothesis is that CNV events cause disease by affecting genes with specific brain functions. Under these circumstances, we expect that CNV events in cases should impact brain-function genes more frequently than those events in controls. Previous publications have applied “pathway” analyses to genes within neuropsychiatric case CNVs to show enrichment for brain-functions. While such analyses have been suggestive, they often have not rigorously compared the rates of CNVs impacting genes with brain function in cases to controls, and therefore do not address important confounders such as the large size of brain genes and overall differences in rates and sizes of CNVs. To demonstrate the potential impact of confounders, we genotyped rare CNV events in 2,415 unaffected controls with Affymetrix 6.0; we then applied standard pathway analyses using four sets of brain-function genes and observed an apparently highly significant enrichment for each set. The enrichment is simply driven by the large size of brain-function genes. Instead, we propose a case-control statistical test, cnv-enrichment-test, to compare the rate of CNVs impacting specific gene sets in cases versus controls. With simulations, we demonstrate that cnv-enrichment-test is robust to case-control differences in CNV size, CNV rate, and systematic differences in gene size. Finally, we apply cnv-enrichment-test to rare CNV events published by the International Schizophrenia Consortium (ISC). This approach reveals nominal evidence of case-association in neuronal-activity and the learning gene sets, but not the other two examined gene sets. The neuronal-activity genes have been associated in a separate set of schizophrenia cases and controls; however, testing in independent samples is necessary to definitively confirm this association. Our method is implemented in the PLINK software package. PMID:20838587
Cowling, Krycia; Thow, Anne Marie; Pollack Porter, Keshia
2018-05-24
A key mechanism through which globalization has impacted health is the liberalization of trade and investment, yet relatively few studies to date have used quantitative methods to investigate the impacts of global trade and investment policies on non-communicable diseases and risk factors. Recent reviews of this literature have found heterogeneity in results and a range of quality across studies, which may be in part attributable to a lack of conceptual clarity and methodological inconsistencies. This study is a critical review of methodological approaches used in the quantitative literature on global trade and investment and diet, tobacco, alcohol, and related health outcomes, with the objective of developing recommendations and providing resources to guide future robust, policy relevant research. A review of reviews, expert review, and reference tracing were employed to identify relevant studies, which were evaluated using a novel quality assessment tool designed for this research. Eight review articles and 34 quantitative studies were identified for inclusion. Important ways to improve this literature were identified and discussed: clearly defining exposures of interest and not conflating trade and investment; exploring mechanisms of broader relationships; increasing the use of individual-level data; ensuring consensus and consistency in key confounding variables; utilizing more sector-specific versus economy-wide trade and investment indicators; testing and adequately adjusting for autocorrelation and endogeneity when using longitudinal data; and presenting results from alternative statistical models and sensitivity analyses. To guide the development of future analyses, recommendations for international data sources for selected trade and investment indicators, as well as key gaps in the literature, are presented. More methodologically rigorous and consistent approaches in future quantitative studies on the impacts of global trade and investment policies on non-communicable diseases and risk factors can help to resolve inconsistencies of existing research and generate useful information to guide policy decisions.
Krompecher, T
1981-01-01
Objective measurements were carried out to study the evolution of rigor mortis on rats at various temperatures. Our experiments showed that: (1) at 6 degrees C rigor mortis reaches full development between 48 and 60 hours post mortem, and is resolved at 168 hours post mortem; (2) at 24 degrees C rigor mortis reaches full development at 5 hours post mortem, and is resolved at 16 hours post mortem; (3) at 37 degrees C rigor mortis reaches full development at 3 hours post mortem, and is resolved at 6 hours post mortem; (4) the intensity of rigor mortis grows with increase in temperature (difference between values obtained at 24 degrees C and 37 degrees C); and (5) and 6 degrees C a "cold rigidity" was found, in addition to and independent of rigor mortis.
Ambler, Graeme K; Gohel, Manjit S; Mitchell, David C; Loftus, Ian M; Boyle, Jonathan R
2015-01-01
Accurate adjustment of surgical outcome data for risk is vital in an era of surgeon-level reporting. Current risk prediction models for abdominal aortic aneurysm (AAA) repair are suboptimal. We aimed to develop a reliable risk model for in-hospital mortality after intervention for AAA, using rigorous contemporary statistical techniques to handle missing data. Using data collected during a 15-month period in the United Kingdom National Vascular Database, we applied multiple imputation methodology together with stepwise model selection to generate preoperative and perioperative models of in-hospital mortality after AAA repair, using two thirds of the available data. Model performance was then assessed on the remaining third of the data by receiver operating characteristic curve analysis and compared with existing risk prediction models. Model calibration was assessed by Hosmer-Lemeshow analysis. A total of 8088 AAA repair operations were recorded in the National Vascular Database during the study period, of which 5870 (72.6%) were elective procedures. Both preoperative and perioperative models showed excellent discrimination, with areas under the receiver operating characteristic curve of .89 and .92, respectively. This was significantly better than any of the existing models (area under the receiver operating characteristic curve for best comparator model, .84 and .88; P < .001 and P = .001, respectively). Discrimination remained excellent when only elective procedures were considered. There was no evidence of miscalibration by Hosmer-Lemeshow analysis. We have developed accurate models to assess risk of in-hospital mortality after AAA repair. These models were carefully developed with rigorous statistical methodology and significantly outperform existing methods for both elective cases and overall AAA mortality. These models will be invaluable for both preoperative patient counseling and accurate risk adjustment of published outcome data. Copyright © 2015 Society for Vascular Surgery. Published by Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Francis, Clay
2018-01-01
Historic notions of academic rigor usually follow from critiques of the system--we often define our goals for academically rigorous work through the lens of our shortcomings. This chapter discusses how the Truman Commission in 1947 and the Spellings Commission in 2006 shaped the way we think about academic rigor in today's context.
What's with all this peer-review stuff anyway?
DOE Office of Scientific and Technical Information (OSTI.GOV)
Warner, J. S.
2010-01-01
The Journal of Physical Security was ostensibly started to deal with a perceived lack of peer-reviewed journals related to the field of physical security. In fact, concerns have been expressed that the field of physical security is scarcely a field at all. A typical, well-developed field might include the following: multiple peer-reviewed journals devoted to the subject, rigor and critical thinking, metrics, fundamental principles, models and theories, effective standards and guidelines, R and D conferences, professional societies, certifications, its own academic department (or at least numerous academic experts), widespread granting of degrees in the field from 4-year research universities, mechanismsmore » for easily spotting 'snake oil' products and services, and the practice of professionals organizing to police themselves, provide quality control, and determine best practices. Physical Security seems to come up short in a number of these areas. Many of these attributes are difficult to quantify. This paper seeks to focus on one area that is quantifiable: the number of peer-reviewed journals dedicated to the field of Physical Security. In addition, I want to examine the number of overall periodicals (peer-reviewed and non-peer-reviewed) dedicated to physical security, as well as the number of papers published each year about physical security. These are potentially useful analyses because one can often infer how healthy or active a given field is by its publishing activity. For example, there are 2,754 periodicals dedicated to the (very healthy and active) field of physics. This paper concentrates on trade journal versus peer-reviewed journals. Trade journals typically focus on practice-related topics. A paper appropriate for a trade journal is usually based more on practical experience than rigorous studies or research. Models, theories, or rigorous experimental research results will usually not be included. A trade journal typically targets a specific market in an industry or trade. Such journals are often considered to be news magazines and may contain industry specific advertisements and/or job ads. A peer-reviewed journal, a.k.a 'referred journal', in contrast, contains peer-reviewed papers. A peer-reviewed paper is one that has been vetted by the peer review process. In this process, the paper is typically sent to independent experts for review and consideration. A peer-reviewed paper might cover experimental results, and/or a rigorous study, analyses, research efforts, theory, models, or one of many other scholarly endeavors.« less
NASA Astrophysics Data System (ADS)
Qian, Hong; Kjelstrup, Signe; Kolomeisky, Anatoly B.; Bedeaux, Dick
2016-04-01
Nonequilibrium thermodynamics (NET) investigates processes in systems out of global equilibrium. On a mesoscopic level, it provides a statistical dynamic description of various complex phenomena such as chemical reactions, ion transport, diffusion, thermochemical, thermomechanical and mechanochemical fluxes. In the present review, we introduce a mesoscopic stochastic formulation of NET by analyzing entropy production in several simple examples. The fundamental role of nonequilibrium steady-state cycle kinetics is emphasized. The statistical mechanics of Onsager’s reciprocal relations in this context is elucidated. Chemomechanical, thermomechanical, and enzyme-catalyzed thermochemical energy transduction processes are discussed. It is argued that mesoscopic stochastic NET in phase space provides a rigorous mathematical basis of fundamental concepts needed for understanding complex processes in chemistry, physics and biology. This theory is also relevant for nanoscale technological advances.
How Osmolytes Counteract Pressure Denaturation on a Molecular Scale.
Shimizu, Seishi; Smith, Paul E
2017-08-18
Life in the deep sea exposes enzymes to high hydrostatic pressure, which decreases their stability. For survival, deep sea organisms tend to accumulate various osmolytes, most notably trimethylamine N-oxide used by fish, to counteract pressure denaturation. However, exactly how these osmolytes work remains unclear. Here, a rigorous statistical thermodynamics approach is used to clarify the mechanism of osmoprotection. It is shown that the weak, nonspecific, and dynamic interactions of water and osmolytes with proteins can be characterized only statistically, and that the competition between protein-osmolyte and protein-water interactions is crucial in determining conformational stability. Osmoprotection is driven by a stronger exclusion of osmolytes from the denatured protein than from the native conformation, and water distribution has no significant effect on these changes at low osmolyte concentrations. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
Piotrowski, T; Rodrigues, G; Bajon, T; Yartsev, S
2014-03-01
Multi-institutional collaborations allow for more information to be analyzed but the data from different sources may vary in the subgroup sizes and/or conditions of measuring. Rigorous statistical analysis is required for pooling the data in a larger set. Careful comparison of all the components of the data acquisition is indispensable: identical conditions allow for enlargement of the database with improved statistical analysis, clearly defined differences provide opportunity for establishing a better practice. The optimal sequence of required normality, asymptotic normality, and independence tests is proposed. An example of analysis of six subgroups of position corrections in three directions obtained during image guidance procedures for 216 prostate cancer patients from two institutions is presented. Copyright © 2013 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
Weak Value Amplification is Suboptimal for Estimation and Detection
NASA Astrophysics Data System (ADS)
Ferrie, Christopher; Combes, Joshua
2014-01-01
We show by using statistically rigorous arguments that the technique of weak value amplification does not perform better than standard statistical techniques for the tasks of single parameter estimation and signal detection. Specifically, we prove that postselection, a necessary ingredient for weak value amplification, decreases estimation accuracy and, moreover, arranging for anomalously large weak values is a suboptimal strategy. In doing so, we explicitly provide the optimal estimator, which in turn allows us to identify the optimal experimental arrangement to be the one in which all outcomes have equal weak values (all as small as possible) and the initial state of the meter is the maximal eigenvalue of the square of the system observable. Finally, we give precise quantitative conditions for when weak measurement (measurements without postselection or anomalously large weak values) can mitigate the effect of uncharacterized technical noise in estimation.
Efficacy of Curcuma for Treatment of Osteoarthritis
Perkins, Kimberly; Sahy, William; Beckett, Robert D.
2016-01-01
The objective of this review is to identify, summarize, and evaluate clinical trials to determine the efficacy of curcuma in the treatment of osteoarthritis. A literature search for interventional studies assessing efficacy of curcuma was performed, resulting in 8 clinical trials. Studies have investigated the effect of curcuma on pain, stiffness, and functionality in patients with knee osteoarthritis. Curcuma-containing products consistently demonstrated statistically significant improvement in osteoarthritis-related endpoints compared with placebo, with one exception. When compared with active control, curcuma-containing products were similar to nonsteroidal anti-inflammatory drugs, and potentially to glucosamine. While statistical significant differences in outcomes were reported in a majority of studies, the small magnitude of effect and presence of major study limitations hinder application of these results. Further rigorous studies are needed prior to recommending curcuma as an effective alternative therapy for knee osteoarthritis. PMID:26976085
Guidelines for a graph-theoretic implementation of structural equation modeling
Grace, James B.; Schoolmaster, Donald R.; Guntenspergen, Glenn R.; Little, Amanda M.; Mitchell, Brian R.; Miller, Kathryn M.; Schweiger, E. William
2012-01-01
Structural equation modeling (SEM) is increasingly being chosen by researchers as a framework for gaining scientific insights from the quantitative analyses of data. New ideas and methods emerging from the study of causality, influences from the field of graphical modeling, and advances in statistics are expanding the rigor, capability, and even purpose of SEM. Guidelines for implementing the expanded capabilities of SEM are currently lacking. In this paper we describe new developments in SEM that we believe constitute a third-generation of the methodology. Most characteristic of this new approach is the generalization of the structural equation model as a causal graph. In this generalization, analyses are based on graph theoretic principles rather than analyses of matrices. Also, new devices such as metamodels and causal diagrams, as well as an increased emphasis on queries and probabilistic reasoning, are now included. Estimation under a graph theory framework permits the use of Bayesian or likelihood methods. The guidelines presented start from a declaration of the goals of the analysis. We then discuss how theory frames the modeling process, requirements for causal interpretation, model specification choices, selection of estimation method, model evaluation options, and use of queries, both to summarize retrospective results and for prospective analyses. The illustrative example presented involves monitoring data from wetlands on Mount Desert Island, home of Acadia National Park. Our presentation walks through the decision process involved in developing and evaluating models, as well as drawing inferences from the resulting prediction equations. In addition to evaluating hypotheses about the connections between human activities and biotic responses, we illustrate how the structural equation (SE) model can be queried to understand how interventions might take advantage of an environmental threshold to limit Typha invasions. The guidelines presented provide for an updated definition of the SEM process that subsumes the historical matrix approach under a graph-theory implementation. The implementation is also designed to permit complex specifications and to be compatible with various estimation methods. Finally, they are meant to foster the use of probabilistic reasoning in both retrospective and prospective considerations of the quantitative implications of the results.
Catalá-López, Ferrán; Hutton, Brian; Driver, Jane A; Page, Matthew J; Ridao, Manuel; Valderas, José M; Alonso-Arroyo, Adolfo; Forés-Martos, Jaume; Martínez, Salvador; Gènova-Maleras, Ricard; Macías-Saint-Gerons, Diego; Crespo-Facorro, Benedicto; Vieta, Eduard; Valencia, Alfonso; Tabarés-Seisdedos, Rafael
2017-04-04
The objective of this study will be to synthesize the epidemiological evidence and evaluate the validity of the associations between central nervous system disorders and the risk of developing or dying from cancer. We will perform an umbrella review of systematic reviews and conduct updated meta-analyses of observational studies (cohort and case-control) investigating the association between central nervous system disorders and the risk of developing or dying from any cancer or specific types of cancer. Searches involving PubMed/MEDLINE, EMBASE, SCOPUS and Web of Science will be used to identify systematic reviews and meta-analyses of observational studies. In addition, online databases will be checked for observational studies published outside the time frames of previous reviews. Eligible central nervous system disorders will be Alzheimer's disease, anorexia nervosa, amyotrophic lateral sclerosis, autism spectrum disorders, bipolar disorder, depression, Down's syndrome, epilepsy, Huntington's disease, multiple sclerosis, Parkinson's disease and schizophrenia. The primary outcomes will be cancer incidence and cancer mortality in association with a central nervous system disorder. Secondary outcome measures will be site-specific cancer incidence and mortality, respectively. Two reviewers will independently screen references identified by the literature search, as well as potentially relevant full-text articles. Data will be abstracted, and study quality/risk of bias will be appraised by two reviewers independently. Conflicts at all levels of screening and abstraction will be resolved through discussion. Random-effects meta-analyses of primary observational studies will be conducted where appropriate. Parameters for exploring statistical heterogeneity are pre-specified. The World Cancer Research Fund (WCRF)/American Institute for Cancer Research (AICR) criteria and the Grading of Recommendations Assessment, Development and Evaluation (GRADE) approach will be used for determining the quality of evidence for cancer outcomes. Our study will establish the extent of the epidemiological evidence underlying the associations between central nervous system disorders and cancer and will provide a rigorous and updated synthesis of a range of important site-specific cancer outcomes. PROSPERO CRD42016052762.
Emergency cricothyrotomy for trismus caused by instantaneous rigor in cardiac arrest patients.
Lee, Jae Hee; Jung, Koo Young
2012-07-01
Instantaneous rigor as muscle stiffening occurring in the moment of death (or cardiac arrest) can be confused with rigor mortis. If trismus is caused by instantaneous rigor, orotracheal intubation is impossible and a surgical airway should be secured. Here, we report 2 patients who had emergency cricothyrotomy for trismus caused by instantaneous rigor. This case report aims to help physicians understand instantaneous rigor and to emphasize the importance of securing a surgical airway quickly on the occurrence of trismus. Copyright © 2012 Elsevier Inc. All rights reserved.
Winters, Zoë Ellen; Benson, John R; Pusic, Andrea L
2010-12-01
Advances in breast cancer diagnosis and management have produced significant improvements in disease-free and breast cancer related survival. Consequently, there is increasing focus on the quality of long-term cancer survivorship. Of the 44,000 women diagnosed annually in the United Kingdom, 30% to 40% are required to undergo mastectomy. During the past 30 years, significant technical advances in breast reconstruction have increased performance of this surgical practice as a means to potentially improve health-related quality of life (HRQoL) for breast cancer survivors. Breast reconstruction studies increasingly aim to assess more discriminating outcomes based on the patients' own perception of the surgical result and its effect on HRQoL. This incremental output in HRQoL evaluation is being fuelled by both healthcare providers and official bodies such as the Food and Drug Administration, together with demands for more comprehensive comparative effectiveness data to permit fully informed consent by patients. In this systematic review, the authors apply inclusion and exclusion criteria to effectively screen 1012 abstracts identified in the field of HRQoL in breast reconstruction between 1978 and 2009. Each study was evaluated with respect to its design and statistical methodology. Each was reviewed with a recommended standard checklist of methodological requirements as described by Efficace et al (J Clin Oncol. 2003;21:3502-3511). A total of 34 papers that included HRQL outcomes in breast reconstruction were identified and reviewed in detail. The majority of studies were retrospective in nature with significant inherent limitations. Specifically, they were compromised by potentially biased patient recall. Most of these studies lacked both an a priori outcome of interest and statistical rigor jeopardizing estimations of potential effect size. In addition, more than 90% of the studies failed to report or describe missing data. Thirteen studies provided level I (n = 2) and II (n = 11) evidence. While these studies benefited from more robust design, the majority used generic instruments such as 36-item short form (SF-36), which may not be sufficiently sensitive to measure changes consequent to breast reconstruction (ie, effect on body image or psychosocial well-being). Furthermore, these studies were generally underpowered to detect meaningful clinical difference or to permit subgroup analyses. Further limitations included reliance on single center design that may negatively impact generalizability and deficiencies in reporting the number and types of surgical complications, which potentially has an effect on HRQoL outcomes. This systematic review reveals tendency for sound scientific methodology in HRQoL to be undermined by poorly designed and underpowered studies. In the current healthcare environment, patients and providers increasingly seek meaningful data to guide clinical decisions; policy makers are similarly in need of a rigorous patient-centered, comparative effectiveness data to inform national level decision-making. In light of this and the limitations of the existing published data, there is a pressing need for further Level I and II evidence in the form of randomized controlled trials as well as well-designed, multicenter prospective longitudinal studies in breast reconstruction. Such studies should incorporate sensitive and condition-specific patient-report outcome measures, provide adequate sample sizes, and respect established guidelines for rigorous HRQoL methodology.
Real, Jordi; Forné, Carles; Roso-Llorach, Albert; Martínez-Sánchez, Jose M
2016-05-01
Controlling for confounders is a crucial step in analytical observational studies, and multivariable models are widely used as statistical adjustment techniques. However, the validation of the assumptions of the multivariable regression models (MRMs) should be made clear in scientific reporting. The objective of this study is to review the quality of statistical reporting of the most commonly used MRMs (logistic, linear, and Cox regression) that were applied in analytical observational studies published between 2003 and 2014 by journals indexed in MEDLINE.Review of a representative sample of articles indexed in MEDLINE (n = 428) with observational design and use of MRMs (logistic, linear, and Cox regression). We assessed the quality of reporting about: model assumptions and goodness-of-fit, interactions, sensitivity analysis, crude and adjusted effect estimate, and specification of more than 1 adjusted model.The tests of underlying assumptions or goodness-of-fit of the MRMs used were described in 26.2% (95% CI: 22.0-30.3) of the articles and 18.5% (95% CI: 14.8-22.1) reported the interaction analysis. Reporting of all items assessed was higher in articles published in journals with a higher impact factor.A low percentage of articles indexed in MEDLINE that used multivariable techniques provided information demonstrating rigorous application of the model selected as an adjustment method. Given the importance of these methods to the final results and conclusions of observational studies, greater rigor is required in reporting the use of MRMs in the scientific literature.
The average receiver operating characteristic curve in multireader multicase imaging studies
Samuelson, F W
2014-01-01
Objective: In multireader, multicase (MRMC) receiver operating characteristic (ROC) studies for evaluating medical imaging systems, the area under the ROC curve (AUC) is often used as a summary metric. Owing to the limitations of AUC, plotting the average ROC curve to accompany the rigorous statistical inference on AUC is recommended. The objective of this article is to investigate methods for generating the average ROC curve from ROC curves of individual readers. Methods: We present both a non-parametric method and a parametric method for averaging ROC curves that produce a ROC curve, the area under which is equal to the average AUC of individual readers (a property we call area preserving). We use hypothetical examples, simulated data and a real-world imaging data set to illustrate these methods and their properties. Results: We show that our proposed methods are area preserving. We also show that the method of averaging the ROC parameters, either the conventional bi-normal parameters (a, b) or the proper bi-normal parameters (c, da), is generally not area preserving and may produce a ROC curve that is intuitively not an average of multiple curves. Conclusion: Our proposed methods are useful for making plots of average ROC curves in MRMC studies as a companion to the rigorous statistical inference on the AUC end point. The software implementing these methods is freely available from the authors. Advances in knowledge: Methods for generating the average ROC curve in MRMC ROC studies are formally investigated. The area-preserving criterion we defined is useful to evaluate such methods. PMID:24884728
2017-01-01
Background The home environment is where young children spend most of their time, and is critically important to supporting behaviors that promote health and prevent obesity. However, the home environment and lifestyle patterns remain understudied, and few interventions have investigated parent-led makeovers designed to create home environments that are supportive of optimal child health and healthy child weights. Objective The aim of the HomeStyles randomized controlled trial (RCT) is to determine whether the Web-based HomeStyles intervention enables and motivates parents to shape the weight-related aspects of their home environments and lifestyle behavioral practices (diet, exercise, and sleep) to be more supportive of their preschool children’s optimal health and weight. Methods A rigorous RCT utilizing an experimental group and an attention control group, receiving a bona fide contemporaneous treatment equal in nonspecific treatment effects and differing only in subject matter content, will test the effect of HomeStyles on a diverse sample of families with preschool children. This intervention is based on social cognitive theory and uses a social ecological framework, and will assess: intrapersonal characteristics (dietary intake, physical activity level, and sleep) of parents and children; family interpersonal or social characteristics related to diet, physical activity, media use, and parental values and self-efficacy for obesity-preventive practices; and home environment food availability, physical activity space and supports in and near the home, and media availability and controls in the home. Results Enrollment for this study has been completed and statistical data analyses are currently underway. Conclusions This paper describes the HomeStyles intervention with regards to: rationale, the intervention’s logic model, sample eligibility criteria and recruitment, experimental group and attention control intervention content, study design, instruments, data management, and planned analyses. PMID:28442452
Sweat, Michael D; Denison, Julie; Kennedy, Caitlin; Tedrow, Virginia; O'Reilly, Kevin
2012-08-01
To examine the relationship between condom social marketing programmes and condom use. Standard systematic review and meta-analysis methods were followed. The review included studies of interventions in which condoms were sold, in which a local brand name(s) was developed for condoms, and in which condoms were marketed through a promotional campaign to increase sales. A definition of intervention was developed and standard inclusion criteria were followed in selecting studies. Data were extracted from each eligible study, and a meta-analysis of the results was carried out. Six studies with a combined sample size of 23,048 met the inclusion criteria. One was conducted in India and five in sub-Saharan Africa. All studies were cross-sectional or serial cross-sectional. Three studies had a comparison group, although all lacked equivalence in sociodemographic characteristics across study arms. All studies randomly selected participants for assessments, although none randomly assigned participants to intervention arms. The random-effects pooled odds ratio for condom use was 2.01 (95% confidence interval, CI: 1.42-2.84) for the most recent sexual encounter and 2.10 (95% CI: 1.51-2.91) for a composite of all condom use outcomes. Tests for heterogeneity yielded significant results for both meta-analyses. The evidence base for the effect of condom social marketing on condom use is small because few rigorous studies have been conducted. Meta-analyses showed a positive and statistically significant effect on increasing condom use, and all individual studies showed positive trends. The cumulative effect of condom social marketing over multiple years could be substantial. We strongly encourage more evaluations of these programmes with study designs of high rigour.
A Fast Multiple-Kernel Method With Applications to Detect Gene-Environment Interaction.
Marceau, Rachel; Lu, Wenbin; Holloway, Shannon; Sale, Michèle M; Worrall, Bradford B; Williams, Stephen R; Hsu, Fang-Chi; Tzeng, Jung-Ying
2015-09-01
Kernel machine (KM) models are a powerful tool for exploring associations between sets of genetic variants and complex traits. Although most KM methods use a single kernel function to assess the marginal effect of a variable set, KM analyses involving multiple kernels have become increasingly popular. Multikernel analysis allows researchers to study more complex problems, such as assessing gene-gene or gene-environment interactions, incorporating variance-component based methods for population substructure into rare-variant association testing, and assessing the conditional effects of a variable set adjusting for other variable sets. The KM framework is robust, powerful, and provides efficient dimension reduction for multifactor analyses, but requires the estimation of high dimensional nuisance parameters. Traditional estimation techniques, including regularization and the "expectation-maximization (EM)" algorithm, have a large computational cost and are not scalable to large sample sizes needed for rare variant analysis. Therefore, under the context of gene-environment interaction, we propose a computationally efficient and statistically rigorous "fastKM" algorithm for multikernel analysis that is based on a low-rank approximation to the nuisance effect kernel matrices. Our algorithm is applicable to various trait types (e.g., continuous, binary, and survival traits) and can be implemented using any existing single-kernel analysis software. Through extensive simulation studies, we show that our algorithm has similar performance to an EM-based KM approach for quantitative traits while running much faster. We also apply our method to the Vitamin Intervention for Stroke Prevention (VISP) clinical trial, examining gene-by-vitamin effects on recurrent stroke risk and gene-by-age effects on change in homocysteine level. © 2015 WILEY PERIODICALS, INC.
Chehab, E F; Andriacchi, T P; Favre, J
2017-06-14
The increased use of gait analysis has raised the need for a better understanding of how walking speed and demographic variations influence asymptomatic gait. Previous analyses mainly reported relationships between subsets of gait features and demographic measures, rendering it difficult to assess whether gait features are affected by walking speed or other demographic measures. The purpose of this study was to conduct a comprehensive analysis of the kinematic and kinetic profiles during ambulation that tests for the effect of walking speed in parallel to the effects of age, sex, and body mass index. This was accomplished by recruiting a population of 121 asymptomatic subjects and analyzing characteristic 3-dimensional kinematic and kinetic features at the ankle, knee, hip, and pelvis during walking trials at slow, normal, and fast speeds. Mixed effects linear regression models were used to identify how each of 78 discrete gait features is affected by variations in walking speed, age, sex, and body mass index. As expected, nearly every feature was associated with variations in walking speed. Several features were also affected by variations in demographic measures, including age affecting sagittal-plane knee kinematics, body mass index affecting sagittal-plane pelvis and hip kinematics, body mass index affecting frontal-plane knee kinematics and kinetics, and sex affecting frontal-plane kinematics at the pelvis, hip, and knee. These results could aid in the design of future studies, as well as clarify how walking speed, age, sex, and body mass index may act as potential confounders in studies with small populations or in populations with insufficient demographic variations for thorough statistical analyses. Copyright © 2017 Elsevier Ltd. All rights reserved.
Denison, Julie; Kennedy, Caitlin; Tedrow, Virginia; O'Reilly, Kevin
2012-01-01
Abstract Objective To examine the relationship between condom social marketing programmes and condom use. Methods Standard systematic review and meta-analysis methods were followed. The review included studies of interventions in which condoms were sold, in which a local brand name(s) was developed for condoms, and in which condoms were marketed through a promotional campaign to increase sales. A definition of intervention was developed and standard inclusion criteria were followed in selecting studies. Data were extracted from each eligible study, and a meta-analysis of the results was carried out. Findings Six studies with a combined sample size of 23 048 met the inclusion criteria. One was conducted in India and five in sub-Saharan Africa. All studies were cross-sectional or serial cross-sectional. Three studies had a comparison group, although all lacked equivalence in sociodemographic characteristics across study arms. All studies randomly selected participants for assessments, although none randomly assigned participants to intervention arms. The random-effects pooled odds ratio for condom use was 2.01 (95% confidence interval, CI: 1.42–2.84) for the most recent sexual encounter and 2.10 (95% CI: 1.51–2.91) for a composite of all condom use outcomes. Tests for heterogeneity yielded significant results for both meta-analyses. Conclusion The evidence base for the effect of condom social marketing on condom use is small because few rigorous studies have been conducted. Meta-analyses showed a positive and statistically significant effect on increasing condom use, and all individual studies showed positive trends. The cumulative effect of condom social marketing over multiple years could be substantial. We strongly encourage more evaluations of these programmes with study designs of high rigour. PMID:22893745
Walton, David M; Putos, Joseph; Beattie, Tyler; MacDermid, Joy C
2016-07-01
The Brief Pain Inventory (BPI-SF) is a widely-used generic pain interference scale, however its factor structure remains unclear. An expanded 10-item version of the Interference subscale has been proposed, but the additional value of the 3 extra items has not been rigorously evaluated. The purpose of this study was to evaluate and contrast the factorial and concurrent validity of the original 7-item and 10-item versions of the BPI-SF in a large heterogeneous sample of patients with chronic pain. Exploratory and confirmatory factor analyses were conducted on independent subsets of the sample, and concurrent correlations with scales capturing similar constructs were evaluated. Two independent exploratory factor analyses (n=500 each) supported a single interference factor in both the 7- and 10-item versions, while confirmatory factor analysis (N=1000) suggested that a 2-factor structure (Physical and Affective) provided better fit. A 3-factor model, where sleep interference was the third factor, improved in model fit further. There was no significant difference in model fit between the 7- and 10-item versions. Concurrent associations with measures of general health, pain intensity and pain-related cognitions were all in the anticipated direction and magnitude and were not different by version of the BPI-SF. The addition of 3 extra items to the original 7-item Interference subscale of the BPI-SF did not improve psychometric properties. The combined results lead us to endorse a 3-factor structure (Physical, Affective, and Sleep Interference) as the more statistically and conceptually sound option. Copyright © 2016 Scandinavian Association for the Study of Pain. Published by Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ehresman, David J.; Froehlich, John W.; Olsen, Geary W.
2007-02-15
Interest in human exposure to perfluorinated acids, including perfluorobutanesulfonate (PFBS), perfluorohexanesulfonate (PFHS), perfluorooctanesulfonate (PFOS), and perfluorooctanoate (PFOA) has led to their measurement in whole blood, plasma and serum. Comparison of measurements in these different blood-based matrices, however, has not been rigorously investigated to allow for across-matrix comparisons. This research evaluated concentrations of PFBS, PFHS, PFOS, and PFOA in whole blood collected in heparin (lithium) and ethylenediamine tetraacetic acid (EDTA), plasma samples collected in heparin and EDTA, and serum (from whole blood allowed to clot). Blood samples were collected from 18 voluntary participants employed at 3M Company. Solid phase extraction methodsmore » were used for all analytical sample preparations, and analyses were completed using high-pressure liquid chromatography/tandem mass spectrometry methods. Serum concentrations ranged from: limit of quantitation (LOQ, 5 ng/mL) to 25 ng/mL for PFBS; LOQ (5 ng/mL) to 75 ng/mL for PFHS; LOQ (5 ng/mL) to 880 ng/mL for PFOS; and LOQ (5 or 10 ng/mL) to 7320 ng/mL for PFOA. Values less than the LOQ were not included in the statistical analyses of the mean of the ratios of individual values for the matrices. PFBS was not quantifiable in most samples. Serum to plasma ratios for PFHS, PFOS, and PFOA were 1:1 and this ratio was independent of the level of concentrations measured. Serum or plasma to whole blood ratios, regardless of the anticoagulant used, approximated 2:1. The difference between plasma and serum and whole blood corresponded to volume displacement by red blood cells, suggesting that the fluorochemicals are not found intracellularly or attached to the red blood cells.« less
Rivaroxaban real-world evidence: Validating safety and effectiveness in clinical practice.
Beyer-Westendorf, Jan; Camm, A John; Coleman, Craig I; Tamayo, Sally
2016-09-28
Randomised controlled trials (RCTs) are considered the gold standard of clinical research as they use rigorous methodologies, detailed protocols, pre-specified statistical analyses and well-defined patient cohorts. However, RCTs do not take into account the complexity of real-world clinical decision-making. To tackle this, real-world data are being increasingly used to evaluate the long-term safety and effectiveness of a given therapy in routine clinical practice and in patients who may not be represented in RCTs, addressing key clinical questions that may remain. Real-world evidence plays a substantial role in supporting the use of non-vitamin K antagonist (VKA) oral anticoagulants (NOACs) in clinical practice. By providing data on patient profiles and the use of anticoagulation therapies in routine clinical practice, real-world evidence expands the current awareness of NOACs, helping to ensure that clinicians are well-informed on their use to implement patient-tailored clinical decisions. There are various issues with current anticoagulation strategies, including under- or overtreatment and frequent monitoring with VKAs. Real-world studies have demonstrated that NOAC use is increasing (Dresden NOAC registry and Global Anticoagulant Registry in the FIELD-AF [GARFIELD-AF]), as well as reaffirming the safety and effectiveness of rivaroxaban previously observed in RCTs (XArelto on preveNtion of sTroke and non-central nervoUS system systemic embolism in patients with non-valvular atrial fibrillation [XANTUS] and IMS Disease Analyzer). This article will describe the latest updates in real-world evidence across a variety of methodologies, such as non-interventional studies (NIS), registries and database analyses studies. It is anticipated that these studies will provide valuable clinical insights into the management of thromboembolism, and enhance the current knowledge on anticoagulant use and outcomes for patients.
Birkeland, S; Akse, L
2010-01-01
Improved slaughtering procedures in the salmon industry have caused a delayed onset of rigor mortis and, thus, a potential for pre-rigor secondary processing. The aim of this study was to investigate the effect of rigor status at time of processing on quality traits color, texture, sensory, microbiological, in injection salted, and cold-smoked Atlantic salmon (Salmo salar). Injection of pre-rigor fillets caused a significant (P<0.001) contraction (-7.9%± 0.9%) on the caudal-cranial axis. No significant differences in instrumental color (a*, b*, C*, or h*), texture (hardness), or sensory traits (aroma, color, taste, and texture) were observed between pre- or post-rigor processed fillets; however, post-rigor (1477 ± 38 g) fillets had a significant (P>0.05) higher fracturability than pre-rigor fillets (1369 ± 71 g). Pre-rigor fillets were significantly (P<0.01) lighter, L*, (39.7 ± 1.0) than post-rigor fillets (37.8 ± 0.8) and had significantly lower (P<0.05) aerobic plate count (APC), 1.4 ± 0.4 log CFU/g against 2.6 ± 0.6 log CFU/g, and psychrotrophic count (PC), 2.1 ± 0.2 log CFU/g against 3.0 ± 0.5 log CFU/g, than post-rigor processed fillets. This study showed that similar quality characteristics can be obtained in cold-smoked products processed either pre- or post-rigor when using suitable injection salting protocols and smoking techniques. © 2010 Institute of Food Technologists®
pytc: Open-Source Python Software for Global Analyses of Isothermal Titration Calorimetry Data.
Duvvuri, Hiranmayi; Wheeler, Lucas C; Harms, Michael J
2018-05-08
Here we describe pytc, an open-source Python package for global fits of thermodynamic models to multiple isothermal titration calorimetry experiments. Key features include simplicity, the ability to implement new thermodynamic models, a robust maximum likelihood fitter, a fast Bayesian Markov-Chain Monte Carlo sampler, rigorous implementation, extensive documentation, and full cross-platform compatibility. pytc fitting can be done using an application program interface or via a graphical user interface. It is available for download at https://github.com/harmslab/pytc .
2014-09-01
The NATO Science and Technology Organization Science & Technology (S& T ) in the NATO context is defined as the selective and rigorous...generation and application of state-of-the-art, validated knowledge for defence and security purposes. S& T activities embrace scientific research...engineering, operational research and analysis, synthesis, integration and validation of knowledge derived through the scientific method. In NATO, S& T is
Assessing significance in a Markov chain without mixing.
Chikina, Maria; Frieze, Alan; Pegden, Wesley
2017-03-14
We present a statistical test to detect that a presented state of a reversible Markov chain was not chosen from a stationary distribution. In particular, given a value function for the states of the Markov chain, we would like to show rigorously that the presented state is an outlier with respect to the values, by establishing a [Formula: see text] value under the null hypothesis that it was chosen from a stationary distribution of the chain. A simple heuristic used in practice is to sample ranks of states from long random trajectories on the Markov chain and compare these with the rank of the presented state; if the presented state is a [Formula: see text] outlier compared with the sampled ranks (its rank is in the bottom [Formula: see text] of sampled ranks), then this observation should correspond to a [Formula: see text] value of [Formula: see text] This significance is not rigorous, however, without good bounds on the mixing time of the Markov chain. Our test is the following: Given the presented state in the Markov chain, take a random walk from the presented state for any number of steps. We prove that observing that the presented state is an [Formula: see text]-outlier on the walk is significant at [Formula: see text] under the null hypothesis that the state was chosen from a stationary distribution. We assume nothing about the Markov chain beyond reversibility and show that significance at [Formula: see text] is best possible in general. We illustrate the use of our test with a potential application to the rigorous detection of gerrymandering in Congressional districting.
NASA Astrophysics Data System (ADS)
Vereecken, Luc; Peeters, Jozef
2003-09-01
The rigorous implementation of transition state theory (TST) for a reaction system with multiple reactant rotamers and multiple transition state conformers is discussed by way of a statistical rate analysis of the 1,5-H-shift in 1-butoxy radicals, a prototype reaction for the important class of H-shift reactions in atmospheric chemistry. Several approaches for deriving a multirotamer TST expression are treated: oscillator versus (hindered) internal rotor models; distinguishable versus indistinguishable atoms; and direct count methods versus degeneracy factors calculated by (simplified) direct count methods or from symmetry numbers and number of enantiomers, where applicable. It is shown that the various treatments are fully consistent, even if the TST expressions themselves appear different. The 1-butoxy H-shift reaction is characterized quantum chemically using B3LYP-DFT; the performance of this level of theory is compared to other methods. Rigorous application of the multirotamer TST methodology in an harmonic oscillator approximation based on this data yields a rate coefficient of k(298 K,1 atm)=1.4×105 s-1, and an Arrhenius expression k(T,1 atm)=1.43×1011 exp(-8.17 kcal mol-1/RT) s-1, which both closely match the experimental recommendations in the literature. The T-dependence is substantially influenced by the multirotamer treatment, as well as by the tunneling and fall-off corrections. The present results are compared to those of simplified TST calculations based solely on the properties of the lowest energy 1-butoxy rotamer.
Assessing significance in a Markov chain without mixing
Chikina, Maria; Frieze, Alan; Pegden, Wesley
2017-01-01
We present a statistical test to detect that a presented state of a reversible Markov chain was not chosen from a stationary distribution. In particular, given a value function for the states of the Markov chain, we would like to show rigorously that the presented state is an outlier with respect to the values, by establishing a p value under the null hypothesis that it was chosen from a stationary distribution of the chain. A simple heuristic used in practice is to sample ranks of states from long random trajectories on the Markov chain and compare these with the rank of the presented state; if the presented state is a 0.1% outlier compared with the sampled ranks (its rank is in the bottom 0.1% of sampled ranks), then this observation should correspond to a p value of 0.001. This significance is not rigorous, however, without good bounds on the mixing time of the Markov chain. Our test is the following: Given the presented state in the Markov chain, take a random walk from the presented state for any number of steps. We prove that observing that the presented state is an ε-outlier on the walk is significant at p=2ε under the null hypothesis that the state was chosen from a stationary distribution. We assume nothing about the Markov chain beyond reversibility and show that significance at p≈ε is best possible in general. We illustrate the use of our test with a potential application to the rigorous detection of gerrymandering in Congressional districting. PMID:28246331
Increased mortality attributed to Chagas disease: a systematic review and meta-analysis.
Cucunubá, Zulma M; Okuwoga, Omolade; Basáñez, María-Gloria; Nouvellet, Pierre
2016-01-27
The clinical outcomes associated with Chagas disease remain poorly understood. In addition to the burden of morbidity, the burden of mortality due to Trypanosoma cruzi infection can be substantial, yet its quantification has eluded rigorous scrutiny. This is partly due to considerable heterogeneity between studies, which can influence the resulting estimates. There is a pressing need for accurate estimates of mortality due to Chagas disease that can be used to improve mathematical modelling, burden of disease evaluations, and cost-effectiveness studies. A systematic literature review was conducted to select observational studies comparing mortality in populations with and without a diagnosis of Chagas disease using the PubMed, MEDLINE, EMBASE, Web of Science and LILACS databases, without restrictions on language or date of publication. The primary outcome of interest was mortality (as all-cause mortality, sudden cardiac death, heart transplant or cardiovascular deaths). Data were analysed using a random-effects model to obtain the relative risk (RR) of mortality, the attributable risk percent (ARP), and the annual mortality rates (AMR). The statistic I(2) (proportion of variance in the meta-analysis due to study heterogeneity) was calculated. Sensitivity analyses and publication bias test were also conducted. Twenty five studies were selected for quantitative analysis, providing data on 10,638 patients, 53,346 patient-years of follow-up, and 2739 events. Pooled estimates revealed that Chagas disease patients have significantly higher AMR compared with non-Chagas disease patients (0.18 versus 0.10; RR = 1.74, 95% CI 1.49-2.03). Substantial heterogeneity was found among studies (I(2) = 67.3%). The ARP above background mortality was 42.5%. Through a sub-analysis patients were classified by clinical group (severe, moderate, asymptomatic). While RR did not differ significantly between clinical groups, important differences in AMR were found: AMR = 0.43 in Chagas vs. 0.29 in non-Chagas patients (RR = 1.40, 95% CI 1.21-1.62) in the severe group; AMR = 0.16 (Chagas) vs. 0.08 (non-Chagas) (RR = 2.10, 95% CI 1.52-2.91) in the moderate group, and AMR = 0.02 vs. 0.01 (RR = 1.42, 95% CI 1.14-1.77) in the asymptomatic group. Meta-regression showed no evidence of study-level covariates on the effect size. Publication bias was not statistically significant (Egger's test p=0.08). The results indicate a statistically significant excess of mortality due to Chagas disease that is shared among both symptomatic and asymptomatic populations.
Early Warning Signs of Suicide in Service Members Who Engage in Unauthorized Acts of Violence
2016-06-01
observable to military law enforcement personnel. Statistical analyses tested for differences in warning signs between cases of suicide, violence, or...indicators, (2) Behavioral Change indicators, (3) Social indicators, and (4) Occupational indicators. Statistical analyses were conducted to test for...6 Coding _________________________________________________________________ 7 Statistical
[Statistical analysis using freely-available "EZR (Easy R)" software].
Kanda, Yoshinobu
2015-10-01
Clinicians must often perform statistical analyses for purposes such evaluating preexisting evidence and designing or executing clinical studies. R is a free software environment for statistical computing. R supports many statistical analysis functions, but does not incorporate a statistical graphical user interface (GUI). The R commander provides an easy-to-use basic-statistics GUI for R. However, the statistical function of the R commander is limited, especially in the field of biostatistics. Therefore, the author added several important statistical functions to the R commander and named it "EZR (Easy R)", which is now being distributed on the following website: http://www.jichi.ac.jp/saitama-sct/. EZR allows the application of statistical functions that are frequently used in clinical studies, such as survival analyses, including competing risk analyses and the use of time-dependent covariates and so on, by point-and-click access. In addition, by saving the script automatically created by EZR, users can learn R script writing, maintain the traceability of the analysis, and assure that the statistical process is overseen by a supervisor.
Video analysis of head blows leading to concussion in competition Taekwondo.
Koh, Jae O; Watkinson, E Jane; Yoon, Yong-Jin
2004-12-01
To analyse the situational and contextual factors surrounding concussions and head blows in Taekwondo. Prospective design. Direct observation, subject interview and videotape recording used. A total of 2328 competitors participated in the 2001 tournament, South Korea. All matches were recorded on videotape. All recipients of head blows were interviewed by athletic therapists and the researcher immediately after the match. The videotapes of concussions and head blows were analysed. A total of 1009 head blows including concussions were analysed. Head blows and concussions were most evident when the attacker was situated in a closed stance and received a single roundhouse kick. The most frequent anatomical site of the head impact was the temporal region. The frequency of head blows and concussions is high in Taekwondo. Development of blocking skills, safety education, rigorous enforcement of the competition rules and improvement of head-gear are recommended.
Kim, Hyun-Wook; Hwang, Ko-Eun; Song, Dong-Heon; Kim, Yong-Jae; Ham, Youn-Kyung; Yeo, Eui-Joo; Jeong, Tae-Jun; Choi, Yun-Sang; Kim, Cheon-Jei
2015-01-01
This study was conducted to evaluate the effect of pre-rigor salting level (0-4% NaCl concentration) on physicochemical and textural properties of pre-rigor chicken breast muscles. The pre-rigor chicken breast muscles were de-boned 10 min post-mortem and salted within 25 min post-mortem. An increase in pre-rigor salting level led to the formation of high ultimate pH of chicken breast muscles at post-mortem 24 h. The addition of minimum of 2% NaCl significantly improved water holding capacity, cooking loss, protein solubility, and hardness when compared to the non-salting chicken breast muscle (p<0.05). On the other hand, the increase in pre-rigor salting level caused the inhibition of myofibrillar protein degradation and the acceleration of lipid oxidation. However, the difference in NaCl concentration between 3% and 4% had no great differences in the results of physicochemical and textural properties due to pre-rigor salting effects (p>0.05). Therefore, our study certified the pre-rigor salting effect of chicken breast muscle salted with 2% NaCl when compared to post-rigor muscle salted with equal NaCl concentration, and suggests that the 2% NaCl concentration is minimally required to ensure the definite pre-rigor salting effect on chicken breast muscle.
Choi, Yun-Sang
2015-01-01
This study was conducted to evaluate the effect of pre-rigor salting level (0-4% NaCl concentration) on physicochemical and textural properties of pre-rigor chicken breast muscles. The pre-rigor chicken breast muscles were de-boned 10 min post-mortem and salted within 25 min post-mortem. An increase in pre-rigor salting level led to the formation of high ultimate pH of chicken breast muscles at post-mortem 24 h. The addition of minimum of 2% NaCl significantly improved water holding capacity, cooking loss, protein solubility, and hardness when compared to the non-salting chicken breast muscle (p<0.05). On the other hand, the increase in pre-rigor salting level caused the inhibition of myofibrillar protein degradation and the acceleration of lipid oxidation. However, the difference in NaCl concentration between 3% and 4% had no great differences in the results of physicochemical and textural properties due to pre-rigor salting effects (p>0.05). Therefore, our study certified the pre-rigor salting effect of chicken breast muscle salted with 2% NaCl when compared to post-rigor muscle salted with equal NaCl concentration, and suggests that the 2% NaCl concentration is minimally required to ensure the definite pre-rigor salting effect on chicken breast muscle. PMID:26761884
New statistical potential for quality assessment of protein models and a survey of energy functions
2010-01-01
Background Scoring functions, such as molecular mechanic forcefields and statistical potentials are fundamentally important tools in protein structure modeling and quality assessment. Results The performances of a number of publicly available scoring functions are compared with a statistical rigor, with an emphasis on knowledge-based potentials. We explored the effect on accuracy of alternative choices for representing interaction center types and other features of scoring functions, such as using information on solvent accessibility, on torsion angles, accounting for secondary structure preferences and side chain orientation. Partially based on the observations made, we present a novel residue based statistical potential, which employs a shuffled reference state definition and takes into account the mutual orientation of residue side chains. Atom- and residue-level statistical potentials and Linux executables to calculate the energy of a given protein proposed in this work can be downloaded from http://www.fiserlab.org/potentials. Conclusions Among the most influential terms we observed a critical role of a proper reference state definition and the benefits of including information about the microenvironment of interaction centers. Molecular mechanical potentials were also tested and found to be over-sensitive to small local imperfections in a structure, requiring unfeasible long energy relaxation before energy scores started to correlate with model quality. PMID:20226048
Eng, Kevin H; Schiller, Emily; Morrell, Kayla
2015-11-03
Researchers developing biomarkers for cancer prognosis from quantitative gene expression data are often faced with an odd methodological discrepancy: while Cox's proportional hazards model, the appropriate and popular technique, produces a continuous and relative risk score, it is hard to cast the estimate in clear clinical terms like median months of survival and percent of patients affected. To produce a familiar Kaplan-Meier plot, researchers commonly make the decision to dichotomize a continuous (often unimodal and symmetric) score. It is well known in the statistical literature that this procedure induces significant bias. We illustrate the liabilities of common techniques for categorizing a risk score and discuss alternative approaches. We promote the use of the restricted mean survival (RMS) and the corresponding RMS curve that may be thought of as an analog to the best fit line from simple linear regression. Continuous biomarker workflows should be modified to include the more rigorous statistical techniques and descriptive plots described in this article. All statistics discussed can be computed via standard functions in the Survival package of the R statistical programming language. Example R language code for the RMS curve is presented in the appendix.
Estimating maize production in Kenya using NDVI: Some statistical considerations
Lewis, J.E.; Rowland, James; Nadeau , A.
1998-01-01
A regression model approach using a normalized difference vegetation index (NDVI) has the potential for estimating crop production in East Africa. However, before production estimation can become a reality, the underlying model assumptions and statistical nature of the sample data (NDVI and crop production) must be examined rigorously. Annual maize production statistics from 1982-90 for 36 agricultural districts within Kenya were used as the dependent variable; median area NDVI (independent variable) values from each agricultural district and year were extracted from the annual maximum NDVI data set. The input data and the statistical association of NDVI with maize production for Kenya were tested systematically for the following items: (1) homogeneity of the data when pooling the sample, (2) gross data errors and influence points, (3) serial (time) correlation, (4) spatial autocorrelation and (5) stability of the regression coefficients. The results of using a simple regression model with NDVI as the only independent variable are encouraging (r 0.75, p 0.05) and illustrate that NDVI can be a responsive indicator of maize production, especially in areas of high NDVI spatial variability, which coincide with areas of production variability in Kenya.
Krompecher, T; Bergerioux, C; Brandt-Casadevall, C; Gujer, H R
1983-07-01
The evolution of rigor mortis was studied in cases of nitrogen asphyxia, drowning and strangulation, as well as in fatal intoxications due to strychnine, carbon monoxide and curariform drugs, using a modified method of measurement. Our experiments demonstrated that: (1) Strychnine intoxication hastens the onset and passing of rigor mortis. (2) CO intoxication delays the resolution of rigor mortis. (3) The intensity of rigor may vary depending upon the cause of death. (4) If the stage of rigidity is to be used to estimate the time of death, it is necessary: (a) to perform a succession of objective measurements of rigor mortis intensity; and (b) to verify the eventual presence of factors that could play a role in the modification of its development.
RIGOR MORTIS AND THE INFLUENCE OF CALCIUM AND MAGNESIUM SALTS UPON ITS DEVELOPMENT.
Meltzer, S J; Auer, J
1908-01-01
Calcium salts hasten and magnesium salts retard the development of rigor mortis, that is, when these salts are administered subcutaneously or intravenously. When injected intra-arterially, concentrated solutions of both kinds of salts cause nearly an immediate onset of a strong stiffness of the muscles which is apparently a contraction, brought on by a stimulation caused by these salts and due to osmosis. This contraction, if strong, passes over without a relaxation into a real rigor. This form of rigor may be classed as work-rigor (Arbeitsstarre). In animals, at least in frogs, with intact cords, the early contraction and the following rigor are stronger than in animals with destroyed cord. If M/8 solutions-nearly equimolecular to "physiological" solutions of sodium chloride-are used, even when injected intra-arterially, calcium salts hasten and magnesium salts retard the onset of rigor. The hastening and retardation in this case as well as in the cases of subcutaneous and intravenous injections, are ion effects and essentially due to the cations, calcium and magnesium. In the rigor hastened by calcium the effects of the extensor muscles mostly prevail; in the rigor following magnesium injection, on the other hand, either the flexor muscles prevail or the muscles become stiff in the original position of the animal at death. There seems to be no difference in the degree of stiffness in the final rigor, only the onset and development of the rigor is hastened in the case of the one salt and retarded in the other. Calcium hastens also the development of heat rigor. No positive facts were obtained with regard to the effect of magnesium upon heat vigor. Calcium also hastens and magnesium retards the onset of rigor in the left ventricle of the heart. No definite data were gathered with regard to the effects of these salts upon the right ventricle.
RIGOR MORTIS AND THE INFLUENCE OF CALCIUM AND MAGNESIUM SALTS UPON ITS DEVELOPMENT
Meltzer, S. J.; Auer, John
1908-01-01
Calcium salts hasten and magnesium salts retard the development of rigor mortis, that is, when these salts are administered subcutaneously or intravenously. When injected intra-arterially, concentrated solutions of both kinds of salts cause nearly an immediate onset of a strong stiffness of the muscles which is apparently a contraction, brought on by a stimulation caused by these salts and due to osmosis. This contraction, if strong, passes over without a relaxation into a real rigor. This form of rigor may be classed as work-rigor (Arbeitsstarre). In animals, at least in frogs, with intact cords, the early contraction and the following rigor are stronger than in animals with destroyed cord. If M/8 solutions—nearly equimolecular to "physiological" solutions of sodium chloride—are used, even when injected intra-arterially, calcium salts hasten and magnesium salts retard the onset of rigor. The hastening and retardation in this case as well as in the cases of subcutaneous and intravenous injections, are ion effects and essentially due to the cations, calcium and magnesium. In the rigor hastened by calcium the effects of the extensor muscles mostly prevail; in the rigor following magnesium injection, on the other hand, either the flexor muscles prevail or the muscles become stiff in the original position of the animal at death. There seems to be no difference in the degree of stiffness in the final rigor, only the onset and development of the rigor is hastened in the case of the one salt and retarded in the other. Calcium hastens also the development of heat rigor. No positive facts were obtained with regard to the effect of magnesium upon heat vigor. Calcium also hastens and magnesium retards the onset of rigor in the left ventricle of the heart. No definite data were gathered with regard to the effects of these salts upon the right ventricle. PMID:19867124
Zhang, Harrison G; Ying, Gui-Shuang
2018-02-09
The aim of this study is to evaluate the current practice of statistical analysis of eye data in clinical science papers published in British Journal of Ophthalmology ( BJO ) and to determine whether the practice of statistical analysis has improved in the past two decades. All clinical science papers (n=125) published in BJO in January-June 2017 were reviewed for their statistical analysis approaches for analysing primary ocular measure. We compared our findings to the results from a previous paper that reviewed BJO papers in 1995. Of 112 papers eligible for analysis, half of the studies analysed the data at an individual level because of the nature of observation, 16 (14%) studies analysed data from one eye only, 36 (32%) studies analysed data from both eyes at ocular level, one study (1%) analysed the overall summary of ocular finding per individual and three (3%) studies used the paired comparison. Among studies with data available from both eyes, 50 (89%) of 56 papers in 2017 did not analyse data from both eyes or ignored the intereye correlation, as compared with in 60 (90%) of 67 papers in 1995 (P=0.96). Among studies that analysed data from both eyes at an ocular level, 33 (92%) of 36 studies completely ignored the intereye correlation in 2017, as compared with in 16 (89%) of 18 studies in 1995 (P=0.40). A majority of studies did not analyse the data properly when data from both eyes were available. The practice of statistical analysis did not improve in the past two decades. Collaborative efforts should be made in the vision research community to improve the practice of statistical analysis for ocular data. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Using R-Project for Free Statistical Analysis in Extension Research
ERIC Educational Resources Information Center
Mangiafico, Salvatore S.
2013-01-01
One option for Extension professionals wishing to use free statistical software is to use online calculators, which are useful for common, simple analyses. A second option is to use a free computing environment capable of performing statistical analyses, like R-project. R-project is free, cross-platform, powerful, and respected, but may be…
Long persistence of rigor mortis at constant low temperature.
Varetto, Lorenzo; Curto, Ombretta
2005-01-06
We studied the persistence of rigor mortis by using physical manipulation. We tested the mobility of the knee on 146 corpses kept under refrigeration at Torino's city mortuary at a constant temperature of +4 degrees C. We found a persistence of complete rigor lasting for 10 days in all the cadavers we kept under observation; and in one case, rigor lasted for 16 days. Between the 11th and the 17th days, a progressively increasing number of corpses showed a change from complete into partial rigor (characterized by partial bending of the articulation). After the 17th day, all the remaining corpses showed partial rigor and in the two cadavers that were kept under observation "à outrance" we found the absolute resolution of rigor mortis occurred on the 28th day. Our results prove that it is possible to find a persistence of rigor mortis that is much longer than the expected when environmental conditions resemble average outdoor winter temperatures in temperate zones. Therefore, this datum must be considered when a corpse is found in those environmental conditions so that when estimating the time of death, we are not misled by the long persistence of rigor mortis.
THE OPTICS OF REFRACTIVE SUBSTRUCTURE
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnson, Michael D.; Narayan, Ramesh, E-mail: mjohnson@cfa.harvard.edu
2016-08-01
Newly recognized effects of refractive scattering in the ionized interstellar medium have broad implications for very long baseline interferometry (VLBI) at extreme angular resolutions. Building upon work by Blandford and Narayan, we present a simplified, geometrical optics framework, which enables rapid, semi-analytic estimates of refractive scattering effects. We show that these estimates exactly reproduce previous results based on a more rigorous statistical formulation. We then derive new expressions for the scattering-induced fluctuations of VLBI observables such as closure phase, and we demonstrate how to calculate the fluctuations for arbitrary quantities of interest using a Monte Carlo technique.
Mathematical Analysis of a Coarsening Model with Local Interactions
NASA Astrophysics Data System (ADS)
Helmers, Michael; Niethammer, Barbara; Velázquez, Juan J. L.
2016-10-01
We consider particles on a one-dimensional lattice whose evolution is governed by nearest-neighbor interactions where particles that have reached size zero are removed from the system. Concentrating on configurations with infinitely many particles, we prove existence of solutions under a reasonable density assumption on the initial data and show that the vanishing of particles and the localized interactions can lead to non-uniqueness. Moreover, we provide a rigorous upper coarsening estimate and discuss generic statistical properties as well as some non-generic behavior of the evolution by means of heuristic arguments and numerical observations.
Bhatt, Divesh; Zuckerman, Daniel M.
2010-01-01
We performed “weighted ensemble” path–sampling simulations of adenylate kinase, using several semi–atomistic protein models. The models have an all–atom backbone with various levels of residue interactions. The primary result is that full statistically rigorous path sampling required only a few weeks of single–processor computing time with these models, indicating the addition of further chemical detail should be readily feasible. Our semi–atomistic path ensembles are consistent with previous biophysical findings: the presence of two distinct pathways, identification of intermediates, and symmetry of forward and reverse pathways. PMID:21660120
Experiment Design for Complex VTOL Aircraft with Distributed Propulsion and Tilt Wing
NASA Technical Reports Server (NTRS)
Murphy, Patrick C.; Landman, Drew
2015-01-01
Selected experimental results from a wind tunnel study of a subscale VTOL concept with distributed propulsion and tilt lifting surfaces are presented. The vehicle complexity and automated test facility were ideal for use with a randomized designed experiment. Design of Experiments and Response Surface Methods were invoked to produce run efficient, statistically rigorous regression models with minimized prediction error. Static tests were conducted at the NASA Langley 12-Foot Low-Speed Tunnel to model all six aerodynamic coefficients over a large flight envelope. This work supports investigations at NASA Langley in developing advanced configurations, simulations, and advanced control systems.
Recommendations for research design of telehealth studies.
Chumbler, Neale R; Kobb, Rita; Brennan, David M; Rabinowitz, Terry
2008-11-01
Properly designed randomized controlled trials (RCTs) are the gold standard to use when examining the effectiveness of telehealth interventions on clinical outcomes. Some published telehealth studies have employed well-designed RCTs. However, such methods are not always feasible and practical in particular settings. This white paper addresses not only the need for properly designed RCTs, but also offers alternative research designs, such as quasi-experimental designs, and statistical techniques that can be employed to rigorously assess the effectiveness of telehealth studies. This paper further offers design and measurement recommendations aimed at and relevant to administrative decision-makers, policymakers, and practicing clinicians.
Goodpaster, Aaron M.; Kennedy, Michael A.
2015-01-01
Currently, no standard metrics are used to quantify cluster separation in PCA or PLS-DA scores plots for metabonomics studies or to determine if cluster separation is statistically significant. Lack of such measures makes it virtually impossible to compare independent or inter-laboratory studies and can lead to confusion in the metabonomics literature when authors putatively identify metabolites distinguishing classes of samples based on visual and qualitative inspection of scores plots that exhibit marginal separation. While previous papers have addressed quantification of cluster separation in PCA scores plots, none have advocated routine use of a quantitative measure of separation that is supported by a standard and rigorous assessment of whether or not the cluster separation is statistically significant. Here quantification and statistical significance of separation of group centroids in PCA and PLS-DA scores plots are considered. The Mahalanobis distance is used to quantify the distance between group centroids, and the two-sample Hotelling's T2 test is computed for the data, related to an F-statistic, and then an F-test is applied to determine if the cluster separation is statistically significant. We demonstrate the value of this approach using four datasets containing various degrees of separation, ranging from groups that had no apparent visual cluster separation to groups that had no visual cluster overlap. Widespread adoption of such concrete metrics to quantify and evaluate the statistical significance of PCA and PLS-DA cluster separation would help standardize reporting of metabonomics data. PMID:26246647
Rigor Made Easy: Getting Started
ERIC Educational Resources Information Center
Blackburn, Barbara R.
2012-01-01
Bestselling author and noted rigor expert Barbara Blackburn shares the secrets to getting started, maintaining momentum, and reaching your goals. Learn what rigor looks like in the classroom, understand what it means for your students, and get the keys to successful implementation. Learn how to use rigor to raise expectations, provide appropriate…
Close Early Learning Gaps with Rigorous DAP
ERIC Educational Resources Information Center
Brown, Christopher P.; Mowry, Brian
2015-01-01
Rigorous DAP (developmentally appropriate practices) is a set of 11 principles of instruction intended to help close early childhood learning gaps. Academically rigorous learning environments create the conditions for children to learn at high levels. While academic rigor focuses on one dimension of education--academic--DAP considers the whole…
The Problem of Auto-Correlation in Parasitology
Pollitt, Laura C.; Reece, Sarah E.; Mideo, Nicole; Nussey, Daniel H.; Colegrave, Nick
2012-01-01
Explaining the contribution of host and pathogen factors in driving infection dynamics is a major ambition in parasitology. There is increasing recognition that analyses based on single summary measures of an infection (e.g., peak parasitaemia) do not adequately capture infection dynamics and so, the appropriate use of statistical techniques to analyse dynamics is necessary to understand infections and, ultimately, control parasites. However, the complexities of within-host environments mean that tracking and analysing pathogen dynamics within infections and among hosts poses considerable statistical challenges. Simple statistical models make assumptions that will rarely be satisfied in data collected on host and parasite parameters. In particular, model residuals (unexplained variance in the data) should not be correlated in time or space. Here we demonstrate how failure to account for such correlations can result in incorrect biological inference from statistical analysis. We then show how mixed effects models can be used as a powerful tool to analyse such repeated measures data in the hope that this will encourage better statistical practices in parasitology. PMID:22511865
NASA Technical Reports Server (NTRS)
Lau, K.-M.; Chan, P. H.
1983-01-01
Attention is given to the low-frequency variability of outgoing longwave radiation (OLR) fluctuations, their possible correlations over different parts of the globe, and their relationships with teleconnections obtained from other meteorological parameters, for example, geopotential and temperature fields. Simultaneous relationships with respect to the Southern Oscillation (Namais, 1978; Barnett, 1981) signal and the reference OLR fluctuation over the equatorial central Pacific are investigated. Emphasis is placed on the relative importance of the Southern Oscillation (SO) signal over preferred regions. Using lag cross-correlation statistics, possible lagged relationships between the tropics and midlatitudes and their relationships with the SO are then investigated. Only features that are consistent with present knowledge of the dynamics of the system are emphasized. Certain features which may not meet rigorous statistical significance tests but yet are either expected a priori from independent observations or are predicted from dynamical theories are also explored.
The importance of early investigation and publishing in an emergent health and environment crisis.
Murase, Kaori
2016-10-01
To minimize the damage resulting from a long-term environmental disaster such as the 2011 Fukushima nuclear accident in Japan, early disclosure of research data by scientists and prompt decision making by government authorities are required in place of careful, time-consuming research and deliberation about the consequences and cause of the accident. A Bayesian approach with flexible statistical modeling helps scientists and encourages government authorities to make decisions based on environmental data available in the early stages of a disaster. It is evident from Fukushima and similar accidents that classical research methods involving statistical methodologies that require rigorous experimental design and complex data sets are too cumbersome and delay important actions that may be critical in the early stages of an environmental disaster. Integr Environ Assess Manag 2016;12:680-682. © 2016 SETAC. © 2016 SETAC.
A combinatorial framework to quantify peak/pit asymmetries in complex dynamics.
Hasson, Uri; Iacovacci, Jacopo; Davis, Ben; Flanagan, Ryan; Tagliazucchi, Enzo; Laufs, Helmut; Lacasa, Lucas
2018-02-23
We explore a combinatorial framework which efficiently quantifies the asymmetries between minima and maxima in local fluctuations of time series. We first showcase its performance by applying it to a battery of synthetic cases. We find rigorous results on some canonical dynamical models (stochastic processes with and without correlations, chaotic processes) complemented by extensive numerical simulations for a range of processes which indicate that the methodology correctly distinguishes different complex dynamics and outperforms state of the art metrics in several cases. Subsequently, we apply this methodology to real-world problems emerging across several disciplines including cases in neurobiology, finance and climate science. We conclude that differences between the statistics of local maxima and local minima in time series are highly informative of the complex underlying dynamics and a graph-theoretic extraction procedure allows to use these features for statistical learning purposes.
Space-Time Data fusion for Remote Sensing Applications
NASA Technical Reports Server (NTRS)
Braverman, Amy; Nguyen, H.; Cressie, N.
2011-01-01
NASA has been collecting massive amounts of remote sensing data about Earth's systems for more than a decade. Missions are selected to be complementary in quantities measured, retrieval techniques, and sampling characteristics, so these datasets are highly synergistic. To fully exploit this, a rigorous methodology for combining data with heterogeneous sampling characteristics is required. For scientific purposes, the methodology must also provide quantitative measures of uncertainty that propagate input-data uncertainty appropriately. We view this as a statistical inference problem. The true but notdirectly- observed quantities form a vector-valued field continuous in space and time. Our goal is to infer those true values or some function of them, and provide to uncertainty quantification for those inferences. We use a spatiotemporal statistical model that relates the unobserved quantities of interest at point-level to the spatially aggregated, observed data. We describe and illustrate our method using CO2 data from two NASA data sets.
On the statistical equivalence of restrained-ensemble simulations with the maximum entropy method
Roux, Benoît; Weare, Jonathan
2013-01-01
An issue of general interest in computer simulations is to incorporate information from experiments into a structural model. An important caveat in pursuing this goal is to avoid corrupting the resulting model with spurious and arbitrary biases. While the problem of biasing thermodynamic ensembles can be formulated rigorously using the maximum entropy method introduced by Jaynes, the approach can be cumbersome in practical applications with the need to determine multiple unknown coefficients iteratively. A popular alternative strategy to incorporate the information from experiments is to rely on restrained-ensemble molecular dynamics simulations. However, the fundamental validity of this computational strategy remains in question. Here, it is demonstrated that the statistical distribution produced by restrained-ensemble simulations is formally consistent with the maximum entropy method of Jaynes. This clarifies the underlying conditions under which restrained-ensemble simulations will yield results that are consistent with the maximum entropy method. PMID:23464140
Dependence of exponents on text length versus finite-size scaling for word-frequency distributions
NASA Astrophysics Data System (ADS)
Corral, Álvaro; Font-Clos, Francesc
2017-08-01
Some authors have recently argued that a finite-size scaling law for the text-length dependence of word-frequency distributions cannot be conceptually valid. Here we give solid quantitative evidence for the validity of this scaling law, using both careful statistical tests and analytical arguments based on the generalized central-limit theorem applied to the moments of the distribution (and obtaining a novel derivation of Heaps' law as a by-product). We also find that the picture of word-frequency distributions with power-law exponents that decrease with text length [X. Yan and P. Minnhagen, Physica A 444, 828 (2016), 10.1016/j.physa.2015.10.082] does not stand with rigorous statistical analysis. Instead, we show that the distributions are perfectly described by power-law tails with stable exponents, whose values are close to 2, in agreement with the classical Zipf's law. Some misconceptions about scaling are also clarified.
Statistical significance of combinatorial regulations
Terada, Aika; Okada-Hatakeyama, Mariko; Tsuda, Koji; Sese, Jun
2013-01-01
More than three transcription factors often work together to enable cells to respond to various signals. The detection of combinatorial regulation by multiple transcription factors, however, is not only computationally nontrivial but also extremely unlikely because of multiple testing correction. The exponential growth in the number of tests forces us to set a strict limit on the maximum arity. Here, we propose an efficient branch-and-bound algorithm called the “limitless arity multiple-testing procedure” (LAMP) to count the exact number of testable combinations and calibrate the Bonferroni factor to the smallest possible value. LAMP lists significant combinations without any limit, whereas the family-wise error rate is rigorously controlled under the threshold. In the human breast cancer transcriptome, LAMP discovered statistically significant combinations of as many as eight binding motifs. This method may contribute to uncover pathways regulated in a coordinated fashion and find hidden associations in heterogeneous data. PMID:23882073
Asteroid orbital error analysis: Theory and application
NASA Technical Reports Server (NTRS)
Muinonen, K.; Bowell, Edward
1992-01-01
We present a rigorous Bayesian theory for asteroid orbital error estimation in which the probability density of the orbital elements is derived from the noise statistics of the observations. For Gaussian noise in a linearized approximation the probability density is also Gaussian, and the errors of the orbital elements at a given epoch are fully described by the covariance matrix. The law of error propagation can then be applied to calculate past and future positional uncertainty ellipsoids (Cappellari et al. 1976, Yeomans et al. 1987, Whipple et al. 1991). To our knowledge, this is the first time a Bayesian approach has been formulated for orbital element estimation. In contrast to the classical Fisherian school of statistics, the Bayesian school allows a priori information to be formally present in the final estimation. However, Bayesian estimation does give the same results as Fisherian estimation when no priori information is assumed (Lehtinen 1988, and reference therein).
Quantum-statistical theory of microwave detection using superconducting tunnel junctions
NASA Astrophysics Data System (ADS)
Deviatov, I. A.; Kuzmin, L. S.; Likharev, K. K.; Migulin, V. V.; Zorin, A. B.
1986-09-01
A quantum-statistical theory of microwave and millimeter-wave detection using superconducting tunnel junctions is developed, with a rigorous account of quantum, thermal, and shot noise arising from fluctuation sources associated with the junctions, signal source, and matching circuits. The problem of the noise characterization in the quantum sensitivity range is considered and a general noise parameter Theta(N) is introduced. This parameter is shown to be an adequate figure of merit for most receivers of interest while some devices can require a more complex characterization. Analytical expressions and/or numerically calculated plots for Theta(N) are presented for the most promising detection modes including the parametric amplification, heterodyne mixing, and quadratic videodetection, using both the quasiparticle-current and the Cooper-pair-current nonlinearities. Ultimate minimum values of Theta(N) for each detection mode are compared and found to be in agreement with limitations imposed by the quantum-mechanical uncertainty principle.
Patounakis, George; Hill, Micah J
2018-06-01
The purpose of the current review is to describe the common pitfalls in design and statistical analysis of reproductive medicine studies. It serves to guide both authors and reviewers toward reducing the incidence of spurious statistical results and erroneous conclusions. The large amount of data gathered in IVF cycles leads to problems with multiplicity, multicollinearity, and over fitting of regression models. Furthermore, the use of the word 'trend' to describe nonsignificant results has increased in recent years. Finally, methods to accurately account for female age in infertility research models are becoming more common and necessary. The pitfalls of study design and analysis reviewed provide a framework for authors and reviewers to approach clinical research in the field of reproductive medicine. By providing a more rigorous approach to study design and analysis, the literature in reproductive medicine will have more reliable conclusions that can stand the test of time.
NASA Technical Reports Server (NTRS)
Talpe, Matthieu J.; Nerem, R. Steven; Forootan, Ehsan; Schmidt, Michael; Lemoine, Frank G.; Enderlin, Ellyn M.; Landerer, Felix W.
2017-01-01
We construct long-term time series of Greenland and Antarctic ice sheet mass change from satellite gravity measurements. A statistical reconstruction approach is developed based on a principal component analysis (PCA) to combine high-resolution spatial modes from the Gravity Recovery and Climate Experiment (GRACE) mission with the gravity information from conventional satellite tracking data. Uncertainties of this reconstruction are rigorously assessed; they include temporal limitations for short GRACE measurements, spatial limitations for the low-resolution conventional tracking data measurements, and limitations of the estimated statistical relationships between low- and high-degree potential coefficients reflected in the PCA modes. Trends of mass variations in Greenland and Antarctica are assessed against a number of previous studies. The resulting time series for Greenland show a higher rate of mass loss than other methods before 2000, while the Antarctic ice sheet appears heavily influenced by interannual variations.
Efficacy of Curcuma for Treatment of Osteoarthritis.
Perkins, Kimberly; Sahy, William; Beckett, Robert D
2017-01-01
The objective of this review is to identify, summarize, and evaluate clinical trials to determine the efficacy of curcuma in the treatment of osteoarthritis. A literature search for interventional studies assessing efficacy of curcuma was performed, resulting in 8 clinical trials. Studies have investigated the effect of curcuma on pain, stiffness, and functionality in patients with knee osteoarthritis. Curcuma-containing products consistently demonstrated statistically significant improvement in osteoarthritis-related endpoints compared with placebo, with one exception. When compared with active control, curcuma-containing products were similar to nonsteroidal anti-inflammatory drugs, and potentially to glucosamine. While statistical significant differences in outcomes were reported in a majority of studies, the small magnitude of effect and presence of major study limitations hinder application of these results. Further rigorous studies are needed prior to recommending curcuma as an effective alternative therapy for knee osteoarthritis. © The Author(s) 2016.
Statistical mechanics of an ideal active fluid confined in a channel
NASA Astrophysics Data System (ADS)
Wagner, Caleb; Baskaran, Aparna; Hagan, Michael
The statistical mechanics of ideal active Brownian particles (ABPs) confined in a channel is studied by obtaining the exact solution of the steady-state Smoluchowski equation for the 1-particle distribution function. The solution is derived using results from the theory of two-way diffusion equations, combined with an iterative procedure that is justified by numerical results. Using this solution, we quantify the effects of confinement on the spatial and orientational order of the ensemble. Moreover, we rigorously show that both the bulk density and the fraction of particles on the channel walls obey simple scaling relations as a function of channel width. By considering a constant-flux steady state, an effective diffusivity for ABPs is derived which shows signatures of the persistent motion that characterizes ABP trajectories. Finally, we discuss how our techniques generalize to other active models, including systems whose activity is modeled in terms of an Ornstein-Uhlenbeck process.
Promoting the Multidimensional Character of Scientific Reasoning †
Bradshaw, William S.; Nelson, Jennifer; Adams, Byron J.; Bell, John D.
2017-01-01
This study reports part of a long-term program to help students improve scientific reasoning using higher-order cognitive tasks set in the discipline of cell biology. This skill was assessed using problems requiring the construction of valid conclusions drawn from authentic research data. We report here efforts to confirm the hypothesis that data interpretation is a complex, multifaceted exercise. Confirmation was obtained using a statistical treatment showing that various such problems rank students differently—each contains a unique set of cognitive challenges. Additional analyses of performance results have allowed us to demonstrate that individuals differ in their capacity to navigate five independent generic elements that constitute successful data interpretation: biological context, connection to course concepts, experimental protocols, data inference, and integration of isolated experimental observations into a coherent model. We offer these aspects of scientific thinking as a “data analysis skills inventory,” along with usable sample problems that illustrate each element. Additionally, we show that this kind of reasoning is rigorous in that it is difficult for most novice students, who are unable to intuitively implement strategies for improving these skills. Instructors armed with knowledge of the specific challenges presented by different types of problems can provide specific helpful feedback during formative practice. The use of this instructional model is most likely to require changes in traditional classroom instruction. PMID:28512524
Detection, isolation and diagnosability analysis of intermittent faults in stochastic systems
NASA Astrophysics Data System (ADS)
Yan, Rongyi; He, Xiao; Wang, Zidong; Zhou, D. H.
2018-02-01
Intermittent faults (IFs) have the properties of unpredictability, non-determinacy, inconsistency and repeatability, switching systems between faulty and healthy status. In this paper, the fault detection and isolation (FDI) problem of IFs in a class of linear stochastic systems is investigated. For the detection and isolation of IFs, it includes: (1) to detect all the appearing time and the disappearing time of an IF; (2) to detect each appearing (disappearing) time of the IF before the subsequent disappearing (appearing) time; (3) to determine where the IFs happen. Based on the outputs of the observers we designed, a novel set of residuals is constructed by using the sliding-time window technique, and two hypothesis tests are proposed to detect all the appearing time and disappearing time of IFs. The isolation problem of IFs is also considered. Furthermore, within a statistical framework, the definition of the diagnosability of IFs is proposed, and a sufficient condition is brought forward for the diagnosability of IFs. Quantitative performance analysis results for the false alarm rate and missing detection rate are discussed, and the influences of some key parameters of the proposed scheme on performance indices such as the false alarm rate and missing detection rate are analysed rigorously. The effectiveness of the proposed scheme is illustrated via a simulation example of an unmanned helicopter longitudinal control system.
A Biomarker Bakeoff in Early Stage Pancreatic Cancer — EDRN Public Portal
Previous research in EDRN laboratories and elsewhere has produced several candidate biomarker(s) for the detection of early-stage pancreatic ductal adenocarcinoma (PDAC), many of which show promise for significantly improving upon the performance of the current best marker, CA19-9. As yet, the relative performance of the markers in combination is not known because a rigorous comparison using a common sample set has not been performed. A direct comparison of the potential biomarkers in a comparative study (“biomarker bakeoff”) would enable an objective determination of which candidates should move forward for further validation, as well as an assessment of the potential value of using novel combinations of the biomarkers. The gastrointestinal collaborative group within the EDRN is in an optimal position to carry out such a study given its shared resources and interactive structure. In this project, the two pancreatic CVCs in the EDRN will provide samples to be distributed to four laboratories with promising biomarkers. The laboratories will run their own assays and perform initial analyses on the blinded PDAC and control samples. Our biostatistical collaborator, Dr. Huang at FHCRC, will perform the statistical evaluations. Biomarkers meeting the predetermined performance criteria will move forward to further validation using the EDRN reference set. In addition, we will determine whether any novel combinations of biomarkers should be further tested.
Allebeck, Peter; Mastekaasa, Arne
2004-01-01
Extensive information is available from official statistics and descriptive studies on the association between different socio-demographic background factors and sickness absence. This information addresses age, gender, place of residence, and socio-economic status. However, few studies have thoroughly analysed these background factors, and rigorous scientific evidence on the causal relationship between these factors and sick leave is lacking. Regarding the family, we found no scientific evidence that marital status or children living at home were associated with sickness absence. However, we found limited scientific evidence for an effect of divorce. Regarding work-related factors, we found limited scientific evidence for an effect of physically stressful work, and moderate scientific evidence for low psychological control over the work situation. We found limited scientific evidence for a correlation in time between unemployment and sickness absence, but insufficient scientific evidence for the causes of the association. There was moderate scientific evidence that the amount of sickness absence is influenced by the design of the social insurance system, but insufficient evidence on the magnitude of change required to influence the level of sickness absence. Essentially the same results apply to disability pension, although the number of studies is small. However, we found moderate scientific evidence for the effects of socio-economic status, which could be explained partly by childhood experiences.
Nikolaus, Cassandra J; Muzaffar, Henna; Nickols-Richardson, Sharon M
2016-09-01
To evaluate evidence regarding grocery store tours as an effective nutrition education medium for improving nutrition knowledge and food-related behaviors. A systematic literature review of studies published from 1984 to 2015 concerning grocery store (or supermarket) tours and impact on nutrition knowledge and behaviors. Three investigators independently reviewed articles, extracted details, and assessed the quality of each study. Of 307 citations identified, 8 were reviewed and 6 were of neutral quality. Increases in nutrition knowledge were reported in 4 studies, as evaluated by investigator-designed quizzes, with short intervals between tours and assessments. Six programs assessed behavior change using subjective reports or objective purchasing behavior measures; 2 studies did not perform statistical analyses. The 6 studies that reported positive health-related outcomes had varying topics, tour lengths, and target audiences. Grocery store tours are increasingly used as an avenue for nutrition education to improve knowledge and/or alter food selection behaviors and may result in positive outcomes, but it is unknown whether these outcomes persist for longer than 3 months after the tour and whether there are common attributes of effective grocery store tours. More rigorous studies with uniform methodology in study design and outcome measures are needed to confirm the effectiveness of supermarket tours. Copyright © 2016 Society for Nutrition Education and Behavior. Published by Elsevier Inc. All rights reserved.
The MWA Transients Survey (MWATS).
NASA Astrophysics Data System (ADS)
Bell, M.; Murphy, T.; Kaplan, D. L.; Croft, S. D.; Hancock, P.; Rowlinson, A.; Wayth, R.; Gaensler, B.; Hurley-Walker, N.; Offringa, A.; Loi, C.; Bannister, K.; Trott, C.; Marquart, J.
2017-01-01
We propose the continuation of the MWA transients survey to search for and monitor low frequency transient and variable radio sources in the southern sky. This proposal is aimed at commensally utilising data from the GLEAM-X (G0008) project in semester 2017-A. The aim of this commensal data acquisition is to commission long baseline observations for transient science. In particular this will involve studying the impact of the ionosphere on calibration and imaging, and developing the techniques needed to produce science quality data products. The proposed drift scans with LST locking (see G0008 proposal) are particularly exciting as we can test image subtraction for transient and variable identification. This survey is targeted at studying objects such as AGN (intrinsic and extrinsic variability), long duration synchrotron emitters, pulsars and transients of unknown origin. The maps generated from this survey will be analysed with the Variables and Slow Transients (VAST) detection pipeline. The motivation for this survey is as follows: (i) To obtain temporal data on an extremely large and robust sample of low frequency sources to explore and quantify both intrinsic and extrinsic variability; (ii) To search and find new classes of low frequency radio transients that previously remained undetected and obscured from multi-wavelength discovery; (iii) To place rigorous statistics on the occurrence of both transients and variables prior to the Australian SKA era.
Good pharmacovigilance practices: technology enabled.
Nelson, Robert C; Palsulich, Bruce; Gogolak, Victor
2002-01-01
The assessment of spontaneous reports is most effective it is conducted within a defined and rigorous process. The framework for good pharmacovigilance process (GPVP) is proposed as a subset of good postmarketing surveillance process (GPMSP), a functional structure for both a public health and corporate risk management strategy. GPVP has good practices that implement each step within a defined process. These practices are designed to efficiently and effectively detect and alert the drug safety professional to new and potentially important information on drug-associated adverse reactions. These practices are enabled by applied technology designed specifically for the review and assessment of spontaneous reports. Specific practices include rules-based triage, active query prompts for severe organ insults, contextual single case evaluation, statistical proportionality and correlational checks, case-series analyses, and templates for signal work-up and interpretation. These practices and the overall GPVP are supported by state-of-the-art web-based systems with powerful analytical engines, workflow and audit trials to allow validated systems support for valid drug safety signalling efforts. It is also important to understand that a process has a defined set of steps and any one cannot stand independently. Specifically, advanced use of technical alerting methods in isolation can mislead and allow one to misunderstand priorities and relative value. In the end, pharmacovigilance is a clinical art and a component process to the science of pharmacoepidemiology and risk management.
Nalin, David R
2002-02-22
Evidence based vaccinology (EBV) is the identification and use of the best evidence in making and implementing decisions during all of the stages of the life of a vaccine, including pre-licensure vaccine development and post-licensure manufacture and research, and utilization of the vaccine for disease control. Vaccines, unlike most pharmaceuticals, are in a continuous process of development both before and after licensure. Changes in biologics manufacturing technology and changes that vaccines induce in population and disease biology lead to periodic review of regimens (and sometimes dosage) based on changing immunologic data or public perceptions relevant to vaccine safety and effectiveness. EBV includes the use of evidence based medicine (EBM) both in clinical trials and in national disease containment programs. The rationale for EBV is that the highest evidentiary standards are required to maintain a rigorous scientific basis of vaccine quality control in manufacture and to ensure valid determination of vaccine efficacy, field effectiveness and safety profiles (including post-licensure safety monitoring), cost-benefit analyses, and risk:benefit ratios. EBV is increasingly based on statistically validated, clearly defined laboratory, manufacturing, clinical and epidemiological research methods and procedures, codified as good laboratory practices (GLP), good manufacturing practices (GMP), good clinical research practices (GCRP) and in clinical and public health practice (good vaccination practices, GVP). Implementation demands many data-driven decisions made by a spectrum of specialists pre- and post-licensure, and is essential to maintaining public confidence in vaccines.
Krompecher, T; Bergerioux, C
1988-01-01
The influence of electrocution on the evolution of rigor mortis was studied on rats. Our experiments showed that: (1) Electrocution hastens the onset of rigor mortis. After an electrocution of 90 s, a complete rigor develops already 1 h post-mortem (p.m.) compared to 5 h p.m. for the controls. (2) Electrocution hastens the passing of rigor mortis. After an electrocution of 90 s, the first significant decrease occurs at 3 h p.m. (8 h p.m. in the controls). (3) These modifications in rigor mortis evolution are less pronounced in the limbs not directly touched by the electric current. (4) In case of post-mortem electrocution, the changes are slightly less pronounced, the resistance is higher and the absorbed energy is lower as compared with the ante-mortem electrocution cases. The results are completed by two practical observations on human electrocution cases.
Machkovech, Heather M; Bedford, Trevor; Suchard, Marc A; Bloom, Jesse D
2015-11-01
Numerous experimental studies have demonstrated that CD8(+) T cells contribute to immunity against influenza by limiting viral replication. It is therefore surprising that rigorous statistical tests have failed to find evidence of positive selection in the epitopes targeted by CD8(+) T cells. Here we use a novel computational approach to test for selection in CD8(+) T-cell epitopes. We define all epitopes in the nucleoprotein (NP) and matrix protein (M1) with experimentally identified human CD8(+) T-cell responses and then compare the evolution of these epitopes in parallel lineages of human and swine influenza viruses that have been diverging since roughly 1918. We find a significant enrichment of substitutions that alter human CD8(+) T-cell epitopes in NP of human versus swine influenza virus, consistent with the idea that these epitopes are under positive selection. Furthermore, we show that epitope-altering substitutions in human influenza virus NP are enriched on the trunk versus the branches of the phylogenetic tree, indicating that viruses that acquire these mutations have a selective advantage. However, even in human influenza virus NP, sites in T-cell epitopes evolve more slowly than do nonepitope sites, presumably because these epitopes are under stronger inherent functional constraint. Overall, our work demonstrates that there is clear selection from CD8(+) T cells in human influenza virus NP and illustrates how comparative analyses of viral lineages from different hosts can identify positive selection that is otherwise obscured by strong functional constraint. There is a strong interest in correlates of anti-influenza immunity that are protective against diverse virus strains. CD8(+) T cells provide such broad immunity, since they target conserved viral proteins. An important question is whether T-cell immunity is sufficiently strong to drive influenza virus evolution. Although many studies have shown that T cells limit viral replication in animal models and are associated with decreased symptoms in humans, no studies have proven with statistical significance that influenza virus evolves under positive selection to escape T cells. Here we use comparisons of human and swine influenza viruses to rigorously demonstrate that human influenza virus evolves under pressure to fix mutations in the nucleoprotein that promote escape from T cells. We further show that viruses with these mutations have a selective advantage since they are preferentially located on the "trunk" of the phylogenetic tree. Overall, our results show that CD8(+) T cells targeting nucleoprotein play an important role in shaping influenza virus evolution. Copyright © 2015, American Society for Microbiology. All Rights Reserved.
Rigorous Schools and Classrooms: Leading the Way
ERIC Educational Resources Information Center
Williamson, Ronald; Blackburn, Barbara R.
2010-01-01
Turn your school into a student-centered learning environment, where rigor is at the heart of instruction in every classroom. From the bestselling author of "Rigor is Not a Four-Letter Word," Barbara Blackburn, and award-winning educator Ronald Williamson, this comprehensive guide to establishing a schoolwide culture of rigor is for principals and…
Rigor Revisited: Scaffolding College Student Learning by Incorporating Their Lived Experiences
ERIC Educational Resources Information Center
Castillo-Montoya, Milagros
2018-01-01
This chapter explores how students' lived experiences contribute to the rigor of their thinking. Insights from research indicate faculty can enhance rigor by accounting for the many ways it may surface in the classroom. However, to see this type of rigor, we must revisit the way we conceptualize it for higher education.
Mungure, Tanyaradzwa E; Bekhit, Alaa El-Din A; Birch, E John; Stewart, Ian
2016-04-01
The effects of rigor temperature (5, 15, 20 and 25°C), ageing (3, 7, 14, and 21 days) and display time on meat quality and lipid oxidative stability of hot boned beef M. Semimembranosus (SM) muscle were investigated. Ultimate pH (pH(u)) was rapidly attained at higher rigor temperatures. Electrical conductivity increased with rigor temperature (p<0.001). Tenderness, purge and cooking losses were not affected by rigor temperature; however purge loss and tenderness increased with ageing (p<0.01). Lightness (L*) and redness (a*) of the SM increased as rigor temperature increased (p<0.01). Lipid oxidation was assessed using (1)H NMR where changes in aliphatic to olefinic (R(ao)) and diallylmethylene (R(ad)) proton ratios can be rapidly monitored. R(ad), R(ao), PUFA and TBARS were not affected by rigor temperature, however ageing and display increased lipid oxidation (p<0.05). This study shows that rigor temperature manipulation of hot boned beef SM muscle does not have adverse effects on lipid oxidation. Copyright © 2016 Elsevier Ltd. All rights reserved.
Kantardjiev, Alexander A
2015-04-05
A cluster of strongly interacting ionization groups in protein molecules with irregular ionization behavior is suggestive for specific structure-function relationship. However, their computational treatment is unconventional (e.g., lack of convergence in naive self-consistent iterative algorithm). The stringent evaluation requires evaluation of Boltzmann averaged statistical mechanics sums and electrostatic energy estimation for each microstate. irGPU: Irregular strong interactions in proteins--a GPU solver is novel solution to a versatile problem in protein biophysics--atypical protonation behavior of coupled groups. The computational severity of the problem is alleviated by parallelization (via GPU kernels) which is applied for the electrostatic interaction evaluation (including explicit electrostatics via the fast multipole method) as well as statistical mechanics sums (partition function) estimation. Special attention is given to the ease of the service and encapsulation of theoretical details without sacrificing rigor of computational procedures. irGPU is not just a solution-in-principle but a promising practical application with potential to entice community into deeper understanding of principles governing biomolecule mechanisms. © 2015 Wiley Periodicals, Inc.
Machine learning to analyze images of shocked materials for precise and accurate measurements
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dresselhaus-Cooper, Leora; Howard, Marylesa; Hock, Margaret C.
A supervised machine learning algorithm, called locally adaptive discriminant analysis (LADA), has been developed to locate boundaries between identifiable image features that have varying intensities. LADA is an adaptation of image segmentation, which includes techniques that find the positions of image features (classes) using statistical intensity distributions for each class in the image. In order to place a pixel in the proper class, LADA considers the intensity at that pixel and the distribution of intensities in local (nearby) pixels. This paper presents the use of LADA to provide, with statistical uncertainties, the positions and shapes of features within ultrafast imagesmore » of shock waves. We demonstrate the ability to locate image features including crystals, density changes associated with shock waves, and material jetting caused by shock waves. This algorithm can analyze images that exhibit a wide range of physical phenomena because it does not rely on comparison to a model. LADA enables analysis of images from shock physics with statistical rigor independent of underlying models or simulations.« less
Extended q -Gaussian and q -exponential distributions from gamma random variables
NASA Astrophysics Data System (ADS)
Budini, Adrián A.
2015-05-01
The family of q -Gaussian and q -exponential probability densities fit the statistical behavior of diverse complex self-similar nonequilibrium systems. These distributions, independently of the underlying dynamics, can rigorously be obtained by maximizing Tsallis "nonextensive" entropy under appropriate constraints, as well as from superstatistical models. In this paper we provide an alternative and complementary scheme for deriving these objects. We show that q -Gaussian and q -exponential random variables can always be expressed as a function of two statistically independent gamma random variables with the same scale parameter. Their shape index determines the complexity q parameter. This result also allows us to define an extended family of asymmetric q -Gaussian and modified q -exponential densities, which reduce to the standard ones when the shape parameters are the same. Furthermore, we demonstrate that a simple change of variables always allows relating any of these distributions with a beta stochastic variable. The extended distributions are applied in the statistical description of different complex dynamics such as log-return signals in financial markets and motion of point defects in a fluid flow.
Karl Pearson and eugenics: personal opinions and scientific rigor.
Delzell, Darcie A P; Poliak, Cathy D
2013-09-01
The influence of personal opinions and biases on scientific conclusions is a threat to the advancement of knowledge. Expertise and experience does not render one immune to this temptation. In this work, one of the founding fathers of statistics, Karl Pearson, is used as an illustration of how even the most talented among us can produce misleading results when inferences are made without caution or reference to potential bias and other analysis limitations. A study performed by Pearson on British Jewish schoolchildren is examined in light of ethical and professional statistical practice. The methodology used and inferences made by Pearson and his coauthor are sometimes questionable and offer insight into how Pearson's support of eugenics and his own British nationalism could have potentially influenced his often careless and far-fetched inferences. A short background into Pearson's work and beliefs is provided, along with an in-depth examination of the authors' overall experimental design and statistical practices. In addition, portions of the study regarding intelligence and tuberculosis are discussed in more detail, along with historical reactions to their work.
Skelly, Daniel A.; Johansson, Marnie; Madeoy, Jennifer; Wakefield, Jon; Akey, Joshua M.
2011-01-01
Variation in gene expression is thought to make a significant contribution to phenotypic diversity among individuals within populations. Although high-throughput cDNA sequencing offers a unique opportunity to delineate the genome-wide architecture of regulatory variation, new statistical methods need to be developed to capitalize on the wealth of information contained in RNA-seq data sets. To this end, we developed a powerful and flexible hierarchical Bayesian model that combines information across loci to allow both global and locus-specific inferences about allele-specific expression (ASE). We applied our methodology to a large RNA-seq data set obtained in a diploid hybrid of two diverse Saccharomyces cerevisiae strains, as well as to RNA-seq data from an individual human genome. Our statistical framework accurately quantifies levels of ASE with specified false-discovery rates, achieving high reproducibility between independent sequencing platforms. We pinpoint loci that show unusual and biologically interesting patterns of ASE, including allele-specific alternative splicing and transcription termination sites. Our methodology provides a rigorous, quantitative, and high-resolution tool for profiling ASE across whole genomes. PMID:21873452
Trasande, Leonardo; Vandenberg, Laura N; Bourguignon, Jean-Pierre; Myers, John Peterson; Slama, Remy; Saal, Frederick vom; Zoeller, Robert Thomas
2017-01-01
Evidence increasingly confirms that synthetic chemicals disrupt the endocrine system and contribute to disease and disability across the lifespan. Despite a United Nations Environment Programme/WHO report affirmed by over 100 countries at the Fourth International Conference on Chemicals Management, ‘manufactured doubt’ continues to be cast as a cloud over rigorous, peer-reviewed and independently funded scientific data. This study describes the sources of doubt and their social costs, and suggested courses of action by policymakers to prevent disease and disability. The problem is largely based on the available data, which are all too limited. Rigorous testing programmes should not simply focus on oestrogen, androgen and thyroid. Tests should have proper statistical power. ‘Good laboratory practice’ (GLP) hardly represents a proper or even gold standard for laboratory studies of endocrine disruption. Studies should be evaluated with regard to the contamination of negative controls, responsiveness to positive controls and dissection techniques. Flaws in many GLP studies have been identified, yet regulatory agencies rely on these flawed studies. Peer-reviewed and unbiased research, rather than ‘sound science’, should be used to evaluate endocrine-disrupting chemicals. PMID:27417427
NASA Technical Reports Server (NTRS)
Zhang, Z.; Meyer, K.; Platnick, S.; Oreopoulos, L.; Lee, D.; Yu, H.
2013-01-01
This paper describes an efficient and unique method for computing the shortwave direct radiative effect (DRE) of aerosol residing above low-level liquid-phase clouds using CALIOP and MODIS data. It accounts for the overlapping of aerosol and cloud rigorously by utilizing the joint histogram of cloud optical depth and cloud top pressure. Effects of sub-grid scale cloud and aerosol variations on DRE are accounted for. It is computationally efficient through using grid-level cloud and aerosol statistics, instead of pixel-level products, and a pre-computed look-up table in radiative transfer calculations. We verified that for smoke over the southeast Atlantic Ocean the method yields a seasonal mean instantaneous shortwave DRE that generally agrees with more rigorous pixel-level computation within 4%. We have also computed the annual mean instantaneous shortwave DRE of light-absorbing aerosols (i.e., smoke and polluted dust) over global ocean based on 4 yr of CALIOP and MODIS data. We found that the variability of the annual mean shortwave DRE of above-cloud light-absorbing aerosol is mainly driven by the optical depth of the underlying clouds.
NASA Technical Reports Server (NTRS)
Zhang, Z.; Meyer, K.; Platnick, S.; Oreopoulos, L.; Lee, D.; Yu, H.
2014-01-01
This paper describes an efficient and unique method for computing the shortwave direct radiative effect (DRE) of aerosol residing above low-level liquid-phase clouds using CALIOP and MODIS data. It accounts for the overlapping of aerosol and cloud rigorously by utilizing the joint histogram of cloud optical depth and cloud top pressure. Effects of sub-grid scale cloud and aerosol variations on DRE are accounted for. It is computationally efficient through using grid-level cloud and aerosol statistics, instead of pixel-level products, and a pre-computed look-up table in radiative transfer calculations. We verified that for smoke over the southeast Atlantic Ocean the method yields a seasonal mean instantaneous shortwave DRE that generally agrees with more rigorous pixel-level computation within 4. We have also computed the annual mean instantaneous shortwave DRE of light-absorbing aerosols (i.e., smoke and polluted dust) over global ocean based on 4 yr of CALIOP and MODIS data. We found that the variability of the annual mean shortwave DRE of above-cloud light-absorbing aerosol is mainly driven by the optical depth of the underlying clouds.
Rigorous Proof of the Boltzmann-Gibbs Distribution of Money on Connected Graphs
NASA Astrophysics Data System (ADS)
Lanchier, Nicolas
2017-04-01
Models in econophysics, i.e., the emerging field of statistical physics that applies the main concepts of traditional physics to economics, typically consist of large systems of economic agents who are characterized by the amount of money they have. In the simplest model, at each time step, one agent gives one dollar to another agent, with both agents being chosen independently and uniformly at random from the system. Numerical simulations of this model suggest that, at least when the number of agents and the average amount of money per agent are large, the distribution of money converges to an exponential distribution reminiscent of the Boltzmann-Gibbs distribution of energy in physics. The main objective of this paper is to give a rigorous proof of this result and show that the convergence to the exponential distribution holds more generally when the economic agents are located on the vertices of a connected graph and interact locally with their neighbors rather than globally with all the other agents. We also study a closely related model where, at each time step, agents buy with a probability proportional to the amount of money they have, and prove that in this case the limiting distribution of money is Poissonian.
Closed loop statistical performance analysis of N-K knock controllers
NASA Astrophysics Data System (ADS)
Peyton Jones, James C.; Shayestehmanesh, Saeed; Frey, Jesse
2017-09-01
The closed loop performance of engine knock controllers cannot be rigorously assessed from single experiments or simulations because knock behaves as a random process and therefore the response belongs to a random distribution also. In this work a new method is proposed for computing the distributions and expected values of the closed loop response, both in steady state and in response to disturbances. The method takes as its input the control law, and the knock propensity characteristic of the engine which is mapped from open loop steady state tests. The method is applicable to the 'n-k' class of knock controllers in which the control action is a function only of the number of cycles n since the last control move, and the number k of knock events that have occurred in this time. A Cumulative Summation (CumSum) based controller falls within this category, and the method is used to investigate the performance of the controller in a deeper and more rigorous way than has previously been possible. The results are validated using onerous Monte Carlo simulations, which confirm both the validity of the method and its high computational efficiency.
Statistical Inference for Data Adaptive Target Parameters.
Hubbard, Alan E; Kherad-Pajouh, Sara; van der Laan, Mark J
2016-05-01
Consider one observes n i.i.d. copies of a random variable with a probability distribution that is known to be an element of a particular statistical model. In order to define our statistical target we partition the sample in V equal size sub-samples, and use this partitioning to define V splits in an estimation sample (one of the V subsamples) and corresponding complementary parameter-generating sample. For each of the V parameter-generating samples, we apply an algorithm that maps the sample to a statistical target parameter. We define our sample-split data adaptive statistical target parameter as the average of these V-sample specific target parameters. We present an estimator (and corresponding central limit theorem) of this type of data adaptive target parameter. This general methodology for generating data adaptive target parameters is demonstrated with a number of practical examples that highlight new opportunities for statistical learning from data. This new framework provides a rigorous statistical methodology for both exploratory and confirmatory analysis within the same data. Given that more research is becoming "data-driven", the theory developed within this paper provides a new impetus for a greater involvement of statistical inference into problems that are being increasingly addressed by clever, yet ad hoc pattern finding methods. To suggest such potential, and to verify the predictions of the theory, extensive simulation studies, along with a data analysis based on adaptively determined intervention rules are shown and give insight into how to structure such an approach. The results show that the data adaptive target parameter approach provides a general framework and resulting methodology for data-driven science.
High and low rigor temperature effects on sheep meat tenderness and ageing.
Devine, Carrick E; Payne, Steven R; Peachey, Bridget M; Lowe, Timothy E; Ingram, John R; Cook, Christian J
2002-02-01
Immediately after electrical stimulation, the paired m. longissimus thoracis et lumborum (LT) of 40 sheep were boned out and wrapped tightly with a polyethylene cling film. One of the paired LT's was chilled in 15°C air to reach a rigor mortis (rigor) temperature of 18°C and the other side was placed in a water bath at 35°C and achieved rigor at this temperature. Wrapping reduced rigor shortening and mimicked meat left on the carcass. After rigor, the meat was aged at 15°C for 0, 8, 26 and 72 h and then frozen. The frozen meat was cooked to 75°C in an 85°C water bath and shear force values obtained from a 1×1 cm cross-section. The shear force values of meat for 18 and 35°C rigor were similar at zero ageing, but as ageing progressed, the 18 rigor meat aged faster and became more tender than meat that went into rigor at 35°C (P<0.001). The mean sarcomere length values of meat samples for 18 and 35°C rigor at each ageing time were significantly different (P<0.001), the samples at 35°C being shorter. When the short sarcomere length values and corresponding shear force values were removed for further data analysis, the shear force values for the 35°C rigor were still significantly greater. Thus the toughness of 35°C meat was not a consequence of muscle shortening and appears to be due to both a faster rate of tenderisation and the meat tenderising to a greater extent at the lower temperature. The cook loss at 35°C rigor (30.5%) was greater than that at 18°C rigor (28.4%) (P<0.01) and the colour Hunter L values were higher at 35°C (P<0.01) compared with 18°C, but there were no significant differences in a or b values.
Adachi, Tetsuya; Pezzotti, Giuseppe; Yamamoto, Toshiro; Ichioka, Hiroaki; Boffelli, Marco; Zhu, Wenliang; Kanamura, Narisato
2015-05-01
A systematic investigation, based on highly spectrally resolved Raman spectroscopy, was undertaken to research the efficacy of vibrational assessments in locating chemical and crystallographic fingerprints for the characterization of dental caries and the early detection of non-cavitated carious lesions. Raman results published by other authors have indicated possible approaches for this method. However, they conspicuously lacked physical insight at the molecular scale and, thus, the rigor necessary to prove the efficacy of this spectroscopy method. After solving basic physical challenges in a companion paper, we apply them here in the form of newly developed Raman algorithms for practical dental research. Relevant differences in mineral crystallite (average) orientation and texture distribution were revealed for diseased enamel at different stages compared with healthy mineralized enamel. Clear spectroscopy features could be directly translated in terms of a rigorous and quantitative classification of crystallography and chemical characteristics of diseased enamel structures. The Raman procedure enabled us to trace back otherwise invisible characteristics in early caries, in the translucent zone (i.e., the advancing front of the disease) and in the body of lesion of cavitated caries.
Bhopal, Raj
2017-03-01
Rigorous evaluation of associations in epidemiology is essential, especially given big data, data mining, and hypothesis-free analyses. There is a precedent in making judgments on associations in the monographs of the International Agency for Research on Cancer; however, only the carcinogenic effects of exposures are examined. The idea of a World Council of Epidemiology and Causality to undertake rigorous, independent, comprehensive examination of associations has been debated, including in a workshop at the International Epidemiology Association's 20 th World Congress of Epidemiology, 2014. The objective of the workshop was both to, briefly, debate the idea and set out further questions and next steps. The principal conclusion from feedback including from the workshop is that the World Council of Epidemiology and Causality idea, notwithstanding challenges, has promise and deserves more debate. The preferred model is for a small independent body working closely with relevant partners with a distributed approach to tasks. Recommendations are contextualized in contemporary approaches in causal thinking in epidemiology. Copyright © 2017 Elsevier Inc. All rights reserved.
1992-10-01
N=8) and Results of 44 Statistical Analyses for Impact Test Performed on Forefoot of Unworn Footwear A-2. Summary Statistics (N=8) and Results of...on Forefoot of Worn Footwear Vlll Tables (continued) Table Page B-2. Summary Statistics (N=4) and Results of 76 Statistical Analyses for Impact...used tests to assess heel and forefoot shock absorption, upper and sole durability, and flexibility (Cavanagh, 1978). Later, the number of tests was
2011-01-01
Background Clinical researchers have often preferred to use a fixed effects model for the primary interpretation of a meta-analysis. Heterogeneity is usually assessed via the well known Q and I2 statistics, along with the random effects estimate they imply. In recent years, alternative methods for quantifying heterogeneity have been proposed, that are based on a 'generalised' Q statistic. Methods We review 18 IPD meta-analyses of RCTs into treatments for cancer, in order to quantify the amount of heterogeneity present and also to discuss practical methods for explaining heterogeneity. Results Differing results were obtained when the standard Q and I2 statistics were used to test for the presence of heterogeneity. The two meta-analyses with the largest amount of heterogeneity were investigated further, and on inspection the straightforward application of a random effects model was not deemed appropriate. Compared to the standard Q statistic, the generalised Q statistic provided a more accurate platform for estimating the amount of heterogeneity in the 18 meta-analyses. Conclusions Explaining heterogeneity via the pre-specification of trial subgroups, graphical diagnostic tools and sensitivity analyses produced a more desirable outcome than an automatic application of the random effects model. Generalised Q statistic methods for quantifying and adjusting for heterogeneity should be incorporated as standard into statistical software. Software is provided to help achieve this aim. PMID:21473747
Gaskin, Cadeyrn J; Happell, Brenda
2014-05-01
To (a) assess the statistical power of nursing research to detect small, medium, and large effect sizes; (b) estimate the experiment-wise Type I error rate in these studies; and (c) assess the extent to which (i) a priori power analyses, (ii) effect sizes (and interpretations thereof), and (iii) confidence intervals were reported. Statistical review. Papers published in the 2011 volumes of the 10 highest ranked nursing journals, based on their 5-year impact factors. Papers were assessed for statistical power, control of experiment-wise Type I error, reporting of a priori power analyses, reporting and interpretation of effect sizes, and reporting of confidence intervals. The analyses were based on 333 papers, from which 10,337 inferential statistics were identified. The median power to detect small, medium, and large effect sizes was .40 (interquartile range [IQR]=.24-.71), .98 (IQR=.85-1.00), and 1.00 (IQR=1.00-1.00), respectively. The median experiment-wise Type I error rate was .54 (IQR=.26-.80). A priori power analyses were reported in 28% of papers. Effect sizes were routinely reported for Spearman's rank correlations (100% of papers in which this test was used), Poisson regressions (100%), odds ratios (100%), Kendall's tau correlations (100%), Pearson's correlations (99%), logistic regressions (98%), structural equation modelling/confirmatory factor analyses/path analyses (97%), and linear regressions (83%), but were reported less often for two-proportion z tests (50%), analyses of variance/analyses of covariance/multivariate analyses of variance (18%), t tests (8%), Wilcoxon's tests (8%), Chi-squared tests (8%), and Fisher's exact tests (7%), and not reported for sign tests, Friedman's tests, McNemar's tests, multi-level models, and Kruskal-Wallis tests. Effect sizes were infrequently interpreted. Confidence intervals were reported in 28% of papers. The use, reporting, and interpretation of inferential statistics in nursing research need substantial improvement. Most importantly, researchers should abandon the misleading practice of interpreting the results from inferential tests based solely on whether they are statistically significant (or not) and, instead, focus on reporting and interpreting effect sizes, confidence intervals, and significance levels. Nursing researchers also need to conduct and report a priori power analyses, and to address the issue of Type I experiment-wise error inflation in their studies. Crown Copyright © 2013. Published by Elsevier Ltd. All rights reserved.
Exploring Student Perceptions of Rigor Online: Toward a Definition of Rigorous Learning
ERIC Educational Resources Information Center
Duncan, Heather E.; Range, Bret; Hvidston, David
2013-01-01
Technological advances in the last decade have impacted delivery methods of university courses. More and more courses are offered in a variety of formats. While academic rigor is a term often used, its definition is less clear. This mixed-methods study explored graduate student conceptions of rigor in the online learning environment embedded…
Calhelha, Ricardo C; Martínez, Mireia A; Prieto, M A; Ferreira, Isabel C F R
2017-10-23
The development of convenient tools for describing and quantifying the effects of standard and novel therapeutic agents is essential for the research community, to perform more precise evaluations. Although mathematical models and quantification criteria have been exchanged in the last decade between different fields of study, there are relevant methodologies that lack proper mathematical descriptions and standard criteria to quantify their responses. Therefore, part of the relevant information that can be drawn from the experimental results obtained and the quantification of its statistical reliability are lost. Despite its relevance, there is not a standard form for the in vitro endpoint tumor cell lines' assays (TCLA) that enables the evaluation of the cytotoxic dose-response effects of anti-tumor drugs. The analysis of all the specific problems associated with the diverse nature of the available TCLA used is unfeasible. However, since most TCLA share the main objectives and similar operative requirements, we have chosen the sulforhodamine B (SRB) colorimetric assay for cytotoxicity screening of tumor cell lines as an experimental case study. In this work, the common biological and practical non-linear dose-response mathematical models are tested against experimental data and, following several statistical analyses, the model based on the Weibull distribution was confirmed as the convenient approximation to test the cytotoxic effectiveness of anti-tumor compounds. Then, the advantages and disadvantages of all the different parametric criteria derived from the model, which enable the quantification of the dose-response drug-effects, are extensively discussed. Therefore, model and standard criteria for easily performing the comparisons between different compounds are established. The advantages include a simple application, provision of parametric estimations that characterize the response as standard criteria, economization of experimental effort and enabling rigorous comparisons among the effects of different compounds and experimental approaches. In all experimental data fitted, the calculated parameters were always statistically significant, the equations proved to be consistent and the correlation coefficient of determination was, in most of the cases, higher than 0.98.
Anticonvulsants for alcohol withdrawal.
Minozzi, Silvia; Amato, Laura; Vecchi, Simona; Davoli, Marina
2010-03-17
Alcohol abuse and dependence represents a most serious health problem worldwide with major social, interpersonal and legal interpolations. Besides benzodiazepines, anticonvulsants are often used for the treatment of alcohol withdrawal symptoms. Anticonvulsants drugs are indicated for the treatment of alcohol withdrawal syndrome, alone or in combination with benzodiazepine treatments. In spite of the wide use, the exact role of the anticonvulsants for the treatment of alcohol withdrawal has not yet bee adequately assessed. To evaluate the effectiveness and safety of anticonvulsants in the treatment of alcohol withdrawal. We searched Cochrane Drugs and Alcohol Group' Register of Trials (December 2009), PubMed, EMBASE, CINAHL (1966 to December 2009), EconLIT (1969 to December 2009). Parallel searches on web sites of health technology assessment and related agencies, and their databases. Randomized controlled trials (RCTs) examining the effectiveness, safety and overall risk-benefit of anticonvulsants in comparison with a placebo or other pharmacological treatment. All patients were included regardless of age, gender, nationality, and outpatient or inpatient therapy. Two authors independently screened and extracted data from studies. Fifty-six studies, with a total of 4076 participants, met the inclusion criteria. Comparing anticonvulsants with placebo, no statistically significant differences for the six outcomes considered.Comparing anticonvulsant versus other drug, 19 outcomes considered, results favour anticonvulsants only in the comparison carbamazepine versus benzodiazepine (oxazepam and lorazepam) for alcohol withdrawal symptoms (CIWA-Ar score): 3 studies, 262 participants, MD -1.04 (-1.89 to -0.20), none of the other comparisons reached statistical significance.Comparing different anticonvulsants no statistically significant differences in the two outcomes considered.Comparing anticonvulsants plus other drugs versus other drugs (3 outcomes considered), results from one study, 72 participants, favour paraldehyde plus chloral hydrate versus chlordiazepoxide, for the severe-life threatening side effects, RR 0.12 (0.03 to 0.44). Results of this review do not provide sufficient evidence in favour of anticonvulsants for the treatment of AWS. There are some suggestions that carbamazepine may actually be more effective in treating some aspects of alcohol withdrawal when compared to benzodiazepines, the current first-line regimen for alcohol withdrawal syndrome. Anticonvulsants seem to have limited side effects, although adverse effects are not rigorously reported in the analysed trials.
The Keys to Success in Doctoral Studies: A Preimmersion Course.
Salani, Deborah; Albuja, Laura Dean; Azaiza, Khitam
2016-01-01
This article will review an innovative on-line preimmersion course for a hybrid doctor of nursing practice (DNP) program and a traditional face-to-face doctor of philosophy nursing program. The doctoral candidates include both postbaccalaureate and postmaster's students. The authors of the preimmersion course developed and initiated the course in order to address various issues that have surfaced in discussions between students and faculty. Examples of common themes identified include writing skills, statistics, life-work-school balance, and navigating instructional technology. Doctoral studies may pose challenges to students studying nursing, in regard to academic rigor and experiencing on-line education for the first time, especially for students who have been out of school for an extended amount of time or are not accustomed to a nontraditional classroom; thus, having a preimmersion course established may facilitate a smooth transition to rigorous academic studies in a hybrid program. The course, which was developed and delivered through Blackboard, a learning management system, includes the following 9 preimmersion modules: academic strategies (learning styles, creating an effective PowerPoint presentation), library support (introduction to the university library, literature review tutorial, and citation styles), mindfulness, wellness, statistics essentials, writing express, DNP capstone, netiquette, and DNP/doctor of philosophy mentorship. Each module consists of various tools that may promote student success in specific courses and the programs in general. The purpose of designing the preimmersion course is to decrease attrition rates and increase success of the students. While the majority of students have succeeded in their coursework and been graduated from the program, the authors of this article found that many students struggled with the work, life, and school balance. Future work will include the evaluation of results from graduate students enrolled in the program. Copyright © 2016 Elsevier Inc. All rights reserved.
Ondeck, Nathaniel T; Fu, Michael C; Skrip, Laura A; McLynn, Ryan P; Su, Edwin P; Grauer, Jonathan N
2018-03-01
Despite the advantages of large, national datasets, one continuing concern is missing data values. Complete case analysis, where only cases with complete data are analyzed, is commonly used rather than more statistically rigorous approaches such as multiple imputation. This study characterizes the potential selection bias introduced using complete case analysis and compares the results of common regressions using both techniques following unicompartmental knee arthroplasty. Patients undergoing unicompartmental knee arthroplasty were extracted from the 2005 to 2015 National Surgical Quality Improvement Program. As examples, the demographics of patients with and without missing preoperative albumin and hematocrit values were compared. Missing data were then treated with both complete case analysis and multiple imputation (an approach that reproduces the variation and associations that would have been present in a full dataset) and the conclusions of common regressions for adverse outcomes were compared. A total of 6117 patients were included, of which 56.7% were missing at least one value. Younger, female, and healthier patients were more likely to have missing preoperative albumin and hematocrit values. The use of complete case analysis removed 3467 patients from the study in comparison with multiple imputation which included all 6117 patients. The 2 methods of handling missing values led to differing associations of low preoperative laboratory values with commonly studied adverse outcomes. The use of complete case analysis can introduce selection bias and may lead to different conclusions in comparison with the statistically rigorous multiple imputation approach. Joint surgeons should consider the methods of handling missing values when interpreting arthroplasty research. Copyright © 2017 Elsevier Inc. All rights reserved.
Zhang, Yiming; Jin, Quan; Wang, Shuting; Ren, Ren
2011-05-01
The mobile behavior of 1481 peptides in ion mobility spectrometry (IMS), which are generated by protease digestion of the Drosophila melanogaster proteome, is modeled and predicted based on two different types of characterization methods, i.e. sequence-based approach and structure-based approach. In this procedure, the sequence-based approach considers both the amino acid composition of a peptide and the local environment profile of each amino acid in the peptide; the structure-based approach is performed with the CODESSA protocol, which regards a peptide as a common organic compound and generates more than 200 statistically significant variables to characterize the whole structure profile of a peptide molecule. Subsequently, the nonlinear support vector machine (SVM) and Gaussian process (GP) as well as linear partial least squares (PLS) regression is employed to correlate the structural parameters of the characterizations with the IMS drift times of these peptides. The obtained quantitative structure-spectrum relationship (QSSR) models are evaluated rigorously and investigated systematically via both one-deep and two-deep cross-validations as well as the rigorous Monte Carlo cross-validation (MCCV). We also give a comprehensive comparison on the resulting statistics arising from the different combinations of variable types with modeling methods and find that the sequence-based approach can give the QSSR models with better fitting ability and predictive power but worse interpretability than the structure-based approach. In addition, though the QSSR modeling using sequence-based approach is not needed for the preparation of the minimization structures of peptides before the modeling, it would be considerably efficient as compared to that using structure-based approach. Copyright © 2011 Elsevier Ltd. All rights reserved.
Maru, Shoko; Byrnes, Joshua; Carrington, Melinda J; Stewart, Simon; Scuffham, Paul A
2015-01-01
Substantial variation in economic analyses of cardiovascular disease management programs hinders not only the proper assessment of cost-effectiveness but also the identification of heterogeneity of interest such as patient characteristics. The authors discuss the impact of reporting and methodological variation on the cost-effectiveness of cardiovascular disease management programs by introducing issues that could lead to different policy or clinical decisions, followed by the challenges associated with net intervention effects and generalizability. The authors conclude with practical suggestions to mitigate the identified issues. Improved transparency through standardized reporting practice is the first step to advance beyond one-off experiments (limited applicability outside the study itself). Transparent reporting is a prerequisite for rigorous cost-effectiveness analyses that provide unambiguous implications for practice: what type of program works for whom and how.
Goeke-Morey, Marcie C; Taylor, Laura K; Merrilees, Christine E; Shirlow, Peter; Cummings, E Mark
2014-12-01
A growing literature supports the importance of understanding the link between religiosity and youths' adjustment and development, but in the absence of rigorous, longitudinal designs, questions remain about the direction of effect and the role of family factors. This paper investigates the bidirectional association between adolescents' relationship with God and their internalizing adjustment. Results from 2-wave, SEM cross-lag analyses of data from 667 mother/adolescent dyads in Belfast, Northern Ireland (50% male, M age = 15.75 years old) supports a risk model suggesting that greater internalizing problems predict a weaker relationship with God 1 year later. Significant moderation analyses suggest that a stronger relationship with God predicted fewer depression and anxiety symptoms for youth whose mothers used more religious coping.
Goeke-Morey, Marcie C.; Taylor, Laura K.; Merrilees, Christine E.; Shirlow, Peter; Cummings, E. Mark
2015-01-01
A growing literature supports the importance of understanding the link between religiosity and youths’ adjustment and development, but in the absence of rigorous, longitudinal designs, questions remain about the direction of effect and the role of family factors. This paper investigates the bi-directional association between adolescents’ relationship with God and their internalizing adjustment. Results from two-wave, SEM cross-lag analyses of data from 667 mother/adolescent dyads in Belfast, Northern Ireland (50% male, M age = 15.75 years old) supports a risk model suggesting that greater internalizing problems predicts a weaker relationship with God one year later. Significant moderation analyses suggest that a stronger relationship with God predicted fewer depression and anxiety symptoms for youth whose mothers used more religious coping. PMID:24955590
Alborz, Alison; McNally, Rosalind
2004-12-01
To develop methods to facilitate the 'systematic' review of evidence from a range of methodologies on diffuse or 'soft' topics, as exemplified by 'access to health care'. Twenty-eight bibliographic databases, research registers, organizational websites or library catalogues. Reference lists from identified studies. Contact with experts and service users. Current awareness and contents alerting services in the area of learning disabilities. Inclusion criteria were English language literature from 1980 onwards, relating to people with learning disabilities of any age and all study designs. The main criteria for assessment was relevance to Guillifords' model of access to health care which was adapted to the circumstances of people with learning disabilities. Selected studies were evaluated for scientific rigour then data was extracted and the results synthesized. Quality assessment was by an initial set of 'generic' quality indicators. This enabled further evidence selection before evaluation of findings according to specific criteria for qualitative, quantitative or mixed-method studies. Eighty-two studies were fully evaluated. Five studies were rated 'highly rigorous', 22 'rigorous', 46 'less rigorous' and nine 'poor' papers were retained as the sole evidence covering aspects of the guiding model. The majority of studies were quantitative but used only descriptive statistics. Most evidence lacked methodological detail, which often lowered final quality ratings. The application of a consistent structure to quality evaluation can facilitate data appraisal, extraction and synthesis across a range of methodologies in diffuse or 'soft' topics. Synthesis can be facilitated further by using software, such as the microsoft 'access' database, for managing information.
Provencher, Steeve; Archer, Stephen L; Ramirez, F Daniel; Hibbert, Benjamin; Paulin, Roxane; Boucherat, Olivier; Lacasse, Yves; Bonnet, Sébastien
2018-03-30
Despite advances in our understanding of the pathophysiology and the management of pulmonary arterial hypertension (PAH), significant therapeutic gaps remain for this devastating disease. Yet, few innovative therapies beyond the traditional pathways of endothelial dysfunction have reached clinical trial phases in PAH. Although there are inherent limitations of the currently available models of PAH, the leaky pipeline of innovative therapies relates, in part, to flawed preclinical research methodology, including lack of rigour in trial design, incomplete invasive hemodynamic assessment, and lack of careful translational studies that replicate randomized controlled trials in humans with attention to adverse effects and benefits. Rigorous methodology should include the use of prespecified eligibility criteria, sample sizes that permit valid statistical analysis, randomization, blinded assessment of standardized outcomes, and transparent reporting of results. Better design and implementation of preclinical studies can minimize inherent flaws in the models of PAH, reduce the risk of bias, and enhance external validity and our ability to distinguish truly promising therapies form many false-positive or overstated leads. Ideally, preclinical studies should use advanced imaging, study several preclinical pulmonary hypertension models, or correlate rodent and human findings and consider the fate of the right ventricle, which is the major determinant of prognosis in human PAH. Although these principles are widely endorsed, empirical evidence suggests that such rigor is often lacking in pulmonary hypertension preclinical research. The present article discusses the pitfalls in the design of preclinical pulmonary hypertension trials and discusses opportunities to create preclinical trials with improved predictive value in guiding early-phase drug development in patients with PAH, which will need support not only from researchers, peer reviewers, and editors but also from academic institutions, funding agencies, and animal ethics authorities. © 2018 American Heart Association, Inc.
Single toxin dose-response models revisited
DOE Office of Scientific and Technical Information (OSTI.GOV)
Demidenko, Eugene, E-mail: eugened@dartmouth.edu
The goal of this paper is to offer a rigorous analysis of the sigmoid shape single toxin dose-response relationship. The toxin efficacy function is introduced and four special points, including maximum toxin efficacy and inflection points, on the dose-response curve are defined. The special points define three phases of the toxin effect on mortality: (1) toxin concentrations smaller than the first inflection point or (2) larger then the second inflection point imply low mortality rate, and (3) concentrations between the first and the second inflection points imply high mortality rate. Probabilistic interpretation and mathematical analysis for each of the fourmore » models, Hill, logit, probit, and Weibull is provided. Two general model extensions are introduced: (1) the multi-target hit model that accounts for the existence of several vital receptors affected by the toxin, and (2) model with a nonzero mortality at zero concentration to account for natural mortality. Special attention is given to statistical estimation in the framework of the generalized linear model with the binomial dependent variable as the mortality count in each experiment, contrary to the widespread nonlinear regression treating the mortality rate as continuous variable. The models are illustrated using standard EPA Daphnia acute (48 h) toxicity tests with mortality as a function of NiCl or CuSO{sub 4} toxin. - Highlights: • The paper offers a rigorous study of a sigmoid dose-response relationship. • The concentration with highest mortality rate is rigorously defined. • A table with four special points for five morality curves is presented. • Two new sigmoid dose-response models have been introduced. • The generalized linear model is advocated for estimation of sigmoid dose-response relationship.« less