Using R-Project for Free Statistical Analysis in Extension Research
ERIC Educational Resources Information Center
Mangiafico, Salvatore S.
2013-01-01
One option for Extension professionals wishing to use free statistical software is to use online calculators, which are useful for common, simple analyses. A second option is to use a free computing environment capable of performing statistical analyses, like R-project. R-project is free, cross-platform, powerful, and respected, but may be…
Research of Extension of the Life Cycle of Helicopter Rotor Blade in Hungary
2003-02-01
Radiography (DXR), and (iii) Vibration Diagnostics (VD) with Statistical Energy Analysis (SEA) were semi- simultaneously applied [1]. The used three...2.2. Vibration Diagnostics (VD)) Parallel to the NDT measurements the Statistical Energy Analysis (SEA) as a vibration diagnostical tool were...noises were analysed with a dual-channel real time frequency analyser (BK2035). In addition to the Statistical Energy Analysis measurement a small
permGPU: Using graphics processing units in RNA microarray association studies.
Shterev, Ivo D; Jung, Sin-Ho; George, Stephen L; Owzar, Kouros
2010-06-16
Many analyses of microarray association studies involve permutation, bootstrap resampling and cross-validation, that are ideally formulated as embarrassingly parallel computing problems. Given that these analyses are computationally intensive, scalable approaches that can take advantage of multi-core processor systems need to be developed. We have developed a CUDA based implementation, permGPU, that employs graphics processing units in microarray association studies. We illustrate the performance and applicability of permGPU within the context of permutation resampling for a number of test statistics. An extensive simulation study demonstrates a dramatic increase in performance when using permGPU on an NVIDIA GTX 280 card compared to an optimized C/C++ solution running on a conventional Linux server. permGPU is available as an open-source stand-alone application and as an extension package for the R statistical environment. It provides a dramatic increase in performance for permutation resampling analysis in the context of microarray association studies. The current version offers six test statistics for carrying out permutation resampling analyses for binary, quantitative and censored time-to-event traits.
Learning investment indicators through data extension
NASA Astrophysics Data System (ADS)
Dvořák, Marek
2017-07-01
Stock prices in the form of time series were analysed using single and multivariate statistical methods. After simple data preprocessing in the form of logarithmic differences, we augmented this single variate time series to a multivariate representation. This method makes use of sliding windows to calculate several dozen of new variables using simple statistic tools like first and second moments as well as more complicated statistic, like auto-regression coefficients and residual analysis, followed by an optional quadratic transformation that was further used for data extension. These were used as a explanatory variables in a regularized logistic LASSO regression which tried to estimate Buy-Sell Index (BSI) from real stock market data.
Dissecting the genetics of complex traits using summary association statistics.
Pasaniuc, Bogdan; Price, Alkes L
2017-02-01
During the past decade, genome-wide association studies (GWAS) have been used to successfully identify tens of thousands of genetic variants associated with complex traits and diseases. These studies have produced extensive repositories of genetic variation and trait measurements across large numbers of individuals, providing tremendous opportunities for further analyses. However, privacy concerns and other logistical considerations often limit access to individual-level genetic data, motivating the development of methods that analyse summary association statistics. Here, we review recent progress on statistical methods that leverage summary association data to gain insights into the genetic basis of complex traits and diseases.
Impact of animal health programmes on poverty reduction and sustainable livestock development.
Pradere, J P
2017-04-01
Based on data from publications and field observations, this study analyses the interactions between animal health, rural poverty and the performance and environmental impact of livestock farming in low-income countries and middle-income countries. There are strong statistical correlations between the quality of Veterinary Services, livestock productivity and poverty rates. In countries with effective Veterinary Services, livestock growth stems mainly from productivity gains and poverty rates are the lowest. Conversely, these analyses identify no statistical link between the quality of Veterinary Services and increased livestock production volumes. However, where animal diseases are poorly controlled, productivity is low and livestock growth is extensive, based mainly on a steady increase in animal numbers. Extensive growth is less effective than intensive growth in reducing poverty and aggravates the pressure of livestock production on natural resources and the climate.
Statistical power analyses using G*Power 3.1: tests for correlation and regression analyses.
Faul, Franz; Erdfelder, Edgar; Buchner, Axel; Lang, Albert-Georg
2009-11-01
G*Power is a free power analysis program for a variety of statistical tests. We present extensions and improvements of the version introduced by Faul, Erdfelder, Lang, and Buchner (2007) in the domain of correlation and regression analyses. In the new version, we have added procedures to analyze the power of tests based on (1) single-sample tetrachoric correlations, (2) comparisons of dependent correlations, (3) bivariate linear regression, (4) multiple linear regression based on the random predictor model, (5) logistic regression, and (6) Poisson regression. We describe these new features and provide a brief introduction to their scope and handling.
ERIC Educational Resources Information Center
Douglas, Jeff; Kim, Hae-Rim; Roussos, Louis; Stout, William; Zhang, Jinming
An extensive nonparametric dimensionality analysis of latent structure was conducted on three forms of the Law School Admission Test (LSAT) (December 1991, June 1992, and October 1992) using the DIMTEST model in confirmatory analyses and using DIMTEST, FAC, DETECT, HCA, PROX, and a genetic algorithm in exploratory analyses. Results indicate that…
A new statistical method for design and analyses of component tolerance
NASA Astrophysics Data System (ADS)
Movahedi, Mohammad Mehdi; Khounsiavash, Mohsen; Otadi, Mahmood; Mosleh, Maryam
2017-03-01
Tolerancing conducted by design engineers to meet customers' needs is a prerequisite for producing high-quality products. Engineers use handbooks to conduct tolerancing. While use of statistical methods for tolerancing is not something new, engineers often use known distributions, including the normal distribution. Yet, if the statistical distribution of the given variable is unknown, a new statistical method will be employed to design tolerance. In this paper, we use generalized lambda distribution for design and analyses component tolerance. We use percentile method (PM) to estimate the distribution parameters. The findings indicated that, when the distribution of the component data is unknown, the proposed method can be used to expedite the design of component tolerance. Moreover, in the case of assembled sets, more extensive tolerance for each component with the same target performance can be utilized.
Helmholtz and Gibbs ensembles, thermodynamic limit and bistability in polymer lattice models
NASA Astrophysics Data System (ADS)
Giordano, Stefano
2017-12-01
Representing polymers by random walks on a lattice is a fruitful approach largely exploited to study configurational statistics of polymer chains and to develop efficient Monte Carlo algorithms. Nevertheless, the stretching and the folding/unfolding of polymer chains within the Gibbs (isotensional) and the Helmholtz (isometric) ensembles of the statistical mechanics have not been yet thoroughly analysed by means of the lattice methodology. This topic, motivated by the recent introduction of several single-molecule force spectroscopy techniques, is investigated in the present paper. In particular, we analyse the force-extension curves under the Gibbs and Helmholtz conditions and we give a proof of the ensembles equivalence in the thermodynamic limit for polymers represented by a standard random walk on a lattice. Then, we generalize these concepts for lattice polymers that can undergo conformational transitions or, equivalently, for chains composed of bistable or two-state elements (that can be either folded or unfolded). In this case, the isotensional condition leads to a plateau-like force-extension response, whereas the isometric condition causes a sawtooth-like force-extension curve, as predicted by numerous experiments. The equivalence of the ensembles is finally proved also for lattice polymer systems exhibiting conformational transitions.
DOT National Transportation Integrated Search
2016-08-01
This study conducted an analysis of the SCDOT HMA specification. A Research Steering Committee provided oversight : of the process. The research process included extensive statistical analyses of test data supplied by SCDOT. : A total of 2,789 AC tes...
Conceptual and statistical problems associated with the use of diversity indices in ecology.
Barrantes, Gilbert; Sandoval, Luis
2009-09-01
Diversity indices, particularly the Shannon-Wiener index, have extensively been used in analyzing patterns of diversity at different geographic and ecological scales. These indices have serious conceptual and statistical problems which make comparisons of species richness or species abundances across communities nearly impossible. There is often no a single statistical method that retains all information needed to answer even a simple question. However, multivariate analyses could be used instead of diversity indices, such as cluster analyses or multiple regressions. More complex multivariate analyses, such as Canonical Correspondence Analysis, provide very valuable information on environmental variables associated to the presence and abundance of the species in a community. In addition, particular hypotheses associated to changes in species richness across localities, or change in abundance of one, or a group of species can be tested using univariate, bivariate, and/or rarefaction statistical tests. The rarefaction method has proved to be robust to standardize all samples to a common size. Even the simplest method as reporting the number of species per taxonomic category possibly provides more information than a diversity index value.
An Experimental Ecological Study of a Garden Compost Heap.
ERIC Educational Resources Information Center
Curds, Tracy
1985-01-01
A quantitative study of the fauna of a garden compost heap shows it to be similar to that of organisms found in soil and leaf litter. Materials, methods, and results are discussed and extensive tables of fauna lists, wet/dry masses, and statistical analyses are presented. (Author/DH)
Dissecting the genetics of complex traits using summary association statistics
Pasaniuc, Bogdan; Price, Alkes L.
2017-01-01
During the past decade, genome-wide association studies (GWAS) have successfully identified tens of thousands of genetic variants associated with complex traits and diseases. These studies have produced extensive repositories of genetic variation and trait measurements across large numbers of individuals, providing tremendous opportunities for further analyses. However, privacy concerns and other logistical considerations often limit access to individual-level genetic data, motivating the development of methods that analyze summary association statistics. Here we review recent progress on statistical methods that leverage summary association data to gain insights into the genetic basis of complex traits and diseases. PMID:27840428
Open Doors 1991/92. Report on International Educational Exchange.
ERIC Educational Resources Information Center
Zikopoulos, Marianthi, Ed.; And Others
1992-01-01
This report provides statistical data on 419,600 foreign students from over 200 countries studying at U.S. higher educational institutions. The report identifies trends in student mobility and migration, national origin, sources of financial support, fields of study, enrollments, and rates of growth. The book's extensive tables and analyses are…
Misyura, Maksym; Sukhai, Mahadeo A; Kulasignam, Vathany; Zhang, Tong; Kamel-Reid, Suzanne; Stockley, Tracy L
2018-01-01
Aims A standard approach in test evaluation is to compare results of the assay in validation to results from previously validated methods. For quantitative molecular diagnostic assays, comparison of test values is often performed using simple linear regression and the coefficient of determination (R2), using R2 as the primary metric of assay agreement. However, the use of R2 alone does not adequately quantify constant or proportional errors required for optimal test evaluation. More extensive statistical approaches, such as Bland-Altman and expanded interpretation of linear regression methods, can be used to more thoroughly compare data from quantitative molecular assays. Methods We present the application of Bland-Altman and linear regression statistical methods to evaluate quantitative outputs from next-generation sequencing assays (NGS). NGS-derived data sets from assay validation experiments were used to demonstrate the utility of the statistical methods. Results Both Bland-Altman and linear regression were able to detect the presence and magnitude of constant and proportional error in quantitative values of NGS data. Deming linear regression was used in the context of assay comparison studies, while simple linear regression was used to analyse serial dilution data. Bland-Altman statistical approach was also adapted to quantify assay accuracy, including constant and proportional errors, and precision where theoretical and empirical values were known. Conclusions The complementary application of the statistical methods described in this manuscript enables more extensive evaluation of performance characteristics of quantitative molecular assays, prior to implementation in the clinical molecular laboratory. PMID:28747393
Investigating Participation in Advanced Level Mathematics: A Study of Student Drop-Out
ERIC Educational Resources Information Center
Noyes, Andrew; Sealey, Paula
2012-01-01
There has, for some years, been a growing concern about participation in university-entrance level mathematics in England and across the developed world. Extensive statistical analyses present the decline but offer little to help us understand the causes. In this paper we explore a concern which cannot be explored through national data-sets,…
Misyura, Maksym; Sukhai, Mahadeo A; Kulasignam, Vathany; Zhang, Tong; Kamel-Reid, Suzanne; Stockley, Tracy L
2018-02-01
A standard approach in test evaluation is to compare results of the assay in validation to results from previously validated methods. For quantitative molecular diagnostic assays, comparison of test values is often performed using simple linear regression and the coefficient of determination (R 2 ), using R 2 as the primary metric of assay agreement. However, the use of R 2 alone does not adequately quantify constant or proportional errors required for optimal test evaluation. More extensive statistical approaches, such as Bland-Altman and expanded interpretation of linear regression methods, can be used to more thoroughly compare data from quantitative molecular assays. We present the application of Bland-Altman and linear regression statistical methods to evaluate quantitative outputs from next-generation sequencing assays (NGS). NGS-derived data sets from assay validation experiments were used to demonstrate the utility of the statistical methods. Both Bland-Altman and linear regression were able to detect the presence and magnitude of constant and proportional error in quantitative values of NGS data. Deming linear regression was used in the context of assay comparison studies, while simple linear regression was used to analyse serial dilution data. Bland-Altman statistical approach was also adapted to quantify assay accuracy, including constant and proportional errors, and precision where theoretical and empirical values were known. The complementary application of the statistical methods described in this manuscript enables more extensive evaluation of performance characteristics of quantitative molecular assays, prior to implementation in the clinical molecular laboratory. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Ceppi, Marcello; Gallo, Fabio; Bonassi, Stefano
2011-01-01
The most common study design performed in population studies based on the micronucleus (MN) assay, is the cross-sectional study, which is largely performed to evaluate the DNA damaging effects of exposure to genotoxic agents in the workplace, in the environment, as well as from diet or lifestyle factors. Sample size is still a critical issue in the design of MN studies since most recent studies considering gene-environment interaction, often require a sample size of several hundred subjects, which is in many cases difficult to achieve. The control of confounding is another major threat to the validity of causal inference. The most popular confounders considered in population studies using MN are age, gender and smoking habit. Extensive attention is given to the assessment of effect modification, given the increasing inclusion of biomarkers of genetic susceptibility in the study design. Selected issues concerning the statistical treatment of data have been addressed in this mini-review, starting from data description, which is a critical step of statistical analysis, since it allows to detect possible errors in the dataset to be analysed and to check the validity of assumptions required for more complex analyses. Basic issues dealing with statistical analysis of biomarkers are extensively evaluated, including methods to explore the dose-response relationship among two continuous variables and inferential analysis. A critical approach to the use of parametric and non-parametric methods is presented, before addressing the issue of most suitable multivariate models to fit MN data. In the last decade, the quality of statistical analysis of MN data has certainly evolved, although even nowadays only a small number of studies apply the Poisson model, which is the most suitable method for the analysis of MN data.
An Overview of R in Health Decision Sciences.
Jalal, Hawre; Pechlivanoglou, Petros; Krijkamp, Eline; Alarid-Escudero, Fernando; Enns, Eva; Hunink, M G Myriam
2017-10-01
As the complexity of health decision science applications increases, high-level programming languages are increasingly adopted for statistical analyses and numerical computations. These programming languages facilitate sophisticated modeling, model documentation, and analysis reproducibility. Among the high-level programming languages, the statistical programming framework R is gaining increased recognition. R is freely available, cross-platform compatible, and open source. A large community of users who have generated an extensive collection of well-documented packages and functions supports it. These functions facilitate applications of health decision science methodology as well as the visualization and communication of results. Although R's popularity is increasing among health decision scientists, methodological extensions of R in the field of decision analysis remain isolated. The purpose of this article is to provide an overview of existing R functionality that is applicable to the various stages of decision analysis, including model design, input parameter estimation, and analysis of model outputs.
Lee, Dae-Hee; Shin, Young-Soo; Jeon, Jin-Ho; Suh, Dong-Won; Han, Seung-Beom
2014-08-01
The aim of this study was to investigate the mechanism underlying the development of gap differences in total knee arthroplasty using the navigation-assisted gap technique and to assess whether these gap differences have statistical significance. Ninety-two patients (105 knees) implanted with cruciate-retaining prostheses using the navigation-assisted gap balancing technique were prospectively analysed. Medial extension and flexion gaps and lateral extension and flexion gaps were measured at full extension and at 90° of flexion. Repeated measures analysis of variance was used to compare the mean values of these four gaps. The correlation coefficient between each pair of gaps was assessed using Pearson's correlation analysis. Mean intra-operative medial and lateral extension gaps were 20.6 ± 2.1 and 21.7 ± 2.2 mm, respectively, and mean intra-operative medial and lateral flexion gaps were 21.6 ± 2.7 and 22.1 ± 2.5 mm, respectively. The pairs of gaps differed significantly (P < 0.05 each), except for the difference between the medial flexion and lateral extension gaps (n.s.). All four gaps were significantly correlated with each other, with the highest correlation between the medial and lateral flexion gaps (r = 0.890, P < 0.001) and the lowest between the medial flexion and lateral extension gaps (r = 0.701, P < 0.001). Medial and lateral flexion and extension gaps created using the navigation-assisted gap technique differed significantly, although the differences between them were <2 mm, and the gaps were closely correlated. These narrow ranges of statistically acceptable gap differences and the strong correlations between gaps should be considered by surgeons, as should the risks of soft tissue over-release or unintentional increases in extension or flexion gap after preparation of the other gap.
Huh, Iksoo; Wu, Xin; Park, Taesung; Yi, Soojin V
2017-07-21
DNA methylation is one of the most extensively studied epigenetic modifications of genomic DNA. In recent years, sequencing of bisulfite-converted DNA, particularly via next-generation sequencing technologies, has become a widely popular method to study DNA methylation. This method can be readily applied to a variety of species, dramatically expanding the scope of DNA methylation studies beyond the traditionally studied human and mouse systems. In parallel to the increasing wealth of genomic methylation profiles, many statistical tools have been developed to detect differentially methylated loci (DMLs) or differentially methylated regions (DMRs) between biological conditions. We discuss and summarize several key properties of currently available tools to detect DMLs and DMRs from sequencing of bisulfite-converted DNA. However, the majority of the statistical tools developed for DML/DMR analyses have been validated using only mammalian data sets, and less priority has been placed on the analyses of invertebrate or plant DNA methylation data. We demonstrate that genomic methylation profiles of non-mammalian species are often highly distinct from those of mammalian species using examples of honey bees and humans. We then discuss how such differences in data properties may affect statistical analyses. Based on these differences, we provide three specific recommendations to improve the power and accuracy of DML and DMR analyses of invertebrate data when using currently available statistical tools. These considerations should facilitate systematic and robust analyses of DNA methylation from diverse species, thus advancing our understanding of DNA methylation. © The Author 2017. Published by Oxford University Press.
Campos-Filho, N; Franco, E L
1989-02-01
A frequent procedure in matched case-control studies is to report results from the multivariate unmatched analyses if they do not differ substantially from the ones obtained after conditioning on the matching variables. Although conceptually simple, this rule requires that an extensive series of logistic regression models be evaluated by both the conditional and unconditional maximum likelihood methods. Most computer programs for logistic regression employ only one maximum likelihood method, which requires that the analyses be performed in separate steps. This paper describes a Pascal microcomputer (IBM PC) program that performs multiple logistic regression by both maximum likelihood estimation methods, which obviates the need for switching between programs to obtain relative risk estimates from both matched and unmatched analyses. The program calculates most standard statistics and allows factoring of categorical or continuous variables by two distinct methods of contrast. A built-in, descriptive statistics option allows the user to inspect the distribution of cases and controls across categories of any given variable.
Nitrifying biomass characterization and monitoring during bioaugmentation in a membrane bioreactor.
D'Anteo, Sibilla; Mannucci, Alberto; Meliani, Matteo; Verni, Franco; Petroni, Giulio; Munz, Giulio; Lubello, Claudio; Mori, Gualtiero; Vannini, Claudia
2015-01-01
A membrane bioreactor (MBR), fed with domestic wastewater, was bioaugmented with nitrifying biomass selected in a side-stream MBR fed with a synthetic high nitrogen-loaded influent. Microbial communities evolution was monitored and comparatively analysed through an extensive bio-molecular investigation (16S rRNA gene library construction and terminal-restriction fragment length polymorphism techniques) followed by statistical analyses. As expected, a highly specialized nitrifying biomass was selected in the side-stream reactor fed with high-strength ammonia synthetic wastewater. The bioaugmentation process caused an increase of nitrifying bacteria of the genera Nitrosomonas (up to more than 30%) and Nitrobacter in the inoculated MBR reactor. The overall structure of the microbial community changed in the mainstream MBR as a result of bioaugmentation. The effect of bioaugmentation in the shift of the microbial community was also verified through statistical analysis.
NASA Astrophysics Data System (ADS)
Pignalosa, Antonio; Di Crescenzo, Giuseppe; Marino, Ermanno; Terracciano, Rosario; Santo, Antonio
2015-04-01
The work here presented concerns a case study in which a complete multidisciplinary workflow has been applied for an extensive assessment of the rockslide susceptibility and hazard in a common scenario such as a vertical and fractured rocky cliffs. The studied area is located in a high-relief zone in Southern Italy (Sacco, Salerno, Campania), characterized by wide vertical rocky cliffs formed by tectonized thick successions of shallow-water limestones. The study concerned the following phases: a) topographic surveying integrating of 3d laser scanning, photogrammetry and GNSS; b) gelogical surveying, characterization of single instabilities and geomecanichal surveying, conducted by geologists rock climbers; c) processing of 3d data and reconstruction of high resolution geometrical models; d) structural and geomechanical analyses; e) data filing in a GIS-based spatial database; f) geo-statistical and spatial analyses and mapping of the whole set of data; g) 3D rockfall analysis; The main goals of the study have been a) to set-up an investigation method to achieve a complete and thorough characterization of the slope stability conditions and b) to provide a detailed base for an accurate definition of the reinforcement and mitigation systems. For this purposes the most up-to-date methods of field surveying, remote sensing, 3d modelling and geospatial data analysis have been integrated in a systematic workflow, accounting of the economic sustainability of the whole project. A novel integrated approach have been applied both fusing deterministic and statistical surveying methods. This approach enabled to deal with the wide extension of the studied area (near to 200.000 m2), without compromising an high accuracy of the results. The deterministic phase, based on a field characterization of single instabilities and their further analyses on 3d models, has been applied for delineating the peculiarity of each single feature. The statistical approach, based on geostructural field mapping and on punctual geomechanical data from scan-line surveying, allowed the rock mass partitioning in homogeneous geomechanical sectors and data interpolation through bounded geostatistical analyses on 3d models. All data, resulting from both approaches, have been referenced and filed in a single spatial database and considered in global geo-statistical analyses for deriving a fully modelled and comprehensive evaluation of the rockslide susceptibility. The described workflow yielded the following innovative results: a) a detailed census of single potential instabilities, through a spatial database recording the geometrical, geological and mechanical features, along with the expected failure modes; b) an high resolution characterization of the whole slope rockslide susceptibility, based on the partitioning of the area according to the stability and mechanical conditions which can be directly related to specific hazard mitigation systems; c) the exact extension of the area exposed to the rockslide hazard, along with the dynamic parameters of expected phenomena; d) an intervention design for hazard mitigation.
Sharma, Swarkar; Saha, Anjana; Rai, Ekta; Bhat, Audesh; Bamezai, Ramesh
2005-01-01
We have analysed the hypervariable regions (HVR I and II) of human mitochondrial DNA (mtDNA) in individuals from Uttar Pradesh (UP), Bihar (BI) and Punjab (PUNJ), belonging to the Indo-European linguistic group, and from South India (SI), that have their linguistic roots in Dravidian language. Our analysis revealed the presence of known and novel mutations in both hypervariable regions in the studied population groups. Median joining network analyses based on mtDNA showed extensive overlap in mtDNA lineages despite the extensive cultural and linguistic diversity. MDS plot analysis based on Fst distances suggested increased maternal genetic proximity for the studied population groups compared with other world populations. Mismatch distribution curves, respective neighbour joining trees and other statistical analyses showed that there were significant expansions. The study revealed an ancient common ancestry for the studied population groups, most probably through common founder female lineage(s), and also indicated that human migrations occurred (maybe across and within the Indian subcontinent) even after the initial phase of female migration to India.
NASA Astrophysics Data System (ADS)
Rubin, D.; Aldering, G.; Barbary, K.; Boone, K.; Chappell, G.; Currie, M.; Deustua, S.; Fagrelius, P.; Fruchter, A.; Hayden, B.; Lidman, C.; Nordin, J.; Perlmutter, S.; Saunders, C.; Sofiatti, C.; Supernova Cosmology Project, The
2015-11-01
While recent supernova (SN) cosmology research has benefited from improved measurements, current analysis approaches are not statistically optimal and will prove insufficient for future surveys. This paper discusses the limitations of current SN cosmological analyses in treating outliers, selection effects, shape- and color-standardization relations, unexplained dispersion, and heterogeneous observations. We present a new Bayesian framework, called UNITY (Unified Nonlinear Inference for Type-Ia cosmologY), that incorporates significant improvements in our ability to confront these effects. We apply the framework to real SN observations and demonstrate smaller statistical and systematic uncertainties. We verify earlier results that SNe Ia require nonlinear shape and color standardizations, but we now include these nonlinear relations in a statistically well-justified way. This analysis was primarily performed blinded, in that the basic framework was first validated on simulated data before transitioning to real data. We also discuss possible extensions of the method.
Iba Techniques to Study Renaissance Pottery Techniques
NASA Astrophysics Data System (ADS)
Bouquillon, A.; Castaing, J.; Salomon, J.; Zucchiatti, A.; Lucarelli, F.; Mando, P. A.; Prati, P.; Lanterna, G.; Vaccari, M. G.
2001-09-01
The application of Ion Beam Analysis, associated to Scanning Electron Microscopy is examined in connection with an extensive program on structural and chemical analyses of glazed terracotta's from the Italian Renaissance, launched by a French-Italian collaboration in the framework of the European COST-G1 scientific action. The objectives of the collaboration are reviewed. The compatibility of data from different specimen and various laboratories are discussed. Examples of the PIXE and statistical analyses on some artefacts of the "Robbiesche" type, supplied by the Louvre Museum of Paris and the Opificio delle Pietre Dure of Florence, are given to illustrate the performances of IBA in this particular field.
Using GIS to analyze animal movements in the marine environment
Hooge, Philip N.; Eichenlaub, William M.; Solomon, Elizabeth K.; Kruse, Gordon H.; Bez, Nicolas; Booth, Anthony; Dorn, Martin W.; Hills, Susan; Lipcius, Romuald N.; Pelletier, Dominique; Roy, Claude; Smith, Stephen J.; Witherell, David B.
2001-01-01
Advanced methods for analyzing animal movements have been little used in the aquatic research environment compared to the terrestrial. In addition, despite obvious advantages of integrating geographic information systems (GIS) with spatial studies of animal movement behavior, movement analysis tools have not been integrated into GIS for either aquatic or terrestrial environments. We therefore developed software that integrates one of the most commonly used GIS programs (ArcView®) with a large collection of animal movement analysis tools. This application, the Animal Movement Analyst Extension (AMAE), can be loaded as an extension to ArcView® under multiple operating system platforms (PC, Unix, and Mac OS). It contains more than 50 functions, including parametric and nonparametric home range analyses, random walk models, habitat analyses, point and circular statistics, tests of complete spatial randomness, tests for autocorrelation and sample size, point and line manipulation tools, and animation tools. This paper describes the use of these functions in analyzing animal location data; some limited examples are drawn from a sonic-tracking study of Pacific halibut (Hippoglossus stenolepis) in Glacier Bay, Alaska. The extension is available on the Internet at www.absc.usgs.gov/glba/gistools/index.htm.
Towards interoperable and reproducible QSAR analyses: Exchange of datasets.
Spjuth, Ola; Willighagen, Egon L; Guha, Rajarshi; Eklund, Martin; Wikberg, Jarl Es
2010-06-30
QSAR is a widely used method to relate chemical structures to responses or properties based on experimental observations. Much effort has been made to evaluate and validate the statistical modeling in QSAR, but these analyses treat the dataset as fixed. An overlooked but highly important issue is the validation of the setup of the dataset, which comprises addition of chemical structures as well as selection of descriptors and software implementations prior to calculations. This process is hampered by the lack of standards and exchange formats in the field, making it virtually impossible to reproduce and validate analyses and drastically constrain collaborations and re-use of data. We present a step towards standardizing QSAR analyses by defining interoperable and reproducible QSAR datasets, consisting of an open XML format (QSAR-ML) which builds on an open and extensible descriptor ontology. The ontology provides an extensible way of uniquely defining descriptors for use in QSAR experiments, and the exchange format supports multiple versioned implementations of these descriptors. Hence, a dataset described by QSAR-ML makes its setup completely reproducible. We also provide a reference implementation as a set of plugins for Bioclipse which simplifies setup of QSAR datasets, and allows for exporting in QSAR-ML as well as old-fashioned CSV formats. The implementation facilitates addition of new descriptor implementations from locally installed software and remote Web services; the latter is demonstrated with REST and XMPP Web services. Standardized QSAR datasets open up new ways to store, query, and exchange data for subsequent analyses. QSAR-ML supports completely reproducible creation of datasets, solving the problems of defining which software components were used and their versions, and the descriptor ontology eliminates confusions regarding descriptors by defining them crisply. This makes is easy to join, extend, combine datasets and hence work collectively, but also allows for analyzing the effect descriptors have on the statistical model's performance. The presented Bioclipse plugins equip scientists with graphical tools that make QSAR-ML easily accessible for the community.
Towards interoperable and reproducible QSAR analyses: Exchange of datasets
2010-01-01
Background QSAR is a widely used method to relate chemical structures to responses or properties based on experimental observations. Much effort has been made to evaluate and validate the statistical modeling in QSAR, but these analyses treat the dataset as fixed. An overlooked but highly important issue is the validation of the setup of the dataset, which comprises addition of chemical structures as well as selection of descriptors and software implementations prior to calculations. This process is hampered by the lack of standards and exchange formats in the field, making it virtually impossible to reproduce and validate analyses and drastically constrain collaborations and re-use of data. Results We present a step towards standardizing QSAR analyses by defining interoperable and reproducible QSAR datasets, consisting of an open XML format (QSAR-ML) which builds on an open and extensible descriptor ontology. The ontology provides an extensible way of uniquely defining descriptors for use in QSAR experiments, and the exchange format supports multiple versioned implementations of these descriptors. Hence, a dataset described by QSAR-ML makes its setup completely reproducible. We also provide a reference implementation as a set of plugins for Bioclipse which simplifies setup of QSAR datasets, and allows for exporting in QSAR-ML as well as old-fashioned CSV formats. The implementation facilitates addition of new descriptor implementations from locally installed software and remote Web services; the latter is demonstrated with REST and XMPP Web services. Conclusions Standardized QSAR datasets open up new ways to store, query, and exchange data for subsequent analyses. QSAR-ML supports completely reproducible creation of datasets, solving the problems of defining which software components were used and their versions, and the descriptor ontology eliminates confusions regarding descriptors by defining them crisply. This makes is easy to join, extend, combine datasets and hence work collectively, but also allows for analyzing the effect descriptors have on the statistical model's performance. The presented Bioclipse plugins equip scientists with graphical tools that make QSAR-ML easily accessible for the community. PMID:20591161
Smith, Paul F.
2017-01-01
Effective inferential statistical analysis is essential for high quality studies in neuroscience. However, recently, neuroscience has been criticised for the poor use of experimental design and statistical analysis. Many of the statistical issues confronting neuroscience are similar to other areas of biology; however, there are some that occur more regularly in neuroscience studies. This review attempts to provide a succinct overview of some of the major issues that arise commonly in the analyses of neuroscience data. These include: the non-normal distribution of the data; inequality of variance between groups; extensive correlation in data for repeated measurements across time or space; excessive multiple testing; inadequate statistical power due to small sample sizes; pseudo-replication; and an over-emphasis on binary conclusions about statistical significance as opposed to effect sizes. Statistical analysis should be viewed as just another neuroscience tool, which is critical to the final outcome of the study. Therefore, it needs to be done well and it is a good idea to be proactive and seek help early, preferably before the study even begins. PMID:29371855
Smith, Paul F
2017-01-01
Effective inferential statistical analysis is essential for high quality studies in neuroscience. However, recently, neuroscience has been criticised for the poor use of experimental design and statistical analysis. Many of the statistical issues confronting neuroscience are similar to other areas of biology; however, there are some that occur more regularly in neuroscience studies. This review attempts to provide a succinct overview of some of the major issues that arise commonly in the analyses of neuroscience data. These include: the non-normal distribution of the data; inequality of variance between groups; extensive correlation in data for repeated measurements across time or space; excessive multiple testing; inadequate statistical power due to small sample sizes; pseudo-replication; and an over-emphasis on binary conclusions about statistical significance as opposed to effect sizes. Statistical analysis should be viewed as just another neuroscience tool, which is critical to the final outcome of the study. Therefore, it needs to be done well and it is a good idea to be proactive and seek help early, preferably before the study even begins.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-16
... Statistics Service Notice of Intent To Request Revision and Extension of a Currently Approved Information Collection AGENCY: National Agricultural Statistics Service, USDA. ACTION: Notice and request for comments... National Agricultural Statistics Service (NASS) to request revision and extension of a currently approved...
MetalPDB in 2018: a database of metal sites in biological macromolecular structures.
Putignano, Valeria; Rosato, Antonio; Banci, Lucia; Andreini, Claudia
2018-01-04
MetalPDB (http://metalweb.cerm.unifi.it/) is a database providing information on metal-binding sites detected in the three-dimensional (3D) structures of biological macromolecules. MetalPDB represents such sites as 3D templates, called Minimal Functional Sites (MFSs), which describe the local environment around the metal(s) independently of the larger context of the macromolecular structure. The 2018 update of MetalPDB includes new contents and tools. A major extension is the inclusion of proteins whose structures do not contain metal ions although their sequences potentially contain a known MFS. In addition, MetalPDB now provides extensive statistical analyses addressing several aspects of general metal usage within the PDB, across protein families and in catalysis. Users can also query MetalPDB to extract statistical information on structural aspects associated with individual metals, such as preferred coordination geometries or aminoacidic environment. A further major improvement is the functional annotation of MFSs; the annotation is manually performed via a password-protected annotator interface. At present, ∼50% of all MFSs have such a functional annotation. Other noteworthy improvements are bulk query functionality, through the upload of a list of PDB identifiers, and ftp access to MetalPDB contents, allowing users to carry out in-depth analyses on their own computational infrastructure. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.
Statistical analysis of fNIRS data: a comprehensive review.
Tak, Sungho; Ye, Jong Chul
2014-01-15
Functional near-infrared spectroscopy (fNIRS) is a non-invasive method to measure brain activities using the changes of optical absorption in the brain through the intact skull. fNIRS has many advantages over other neuroimaging modalities such as positron emission tomography (PET), functional magnetic resonance imaging (fMRI), or magnetoencephalography (MEG), since it can directly measure blood oxygenation level changes related to neural activation with high temporal resolution. However, fNIRS signals are highly corrupted by measurement noises and physiology-based systemic interference. Careful statistical analyses are therefore required to extract neuronal activity-related signals from fNIRS data. In this paper, we provide an extensive review of historical developments of statistical analyses of fNIRS signal, which include motion artifact correction, short source-detector separation correction, principal component analysis (PCA)/independent component analysis (ICA), false discovery rate (FDR), serially-correlated errors, as well as inference techniques such as the standard t-test, F-test, analysis of variance (ANOVA), and statistical parameter mapping (SPM) framework. In addition, to provide a unified view of various existing inference techniques, we explain a linear mixed effect model with restricted maximum likelihood (ReML) variance estimation, and show that most of the existing inference methods for fNIRS analysis can be derived as special cases. Some of the open issues in statistical analysis are also described. Copyright © 2013 Elsevier Inc. All rights reserved.
Ge, Long; Tian, Jin-hui; Li, Xiu-xia; Song, Fujian; Li, Lun; Zhang, Jun; Li, Ge; Pei, Gai-qin; Qiu, Xia; Yang, Ke-hu
2016-01-01
Because of the methodological complexity of network meta-analyses (NMAs), NMAs may be more vulnerable to methodological risks than conventional pair-wise meta-analysis. Our study aims to investigate epidemiology characteristics, conduction of literature search, methodological quality and reporting of statistical analysis process in the field of cancer based on PRISMA extension statement and modified AMSTAR checklist. We identified and included 102 NMAs in the field of cancer. 61 NMAs were conducted using a Bayesian framework. Of them, more than half of NMAs did not report assessment of convergence (60.66%). Inconsistency was assessed in 27.87% of NMAs. Assessment of heterogeneity in traditional meta-analyses was more common (42.62%) than in NMAs (6.56%). Most of NMAs did not report assessment of similarity (86.89%) and did not used GRADE tool to assess quality of evidence (95.08%). 43 NMAs were adjusted indirect comparisons, the methods used were described in 53.49% NMAs. Only 4.65% NMAs described the details of handling of multi group trials and 6.98% described the methods of similarity assessment. The median total AMSTAR-score was 8.00 (IQR: 6.00–8.25). Methodological quality and reporting of statistical analysis did not substantially differ by selected general characteristics. Overall, the quality of NMAs in the field of cancer was generally acceptable. PMID:27848997
Dynamic properties of small-scale solar wind plasma fluctuations.
Riazantseva, M O; Budaev, V P; Zelenyi, L M; Zastenker, G N; Pavlos, G P; Safrankova, J; Nemecek, Z; Prech, L; Nemec, F
2015-05-13
The paper presents the latest results of the studies of small-scale fluctuations in a turbulent flow of solar wind (SW) using measurements with extremely high temporal resolution (up to 0.03 s) of the bright monitor of SW (BMSW) plasma spectrometer operating on astrophysical SPECTR-R spacecraft at distances up to 350,000 km from the Earth. The spectra of SW ion flux fluctuations in the range of scales between 0.03 and 100 s are systematically analysed. The difference of slopes in low- and high-frequency parts of spectra and the frequency of the break point between these two characteristic slopes was analysed for different conditions in the SW. The statistical properties of the SW ion flux fluctuations were thoroughly analysed on scales less than 10 s. A high level of intermittency is demonstrated. The extended self-similarity of SW ion flux turbulent flow is constantly observed. The approximation of non-Gaussian probability distribution function of ion flux fluctuations by the Tsallis statistics shows the non-extensive character of SW fluctuations. Statistical characteristics of ion flux fluctuations are compared with the predictions of a log-Poisson model. The log-Poisson parametrization of the structure function scaling has shown that well-defined filament-like plasma structures are, as a rule, observed in the turbulent SW flows. © 2015 The Author(s) Published by the Royal Society. All rights reserved.
CSB: a Python framework for structural bioinformatics.
Kalev, Ivan; Mechelke, Martin; Kopec, Klaus O; Holder, Thomas; Carstens, Simeon; Habeck, Michael
2012-11-15
Computational Structural Biology Toolbox (CSB) is a cross-platform Python class library for reading, storing and analyzing biomolecular structures with rich support for statistical analyses. CSB is designed for reusability and extensibility and comes with a clean, well-documented API following good object-oriented engineering practice. Stable release packages are available for download from the Python Package Index (PyPI) as well as from the project's website http://csb.codeplex.com. ivan.kalev@gmail.com or michael.habeck@tuebingen.mpg.de
Access to pedestrian roads, daily activities, and physical performance of adolescents.
Sjolie, A N
2000-08-01
A cross-sectional study using a questionnaire and physical tests was performed. To study how access to pedestrian roads and daily activities are related to low back strength, low back mobility, and hip mobility in adolescents. Although many authorities express concern about the passive lifestyle of adolescents, little is known about associations between daily activities and physical performance. This study compared 38 youths in a community lacking access to pedestrian roads with 50 youths in nearby area providing excellent access to pedestrian roads. A standardized questionnaire was used to obtain data about pedestrian roads, school journeys, and activities from the local authorities and the pupils. Low back strength was tested as static endurance strength, low back mobility by modified Schober techniques, and hip mobility by goniometer. For statistical analyses, a P value of 0.05 or less determined significance. In the area using school buses, the pupils had less low back extension, less hamstring flexibility, and less hip abduction, flexion, and extension than pupils in the area with pedestrian roads. Multivariate analyses showed no associations between walking or bicycling to school and anatomic function, but regular walking or bicycling to leisure-time activities associated positively with low back strength, low back extension, hip flexion, and extension. Distance by school bus associated negatively with hip abduction, hip flexion, hip extension, and hamstring flexibility (P<0.001). Time spent on television or computer associated negatively but insignificantly with low back strength, hamstring flexibility, hip abduction, and flexion (P<0.1). The results indicate that access to pedestrian roads and other lifestyle factors are associated with physical performance.
Identifying currents in the gene pool for bacterial populations using an integrative approach.
Tang, Jing; Hanage, William P; Fraser, Christophe; Corander, Jukka
2009-08-01
The evolution of bacterial populations has recently become considerably better understood due to large-scale sequencing of population samples. It has become clear that DNA sequences from a multitude of genes, as well as a broad sample coverage of a target population, are needed to obtain a relatively unbiased view of its genetic structure and the patterns of ancestry connected to the strains. However, the traditional statistical methods for evolutionary inference, such as phylogenetic analysis, are associated with several difficulties under such an extensive sampling scenario, in particular when a considerable amount of recombination is anticipated to have taken place. To meet the needs of large-scale analyses of population structure for bacteria, we introduce here several statistical tools for the detection and representation of recombination between populations. Also, we introduce a model-based description of the shape of a population in sequence space, in terms of its molecular variability and affinity towards other populations. Extensive real data from the genus Neisseria are utilized to demonstrate the potential of an approach where these population genetic tools are combined with an phylogenetic analysis. The statistical tools introduced here are freely available in BAPS 5.2 software, which can be downloaded from http://web.abo.fi/fak/mnf/mate/jc/software/baps.html.
NASA Astrophysics Data System (ADS)
Adams, W. K.; Perkins, K. K.; Podolefsky, N. S.; Dubson, M.; Finkelstein, N. D.; Wieman, C. E.
2006-06-01
The Colorado Learning Attitudes about Science Survey (CLASS) is a new instrument designed to measure student beliefs about physics and about learning physics. This instrument extends previous work by probing additional aspects of student beliefs and by using wording suitable for students in a wide variety of physics courses. The CLASS has been validated using interviews, reliability studies, and extensive statistical analyses of responses from over 5000 students. In addition, a new methodology for determining useful and statistically robust categories of student beliefs has been developed. This paper serves as the foundation for an extensive study of how student beliefs impact and are impacted by their educational experiences. For example, this survey measures the following: that most teaching practices cause substantial drops in student scores; that a student’s likelihood of becoming a physics major correlates with their “Personal Interest” score; and that, for a majority of student populations, women’s scores in some categories, including “Personal Interest” and “Real World Connections,” are significantly different from men’s scores.
Smith, David; Španěl, Patrik
2016-06-01
This article reflects our observations of recent accomplishments made using selected ion flow tube MS (SIFT-MS). Only brief descriptions are given of SIFT-MS as an analytical method and of the recent extensions to the underpinning analytical ion chemistry required to realize more robust analyses. The challenge of breath analysis is given special attention because, when achieved, it renders analysis of other air media relatively straightforward. Brief overviews are given of recent SIFT-MS breath analyses by leading research groups, noting the desirability of detection and quantification of single volatile biomarkers rather than reliance on statistical analyses, if breath analysis is to be accepted into clinical practice. A 'strengths, weaknesses, opportunities and threats' analysis of SIFT-MS is made, which should help to increase its utility for trace gas analysis.
Identification of Chinese plague foci from long-term epidemiological data
Ben-Ari, Tamara; Neerinckx, Simon; Agier, Lydiane; Cazelles, Bernard; Xu, Lei; Zhang, Zhibin; Fang, Xiye; Wang, Shuchun; Liu, Qiyong; Stenseth, Nils C.
2012-01-01
Carrying out statistical analysis over an extensive dataset of human plague reports in Chinese villages from 1772 to 1964, we identified plague endemic territories in China (i.e., plague foci). Analyses rely on (i) a clustering method that groups time series based on their time-frequency resemblances and (ii) an ecological niche model that helps identify plague suitable territories characterized by value ranges for a set of predefined environmental variables. Results from both statistical tools indicate the existence of two disconnected plague territories corresponding to Northern and Southern China. Altogether, at least four well defined independent foci are identified. Their contours compare favorably with field observations. Potential and limitations of inferring plague foci and dynamics using epidemiological data is discussed. PMID:22570501
An ANOVA approach for statistical comparisons of brain networks.
Fraiman, Daniel; Fraiman, Ricardo
2018-03-16
The study of brain networks has developed extensively over the last couple of decades. By contrast, techniques for the statistical analysis of these networks are less developed. In this paper, we focus on the statistical comparison of brain networks in a nonparametric framework and discuss the associated detection and identification problems. We tested network differences between groups with an analysis of variance (ANOVA) test we developed specifically for networks. We also propose and analyse the behaviour of a new statistical procedure designed to identify different subnetworks. As an example, we show the application of this tool in resting-state fMRI data obtained from the Human Connectome Project. We identify, among other variables, that the amount of sleep the days before the scan is a relevant variable that must be controlled. Finally, we discuss the potential bias in neuroimaging findings that is generated by some behavioural and brain structure variables. Our method can also be applied to other kind of networks such as protein interaction networks, gene networks or social networks.
Wynant, Willy; Abrahamowicz, Michal
2016-11-01
Standard optimization algorithms for maximizing likelihood may not be applicable to the estimation of those flexible multivariable models that are nonlinear in their parameters. For applications where the model's structure permits separating estimation of mutually exclusive subsets of parameters into distinct steps, we propose the alternating conditional estimation (ACE) algorithm. We validate the algorithm, in simulations, for estimation of two flexible extensions of Cox's proportional hazards model where the standard maximum partial likelihood estimation does not apply, with simultaneous modeling of (1) nonlinear and time-dependent effects of continuous covariates on the hazard, and (2) nonlinear interaction and main effects of the same variable. We also apply the algorithm in real-life analyses to estimate nonlinear and time-dependent effects of prognostic factors for mortality in colon cancer. Analyses of both simulated and real-life data illustrate good statistical properties of the ACE algorithm and its ability to yield new potentially useful insights about the data structure. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
The evolutionary history of bears is characterized by gene flow across species
Kumar, Vikas; Lammers, Fritjof; Bidon, Tobias; Pfenninger, Markus; Kolter, Lydia; Nilsson, Maria A.; Janke, Axel
2017-01-01
Bears are iconic mammals with a complex evolutionary history. Natural bear hybrids and studies of few nuclear genes indicate that gene flow among bears may be more common than expected and not limited to polar and brown bears. Here we present a genome analysis of the bear family with representatives of all living species. Phylogenomic analyses of 869 mega base pairs divided into 18,621 genome fragments yielded a well-resolved coalescent species tree despite signals for extensive gene flow across species. However, genome analyses using different statistical methods show that gene flow is not limited to closely related species pairs. Strong ancestral gene flow between the Asiatic black bear and the ancestor to polar, brown and American black bear explains uncertainties in reconstructing the bear phylogeny. Gene flow across the bear clade may be mediated by intermediate species such as the geographically wide-spread brown bears leading to large amounts of phylogenetic conflict. Genome-scale analyses lead to a more complete understanding of complex evolutionary processes. Evidence for extensive inter-specific gene flow, found also in other animal species, necessitates shifting the attention from speciation processes achieving genome-wide reproductive isolation to the selective processes that maintain species divergence in the face of gene flow. PMID:28422140
The evolutionary history of bears is characterized by gene flow across species.
Kumar, Vikas; Lammers, Fritjof; Bidon, Tobias; Pfenninger, Markus; Kolter, Lydia; Nilsson, Maria A; Janke, Axel
2017-04-19
Bears are iconic mammals with a complex evolutionary history. Natural bear hybrids and studies of few nuclear genes indicate that gene flow among bears may be more common than expected and not limited to polar and brown bears. Here we present a genome analysis of the bear family with representatives of all living species. Phylogenomic analyses of 869 mega base pairs divided into 18,621 genome fragments yielded a well-resolved coalescent species tree despite signals for extensive gene flow across species. However, genome analyses using different statistical methods show that gene flow is not limited to closely related species pairs. Strong ancestral gene flow between the Asiatic black bear and the ancestor to polar, brown and American black bear explains uncertainties in reconstructing the bear phylogeny. Gene flow across the bear clade may be mediated by intermediate species such as the geographically wide-spread brown bears leading to large amounts of phylogenetic conflict. Genome-scale analyses lead to a more complete understanding of complex evolutionary processes. Evidence for extensive inter-specific gene flow, found also in other animal species, necessitates shifting the attention from speciation processes achieving genome-wide reproductive isolation to the selective processes that maintain species divergence in the face of gene flow.
BIG BANG NUCLEOSYNTHESIS WITH A NON-MAXWELLIAN DISTRIBUTION
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bertulani, C. A.; Fuqua, J.; Hussein, M. S.
The abundances of light elements based on the big bang nucleosynthesis model are calculated using the Tsallis non-extensive statistics. The impact of the variation of the non-extensive parameter q from the unity value is compared to observations and to the abundance yields from the standard big bang model. We find large differences between the reaction rates and the abundance of light elements calculated with the extensive and the non-extensive statistics. We found that the observations are consistent with a non-extensive parameter q = 1{sub -} {sub 0.12}{sup +0.05}, indicating that a large deviation from the Boltzmann-Gibbs statistics (q = 1)more » is highly unlikely.« less
Analysis of Flexural Fatigue Strength of Self Compacting Fibre Reinforced Concrete Beams
NASA Astrophysics Data System (ADS)
Murali, G.; Sudar Celestina, J. P. Arul; Subhashini, N.; Vigneshwari, M.
2017-07-01
This study presents the extensive statistical investigation ofvariations in flexural fatigue life of self-compacting Fibrous Concrete (FC) beams. For this purpose, the experimental data of earlier researchers were examined by two parameter Weibull distribution.Two methods namely Graphical and moment wereused to analyse the variations in experimental data and the results have been presented in the form of probability of survival. The Weibull parameters values obtained from graphical and method of moments are precise. At 0.7 stress level, the fatigue life shows 59861 cyclesfor areliability of 90%.
Testosterone replacement therapy and the heart: friend, foe or bystander?
Canfield, Steven; Wang, Run
2016-01-01
The role of testosterone therapy (TTh) in cardiovascular disease (CVD) outcomes is still controversial, and it seems will remain inconclusive for the moment. An extensive body of literature has investigated the association of endogenous testosterone and use of TTh with CVD events including several meta-analyses. In some instances, a number of studies reported beneficial effects of TTh on CVD events and in other instances the body of literature reported detrimental effects or no effects at all. Yet, no review article has scrutinized this body of literature using the magnitude of associations and statistical significance reported from this relationship. We critically reviewed the previous and emerging body of literature that investigated the association of endogenous testosterone and use of TTh with CVD events (only fatal and nonfatal). These studies were divided into three groups, “beneficial (friendly use)”, “detrimental (foe)” and “no effects at all (bystander)”, based on their magnitude of associations and statistical significance from original research studies and meta-analyses of epidemiological studies and of randomized controlled trials (RCT’s). In this review article, the studies reporting a significant association of high levels of testosterone with a reduced risk of CVD events in original prospective studies and meta-analyses of cross-sectional and prospective studies seems to be more consistent. However, the number of meta-analyses of RCT’s does not provide a clear picture after we divided it into the beneficial, detrimental or no effects all groups using their magnitudes of association and statistical significance. From this review, we suggest that we need a study or number of studies that have the adequate power, epidemiological, and clinical data to provide a definitive conclusion on whether the effect of TTh on the natural history of CVD is real or not. PMID:28078222
Testosterone replacement therapy and the heart: friend, foe or bystander?
Lopez, David S; Canfield, Steven; Wang, Run
2016-12-01
The role of testosterone therapy (TTh) in cardiovascular disease (CVD) outcomes is still controversial, and it seems will remain inconclusive for the moment. An extensive body of literature has investigated the association of endogenous testosterone and use of TTh with CVD events including several meta-analyses. In some instances, a number of studies reported beneficial effects of TTh on CVD events and in other instances the body of literature reported detrimental effects or no effects at all. Yet, no review article has scrutinized this body of literature using the magnitude of associations and statistical significance reported from this relationship. We critically reviewed the previous and emerging body of literature that investigated the association of endogenous testosterone and use of TTh with CVD events (only fatal and nonfatal). These studies were divided into three groups, "beneficial (friendly use)", "detrimental (foe)" and "no effects at all (bystander)", based on their magnitude of associations and statistical significance from original research studies and meta-analyses of epidemiological studies and of randomized controlled trials (RCT's). In this review article, the studies reporting a significant association of high levels of testosterone with a reduced risk of CVD events in original prospective studies and meta-analyses of cross-sectional and prospective studies seems to be more consistent. However, the number of meta-analyses of RCT's does not provide a clear picture after we divided it into the beneficial, detrimental or no effects all groups using their magnitudes of association and statistical significance. From this review, we suggest that we need a study or number of studies that have the adequate power, epidemiological, and clinical data to provide a definitive conclusion on whether the effect of TTh on the natural history of CVD is real or not.
Schmidt, Kerstin; Schmidtke, Jörg; Mast, Yvonne; Waldvogel, Eva; Wohlleben, Wolfgang; Klemke, Friederike; Lockau, Wolfgang; Hausmann, Tina; Hühns, Maja; Broer, Inge
2017-08-01
Potatoes are a promising system for industrial production of the biopolymer cyanophycin as a second compound in addition to starch. To assess the efficiency in the field, we analysed the stability of the system, specifically its sensitivity to environmental factors. Field and greenhouse trials with transgenic potatoes (two independent events) were carried out for three years. The influence of environmental factors was measured and target compounds in the transgenic plants (cyanophycin, amino acids) were analysed for differences to control plants. Furthermore, non-target parameters (starch content, number, weight and size of tubers) were analysed for equivalence with control plants. The huge amount of data received was handled using modern statistical approaches to model the correlation between influencing environmental factors (year of cultivation, nitrogen fertilization, origin of plants, greenhouse or field cultivation) and key components (starch, amino acids, cyanophycin) and agronomic characteristics. General linear models were used for modelling, and standard effect sizes were applied to compare conventional and genetically modified plants. Altogether, the field trials prove that significant cyanophycin production is possible without reduction of starch content. Non-target compound composition seems to be equivalent under varying environmental conditions. Additionally, a quick test to measure cyanophycin content gives similar results compared to the extensive enzymatic test. This work facilitates the commercial cultivation of cyanophycin potatoes.
Sul, Jae Hoon; Bilow, Michael; Yang, Wen-Yun; Kostem, Emrah; Furlotte, Nick; He, Dan; Eskin, Eleazar
2016-03-01
Although genome-wide association studies (GWASs) have discovered numerous novel genetic variants associated with many complex traits and diseases, those genetic variants typically explain only a small fraction of phenotypic variance. Factors that account for phenotypic variance include environmental factors and gene-by-environment interactions (GEIs). Recently, several studies have conducted genome-wide gene-by-environment association analyses and demonstrated important roles of GEIs in complex traits. One of the main challenges in these association studies is to control effects of population structure that may cause spurious associations. Many studies have analyzed how population structure influences statistics of genetic variants and developed several statistical approaches to correct for population structure. However, the impact of population structure on GEI statistics in GWASs has not been extensively studied and nor have there been methods designed to correct for population structure on GEI statistics. In this paper, we show both analytically and empirically that population structure may cause spurious GEIs and use both simulation and two GWAS datasets to support our finding. We propose a statistical approach based on mixed models to account for population structure on GEI statistics. We find that our approach effectively controls population structure on statistics for GEIs as well as for genetic variants.
Information filtering via biased heat conduction.
Liu, Jian-Guo; Zhou, Tao; Guo, Qiang
2011-09-01
The process of heat conduction has recently found application in personalized recommendation [Zhou et al., Proc. Natl. Acad. Sci. USA 107, 4511 (2010)], which is of high diversity but low accuracy. By decreasing the temperatures of small-degree objects, we present an improved algorithm, called biased heat conduction, which could simultaneously enhance the accuracy and diversity. Extensive experimental analyses demonstrate that the accuracy on MovieLens, Netflix, and Delicious datasets could be improved by 43.5%, 55.4% and 19.2%, respectively, compared with the standard heat conduction algorithm and also the diversity is increased or approximately unchanged. Further statistical analyses suggest that the present algorithm could simultaneously identify users' mainstream and special tastes, resulting in better performance than the standard heat conduction algorithm. This work provides a creditable way for highly efficient information filtering.
Experimental design matters for statistical analysis: how to handle blocking.
Jensen, Signe M; Schaarschmidt, Frank; Onofri, Andrea; Ritz, Christian
2018-03-01
Nowadays, evaluation of the effects of pesticides often relies on experimental designs that involve multiple concentrations of the pesticide of interest or multiple pesticides at specific comparable concentrations and, possibly, secondary factors of interest. Unfortunately, the experimental design is often more or less neglected when analysing data. Two data examples were analysed using different modelling strategies. First, in a randomized complete block design, mean heights of maize treated with a herbicide and one of several adjuvants were compared. Second, translocation of an insecticide applied to maize as a seed treatment was evaluated using incomplete data from an unbalanced design with several layers of hierarchical sampling. Extensive simulations were carried out to further substantiate the effects of different modelling strategies. It was shown that results from suboptimal approaches (two-sample t-tests and ordinary ANOVA assuming independent observations) may be both quantitatively and qualitatively different from the results obtained using an appropriate linear mixed model. The simulations demonstrated that the different approaches may lead to differences in coverage percentages of confidence intervals and type 1 error rates, confirming that misleading conclusions can easily happen when an inappropriate statistical approach is chosen. To ensure that experimental data are summarized appropriately, avoiding misleading conclusions, the experimental design should duly be reflected in the choice of statistical approaches and models. We recommend that author guidelines should explicitly point out that authors need to indicate how the statistical analysis reflects the experimental design. © 2017 Society of Chemical Industry. © 2017 Society of Chemical Industry.
NASA Astrophysics Data System (ADS)
Vallianatos, Filippos
2015-04-01
Despite the extreme complexity that characterizes earthquake generation process, simple phenomenology seems to apply in the collective properties of seismicity. The best known is the Gutenberg-Richter relation. Short and long-term clustering, power-law scaling and scale-invariance have been exhibited in the spatio-temporal evolution of seismicity providing evidence for earthquakes as a nonlinear dynamic process. Regarding the physics of "many" earthquakes and how this can be derived from first principles, one may wonder, how can the collective properties of a set formed by all earthquakes in a given region, be derived and how does the structure of seismicity depend on its elementary constituents - the earthquakes? What are these properties? The physics of many earthquakes has to be studied with a different approach than the physics of one earthquake making the use of statistical physics necessary to understand the collective properties of earthquakes. Then a natural question arises. What type of statistical physics is appropriate to commonly describe effects from the microscale and crack opening level to the level of large earthquakes? An answer to the previous question could be non-extensive statistical physics, introduced by Tsallis (1988), as the appropriate methodological tool to describe entities with (multi) fractal distributions of their elements and where long-range interactions or intermittency are important, as in fracturing phenomena and earthquakes. In the present work, we review some fundamental properties of earthquake physics and how these are derived by means of non-extensive statistical physics. The aim is to understand aspects of the underlying physics that lead to the evolution of the earthquake phenomenon introducing the new topic of non-extensive statistical seismology. This research has been funded by the European Union (European Social Fund) and Greek national resources under the framework of the "THALES Program: SEISMO FEAR HELLARC" project. References F. Vallianatos, "A non-extensive approach to risk assessment", Nat. Hazards Earth Syst. Sci., 9, 211-216, 2009 F. Vallianatos and P. Sammonds "Is plate tectonics a case of non-extensive thermodynamics?" Physica A: Statistical Mechanics and its Applications, 389 (21), 4989-4993, 2010, F. Vallianatos, G. Michas, G. Papadakis and P. Sammonds " A non extensive statistical physics view to the spatiotemporal properties of the June 1995, Aigion earthquake (M6.2) aftershock sequence (West Corinth rift, Greece)", Acta Geophysica, 60(3), 758-768, 2012 F. Vallianatos and L. Telesca, Statistical mechanics in earth physics and natural hazards (editorial), Acta Geophysica, 60, 3, 499-501, 2012 F. Vallianatos, G. Michas, G. Papadakis and A. Tzanis "Evidence of non-extensivity in the seismicity observed during the 2011-2012 unrest at the Santorini volcanic complex, Greece" Nat. Hazards Earth Syst. Sci.,13,177-185, 2013 F. Vallianatos and P. Sammonds, "Evidence of non-extensive statistical physics of the lithospheric instability approaching the 2004 Sumatran-Andaman and 2011 Honshu mega-earthquakes" Tectonophysics, 590 , 52-58, 2013 G. Papadakis, F. Vallianatos, P. Sammonds, " Evidence of Nonextensive Statistical Physics behavior of the Hellenic Subduction Zone seismicity" Tectonophysics, 608, 1037 -1048, 2013 G. Michas, F. Vallianatos, and P. Sammonds, Non-extensivity and long-range correlations in the earthquake activity at the West Corinth rift (Greece) Nonlin. Processes Geophys., 20, 713-724, 2013
GenomeGraphs: integrated genomic data visualization with R.
Durinck, Steffen; Bullard, James; Spellman, Paul T; Dudoit, Sandrine
2009-01-06
Biological studies involve a growing number of distinct high-throughput experiments to characterize samples of interest. There is a lack of methods to visualize these different genomic datasets in a versatile manner. In addition, genomic data analysis requires integrated visualization of experimental data along with constantly changing genomic annotation and statistical analyses. We developed GenomeGraphs, as an add-on software package for the statistical programming environment R, to facilitate integrated visualization of genomic datasets. GenomeGraphs uses the biomaRt package to perform on-line annotation queries to Ensembl and translates these to gene/transcript structures in viewports of the grid graphics package. This allows genomic annotation to be plotted together with experimental data. GenomeGraphs can also be used to plot custom annotation tracks in combination with different experimental data types together in one plot using the same genomic coordinate system. GenomeGraphs is a flexible and extensible software package which can be used to visualize a multitude of genomic datasets within the statistical programming environment R.
Statistical Analysis of NAS Parallel Benchmarks and LINPACK Results
NASA Technical Reports Server (NTRS)
Meuer, Hans-Werner; Simon, Horst D.; Strohmeier, Erich; Lasinski, T. A. (Technical Monitor)
1994-01-01
In the last three years extensive performance data have been reported for parallel machines both based on the NAS Parallel Benchmarks, and on LINPACK. In this study we have used the reported benchmark results and performed a number of statistical experiments using factor, cluster, and regression analyses. In addition to the performance results of LINPACK and the eight NAS parallel benchmarks, we have also included peak performance of the machine, and the LINPACK n and n(sub 1/2) values. Some of the results and observations can be summarized as follows: 1) All benchmarks are strongly correlated with peak performance. 2) LINPACK and EP have each a unique signature. 3) The remaining NPB can grouped into three groups as follows: (CG and IS), (LU and SP), and (MG, FT, and BT). Hence three (or four with EP) benchmarks are sufficient to characterize the overall NPB performance. Our poster presentation will follow a standard poster format, and will present the data of our statistical analysis in detail.
Quantifying variation in speciation and extinction rates with clade data.
Paradis, Emmanuel; Tedesco, Pablo A; Hugueny, Bernard
2013-12-01
High-level phylogenies are very common in evolutionary analyses, although they are often treated as incomplete data. Here, we provide statistical tools to analyze what we name "clade data," which are the ages of clades together with their numbers of species. We develop a general approach for the statistical modeling of variation in speciation and extinction rates, including temporal variation, unknown variation, and linear and nonlinear modeling. We show how this approach can be generalized to a wide range of situations, including testing the effects of life-history traits and environmental variables on diversification rates. We report the results of an extensive simulation study to assess the performance of some statistical tests presented here as well as of the estimators of speciation and extinction rates. These latter results suggest the possibility to estimate correctly extinction rate in the absence of fossils. An example with data on fish is presented. © 2013 The Author(s). Evolution © 2013 The Society for the Study of Evolution.
Hernández-Socorro, Carmen Rosa; Saavedra, Pedro; Ramírez Felipe, José; Bohn Sarmiento, Uriel; Ruiz-Santana, Sergio
2017-04-21
The risk factors associated to long-term survival were assessed in patients with liver metastases of colorectal carcinoma undergoing ablative therapies. Single-centre cohort study, retrospectively analysed and prospectively collected consecutive patients with unresectable metastatic liver disease of colorectal carcinoma treated with ablative therapies between 1996 and 2013. Factors associated with survival time were identified using Cox's proportional hazard model with time-dependent covariates. A forward variable selection based on Akaike information criterion was performed. Relative risk and 95% confidence intervals for each factor were calculated. Statistical significance was set as P<.05. Seventy-five patients with liver metastases of colorectal cancer, with a mean age of 65.6 (10.3) underwent 106 treatments. Variables selected were good quality of life (RR 0.308, 95% CI 0.150-0.632) and tumour extension (RR 3.070, 95% CI 1.776-5.308). The median overall survival was 18.5 months (95% CI 17.4-24.4). The survival prognosis in median was 13.5 vs. 23.4 months for patients with and without tumour extension, and 23.0 vs. 12.8 months for patients with good and fair or poor quality of life, respectively. Good quality of life and tumour extension were the only statistically significant predictors of long-term survival in patients of colorectal carcinoma with liver metastatic disease undergoing ablative treatment with ultrasound. Copyright © 2016 Elsevier España, S.L.U. All rights reserved.
Hoehenwarter, Wolfgang; Larhlimi, Abdelhalim; Hummel, Jan; Egelhofer, Volker; Selbig, Joachim; van Dongen, Joost T; Wienkoop, Stefanie; Weckwerth, Wolfram
2011-07-01
Mass Accuracy Precursor Alignment is a fast and flexible method for comparative proteome analysis that allows the comparison of unprecedented numbers of shotgun proteomics analyses on a personal computer in a matter of hours. We compared 183 LC-MS analyses and more than 2 million MS/MS spectra and could define and separate the proteomic phenotypes of field grown tubers of 12 tetraploid cultivars of the crop plant Solanum tuberosum. Protein isoforms of patatin as well as other major gene families such as lipoxygenase and cysteine protease inhibitor that regulate tuber development were found to be the primary source of variability between the cultivars. This suggests that differentially expressed protein isoforms modulate genotype specific tuber development and the plant phenotype. We properly assigned the measured abundance of tryptic peptides to different protein isoforms that share extensive stretches of primary structure and thus inferred their abundance. Peptides unique to different protein isoforms were used to classify the remaining peptides assigned to the entire subset of isoforms based on a common abundance profile using multivariate statistical procedures. We identified nearly 4000 proteins which we used for quantitative functional annotation making this the most extensive study of the tuber proteome to date.
Analysis of the 5 iron golf swing when hitting for maximum distance.
Healy, Aoife; Moran, Kieran A; Dickson, Jane; Hurley, Cillian; Smeaton, Alan F; O'Connor, Noel E; Kelly, Philip; Haahr, Mads; Chockalingam, Nachiappan
2011-07-01
Most previous research on golf swing mechanics has focused on the driver club. The aim of this study was to identify the kinematic factors that contribute to greater hitting distance when using the 5 iron club. Three-dimensional marker coordinate data were collected (250 Hz) to calculate joint kinematics at eight key swing events, while a swing analyser measured club swing and ball launch characteristics. Thirty male participants were assigned to one of two groups, based on their ball launch speed (high: 52.9 ± 2.1 m · s(-1); low: 39.9 ± 5.2 m · s(-1)). Statistical analyses were used to identify variables that differed significantly between the two groups. Results showed significant differences were evident between the two groups for club face impact point and a number of joint angles and angular velocities, with greater shoulder flexion and less left shoulder internal rotation in the backswing, greater extension angular velocity in both shoulders at early downswing, greater left shoulder adduction angular velocity at ball contact, greater hip joint movement and X Factor angle during the downswing, and greater left elbow extension early in the downswing appearing to contribute to greater hitting distance with the 5 iron club.
Homeopathy: meta-analyses of pooled clinical data.
Hahn, Robert G
2013-01-01
In the first decade of the evidence-based era, which began in the mid-1990s, meta-analyses were used to scrutinize homeopathy for evidence of beneficial effects in medical conditions. In this review, meta-analyses including pooled data from placebo-controlled clinical trials of homeopathy and the aftermath in the form of debate articles were analyzed. In 1997 Klaus Linde and co-workers identified 89 clinical trials that showed an overall odds ratio of 2.45 in favor of homeopathy over placebo. There was a trend toward smaller benefit from studies of the highest quality, but the 10 trials with the highest Jadad score still showed homeopathy had a statistically significant effect. These results challenged academics to perform alternative analyses that, to demonstrate the lack of effect, relied on extensive exclusion of studies, often to the degree that conclusions were based on only 5-10% of the material, or on virtual data. The ultimate argument against homeopathy is the 'funnel plot' published by Aijing Shang's research group in 2005. However, the funnel plot is flawed when applied to a mixture of diseases, because studies with expected strong treatments effects are, for ethical reasons, powered lower than studies with expected weak or unclear treatment effects. To conclude that homeopathy lacks clinical effect, more than 90% of the available clinical trials had to be disregarded. Alternatively, flawed statistical methods had to be applied. Future meta-analyses should focus on the use of homeopathy in specific diseases or groups of diseases instead of pooling data from all clinical trials. © 2013 S. Karger GmbH, Freiburg.
Gürün, O O; Fatouros, P P; Kuhn, G M; de Paredes, E S
2001-04-01
We report on some extensions and further developments of a well-known microcalcification detection algorithm based on adaptive noise equalization. Tissue equivalent phantom images with and without labeled microcalcifications were subjected to this algorithm, and analyses of results revealed some shortcomings in the approach. Particularly, it was observed that the method of estimating the width of distributions in the feature space was based on assumptions which resulted in the loss of similarity preservation characteristics. A modification involving a change of estimator statistic was made, and the modified approach was tested on the same phantom images. Other modifications for improving detectability such as downsampling and use of alternate local contrast filters were also tested. The results indicate that these modifications yield improvements in detectability, while extending the generality of the approach. Extensions to real mammograms and further directions of research are discussed.
Amplitude analysis of four-body decays using a massively-parallel fitting framework
NASA Astrophysics Data System (ADS)
Hasse, C.; Albrecht, J.; Alves, A. A., Jr.; d'Argent, P.; Evans, T. D.; Rademacker, J.; Sokoloff, M. D.
2017-10-01
The GooFit Framework is designed to perform maximum-likelihood fits for arbitrary functions on various parallel back ends, for example a GPU. We present an extension to GooFit which adds the functionality to perform time-dependent amplitude analyses of pseudoscalar mesons decaying into four pseudoscalar final states. Benchmarks of this functionality show a significant performance increase when utilizing a GPU compared to a CPU. Furthermore, this extension is employed to study the sensitivity on the {{{D}}}0-{\\bar{{{D}}}}0 mixing parameters x and y in a time-dependent amplitude analysis of the decay D0 → K+π-π+π-. Studying a sample of 50 000 events and setting the central values to the world average of x = (0.49 ± 0.15)% and y = (0.61 ± 0.08)%, the statistical sensitivities of x and y are determined to be σ(x) = 0.019 % and σ(y) = 0.019 %.
Improta, Roberto; Vitagliano, Luigi; Esposito, Luciana
2015-11-01
The elucidation of the mutual influence between peptide bond geometry and local conformation has important implications for protein structure refinement, validation, and prediction. To gain insights into the structural determinants and the energetic contributions associated with protein/peptide backbone plasticity, we here report an extensive analysis of the variability of the peptide bond angles by combining statistical analyses of protein structures and quantum mechanics calculations on small model peptide systems. Our analyses demonstrate that all the backbone bond angles strongly depend on the peptide conformation and unveil the existence of regular trends as function of ψ and/or φ. The excellent agreement of the quantum mechanics calculations with the statistical surveys of protein structures validates the computational scheme here employed and demonstrates that the valence geometry of protein/peptide backbone is primarily dictated by local interactions. Notably, for the first time we show that the position of the H(α) hydrogen atom, which is an important parameter in NMR structural studies, is also dependent on the local conformation. Most of the trends observed may be satisfactorily explained by invoking steric repulsive interactions; in some specific cases the valence bond variability is also influenced by hydrogen-bond like interactions. Moreover, we can provide a reliable estimate of the energies involved in the interplay between geometry and conformations. © 2015 Wiley Periodicals, Inc.
Wen, Dingqiao; Yu, Yun; Hahn, Matthew W.; Nakhleh, Luay
2016-01-01
The role of hybridization and subsequent introgression has been demonstrated in an increasing number of species. Recently, Fontaine et al. (Science, 347, 2015, 1258524) conducted a phylogenomic analysis of six members of the Anopheles gambiae species complex. Their analysis revealed a reticulate evolutionary history and pointed to extensive introgression on all four autosomal arms. The study further highlighted the complex evolutionary signals that the co-occurrence of incomplete lineage sorting (ILS) and introgression can give rise to in phylogenomic analyses. While tree-based methodologies were used in the study, phylogenetic networks provide a more natural model to capture reticulate evolutionary histories. In this work, we reanalyse the Anopheles data using a recently devised framework that combines the multispecies coalescent with phylogenetic networks. This framework allows us to capture ILS and introgression simultaneously, and forms the basis for statistical methods for inferring reticulate evolutionary histories. The new analysis reveals a phylogenetic network with multiple hybridization events, some of which differ from those reported in the original study. To elucidate the extent and patterns of introgression across the genome, we devise a new method that quantifies the use of reticulation branches in the phylogenetic network by each genomic region. Applying the method to the mosquito data set reveals the evolutionary history of all the chromosomes. This study highlights the utility of ‘network thinking’ and the new insights it can uncover, in particular in phylogenomic analyses of large data sets with extensive gene tree incongruence. PMID:26808290
Trends in study design and the statistical methods employed in a leading general medicine journal.
Gosho, M; Sato, Y; Nagashima, K; Takahashi, S
2018-02-01
Study design and statistical methods have become core components of medical research, and the methodology has become more multifaceted and complicated over time. The study of the comprehensive details and current trends of study design and statistical methods is required to support the future implementation of well-planned clinical studies providing information about evidence-based medicine. Our purpose was to illustrate study design and statistical methods employed in recent medical literature. This was an extension study of Sato et al. (N Engl J Med 2017; 376: 1086-1087), which reviewed 238 articles published in 2015 in the New England Journal of Medicine (NEJM) and briefly summarized the statistical methods employed in NEJM. Using the same database, we performed a new investigation of the detailed trends in study design and individual statistical methods that were not reported in the Sato study. Due to the CONSORT statement, prespecification and justification of sample size are obligatory in planning intervention studies. Although standard survival methods (eg Kaplan-Meier estimator and Cox regression model) were most frequently applied, the Gray test and Fine-Gray proportional hazard model for considering competing risks were sometimes used for a more valid statistical inference. With respect to handling missing data, model-based methods, which are valid for missing-at-random data, were more frequently used than single imputation methods. These methods are not recommended as a primary analysis, but they have been applied in many clinical trials. Group sequential design with interim analyses was one of the standard designs, and novel design, such as adaptive dose selection and sample size re-estimation, was sometimes employed in NEJM. Model-based approaches for handling missing data should replace single imputation methods for primary analysis in the light of the information found in some publications. Use of adaptive design with interim analyses is increasing after the presentation of the FDA guidance for adaptive design. © 2017 John Wiley & Sons Ltd.
Information filtering via biased heat conduction
NASA Astrophysics Data System (ADS)
Liu, Jian-Guo; Zhou, Tao; Guo, Qiang
2011-09-01
The process of heat conduction has recently found application in personalized recommendation [Zhou , Proc. Natl. Acad. Sci. USA PNASA60027-842410.1073/pnas.1000488107107, 4511 (2010)], which is of high diversity but low accuracy. By decreasing the temperatures of small-degree objects, we present an improved algorithm, called biased heat conduction, which could simultaneously enhance the accuracy and diversity. Extensive experimental analyses demonstrate that the accuracy on MovieLens, Netflix, and Delicious datasets could be improved by 43.5%, 55.4% and 19.2%, respectively, compared with the standard heat conduction algorithm and also the diversity is increased or approximately unchanged. Further statistical analyses suggest that the present algorithm could simultaneously identify users' mainstream and special tastes, resulting in better performance than the standard heat conduction algorithm. This work provides a creditable way for highly efficient information filtering.
Effects of foot strike on low back posture, shock attenuation, and comfort in running.
Delgado, Traci L; Kubera-Shelton, Emilia; Robb, Robert R; Hickman, Robbin; Wallmann, Harvey W; Dufek, Janet S
2013-03-01
Barefoot running (BF) is gaining popularity in the running community. Biomechanical changes occur with BF, especially when initial contact changes from rearfoot strike (RFS) to forefoot strike (FFS). Changes in lumbar spine range of motion (ROM), particularly involving lumbar lordosis, have been associated with increased low back pain. However, it is not known if changing from RFS to FFS affects lumbar lordosis or low back pain. The purpose of this study was to determine whether a change from RFS to FFS would change lumbar lordosis, influence shock attenuation, or change comfort levels in healthy recreational/experienced runners. Forty-three subjects performed a warm-up on the treadmill where a self-selected foot strike pattern was determined. Instructions on running RFS/FFS were taught, and two conditions were examined. Each condition consisted of 90 s of BF with RFS or FFS, order randomly assigned. A comfort questionnaire was completed after both conditions. Fifteen consecutive strides from each condition were extracted for analyses. Statistically significant differences between FFS and RFS shock attenuation (P < 0.001), peak leg acceleration (P < 0.001), and overall lumbar ROM (P = 0.045) were found. There were no statistically significant differences between FFS and RFS in lumbar extension or lumbar flexion. There was a statistically significant difference between FFS and RFS for comfort/discomfort of the comfort questionnaire (P = .007). There were no statistically significant differences between other questions or the average of all questions. Change in foot strike from RFS to FFS decreased overall ROM in the lumbar spine but did not make a difference in flexion or extension in which the lumbar spine is positioned. Shock attenuation was greater in RFS. RFS was perceived a more comfortable running pattern.
Franz, D; Franz, K; Roeder, N; Hörmann, K; Fischer, R-J; Alberty, Jürgen
2007-07-01
When the German DRG system was implemented there was some doubt about whether patients with extensive head and neck surgery would be properly accounted for. Significant efforts have therefore been invested in analysis and case allocation of those in this group. The object of this study was to investigate whether the changes within the German DRG system have led to improved case allocation. Cost data received from 25 ENT departments on 518 prospective documented cases of extensive head and neck surgery were compared with data from the German institute dealing with remuneration in hospitals (InEK). Statistical measures used by InEK were used to analyse the quality of the overall system and the homogeneity of the individual case groups. The reduction of variance of inlier costs improved by about 107.3% from the 2004 version to the 2007 version of the German DRG system. The average coefficient of cost homogeneity rose by about 9.7% in the same period. Case mix index and DRG revenues were redistributed from less extensive to the more complex operations. Hospitals with large numbers of extensive operations and university hospitals will gain most benefit from this development. Appropriate case allocation of extensive operations on the head and neck has been improved by the continued development of the German DRG system culminating in the 2007 version. Further adjustments will be needed in the future.
Tsallis non-extensive statistics and solar wind plasma complexity
NASA Astrophysics Data System (ADS)
Pavlos, G. P.; Iliopoulos, A. C.; Zastenker, G. N.; Zelenyi, L. M.; Karakatsanis, L. P.; Riazantseva, M. O.; Xenakis, M. N.; Pavlos, E. G.
2015-03-01
This article presents novel results revealing non-equilibrium phase transition processes in the solar wind plasma during a strong shock event, which took place on 26th September 2011. Solar wind plasma is a typical case of stochastic spatiotemporal distribution of physical state variables such as force fields (B → , E →) and matter fields (particle and current densities or bulk plasma distributions). This study shows clearly the non-extensive and non-Gaussian character of the solar wind plasma and the existence of multi-scale strong correlations from the microscopic to the macroscopic level. It also underlines the inefficiency of classical magneto-hydro-dynamic (MHD) or plasma statistical theories, based on the classical central limit theorem (CLT), to explain the complexity of the solar wind dynamics, since these theories include smooth and differentiable spatial-temporal functions (MHD theory) or Gaussian statistics (Boltzmann-Maxwell statistical mechanics). On the contrary, the results of this study indicate the presence of non-Gaussian non-extensive statistics with heavy tails probability distribution functions, which are related to the q-extension of CLT. Finally, the results of this study can be understood in the framework of modern theoretical concepts such as non-extensive statistical mechanics (Tsallis, 2009), fractal topology (Zelenyi and Milovanov, 2004), turbulence theory (Frisch, 1996), strange dynamics (Zaslavsky, 2002), percolation theory (Milovanov, 1997), anomalous diffusion theory and anomalous transport theory (Milovanov, 2001), fractional dynamics (Tarasov, 2013) and non-equilibrium phase transition theory (Chang, 1992).
NASA Technical Reports Server (NTRS)
Myers, R. H.
1976-01-01
The depletion of ozone in the stratosphere is examined, and causes for the depletion are cited. Ground station and satellite measurements of ozone, which are taken on a worldwide basis, are discussed. Instruments used in ozone measurement are discussed, such as the Dobson spectrophotometer, which is credited with providing the longest and most extensive series of observations for ground based observation of stratospheric ozone. Other ground based instruments used to measure ozone are also discussed. The statistical differences of ground based measurements of ozone from these different instruments are compared to each other, and to satellite measurements. Mathematical methods (i.e., trend analysis or linear regression analysis) of analyzing the variability of ozone concentration with respect to time and lattitude are described. Various time series models which can be employed in accounting for ozone concentration variability are examined.
Why Flash Type Matters: A Statistical Analysis
NASA Astrophysics Data System (ADS)
Mecikalski, Retha M.; Bitzer, Phillip M.; Carey, Lawrence D.
2017-09-01
While the majority of research only differentiates between intracloud (IC) and cloud-to-ground (CG) flashes, there exists a third flash type, known as hybrid flashes. These flashes have extensive IC components as well as return strokes to ground but are misclassified as CG flashes in current flash type analyses due to the presence of a return stroke. In an effort to show that IC, CG, and hybrid flashes should be separately classified, the two-sample Kolmogorov-Smirnov (KS) test was applied to the flash sizes, flash initiation, and flash propagation altitudes for each of the three flash types. The KS test statistically showed that IC, CG, and hybrid flashes do not have the same parent distributions and thus should be separately classified. Separate classification of hybrid flashes will lead to improved lightning-related research, because unambiguously classified hybrid flashes occur on the same order of magnitude as CG flashes for multicellular storms.
Rhodes, Kirsty M; Turner, Rebecca M; White, Ian R; Jackson, Dan; Spiegelhalter, David J; Higgins, Julian P T
2016-12-20
Many meta-analyses combine results from only a small number of studies, a situation in which the between-study variance is imprecisely estimated when standard methods are applied. Bayesian meta-analysis allows incorporation of external evidence on heterogeneity, providing the potential for more robust inference on the effect size of interest. We present a method for performing Bayesian meta-analysis using data augmentation, in which we represent an informative conjugate prior for between-study variance by pseudo data and use meta-regression for estimation. To assist in this, we derive predictive inverse-gamma distributions for the between-study variance expected in future meta-analyses. These may serve as priors for heterogeneity in new meta-analyses. In a simulation study, we compare approximate Bayesian methods using meta-regression and pseudo data against fully Bayesian approaches based on importance sampling techniques and Markov chain Monte Carlo (MCMC). We compare the frequentist properties of these Bayesian methods with those of the commonly used frequentist DerSimonian and Laird procedure. The method is implemented in standard statistical software and provides a less complex alternative to standard MCMC approaches. An importance sampling approach produces almost identical results to standard MCMC approaches, and results obtained through meta-regression and pseudo data are very similar. On average, data augmentation provides closer results to MCMC, if implemented using restricted maximum likelihood estimation rather than DerSimonian and Laird or maximum likelihood estimation. The methods are applied to real datasets, and an extension to network meta-analysis is described. The proposed method facilitates Bayesian meta-analysis in a way that is accessible to applied researchers. © 2016 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd. © 2016 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd.
Granato, Gregory E.
2014-01-01
The U.S. Geological Survey (USGS) developed the Stochastic Empirical Loading and Dilution Model (SELDM) in cooperation with the Federal Highway Administration (FHWA) to indicate the risk for stormwater concentrations, flows, and loads to be above user-selected water-quality goals and the potential effectiveness of mitigation measures to reduce such risks. SELDM models the potential effect of mitigation measures by using Monte Carlo methods with statistics that approximate the net effects of structural and nonstructural best management practices (BMPs). In this report, structural BMPs are defined as the components of the drainage pathway between the source of runoff and a stormwater discharge location that affect the volume, timing, or quality of runoff. SELDM uses a simple stochastic statistical model of BMP performance to develop planning-level estimates of runoff-event characteristics. This statistical approach can be used to represent a single BMP or an assemblage of BMPs. The SELDM BMP-treatment module has provisions for stochastic modeling of three stormwater treatments: volume reduction, hydrograph extension, and water-quality treatment. In SELDM, these three treatment variables are modeled by using the trapezoidal distribution and the rank correlation with the associated highway-runoff variables. This report describes methods for calculating the trapezoidal-distribution statistics and rank correlation coefficients for stochastic modeling of volume reduction, hydrograph extension, and water-quality treatment by structural stormwater BMPs and provides the calculated values for these variables. This report also provides robust methods for estimating the minimum irreducible concentration (MIC), which is the lowest expected effluent concentration from a particular BMP site or a class of BMPs. These statistics are different from the statistics commonly used to characterize or compare BMPs. They are designed to provide a stochastic transfer function to approximate the quantity, duration, and quality of BMP effluent given the associated inflow values for a population of storm events. A database application and several spreadsheet tools are included in the digital media accompanying this report for further documentation of methods and for future use. In this study, analyses were done with data extracted from a modified copy of the January 2012 version of International Stormwater Best Management Practices Database, designated herein as the January 2012a version. Statistics for volume reduction, hydrograph extension, and water-quality treatment were developed with selected data. Sufficient data were available to estimate statistics for 5 to 10 BMP categories by using data from 40 to more than 165 monitoring sites. Water-quality treatment statistics were developed for 13 runoff-quality constituents commonly measured in highway and urban runoff studies including turbidity, sediment and solids; nutrients; total metals; organic carbon; and fecal coliforms. The medians of the best-fit statistics for each category were selected to construct generalized cumulative distribution functions for the three treatment variables. For volume reduction and hydrograph extension, interpretation of available data indicates that selection of a Spearman’s rho value that is the average of the median and maximum values for the BMP category may help generate realistic simulation results in SELDM. The median rho value may be selected to help generate realistic simulation results for water-quality treatment variables. MIC statistics were developed for 12 runoff-quality constituents commonly measured in highway and urban runoff studies by using data from 11 BMP categories and more than 167 monitoring sites. Four statistical techniques were applied for estimating MIC values with monitoring data from each site. These techniques produce a range of lower-bound estimates for each site. Four MIC estimators are proposed as alternatives for selecting a value from among the estimates from multiple sites. Correlation analysis indicates that the MIC estimates from multiple sites were weakly correlated with the geometric mean of inflow values, which indicates that there may be a qualitative or semiquantitative link between the inflow quality and the MIC. Correlations probably are weak because the MIC is influenced by the inflow water quality and the capability of each individual BMP site to reduce inflow concentrations.
Fairchild, Amanda J.; McQuillin, Samuel D.
2017-01-01
Third variable effects elucidate the relation between two other variables, and can describe why they are related or under what conditions they are related. This article demonstrates methods to analyze two third-variable effects: moderation and mediation. The utility of examining moderation and mediation effects in school psychology is described and current use of the analyses in applied school psychology research is reviewed and evaluated. Proper statistical methods to test the effects are presented, and different effect size measures for the models are provided. Extensions of the basic moderator and mediator models are also described. PMID:20006988
Fairchild, Amanda J; McQuillin, Samuel D
2010-02-01
Third variable effects elucidate the relation between two other variables, and can describe why they are related or under what conditions they are related. This article demonstrates methods to analyze two third-variable effects: moderation and mediation. The utility of examining moderation and mediation effects in school psychology is described and current use of the analyses in applied school psychology research is reviewed and evaluated. Proper statistical methods to test the effects are presented, and different effect size measures for the models are provided. Extensions of the basic moderator and mediator models are also described.
Coastal and Marine Bird Data Base
Anderson, S.H.; Geissler, P.H.; Dawson, D.K.
1980-01-01
Summary: This report discusses the development of a coastal and marine bird data base at the Migratory Bird and Habitat Research Laboratory. The system is compared with other data bases, and suggestions for future development, such as possible adaptations for other taxonomic groups, are included. The data base is based on the Statistical Analysis System but includes extensions programmed in PL/I. The Appendix shows how the system evolved. Output examples are given for heron data and pelagic bird data which indicate the types of analyses that can be conducted and output figures. The Appendixes include a retrieval language user's guide and description of the retrieval process and listing of translator program.
Vedula, S Swaroop; Li, Tianjing; Dickersin, Kay
2013-01-01
Details about the type of analysis (e.g., intent to treat [ITT]) and definitions (i.e., criteria for including participants in the analysis) are necessary for interpreting a clinical trial's findings. Our objective was to compare the description of types of analyses and criteria for including participants in the publication (i.e., what was reported) with descriptions in the corresponding internal company documents (i.e., what was planned and what was done). Trials were for off-label uses of gabapentin sponsored by Pfizer and Parke-Davis, and documents were obtained through litigation. For each trial, we compared internal company documents (protocols, statistical analysis plans, and research reports, all unpublished), with publications. One author extracted data and another verified, with a third person verifying discordant items and a sample of the rest. Extracted data included the number of participants randomized and analyzed for efficacy, and types of analyses for efficacy and safety and their definitions (i.e., criteria for including participants in each type of analysis). We identified 21 trials, 11 of which were published randomized controlled trials, and that provided the documents needed for planned comparisons. For three trials, there was disagreement on the number of randomized participants between the research report and publication. Seven types of efficacy analyses were described in the protocols, statistical analysis plans, and publications, including ITT and six others. The protocol or publication described ITT using six different definitions, resulting in frequent disagreements between the two documents (i.e., different numbers of participants were included in the analyses). Descriptions of analyses conducted did not agree between internal company documents and what was publicly reported. Internal company documents provide extensive documentation of methods planned and used, and trial findings, and should be publicly accessible. Reporting standards for randomized controlled trials should recommend transparent descriptions and definitions of analyses performed and which study participants are excluded.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-07
... DEPARTMENT OF AGRICULTURE National Agricultural Statistics Service Notice of Intent To Request Revision and Extension of a Currently Approved Information Collection AGENCY: National Agricultural... Reduction Act of 1995 this notice announces the intention of the National Agricultural Statistics Service...
Elastin: a representative ideal protein elastomer.
Urry, D W; Hugel, T; Seitz, M; Gaub, H E; Sheiba, L; Dea, J; Xu, J; Parker, T
2002-01-01
During the last half century, identification of an ideal (predominantly entropic) protein elastomer was generally thought to require that the ideal protein elastomer be a random chain network. Here, we report two new sets of data and review previous data. The first set of new data utilizes atomic force microscopy to report single-chain force-extension curves for (GVGVP)(251) and (GVGIP)(260), and provides evidence for single-chain ideal elasticity. The second class of new data provides a direct contrast between low-frequency sound absorption (0.1-10 kHz) exhibited by random-chain network elastomers and by elastin protein-based polymers. Earlier composition, dielectric relaxation (1-1000 MHz), thermoelasticity, molecular mechanics and dynamics calculations and thermodynamic and statistical mechanical analyses are presented, that combine with the new data to contrast with random-chain network rubbers and to detail the presence of regular non-random structural elements of the elastin-based systems that lose entropic elastomeric force upon thermal denaturation. The data and analyses affirm an earlier contrary argument that components of elastin, the elastic protein of the mammalian elastic fibre, and purified elastin fibre itself contain dynamic, non-random, regularly repeating structures that exhibit dominantly entropic elasticity by means of a damping of internal chain dynamics on extension. PMID:11911774
Zhu, Wensheng; Yuan, Ying; Zhang, Jingwen; Zhou, Fan; Knickmeyer, Rebecca C; Zhu, Hongtu
2017-02-01
The aim of this paper is to systematically evaluate a biased sampling issue associated with genome-wide association analysis (GWAS) of imaging phenotypes for most imaging genetic studies, including the Alzheimer's Disease Neuroimaging Initiative (ADNI). Specifically, the original sampling scheme of these imaging genetic studies is primarily the retrospective case-control design, whereas most existing statistical analyses of these studies ignore such sampling scheme by directly correlating imaging phenotypes (called the secondary traits) with genotype. Although it has been well documented in genetic epidemiology that ignoring the case-control sampling scheme can produce highly biased estimates, and subsequently lead to misleading results and suspicious associations, such findings are not well documented in imaging genetics. We use extensive simulations and a large-scale imaging genetic data analysis of the Alzheimer's Disease Neuroimaging Initiative (ADNI) data to evaluate the effects of the case-control sampling scheme on GWAS results based on some standard statistical methods, such as linear regression methods, while comparing it with several advanced statistical methods that appropriately adjust for the case-control sampling scheme. Copyright © 2016 Elsevier Inc. All rights reserved.
Authenticated DNA from Ancient Wood Remains
LIEPELT, SASCHA; SPERISEN, CHRISTOPH; DEGUILLOUX, MARIE-FRANCE; PETIT, REMY J.; KISSLING, ROY; SPENCER, MATTHEW; DE BEAULIEU, JACQUES-LOUIS; TABERLET, PIERRE; GIELLY, LUDOVIC; ZIEGENHAGEN, BIRGIT
2006-01-01
• Background The reconstruction of biological processes and human activities during the last glacial cycle relies mainly on data from biological remains. Highly abundant tissues, such as wood, are candidates for a genetic analysis of past populations. While well-authenticated DNA has now been recovered from various fossil remains, the final ‘proof’ is still missing for wood, despite some promising studies. • Scope The goal of this study was to determine if ancient wood can be analysed routinely in studies of archaeology and palaeogenetics. An experiment was designed which included blind testing, independent replicates, extensive contamination controls and rigorous statistical tests. Ten samples of ancient wood from major European forest tree genera were analysed with plastid DNA markers. • Conclusions Authentic DNA was retrieved from wood samples up to 1000 years of age. A new tool for real-time vegetation history and archaeology is ready to use. PMID:16987920
Spatio-temporal analysis of aftershock sequences in terms of Non Extensive Statistical Physics.
NASA Astrophysics Data System (ADS)
Chochlaki, Kalliopi; Vallianatos, Filippos
2017-04-01
Earth's seismicity is considered as an extremely complicated process where long-range interactions and fracturing exist (Vallianatos et al., 2016). For this reason, in order to analyze it, we use an innovative methodological approach, introduced by Tsallis (Tsallis, 1988; 2009), named Non Extensive Statistical Physics. This approach introduce a generalization of the Boltzmann-Gibbs statistical mechanics and it is based on the definition of Tsallis entropy Sq, which maximized leads the the so-called q-exponential function that expresses the probability distribution function that maximizes the Sq. In the present work, we utilize the concept of Non Extensive Statistical Physics in order to analyze the spatiotemporal properties of several aftershock series. Marekova (Marekova, 2014) suggested that the probability densities of the inter-event distances between successive aftershocks follow a beta distribution. Using the same data set we analyze the inter-event distance distribution of several aftershocks sequences in different geographic regions by calculating non extensive parameters that determine the behavior of the system and by fitting the q-exponential function, which expresses the degree of non-extentivity of the investigated system. Furthermore, the inter-event times distribution of the aftershocks as well as the frequency-magnitude distribution has been analyzed. The results supports the applicability of Non Extensive Statistical Physics ideas in aftershock sequences where a strong correlation exists along with memory effects. References C. Tsallis, Possible generalization of Boltzmann-Gibbs statistics, J. Stat. Phys. 52 (1988) 479-487. doi:10.1007/BF01016429 C. Tsallis, Introduction to nonextensive statistical mechanics: Approaching a complex world, 2009. doi:10.1007/978-0-387-85359-8. E. Marekova, Analysis of the spatial distribution between successive earthquakes in aftershocks series, Annals of Geophysics, 57, 5, doi:10.4401/ag-6556, 2014 F. Vallianatos, G. Papadakis, G. Michas, Generalized statistical mechanics approaches to earthquakes and tectonics. Proc. R. Soc. A, 472, 20160497, 2016.
MOLSIM: A modular molecular simulation software
Jurij, Reščič
2015-01-01
The modular software MOLSIM for all‐atom molecular and coarse‐grained simulations is presented with focus on the underlying concepts used. The software possesses four unique features: (1) it is an integrated software for molecular dynamic, Monte Carlo, and Brownian dynamics simulations; (2) simulated objects are constructed in a hierarchical fashion representing atoms, rigid molecules and colloids, flexible chains, hierarchical polymers, and cross‐linked networks; (3) long‐range interactions involving charges, dipoles and/or anisotropic dipole polarizabilities are handled either with the standard Ewald sum, the smooth particle mesh Ewald sum, or the reaction‐field technique; (4) statistical uncertainties are provided for all calculated observables. In addition, MOLSIM supports various statistical ensembles, and several types of simulation cells and boundary conditions are available. Intermolecular interactions comprise tabulated pairwise potentials for speed and uniformity and many‐body interactions involve anisotropic polarizabilities. Intramolecular interactions include bond, angle, and crosslink potentials. A very large set of analyses of static and dynamic properties is provided. The capability of MOLSIM can be extended by user‐providing routines controlling, for example, start conditions, intermolecular potentials, and analyses. An extensive set of case studies in the field of soft matter is presented covering colloids, polymers, and crosslinked networks. © 2015 The Authors. Journal of Computational Chemistry Published by Wiley Periodicals, Inc. PMID:25994597
Is there a genetic cause for cancer cachexia? – a clinical validation study in 1797 patients
Solheim, T S; Fayers, P M; Fladvad, T; Tan, B; Skorpen, F; Fearon, K; Baracos, V E; Klepstad, P; Strasser, F; Kaasa, S
2011-01-01
Background: Cachexia has major impact on cancer patients' morbidity and mortality. Future development of cachexia treatment needs methods for early identification of patients at risk. The aim of the study was to validate nine single-nucleotide polymorphisms (SNPs) previously associated with cachexia, and to explore 182 other candidate SNPs with the potential to be involved in the pathophysiology. Method: A total of 1797 cancer patients, classified as either having severe cachexia, mild cachexia or no cachexia, were genotyped. Results: After allowing for multiple testing, there was no statistically significant association between any of the SNPs analysed and the cachexia groups. However, consistent with prior reports, two SNPs from the acylpeptide hydrolase (APEH) gene showed suggestive statistical significance (P=0.02; OR, 0.78). Conclusion: This study failed to detect any significant association between any of the SNPs analysed and cachexia; although two SNPs from the APEH gene had a trend towards significance. The APEH gene encodes the enzyme APEH, postulated to be important in the endpoint of the ubiquitin system and thus the breakdown of proteins into free amino acids. In cachexia, there is an extensive breakdown of muscle proteins and an increase in the production of acute phase proteins in the liver. PMID:21934689
Silva, Anderson Clayton da; Santos, Priscila Dayane de Freitas; Palazzi, Nicole Campezato; Leimann, Fernanda Vitória; Fuchs, Renata Hernandez Barros; Bracht, Lívia; Gonçalves, Odinei Hess
2017-05-24
Nontoxic conserving agents are in demand by the food industry due to consumers concern about synthetic conservatives, especially in minimally processed food. The antimicrobial activity of curcumin, a natural phenolic compound, has been extensively investigated but hydrophobicity is an issue when applying curcumin to foodstuff. The objective of this work was to evaluate curcumin microcrystals as an antimicrobial agent in minimally processed carrots. The antimicrobial activity of curcumin microcrystals was evaluated in vitro against Gram-positive (Bacillus cereus and Staphylococcus aureus) and Gram-negative (Escherichia coli and Pseudomonas aeruginosa) microorganisms, showing a statistically significant (p < 0.05) decrease in the minimum inhibitory concentration compared to in natura, pristine curcumin. Curcumin microcrystals were effective in inhibiting psychrotrophic and mesophile microorganisms in minimally processed carrots. Sensory analyses were carried out showing no significant difference (p < 0.05) between curcumin microcrystal-treated carrots and non-treated carrots in triangular and tetrahedral discriminative tests. Sensory tests also showed that curcumin microcrystals could be added as a natural preservative in minimally processed carrots without causing noticeable differences that could be detected by the consumer. One may conclude that the analyses of the minimally processed carrots demonstrated that curcumin microcrystals are a suitable natural compound to inhibit the natural microbiota of carrots from a statistical point of view.
Automated brain volumetrics in multiple sclerosis: a step closer to clinical application
Beadnall, H N; Hatton, S N; Bader, G; Tomic, D; Silva, D G
2016-01-01
Background Whole brain volume (WBV) estimates in patients with multiple sclerosis (MS) correlate more robustly with clinical disability than traditional, lesion-based metrics. Numerous algorithms to measure WBV have been developed over the past two decades. We compare Structural Image Evaluation using Normalisation of Atrophy-Cross-sectional (SIENAX) to NeuroQuant and MSmetrix, for assessment of cross-sectional WBV in patients with MS. Methods MRIs from 61 patients with relapsing-remitting MS and 2 patients with clinically isolated syndrome were analysed. WBV measurements were calculated using SIENAX, NeuroQuant and MSmetrix. Statistical agreement between the methods was evaluated using linear regression and Bland-Altman plots. Precision and accuracy of WBV measurement was calculated for (1) NeuroQuant versus SIENAX and (2) MSmetrix versus SIENAX. Results Precision (Pearson's r) of WBV estimation for NeuroQuant and MSmetrix versus SIENAX was 0.983 and 0.992, respectively. Accuracy (Cb) was 0.871 and 0.994, respectively. NeuroQuant and MSmetrix showed a 5.5% and 1.0% volume difference compared with SIENAX, respectively, that was consistent across low and high values. Conclusions In the analysed population, NeuroQuant and MSmetrix both quantified cross-sectional WBV with comparable statistical agreement to SIENAX, a well-validated cross-sectional tool that has been used extensively in MS clinical studies. PMID:27071647
Deriving Vegetation Dynamics of Natural Terrestrial Ecosystems from MODIS NDVI/EVI Data over Turkey.
Evrendilek, Fatih; Gulbeyaz, Onder
2008-09-01
The 16-day composite MODIS vegetation indices (VIs) at 500-m resolution for the period between 2000 to 2007 were seasonally averaged on the basis of the estimated distribution of 16 potential natural terrestrial ecosystems (NTEs) across Turkey. Graphical and statistical analyses of the time-series VIs for the NTEs spatially disaggregated in terms of biogeoclimate zones and land cover types included descriptive statistics, correlations, discrete Fourier transform (DFT), time-series decomposition, and simple linear regression (SLR) models. Our spatio-temporal analyses revealed that both MODIS VIs, on average, depicted similar seasonal variations for the NTEs, with the NDVI values having higher mean and SD values. The seasonal VIs were most correlated in decreasing order for: barren/sparsely vegetated land > grassland > shrubland/woodland > forest; (sub)nival > warm temperate > alpine > cool temperate > boreal = Mediterranean; and summer > spring > autumn > winter. Most pronounced differences between the MODIS VI responses over Turkey occurred in boreal and Mediterranean climate zones and forests, and in winter (the senescence phase of the growing season). Our results showed the potential of the time-series MODIS VI datasets in the estimation and monitoring of seasonal and interannual ecosystem dynamics over Turkey that needs to be further improved and refined through systematic and extensive field measurements and validations across various biomes.
Whole genome sequencing data and de novo draft assemblies for 66 teleost species
Malmstrøm, Martin; Matschiner, Michael; Tørresen, Ole K.; Jakobsen, Kjetill S.; Jentoft, Sissel
2017-01-01
Teleost fishes comprise more than half of all vertebrate species, yet genomic data are only available for 0.2% of their diversity. Here, we present whole genome sequencing data for 66 new species of teleosts, vastly expanding the availability of genomic data for this important vertebrate group. We report on de novo assemblies based on low-coverage (9–39×) sequencing and present detailed methodology for all analyses. To facilitate further utilization of this data set, we present statistical analyses of the gene space completeness and verify the expected phylogenetic position of the sequenced genomes in a large mitogenomic context. We further present a nuclear marker set used for phylogenetic inference and evaluate each gene tree in relation to the species tree to test for homogeneity in the phylogenetic signal. Collectively, these analyses illustrate the robustness of this highly diverse data set and enable extensive reuse of the selected phylogenetic markers and the genomic data in general. This data set covers all major teleost lineages and provides unprecedented opportunities for comparative studies of teleosts. PMID:28094797
The Heuristics of Statistical Argumentation: Scaffolding at the Postsecondary Level
ERIC Educational Resources Information Center
Pardue, Teneal Messer
2017-01-01
Language plays a key role in statistics and, by extension, in statistics education. Enculturating students into the practice of statistics requires preparing them to communicate results of data analysis. Statistical argumentation is one way of providing structure to facilitate discourse in the statistics classroom. In this study, a teaching…
Permanent tooth mineralization in bonobos (Pan paniscus) and chimpanzees (P. troglodytes).
Boughner, Julia C; Dean, M Christopher; Wilgenbusch, Chelsea S
2012-12-01
The timing of tooth mineralization in bonobos (Pan paniscus) is virtually uncharacterized. Analysis of these developmental features in bonobos and the possible differences with its sister species, the chimpanzee (P. troglodytes), is important to properly quantify the normal ranges of dental growth variation in closely related primate species. Understanding this variation among bonobo, chimpanzee and modern human dental development is necessary to better contextualize the life histories of extinct hominins. This study tests whether bonobos and chimpanzees are distinguished from each other by covariance among the relative timing and sequences of tooth crown initiation, mineralization, root extension, and completion. Using multivariate statistical analyses, we compared the relative timing of permanent tooth crypt formation, crown mineralization, and root extension between 34 P. paniscus and 80 P. troglodytes mandibles radiographed in lateral and occlusal views. Covariance among our 12 assigned dental scores failed to statistically distinguish between bonobos and chimpanzees. Rather than clustering by species, individuals clustered by age group (infant, younger or older juvenile, and adult). Dental scores covaried similarly between the incisors, as well as between both premolars. Conversely, covariance among dental scores distinguished the canine and each of the three molars not only from each other, but also from the rest of the anterior teeth. Our study showed no significant differences in the relative timing of permanent tooth crown and root formation between bonobos and chimpanzees. Copyright © 2012 Wiley Periodicals, Inc.
NASA Astrophysics Data System (ADS)
Reynders, Edwin P. B.; Langley, Robin S.
2018-08-01
The hybrid deterministic-statistical energy analysis method has proven to be a versatile framework for modeling built-up vibro-acoustic systems. The stiff system components are modeled deterministically, e.g., using the finite element method, while the wave fields in the flexible components are modeled as diffuse. In the present paper, the hybrid method is extended such that not only the ensemble mean and variance of the harmonic system response can be computed, but also of the band-averaged system response. This variance represents the uncertainty that is due to the assumption of a diffuse field in the flexible components of the hybrid system. The developments start with a cross-frequency generalization of the reciprocity relationship between the total energy in a diffuse field and the cross spectrum of the blocked reverberant loading at the boundaries of that field. By making extensive use of this generalization in a first-order perturbation analysis, explicit expressions are derived for the cross-frequency and band-averaged variance of the vibrational energies in the diffuse components and for the cross-frequency and band-averaged variance of the cross spectrum of the vibro-acoustic field response of the deterministic components. These expressions are extensively validated against detailed Monte Carlo analyses of coupled plate systems in which diffuse fields are simulated by randomly distributing small point masses across the flexible components, and good agreement is found.
Does patella position influence ligament balancing in total knee arthroplasty?
Yoon, Jung-Ro; Oh, Kwang-Jun; Wang, Joon Ho; Yang, Jae-Hyuk
2015-07-01
In vivo comparative gap measurements were performed in three different patella positions (reduced, subluxated and everted) using offset-type-force-controlled-spreader-system. Prospectively, 50 knees were operated by total knee arthroplasty using a navigation-assisted gap-balancing technique. The offset-type-force-controlled-spreader-system was used for gap measurements. This commercially available instrument allows controllable tension in patella-reduced position. The mediolateral gaps of knee extension (0°) and flexion (90°) angle were recorded in three different patella positions; reduced, subluxated and everted. Any gap differences of more than 3 mm were considered as a meaningful difference. Correlation between the difference with the demographic data, preoperative radiologic alignment and intraoperative data was analysed. For statistical analysis, ANOVA and Pearson's correlation test were used. The gaps in patella eversion demonstrated smaller gaps both in knee extension and flexion position compared to the gaps of patella reduction position. The amount of decreased gaps was more definite in knee flexion position. Statistically significant difference was observed for the lateral gap of patella eversion compared to gap of patella reduction in knee flexion position (p < 0.05). There were notable cases of variability in knee flexion position. Significant portion of 12 (24 %) knees of patella subluxation and 33 (66 %) knees of patella evertion demonstrated either increased or decreased gaps in knee flexion position compared to the gaps of patella reduction position. The gaps in patella eversion demonstrated smaller gaps both in knee extension and flexion position compared to the gaps of patella reduction position. The amount of decreased gaps was more definite in knee flexion position. Therefore, the intraoperative patellar positioning has influence on the measurement of the joint gap. Keeping the patella in reduced position is important during gap balancing. I.
Sander, Edward A; Lynch, Kaari A; Boyce, Steven T
2014-05-01
Engineered skin substitutes (ESSs) have been reported to close full-thickness burn wounds but are subject to loss from mechanical shear due to their deficiencies in tensile strength and elasticity. Hypothetically, if the mechanical properties of ESS matched those of native skin, losses due to shear or fracture could be reduced. To consider modifications of the composition of ESS to improve homology with native skin, biomechanical analyses of the current composition of ESS were performed. ESSs consist of a degradable biopolymer scaffold of type I collagen and chondroitin-sulfate (CGS) that is populated sequentially with cultured human dermal fibroblasts (hF) and epidermal keratinocytes (hK). In the current study, the hydrated biopolymer scaffold (CGS), the scaffold populated with hF dermal skin substitute (DSS), or the complete ESS were evaluated mechanically for linear stiffness (N/mm), ultimate tensile load at failure (N), maximum extension at failure (mm), and energy absorbed up to the point of failure (N-mm). These biomechanical end points were also used to evaluate ESS at six weeks after grafting to full-thickness skin wounds in athymic mice and compared to murine autograft or excised murine skin. The data showed statistically significant differences (p <0.05) between ESS in vitro and after grafting for all four structural properties. Grafted ESS differed statistically from murine autograft with respect to maximum extension at failure, and from intact murine skin with respect to linear stiffness and maximum extension. These results demonstrate rapid changes in mechanical properties of ESS after grafting that are comparable to murine autograft. These values provide instruction for improvement of the biomechanical properties of ESS in vitro that may reduce clinical morbidity from graft loss.
NASA Astrophysics Data System (ADS)
Toth-Tascau, Mirela; Balanean, Flavia; Krepelka, Mircea
2013-10-01
Musculoskeletal impairment of the upper limb can cause difficulties in performing basic daily activities. Three dimensional motion analyses can provide valuable data of arm movement in order to precisely determine arm movement and inter-joint coordination. The purpose of this study was to develop a method to evaluate the degree of impairment based on the influence of shoulder movements in the amplitude of elbow flexion and extension based on the assumption that a lack of motion of the elbow joint will be compensated by an increased shoulder activity. In order to develop and validate a statistical model, one healthy young volunteer has been involved in the study. The activity of choice simulated blowing the nose, starting from a slight flexion of the elbow and raising the hand until the middle finger touches the tip of the nose and return to the start position. Inter-joint coordination between the elbow and shoulder movements showed significant correlation. Statistical regression was used to fit an equation model describing the influence of shoulder movements on the elbow mobility. The study provides a brief description of the kinematic analysis protocol and statistical models that may be useful in describing the relation between inter-joint movements of daily activities.
NASA Technical Reports Server (NTRS)
Dragonette, Richard A.; Suter, Joseph J.
1992-01-01
An extensive statistical analysis has been undertaken to determine if a correlation exists between changes in an NR atomic hydrogen maser's frequency offset and changes in environmental conditions. Correlation analyses have been performed comparing barometric pressure, humidity, and temperature with maser frequency offset as a function of time for periods ranging from 5.5 to 17 days. Semipartial correlation coefficients as large as -0.9 have been found between barometric pressure and maser frequency offset. Correlation between maser frequency offset and humidity was small compared to barometric pressure and unpredictable. Analysis of temperature data indicates that in the most current design, temperature does not significantly affect maser frequency offset.
Evidence of the non-extensive character of Earth's ambient noise.
NASA Astrophysics Data System (ADS)
Koutalonis, Ioannis; Vallianatos, Filippos
2017-04-01
Investigation of dynamical features of ambient seismic noise is one of the important scientific and practical research challenges. In the same time there isgrowing interest concerning an approach to study Earth Physics based on thescience of complex systems and non extensive statistical mechanics which is a generalization of Boltzmann-Gibbs statistical physics (Vallianatos et al., 2016).This seems to be a promising framework for studying complex systems exhibitingphenomena such as, long-range interactions, and memory effects. Inthis work we use non-extensive statistical mechanics and signal analysis methodsto explore the nature of ambient noise as measured in the stations of the HSNC in South Aegean (Chatzopoulos et al., 2016). In the present work we analyzed the de-trended increments time series of ambient seismic noise X(t), in time windows of 20 minutes to 10 seconds within "calm time zones" where the human-induced noise presents a minimum. Following the non extensive statistical physics approach, the probability distribution function of the increments of ambient noise is investigated. Analyzing the probability density function (PDF)p(X), normalized to zero mean and unit varianceresults that the fluctuations of Earth's ambient noise follows a q-Gaussian distribution asdefined in the frame of non-extensive statisticalmechanics indicated the possible existence of memory effects in Earth's ambient noise. References: F. Vallianatos, G. Papadakis, G. Michas, Generalized statistical mechanics approaches to earthquakes and tectonics. Proc. R. Soc. A, 472, 20160497, 2016. G. Chatzopoulos, I.Papadopoulos, F.Vallianatos, The Hellenic Seismological Network of Crete (HSNC): Validation and results of the 2013 aftershock,Advances in Geosciences, 41, 65-72, 2016.
Mechanics and statistics of the worm-like chain
NASA Astrophysics Data System (ADS)
Marantan, Andrew; Mahadevan, L.
2018-02-01
The worm-like chain model is a simple continuum model for the statistical mechanics of a flexible polymer subject to an external force. We offer a tutorial introduction to it using three approaches. First, we use a mesoscopic view, treating a long polymer (in two dimensions) as though it were made of many groups of correlated links or "clinks," allowing us to calculate its average extension as a function of the external force via scaling arguments. We then provide a standard statistical mechanics approach, obtaining the average extension by two different means: the equipartition theorem and the partition function. Finally, we work in a probabilistic framework, taking advantage of the Gaussian properties of the chain in the large-force limit to improve upon the previous calculations of the average extension.
Hutton, Brian; Salanti, Georgia; Caldwell, Deborah M; Chaimani, Anna; Schmid, Christopher H; Cameron, Chris; Ioannidis, John P A; Straus, Sharon; Thorlund, Kristian; Jansen, Jeroen P; Mulrow, Cynthia; Catalá-López, Ferrán; Gøtzsche, Peter C; Dickersin, Kay; Boutron, Isabelle; Altman, Douglas G; Moher, David
2015-06-02
The PRISMA statement is a reporting guideline designed to improve the completeness of reporting of systematic reviews and meta-analyses. Authors have used this guideline worldwide to prepare their reviews for publication. In the past, these reports typically compared 2 treatment alternatives. With the evolution of systematic reviews that compare multiple treatments, some of them only indirectly, authors face novel challenges for conducting and reporting their reviews. This extension of the PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-analyses) statement was developed specifically to improve the reporting of systematic reviews incorporating network meta-analyses. A group of experts participated in a systematic review, Delphi survey, and face-to-face discussion and consensus meeting to establish new checklist items for this extension statement. Current PRISMA items were also clarified. A modified, 32-item PRISMA extension checklist was developed to address what the group considered to be immediately relevant to the reporting of network meta-analyses. This document presents the extension and provides examples of good reporting, as well as elaborations regarding the rationale for new checklist items and the modification of previously existing items from the PRISMA statement. It also highlights educational information related to key considerations in the practice of network meta-analysis. The target audience includes authors and readers of network meta-analyses, as well as journal editors and peer reviewers.
Tsatsarelis, Thomas; Antonopoulos, Ioannis; Karagiannidis, Avraam; Perkoulidis, George
2007-10-01
This study presents an assessment of the current status of open dumps in Laconia prefecture of Peloponnese in southern Greece, where all open dumps are targeted for closure by 2008. An extensive field survey was conducted in 2005 to register existing sites in the prefecture. The data collected included the site area and age, waste depth, type of disposed waste, distance from nearest populated area, local geographical features and observed practices of open burning and soil coverage. On the basis of the collected data, a GIS database was developed, and the above parameters were statistically analysed. Subsequently, a decision tool for the restoration of open dumps was implemented, which led to the prioritization of site restorations and specific decisions about appropriate restoration steps for each site. The sites requiring restoration were then further classified using Principal Component Analysis, in order to categorize them into groups suitable for similar restoration work, thus facilitating fund allocation and subsequent restoration project management.
The Matching Relation and Situation-Specific Bias Modulation in Professional Football Play Selection
Stilling, Stephanie T; Critchfield, Thomas S
2010-01-01
The utility of a quantitative model depends on the extent to which its fitted parameters vary systematically with environmental events of interest. Professional football statistics were analyzed to determine whether play selection (passing versus rushing plays) could be accounted for with the generalized matching equation, and in particular whether variations in play selection across game situations would manifest as changes in the equation's fitted parameters. Statistically significant changes in bias were found for each of five types of game situations; no systematic changes in sensitivity were observed. Further analyses suggested relationships between play selection bias and both turnover probability (which can be described in terms of punishment) and yards-gained variance (which can be described in terms of variable-magnitude reinforcement schedules). The present investigation provides a useful demonstration of association between face-valid, situation-specific effects in a domain of everyday interest, and a theoretically important term of a quantitative model of behavior. Such associations, we argue, are an essential focus in translational extensions of quantitative models. PMID:21119855
Gardner, S L; Rausch, R L; Camacho, O C
1988-06-01
Among approximately 2,000 mammals examined for helminths in various regions of Bolivia during 1983-1987, cysts of Echinococcus vogeli Rausch and Bernstein, 1972, were found in a single paca, Cuniculus paca L., collected at La Laguna, Departamento de Santa Cruz (lat. 16 degrees 36'W; long. 62 degrees 42'S). This record, the first from Bolivia, represents a considerable extension of the known geographic range of this species in South America. Upon analysis of the morphologic characteristics of the protoscoleces derived from the cysts, the sizes of rostellar hooks from the material from the paca were found to be well within the ranges reported in previous studies. Statistical analysis of frequency distributions of hook characteristics revealed some deviations from normality. These results indicate that parametric statistics should be applied with caution in analyses of inter-and intraspecific variation of morphologic characteristics of hooks of metacestodes of the genus Echinococcus.
Borghammer, Per; Chakravarty, Mallar; Jonsdottir, Kristjana Yr; Sato, Noriko; Matsuda, Hiroshi; Ito, Kengo; Arahata, Yutaka; Kato, Takashi; Gjedde, Albert
2010-05-01
Recent cerebral blood flow (CBF) and glucose consumption (CMRglc) studies of Parkinson's disease (PD) revealed conflicting results. Using simulated data, we previously demonstrated that the often-reported subcortical hypermetabolism in PD could be explained as an artifact of biased global mean (GM) normalization, and that low-magnitude, extensive cortical hypometabolism is best detected by alternative data-driven normalization methods. Thus, we hypothesized that PD is characterized by extensive cortical hypometabolism but no concurrent widespread subcortical hypermetabolism and tested it on three independent samples of PD patients. We compared SPECT CBF images of 32 early-stage and 33 late-stage PD patients with that of 60 matched controls. We also compared PET FDG images from 23 late-stage PD patients with that of 13 controls. Three different normalization methods were compared: (1) GM normalization, (2) cerebellum normalization, (3) reference cluster normalization (Yakushev et al.). We employed standard voxel-based statistics (fMRIstat) and principal component analysis (SSM). Additionally, we performed a meta-analysis of all quantitative CBF and CMRglc studies in the literature to investigate whether the global mean (GM) values in PD are decreased. Voxel-based analysis with GM normalization and the SSM method performed similarly, i.e., both detected decreases in small cortical clusters and concomitant increases in extensive subcortical regions. Cerebellum normalization revealed more widespread cortical decreases but no subcortical increase. In all comparisons, the Yakushev method detected nearly identical patterns of very extensive cortical hypometabolism. Lastly, the meta-analyses demonstrated that global CBF and CMRglc values are decreased in PD. Based on the results, we conclude that PD most likely has widespread cortical hypometabolism, even at early disease stages. In contrast, extensive subcortical hypermetabolism is probably not a feature of PD.
Statistics for NAEG: past efforts, new results, and future plans
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gilbert, R.O.; Simpson, J.C.; Kinnison, R.R.
A brief review of Nevada Applied Ecology Group (NAEG) objectives is followed by a summary of past statistical analyses conducted by Pacific Northwest Laboratory for the NAEG. Estimates of spatial pattern of radionuclides and other statistical analyses at NS's 201, 219 and 221 are reviewed as background for new analyses presented in this paper. Suggested NAEG activities and statistical analyses needed for the projected termination date of NAEG studies in March 1986 are given.
Wangia, Victoria; Shireman, Theresa I
2013-01-01
While understanding geography's role in healthcare has been an area of research for over 40 years, the application of geography-based analyses to prescription medication use is limited. The body of literature was reviewed to assess the current state of such studies to demonstrate the scale and scope of projects in order to highlight potential research opportunities. To review systematically how researchers have applied geography-based analyses to medication use data. Empiric, English language research articles were identified through PubMed and bibliographies. Original research articles were independently reviewed as to the medications or classes studied, data sources, measures of medication exposure, geographic units of analysis, geospatial measures, and statistical approaches. From 145 publications matching key search terms, forty publications met the inclusion criteria. Cardiovascular and psychotropic classes accounted for the largest proportion of studies. Prescription drug claims were the primary source, and medication exposure was frequently captured as period prevalence. Medication exposure was documented across a variety of geopolitical units such as countries, provinces, regions, states, and postal codes. Most results were descriptive and formal statistical modeling capitalizing on geospatial techniques was rare. Despite the extensive research on small area variation analysis in healthcare, there are a limited number of studies that have examined geographic variation in medication use. Clearly, there is opportunity to collaborate with geographers and GIS professionals to harness the power of GIS technologies and to strengthen future medication studies by applying more robust geospatial statistical methods. Copyright © 2013 Elsevier Inc. All rights reserved.
Association analysis of multiple traits by an approach of combining P values.
Chen, Lili; Wang, Yong; Zhou, Yajing
2018-03-01
Increasing evidence shows that one variant can affect multiple traits, which is a widespread phenomenon in complex diseases. Joint analysis of multiple traits can increase statistical power of association analysis and uncover the underlying genetic mechanism. Although there are many statistical methods to analyse multiple traits, most of these methods are usually suitable for detecting common variants associated with multiple traits. However, because of low minor allele frequency of rare variant, these methods are not optimal for rare variant association analysis. In this paper, we extend an adaptive combination of P values method (termed ADA) for single trait to test association between multiple traits and rare variants in the given region. For a given region, we use reverse regression model to test each rare variant associated with multiple traits and obtain the P value of single-variant test. Further, we take the weighted combination of these P values as the test statistic. Extensive simulation studies show that our approach is more powerful than several other comparison methods in most cases and is robust to the inclusion of a high proportion of neutral variants and the different directions of effects of causal variants.
Novick, Steven; Shen, Yan; Yang, Harry; Peterson, John; LeBlond, Dave; Altan, Stan
2015-01-01
Dissolution (or in vitro release) studies constitute an important aspect of pharmaceutical drug development. One important use of such studies is for justifying a biowaiver for post-approval changes which requires establishing equivalence between the new and old product. We propose a statistically rigorous modeling approach for this purpose based on the estimation of what we refer to as the F2 parameter, an extension of the commonly used f2 statistic. A Bayesian test procedure is proposed in relation to a set of composite hypotheses that capture the similarity requirement on the absolute mean differences between test and reference dissolution profiles. Several examples are provided to illustrate the application. Results of our simulation study comparing the performance of f2 and the proposed method show that our Bayesian approach is comparable to or in many cases superior to the f2 statistic as a decision rule. Further useful extensions of the method, such as the use of continuous-time dissolution modeling, are considered.
Study of pre-seismic kHz EM emissions by means of complex systems
NASA Astrophysics Data System (ADS)
Balasis, Georgios; Papadimitriou, Constantinos; Eftaxias, Konstantinos
2010-05-01
The field of study of complex systems holds that the dynamics of complex systems are founded on universal principles that may used to describe disparate problems ranging from particle physics to economies of societies. A corollary is that transferring ideas and results from investigators in hitherto disparate areas will cross-fertilize and lead to important new results. It is well-known that the Boltzmann-Gibbs statistical mechanics works best in dealing with systems composed of either independent subsystems or interacting via short-range forces, and whose subsystems can access all the available phase space. For systems exhibiting long-range correlations, memory, or fractal properties, non-extensive Tsallis statistical mechanics becomes the most appropriate mathematical framework. As it was mentioned a central property of the magnetic storm, solar flare, and earthquake preparation process is the possible occurrence of coherent large-scale collective with a very rich structure, resulting from the repeated nonlinear interactions among collective with a very rich structure, resulting from the repeated nonlinear interactions among its constituents. Consequently, the non-extensive statistical mechanics is an appropriate regime to investigate universality, if any, in magnetic storm, solar flare, earthquake and pre-failure EM emission occurrence. A model for earthquake dynamics coming from a non-extensive Tsallis formulation, starting from first principles, has been recently introduced. This approach leads to a Gutenberg-Richter type law for the magnitude distribution of earthquakes which provides an excellent fit to seismicities generated in various large geographic areas usually identified as "seismic regions". We examine whether the Gutenberg-Richter law corresponding to a non-extensive Tsallis statistics is able to describe the distribution of amplitude of earthquakes, pre-seismic kHz EM emissions (electromagnetic earthquakes), solar flares, and magnetic storms. The analysis shows that the introduced non-extensive model provides an excellent fit to the experimental data, incorporating the characteristics of universality by means of non-extensive statistics into the extreme events under study.
Reif, David M.; Israel, Mark A.; Moore, Jason H.
2007-01-01
The biological interpretation of gene expression microarray results is a daunting challenge. For complex diseases such as cancer, wherein the body of published research is extensive, the incorporation of expert knowledge provides a useful analytical framework. We have previously developed the Exploratory Visual Analysis (EVA) software for exploring data analysis results in the context of annotation information about each gene, as well as biologically relevant groups of genes. We present EVA as a flexible combination of statistics and biological annotation that provides a straightforward visual interface for the interpretation of microarray analyses of gene expression in the most commonly occuring class of brain tumors, glioma. We demonstrate the utility of EVA for the biological interpretation of statistical results by analyzing publicly available gene expression profiles of two important glial tumors. The results of a statistical comparison between 21 malignant, high-grade glioblastoma multiforme (GBM) tumors and 19 indolent, low-grade pilocytic astrocytomas were analyzed using EVA. By using EVA to examine the results of a relatively simple statistical analysis, we were able to identify tumor class-specific gene expression patterns having both statistical and biological significance. Our interactive analysis highlighted the potential importance of genes involved in cell cycle progression, proliferation, signaling, adhesion, migration, motility, and structure, as well as candidate gene loci on a region of Chromosome 7 that has been implicated in glioma. Because EVA does not require statistical or computational expertise and has the flexibility to accommodate any type of statistical analysis, we anticipate EVA will prove a useful addition to the repertoire of computational methods used for microarray data analysis. EVA is available at no charge to academic users and can be found at http://www.epistasis.org. PMID:19390666
Vedula, S. Swaroop; Li, Tianjing; Dickersin, Kay
2013-01-01
Background Details about the type of analysis (e.g., intent to treat [ITT]) and definitions (i.e., criteria for including participants in the analysis) are necessary for interpreting a clinical trial's findings. Our objective was to compare the description of types of analyses and criteria for including participants in the publication (i.e., what was reported) with descriptions in the corresponding internal company documents (i.e., what was planned and what was done). Trials were for off-label uses of gabapentin sponsored by Pfizer and Parke-Davis, and documents were obtained through litigation. Methods and Findings For each trial, we compared internal company documents (protocols, statistical analysis plans, and research reports, all unpublished), with publications. One author extracted data and another verified, with a third person verifying discordant items and a sample of the rest. Extracted data included the number of participants randomized and analyzed for efficacy, and types of analyses for efficacy and safety and their definitions (i.e., criteria for including participants in each type of analysis). We identified 21 trials, 11 of which were published randomized controlled trials, and that provided the documents needed for planned comparisons. For three trials, there was disagreement on the number of randomized participants between the research report and publication. Seven types of efficacy analyses were described in the protocols, statistical analysis plans, and publications, including ITT and six others. The protocol or publication described ITT using six different definitions, resulting in frequent disagreements between the two documents (i.e., different numbers of participants were included in the analyses). Conclusions Descriptions of analyses conducted did not agree between internal company documents and what was publicly reported. Internal company documents provide extensive documentation of methods planned and used, and trial findings, and should be publicly accessible. Reporting standards for randomized controlled trials should recommend transparent descriptions and definitions of analyses performed and which study participants are excluded. Please see later in the article for the Editors' Summary PMID:23382656
MGAS: a powerful tool for multivariate gene-based genome-wide association analysis.
Van der Sluis, Sophie; Dolan, Conor V; Li, Jiang; Song, Youqiang; Sham, Pak; Posthuma, Danielle; Li, Miao-Xin
2015-04-01
Standard genome-wide association studies, testing the association between one phenotype and a large number of single nucleotide polymorphisms (SNPs), are limited in two ways: (i) traits are often multivariate, and analysis of composite scores entails loss in statistical power and (ii) gene-based analyses may be preferred, e.g. to decrease the multiple testing problem. Here we present a new method, multivariate gene-based association test by extended Simes procedure (MGAS), that allows gene-based testing of multivariate phenotypes in unrelated individuals. Through extensive simulation, we show that under most trait-generating genotype-phenotype models MGAS has superior statistical power to detect associated genes compared with gene-based analyses of univariate phenotypic composite scores (i.e. GATES, multiple regression), and multivariate analysis of variance (MANOVA). Re-analysis of metabolic data revealed 32 False Discovery Rate controlled genome-wide significant genes, and 12 regions harboring multiple genes; of these 44 regions, 30 were not reported in the original analysis. MGAS allows researchers to conduct their multivariate gene-based analyses efficiently, and without the loss of power that is often associated with an incorrectly specified genotype-phenotype models. MGAS is freely available in KGG v3.0 (http://statgenpro.psychiatry.hku.hk/limx/kgg/download.php). Access to the metabolic dataset can be requested at dbGaP (https://dbgap.ncbi.nlm.nih.gov/). The R-simulation code is available from http://ctglab.nl/people/sophie_van_der_sluis. Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press.
Chemical freezeout parameters within generic nonextensive statistics
NASA Astrophysics Data System (ADS)
Tawfik, Abdel; Yassin, Hayam; Abo Elyazeed, Eman R.
2018-06-01
The particle production in relativistic heavy-ion collisions seems to be created in a dynamically disordered system which can be best described by an extended exponential entropy. In distinguishing between the applicability of this and Boltzmann-Gibbs (BG) in generating various particle-ratios, generic (non)extensive statistics is introduced to the hadron resonance gas model. Accordingly, the degree of (non)extensivity is determined by the possible modifications in the phase space. Both BG extensivity and Tsallis nonextensivity are included as very special cases defined by specific values of the equivalence classes (c, d). We found that the particle ratios at energies ranging between 3.8 and 2760 GeV are best reproduced by nonextensive statistics, where c and d range between ˜ 0.9 and ˜ 1 . The present work aims at illustrating that the proposed approach is well capable to manifest the statistical nature of the system on interest. We don't aim at highlighting deeper physical insights. In other words, while the resulting nonextensivity is neither BG nor Tsallis, the freezeout parameters are found very compatible with BG and accordingly with the well-known freezeout phase-diagram, which is in an excellent agreement with recent lattice calculations. We conclude that the particle production is nonextensive but should not necessarily be accompanied by a radical change in the intensive or extensive thermodynamic quantities, such as internal energy and temperature. Only, the two critical exponents defining the equivalence classes (c, d) are the physical parameters characterizing the (non)extensivity.
Tracing the drift of MH370 debris throughout the Indian Ocean
NASA Astrophysics Data System (ADS)
Biastoch, Arne; Durgadoo, Jonathan V.; Rühs, Siren
2017-04-01
On 8 March 2014, a missing Boeing 777 of Malaysia Airlines (MH370) disappeared from radar screens. Since then, extensive search efforts aim to find the missing plane in the southeastern Indian Ocean. Starting with a flaperon washed up at La Réunion in July 2015, several pieces of debris were found at different shores at islands and African coasts in the southwestern Indian Ocean. Ocean currents were examined to understand the drift paths of debris throughout the Indian Ocean, and in consequence to identify the location of MH370. Here we present a series of Lagrangian analyses in which we follow particles representing virtual pieces of debris advected in an operational high-resolution ocean model. Of particular importance is the lare-scale influence of surface waves through Stokes drift. Large number of particles are analysed in statistical approaches to provide most likely starting locations. Different pieces of debris are combined to refine probability maps of their joint start positions. Forward vs. backward advection approaches are compared.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 40 Protection of Environment 4 2014-07-01 2014-07-01 false Extensions. 52.1572 Section 52.1572... PROMULGATION OF IMPLEMENTATION PLANS (CONTINUED) New Jersey § 52.1572 Extensions. Pursuant to section 186(a)(4... Consolidated Metropolitan Statistical Carbon Monoxide nonattainment area. [61 FR 56900, Nov. 5, 1996] ...
Code of Federal Regulations, 2013 CFR
2013-07-01
... 40 Protection of Environment 4 2013-07-01 2013-07-01 false Extensions. 52.1572 Section 52.1572... PROMULGATION OF IMPLEMENTATION PLANS (CONTINUED) New Jersey § 52.1572 Extensions. Pursuant to section 186(a)(4... Consolidated Metropolitan Statistical Carbon Monoxide nonattainment area. [61 FR 56900, Nov. 5, 1996] ...
Code of Federal Regulations, 2014 CFR
2014-07-01
... 40 Protection of Environment 4 2014-07-01 2014-07-01 false Extensions. 52.1672 Section 52.1672... PROMULGATION OF IMPLEMENTATION PLANS (CONTINUED) New York § 52.1672 Extensions. Pursuant to section 186(a)(4... Consolidated Metropolitan Statistical Carbon Monoxide nonattainment area. [61 FR 56900, Nov. 5, 1996] ...
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 4 2011-07-01 2011-07-01 false Extensions. 52.1572 Section 52.1572... PROMULGATION OF IMPLEMENTATION PLANS (CONTINUED) New Jersey § 52.1572 Extensions. Pursuant to section 186(a)(4... Consolidated Metropolitan Statistical Carbon Monoxide nonattainment area. [61 FR 56900, Nov. 5, 1996] ...
Code of Federal Regulations, 2013 CFR
2013-07-01
... 40 Protection of Environment 4 2013-07-01 2013-07-01 false Extensions. 52.1672 Section 52.1672... PROMULGATION OF IMPLEMENTATION PLANS (CONTINUED) New York § 52.1672 Extensions. Pursuant to section 186(a)(4... Consolidated Metropolitan Statistical Carbon Monoxide nonattainment area. [61 FR 56900, Nov. 5, 1996] ...
Code of Federal Regulations, 2012 CFR
2012-07-01
... 40 Protection of Environment 4 2012-07-01 2012-07-01 false Extensions. 52.1572 Section 52.1572... PROMULGATION OF IMPLEMENTATION PLANS (CONTINUED) New Jersey § 52.1572 Extensions. Pursuant to section 186(a)(4... Consolidated Metropolitan Statistical Carbon Monoxide nonattainment area. [61 FR 56900, Nov. 5, 1996] ...
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 4 2011-07-01 2011-07-01 false Extensions. 52.1672 Section 52.1672... PROMULGATION OF IMPLEMENTATION PLANS (CONTINUED) New York § 52.1672 Extensions. Pursuant to section 186(a)(4... Consolidated Metropolitan Statistical Carbon Monoxide nonattainment area. [61 FR 56900, Nov. 5, 1996] ...
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 4 2010-07-01 2010-07-01 false Extensions. 52.1572 Section 52.1572... PROMULGATION OF IMPLEMENTATION PLANS (CONTINUED) New Jersey § 52.1572 Extensions. Pursuant to section 186(a)(4... Consolidated Metropolitan Statistical Carbon Monoxide nonattainment area. [61 FR 56900, Nov. 5, 1996] ...
Code of Federal Regulations, 2012 CFR
2012-07-01
... 40 Protection of Environment 4 2012-07-01 2012-07-01 false Extensions. 52.1672 Section 52.1672... PROMULGATION OF IMPLEMENTATION PLANS (CONTINUED) New York § 52.1672 Extensions. Pursuant to section 186(a)(4... Consolidated Metropolitan Statistical Carbon Monoxide nonattainment area. [61 FR 56900, Nov. 5, 1996] ...
75 FR 15709 - Agency Forms Undergoing Paperwork Reduction Act Review
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-30
... statistics at the national level, referred to as the U.S. National Vital Statistics System (NVSS), depends on.... Proposed Project Vital Statistics Training Application (OMB No. 0920-0217 exp. 7/31/ 2010)--Extension--National Center for Health Statistics (NCHS), Centers for Disease Control and Prevention (CDC). Background...
Gene Level Meta-Analysis of Quantitative Traits by Functional Linear Models.
Fan, Ruzong; Wang, Yifan; Boehnke, Michael; Chen, Wei; Li, Yun; Ren, Haobo; Lobach, Iryna; Xiong, Momiao
2015-08-01
Meta-analysis of genetic data must account for differences among studies including study designs, markers genotyped, and covariates. The effects of genetic variants may differ from population to population, i.e., heterogeneity. Thus, meta-analysis of combining data of multiple studies is difficult. Novel statistical methods for meta-analysis are needed. In this article, functional linear models are developed for meta-analyses that connect genetic data to quantitative traits, adjusting for covariates. The models can be used to analyze rare variants, common variants, or a combination of the two. Both likelihood-ratio test (LRT) and F-distributed statistics are introduced to test association between quantitative traits and multiple variants in one genetic region. Extensive simulations are performed to evaluate empirical type I error rates and power performance of the proposed tests. The proposed LRT and F-distributed statistics control the type I error very well and have higher power than the existing methods of the meta-analysis sequence kernel association test (MetaSKAT). We analyze four blood lipid levels in data from a meta-analysis of eight European studies. The proposed methods detect more significant associations than MetaSKAT and the P-values of the proposed LRT and F-distributed statistics are usually much smaller than those of MetaSKAT. The functional linear models and related test statistics can be useful in whole-genome and whole-exome association studies. Copyright © 2015 by the Genetics Society of America.
How Extension Can Help Communities Conduct Impact Analyses.
ERIC Educational Resources Information Center
Wisconsin Univ., Madison. Dept. of Agricultural Journalism.
Intended to provide guidance to Extension specialists and agents faced with requests for impact analyses from communities experiencing economic development, this report also summarizes issues that need to be considered. The first section, on private sector impacts, addresses questions on predicting changes in production, employment, and housing…
ERIC Educational Resources Information Center
Zare, Mohsen Nazarzadeh; Dorrani, Kamal; Lavasani, Masoud Gholamali
2012-01-01
Background and purpose: This study examines the views of farmers and extension agents participating in extension education courses in Dezful, Iran, with regard to problems with these courses. It relies upon a descriptive methodology, using a survey as its instrument. Sample: The statistical population consisted of 5060 farmers and 50 extension…
Stochastic Geometric Network Models for Groups of Functional and Structural Connectomes
Friedman, Eric J.; Landsberg, Adam S.; Owen, Julia P.; Li, Yi-Ou; Mukherjee, Pratik
2014-01-01
Structural and functional connectomes are emerging as important instruments in the study of normal brain function and in the development of new biomarkers for a variety of brain disorders. In contrast to single-network studies that presently dominate the (non-connectome) network literature, connectome analyses typically examine groups of empirical networks and then compare these against standard (stochastic) network models. Current practice in connectome studies is to employ stochastic network models derived from social science and engineering contexts as the basis for the comparison. However, these are not necessarily best suited for the analysis of connectomes, which often contain groups of very closely related networks, such as occurs with a set of controls or a set of patients with a specific disorder. This paper studies important extensions of standard stochastic models that make them better adapted for analysis of connectomes, and develops new statistical fitting methodologies that account for inter-subject variations. The extensions explicitly incorporate geometric information about a network based on distances and inter/intra hemispherical asymmetries (to supplement ordinary degree-distribution information), and utilize a stochastic choice of networks' density levels (for fixed threshold networks) to better capture the variance in average connectivity among subjects. The new statistical tools introduced here allow one to compare groups of networks by matching both their average characteristics and the variations among them. A notable finding is that connectomes have high “smallworldness” beyond that arising from geometric and degree considerations alone. PMID:25067815
Long-term efficacy and safety of safinamide as add-on therapy in early Parkinson's disease.
Schapira, A H V; Stocchi, F; Borgohain, R; Onofrj, M; Bhatt, M; Lorenzana, P; Lucini, V; Giuliani, R; Anand, R
2013-02-01
Safinamide is an α-aminoamide with both dopaminergic and non-dopaminergic mechanisms of action in Phase III clinical development as a once-daily add-on to dopamine agonist (DA) therapy for early Parkinson's disease (PD). Study 017 was a 12-month, randomized, double-blind, placebo-controlled pre-planned extension study to the previously reported Study 015. Patients received safinamide 100 or 200 mg/day or placebo added to a single DA in early PD. The primary efficacy endpoint was the time from baseline (Study 015 randomization) to 'intervention', defined as increase in DA dose; addition of another DA, levodopa or other PD treatment; or discontinuation due to lack of efficacy. Safinamide groups were pooled for the primary efficacy endpoint analysis; post hoc analyses were performed on each separate dose group. Of the 269 patients randomized in Study 015, 227 (84%) enrolled in Study 017 and 187/227 (82%) patients completed the extension study. Median time to intervention was 559 and 466 days in the pooled safinamide and placebo groups, respectively (log-rank test; P = 0.3342). In post hoc analyses, patients receiving safinamide 100 mg/day experienced a significantly lower rate of intervention compared with placebo (25% vs. 51%, respectively) and a delay in median time to intervention of 9 days (P < 0.05; 240- to 540-day analysis). The pooled data from the safinamide groups failed to reach statistical significance for the primary endpoint of median time from baseline to additional drug intervention. Post hoc analyses indicate that safinamide 100 mg/day may be effective as add-on treatment to DA in PD. © 2012 The Author(s) European Journal of Neurology © 2012 EFNS.
Edmonds, Lisa A; Donovan, Neila J
2014-06-01
Virtually no valid materials are available to evaluate confrontation naming in Spanish-English bilingual adults in the U.S. In a recent study, a large group of young Spanish-English bilingual adults were evaluated on An Object and Action Naming Battery (Edmonds & Donovan in Journal of Speech, Language, and Hearing Research 55:359-381, 2012). Rasch analyses of the responses resulted in evidence for the content and construct validity of the retained items. However, the scope of that study did not allow for extensive examination of individual item characteristics, group analyses of participants, or the provision of testing and scoring materials or raw data, thereby limiting the ability of researchers to administer the test to Spanish-English bilinguals and to score the items with confidence. In this study, we present the in-depth information described above on the basis of further analyses, including (1) online searchable spreadsheets with extensive empirical (e.g., accuracy and name agreeability) and psycholinguistic item statistics; (2) answer sheets and instructions for scoring and interpreting the responses to the Rasch items; (3) tables of alternative correct responses for English and Spanish; (4) ability strata determined for all naming conditions (English and Spanish nouns and verbs); and (5) comparisons of accuracy across proficiency groups (i.e., Spanish dominant, English dominant, and balanced). These data indicate that the Rasch items from An Object and Action Naming Battery are valid and sensitive for the evaluation of naming in young Spanish-English bilingual adults. Additional information based on participant responses for all of the items on the battery can provide researchers with valuable information to aid in stimulus development and response interpretation for experimental studies in this population.
Effect of equilibration on primitive path analyses of entangled polymers.
Hoy, Robert S; Robbins, Mark O
2005-12-01
We use recently developed primitive path analysis (PPA) methods to study the effect of equilibration on entanglement density in model polymeric systems. Values of Ne for two commonly used equilibration methods differ by a factor of 2-4 even though the methods produce similar large-scale chain statistics. We find that local chain stretching in poorly equilibrated samples increases entanglement density. The evolution of Ne with time shows that many entanglements are lost through fast processes such as chain retraction as the local stretching relaxes. Quenching a melt state into a glass has little effect on Ne. Equilibration-dependent differences in short-scale structure affect the craze extension ratio much less than expected from the differences in PPA values of Ne.
Generalized Models for Rock Joint Surface Shapes
Du, Shigui; Hu, Yunjin; Hu, Xiaofei
2014-01-01
Generalized models of joint surface shapes are the foundation for mechanism studies on the mechanical effects of rock joint surface shapes. Based on extensive field investigations of rock joint surface shapes, generalized models for three level shapes named macroscopic outline, surface undulating shape, and microcosmic roughness were established through statistical analyses of 20,078 rock joint surface profiles. The relative amplitude of profile curves was used as a borderline for the division of different level shapes. The study results show that the macroscopic outline has three basic features such as planar, arc-shaped, and stepped; the surface undulating shape has three basic features such as planar, undulating, and stepped; and the microcosmic roughness has two basic features such as smooth and rough. PMID:25152901
Vector wind profile gust model
NASA Technical Reports Server (NTRS)
Adelfang, S. I.
1979-01-01
Work towards establishing a vector wind profile gust model for the Space Transportation System flight operations and trade studies is reported. To date, all the statistical and computational techniques required were established and partially implemented. An analysis of wind profile gust at Cape Kennedy within the theoretical framework is presented. The variability of theoretical and observed gust magnitude with filter type, altitude, and season is described. Various examples are presented which illustrate agreement between theoretical and observed gust percentiles. The preliminary analysis of the gust data indicates a strong variability with altitude, season, and wavelength regime. An extension of the analyses to include conditional distributions of gust magnitude given gust length, distributions of gust modulus, and phase differences between gust components has begun.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bracken, M.B.
This joint EPRI/National Institutes of Health study is the largest epidemiological study ever undertaken to examine the relationship between exposure to electric and magnetic fields (EMF) during pregnancy and reproductive outcomes. Overall, the study concludes that EMF exposure during pregnancy is unrelated to pregnancy outcome. In specific, the study reveals no association between electromagnetic field exposure from electrically heated beds and intrauterine growth retardation or spontaneous abortion. Among the many strengths of this study are clearly specified hypotheses; prospective design; randomized assignment to exposure monitoring; very large sample size; detailed assessment of potential confounding by known risk factors for adversemore » pregnancy outcomes; and comprehensive statistical analyses. The study also featured extensive exposure assessment, including measurements of EMF from a variety of sources, personal monitoring, and wire coding information.« less
Groundwater monitoring in the Savannah River Plant Low Level Waste Burial Ground
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carlton, W.H.
1983-12-31
This document describes chemical mechanisms that may affect trace-level radionuclide migration through acidic sandy clay soils in a humid environment, and summarizes the extensive chemical and radiochemical analyses of the groundwater directly below the SRP Low-Level Waste (LLW) Burial Ground (643-G). Anomalies were identified in the chemistry of individual wells which appear to be related to small amounts of fission product activity that have reached the water table. The chemical properties which were statistically related to trace level transport of Cs-137 and Sr-90 were iron, potassium, sodium and calcium. Concentrations on the order of 100 ppM appear sufficient to affectmore » nuclide migration. Several complexation mechanisms for plutonium migration were investigated.« less
Design solutions for the solar cell interconnect fatigue fracture problem
NASA Technical Reports Server (NTRS)
Mon, G. R.; Ross, R. G., Jr.
1982-01-01
Mechanical fatigue of solar cell interconnects is a major failure mechanism in photovoltaic arrays. A comprehensive approach to the reliability design of interconnects, together with extensive design data for the fatigue properties of copper interconnects, has been published. This paper extends the previous work, developing failure prediction (fatigue) data for additional interconnect material choices, including aluminum and a variety of copper-Invar and copper-steel claddings. An improved global fatigue function is used to model the probability-of-failure statistics of each material as a function of level and number of cycles of applied strain. Life-cycle economic analyses are used to evaluate the relative merits of each material choce. The copper-Invar clad composites demonstrate superior performance over pure copper. Aluminum results are disappointing.
van Heuven, Walter J. B.; Pitchford, Nicola J.; Ledgeway, Timothy
2017-01-01
Databases containing lexical properties on any given orthography are crucial for psycholinguistic research. In the last ten years, a number of lexical databases have been developed for Greek. However, these lack important part-of-speech information. Furthermore, the need for alternative procedures for calculating syllabic measurements and stress information, as well as combination of several metrics to investigate linguistic properties of the Greek language are highlighted. To address these issues, we present a new extensive lexical database of Modern Greek (GreekLex 2) with part-of-speech information for each word and accurate syllabification and orthographic information predictive of stress, as well as several measurements of word similarity and phonetic information. The addition of detailed statistical information about Greek part-of-speech, syllabification, and stress neighbourhood allowed novel analyses of stress distribution within different grammatical categories and syllabic lengths to be carried out. Results showed that the statistical preponderance of stress position on the pre-final syllable that is reported for Greek language is dependent upon grammatical category. Additionally, analyses showed that a proportion higher than 90% of the tokens in the database would be stressed correctly solely by relying on stress neighbourhood information. The database and the scripts for orthographic and phonological syllabification as well as phonetic transcription are available at http://www.psychology.nottingham.ac.uk/greeklex/. PMID:28231303
Kyparissiadis, Antonios; van Heuven, Walter J B; Pitchford, Nicola J; Ledgeway, Timothy
2017-01-01
Databases containing lexical properties on any given orthography are crucial for psycholinguistic research. In the last ten years, a number of lexical databases have been developed for Greek. However, these lack important part-of-speech information. Furthermore, the need for alternative procedures for calculating syllabic measurements and stress information, as well as combination of several metrics to investigate linguistic properties of the Greek language are highlighted. To address these issues, we present a new extensive lexical database of Modern Greek (GreekLex 2) with part-of-speech information for each word and accurate syllabification and orthographic information predictive of stress, as well as several measurements of word similarity and phonetic information. The addition of detailed statistical information about Greek part-of-speech, syllabification, and stress neighbourhood allowed novel analyses of stress distribution within different grammatical categories and syllabic lengths to be carried out. Results showed that the statistical preponderance of stress position on the pre-final syllable that is reported for Greek language is dependent upon grammatical category. Additionally, analyses showed that a proportion higher than 90% of the tokens in the database would be stressed correctly solely by relying on stress neighbourhood information. The database and the scripts for orthographic and phonological syllabification as well as phonetic transcription are available at http://www.psychology.nottingham.ac.uk/greeklex/.
Generalized ensemble theory with non-extensive statistics
NASA Astrophysics Data System (ADS)
Shen, Ke-Ming; Zhang, Ben-Wei; Wang, En-Ke
2017-12-01
The non-extensive canonical ensemble theory is reconsidered with the method of Lagrange multipliers by maximizing Tsallis entropy, with the constraint that the normalized term of Tsallis' q -average of physical quantities, the sum ∑ pjq, is independent of the probability pi for Tsallis parameter q. The self-referential problem in the deduced probability and thermal quantities in non-extensive statistics is thus avoided, and thermodynamical relationships are obtained in a consistent and natural way. We also extend the study to the non-extensive grand canonical ensemble theory and obtain the q-deformed Bose-Einstein distribution as well as the q-deformed Fermi-Dirac distribution. The theory is further applied to the generalized Planck law to demonstrate the distinct behaviors of the various generalized q-distribution functions discussed in literature.
Umar, Sulaiman; Man, Norsida; Nawi, Nolila Mohd; Latif, Ismail Abd; Samah, Bahaman Abu
2017-06-01
The study described the perceived importance of, and proficiency in core agricultural extension competencies among extension workers in Peninsular Malaysia; and evaluating the resultant deficits in the competencies. The Borich's Needs Assessment Model was used to achieve the objectives of the study. A sample of 298 respondents was randomly selected and interviewed using a pre-tested structured questionnaire. Thirty-three core competency items were assessed. Instrument validity and reliability were ensured. The cross-sectional data obtained was analysed using SPSS for descriptive statistics including mean weighted discrepancy score (MWDS). Results of the study showed that on a scale of 5, the most important core extension competency items according to respondents' perception were: "Making good use of information and communication technologies/access and use of web-based resources" (M=4.86, SD=0.23); "Conducting needs assessments" (M=4.84, SD=0.16); "organizing extension campaigns" (M=4.82, SD=0.47) and "Managing groups and teamwork" (M=4.81, SD=0.76). In terms of proficiency, the highest competency identified by the respondents was "Conducting farm and home visits (M=3.62, SD=0.82) followed by 'conducting meetings effectively' (M=3.19, SD=0.72); "Conducting focus group discussions" (M=3.16, SD=0.32) and "conducting community forums" (M=3.13, SD=0.64). The discrepancies implying competency deficits were widest in "Acquiring and allocating resources" (MWDS=12.67); use of information and communication technologies (ICTs) and web-based resources in agricultural extension (MWDS=12.59); and report writing and sharing the results and impacts (MWDS=11.92). It is recommended that any intervention aimed at developing the capacity of extension workers in Peninsular Malaysia should prioritize these core competency items in accordance with the deficits established in this study. Copyright © 2017 Elsevier Ltd. All rights reserved.
Cox, Tony; Popken, Douglas; Ricci, Paolo F
2013-01-01
Exposures to fine particulate matter (PM2.5) in air (C) have been suspected of contributing causally to increased acute (e.g., same-day or next-day) human mortality rates (R). We tested this causal hypothesis in 100 United States cities using the publicly available NMMAPS database. Although a significant, approximately linear, statistical C-R association exists in simple statistical models, closer analysis suggests that it is not causal. Surprisingly, conditioning on other variables that have been extensively considered in previous analyses (usually using splines or other smoothers to approximate their effects), such as month of the year and mean daily temperature, suggests that they create strong, nonlinear confounding that explains the statistical association between PM2.5 and mortality rates in this data set. As this finding disagrees with conventional wisdom, we apply several different techniques to examine it. Conditional independence tests for potential causation, non-parametric classification tree analysis, Bayesian Model Averaging (BMA), and Granger-Sims causality testing, show no evidence that PM2.5 concentrations have any causal impact on increasing mortality rates. This apparent absence of a causal C-R relation, despite their statistical association, has potentially important implications for managing and communicating the uncertain health risks associated with, but not necessarily caused by, PM2.5 exposures. PMID:23983662
Budiyono, Agung; Rohrlich, Daniel
2017-11-03
Where does quantum mechanics part ways with classical mechanics? How does quantum randomness differ fundamentally from classical randomness? We cannot fully explain how the theories differ until we can derive them within a single axiomatic framework, allowing an unambiguous account of how one theory is the limit of the other. Here we derive non-relativistic quantum mechanics and classical statistical mechanics within a common framework. The common axioms include conservation of average energy and conservation of probability current. But two axioms distinguish quantum mechanics from classical statistical mechanics: an "ontic extension" defines a nonseparable (global) random variable that generates physical correlations, and an "epistemic restriction" constrains allowed phase space distributions. The ontic extension and epistemic restriction, with strength on the order of Planck's constant, imply quantum entanglement and uncertainty relations. This framework suggests that the wave function is epistemic, yet it does not provide an ontic dynamics for individual systems.
Power considerations for λ inflation factor in meta-analyses of genome-wide association studies.
Georgiopoulos, Georgios; Evangelou, Evangelos
2016-05-19
The genomic control (GC) approach is extensively used to effectively control false positive signals due to population stratification in genome-wide association studies (GWAS). However, GC affects the statistical power of GWAS. The loss of power depends on the magnitude of the inflation factor (λ) that is used for GC. We simulated meta-analyses of different GWAS. Minor allele frequency (MAF) ranged from 0·001 to 0·5 and λ was sampled from two scenarios: (i) random scenario (empirically-derived distribution of real λ values) and (ii) selected scenario from simulation parameter modification. Adjustment for λ was considered under single correction (within study corrected standard errors) and double correction (additional λ corrected summary estimate). MAF was a pivotal determinant of observed power. In random λ scenario, double correction induced a symmetric power reduction in comparison to single correction. For MAF 1·2 and MAF >5%. Our results provide a quick but detailed index for power considerations of future meta-analyses of GWAS that enables a more flexible design from early steps based on the number of studies accumulated in different groups and the λ values observed in the single studies.
76 FR 71076 - Proposed Collection, Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2011-11-16
... DEPARTMENT OF LABOR Bureau of Labor Statistics Proposed Collection, Comment Request ACTION: Notice... requirements on respondents can be properly assessed. The Bureau of Labor Statistics (BLS) is soliciting comments on the proposed extension of the ``BLS Occupational Safety and Health Statistics (OSHS...
Free Fermions and the Classical Compact Groups
NASA Astrophysics Data System (ADS)
Cunden, Fabio Deelan; Mezzadri, Francesco; O'Connell, Neil
2018-06-01
There is a close connection between the ground state of non-interacting fermions in a box with classical (absorbing, reflecting, and periodic) boundary conditions and the eigenvalue statistics of the classical compact groups. The associated determinantal point processes can be extended in two natural directions: (i) we consider the full family of admissible quantum boundary conditions (i.e., self-adjoint extensions) for the Laplacian on a bounded interval, and the corresponding projection correlation kernels; (ii) we construct the grand canonical extensions at finite temperature of the projection kernels, interpolating from Poisson to random matrix eigenvalue statistics. The scaling limits in the bulk and at the edges are studied in a unified framework, and the question of universality is addressed. Whether the finite temperature determinantal processes correspond to the eigenvalue statistics of some matrix models is, a priori, not obvious. We complete the picture by constructing a finite temperature extension of the Haar measure on the classical compact groups. The eigenvalue statistics of the resulting grand canonical matrix models (of random size) corresponds exactly to the grand canonical measure of free fermions with classical boundary conditions.
Test-retest reliability of 3D ultrasound measurements of the thoracic spine.
Fölsch, Christian; Schlögel, Stefanie; Lakemeier, Stefan; Wolf, Udo; Timmesfeld, Nina; Skwara, Adrian
2012-05-01
To explore the reliability of the Zebris CMS 20 ultrasound analysis system with pointer application for measuring end-range flexion, end-range extension, and neutral kyphosis angle of the thoracic spine. The study was performed within the School of Physiotherapy in cooperation with the Orthopedic Department at a University Hospital. The thoracic spines of 28 healthy subjects were measured. Measurements for neutral kyphosis angle, end-range flexion, and end-range extension were taken once at each time point. The bone landmarks were palpated by one examiner and marked with a pointer containing 2 transmitters using a frequency of 40 kHz. A third transmitter was fixed to the pelvis, and 3 microphones were used as receiver. The real angle was calculated by the software. Bland-Altman plots with 95% limits of agreement, intraclass correlations (ICC), standard deviations of mean measurements, and standard error of measurements were used for statistical analyses. The test-retest reliability in this study was measured within a 24-hour interval. Statistical parameters were used to judge reliability. The mean kyphosis angle was 44.8° with a standard deviation of 17.3° at the first measurement and a mean of 45.8° with a standard deviation of 16.2° the following day. The ICC was high at 0.95 for the neutral kyphosis angle, and the Bland-Altman 95% limits of agreement were within clinical acceptable margins. The ICC was 0.71 for end-range flexion and 0.34 for end-range extension, whereas the Bland-Altman 95% limits of agreement were wider than with the static measurement of kyphosis. Compared with static measurements, the analysis of motion with 3-dimensional ultrasound showed an increased standard deviation for test-retest measurements. The test-retest reliability of ultrasound measuring of the neutral kyphosis angle of the thoracic spine was demonstrated within 24 hours. Bland-Altman 95% limits of agreement and the standard deviation of differences did not appear to be clinically acceptable for measuring flexion and extension. Copyright © 2012 American Academy of Physical Medicine and Rehabilitation. Published by Elsevier Inc. All rights reserved.
76 FR 60930 - Proposed Collection, Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-30
... DEPARTMENT OF LABOR Bureau of Labor Statistics Proposed Collection, Comment Request ACTION: Notice... requirements on respondents can be properly assessed. The Bureau of Labor Statistics (BLS) is soliciting comments concerning the proposed extension of the ``Mass Layoff Statistics Program.'' A copy of the...
The health of the American slave examined by means of Union Army medical statistics.
Freemon, F R
1985-01-01
The health status of the American slave in the 19th century remains unclear despite extensive historical research. Better knowledge of slave health would provide a clearer picture of the life of the slave, a better understanding of the 19th-century medicine, and possibly even clues to the health problems of modern blacks. This article hopes to contribute to the literature by examining another source of data. Slaves entering the Union Army joined an organization with standardized medical care that generated extensive statistical information. Review of these statistics answers questions about the health of young male blacks at the time American slavery ended.
HydroApps: An R package for statistical simulation to use in regional analysis
NASA Astrophysics Data System (ADS)
Ganora, D.
2013-12-01
The HydroApps package is a newborn R extension initially developed to support the use of a recent model for flood frequency estimation developed for applications in Northwestern Italy; it also contains some general tools for regional analyses and can be easily extended to include other statistical models. The package is currently at an experimental level of development. The HydroApps is a corollary of the SSEM project for regional flood frequency analysis, although it was developed independently to support various instances of regional analyses. Its aim is to provide a basis for interplay between statistical simulation and practical operational use. In particular, the main module of the package deals with the building of the confidence bands of flood frequency curves expressed by means of their L-moments. Other functions include pre-processing and visualization of hydrologic time series, analysis of the optimal design-flood under uncertainty, but also tools useful in water resources management for the estimation of flow duration curves and their sensitivity to water withdrawals. Particular attention is devoted to the code granularity, i.e. the level of detail and aggregation of the code: a greater detail means more low-level functions, which entails more flexibility but reduces the ease of use for practical use. A balance between detail and simplicity is necessary and can be resolved with appropriate wrapping functions and specific help pages for each working block. From a more general viewpoint, the package has not really and user-friendly interface, but runs on multiple operating systems and it's easy to update, as many other open-source projects., The HydroApps functions and their features are reported in order to share ideas and materials to improve the ';technological' and information transfer between scientist communities and final users like policy makers.
Application of Statistics in Engineering Technology Programs
ERIC Educational Resources Information Center
Zhan, Wei; Fink, Rainer; Fang, Alex
2010-01-01
Statistics is a critical tool for robustness analysis, measurement system error analysis, test data analysis, probabilistic risk assessment, and many other fields in the engineering world. Traditionally, however, statistics is not extensively used in undergraduate engineering technology (ET) programs, resulting in a major disconnect from industry…
"What If" Analyses: Ways to Interpret Statistical Significance Test Results Using EXCEL or "R"
ERIC Educational Resources Information Center
Ozturk, Elif
2012-01-01
The present paper aims to review two motivations to conduct "what if" analyses using Excel and "R" to understand the statistical significance tests through the sample size context. "What if" analyses can be used to teach students what statistical significance tests really do and in applied research either prospectively to estimate what sample size…
Patterns of exchange of forensic DNA data in the European Union through the Prüm system.
Santos, Filipe; Machado, Helena
2017-07-01
This paper presents a study of the 5-year operation (2011-2015) of the transnational exchange of forensic DNA data between Member States of the European Union (EU) for the purpose of combating cross-border crime and terrorism within the so-called Prüm system. This first systematisation of the full official statistical dataset provides an overall assessment of the match figures and patterns of operation of the Prüm system for DNA exchange. These figures and patterns are analysed in terms of the differentiated contributions by participating EU Member States. The data suggest a trend for West and Central European countries to concentrate the majority of Prüm matches, while DNA databases of Eastern European countries tend to contribute with profiles of people that match stains in other countries. In view of the necessary transparency and accountability of the Prüm system, more extensive and informative statistics would be an important contribution to the assessment of its functioning and societal benefits. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.
Space, time, and the third dimension (model error)
Moss, Marshall E.
1979-01-01
The space-time tradeoff of hydrologic data collection (the ability to substitute spatial coverage for temporal extension of records or vice versa) is controlled jointly by the statistical properties of the phenomena that are being measured and by the model that is used to meld the information sources. The control exerted on the space-time tradeoff by the model and its accompanying errors has seldom been studied explicitly. The technique, known as Network Analyses for Regional Information (NARI), permits such a study of the regional regression model that is used to relate streamflow parameters to the physical and climatic characteristics of the drainage basin.The NARI technique shows that model improvement is a viable and sometimes necessary means of improving regional data collection systems. Model improvement provides an immediate increase in the accuracy of regional parameter estimation and also increases the information potential of future data collection. Model improvement, which can only be measured in a statistical sense, cannot be quantitatively estimated prior to its achievement; thus an attempt to upgrade a particular model entails a certain degree of risk on the part of the hydrologist.
Analysis of recent climatic changes in the Arabian Peninsula region
NASA Astrophysics Data System (ADS)
Nasrallah, H. A.; Balling, R. C.
1996-12-01
Interest in the potential climatic consequences of the continued buildup of anthropo-generated greenhouse gases has led many scientists to conduct extensive climate change studies at the global, hemispheric, and regional scales. In this investigation, analyses are conducted on long-term historical climate records from the Arabian Peninsula region. Over the last 100 years, temperatures in the region increased linearly by 0.63 °C. However, virtually all of this warming occurred from 1911 1935, and over the most recent 50 years, the Arabian Peninsula region has cooled slightly. In addition, the satellite-based measurements of lower-tropospheric temperatures for the region do not show any statistically significant warming over the period 1979 1991. While many other areas of the world are showing a decrease in the diurnal temperature range, the Arabian Peninsula region reveals no evidence of a long-term change in this parameter. Precipitation records for the region show a slight, statistically insignificant decrease over the past 40 years. The results from this study should complement the mass of information that has resulted from similar regional climate studies conducted in the United States, Europe, and Australia.
Fluctuating observation time ensembles in the thermodynamics of trajectories
NASA Astrophysics Data System (ADS)
Budini, Adrián A.; Turner, Robert M.; Garrahan, Juan P.
2014-03-01
The dynamics of stochastic systems, both classical and quantum, can be studied by analysing the statistical properties of dynamical trajectories. The properties of ensembles of such trajectories for long, but fixed, times are described by large-deviation (LD) rate functions. These LD functions play the role of dynamical free energies: they are cumulant generating functions for time-integrated observables, and their analytic structure encodes dynamical phase behaviour. This ‘thermodynamics of trajectories’ approach is to trajectories and dynamics what the equilibrium ensemble method of statistical mechanics is to configurations and statics. Here we show that, just like in the static case, there are a variety of alternative ensembles of trajectories, each defined by their global constraints, with that of trajectories of fixed total time being just one of these. We show how the LD functions that describe an ensemble of trajectories where some time-extensive quantity is constant (and large) but where total observation time fluctuates can be mapped to those of the fixed-time ensemble. We discuss how the correspondence between generalized ensembles can be exploited in path sampling schemes for generating rare dynamical trajectories.
Language experience changes subsequent learning
Onnis, Luca; Thiessen, Erik
2013-01-01
What are the effects of experience on subsequent learning? We explored the effects of language-specific word order knowledge on the acquisition of sequential conditional information. Korean and English adults were engaged in a sequence learning task involving three different sets of stimuli: auditory linguistic (nonsense syllables), visual non-linguistic (nonsense shapes), and auditory non-linguistic (pure tones). The forward and backward probabilities between adjacent elements generated two equally probable and orthogonal perceptual parses of the elements, such that any significant preference at test must be due to either general cognitive biases, or prior language-induced biases. We found that language modulated parsing preferences with the linguistic stimuli only. Intriguingly, these preferences are congruent with the dominant word order patterns of each language, as corroborated by corpus analyses, and are driven by probabilistic preferences. Furthermore, although the Korean individuals had received extensive formal explicit training in English and lived in an English-speaking environment, they exhibited statistical learning biases congruent with their native language. Our findings suggest that mechanisms of statistical sequential learning are implicated in language across the lifespan, and experience with language may affect cognitive processes and later learning. PMID:23200510
Statistical quality control through overall vibration analysis
NASA Astrophysics Data System (ADS)
Carnero, M. a. Carmen; González-Palma, Rafael; Almorza, David; Mayorga, Pedro; López-Escobar, Carlos
2010-05-01
The present study introduces the concept of statistical quality control in automotive wheel bearings manufacturing processes. Defects on products under analysis can have a direct influence on passengers' safety and comfort. At present, the use of vibration analysis on machine tools for quality control purposes is not very extensive in manufacturing facilities. Noise and vibration are common quality problems in bearings. These failure modes likely occur under certain operating conditions and do not require high vibration amplitudes but relate to certain vibration frequencies. The vibration frequencies are affected by the type of surface problems (chattering) of ball races that are generated through grinding processes. The purpose of this paper is to identify grinding process variables that affect the quality of bearings by using statistical principles in the field of machine tools. In addition, an evaluation of the quality results of the finished parts under different combinations of process variables is assessed. This paper intends to establish the foundations to predict the quality of the products through the analysis of self-induced vibrations during the contact between the grinding wheel and the parts. To achieve this goal, the overall self-induced vibration readings under different combinations of process variables are analysed using statistical tools. The analysis of data and design of experiments follows a classical approach, considering all potential interactions between variables. The analysis of data is conducted through analysis of variance (ANOVA) for data sets that meet normality and homoscedasticity criteria. This paper utilizes different statistical tools to support the conclusions such as chi squared, Shapiro-Wilks, symmetry, Kurtosis, Cochran, Hartlett, and Hartley and Krushal-Wallis. The analysis presented is the starting point to extend the use of predictive techniques (vibration analysis) for quality control. This paper demonstrates the existence of predictive variables (high-frequency vibration displacements) that are sensible to the processes setup and the quality of the products obtained. Based on the result of this overall vibration analysis, a second paper will analyse self-induced vibration spectrums in order to define limit vibration bands, controllable every cycle or connected to permanent vibration-monitoring systems able to adjust sensible process variables identified by ANOVA, once the vibration readings exceed established quality limits.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-26
... Sentenced Population Movement--National Prisoner Statistics, Extension and Revision of Existing Collection...) Title of the Form/Collection: Summary of Sentenced Population Movement--National Prisoner Statistics (3...
Modified RS2101 rocket engine study program
NASA Technical Reports Server (NTRS)
1971-01-01
The purpose of the program is to perform design studies and analyses to determine the effects of incorporating a 60:1 expansion area ratio nozzle extension, extended firing time, and modified operating conditions and environments on the MM'71 rocket engine assembly. An injector-to-thrust chamber seal study was conducted to define potential solutions for leakage past this joint. The results and recommendations evolving from the engine thermal analyses, the injector-to-thrust chamber seal studies, and the nozzle extension joint stress analyses are presented.
Finch, S J; Chen, C H; Gordon, D; Mendell, N R
2001-12-01
This study compared the performance of the maximum lod (MLOD), maximum heterogeneity lod (MHLOD), maximum non-parametric linkage score (MNPL), maximum Kong and Cox linear extension (MKC(lin)) of NPL, and maximum Kong and Cox exponential extension (MKC(exp)) of NPL as calculated in Genehunter 1.2 and Genehunter-Plus. Our performance measure was the distance between the marker with maximum value for each linkage statistic and the trait locus. We performed a simulation study considering: 1) four modes of transmission, 2) 100 replicates for each model, 3) 58 pedigrees (with 592 subjects) per replicate, 4) three linked marker loci each having three equally frequent alleles, and 5) either 0% unlinked families (linkage homogeneity) or 50% unlinked families (linkage heterogeneity). For each replicate, we obtained the Haldane map position of the location at which each of the five statistics is maximized. The MLOD and MHLOD were obtained by maximizing over penetrances, phenocopy rate, and risk-allele frequencies. For the models simulated, MHLOD appeared to be the best statistic both in terms of identifying a marker locus having the smallest mean distance from the trait locus and in terms of the strongest negative correlation between maximum linkage statistic and distance of the identified position and the trait locus. The marker loci with maximum value of the Kong and Cox extensions of the NPL statistic also were closer to the trait locus than the marker locus with maximum value of the NPL statistic. Copyright 2001 Wiley-Liss, Inc.
GAMBIT: the global and modular beyond-the-standard-model inference tool
NASA Astrophysics Data System (ADS)
Athron, Peter; Balazs, Csaba; Bringmann, Torsten; Buckley, Andy; Chrząszcz, Marcin; Conrad, Jan; Cornell, Jonathan M.; Dal, Lars A.; Dickinson, Hugh; Edsjö, Joakim; Farmer, Ben; Gonzalo, Tomás E.; Jackson, Paul; Krislock, Abram; Kvellestad, Anders; Lundberg, Johan; McKay, James; Mahmoudi, Farvah; Martinez, Gregory D.; Putze, Antje; Raklev, Are; Ripken, Joachim; Rogan, Christopher; Saavedra, Aldo; Savage, Christopher; Scott, Pat; Seo, Seon-Hee; Serra, Nicola; Weniger, Christoph; White, Martin; Wild, Sebastian
2017-11-01
We describe the open-source global fitting package GAMBIT: the Global And Modular Beyond-the-Standard-Model Inference Tool. GAMBIT combines extensive calculations of observables and likelihoods in particle and astroparticle physics with a hierarchical model database, advanced tools for automatically building analyses of essentially any model, a flexible and powerful system for interfacing to external codes, a suite of different statistical methods and parameter scanning algorithms, and a host of other utilities designed to make scans faster, safer and more easily-extendible than in the past. Here we give a detailed description of the framework, its design and motivation, and the current models and other specific components presently implemented in GAMBIT. Accompanying papers deal with individual modules and present first GAMBIT results. GAMBIT can be downloaded from gambit.hepforge.org.
An analysis of empirical estimates of sexual aggression victimization and perpetration.
Spitzberg, B H
1999-01-01
Estimates of prevalence for several categories of sexual coercion, including rape and attempted rape, were statistically aggregated across 120 studies, involving over 100,000 subjects. According to the data, almost 13% of women and over 3% of men have been raped, and almost 5% of men claim to have perpetrated rape. In contrast, about 25% of women and men claim to have been sexually coerced and to have perpetrated sexual coercion. In general, the mediating variables examined--population type, decade, date of publication, and type of operationalization--were not consistently related to rates of victimization or perpetration. Nevertheless, the extensive variation among study estimates strongly suggests the possibility of systematic sources of variation that have yet to be identified. Further analyses are called for to disentangle such sources.
The Health of the American Slave Examined by Means of Union Army Medical Statistics
Freemon, Frank R.
1985-01-01
The health status of the American slave in the 19th century remains unclear despite extensive historical research. Better knowledge of slave health would provide a clearer picture of the life of the slave, a better understanding of the 19th-century medicine, and possibly even clues to the health problems of modern blacks. This article hopes to contribute to the literature by examining another source of data. Slaves entering the Union Army joined an organization with standardized medical care that generated extensive statistical information. Review of these statistics answers questions about the health of young male blacks at the time American slavery ended. PMID:3881595
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-21
... DEPARTMENT OF AGRICULTURE National Agricultural Statistics Service Notice of Intent To Request... Statistics Service, USDA. ACTION: Notice and request for comments. SUMMARY: In accordance with the Paperwork Reduction Act of 1995, this notice announces the intention of the National Agricultural Statistics Service...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-10-25
... DEPARTMENT OF AGRICULTURE National Agricultural Statistics Service Notice of Intent To Request... Statistics Service, USDA. ACTION: Notice and request for comments. SUMMARY: In accordance with the Paperwork Reduction Act of 1995, this notice announces the intention of the National Agricultural Statistics Service...
Tsallis p⊥ distribution from statistical clusters
NASA Astrophysics Data System (ADS)
Bialas, A.
2015-07-01
It is shown that the transverse momentum distributions of particles emerging from the decay of statistical clusters, distributed according to a power law in their transverse energy, closely resemble those following from the Tsallis non-extensive statistical model. The experimental data are well reproduced with the cluster temperature T ≈ 160 MeV.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-19
... Transportation Statistics [Docket: RITA 2008-0002 BTS Paperwork Reduction Notice] Agency Information Collection... of Transportation Statistics (BTS), DOT. ACTION: Notice. SUMMARY: In compliance with the Paperwork Reduction Act of 1995, Public Law 104-13, the Bureau of Transportation Statistics invites the general public...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-20
... DEPARTMENT OF AGRICULTURE National Agricultural Statistics Service Notice of Intent To Request... Statistics Service, USDA. ACTION: Notice and request for comments. SUMMARY: In accordance with the Paperwork Reduction Act of 1995, this notice announces the intention of the National Agricultural Statistics Service...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-14
... DEPARTMENT OF AGRICULTURE National Agricultural Statistics Service Notice of Intent To Request... Statistics Service, USDA. ACTION: Notice and request for comments. SUMMARY: In accordance with the Paperwork Reduction Act of 1995 this notice announces the intention of the National Agricultural Statistics Service...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-14
... DEPARTMENT OF AGRICULTURE National Agricultural Statistics Service Notice of Intent to Request... Statistics Service, USDA. ACTION: Notice and request for comments. SUMMARY: In accordance with the Paperwork Reduction Act of 1995, this notice announces the intention of the National Agricultural Statistics Service...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-10-18
... DEPARTMENT OF AGRICULTURE National Agricultural Statistics Service Notice of Intent to Request... Statistics Service, USDA. ACTION: Notice and request for comments. SUMMARY: In accordance with the Paperwork Reduction Act of 1995, this notice announces the intention of the National Agricultural Statistics Service...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-02-06
... DEPARTMENT OF AGRICULTURE National Agricultural Statistics Service Notice of Intent To Request... Statistics Service. ACTION: Notice and request for comments. SUMMARY: In accordance with the Paperwork Reduction Act of 1995, this notice announces the intention of the National Agricultural Statistics Service...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-10-30
... DEPARTMENT OF AGRICULTURE National Agricultural Statistics Service Notice of Intent To Request... Statistics Service, USDA. ACTION: Notice and request for comments. SUMMARY: In accordance with the Paperwork Reduction Act of 1995, this notice announces the intention of the National Agricultural Statistics Service...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-14
... DEPARTMENT OF AGRICULTURE National Agricultural Statistics Service Notice of Intent To Request... Statistics Service, USDA. ACTION: Notice and request for comments. SUMMARY: In accordance with the Paperwork Reduction Act of 1995, this notice announces the intention of the National Agricultural Statistics Service...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-16
... DEPARTMENT OF AGRICULTURE National Agricultural Statistics Service Notice of Intent To Request... Statistics Service, USDA. ACTION: Notice and request for comments. SUMMARY: In accordance with the Paperwork Reduction Act of 1995, this notice announces the intention the National Agricultural Statistics Service...
Dartora, Nereu Roque; de Conto Ferreira, Michele Bertoluzi; Moris, Izabela Cristina Maurício; Brazão, Elisabeth Helena; Spazin, Aloísio Oro; Sousa-Neto, Manoel Damião; Silva-Sousa, Yara Terezinha; Gomes, Erica Alves
2018-07-01
Endodontically treated teeth have an increased risk of biomechanical failure because of significant loss of tooth structure. The biomechanical behavior of endodontically treated teeth restored was evaluated using different extensions of endocrowns inside the pulp chamber by in vitro and 3-dimensional finite element analysis (FEA). Thirty mandibular human molars were endodontically treated. Standardized endocrown preparations were performed, and the teeth were randomly divided into 3 groups (n = 10) according to different endocrown extensions inside the pulp chamber: G-5 mm, a 5-mm extension; G-3 mm, a 3-mm extension; and G-1 mm, a 1-mm extension. After adhesive cementation, all specimens were subjected to thermocycling and dynamic loading. The survival specimens were subjected to fracture resistance testing at a crosshead speed of 1 mm/min in a universal testing machine. All fractured specimens were subjected to fractography. Data were analyzed by 1-way analysis of variance and the Tukey post hoc test (P < .05). Stress distribution patterns in each group were analyzed using FEA. Qualitative analyses were performed according to the von Mises criterion. After dynamic loading, a survival rate of 100% was observed in all groups. For static loading, statistically significant differences among the groups were observed (P < .05) (G-5 mm = 2008.61 N, G-3 mm = 1795.41 N, and G-1 mm = 1268.12 N). Fractography showed a higher frequency of compression curls for G-5 mm and G-3 mm than for G-1 mm. FEA explained the results of fracture strength testing and fractography. Greater extension of endocrowns inside the pulp chamber provided better mechanical performance. Copyright © 2018 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.
Li, Pengxiang; Doshi, Jalpa A.
2016-01-01
Objective Since 2007, the Centers for Medicare and Medicaid Services have published 5-star quality rating measures to aid consumers in choosing Medicare Advantage Prescription Drug Plans (MAPDs). We examined the impact of these star ratings on Medicare Advantage Prescription Drug (MAPD) enrollment before and after 2012, when star ratings became tied to bonus payments for MAPDs that could be used to improve plan benefits and/or reduce premiums in the subsequent year. Methods A longitudinal design and multivariable hybrid models were used to assess whether star ratings had a direct impact on concurrent year MAPD contract enrollment (by influencing beneficiary choice) and/or an indirect impact on subsequent year MAPD contract enrollment (because ratings were linked to bonus payments). The main analysis was based on contract-year level data from 2009–2015. We compared effects of star ratings in the pre-bonus payment period (2009–2011) and post-bonus payment period (2012–2015). Extensive sensitivity analyses varied the analytic techniques, unit of analysis, and sample inclusion criteria. Similar analyses were conducted separately using stand-alone PDP contract-year data; since PDPs were not eligible for bonus payments, they served as an external comparison group. Result The main analysis included 3,866 MAPD contract-years. A change of star rating had no statistically significant effect on concurrent year enrollment in any of the pre-, post-, or pre-post combined periods. On the other hand, star rating increase was associated with a statistically significant increase in the subsequent year enrollment (a 1-star increase associated with +11,337 enrollees, p<0.001) in the post-bonus payment period but had a very small and statistically non-significant effect on subsequent year enrollment in the pre-bonus payment period. Further, the difference in effects on subsequent year enrollment was statistically significant between the pre- and post-periods (p = 0.011). Sensitivity analyses indicated that the findings were robust. No statistically significant effect of star ratings was found on concurrent or subsequent year enrollment in the pre- or post-period in the external comparison group of stand-alone PDP contracts. Conclusion Star ratings had no direct impact on concurrent year MAPD enrollment before or after the introduction of bonus payments tied to star ratings. However, after the introduction of these bonus payments, MAPD star ratings had a significant indirect impact of increasing subsequent year enrollment, likely via the reinvestment of bonuses to provide lower premiums and/or additional member benefits in the following year. PMID:27149092
Li, Pengxiang; Doshi, Jalpa A
2016-01-01
Since 2007, the Centers for Medicare and Medicaid Services have published 5-star quality rating measures to aid consumers in choosing Medicare Advantage Prescription Drug Plans (MAPDs). We examined the impact of these star ratings on Medicare Advantage Prescription Drug (MAPD) enrollment before and after 2012, when star ratings became tied to bonus payments for MAPDs that could be used to improve plan benefits and/or reduce premiums in the subsequent year. A longitudinal design and multivariable hybrid models were used to assess whether star ratings had a direct impact on concurrent year MAPD contract enrollment (by influencing beneficiary choice) and/or an indirect impact on subsequent year MAPD contract enrollment (because ratings were linked to bonus payments). The main analysis was based on contract-year level data from 2009-2015. We compared effects of star ratings in the pre-bonus payment period (2009-2011) and post-bonus payment period (2012-2015). Extensive sensitivity analyses varied the analytic techniques, unit of analysis, and sample inclusion criteria. Similar analyses were conducted separately using stand-alone PDP contract-year data; since PDPs were not eligible for bonus payments, they served as an external comparison group. The main analysis included 3,866 MAPD contract-years. A change of star rating had no statistically significant effect on concurrent year enrollment in any of the pre-, post-, or pre-post combined periods. On the other hand, star rating increase was associated with a statistically significant increase in the subsequent year enrollment (a 1-star increase associated with +11,337 enrollees, p<0.001) in the post-bonus payment period but had a very small and statistically non-significant effect on subsequent year enrollment in the pre-bonus payment period. Further, the difference in effects on subsequent year enrollment was statistically significant between the pre- and post-periods (p = 0.011). Sensitivity analyses indicated that the findings were robust. No statistically significant effect of star ratings was found on concurrent or subsequent year enrollment in the pre- or post-period in the external comparison group of stand-alone PDP contracts. Star ratings had no direct impact on concurrent year MAPD enrollment before or after the introduction of bonus payments tied to star ratings. However, after the introduction of these bonus payments, MAPD star ratings had a significant indirect impact of increasing subsequent year enrollment, likely via the reinvestment of bonuses to provide lower premiums and/or additional member benefits in the following year.
NASA Astrophysics Data System (ADS)
Karakatsanis, L. P.; Iliopoulos, A. C.; Pavlos, E. G.; Pavlos, G. P.
2018-02-01
In this paper, we perform statistical analysis of time series deriving from Earth's climate. The time series are concerned with Geopotential Height (GH) and correspond to temporal and spatial components of the global distribution of month average values, during the period (1948-2012). The analysis is based on Tsallis non-extensive statistical mechanics and in particular on the estimation of Tsallis' q-triplet, namely {qstat, qsens, qrel}, the reconstructed phase space and the estimation of correlation dimension and the Hurst exponent of rescaled range analysis (R/S). The deviation of Tsallis q-triplet from unity indicates non-Gaussian (Tsallis q-Gaussian) non-extensive character with heavy tails probability density functions (PDFs), multifractal behavior and long range dependences for all timeseries considered. Also noticeable differences of the q-triplet estimation found in the timeseries at distinct local or temporal regions. Moreover, in the reconstructive phase space revealed a lower-dimensional fractal set in the GH dynamical phase space (strong self-organization) and the estimation of Hurst exponent indicated multifractality, non-Gaussianity and persistence. The analysis is giving significant information identifying and characterizing the dynamical characteristics of the earth's climate.
Transportation Statistical Data and Information
DOT National Transportation Integrated Search
1976-12-01
The document contains an extensive review of internal and external sources of transportation data and statistics especially created for data administrators. Organized around the transportation industry and around the elements of the U.S. Department o...
Brittberg, Mats; Recker, David; Ilgenfritz, John; Saris, Daniel B F
2018-05-01
Matrix-based cell therapy improves surgical handling, increases patient comfort, and allows for expanded indications with better reliability within the knee joint. Five-year efficacy and safety of autologous cultured chondrocytes on porcine collagen membrane (MACI) versus microfracture for treating cartilage defects have not yet been reported from any randomized controlled clinical trial. To examine the clinical efficacy and safety results at 5 years after treatment with MACI and compare these with the efficacy and safety of microfracture treatment for symptomatic cartilage defects of the knee. Randomized controlled trial; Level of evidence, 1. This article describes the 5-year follow-up of the SUMMIT (Superiority of MACI Implant Versus Microfracture Treatment) clinical trial conducted at 14 study sites in Europe. All 144 patients who participated in SUMMIT were eligible to enroll; analyses of the 5-year data were performed with data from patients who signed informed consent and continued in the Extension study. Of the 144 patients randomized in the SUMMIT trial, 128 signed informed consent and continued observation in the Extension study: 65 MACI (90.3%) and 63 microfracture (87.5%). The improvements in Knee injury and Osteoarthritis Outcome Score (KOOS) Pain and Function domains previously described were maintained over the 5-year follow-up. Five years after treatment, the improvement in MACI over microfracture in the co-primary endpoint of KOOS pain and function was maintained and was clinically and statistically significant ( P = .022). Improvements in activities of daily living remained statistically significantly better ( P = .007) in MACI patients, with quality of life and other symptoms remaining numerically higher in MACI patients but losing statistical significance relative to the results of the SUMMIT 2-year analysis. Magnetic resonance imaging (MRI) evaluation of structural repair was performed in 120 patients at year 5. As in the 2-year SUMMIT (MACI00206) results, the MRI evaluation showed improvement in defect filling for both treatments; however, no statistically significant differences were noted between treatment groups. Symptomatic cartilage knee defects 3 cm 2 or larger treated with MACI were clinically and statistically significantly improved at 5 years compared with microfracture treatment. No remarkable adverse events or safety issues were noted in this heterogeneous patient population.
2011-01-01
Background While there is extensive literature evaluating the impact of phytoestrogen consumption on breast cancer risk, its role on ovarian cancer has received little attention. Methods We conducted a population-based case-control study to evaluate phytoestrogen intake from foods and supplements and epithelial ovarian cancer risk. Cases were identified in six counties in New Jersey through the New Jersey State Cancer Registry. Controls were identified by random digit dialing, CMS (Centers for Medicare and Medicaid Service) lists, and area sampling. A total of 205 cases and 390 controls were included in analyses. Unconditional logistic regression analyses were conducted to examine associations with total phytoestrogens, as well as isoflavones (daidzein, genistein, formononetin, and glycitein), lignans (matairesinol, lariciresinol, pinoresinol, secoisolariciresinol), and coumestrol. Results No statistically significant associations were found with any of the phytoestrogens under evaluation. However, there was a suggestion of an inverse association with total phytoestrogen consumption (from foods and supplements), with an odds ratio (OR) of 0.62 (95% CI: 0.38-1.00; p for trend: 0.04) for the highest vs. lowest tertile of consumption, after adjusting for reproductive covariates, age, race, education, BMI, and total energy. Further adjustment for smoking and physical activity attenuated risk estimates (OR: 0.66; 95% CI: 0.41-1.08). There was little evidence of an inverse association for isoflavones, lignans, or coumestrol. Conclusions This study provided some suggestion that phytoestrogen consumption may decrease ovarian cancer risk, although results did not reach statistical significance. PMID:21943063
Break and trend analysis of EUMETSAT Climate Data Records
NASA Astrophysics Data System (ADS)
Doutriaux-Boucher, Marie; Zeder, Joel; Lattanzio, Alessio; Khlystova, Iryna; Graw, Kathrin
2016-04-01
EUMETSAT reprocessed imagery acquired by the Spinning Enhanced Visible and Infrared Imager (SEVIRI) on board Meteosat 8-9. The data covers the period from 2004 to 2012. Climate Data Records (CDRs) of atmospheric parameters such as Atmospheric Motion Vectors (AMV) as well as Clear and All Sky Radiances (CSR and ASR) have been generated. Such CDRs are mainly ingested by ECMWF to produce a reanalysis data. In addition, EUMETSAT produced a long CDR (1982-2004) of land surface albedo exploiting imagery acquired by the Meteosat Visible and Infrared Imager (MVIRI) on board Meteosat 2-7. Such CDR is key information in climate analysis and climate models. Extensive validation has been performed for the surface albedo record and a first validation of the winds and clear sky radiances have been done. All validation results demonstrated that the time series of all parameter appear homogeneous at first sight. Statistical science offers a variety of analyses methods that have been applied to further analyse the homogeneity of the CDRs. Many breakpoint analysis techniques depend on the comparison of two time series which incorporates the issue that both may have breakpoints. This paper will present a quantitative and statistical analysis of eventual breakpoints found in the MVIRI and SEVIRI CDRs that includes attribution of breakpoints to changes of instruments and other events in the data series compared. The value of different methods applied will be discussed with suggestions how to further develop this type of analysis for quality evaluation of CDRs.
Automated brain volumetrics in multiple sclerosis: a step closer to clinical application.
Wang, C; Beadnall, H N; Hatton, S N; Bader, G; Tomic, D; Silva, D G; Barnett, M H
2016-07-01
Whole brain volume (WBV) estimates in patients with multiple sclerosis (MS) correlate more robustly with clinical disability than traditional, lesion-based metrics. Numerous algorithms to measure WBV have been developed over the past two decades. We compare Structural Image Evaluation using Normalisation of Atrophy-Cross-sectional (SIENAX) to NeuroQuant and MSmetrix, for assessment of cross-sectional WBV in patients with MS. MRIs from 61 patients with relapsing-remitting MS and 2 patients with clinically isolated syndrome were analysed. WBV measurements were calculated using SIENAX, NeuroQuant and MSmetrix. Statistical agreement between the methods was evaluated using linear regression and Bland-Altman plots. Precision and accuracy of WBV measurement was calculated for (1) NeuroQuant versus SIENAX and (2) MSmetrix versus SIENAX. Precision (Pearson's r) of WBV estimation for NeuroQuant and MSmetrix versus SIENAX was 0.983 and 0.992, respectively. Accuracy (Cb) was 0.871 and 0.994, respectively. NeuroQuant and MSmetrix showed a 5.5% and 1.0% volume difference compared with SIENAX, respectively, that was consistent across low and high values. In the analysed population, NeuroQuant and MSmetrix both quantified cross-sectional WBV with comparable statistical agreement to SIENAX, a well-validated cross-sectional tool that has been used extensively in MS clinical studies. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
NASA Astrophysics Data System (ADS)
Pavlos, George; Malandraki, Olga; Pavlos, Evgenios; Iliopoulos, Aggelos; Karakatsanis, Leonidas
2017-04-01
As the solar plasma lives far from equilibrium it is an excellent laboratory for testing non-equilibrium statistical mechanics. In this study, we present the highlights of Tsallis non-extensive statistical mechanics as concerns their applications at solar plasma dynamics, especially at solar wind phenomena and magnetosphere. In this study we present some new and significant results concerning the dynamics of interplanetary coronal mass ejections (ICMEs) observed in the near Earth at L1 solar wind environment, as well as its effect in Earth's magnetosphere. The results are referred to Tsallis non-extensive statistics and in particular to the estimation of Tsallis q-triplet, (qstat, qsen, qrel) of SEPs time series observed at the interplanetary space and magnetic field time series of the ICME observed at the Earth resulting from the solar eruptive activity on March 7, 2012 at the Sun. For the magnetic field, we used a multi-spacecraft approach based on data experiments from ACE, CLUSTER 4, THEMIS-E and THEMIS-C spacecraft. For the data analysis different time periods were considered, sorted as "quiet", "shock" and "aftershock", while different space domains such as the Interplanetary space (near Earth at L1 and upstream of the Earth's bowshock), the Earth's magnetosheath and magnetotail, were also taken into account. Our results reveal significant differences in statistical and dynamical features, indicating important variations of the SEPs profile in time, and magnetic field dynamics both in time and space domains during the shock event, in terms of rate of entropy production, relaxation dynamics and non-equilibrium meta-stable stationary states. So far, Tsallis non-extensive statistical theory and Tsallis extension of the Boltzmann-Gibbs entropy principle to the q-entropy entropy principle (Tsallis, 1988, 2009) reveal strong universality character concerning non-equilibrium dynamics (Pavlos et al. 2012a,b, 2014, 2015, 2016; Karakatsanis et al. 2013). Tsallis q-entropy principle can explain the emergence of a series of new and significant physical characteristics in distributed systems as well as in space plasmas. Such characteristics are: non-Gaussian statistics and anomalous diffusion processes, strange and fractional dynamics, multifractal, percolating and intermittent turbulence structures, multiscale and long spatio-temporal correlations, fractional acceleration and Non-Equilibrium Stationary States (NESS) or non-equilibrium self-organization process and non-equilibrium phase transition and topological phase transition processes according to Zelenyi and Milovanov (2004). In this direction, our results reveal clearly strong self-organization and development of macroscopic ordering of plasma system related to strengthen of non-extensivity, multifractality and intermittency everywhere in the space plasmas region during the CME event. Acknowledgements: This project has received funding form the European Union's Horizon 2020 research and innovation program under grant agreement No 637324.
NASA Astrophysics Data System (ADS)
Nazarzadeh Zare, Mohsen; Dorrani, Kamal; Gholamali Lavasani, Masoud
2012-11-01
Background and purpose : This study examines the views of farmers and extension agents participating in extension education courses in Dezful, Iran, with regard to problems with these courses. It relies upon a descriptive methodology, using a survey as its instrument. Sample : The statistical population consisted of 5060 farmers and 50 extension agents; all extension agents were studied owing to their small population and a sample of 466 farmers was selected based on the stratified ratio sampling method. For the data analysis, statistical procedures including the t-test and factor analysis were used. Results : The results of factor analysis on the views of farmers indicated that these courses have problems such as inadequate use of instructional materials by extension agents, insufficient employment of knowledgeable and experienced extension agents, bad and inconvenient timing of courses for farmers, lack of logical connection between one curriculum and prior ones, negligence in considering the opinions of farmers in arranging the courses, and lack of information about the time of courses. The findings of factor analysis on the views of extension agents indicated that these courses suffer from problems such as use of consistent methods of instruction for teaching curricula, and lack of continuity between courses and their levels and content. Conclusions : Recommendations include: listening to the views of farmers when planning extension courses; providing audiovisual aids, pamphlets and CDs; arranging courses based on convenient timing for farmers; using incentives to encourage participation; and employing extension agents with knowledge of the latest agricultural issues.
COGNATE: comparative gene annotation characterizer.
Wilbrandt, Jeanne; Misof, Bernhard; Niehuis, Oliver
2017-07-17
The comparison of gene and genome structures across species has the potential to reveal major trends of genome evolution. However, such a comparative approach is currently hampered by a lack of standardization (e.g., Elliott TA, Gregory TR, Philos Trans Royal Soc B: Biol Sci 370:20140331, 2015). For example, testing the hypothesis that the total amount of coding sequences is a reliable measure of potential proteome diversity (Wang M, Kurland CG, Caetano-Anollés G, PNAS 108:11954, 2011) requires the application of standardized definitions of coding sequence and genes to create both comparable and comprehensive data sets and corresponding summary statistics. However, such standard definitions either do not exist or are not consistently applied. These circumstances call for a standard at the descriptive level using a minimum of parameters as well as an undeviating use of standardized terms, and for software that infers the required data under these strict definitions. The acquisition of a comprehensive, descriptive, and standardized set of parameters and summary statistics for genome publications and further analyses can thus greatly benefit from the availability of an easy to use standard tool. We developed a new open-source command-line tool, COGNATE (Comparative Gene Annotation Characterizer), which uses a given genome assembly and its annotation of protein-coding genes for a detailed description of the respective gene and genome structure parameters. Additionally, we revised the standard definitions of gene and genome structures and provide the definitions used by COGNATE as a working draft suggestion for further reference. Complete parameter lists and summary statistics are inferred using this set of definitions to allow down-stream analyses and to provide an overview of the genome and gene repertoire characteristics. COGNATE is written in Perl and freely available at the ZFMK homepage ( https://www.zfmk.de/en/COGNATE ) and on github ( https://github.com/ZFMK/COGNATE ). The tool COGNATE allows comparing genome assemblies and structural elements on multiples levels (e.g., scaffold or contig sequence, gene). It clearly enhances comparability between analyses. Thus, COGNATE can provide the important standardization of both genome and gene structure parameter disclosure as well as data acquisition for future comparative analyses. With the establishment of comprehensive descriptive standards and the extensive availability of genomes, an encompassing database will become possible.
Martínez, Amparo; Manunza, Arianna; Delgado, Juan Vicente; Landi, Vincenzo; Adebambo, Ayotunde; Ismaila, Muritala; Capote, Juan; El Ouni, Mabrouk; Elbeltagy, Ahmed; Abushady, Asmaa M; Galal, Salah; Ferrando, Ainhoa; Gómez, Mariano; Pons, Agueda; Badaoui, Bouabid; Jordana, Jordi; Vidal, Oriol; Amills, Marcel
2016-12-14
Human-driven migrations are one of the main processes shaping the genetic diversity and population structure of domestic species. However, their magnitude and direction have been rarely analysed in a statistical framework. We aimed to estimate the impact of migration on the population structure of Spanish and African goats. To achieve this goal, we analysed a dataset of 1,472 individuals typed with 23 microsatellites. Population structure of African and Spanish goats was moderate (mean F ST = 0.07), with the exception of the Canarian and South African breeds that displayed a significant differentiation when compared to goats from North Africa and Nigeria. Measurement of gene flow with Migrate-n and IMa coalescent genealogy samplers supported the existence of a bidirectional gene flow between African and Spanish goats. Moreover, IMa estimates of the effective number of migrants were remarkably lower than those calculated with Migrate-n and classical approaches. Such discrepancies suggest that recent divergence, rather than extensive gene flow, is the main cause of the weak population structure observed in caprine breeds.
Martínez, Amparo; Manunza, Arianna; Delgado, Juan Vicente; Landi, Vincenzo; Adebambo, Ayotunde; Ismaila, Muritala; Capote, Juan; El Ouni, Mabrouk; Elbeltagy, Ahmed; Abushady, Asmaa M.; Galal, Salah; Ferrando, Ainhoa; Gómez, Mariano; Pons, Agueda; Badaoui, Bouabid; Jordana, Jordi; Vidal, Oriol; Amills, Marcel
2016-01-01
Human-driven migrations are one of the main processes shaping the genetic diversity and population structure of domestic species. However, their magnitude and direction have been rarely analysed in a statistical framework. We aimed to estimate the impact of migration on the population structure of Spanish and African goats. To achieve this goal, we analysed a dataset of 1,472 individuals typed with 23 microsatellites. Population structure of African and Spanish goats was moderate (mean FST = 0.07), with the exception of the Canarian and South African breeds that displayed a significant differentiation when compared to goats from North Africa and Nigeria. Measurement of gene flow with Migrate-n and IMa coalescent genealogy samplers supported the existence of a bidirectional gene flow between African and Spanish goats. Moreover, IMa estimates of the effective number of migrants were remarkably lower than those calculated with Migrate-n and classical approaches. Such discrepancies suggest that recent divergence, rather than extensive gene flow, is the main cause of the weak population structure observed in caprine breeds. PMID:27966592
Keppner, Eva M; Prang, Madlen; Engel, Katharina C; Ayasse, Manfred; Stökl, Johannes; Steiger, Sandra
2017-01-01
Burying beetles have fascinated scientists for centuries due to their elaborate form of biparental care that includes the burial and defense of a vertebrate carcass, as well as the subsequent feeding of the larvae. However, besides extensive research on burying beetles, one fundamental question has yet to be answered: what cues do males use to discriminate between the sexes? Here, we show in the burying beetle Nicrophorus vespilloides that cuticular lipids trigger male mating behavior. Previous chemical analyses have revealed sex differences in cuticular hydrocarbon (CHC) composition; however, in the current study, fractionated-guided bioassay showed that cuticular lipids, other than CHCs, elicit copulation. Chemical analyses of the behaviorally active fraction revealed 17 compounds, mainly aldehydes and fatty acid esters, with small quantitative but no qualitative differences between the sexes. Supplementation of males with hexadecanal, the compound contributing most to the statistical separation of the chemical profiles of males and females, did not trigger copulation attempts by males. Therefore, a possible explanation is that the whole profile of polar lipids mediates sex recognition in N. vespilloides.
Experimental measurement of flexion-extension movement in normal and corpse prosthetic elbow joint.
TarniŢă, Daniela; TarniŢă, DănuŢ Nicolae
2016-01-01
This paper presents a comparative experimental study of flexion-extension movement in healthy elbow and in the prosthetic elbow joint fixed on an original experimental bench. Measurements were carried out in order to validate the functional morphology and a new elbow prosthesis type ball head. The three-dimensional (3D) model and the physical prototype of our experimental bench used to test elbow endoprosthesis at flexion-extension and pronation-supination movements is presented. The measurements were carried out on a group of nine healthy subjects and on the prosthetic corpse elbow, the experimental data being obtained for flexion-extension movement cycles. Experimental data for the two different flexion-extension tests for the nine subjects and for the corpse prosthetic elbow were acquired using SimiMotion video system. Experimental data were processed statistically. The corresponding graphs were obtained for all subjects in the experimental group, and for corpse prosthetic elbow for both flexion-extension tests. The statistical analysis has proved that the flexion angles of healthy elbows were significantly close to the values measured at the prosthetic elbow fixed on the experimental bench. The studied elbow prosthesis manages to re-establish the mobility for the elbow joint as close to the normal one.
Statistical Data Analyses of Trace Chemical, Biochemical, and Physical Analytical Signatures
DOE Office of Scientific and Technical Information (OSTI.GOV)
Udey, Ruth Norma
Analytical and bioanalytical chemistry measurement results are most meaningful when interpreted using rigorous statistical treatments of the data. The same data set may provide many dimensions of information depending on the questions asked through the applied statistical methods. Three principal projects illustrated the wealth of information gained through the application of statistical data analyses to diverse problems.
Giannantoni, N.M.; Minisci, M.; Brunetti, V.; Scarano, E.; Testani, E.; Vollono, C.; De Corso, E.; Bastanza, G.; D'Alatri, L.
2016-01-01
SUMMARY Oro-pharyngeal dysphagia is frequently present during the acute phase of stroke. The aim of the present study was to evaluate whether the recording of surface EMG using a nasopharyngeal (NP) electrode could be applied to evaluation of pharyngeal muscle activity in acute stroke patients and if this neurophysiological measure is related with clinical assessment of swallowing. Patients were examined and clinical severity was assessed with the National Institute of Health Stroke Scale (NIHSS) score; dysphagia was evaluated through bedside screening test using the Gugging Swallowing Scale (GUSS). Extension of the ischaemic lesion was measured by quantitative score, based on CT scan [Alberta Stroke Programme Early CT Score (ASPECTS)]. We analysed 70 patients; 50 were classified as dysphagic (Dys+), and 20 as non-dysphagic (Dys–). Each participant underwent a surface NP EMG recording performed with a NP electrode, made of a Teflon isolated steel catheter, with a length of 16 cm and a tip diameter of 1.5 mm. The electrode was inserted through the nasal cavity, rotated and positioned approximately 3 mm anteroinferior to the salpingo-palatine fold. At least four consecutive swallowing-induced EMG bursts were recorded and analysed for each participant. Swallowing always induced a repetitive, polyphasic burst of activation of the EMG, lasting around 0.25 to 1 sec, with an amplitude of around 100-600mV. Two parameters of the EMG potentials recorded with the NP electrode were analyzed: duration and amplitude. The duration of the EMG burst was increased in Dys+ patients with a statistically significant difference compared to Dys- patients (p < 0.001). The amplitude was slightly reduced in the Dys+ group, but statistically significant differences were not observed (p = 0,775). Nevertheless, the burst amplitude showed a significant inverse correlation with NIHSS [r(48) = –0.31; p < 0.05] and ASPECTS scores [r(48) = –0.27; p < 0.05], meaning that the burst amplitude progressively reduced with an increase of clinical severity (NIHSS) and topographic extension of brain lesions in CT (ASPECTS). These results suggest that NP recordings can give a semi-quantitative measure of swallowing difficulties originating from pharyngeal dysfunction, in fact, electromyographic findings suggest reduced pharyngeal motility. PMID:27734982
Responding to Nonwords in the Lexical Decision Task: Insights from the English Lexicon Project
Yap, Melvin J.; Sibley, Daragh E.; Balota, David A.; Ratcliff, Roger; Rueckl, Jay
2014-01-01
Researchers have extensively documented how various statistical properties of words (e.g., word-frequency) influence lexical processing. However, the impact of lexical variables on nonword decision-making performance is less clear. This gap is surprising, since a better specification of the mechanisms driving nonword responses may provide valuable insights into early lexical processes. In the present study, item-level and participant-level analyses were conducted on the trial-level lexical decision data for almost 37,000 nonwords in the English Lexicon Project in order to identify the influence of different psycholinguistic variables on nonword lexical decision performance, and to explore individual differences in how participants respond to nonwords. Item-level regression analyses reveal that nonword response time was positively correlated with number of letters, number of orthographic neighbors, number of affixes, and baseword number of syllables, and negatively correlated with Levenshtein orthographic distance and baseword frequency. Participant-level analyses also point to within- and between-session stability in nonword responses across distinct sets of items, and intriguingly reveal that higher vocabulary knowledge is associated with less sensitivity to some dimensions (e.g., number of letters) but more sensitivity to others (e.g., baseword frequency). The present findings provide well-specified and interesting new constraints for informing models of word recognition and lexical decision. PMID:25329078
Sausedo, R A; Schoenwolf, G C
1994-05-01
Formation and extension of the notochord (i.e., notogenesis) is one of the earliest and most obvious events of axis development in vertebrate embryos. In birds and mammals, prospective notochord cells arise from Hensen's node and come to lie beneath the midline of the neural plate. Throughout the period of neurulation, the notochord retains its close spatial relationship with the developing neural tube and undergoes rapid extension in concert with the overlying neuroepithelium. In the present study, we examined notochord development quantitatively in mouse embryos. C57BL/6 mouse embryos were collected at 8, 8.5, 9, 9.5, and 10 days of gestation. They were then embedded in paraffin and sectioned transversely. Serial sections from 21 embryos were stained with Schiff's reagent according to the Feulgen-Rossenbeck procedure and used for quantitative analyses of notochord extension. Quantitative analyses revealed that extension of the notochord involves cell division within the notochord proper and cell rearrangement within the notochordal plate (the immediate precursor of the notochord). In addition, extension of the notochord involves cell accretion, that is, the addition of cells to the notochord's caudal end, a process that involves considerable cell rearrangement at the notochordal plate-node interface. Extension of the mouse notochord occurs similarly to that described previously for birds (Sausedo and Schoenwolf, 1993 Anat. Rec. 237:58-70). That is, in both birds (i.e., quail and chick) and mouse embryos, notochord extension involves cell division, cell rearrangement, and cell accretion. Thus higher vertebrates utilize similar morphogenetic movements to effect notogenesis.
DAnTE: a statistical tool for quantitative analysis of –omics data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Polpitiya, Ashoka D.; Qian, Weijun; Jaitly, Navdeep
2008-05-03
DAnTE (Data Analysis Tool Extension) is a statistical tool designed to address challenges unique to quantitative bottom-up, shotgun proteomics data. This tool has also been demonstrated for microarray data and can easily be extended to other high-throughput data types. DAnTE features selected normalization methods, missing value imputation algorithms, peptide to protein rollup methods, an extensive array of plotting functions, and a comprehensive ANOVA scheme that can handle unbalanced data and random effects. The Graphical User Interface (GUI) is designed to be very intuitive and user friendly.
North American transportation : statistics on Canadian, Mexican, and United States transportation
DOT National Transportation Integrated Search
1994-05-01
North American Transportation: Statistics on Canadian, Mexican, and United States transportation contains extensive data on the size and scope, use, employment, fuel consumption, and economic role of each country's transportation system. It was publi...
Design and Construction of a Thermal Contact Resistance and Thermal Conductivity Measurement System
2015-09-01
plate interface resistance control. Numerical heat transfer and uncertainty analyses with applied engineering judgement were extensively used to come... heat transfer issues facing the Department of Defense. 14. SUBJECT TERMS Thermal contact resistance, thermal conductivity, measurement system 15... heat transfer and uncertainty analyses with applied engineering judgement were extensively used to come up with an optimized design and construction
Early Warning Signs of Suicide in Service Members Who Engage in Unauthorized Acts of Violence
2016-06-01
observable to military law enforcement personnel. Statistical analyses tested for differences in warning signs between cases of suicide, violence, or...indicators, (2) Behavioral Change indicators, (3) Social indicators, and (4) Occupational indicators. Statistical analyses were conducted to test for...6 Coding _________________________________________________________________ 7 Statistical
[Statistical analysis using freely-available "EZR (Easy R)" software].
Kanda, Yoshinobu
2015-10-01
Clinicians must often perform statistical analyses for purposes such evaluating preexisting evidence and designing or executing clinical studies. R is a free software environment for statistical computing. R supports many statistical analysis functions, but does not incorporate a statistical graphical user interface (GUI). The R commander provides an easy-to-use basic-statistics GUI for R. However, the statistical function of the R commander is limited, especially in the field of biostatistics. Therefore, the author added several important statistical functions to the R commander and named it "EZR (Easy R)", which is now being distributed on the following website: http://www.jichi.ac.jp/saitama-sct/. EZR allows the application of statistical functions that are frequently used in clinical studies, such as survival analyses, including competing risk analyses and the use of time-dependent covariates and so on, by point-and-click access. In addition, by saving the script automatically created by EZR, users can learn R script writing, maintain the traceability of the analysis, and assure that the statistical process is overseen by a supervisor.
Ahlborn, W; Tuz, H J; Uberla, K
1990-03-01
In cohort studies the Mantel-Haenszel estimator ORMH is computed from sample data and is used as a point estimator of relative risk. Test-based confidence intervals are estimated with the help of the asymptotic chi-squared distributed MH-statistic chi 2MHS. The Mantel-extension-chi-squared is used as a test statistic for a dose-response relationship. Both test statistics--the Mantel-Haenszel-chi as well as the Mantel-extension-chi--assume homogeneity of risk across strata, which is rarely present. Also an extended nonparametric statistic, proposed by Terpstra, which is based on the Mann-Whitney-statistics assumes homogeneity of risk across strata. We have earlier defined four risk measures RRkj (k = 1,2,...,4) in the population and considered their estimates and the corresponding asymptotic distributions. In order to overcome the homogeneity assumption we use the delta-method to get "test-based" confidence intervals. Because the four risk measures RRkj are presented as functions of four weights gik we give, consequently, the asymptotic variances of these risk estimators also as functions of the weights gik in a closed form. Approximations to these variances are given. For testing a dose-response relationship we propose a new class of chi 2(1)-distributed global measures Gk and the corresponding global chi 2-test. In contrast to the Mantel-extension-chi homogeneity of risk across strata must not be assumed. These global test statistics are of the Wald type for composite hypotheses.(ABSTRACT TRUNCATED AT 250 WORDS)
ERIC Educational Resources Information Center
Savalei, Victoria
2010-01-01
Incomplete nonnormal data are common occurrences in applied research. Although these 2 problems are often dealt with separately by methodologists, they often cooccur. Very little has been written about statistics appropriate for evaluating models with such data. This article extends several existing statistics for complete nonnormal data to…
NASA Astrophysics Data System (ADS)
Vallianatos, F.; Tzanis, A.; Michas, G.; Papadakis, G.
2012-04-01
Since the middle of summer 2011, an increase in the seismicity rates of the volcanic complex system of Santorini Island, Greece, was observed. In the present work, the temporal distribution of seismicity, as well as the magnitude distribution of earthquakes, have been studied using the concept of Non-Extensive Statistical Physics (NESP; Tsallis, 2009) along with the evolution of Shanon entropy H (also called information entropy). The analysis is based on the earthquake catalogue of the Geodynamic Institute of the National Observatory of Athens for the period July 2011-January 2012 (http://www.gein.noa.gr/). Non-Extensive Statistical Physics, which is a generalization of Boltzmann-Gibbs statistical physics, seems a suitable framework for studying complex systems. The observed distributions of seismicity rates at Santorini can be described (fitted) with NESP models to exceptionally well. This implies the inherent complexity of the Santorini volcanic seismicity, the applicability of NESP concepts to volcanic earthquake activity and the usefulness of NESP in investigating phenomena exhibiting multifractality and long-range coupling effects. Acknowledgments. This work was supported in part by the THALES Program of the Ministry of Education of Greece and the European Union in the framework of the project entitled "Integrated understanding of Seismicity, using innovative Methodologies of Fracture mechanics along with Earthquake and non extensive statistical physics - Application to the geodynamic system of the Hellenic Arc. SEISMO FEAR HELLARC". GM and GP wish to acknowledge the partial support of the Greek State Scholarships Foundation (ΙΚΥ).
Zhang, Harrison G; Ying, Gui-Shuang
2018-02-09
The aim of this study is to evaluate the current practice of statistical analysis of eye data in clinical science papers published in British Journal of Ophthalmology ( BJO ) and to determine whether the practice of statistical analysis has improved in the past two decades. All clinical science papers (n=125) published in BJO in January-June 2017 were reviewed for their statistical analysis approaches for analysing primary ocular measure. We compared our findings to the results from a previous paper that reviewed BJO papers in 1995. Of 112 papers eligible for analysis, half of the studies analysed the data at an individual level because of the nature of observation, 16 (14%) studies analysed data from one eye only, 36 (32%) studies analysed data from both eyes at ocular level, one study (1%) analysed the overall summary of ocular finding per individual and three (3%) studies used the paired comparison. Among studies with data available from both eyes, 50 (89%) of 56 papers in 2017 did not analyse data from both eyes or ignored the intereye correlation, as compared with in 60 (90%) of 67 papers in 1995 (P=0.96). Among studies that analysed data from both eyes at an ocular level, 33 (92%) of 36 studies completely ignored the intereye correlation in 2017, as compared with in 16 (89%) of 18 studies in 1995 (P=0.40). A majority of studies did not analyse the data properly when data from both eyes were available. The practice of statistical analysis did not improve in the past two decades. Collaborative efforts should be made in the vision research community to improve the practice of statistical analysis for ocular data. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
7 CFR 2.21 - Under Secretary for Research, Education, and Economics.
Code of Federal Regulations, 2010 CFR
2010-01-01
... aspects, including, but not limited to, production, marketing (other than statistical and economic...). (xxxvi) Administer a cooperative extension program under the Farmer-to-Consumer Direct Marketing Act of... for research and extension to facilitate or expand production and marketing of aquacultural food...
Madeya, S; Börsch, G; Greiner, L; Hahn, H J
1993-03-01
With the aim of analysing the frequency of gastrointestinal (GI) metastases identified by endoscopic procedures, a survey was conducted by questionnaire, which was completed by 34 of 127 medical departments. Peritoneal carcinosis and direct tumor extension were disregarded. One GI metastasis (duodenum) was verified among 3477 upper GI tract endoscopies. Primary site was cutaneous melanoma. In another case metastatic origin is discussed (esophagus). Considering the average frequency of 102 upper GI tract endoscopies performed by the collaborating centers, one case of gastroduodenal metastasis could be expected every 17 (34) months in these institutions. 1634 examinations of the colon and rectum did not reveal any metastatic tumor growth. A longterm study is planned to provide further statistically reliable prevalence data.
Understanding older adults' usage of community green spaces in Taipei, Taiwan.
Pleson, Eryn; Nieuwendyk, Laura M; Lee, Karen K; Chaddah, Anuradha; Nykiforuk, Candace I J; Schopflocher, Donald
2014-01-27
As the world's population ages, there is an increasing need for community environments to support physical activity and social connections for older adults. This exploratory study sought to better understand older adults' usage and perceptions of community green spaces in Taipei, Taiwan, through direct observations of seven green spaces and nineteen structured interviews. Descriptive statistics from observations using the System for Observing Play and Recreation in Communities (SOPARC) confirm that older adults use Taipei's parks extensively. Our analyses of interviews support the following recommendations for age-friendly active living initiatives for older adults: make green spaces accessible to older adults; organize a variety of structured activities that appeal to older adults particularly in the morning; equip green spaces for age-appropriate physical activity; and, promote the health advantages of green spaces to older adults.
Application of Taguchi methods to infrared window design
NASA Astrophysics Data System (ADS)
Osmer, Kurt A.; Pruszynski, Charles J.
1990-10-01
Dr. Genichi Taguchi, a prominent quality consultant, reduced a branch of statistics known as "Design of Experiments" to a cookbook methodology that can be employed by any competent engineer. This technique has been extensively employed by Japanese manufacturers, and is widely credited with helping them attain their current level of success in low cost, high quality product design and fabrication. Although this technique was originally put forth as a tool to streamline the determination of improved production processes, it can also be applied to a wide range of engineering problems. As part of an internal research project, this method of experimental design has been adapted to window trade studies and materials research. Two of these analyses are presented herein, and have been chosen to illustrate the breadth of applications to which the Taguchi method can be utilized.
EEGLAB, SIFT, NFT, BCILAB, and ERICA: new tools for advanced EEG processing.
Delorme, Arnaud; Mullen, Tim; Kothe, Christian; Akalin Acar, Zeynep; Bigdely-Shamlo, Nima; Vankov, Andrey; Makeig, Scott
2011-01-01
We describe a set of complementary EEG data collection and processing tools recently developed at the Swartz Center for Computational Neuroscience (SCCN) that connect to and extend the EEGLAB software environment, a freely available and readily extensible processing environment running under Matlab. The new tools include (1) a new and flexible EEGLAB STUDY design facility for framing and performing statistical analyses on data from multiple subjects; (2) a neuroelectromagnetic forward head modeling toolbox (NFT) for building realistic electrical head models from available data; (3) a source information flow toolbox (SIFT) for modeling ongoing or event-related effective connectivity between cortical areas; (4) a BCILAB toolbox for building online brain-computer interface (BCI) models from available data, and (5) an experimental real-time interactive control and analysis (ERICA) environment for real-time production and coordination of interactive, multimodal experiments.
NASA Astrophysics Data System (ADS)
Bucher, François-Xavier; Cao, Frédéric; Viard, Clément; Guichard, Frédéric
2014-03-01
We present in this paper a novel capacitive device that stimulates the touchscreen interface of a smartphone (or of any imaging device equipped with a capacitive touchscreen) and synchronizes triggering with the DxO LED Universal Timer to measure shooting time lag and shutter lag according to ISO 15781:2013. The device and protocol extend the time lag measurement beyond the standard by including negative shutter lag, a phenomenon that is more and more commonly found in smartphones. The device is computer-controlled, and this feature, combined with measurement algorithms, makes it possible to automatize a large series of captures so as to provide more refined statistical analyses when, for example, the shutter lag of "zero shutter lag" devices is limited by the frame time as our measurements confirm.
Understanding Older Adults’ Usage of Community Green Spaces in Taipei, Taiwan
Pleson, Eryn; Nieuwendyk, Laura M.; Lee, Karen K.; Chaddah, Anuradha; Nykiforuk, Candace I. J.; Schopflocher, Donald
2014-01-01
As the world’s population ages, there is an increasing need for community environments to support physical activity and social connections for older adults. This exploratory study sought to better understand older adults’ usage and perceptions of community green spaces in Taipei, Taiwan, through direct observations of seven green spaces and nineteen structured interviews. Descriptive statistics from observations using the System for Observing Play and Recreation in Communities (SOPARC) confirm that older adults use Taipei’s parks extensively. Our analyses of interviews support the following recommendations for age-friendly active living initiatives for older adults: make green spaces accessible to older adults; organize a variety of structured activities that appeal to older adults particularly in the morning; equip green spaces for age-appropriate physical activity; and, promote the health advantages of green spaces to older adults. PMID:24473116
The applications of Complexity Theory and Tsallis Non-extensive Statistics at Solar Plasma Dynamics
NASA Astrophysics Data System (ADS)
Pavlos, George
2015-04-01
As the solar plasma lives far from equilibrium it is an excellent laboratory for testing complexity theory and non-equilibrium statistical mechanics. In this study, we present the highlights of complexity theory and Tsallis non extensive statistical mechanics as concerns their applications at solar plasma dynamics, especially at sunspot, solar flare and solar wind phenomena. Generally, when a physical system is driven far from equilibrium states some novel characteristics can be observed related to the nonlinear character of dynamics. Generally, the nonlinearity in space plasma dynamics can generate intermittent turbulence with the typical characteristics of the anomalous diffusion process and strange topologies of stochastic space plasma fields (velocity and magnetic fields) caused by the strange dynamics and strange kinetics (Zaslavsky, 2002). In addition, according to Zelenyi and Milovanov (2004) the complex character of the space plasma system includes the existence of non-equilibrium (quasi)-stationary states (NESS) having the topology of a percolating fractal set. The stabilization of a system near the NESS is perceived as a transition into a turbulent state determined by self-organization processes. The long-range correlation effects manifest themselves as a strange non-Gaussian behavior of kinetic processes near the NESS plasma state. The complex character of space plasma can also be described by the non-extensive statistical thermodynamics pioneered by Tsallis, which offers a consistent and effective theoretical framework, based on a generalization of Boltzmann - Gibbs (BG) entropy, to describe far from equilibrium nonlinear complex dynamics (Tsallis, 2009). In a series of recent papers, the hypothesis of Tsallis non-extensive statistics in magnetosphere, sunspot dynamics, solar flares, solar wind and space plasma in general, was tested and verified (Karakatsanis et al., 2013; Pavlos et al., 2014; 2015). Our study includes the analysis of solar plasma time series at three cases: sunspot index, solar flare and solar wind data. The non-linear analysis of the sunspot index is embedded in the non-extensive statistical theory of Tsallis (1988; 2004; 2009). The q-triplet of Tsallis, as well as the correlation dimension and the Lyapunov exponent spectrum were estimated for the SVD components of the sunspot index timeseries. Also the multifractal scaling exponent spectrum f(a), the generalized Renyi dimension spectrum D(q) and the spectrum J(p) of the structure function exponents were estimated experimentally and theoretically by using the q-entropy principle included in Tsallis non-extensive statistical theory, following Arimitsu and Arimitsu (2000, 2001). Our analysis showed clearly the following: (a) a phase transition process in the solar dynamics from high dimensional non-Gaussian SOC state to a low dimensional non-Gaussian chaotic state, (b) strong intermittent solar turbulence and anomalous (multifractal) diffusion solar process, which is strengthened as the solar dynamics makes a phase transition to low dimensional chaos in accordance to Ruzmaikin, Zelenyi and Milovanov's studies (Zelenyi and Milovanov, 1991; Milovanov and Zelenyi, 1993; Ruzmakin et al., 1996), (c) faithful agreement of Tsallis non-equilibrium statistical theory with the experimental estimations of: (i) non-Gaussian probability distribution function P(x), (ii) multifractal scaling exponent spectrum f(a) and generalized Renyi dimension spectrum Dq, (iii) exponent spectrum J(p) of the structure functions estimated for the sunspot index and its underlying non equilibrium solar dynamics. Also, the q-triplet of Tsallis as well as the correlation dimension and the Lyapunov exponent spectrum were estimated for the singular value decomposition (SVD) components of the solar flares timeseries. Also the multifractal scaling exponent spectrum f(a), the generalized Renyi dimension spectrum D(q) and the spectrum J(p) of the structure function exponents were estimated experimentally and theoretically by using the q-entropy principle included in Tsallis non-extensive statistical theory, following Arimitsu and Arimitsu (2000). Our analysis showed clearly the following: (a) a phase transition process in the solar flare dynamics from a high dimensional non-Gaussian self-organized critical (SOC) state to a low dimensional also non-Gaussian chaotic state, (b) strong intermittent solar corona turbulence and an anomalous (multifractal) diffusion solar corona process, which is strengthened as the solar corona dynamics makes a phase transition to low dimensional chaos, (c) faithful agreement of Tsallis non-equilibrium statistical theory with the experimental estimations of the functions: (i) non-Gaussian probability distribution function P(x), (ii) f(a) and D(q), and (iii) J(p) for the solar flares timeseries and its underlying non-equilibrium solar dynamics, and (d) the solar flare dynamical profile is revealed similar to the dynamical profile of the solar corona zone as far as the phase transition process from self-organized criticality (SOC) to chaos state. However the solar low corona (solar flare) dynamical characteristics can be clearly discriminated from the dynamical characteristics of the solar convection zone. At last we present novel results revealing non-equilibrium phase transition processes in the solar wind plasma during a strong shock event, which can take place in Solar wind plasma system. The solar wind plasma as well as the entire solar plasma system is a typical case of stochastic spatiotemporal distribution of physical state variables such as force fields ( ) and matter fields (particle and current densities or bulk plasma distributions). This study shows clearly the non-extensive and non-Gaussian character of the solar wind plasma and the existence of multi-scale strong correlations from the microscopic to the macroscopic level. It also underlines the inefficiency of classical magneto-hydro-dynamic (MHD) or plasma statistical theories, based on the classical central limit theorem (CLT), to explain the complexity of the solar wind dynamics, since these theories include smooth and differentiable spatial-temporal functions (MHD theory) or Gaussian statistics (Boltzmann-Maxwell statistical mechanics). On the contrary, the results of this study indicate the presence of non-Gaussian non-extensive statistics with heavy tails probability distribution functions, which are related to the q-extension of CLT. Finally, the results of this study can be understood in the framework of modern theoretical concepts such as non-extensive statistical mechanics (Tsallis, 2009), fractal topology (Zelenyi and Milovanov, 2004), turbulence theory (Frisch, 1996), strange dynamics (Zaslavsky, 2002), percolation theory (Milovanov, 1997), anomalous diffusion theory and anomalous transport theory (Milovanov, 2001), fractional dynamics (Tarasov, 2013) and non-equilibrium phase transition theory (Chang, 1992). References 1. T. Arimitsu, N. Arimitsu, Tsallis statistics and fully developed turbulence, J. Phys. A: Math. Gen. 33 (2000) L235. 2. T. Arimitsu, N. Arimitsu, Analysis of turbulence by statistics based on generalized entropies, Physica A 295 (2001) 177-194. 3. T. Chang, Low-dimensional behavior and symmetry braking of stochastic systems near criticality can these effects be observed in space and in the laboratory, IEEE 20 (6) (1992) 691-694. 4. U. Frisch, Turbulence, Cambridge University Press, Cambridge, UK, 1996, p. 310. 5. L.P. Karakatsanis, G.P. Pavlos, M.N. Xenakis, Tsallis non-extensive statistics, intermittent turbulence, SOC and chaos in the solar plasma. Part two: Solar flares dynamics, Physica A 392 (2013) 3920-3944. 6. A.V. Milovanov, Topological proof for the Alexander-Orbach conjecture, Phys. Rev. E 56 (3) (1997) 2437-2446. 7. A.V. Milovanov, L.M. Zelenyi, Fracton excitations as a driving mechanism for the self-organized dynamical structuring in the solar wind, Astrophys. Space Sci. 264 (1-4) (1999) 317-345. 8. A.V. Milovanov, Stochastic dynamics from the fractional Fokker-Planck-Kolmogorov equation: large-scale behavior of the turbulent transport coefficient, Phys. Rev. E 63 (2001) 047301. 9. G.P. Pavlos, et al., Universality of non-extensive Tsallis statistics and time series analysis: Theory and applications, Physica A 395 (2014) 58-95. 10. G.P. Pavlos, et al., Tsallis non-extensive statistics and solar wind plasma complexity, Physica A 422 (2015) 113-135. 11. A.A. Ruzmaikin, et al., Spectral properties of solar convection and diffusion, ApJ 471 (1996) 1022. 12. V.E. Tarasov, Review of some promising fractional physical models, Internat. J. Modern Phys. B 27 (9) (2013) 1330005. 13. C. Tsallis, Possible generalization of BG statistics, J. Stat. Phys. J 52 (1-2) (1988) 479-487. 14. C. Tsallis, Nonextensive statistical mechanics: construction and physical interpretation, in: G.M. Murray, C. Tsallis (Eds.), Nonextensive Entropy-Interdisciplinary Applications, Oxford Univ. Press, 2004, pp. 1-53. 15. C. Tsallis, Introduction to Non-Extensive Statistical Mechanics, Springer, 2009. 16. G.M. Zaslavsky, Chaos, fractional kinetics, and anomalous transport, Physics Reports 371 (2002) 461-580. 17. L.M. Zelenyi, A.V. Milovanov, Fractal properties of sunspots, Sov. Astron. Lett. 17 (6) (1991) 425. 18. L.M. Zelenyi, A.V. Milovanov, Fractal topology and strange kinetics: from percolation theory to problems in cosmic electrodynamics, Phys.-Usp. 47 (8), (2004) 749-788.
Efficace, Fabio; Fayers, Peter; Pusic, Andrea; Cemal, Yeliz; Yanagawa, Jane; Jacobs, Marc; la Sala, Andrea; Cafaro, Valentina; Whale, Katie; Rees, Jonathan; Blazeby, Jane
2015-09-15
The main objectives of this study were to identify the number of randomized controlled trials (RCTs) including a patient-reported outcome (PRO) endpoint across a wide range of cancer specialties and to evaluate the completeness of PRO reporting according to the Consolidated Standards of Reporting Trials (CONSORT) PRO extension. RCTs with a PRO endpoint that had been performed across several cancer specialties and published between 2004 and 2013 were considered. Studies were evaluated on the basis of previously defined criteria, including the CONSORT PRO extension and the Cochrane Collaboration's tool for assessing the risk of bias of RCTs. Analyses were also conducted by the type of PRO endpoint (primary vs secondary) and by the cancer disease site. A total of 56,696 potentially eligible records were scrutinized, and 557 RCTs with a PRO evaluation, enrolling 254,677 patients overall, were identified. PROs were most frequently used in RCTs of breast (n = 123), lung (n = 85), and colorectal cancer (n = 66). Overall, PROs were secondary endpoints in 421 RCTs (76%). Four of 6 evaluated CONSORT PRO items were documented in less than 50% of the RCTs. The level of reporting was higher in RCTs with a PRO as a primary endpoint. The presence of a supplementary report was the only statistically significant factor associated with greater completeness of reporting for both RCTs with PROs as primary endpoints (β = .19, P = .001) and RCTs with PROs as secondary endpoints (β = .30, P < .001). Implementation of the CONSORT PRO extension is equally important across all cancer specialties. Its use can also contribute to revealing the robust PRO design of some studies, which might be obscured by poor outcome reporting. © 2015 American Cancer Society.
Evaluating Cellular Polyfunctionality with a Novel Polyfunctionality Index
Larsen, Martin; Sauce, Delphine; Arnaud, Laurent; Fastenackels, Solène; Appay, Victor; Gorochov, Guy
2012-01-01
Functional evaluation of naturally occurring or vaccination-induced T cell responses in mice, men and monkeys has in recent years advanced from single-parameter (e.g. IFN-γ-secretion) to much more complex multidimensional measurements. Co-secretion of multiple functional molecules (such as cytokines and chemokines) at the single-cell level is now measurable due primarily to major advances in multiparametric flow cytometry. The very extensive and complex datasets generated by this technology raise the demand for proper analytical tools that enable the analysis of combinatorial functional properties of T cells, hence polyfunctionality. Presently, multidimensional functional measures are analysed either by evaluating all combinations of parameters individually or by summing frequencies of combinations that include the same number of simultaneous functions. Often these evaluations are visualized as pie charts. Whereas pie charts effectively represent and compare average polyfunctionality profiles of particular T cell subsets or patient groups, they do not document the degree or variation of polyfunctionality within a group nor does it allow more sophisticated statistical analysis. Here we propose a novel polyfunctionality index that numerically evaluates the degree and variation of polyfuntionality, and enable comparative and correlative parametric and non-parametric statistical tests. Moreover, it allows the usage of more advanced statistical approaches, such as cluster analysis. We believe that the polyfunctionality index will render polyfunctionality an appropriate end-point measure in future studies of T cell responsiveness. PMID:22860124
MRI textures as outcome predictor for Gamma Knife radiosurgery on vestibular schwannoma
NASA Astrophysics Data System (ADS)
Langenhuizen, P. P. J. H.; Legters, M. J. W.; Zinger, S.; Verheul, H. B.; Leenstra, S.; de With, P. H. N.
2018-02-01
Vestibular schwannomas (VS) are benign brain tumors that can be treated with high-precision focused radiation with the Gamma Knife in order to stop tumor growth. Outcome prediction of Gamma Knife radiosurgery (GKRS) treatment can help in determining whether GKRS will be effective on an individual patient basis. However, at present, prognostic factors of tumor control after GKRS for VS are largely unknown, and only clinical factors, such as size of the tumor at treatment and pre-treatment growth rate of the tumor, have been considered thus far. This research aims at outcome prediction of GKRS by means of quantitative texture feature analysis on conventional MRI scans. We compute first-order statistics and features based on gray-level co- occurrence (GLCM) and run-length matrices (RLM), and employ support vector machines and decision trees for classification. In a clinical dataset, consisting of 20 tumors showing treatment failure and 20 tumors exhibiting treatment success, we have discovered that the second-order statistical metrics distilled from GLCM and RLM are suitable for describing texture, but are slightly outperformed by simple first-order statistics, like mean, standard deviation and median. The obtained prediction accuracy is about 85%, but a final choice of the best feature can only be made after performing more extensive analyses on larger datasets. In any case, this work provides suitable texture measures for successful prediction of GKRS treatment outcome for VS.
Unperturbed Schelling Segregation in Two or Three Dimensions
NASA Astrophysics Data System (ADS)
Barmpalias, George; Elwes, Richard; Lewis-Pye, Andrew
2016-09-01
Schelling's models of segregation, first described in 1969 (Am Econ Rev 59:488-493, 1969) are among the best known models of self-organising behaviour. Their original purpose was to identify mechanisms of urban racial segregation. But his models form part of a family which arises in statistical mechanics, neural networks, social science, and beyond, where populations of agents interact on networks. Despite extensive study, unperturbed Schelling models have largely resisted rigorous analysis, prior results generally focusing on variants in which noise is introduced into the dynamics, the resulting system being amenable to standard techniques from statistical mechanics or stochastic evolutionary game theory (Young in Individual strategy and social structure: an evolutionary theory of institutions, Princeton University Press, Princeton, 1998). A series of recent papers (Brandt et al. in: Proceedings of the 44th annual ACM symposium on theory of computing (STOC 2012), 2012); Barmpalias et al. in: 55th annual IEEE symposium on foundations of computer science, Philadelphia, 2014, J Stat Phys 158:806-852, 2015), has seen the first rigorous analyses of 1-dimensional unperturbed Schelling models, in an asymptotic framework largely unknown in statistical mechanics. Here we provide the first such analysis of 2- and 3-dimensional unperturbed models, establishing most of the phase diagram, and answering a challenge from Brandt et al. in: Proceedings of the 44th annual ACM symposium on theory of computing (STOC 2012), 2012).
Harrigan, George G; Harrison, Jay M
2012-01-01
New transgenic (GM) crops are subjected to extensive safety assessments that include compositional comparisons with conventional counterparts as a cornerstone of the process. The influence of germplasm, location, environment, and agronomic treatments on compositional variability is, however, often obscured in these pair-wise comparisons. Furthermore, classical statistical significance testing can often provide an incomplete and over-simplified summary of highly responsive variables such as crop composition. In order to more clearly describe the influence of the numerous sources of compositional variation we present an introduction to two alternative but complementary approaches to data analysis and interpretation. These include i) exploratory data analysis (EDA) with its emphasis on visualization and graphics-based approaches and ii) Bayesian statistical methodology that provides easily interpretable and meaningful evaluations of data in terms of probability distributions. The EDA case-studies include analyses of herbicide-tolerant GM soybean and insect-protected GM maize and soybean. Bayesian approaches are presented in an analysis of herbicide-tolerant GM soybean. Advantages of these approaches over classical frequentist significance testing include the more direct interpretation of results in terms of probabilities pertaining to quantities of interest and no confusion over the application of corrections for multiple comparisons. It is concluded that a standardized framework for these methodologies could provide specific advantages through enhanced clarity of presentation and interpretation in comparative assessments of crop composition.
Hass, Chris J; Collins, Mitchell A; Juncos, Jorge L
2007-01-01
Persons with Parkinson disease (PD) exhibit decreased muscular fitness including decreased muscle mass, muscle strength, bioenergetic capabilities and increased fatigability. This purpose of this investigation was to evaluate the therapeutic effects of resistance training with and without creatine supplementation in patients with mild to moderate PD. Twenty patients with idiopathic PD were randomized to receive creatine monohydrate supplementation plus resistance training (CRE) or placebo (lactose monohydrate) plus resistance training (PLA), using a double-blind procedure. Creatine and placebo supplementation consisted of 20 g/d for the first 5 days and 5 g/d thereafter. Both groups participated in progressive resistance training (24 sessions, 2 times per week, 1 set of 8-12 repetitions, 9 exercises). Participants performed 1-repetition maximum (1-RM) for chest press, leg extension, and biceps curl. Muscular endurance was evaluated for chest press and leg extension as the number of repetitions to failure using 60% of baseline 1-RM. Functional performance was evaluated as the time to perform 3 consecutive chair rises. Statistical analyses (ANOVA) revealed significant Group x Time interactions for chest press strength and biceps curl strength, and post hoc testing revealed that the improvement was significantly greater for CRE. Chair rise performance significantly improved only for CRE (12%, P=.03). Both PLA and CRE significantly improved 1-RM for leg extension (PLA: 16%; CRE: 18%). Muscular endurance improved significantly for both groups. These findings demonstrate that creatine supplementation can enhance the benefits of resistance training in patients with PD.
Sargeant, J M; O'Connor, A M; Dohoo, I R; Erb, H N; Cevallos, M; Egger, M; Ersbøll, A K; Martin, S W; Nielsen, L R; Pearl, D L; Pfeiffer, D U; Sanchez, J; Torrence, M E; Vigre, H; Waldner, C; Ward, M P
2016-12-01
Reporting of observational studies in veterinary research presents challenges that often are not addressed in published reporting guidelines. Our objective was to develop an extension of the STROBE (Strengthening the Reporting of Observational Studies in Epidemiology) statement that addresses unique reporting requirements for observational studies in veterinary medicine related to health, production, welfare, and food safety. We conducted a consensus meeting with 17 experts in Mississauga, Canada. Experts completed a premeeting survey about whether items in the STROBE statement should be modified or added to address unique issues related to observational studies in animal species with health, production, welfare, or food safety outcomes. During the meeting, each STROBE item was discussed to determine whether or not rewording was recommended, and whether additions were warranted. Anonymous voting was used to determine consensus. Six items required no modifications or additions. Modifications or additions were made to the STROBE items 1 (title and abstract), 3 (objectives), 5 (setting), 6 (participants), 7 (variables), 8 (data sources and measurement), 9 (bias), 10 (study size), 12 (statistical methods), 13 (participants), 14 (descriptive data), 15 (outcome data), 16 (main results), 17 (other analyses), 19 (limitations), and 22 (funding). The methods and processes used were similar to those used for other extensions of the STROBE statement. The use of this STROBE statement extension should improve reporting of observational studies in veterinary research by recognizing unique features of observational studies involving food-producing and companion animals, products of animal origin, aquaculture, and wildlife.
Content-based VLE designs improve learning efficiency in constructivist statistics education.
Wessa, Patrick; De Rycker, Antoon; Holliday, Ian Edward
2011-01-01
We introduced a series of computer-supported workshops in our undergraduate statistics courses, in the hope that it would help students to gain a deeper understanding of statistical concepts. This raised questions about the appropriate design of the Virtual Learning Environment (VLE) in which such an approach had to be implemented. Therefore, we investigated two competing software design models for VLEs. In the first system, all learning features were a function of the classical VLE. The second system was designed from the perspective that learning features should be a function of the course's core content (statistical analyses), which required us to develop a specific-purpose Statistical Learning Environment (SLE) based on Reproducible Computing and newly developed Peer Review (PR) technology. The main research question is whether the second VLE design improved learning efficiency as compared to the standard type of VLE design that is commonly used in education. As a secondary objective we provide empirical evidence about the usefulness of PR as a constructivist learning activity which supports non-rote learning. Finally, this paper illustrates that it is possible to introduce a constructivist learning approach in large student populations, based on adequately designed educational technology, without subsuming educational content to technological convenience. Both VLE systems were tested within a two-year quasi-experiment based on a Reliable Nonequivalent Group Design. This approach allowed us to draw valid conclusions about the treatment effect of the changed VLE design, even though the systems were implemented in successive years. The methodological aspects about the experiment's internal validity are explained extensively. The effect of the design change is shown to have substantially increased the efficiency of constructivist, computer-assisted learning activities for all cohorts of the student population under investigation. The findings demonstrate that a content-based design outperforms the traditional VLE-based design.
2007-01-01
Background The US Food and Drug Administration approved the Charité artificial disc on October 26, 2004. This approval was based on an extensive analysis and review process; 20 years of disc usage worldwide; and the results of a prospective, randomized, controlled clinical trial that compared lumbar artificial disc replacement to fusion. The results of the investigational device exemption (IDE) study led to a conclusion that clinical outcomes following lumbar arthroplasty were at least as good as outcomes from fusion. Methods The author performed a new analysis of the Visual Analog Scale pain scores and the Oswestry Disability Index scores from the Charité artificial disc IDE study and used a nonparametric statistical test, because observed data distributions were not normal. The analysis included all of the enrolled subjects in both the nonrandomized and randomized phases of the study. Results Subjects from both the treatment and control groups improved from the baseline situation (P < .001) at all follow-up times (6 weeks to 24 months). Additionally, these pain and disability levels with artificial disc replacement were superior (P < .05) to the fusion treatment at all follow-up times including 2 years. Conclusions The a priori statistical plan for an IDE study may not adequately address the final distribution of the data. Therefore, statistical analyses more appropriate to the distribution may be necessary to develop meaningful statistical conclusions from the study. A nonparametric statistical analysis of the Charité artificial disc IDE outcomes scores demonstrates superiority for lumbar arthroplasty versus fusion at all follow-up time points to 24 months. PMID:25802574
Wu, Junsong; Du, Junhua; Jiang, Xiangyun; Wang, Quan; Li, Xigong; Du, Jingyu; Lin, Xiangjin
2014-06-17
To explore the changes of range-of-motion (ROM) in patients with degenerative lumbar disease on the treatment of WavefleX dynamic stabilization system and examine the postoperative lumbar regularity and tendency of ROM. Nine patients with degenerative lumbar disease on the treatment of WavefleX dynamic stabilization system were followed up with respect to ROMs at 5 timepoints within 12 months. Records of ROM were made for instrumented segments, adjacent segments and total lumbar. Compared with preoperation, ROMs in non-fusional segments with WavefleX dynamic stabilization system decreased statistical significantly (P < 0.05 or P < 0.01) at different timepoints; ROMs in adjacent segments increased at some levels without wide statistical significance. The exception was single L3/4 at Month 12 (P < 0.05) versus control group simultaneously at the levels of L3/4, L4/5 and L5/S1, ROMs decreased at Months 6 and 12 with wide statistical significance (P < 0.05 or P < 0.01). ROMs in total lumbar had statistical significant decrease (P < 0.01) in both group of non-fusional segments and hybrid group of non-fusion and fusion. The trends of continuous augments were observed during follow-ups. Statistically significant augments were also acquired at 4 timepoints as compared to control group (P < 0.01). The treatment of degenerative lumbar diseases with WavefleX dynamic stabilization system may limit excessive extension/inflexion and preserve some motor functions. Moreover, it can sustain physiological lordosis, decrease and transfer disc load in adjacent segments to prevent early degeneration of adjacent segment. Trends of motor function augment in total lumbar need to be confirmed during future long-term follow-ups.
Thermal imaging for cold air flow visualisation and analysis
NASA Astrophysics Data System (ADS)
Grudzielanek, M.; Pflitsch, A.; Cermak, J.
2012-04-01
In this work we present first applications of a thermal imaging system for animated visualization and analysis of cold air flow in field studies. The development of mobile thermal imaging systems advanced very fast in the last decades. The surface temperature of objects, which is detected with long-wave infrared radiation, affords conclusions in different problems of research. Modern thermal imaging systems allow infrared picture-sequences and a following data analysis; the systems are not exclusive imaging methods like in the past. Thus, the monitoring and analysing of dynamic processes became possible. We measured the cold air flow on a sloping grassland area with standard methods (sonic anemometers and temperature loggers) plus a thermal imaging system measuring in the range from 7.5 to 14µm. To analyse the cold air with the thermal measurements, we collected the surface infrared temperatures at a projection screen, which was located in cold air flow direction, opposite the infrared (IR) camera. The intention of using a thermal imaging system for our work was: 1. to get a general idea of practicability in our problem, 2. to assess the value of the extensive and more detailed data sets and 3. to optimise visualisation. The results were very promising. Through the possibility of generating time-lapse movies of the image sequences in time scaling, processes of cold air flow, like flow waves, turbulence and general flow speed, can be directly identified. Vertical temperature gradients and near-ground inversions can be visualised very well. Time-lapse movies will be presented. The extensive data collection permits a higher spatial resolution of the data than standard methods, so that cold air flow attributes can be explored in much more detail. Time series are extracted from the IR data series, analysed statistically, and compared to data obtained using traditional systems. Finally, we assess the usefulness of the additional measurement of cold air flow with thermal imaging systems.
Cutler, C; Giri, S; Jeyapalan, S; Paniagua, D; Viswanathan, A; Antin, J H
2001-08-15
Controversy exists as to whether the incidence of graft-versus-host disease (GVHD) is increased after peripheral-blood stem-cell transplantation (PBSCT) when compared with bone marrow transplantation (BMT). We performed a meta-analysis of all trials comparing the incidence of acute and chronic GVHD after PBSCT and BMT reported as of June, 2000. Secondary analyses examined relapse rates after the two procedures. An extensive MEDLINE search of the literature was undertaken. Primary authors were contacted for clarification and completion of missing information. A review of cited references was also undertaken. Sixteen studies (five randomized controlled trials and 11 cohort studies) were included in this analysis. Data was extracted by two pairs of reviewers and analyzed for the outcomes of interest. Meta-analyses, regression analyses, and assessments of publication bias were performed. Using a random effects model, the pooled relative risk (RR) for acute GVHD after PBSCT was 1.16 (95% confidence interval [CI], 1.04 to 1.28; P=.006) when compared with traditional BMT. The pooled RR for chronic GVHD after PBSCT was 1.53 (95% CI, 1.25 to 1.88; P <.001) when compared with BMT. The RR of developing clinically extensive chronic GVHD was 1.66 (95% CI, 1.35 to 2.05; P <.001). The excess risk of chronic GVHD was explained by differences in the T-cell dose delivered with the graft in a meta-regression model that did not reach statistical significance. There was a trend towards a decrease in the rate of relapse after PBSCT (RR = 0.81; 95% CI, 0.62 to 1.05). Both acute and chronic GVHD are more common after PBSCT than BMT, and this may be associated with lower rates of malignant relapse. The magnitude of the transfused T-cell load may explain the differences in chronic GVHD risk.
Applications of quantum entropy to statistics
NASA Astrophysics Data System (ADS)
Silver, R. N.; Martz, H. F.
This paper develops two generalizations of the maximum entropy (ME) principle. First, Shannon classical entropy is replaced by von Neumann quantum entropy to yield a broader class of information divergences (or penalty functions) for statistics applications. Negative relative quantum entropy enforces convexity, positivity, non-local extensivity and prior correlations such as smoothness. This enables the extension of ME methods from their traditional domain of ill-posed in-verse problems to new applications such as non-parametric density estimation. Second, given a choice of information divergence, a combination of ME and Bayes rule is used to assign both prior and posterior probabilities. Hyperparameters are interpreted as Lagrange multipliers enforcing constraints. Conservation principles are proposed to act statistical regularization and other hyperparameters, such as conservation of information and smoothness. ME provides an alternative to hierarchical Bayes methods.
The Problem of Auto-Correlation in Parasitology
Pollitt, Laura C.; Reece, Sarah E.; Mideo, Nicole; Nussey, Daniel H.; Colegrave, Nick
2012-01-01
Explaining the contribution of host and pathogen factors in driving infection dynamics is a major ambition in parasitology. There is increasing recognition that analyses based on single summary measures of an infection (e.g., peak parasitaemia) do not adequately capture infection dynamics and so, the appropriate use of statistical techniques to analyse dynamics is necessary to understand infections and, ultimately, control parasites. However, the complexities of within-host environments mean that tracking and analysing pathogen dynamics within infections and among hosts poses considerable statistical challenges. Simple statistical models make assumptions that will rarely be satisfied in data collected on host and parasite parameters. In particular, model residuals (unexplained variance in the data) should not be correlated in time or space. Here we demonstrate how failure to account for such correlations can result in incorrect biological inference from statistical analysis. We then show how mixed effects models can be used as a powerful tool to analyse such repeated measures data in the hope that this will encourage better statistical practices in parasitology. PMID:22511865
Raymond L. Czaplewski
2015-01-01
Wall-to-wall remotely sensed data are increasingly available to monitor landscape dynamics over large geographic areas. However, statistical monitoring programs that use post-stratification cannot fully utilize those sensor data. The Kalman filter (KF) is an alternative statistical estimator. I develop a new KF algorithm that is numerically robust with large numbers of...
Combescure, Christophe; Foucher, Yohann; Jackson, Daniel
2014-07-10
In epidemiologic studies and clinical trials with time-dependent outcome (for instance death or disease progression), survival curves are used to describe the risk of the event over time. In meta-analyses of studies reporting a survival curve, the most informative finding is a summary survival curve. In this paper, we propose a method to obtain a distribution-free summary survival curve by expanding the product-limit estimator of survival for aggregated survival data. The extension of DerSimonian and Laird's methodology for multiple outcomes is applied to account for the between-study heterogeneity. Statistics I(2) and H(2) are used to quantify the impact of the heterogeneity in the published survival curves. A statistical test for between-strata comparison is proposed, with the aim to explore study-level factors potentially associated with survival. The performance of the proposed approach is evaluated in a simulation study. Our approach is also applied to synthesize the survival of untreated patients with hepatocellular carcinoma from aggregate data of 27 studies and synthesize the graft survival of kidney transplant recipients from individual data from six hospitals. Copyright © 2014 John Wiley & Sons, Ltd.
A model-based approach to wildland fire reconstruction using sediment charcoal records
Itter, Malcolm S.; Finley, Andrew O.; Hooten, Mevin B.; Higuera, Philip E.; Marlon, Jennifer R.; Kelly, Ryan; McLachlan, Jason S.
2017-01-01
Lake sediment charcoal records are used in paleoecological analyses to reconstruct fire history, including the identification of past wildland fires. One challenge of applying sediment charcoal records to infer fire history is the separation of charcoal associated with local fire occurrence and charcoal originating from regional fire activity. Despite a variety of methods to identify local fires from sediment charcoal records, an integrated statistical framework for fire reconstruction is lacking. We develop a Bayesian point process model to estimate the probability of fire associated with charcoal counts from individual-lake sediments and estimate mean fire return intervals. A multivariate extension of the model combines records from multiple lakes to reduce uncertainty in local fire identification and estimate a regional mean fire return interval. The univariate and multivariate models are applied to 13 lakes in the Yukon Flats region of Alaska. Both models resulted in similar mean fire return intervals (100–350 years) with reduced uncertainty under the multivariate model due to improved estimation of regional charcoal deposition. The point process model offers an integrated statistical framework for paleofire reconstruction and extends existing methods to infer regional fire history from multiple lake records with uncertainty following directly from posterior distributions.
Language experience changes subsequent learning.
Onnis, Luca; Thiessen, Erik
2013-02-01
What are the effects of experience on subsequent learning? We explored the effects of language-specific word order knowledge on the acquisition of sequential conditional information. Korean and English adults were engaged in a sequence learning task involving three different sets of stimuli: auditory linguistic (nonsense syllables), visual non-linguistic (nonsense shapes), and auditory non-linguistic (pure tones). The forward and backward probabilities between adjacent elements generated two equally probable and orthogonal perceptual parses of the elements, such that any significant preference at test must be due to either general cognitive biases, or prior language-induced biases. We found that language modulated parsing preferences with the linguistic stimuli only. Intriguingly, these preferences are congruent with the dominant word order patterns of each language, as corroborated by corpus analyses, and are driven by probabilistic preferences. Furthermore, although the Korean individuals had received extensive formal explicit training in English and lived in an English-speaking environment, they exhibited statistical learning biases congruent with their native language. Our findings suggest that mechanisms of statistical sequential learning are implicated in language across the lifespan, and experience with language may affect cognitive processes and later learning. Copyright © 2012 Elsevier B.V. All rights reserved.
Ronza, A; Vílchez, J A; Casal, J
2007-07-19
Risk assessment of hazardous material spill scenarios, and quantitative risk assessment in particular, make use of event trees to account for the possible outcomes of hazardous releases. Using event trees entails the definition of probabilities of occurrence for events such as spill ignition and blast formation. This study comprises an extensive analysis of ignition and explosion probability data proposed in previous work. Subsequently, the results of the survey of two vast US federal spill databases (HMIRS, by the Department of Transportation, and MINMOD, by the US Coast Guard) are reported and commented on. Some tens of thousands of records of hydrocarbon spills were analysed. The general pattern of statistical ignition and explosion probabilities as a function of the amount and the substance spilled is discussed. Equations are proposed based on statistical data that predict the ignition probability of hydrocarbon spills as a function of the amount and the substance spilled. Explosion probabilities are put forth as well. Two sets of probability data are proposed: it is suggested that figures deduced from HMIRS be used in land transportation risk assessment, and MINMOD results with maritime scenarios assessment. Results are discussed and compared with previous technical literature.
Fujita, Masaru; Diab, Mohammad; Xu, Zheng; Puttlitz, Christian M
2006-09-01
An in vitro biomechanical calf thoracic spine study. To evaluate the biomechanical stability of sublaminar and subtransverse process fixation using stainless steel wires and ultra-high molecular weight polyethylene (UHMWPE) cables. It is commonly held that transverse process fixation provides less stability than sublaminar fixation. To our knowledge, this is the first biomechanical study to compare the stability afforded by sublaminar fixation and subtransverse process fixation using metal wire and UHMWPE cable before and after cyclic loading. There were 6 fresh-frozen calf thoracic spines (T4-T9) used to determine the sublaminar fixation stiffness and subtransverse process fixation stiffness in each group. Double strands of 18-gauge stainless steel wire, 3 and 5 mm-width UHMWPE cable (Nesplon; Alfresa, Inc., Osaka, Japan) were applied to each spine. Cyclic pure flexion-extension moment loading (2 Nm, 0.5 Hz, 5000 cycles) was applied after the initial stability was analyzed by measuring the range of motion. Statistical analyses were used to delineate differences between the various experimental groups. Subtransverse process wiring was more stable than sublaminar wiring after cyclic loading in flexion-extension (P < 0.05). There were no significant differences between each group in lateral bending and axial rotation after cyclic loading. Sublaminar stainless steel wiring was more stable than sublaminar 3 and 5-mm cable before and after cyclic loading in axial rotation (P < 0.01). Acute subtransverse process fixation using 3-mm cable was less stable after cyclic loading in axial rotation (P < 0.05). All other groups did not produce statistically significant differences. Subtransverse process fixation provides at least as much stability as sublaminar fixation. A 5-mm UHMWPE cable and stainless steel wire result in equivalent sublaminar and subtransverse process stability.
A Uniformly Selected Sample of Low-mass Black Holes in Seyfert 1 Galaxies. II. The SDSS DR7 Sample
NASA Astrophysics Data System (ADS)
Liu, He-Yang; Yuan, Weimin; Dong, Xiao-Bo; Zhou, Hongyan; Liu, Wen-Juan
2018-04-01
A new sample of 204 low-mass black holes (LMBHs) in active galactic nuclei (AGNs) is presented with black hole masses in the range of (1–20) × 105 M ⊙. The AGNs are selected through a systematic search among galaxies in the Seventh Data Release (DR7) of the Sloan Digital Sky Survey (SDSS), and careful analyses of their optical spectra and precise measurement of spectral parameters. Combining them with our previous sample selected from SDSS DR4 makes it the largest LMBH sample so far, totaling over 500 objects. Some of the statistical properties of the combined LMBH AGN sample are briefly discussed in the context of exploring the low-mass end of the AGN population. Their X-ray luminosities follow the extension of the previously known correlation with the [O III] luminosity. The effective optical-to-X-ray spectral indices α OX, albeit with a large scatter, are broadly consistent with the extension of the relation with the near-UV luminosity L 2500 Å. Interestingly, a correlation of α OX with black hole mass is also found, with α OX being statistically flatter (stronger X-ray relative to optical) for lower black hole masses. Only 26 objects, mostly radio loud, were detected in radio at 20 cm in the FIRST survey, giving a radio-loud fraction of 4%. The host galaxies of LMBHs have stellar masses in the range of 108.8–1012.4 M ⊙ and optical colors typical of Sbc spirals. They are dominated by young stellar populations that seem to have undergone continuous star formation history.
Seeling, Patrick; Reisslein, Martin
2014-01-01
Video encoding for multimedia services over communication networks has significantly advanced in recent years with the development of the highly efficient and flexible H.264/AVC video coding standard and its SVC extension. The emerging H.265/HEVC video coding standard as well as 3D video coding further advance video coding for multimedia communications. This paper first gives an overview of these new video coding standards and then examines their implications for multimedia communications by studying the traffic characteristics of long videos encoded with the new coding standards. We review video coding advances from MPEG-2 and MPEG-4 Part 2 to H.264/AVC and its SVC and MVC extensions as well as H.265/HEVC. For single-layer (nonscalable) video, we compare H.265/HEVC and H.264/AVC in terms of video traffic and statistical multiplexing characteristics. Our study is the first to examine the H.265/HEVC traffic variability for long videos. We also illustrate the video traffic characteristics and statistical multiplexing of scalable video encoded with the SVC extension of H.264/AVC as well as 3D video encoded with the MVC extension of H.264/AVC.
2014-01-01
Video encoding for multimedia services over communication networks has significantly advanced in recent years with the development of the highly efficient and flexible H.264/AVC video coding standard and its SVC extension. The emerging H.265/HEVC video coding standard as well as 3D video coding further advance video coding for multimedia communications. This paper first gives an overview of these new video coding standards and then examines their implications for multimedia communications by studying the traffic characteristics of long videos encoded with the new coding standards. We review video coding advances from MPEG-2 and MPEG-4 Part 2 to H.264/AVC and its SVC and MVC extensions as well as H.265/HEVC. For single-layer (nonscalable) video, we compare H.265/HEVC and H.264/AVC in terms of video traffic and statistical multiplexing characteristics. Our study is the first to examine the H.265/HEVC traffic variability for long videos. We also illustrate the video traffic characteristics and statistical multiplexing of scalable video encoded with the SVC extension of H.264/AVC as well as 3D video encoded with the MVC extension of H.264/AVC. PMID:24701145
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-22
... (NASS) to request revision and extension of a currently approved information collection, the Cotton... from David Hancock, NASS Clearance Officer, at (202) 690-2388. SUPPLEMENTARY INFORMATION: Title: Cotton... conduct the Census of Agriculture. The Cotton Ginning surveys provide cotton ginning statistics from...
Artificial Neural Networks in Policy Research: A Current Assessment.
ERIC Educational Resources Information Center
Woelfel, Joseph
1993-01-01
Suggests that artificial neural networks (ANNs) exhibit properties that promise usefulness for policy researchers. Notes that ANNs have found extensive use in areas once reserved for multivariate statistical programs such as regression and multiple classification analysis and are developing an extensive community of advocates for processing text…
1992-10-01
N=8) and Results of 44 Statistical Analyses for Impact Test Performed on Forefoot of Unworn Footwear A-2. Summary Statistics (N=8) and Results of...on Forefoot of Worn Footwear Vlll Tables (continued) Table Page B-2. Summary Statistics (N=4) and Results of 76 Statistical Analyses for Impact...used tests to assess heel and forefoot shock absorption, upper and sole durability, and flexibility (Cavanagh, 1978). Later, the number of tests was
Effect Size Analyses of Souvenaid in Patients with Alzheimer's Disease.
Cummings, Jeffrey; Scheltens, Philip; McKeith, Ian; Blesa, Rafael; Harrison, John E; Bertolucci, Paulo H F; Rockwood, Kenneth; Wilkinson, David; Wijker, Wouter; Bennett, David A; Shah, Raj C
2017-01-01
Souvenaid® (uridine monophosphate, docosahexaenoic acid, eicosapentaenoic acid, choline, phospholipids, folic acid, vitamins B12, B6, C, and E, and selenium), was developed to support the formation and function of neuronal membranes. To determine effect sizes observed in clinical trials of Souvenaid and to calculate the number needed to treat to show benefit or harm. Data from all three reported randomized controlled trials of Souvenaid in Alzheimer's disease (AD) dementia (Souvenir I, Souvenir II, and S-Connect) and an open-label extension study were included in analyses of effect size for cognitive, functional, and behavioral outcomes. Effect size was determined by calculating Cohen's d statistic (or Cramér's V method for nominal data), number needed to treat and number needed to harm. Statistical calculations were performed for the intent-to-treat populations. In patients with mild AD, effect sizes were 0.21 (95% confidence intervals: -0.06, 0.49) for the primary outcome in Souvenir II (neuropsychological test battery memory z-score) and 0.20 (0.10, 0.34) for the co-primary outcome of Souvenir I (Wechsler memory scale delayed recall). No effect was shown on cognition in patients with mild-to-moderate AD (S-Connect). The number needed to treat (6 and 21 for Souvenir I and II, respectively) and high number needed to harm values indicate a favorable harm:benefit ratio for Souvenaid versus control in patients with mild AD. The favorable safety profile and impact on outcome measures converge to corroborate the putative mode of action and demonstrate that Souvenaid can achieve clinically detectable effects in patients with early AD.
Brorsson, C.; Hansen, N. T.; Lage, K.; Bergholdt, R.; Brunak, S.; Pociot, F.
2009-01-01
Aim To develop novel methods for identifying new genes that contribute to the risk of developing type 1 diabetes within the Major Histocompatibility Complex (MHC) region on chromosome 6, independently of the known linkage disequilibrium (LD) between human leucocyte antigen (HLA)-DRB1, -DQA1, -DQB1 genes. Methods We have developed a novel method that combines single nucleotide polymorphism (SNP) genotyping data with protein–protein interaction (ppi) networks to identify disease-associated network modules enriched for proteins encoded from the MHC region. Approximately 2500 SNPs located in the 4 Mb MHC region were analysed in 1000 affected offspring trios generated by the Type 1 Diabetes Genetics Consortium (T1DGC). The most associated SNP in each gene was chosen and genes were mapped to ppi networks for identification of interaction partners. The association testing and resulting interacting protein modules were statistically evaluated using permutation. Results A total of 151 genes could be mapped to nodes within the protein interaction network and their interaction partners were identified. Five protein interaction modules reached statistical significance using this approach. The identified proteins are well known in the pathogenesis of T1D, but the modules also contain additional candidates that have been implicated in β-cell development and diabetic complications. Conclusions The extensive LD within the MHC region makes it important to develop new methods for analysing genotyping data for identification of additional risk genes for T1D. Combining genetic data with knowledge about functional pathways provides new insight into mechanisms underlying T1D. PMID:19143816
Papageorgiou, Spyridon N; Papadopoulos, Moschos A; Athanasiou, Athanasios E
2014-02-01
Ideally meta-analyses (MAs) should consolidate the characteristics of orthodontic research in order to produce an evidence-based answer. However severe flaws are frequently observed in most of them. The aim of this study was to evaluate the statistical methods, the methodology, and the quality characteristics of orthodontic MAs and to assess their reporting quality during the last years. Electronic databases were searched for MAs (with or without a proper systematic review) in the field of orthodontics, indexed up to 2011. The AMSTAR tool was used for quality assessment of the included articles. Data were analyzed with Student's t-test, one-way ANOVA, and generalized linear modelling. Risk ratios with 95% confidence intervals were calculated to represent changes during the years in reporting of key items associated with quality. A total of 80 MAs with 1086 primary studies were included in this evaluation. Using the AMSTAR tool, 25 (27.3%) of the MAs were found to be of low quality, 37 (46.3%) of medium quality, and 18 (22.5%) of high quality. Specific characteristics like explicit protocol definition, extensive searches, and quality assessment of included trials were associated with a higher AMSTAR score. Model selection and dealing with heterogeneity or publication bias were often problematic in the identified reviews. The number of published orthodontic MAs is constantly increasing, while their overall quality is considered to range from low to medium. Although the number of MAs of medium and high level seems lately to rise, several other aspects need improvement to increase their overall quality.
2011-01-01
Background Clinical researchers have often preferred to use a fixed effects model for the primary interpretation of a meta-analysis. Heterogeneity is usually assessed via the well known Q and I2 statistics, along with the random effects estimate they imply. In recent years, alternative methods for quantifying heterogeneity have been proposed, that are based on a 'generalised' Q statistic. Methods We review 18 IPD meta-analyses of RCTs into treatments for cancer, in order to quantify the amount of heterogeneity present and also to discuss practical methods for explaining heterogeneity. Results Differing results were obtained when the standard Q and I2 statistics were used to test for the presence of heterogeneity. The two meta-analyses with the largest amount of heterogeneity were investigated further, and on inspection the straightforward application of a random effects model was not deemed appropriate. Compared to the standard Q statistic, the generalised Q statistic provided a more accurate platform for estimating the amount of heterogeneity in the 18 meta-analyses. Conclusions Explaining heterogeneity via the pre-specification of trial subgroups, graphical diagnostic tools and sensitivity analyses produced a more desirable outcome than an automatic application of the random effects model. Generalised Q statistic methods for quantifying and adjusting for heterogeneity should be incorporated as standard into statistical software. Software is provided to help achieve this aim. PMID:21473747
Gaskin, Cadeyrn J; Happell, Brenda
2014-05-01
To (a) assess the statistical power of nursing research to detect small, medium, and large effect sizes; (b) estimate the experiment-wise Type I error rate in these studies; and (c) assess the extent to which (i) a priori power analyses, (ii) effect sizes (and interpretations thereof), and (iii) confidence intervals were reported. Statistical review. Papers published in the 2011 volumes of the 10 highest ranked nursing journals, based on their 5-year impact factors. Papers were assessed for statistical power, control of experiment-wise Type I error, reporting of a priori power analyses, reporting and interpretation of effect sizes, and reporting of confidence intervals. The analyses were based on 333 papers, from which 10,337 inferential statistics were identified. The median power to detect small, medium, and large effect sizes was .40 (interquartile range [IQR]=.24-.71), .98 (IQR=.85-1.00), and 1.00 (IQR=1.00-1.00), respectively. The median experiment-wise Type I error rate was .54 (IQR=.26-.80). A priori power analyses were reported in 28% of papers. Effect sizes were routinely reported for Spearman's rank correlations (100% of papers in which this test was used), Poisson regressions (100%), odds ratios (100%), Kendall's tau correlations (100%), Pearson's correlations (99%), logistic regressions (98%), structural equation modelling/confirmatory factor analyses/path analyses (97%), and linear regressions (83%), but were reported less often for two-proportion z tests (50%), analyses of variance/analyses of covariance/multivariate analyses of variance (18%), t tests (8%), Wilcoxon's tests (8%), Chi-squared tests (8%), and Fisher's exact tests (7%), and not reported for sign tests, Friedman's tests, McNemar's tests, multi-level models, and Kruskal-Wallis tests. Effect sizes were infrequently interpreted. Confidence intervals were reported in 28% of papers. The use, reporting, and interpretation of inferential statistics in nursing research need substantial improvement. Most importantly, researchers should abandon the misleading practice of interpreting the results from inferential tests based solely on whether they are statistically significant (or not) and, instead, focus on reporting and interpreting effect sizes, confidence intervals, and significance levels. Nursing researchers also need to conduct and report a priori power analyses, and to address the issue of Type I experiment-wise error inflation in their studies. Crown Copyright © 2013. Published by Elsevier Ltd. All rights reserved.
Schwämmle, Veit; León, Ileana Rodríguez; Jensen, Ole Nørregaard
2013-09-06
Large-scale quantitative analyses of biological systems are often performed with few replicate experiments, leading to multiple nonidentical data sets due to missing values. For example, mass spectrometry driven proteomics experiments are frequently performed with few biological or technical replicates due to sample-scarcity or due to duty-cycle or sensitivity constraints, or limited capacity of the available instrumentation, leading to incomplete results where detection of significant feature changes becomes a challenge. This problem is further exacerbated for the detection of significant changes on the peptide level, for example, in phospho-proteomics experiments. In order to assess the extent of this problem and the implications for large-scale proteome analysis, we investigated and optimized the performance of three statistical approaches by using simulated and experimental data sets with varying numbers of missing values. We applied three tools, including standard t test, moderated t test, also known as limma, and rank products for the detection of significantly changing features in simulated and experimental proteomics data sets with missing values. The rank product method was improved to work with data sets containing missing values. Extensive analysis of simulated and experimental data sets revealed that the performance of the statistical analysis tools depended on simple properties of the data sets. High-confidence results were obtained by using the limma and rank products methods for analyses of triplicate data sets that exhibited more than 1000 features and more than 50% missing values. The maximum number of differentially represented features was identified by using limma and rank products methods in a complementary manner. We therefore recommend combined usage of these methods as a novel and optimal way to detect significantly changing features in these data sets. This approach is suitable for large quantitative data sets from stable isotope labeling and mass spectrometry experiments and should be applicable to large data sets of any type. An R script that implements the improved rank products algorithm and the combined analysis is available.
NASA Astrophysics Data System (ADS)
Pavlos, G. P.; Malandraki, O.; Khabarova, O.; Livadiotis, G.; Pavlos, E.; Karakatsanis, L. P.; Iliopoulos, A. C.; Parisis, K.
2017-12-01
In this work we study the non-extensivity of Solar Wind space plasma by using electric-magnetic field data obtained by in situ spacecraft observations at different dynamical states of solar wind system especially in interplanetary coronal mass ejections (ICMEs), Interplanetary shocks, magnetic islands, or near the Earth Bow shock. Especially, we study the energetic particle non extensive fractional acceleration mechanism producing kappa distributions as well as the intermittent turbulence mechanism producing multifractal structures related with the Tsallis q-entropy principle. We present some new and significant results concerning the dynamics of ICMEs observed in the near Earth at L1 solar wind environment, as well as its effect in Earth's magnetosphere as well as magnetic islands. In-situ measurements of energetic particles at L1 are analyzed, in response to major solar eruptive events at the Sun (intense flares, fast CMEs). The statistical characteristics are obtained and compared for the Solar Energetic Particles (SEPs) originating at the Sun, the energetic particle enhancements associated with local acceleration during the CME-driven shock passage over the spacecraft (Energetic Particle Enhancements, ESPs) as well as the energetic particle signatures observed during the passage of the ICME. The results are referred to Tsallis non-extensive statistics and in particular to the estimation of Tsallis q-triplet, (qstat, qsen, qrel) of electric-magnetic field and the kappa distributions of solar energetic particles time series of the ICME, magnetic islands, resulting from the solar eruptive activity or the internal Solar Wind dynamics. Our results reveal significant differences in statistical and dynamical features, indicating important variations of the magnetic field dynamics both in time and space domains during the shock event, in terms of rate of entropy production, relaxation dynamics and non-equilibrium meta-stable stationary states.
ON THE DYNAMICAL DERIVATION OF EQUILIBRIUM STATISTICAL MECHANICS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Prigogine, I.; Balescu, R.; Henin, F.
1960-12-01
Work on nonequilibrium statistical mechanics, which allows an extension of the kinetic proof to all results of equilibrium statistical mechanics involving a finite number of degrees of freedom, is summarized. As an introduction to the general N-body problem, the scattering theory in classical mechanics is considered. The general N-body problem is considered for the case of classical mechanics, quantum mechanics with Boltzmann statistics, and quantum mechanics including quantum statistics. Six basic diagrams, which describe the elementary processes of the dynamics of correlations, were obtained. (M.C.G.)
2016-03-17
RESERVE COMPONENT MEMBERS: STATISTICAL METHODOLOGY REPORT Defense Research, Surveys, and Statistics Center (RSSC) Defense Manpower Data Center...Defense Manpower Data Center (DMDC) is indebted to numerous people for their assistance with the 2015 Workplace and Gender Relations Survey of Reserve...outcomes were modeled as a function of an extensive set of administrative variables available for both respondents and nonrespondents, resulting in six
Aero-Optics Measurement System for the AEDC Aero-Optics Test Facility
1991-02-01
Pulse Energy Statistics , 150 Pulses ........................................ 41 AEDC-TR-90-20 APPENDIXES A. Optical Performance of Heated Windows...hypersonic wind tunnel, where the requisite extensive statistical database can be developed in a cost- and time-effective manner. Ground testing...At the present time at AEDC, measured AO parameter statistics are derived from sets of image-spot recordings with a set containing as many as 150
2015-04-27
from waste biomass using these two high temperature reactors. We have extensively used a Raman spectrometer to analyse as synthesized carbon materials...corporation). These tools were fully installed and operational. We have also synthesized carbon materials from waste biomass using these two high...materials from waste biomass using these two high temperature reactors. We have extensively used a Raman spectrometer to analyse as synthesized carbon
Statistical Optimality in Multipartite Ranking and Ordinal Regression.
Uematsu, Kazuki; Lee, Yoonkyung
2015-05-01
Statistical optimality in multipartite ranking is investigated as an extension of bipartite ranking. We consider the optimality of ranking algorithms through minimization of the theoretical risk which combines pairwise ranking errors of ordinal categories with differential ranking costs. The extension shows that for a certain class of convex loss functions including exponential loss, the optimal ranking function can be represented as a ratio of weighted conditional probability of upper categories to lower categories, where the weights are given by the misranking costs. This result also bridges traditional ranking methods such as proportional odds model in statistics with various ranking algorithms in machine learning. Further, the analysis of multipartite ranking with different costs provides a new perspective on non-smooth list-wise ranking measures such as the discounted cumulative gain and preference learning. We illustrate our findings with simulation study and real data analysis.
MANCOVA for one way classification with homogeneity of regression coefficient vectors
NASA Astrophysics Data System (ADS)
Mokesh Rayalu, G.; Ravisankar, J.; Mythili, G. Y.
2017-11-01
The MANOVA and MANCOVA are the extensions of the univariate ANOVA and ANCOVA techniques to multidimensional or vector valued observations. The assumption of a Gaussian distribution has been replaced with the Multivariate Gaussian distribution for the vectors data and residual term variables in the statistical models of these techniques. The objective of MANCOVA is to determine if there are statistically reliable mean differences that can be demonstrated between groups later modifying the newly created variable. When randomization assignment of samples or subjects to groups is not possible, multivariate analysis of covariance (MANCOVA) provides statistical matching of groups by adjusting dependent variables as if all subjects scored the same on the covariates. In this research article, an extension has been made to the MANCOVA technique with more number of covariates and homogeneity of regression coefficient vectors is also tested.
The prior statistics of object colors.
Koenderink, Jan J
2010-02-01
The prior statistics of object colors is of much interest because extensive statistical investigations of reflectance spectra reveal highly non-uniform structure in color space common to several very different databases. This common structure is due to the visual system rather than to the statistics of environmental structure. Analysis involves an investigation of the proper sample space of spectral reflectance factors and of the statistical consequences of the projection of spectral reflectances on the color solid. Even in the case of reflectance statistics that are translationally invariant with respect to the wavelength dimension, the statistics of object colors is highly non-uniform. The qualitative nature of this non-uniformity is due to trichromacy.
40 CFR 91.512 - Request for public hearing.
Code of Federal Regulations, 2010 CFR
2010-07-01
... plans and statistical analyses have been properly applied (specifically, whether sampling procedures and statistical analyses specified in this subpart were followed and whether there exists a basis for... will be made available to the public during Agency business hours. ...
Kulmala, Jenni; Sipilä, Sarianna; Tiainen, Kristina; Pärssinen, Olavi; Koskenvuo, Markku; Kaprio, Jaakko; Rantanen, Taina
2012-10-01
Vision problems are common experiences within the older population. This study aimed to examine the association between vision and lower extremity impairment. 434 women aged 63-75 participated in visual acuity (VA) measurements at baseline and 313 persons at three-year follow-up. Measurements of lower extremity function included maximal isometric knee extension strength, leg extension power, maximal walking speed and standing balance. At baseline, knee extension strength was lower among participants with visual impairment (VI) (273.2±6.4 N) compared to those with good vision (306.5±5.9 N, p<0.001) as well as leg extension power (95.2±2.7 W vs 104.2±2.6 W, p=0.009) and maximal walking speed (1.6±0.02 m/s vs 1.8±0.03 m/s, p<0.001). Higher velocity moment among persons with VI (53.5±2.7 mm²/s vs 42.7±1.4 mm²/s, p<0.001) indicated that persons with VI had poorer balance compared to persons with good vision. Decreased isometric knee extension strength (OR 1.26, 95% CI 1.09-1.45), poorer standing balance (OR 1.16, 95% CI 1.00-1.35) as well as lower maximal walking speed (OR 1.34, 95% CI 1.13-1.59) were associated with VI in the logistic regression models. Additionally, the association between poorer leg extension power and VI (OR 1.14, 95% CI 0.99-1.31) was of borderline statistical significance. In longitudinal analyses, VI did not predict decline in lower extremity function. Lower extremity impairment was associated with VI among relatively healthy older women. However, change in lower extremity function was quite similar between the vision groups. It is possible that decreased VA may be a marker of underlying systemic factors or the aging process, which lead to poorer functional capacity, or there may be shared background factors, which lead to decreased vision and lower extremity impairment.
Lin, Tzu-Kang; Tsai, Hong-Chieh; Hsieh, Tsung-Che
2012-07-01
To clarify the clinical role of traumatic subarachnoid hemorrhage (tSAH), stratified analysis with grouping of tSAH was performed. Their blood flow changes and correlations with outcome were assayed. One hundred seventeen tSAH patients were classified into several groups according to their initial computerized tomography scans. Group I included patients with tSAH only in the posterior interhemispheric fissure, whereas Group II contained patients with tSAH located elsewhere. Group II was further subdivided into IIa, little SAH; IIb, extensive SAH; IIc, little SAH with intraventricular hemorrhage (IVH); and IId, extensive SAH with IVH. The cerebral blood flow velocity was monitored using transcranial Doppler sonography (TCD). Both age and initial coma scale were independent predictors of poor outcome. The poor outcome rates in various subgroups of tSAH increased stepwise from group I to group IId (I, 7.4%; IIa, 18.4%; IIb, 33.3%; IIc, 62.5%; and IId, 90.9%) (p = 0.0010). Stratified analyses revealed that patients with extensive tSAH (group IIb + IId) were more likely to have unfavorable outcomes (47.7%) than patients with little tSAH (group IIa + IIc) (26.1%) (p = 0.0185); patients with IVH (group IIc + IId) also displayed a higher incidence (78.9%) of poor outcomes than patients without IVH (group IIa + IIb) (25.4%) (p = 0.0030). TCD study demonstrated that patients with extensive tSAH (group IIb + IId) were more likely to have the vasospasm based on TCD criteria than did patients in group I and group IIa + IIc (37.5% vs. 5.9% and 7.7%, p = 0.0105). Notably, there was a tendency of worse outcome in patients with vasospasm on the basis of TCD-derived criteria than those without, with the unfavorable outcome rates being 47.4% and 24.7% (p = 0.0799). Age, initial coma scale, extensive tSAH, and IVH are independent predictors of poor outcome in the cohort of tSAH patients. Statistically, patients with extensive tSAH are significantly more likely to have vasospasm.
Jin, Zhichao; Yu, Danghui; Zhang, Luoman; Meng, Hong; Lu, Jian; Gao, Qingbin; Cao, Yang; Ma, Xiuqiang; Wu, Cheng; He, Qian; Wang, Rui; He, Jia
2010-05-25
High quality clinical research not only requires advanced professional knowledge, but also needs sound study design and correct statistical analyses. The number of clinical research articles published in Chinese medical journals has increased immensely in the past decade, but study design quality and statistical analyses have remained suboptimal. The aim of this investigation was to gather evidence on the quality of study design and statistical analyses in clinical researches conducted in China for the first decade of the new millennium. Ten (10) leading Chinese medical journals were selected and all original articles published in 1998 (N = 1,335) and 2008 (N = 1,578) were thoroughly categorized and reviewed. A well-defined and validated checklist on study design, statistical analyses, results presentation, and interpretation was used for review and evaluation. Main outcomes were the frequencies of different types of study design, error/defect proportion in design and statistical analyses, and implementation of CONSORT in randomized clinical trials. From 1998 to 2008: The error/defect proportion in statistical analyses decreased significantly ( = 12.03, p<0.001), 59.8% (545/1,335) in 1998 compared to 52.2% (664/1,578) in 2008. The overall error/defect proportion of study design also decreased ( = 21.22, p<0.001), 50.9% (680/1,335) compared to 42.40% (669/1,578). In 2008, design with randomized clinical trials remained low in single digit (3.8%, 60/1,578) with two-third showed poor results reporting (defects in 44 papers, 73.3%). Nearly half of the published studies were retrospective in nature, 49.3% (658/1,335) in 1998 compared to 48.2% (761/1,578) in 2008. Decreases in defect proportions were observed in both results presentation ( = 93.26, p<0.001), 92.7% (945/1,019) compared to 78.2% (1023/1,309) and interpretation ( = 27.26, p<0.001), 9.7% (99/1,019) compared to 4.3% (56/1,309), some serious ones persisted. Chinese medical research seems to have made significant progress regarding statistical analyses, but there remains ample room for improvement regarding study designs. Retrospective clinical studies are the most often used design, whereas randomized clinical trials are rare and often show methodological weaknesses. Urgent implementation of the CONSORT statement is imperative.
ERIC Educational Resources Information Center
Cafri, Guy; Kromrey, Jeffrey D.; Brannick, Michael T.
2010-01-01
This article uses meta-analyses published in "Psychological Bulletin" from 1995 to 2005 to describe meta-analyses in psychology, including examination of statistical power, Type I errors resulting from multiple comparisons, and model choice. Retrospective power estimates indicated that univariate categorical and continuous moderators, individual…
Weighing Evidence "Steampunk" Style via the Meta-Analyser.
Bowden, Jack; Jackson, Chris
2016-10-01
The funnel plot is a graphical visualization of summary data estimates from a meta-analysis, and is a useful tool for detecting departures from the standard modeling assumptions. Although perhaps not widely appreciated, a simple extension of the funnel plot can help to facilitate an intuitive interpretation of the mathematics underlying a meta-analysis at a more fundamental level, by equating it to determining the center of mass of a physical system. We used this analogy to explain the concepts of weighing evidence and of biased evidence to a young audience at the Cambridge Science Festival, without recourse to precise definitions or statistical formulas and with a little help from Sherlock Holmes! Following on from the science fair, we have developed an interactive web-application (named the Meta-Analyser) to bring these ideas to a wider audience. We envisage that our application will be a useful tool for researchers when interpreting their data. First, to facilitate a simple understanding of fixed and random effects modeling approaches; second, to assess the importance of outliers; and third, to show the impact of adjusting for small study bias. This final aim is realized by introducing a novel graphical interpretation of the well-known method of Egger regression.
Managing complex research datasets using electronic tools: A meta-analysis exemplar
Brown, Sharon A.; Martin, Ellen E.; Garcia, Theresa J.; Winter, Mary A.; García, Alexandra A.; Brown, Adama; Cuevas, Heather E.; Sumlin, Lisa L.
2013-01-01
Meta-analyses of broad scope and complexity require investigators to organize many study documents and manage communication among several research staff. Commercially available electronic tools, e.g., EndNote, Adobe Acrobat Pro, Blackboard, Excel, and IBM SPSS Statistics (SPSS), are useful for organizing and tracking the meta-analytic process, as well as enhancing communication among research team members. The purpose of this paper is to describe the electronic processes we designed, using commercially available software, for an extensive quantitative model-testing meta-analysis we are conducting. Specific electronic tools improved the efficiency of (a) locating and screening studies, (b) screening and organizing studies and other project documents, (c) extracting data from primary studies, (d) checking data accuracy and analyses, and (e) communication among team members. The major limitation in designing and implementing a fully electronic system for meta-analysis was the requisite upfront time to: decide on which electronic tools to use, determine how these tools would be employed, develop clear guidelines for their use, and train members of the research team. The electronic process described here has been useful in streamlining the process of conducting this complex meta-analysis and enhancing communication and sharing documents among research team members. PMID:23681256
Managing complex research datasets using electronic tools: a meta-analysis exemplar.
Brown, Sharon A; Martin, Ellen E; Garcia, Theresa J; Winter, Mary A; García, Alexandra A; Brown, Adama; Cuevas, Heather E; Sumlin, Lisa L
2013-06-01
Meta-analyses of broad scope and complexity require investigators to organize many study documents and manage communication among several research staff. Commercially available electronic tools, for example, EndNote, Adobe Acrobat Pro, Blackboard, Excel, and IBM SPSS Statistics (SPSS), are useful for organizing and tracking the meta-analytic process as well as enhancing communication among research team members. The purpose of this article is to describe the electronic processes designed, using commercially available software, for an extensive, quantitative model-testing meta-analysis. Specific electronic tools improved the efficiency of (a) locating and screening studies, (b) screening and organizing studies and other project documents, (c) extracting data from primary studies, (d) checking data accuracy and analyses, and (e) communication among team members. The major limitation in designing and implementing a fully electronic system for meta-analysis was the requisite upfront time to decide on which electronic tools to use, determine how these tools would be used, develop clear guidelines for their use, and train members of the research team. The electronic process described here has been useful in streamlining the process of conducting this complex meta-analysis and enhancing communication and sharing documents among research team members.
Crichton, Georgina E; Elias, Merrill F; Alkerwi, Ala'a
2016-05-01
Chocolate and cocoa flavanols have been associated with improvements in a range of health complaints dating from ancient times, and has established cardiovascular benefits. Less is known about the effects of chocolate on neurocognition and behaviour. The aim of this study was to investigate whether chocolate intake was associated with cognitive function, with adjustment for cardiovascular, lifestyle and dietary factors. Cross-sectional analyses were undertaken on 968 community-dwelling participants, aged 23-98 years, from the Maine-Syracuse Longitudinal Study (MSLS). Habitual chocolate intake was related to cognitive performance, measured with an extensive battery of neuropsychological tests. More frequent chocolate consumption was significantly associated with better performance on the Global Composite score, Visual-Spatial Memory and Organization, Working Memory, Scanning and Tracking, Abstract Reasoning, and the Mini-Mental State Examination. With the exception of Working Memory, these relations were not attenuated with statistical control for cardiovascular, lifestyle and dietary factors. Prospective analyses revealed no association between cognitive function and chocolate intake measured up to 18 years later. Further intervention trials and longitudinal studies are needed to explore relations between chocolate, cocoa flavanols and cognition, and the underlying causal mechanisms. Copyright © 2016 Elsevier Ltd. All rights reserved.
Experimental Investigation and Optimization of Response Variables in WEDM of Inconel - 718
NASA Astrophysics Data System (ADS)
Karidkar, S. S.; Dabade, U. A.
2016-02-01
Effective utilisation of Wire Electrical Discharge Machining (WEDM) technology is challenge for modern manufacturing industries. Day by day new materials with high strengths and capabilities are being developed to fulfil the customers need. Inconel - 718 is similar kind of material which is extensively used in aerospace applications, such as gas turbine, rocket motors, and spacecraft as well as in nuclear reactors and pumps etc. This paper deals with the experimental investigation of optimal machining parameters in WEDM for Surface Roughness, Kerf Width and Dimensional Deviation using DoE such as Taguchi methodology, L9 orthogonal array. By keeping peak current constant at 70 A, the effect of other process parameters on above response variables were analysed. Obtained experimental results were statistically analysed using Minitab-16 software. Analysis of Variance (ANOVA) shows pulse on time as the most influential parameter followed by wire tension whereas spark gap set voltage is observed to be non-influencing parameter. Multi-objective optimization technique, Grey Relational Analysis (GRA), shows optimal machining parameters such as pulse on time 108 Machine unit, spark gap set voltage 50 V and wire tension 12 gm for optimal response variables considered for the experimental analysis.
2012-01-01
We used an ITS2 primary and secondary structure and Compensatory Base Changes (CBCs) analyses on new French and Spanish Dunallela salina strains to investigate their phylogenetic position and taxonomic status within the genus Dunaliella. Our analyses show a great diversity within D. salina (with only some clades not statistically supported) and reveal considerable genetic diversity and structure within Dunaliella, although the CBC analysis did not bolster the existence of different biological groups within this taxon. The ITS2 sequences of the new Spanish and French D. salina strains were very similar except for two of them: ITC5105 "Janubio" from Spain and ITC5119 from France. Although the Spanish one had a unique ITS2 sequence profile and the phylogenetic tree indicates that this strain can represent a new species, this hypothesis was not confirmed by CBCs, and clarification of its taxonomic status requires further investigation with new data. Overall, the use of CBCs to define species boundaries within Dunaliella was not conclusive in some cases, and the ITS2 region does not contain a geographical signal overall. PMID:22520929
A comparative evaluation of semen parameters in pre- and post-Hurricane Katrina human population.
Baran, Caner; Hellstrom, Wayne J; Sikka, Suresh C
2015-01-01
A natural disaster leading to accumulation of environmental contaminants may have substantial effects on the male reproductive system. Our aim was to compare and assess semen parameters in a normospermic population residing in the Southern Louisiana, USA area pre- and post-Hurricane Katrina. We retrospectively evaluated semen analyses data (n = 3452) of 1855 patients who attended the Tulane University Andrology/Fertility Clinic between 1999 and 2013. The study inclusion criteria were men whose semen analyses showed ≥ 1.5 ml volume; ≥15 million ml -1 sperm concentration; ≥39 million total sperm count; ≥40% motility; >30% morphology, with an abstinence interval of 2-7 days. After the inclusion criteria applied to the population, 367 normospermic patients were included in the study. Descriptive statistics and group-based analyses were performed to interpret the differences between the pre-Katrina (Group 1, 1999-2005) and the post-Katrina (Group 2, 2006-2013) populations. There were significant differences in motility, morphology, number of white blood cell, immature germ cell count, pH and presence of sperm agglutination, but surprisingly there were no significant differences in sperm count between the two populations. This long-term comparative analysis further documents that a major natural disaster with its accompanied environmental issues can influence certain semen parameters (e.g., motility and morphology) and, by extension, fertility potential of the population of such areas.
Responding to nonwords in the lexical decision task: Insights from the English Lexicon Project.
Yap, Melvin J; Sibley, Daragh E; Balota, David A; Ratcliff, Roger; Rueckl, Jay
2015-05-01
Researchers have extensively documented how various statistical properties of words (e.g., word frequency) influence lexical processing. However, the impact of lexical variables on nonword decision-making performance is less clear. This gap is surprising, because a better specification of the mechanisms driving nonword responses may provide valuable insights into early lexical processes. In the present study, item-level and participant-level analyses were conducted on the trial-level lexical decision data for almost 37,000 nonwords in the English Lexicon Project in order to identify the influence of different psycholinguistic variables on nonword lexical decision performance and to explore individual differences in how participants respond to nonwords. Item-level regression analyses reveal that nonword response time was positively correlated with number of letters, number of orthographic neighbors, number of affixes, and base-word number of syllables, and negatively correlated with Levenshtein orthographic distance and base-word frequency. Participant-level analyses also point to within- and between-session stability in nonword responses across distinct sets of items, and intriguingly reveal that higher vocabulary knowledge is associated with less sensitivity to some dimensions (e.g., number of letters) but more sensitivity to others (e.g., base-word frequency). The present findings provide well-specified and interesting new constraints for informing models of word recognition and lexical decision. (c) 2015 APA, all rights reserved).
Extension of the statistical theory of resonating valence bonds to hyperelectronic metals
Kamb, Barclay; Pauling, Linus
1985-01-01
The statistical treatment of resonating covalent bonds in metals, previously applied to hypoelectronic metals, is extended to hyperelectronic metals and to metals with two kinds of bonds. The theory leads to half-integral values of the valence for hyperelectronic metallic elements. PMID:16593632
77 FR 15365 - Agency Information Collection Extension
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-15
... you anticipate that you will be submitting comments, but find it difficult to do so within the period.... The mailing address is Office of Survey Development and Statistical Integration, (EI-21), Forrestal... Statistical Integration, (EI-21), Forrestal Building, U.S. Department of Energy, 1000 Independence Ave. SW...
Algorithm for Identifying Erroneous Rain-Gauge Readings
NASA Technical Reports Server (NTRS)
Rickman, Doug
2005-01-01
An algorithm analyzes rain-gauge data to identify statistical outliers that could be deemed to be erroneous readings. Heretofore, analyses of this type have been performed in burdensome manual procedures that have involved subjective judgements. Sometimes, the analyses have included computational assistance for detecting values falling outside of arbitrary limits. The analyses have been performed without statistically valid knowledge of the spatial and temporal variations of precipitation within rain events. In contrast, the present algorithm makes it possible to automate such an analysis, makes the analysis objective, takes account of the spatial distribution of rain gauges in conjunction with the statistical nature of spatial variations in rainfall readings, and minimizes the use of arbitrary criteria. The algorithm implements an iterative process that involves nonparametric statistics.
NASA Astrophysics Data System (ADS)
Duman, T. Y.; Can, T.; Gokceoglu, C.; Nefeslioglu, H. A.; Sonmez, H.
2006-11-01
As a result of industrialization, throughout the world, cities have been growing rapidly for the last century. One typical example of these growing cities is Istanbul, the population of which is over 10 million. Due to rapid urbanization, new areas suitable for settlement and engineering structures are necessary. The Cekmece area located west of the Istanbul metropolitan area is studied, because the landslide activity is extensive in this area. The purpose of this study is to develop a model that can be used to characterize landslide susceptibility in map form using logistic regression analysis of an extensive landslide database. A database of landslide activity was constructed using both aerial-photography and field studies. About 19.2% of the selected study area is covered by deep-seated landslides. The landslides that occur in the area are primarily located in sandstones with interbedded permeable and impermeable layers such as claystone, siltstone and mudstone. About 31.95% of the total landslide area is located at this unit. To apply logistic regression analyses, a data matrix including 37 variables was constructed. The variables used in the forwards stepwise analyses are different measures of slope, aspect, elevation, stream power index (SPI), plan curvature, profile curvature, geology, geomorphology and relative permeability of lithological units. A total of 25 variables were identified as exerting strong influence on landslide occurrence, and included by the logistic regression equation. Wald statistics values indicate that lithology, SPI and slope are more important than the other parameters in the equation. Beta coefficients of the 25 variables included the logistic regression equation provide a model for landslide susceptibility in the Cekmece area. This model is used to generate a landslide susceptibility map that correctly classified 83.8% of the landslide-prone areas.
Citation of previous meta-analyses on the same topic: a clue to perpetuation of incorrect methods?
Li, Tianjing; Dickersin, Kay
2013-06-01
Systematic reviews and meta-analyses serve as a basis for decision-making and clinical practice guidelines and should be carried out using appropriate methodology to avoid incorrect inferences. We describe the characteristics, statistical methods used for meta-analyses, and citation patterns of all 21 glaucoma systematic reviews we identified pertaining to the effectiveness of prostaglandin analog eye drops in treating primary open-angle glaucoma, published between December 2000 and February 2012. We abstracted data, assessed whether appropriate statistical methods were applied in meta-analyses, and examined citation patterns of included reviews. We identified two forms of problematic statistical analyses in 9 of the 21 systematic reviews examined. Except in 1 case, none of the 9 reviews that used incorrect statistical methods cited a previously published review that used appropriate methods. Reviews that used incorrect methods were cited 2.6 times more often than reviews that used appropriate statistical methods. We speculate that by emulating the statistical methodology of previous systematic reviews, systematic review authors may have perpetuated incorrect approaches to meta-analysis. The use of incorrect statistical methods, perhaps through emulating methods described in previous research, calls conclusions of systematic reviews into question and may lead to inappropriate patient care. We urge systematic review authors and journal editors to seek the advice of experienced statisticians before undertaking or accepting for publication a systematic review and meta-analysis. The author(s) have no proprietary or commercial interest in any materials discussed in this article. Copyright © 2013 American Academy of Ophthalmology. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Gross, D. H. E.
2001-11-01
Phase transitions in nuclei, small atomic clusters and self-gravitating systems demand the extension of thermo-statistics to "Small" systems. The main obstacle is the thermodynamic limit. It is shown how the original definition of the entropy by Boltzmann as the volume of the energy-manifold of the N-body phase space allows a geometrical definition of the entropy as function of the conserved quantities. Without invoking the thermodynamic limit the whole "zoo" of phase transitions and critical points/lines can be unambiguously defined. The relation to the Yang-Lee singularities of the grand-canonical partition sum is pointed out. It is shown that just phase transitions in non-extensive systems give the complete set of characteristic parameters of the transition including the surface tension. Nuclear heavy-ion collisions are an experimental playground to explore this extension of thermo-statistics
ERIC Educational Resources Information Center
Chipman, Kristi; Litchfield, Ruth
2012-01-01
The Affordable Care Act provides impetus for Extension efforts in worksite wellness. The study reported here examined the influence of two worksite wellness interventions, newsletters and individual counseling. Surveys examined dietary and physical activity behaviors of participants pre- and post-intervention (N = 157). Descriptive statistics,…
Wu, Robert; Glen, Peter; Ramsay, Tim; Martel, Guillaume
2014-06-28
Observational studies dominate the surgical literature. Statistical adjustment is an important strategy to account for confounders in observational studies. Research has shown that published articles are often poor in statistical quality, which may jeopardize their conclusions. The Statistical Analyses and Methods in the Published Literature (SAMPL) guidelines have been published to help establish standards for statistical reporting.This study will seek to determine whether the quality of statistical adjustment and the reporting of these methods are adequate in surgical observational studies. We hypothesize that incomplete reporting will be found in all surgical observational studies, and that the quality and reporting of these methods will be of lower quality in surgical journals when compared with medical journals. Finally, this work will seek to identify predictors of high-quality reporting. This work will examine the top five general surgical and medical journals, based on a 5-year impact factor (2007-2012). All observational studies investigating an intervention related to an essential component area of general surgery (defined by the American Board of Surgery), with an exposure, outcome, and comparator, will be included in this systematic review. Essential elements related to statistical reporting and quality were extracted from the SAMPL guidelines and include domains such as intent of analysis, primary analysis, multiple comparisons, numbers and descriptive statistics, association and correlation analyses, linear regression, logistic regression, Cox proportional hazard analysis, analysis of variance, survival analysis, propensity analysis, and independent and correlated analyses. Each article will be scored as a proportion based on fulfilling criteria in relevant analyses used in the study. A logistic regression model will be built to identify variables associated with high-quality reporting. A comparison will be made between the scores of surgical observational studies published in medical versus surgical journals. Secondary outcomes will pertain to individual domains of analysis. Sensitivity analyses will be conducted. This study will explore the reporting and quality of statistical analyses in surgical observational studies published in the most referenced surgical and medical journals in 2013 and examine whether variables (including the type of journal) can predict high-quality reporting.
An operational GLS model for hydrologic regression
Tasker, Gary D.; Stedinger, J.R.
1989-01-01
Recent Monte Carlo studies have documented the value of generalized least squares (GLS) procedures to estimate empirical relationships between streamflow statistics and physiographic basin characteristics. This paper presents a number of extensions of the GLS method that deal with realities and complexities of regional hydrologic data sets that were not addressed in the simulation studies. These extensions include: (1) a more realistic model of the underlying model errors; (2) smoothed estimates of cross correlation of flows; (3) procedures for including historical flow data; (4) diagnostic statistics describing leverage and influence for GLS regression; and (5) the formulation of a mathematical program for evaluating future gaging activities. ?? 1989.
NASA Astrophysics Data System (ADS)
Pavlos, G. P.; Malandraki, O. E.; Pavlos, E. G.; Iliopoulos, A. C.; Karakatsanis, L. P.
2016-12-01
In this study we present some new and significant results concerning the dynamics of interplanetary coronal mass ejections (ICMEs) observed in the near Earth at L1 solar wind environment, as well as its effect in Earth's magnetosphere. The results are referred to Tsallis non-extensive statistics and in particular to the estimation of Tsallis q-triplet, (qstat ,qsen ,qrel) of magnetic field time series of the ICME observed at the Earth resulting from the solar eruptive activity on March 7, 2012 at the Sun. For this, we used a multi-spacecraft approach based on data experiments from ACE, CLUSTER 4, THEMIS-E and THEMIS-C spacecraft. For the data analysis different time periods were considered, sorted as ;quiet;, ;shock; and ;aftershock;, while different space domains such as the Interplanetary space (near Earth at L1 and upstream of the Earth's bowshock), the Earth's magnetosheath and magnetotail, were also taken into account. Our results reveal significant differences in statistical and dynamical features, indicating important variations of the magnetic field dynamics both in time and space domains during the shock event, in terms of rate of entropy production, relaxation dynamics and non-equilibrium meta-stable stationary states. So far, Tsallis non-extensive statistical theory and Tsallis extension of the Boltzmann-Gibbs entropy principle to the q-entropy principle (Tsallis, 1988, 2009) reveal strong universality character concerning non-equilibrium dynamics (Pavlos et al. 2012a,b, 2014a,b; Karakatsanis et al. 2013). Tsallis q-entropy principle can explain the emergence of a series of new and significant physical characteristics in distributed systems as well as in space plasmas. Such characteristics are: non-Gaussian statistics and anomalous diffusion processes, strange and fractional dynamics, multifractal, percolating and intermittent turbulence structures, multiscale and long spatio-temporal correlations, fractional acceleration and Non-Equilibrium Stationary States (NESS) or non-equilibrium self-organization process and non-equilibrium phase transition and topological phase transition processes according to Zelenyi and Milovanov (2004). In this direction, our results reveal clearly strong self-organization and development of macroscopic ordering of plasma system related to strengthen of non-extensivity, multifractality and intermittency everywhere in the space plasmas region during the CME event.
Is Ki67 prognostic for aggressive prostate cancer? A multicenter real-world study.
Fantony, Joseph J; Howard, Lauren E; Csizmadi, Ilona; Armstrong, Andrew J; Lark, Amy L; Galet, Colette; Aronson, William J; Freedland, Stephen J
2018-06-15
To test if Ki67 expression is prognostic for biochemical recurrence (BCR) after radical prostatectomy (RP). Ki67 immunohistochemistry was performed on tissue microarrays constructed from specimens obtained from 464 men undergoing RP at the Durham and West LA Veterans Affairs Hospitals. Hazard ratios (HR) for Ki67 expression and time to BCR were estimated using Cox regression. Ki67 was associated with more recent surgery year (p < 0.001), positive margins (p = 0.001) and extracapsular extension (p < 0.001). In center-stratified analyses, the adjusted HR for Ki67 expression and BCR approached statistical significance for west LA (HR: 1.54; p = 0.06), but not Durham (HR: 1.10; p = 0.74). This multi-institutional 'real-world' study provides limited evidence for the prognostic role of Ki67 in predicting outcome after RP.
Receptor-based 3D-QSAR in Drug Design: Methods and Applications in Kinase Studies.
Fang, Cheng; Xiao, Zhiyan
2016-01-01
Receptor-based 3D-QSAR strategy represents a superior integration of structure-based drug design (SBDD) and three-dimensional quantitative structure-activity relationship (3D-QSAR) analysis. It combines the accurate prediction of ligand poses by the SBDD approach with the good predictability and interpretability of statistical models derived from the 3D-QSAR approach. Extensive efforts have been devoted to the development of receptor-based 3D-QSAR methods and two alternative approaches have been exploited. One associates with computing the binding interactions between a receptor and a ligand to generate structure-based descriptors for QSAR analyses. The other concerns the application of various docking protocols to generate optimal ligand poses so as to provide reliable molecular alignments for the conventional 3D-QSAR operations. This review highlights new concepts and methodologies recently developed in the field of receptorbased 3D-QSAR, and in particular, covers its application in kinase studies.
NASA Technical Reports Server (NTRS)
Borg, J.; Horz, F.; Bridges, J. C.; Burchell, M. J.; Djouadi, Z.; Floss, C.; Graham, G. A.; Green, S. F.; Heck, P. R.; Hoppe, P.;
2007-01-01
Aluminium foils were used on Stardust to stabilize the aerogel specimens in the modular collector tray. Part of these foils were fully exposed to the flux of cometary grains emanating from Wild 2. Because the exposed part of these foils had to be harvested before extraction of the aerogel, numerous foil strips some 1.7 mm wide and 13 or 33 mm long were generated during Stardusts's Preliminary Examination (PE). These strips are readily accommodated in their entirety in the sample chambers of modern SEMs, thus providing the opportunity to characterize in situ the size distribution and residue composition - employing EDS methods - of statistically more significant numbers of cometary dust particles compared to aerogel, the latter mandating extensive sample preparation. We describe here the analysis of nearly 300 impact craters and their implications for Wild 2 dust.
Tan, Onder; Atik, Bekir; Dogan, Ali; Uslu, Mustafa; Alpaslan, Suleyman
2007-01-01
Skin grafting is widely used for the treatment of postburn contractures. Their main disadvantage, a tendency to contract again, can be reduced and better outcomes achieved by postoperative splinting. In this study we compared the outcomes of dynamic and static splinting postoperatively. Of the 57 patients managed by split grafts, 36 (44 hands) had Kirschner (K) wires applied with static splints, whereas 21 (26 hands) had dynamic splinting. The mean age was 11 (range 2-37) and 15 (range 2-50) years in the two groups. Before and after the operation, basic hand functions were evaluated clinically, and the results analysed statistically. The mean follow-up times were 18 and 14 months respectively, and recurrence rates were 22% and 14%. We think that the postoperative dynamic splinting is superior to fixation with K-wires with or without static splints.
Application of the Probabilistic Dynamic Synthesis Method to the Analysis of a Realistic Structure
NASA Technical Reports Server (NTRS)
Brown, Andrew M.; Ferri, Aldo A.
1998-01-01
The Probabilistic Dynamic Synthesis method is a new technique for obtaining the statistics of a desired response engineering quantity for a structure with non-deterministic parameters. The method uses measured data from modal testing of the structure as the input random variables, rather than more "primitive" quantities like geometry or material variation. This modal information is much more comprehensive and easily measured than the "primitive" information. The probabilistic analysis is carried out using either response surface reliability methods or Monte Carlo simulation. A previous work verified the feasibility of the PDS method on a simple seven degree-of-freedom spring-mass system. In this paper, extensive issues involved with applying the method to a realistic three-substructure system are examined, and free and forced response analyses are performed. The results from using the method are promising, especially when the lack of alternatives for obtaining quantitative output for probabilistic structures is considered.
The swan-song phenomenon: last-works effects for 172 classical composers.
Simonton, D K
1989-03-01
Creative individuals approaching their final years of life may undergo a transformation in outlook that is reflected in their last works. This hypothesized effect was quantitatively assessed for an extensive sample of 1,919 works by 172 classical composers. The works were independently gauged on seven aesthetic attributes (melodic originality, melodic variation, repertoire popularity, aesthetic significance, listener accessibility, performance duration, and thematic size), and potential last-works effects were operationally defined two separate ways (linearly and exponentially). Statistical controls were introduced for both longitudinal changes (linear, quadratic, and cubic age functions) and individual differences (eminence and lifetime productivity). Hierarchical regression analyses indicated that composers' swan songs tend to score lower in melodic originality and performance duration but higher in repertoire popularity and aesthetic significance. These last-works effects survive control for total compositional output, eminence, and most significantly, the composer's age when the last works were created.
EEGLAB, SIFT, NFT, BCILAB, and ERICA: New Tools for Advanced EEG Processing
Delorme, Arnaud; Mullen, Tim; Kothe, Christian; Akalin Acar, Zeynep; Bigdely-Shamlo, Nima; Vankov, Andrey; Makeig, Scott
2011-01-01
We describe a set of complementary EEG data collection and processing tools recently developed at the Swartz Center for Computational Neuroscience (SCCN) that connect to and extend the EEGLAB software environment, a freely available and readily extensible processing environment running under Matlab. The new tools include (1) a new and flexible EEGLAB STUDY design facility for framing and performing statistical analyses on data from multiple subjects; (2) a neuroelectromagnetic forward head modeling toolbox (NFT) for building realistic electrical head models from available data; (3) a source information flow toolbox (SIFT) for modeling ongoing or event-related effective connectivity between cortical areas; (4) a BCILAB toolbox for building online brain-computer interface (BCI) models from available data, and (5) an experimental real-time interactive control and analysis (ERICA) environment for real-time production and coordination of interactive, multimodal experiments. PMID:21687590
Application of the Probabilistic Dynamic Synthesis Method to Realistic Structures
NASA Technical Reports Server (NTRS)
Brown, Andrew M.; Ferri, Aldo A.
1998-01-01
The Probabilistic Dynamic Synthesis method is a technique for obtaining the statistics of a desired response engineering quantity for a structure with non-deterministic parameters. The method uses measured data from modal testing of the structure as the input random variables, rather than more "primitive" quantities like geometry or material variation. This modal information is much more comprehensive and easily measured than the "primitive" information. The probabilistic analysis is carried out using either response surface reliability methods or Monte Carlo simulation. In previous work, the feasibility of the PDS method applied to a simple seven degree-of-freedom spring-mass system was verified. In this paper, extensive issues involved with applying the method to a realistic three-substructure system are examined, and free and forced response analyses are performed. The results from using the method are promising, especially when the lack of alternatives for obtaining quantitative output for probabilistic structures is considered.
eSACP - a new Nordic initiative towards developing statistical climate services
NASA Astrophysics Data System (ADS)
Thorarinsdottir, Thordis; Thejll, Peter; Drews, Martin; Guttorp, Peter; Venälainen, Ari; Uotila, Petteri; Benestad, Rasmus; Mesquita, Michel d. S.; Madsen, Henrik; Fox Maule, Cathrine
2015-04-01
The Nordic research council NordForsk has recently announced its support for a new 3-year research initiative on "statistical analysis of climate projections" (eSACP). eSACP will focus on developing e-science tools and services based on statistical analysis of climate projections for the purpose of helping decision-makers and planners in the face of expected future challenges in regional climate change. The motivation behind the project is the growing recognition in our society that forecasts of future climate change is associated with various sources of uncertainty, and that any long-term planning and decision-making dependent on a changing climate must account for this. At the same time there is an obvious gap between scientists from different fields and between practitioners in terms of understanding how climate information relates to different parts of the "uncertainty cascade". In eSACP we will develop generic e-science tools and statistical climate services to facilitate the use of climate projections by decision-makers and scientists from all fields for climate impact analyses and for the development of robust adaptation strategies, which properly (in a statistical sense) account for the inherent uncertainty. The new tool will be publically available and include functionality to utilize the extensive and dynamically growing repositories of data and use state-of-the-art statistical techniques to quantify the uncertainty and innovative approaches to visualize the results. Such a tool will not only be valuable for future assessments and underpin the development of dedicated climate services, but will also assist the scientific community in making more clearly its case on the consequences of our changing climate to policy makers and the general public. The eSACP project is led by Thordis Thorarinsdottir, Norwegian Computing Center, and also includes the Finnish Meteorological Institute, the Norwegian Meteorological Institute, the Technical University of Denmark and the Bjerknes Centre for Climate Research, Norway. This poster will present details of focus areas in the project and show some examples of the expected analysis tools.
Statistical analyses of commercial vehicle accident factors. Volume 1 Part 1
DOT National Transportation Integrated Search
1978-02-01
Procedures for conducting statistical analyses of commercial vehicle accidents have been established and initially applied. A file of some 3,000 California Highway Patrol accident reports from two areas of California during a period of about one year...
40 CFR 90.712 - Request for public hearing.
Code of Federal Regulations, 2010 CFR
2010-07-01
... sampling plans and statistical analyses have been properly applied (specifically, whether sampling procedures and statistical analyses specified in this subpart were followed and whether there exists a basis... Clerk and will be made available to the public during Agency business hours. ...
Gunbarrel mafic magmatic event: A key 780 Ma time marker for Rodinia plate reconstructions
Harlan, S.S.; Heaman, L.; LeCheminant, A.N.; Premo, W.R.
2003-01-01
Precise U-Pb baddeleyite dating of mafic igneous rocks provides evidence for a widespread and synchronous magmatic event that extended for >2400 km along the western margin of the Neoproterozoic Laurentian craton. U-Pb baddeleyite analyses for eight intrusions from seven localities ranging from the northern Canadian Shield to northwestern Wyoming-southwestern Montana are statistically indistinguishable and yield a composite U-Pb concordia age for this event of 780.3 ?? 1.4 Ma (95% confidence level). This 780 Ma event is herein termed the Gunbarrel magmatic event. The mafic magmatism of the Gunbarrel event represents the largest mafic dike swarm yet identified along the Neoproterozoic margin of Laurentia. The origin of the mafic magmatism is not clear, but may be related to mantle-plume activity or upwelling asthenosphere leading to crustal extension accompanying initial breakup of the supercontinent Rodinia and development of the proto-Pacific Ocean. The mafic magmatism of the Gunbarrel magmatic event at 780 Ma predates the voluminous magmatism of the 723 Ma Franklin igneous event of the northwestern Canadian Shield by ???60 m.y. The precise dating of the extensive Neoproterozoic Gunbarrel and Franklin magmatic events provides unique time markers that can ultimately be used for robust testing of Neoproterozoic continental reconstructions.
Assessing natural direct and indirect effects through multiple pathways.
Lange, Theis; Rasmussen, Mette; Thygesen, Lau Caspar
2014-02-15
Within the fields of epidemiology, interventions research and social sciences researchers are often faced with the challenge of decomposing the effect of an exposure into different causal pathways working through defined mediator variables. The goal of such analyses is often to understand the mechanisms of the system or to suggest possible interventions. The case of a single mediator, thus implying only 2 causal pathways (direct and indirect) from exposure to outcome, has been extensively studied. By using the framework of counterfactual variables, researchers have established theoretical properties and developed powerful tools. However, in practical problems, it is not uncommon to have several distinct causal pathways from exposure to outcome operating through different mediators. In this article, we suggest a widely applicable approach to quantifying and ranking different causal pathways. The approach is an extension of the natural effect models proposed by Lange et al. (Am J Epidemiol. 2012;176(3):190-195). By allowing the analysis of distinct multiple pathways, the suggested approach adds to the capabilities of modern mediation techniques. Furthermore, the approach can be implemented using standard software, and we have included with this article implementation examples using R (R Foundation for Statistical Computing, Vienna, Austria) and Stata software (StataCorp LP, College Station, Texas).
REddyProc: Enabling researchers to process Eddy-Covariance data
NASA Astrophysics Data System (ADS)
Wutzler, Thomas; Moffat, Antje; Migliavacca, Mirco; Knauer, Jürgen; Menzer, Olaf; Sickel, Kerstin; Reichstein, Markus
2017-04-01
Analysing Eddy-Covariance measurements involves extensive processing, which puts technical labour to researchers. There is a need to overcome difficulties in data processing associated with deploying, adapting and using existing software and online tools. We tackled that need by developing the REddyProc package in the open source cross-platform language R that provides standard processing routines for reading half-hourly files from different formats, including from the recently released FLUXNET 2015 dataset, uStar threshold estimation and associated uncertainty, gap-filling, flux partitioning (both night-time or daytime based), and visualization of results. Although different in some features, the package mimics the online tool that has been extensively used by many users and site Principal Investigators (PIs) in the last years, and available on the website of the Max Planck Institute for Biogeochemistry. Generally, REddyProc results are statistically equal to results based on the state-of the art tools. The provided routines can be easily installed, configured, used, and integrated with further analysis. Hence the eddy covariance community will benefit from using the provided package allowing easier integration of standard processing with extended analysis. This complements activities by AmeriFlux, ICOS, NEON, and other regional networks for developing codes for standardized data processing of multiple sites in FLUXNET.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-07
... DEPARTMENT OF JUSTICE [OMB Number 1121-0094] Agency Information Collection Activities: Existing...: 60-day notice. The Department of Justice (DOJ), Bureau of Justice Statistics, will be submitting the... information, please contact Todd D. Minton, Bureau of Justice Statistics, 810 Seventh Street NW., Washington...
Repeated Random Sampling in Year 5
ERIC Educational Resources Information Center
Watson, Jane M.; English, Lyn D.
2016-01-01
As an extension to an activity introducing Year 5 students to the practice of statistics, the software "TinkerPlots" made it possible to collect repeated random samples from a finite population to informally explore students' capacity to begin reasoning with a distribution of sample statistics. This article provides background for the…
J-adaptive estimation with estimated noise statistics
NASA Technical Reports Server (NTRS)
Jazwinski, A. H.; Hipkins, C.
1973-01-01
The J-adaptive sequential estimator is extended to include simultaneous estimation of the noise statistics in a model for system dynamics. This extension completely automates the estimator, eliminating the requirement of an analyst in the loop. Simulations in satellite orbit determination demonstrate the efficacy of the sequential estimation algorithm.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-11-14
... commodities to keep records that substantiate ``cents off,'' ``introductory offer,'' and/or ``economy size... United States, 2010, U.S. Department of Labor, U.S. Bureau of Labor Statistics (May 2011) (``BLS National... information such as costs, sales statistics, inventories, formulas, patterns devices, manufacturing processes...
Supply Chain Collaboration: Information Sharing in a Tactical Operating Environment
2013-06-01
architecture, there are four tiers: Client (Web Application Clients ), Presentation (Web-Server), Processing (Application-Server), Data (Database...organization in each period. This data will be collected to analyze. i) Analyses and Validation: We will do a statistics test in this data, Pareto ...notes, outstanding deliveries, and inventory. i) Analyses and Validation: We will do a statistics test in this data, Pareto analyses and confirmation
NASA Technical Reports Server (NTRS)
Meston, R. D.; Schall, M. R., Jr.; Brockman, C. L.; Bender, W. H.
1972-01-01
All analyses and tradeoffs conducted to establish the MSS operations and crew activities are discussed. The missions and subsystem integrated analyses that were completed to assure compatibility of program elements and consistency with program objectives are presented.
Cost-effectiveness of prucalopride in the treatment of chronic constipation in the Netherlands
Nuijten, Mark J. C.; Dubois, Dominique J.; Joseph, Alain; Annemans, Lieven
2015-01-01
Objective: To assess the cost-effectiveness of prucalopride vs. continued laxative treatment for chronic constipation in patients in the Netherlands in whom laxatives have failed to provide adequate relief. Methods: A Markov model was developed to estimate the cost-effectiveness of prucalopride in patients with chronic constipation receiving standard laxative treatment from the perspective of Dutch payers in 2011. Data sources included published prucalopride clinical trials, published Dutch price/tariff lists, and national population statistics. The model simulated the clinical and economic outcomes associated with prucalopride vs. standard treatment and had a cycle length of 1 month and a follow-up time of 1 year. Response to treatment was defined as the proportion of patients who achieved “normal bowel function”. One-way and probabilistic sensitivity analyses were conducted to test the robustness of the base case. Results: In the base case analysis, the cost of prucalopride relative to continued laxative treatment was € 9015 per quality-adjusted life-year (QALY). Extensive sensitivity analyses and scenario analyses confirmed that the base case cost-effectiveness estimate was robust. One-way sensitivity analyses showed that the model was most sensitive in response to prucalopride; incremental cost-effectiveness ratios ranged from € 6475 to 15,380 per QALY. Probabilistic sensitivity analyses indicated that there is a greater than 80% probability that prucalopride would be cost-effective compared with continued standard treatment, assuming a willingness-to-pay threshold of € 20,000 per QALY from a Dutch societal perspective. A scenario analysis was performed for women only, which resulted in a cost-effectiveness ratio of € 7773 per QALY. Conclusion: Prucalopride was cost-effective in a Dutch patient population, as well as in a women-only subgroup, who had chronic constipation and who obtained inadequate relief from laxatives. PMID:25926794
Hamel, Jean-Francois; Saulnier, Patrick; Pe, Madeline; Zikos, Efstathios; Musoro, Jammbe; Coens, Corneel; Bottomley, Andrew
2017-09-01
Over the last decades, Health-related Quality of Life (HRQoL) end-points have become an important outcome of the randomised controlled trials (RCTs). HRQoL methodology in RCTs has improved following international consensus recommendations. However, no international recommendations exist concerning the statistical analysis of such data. The aim of our study was to identify and characterise the quality of the statistical methods commonly used for analysing HRQoL data in cancer RCTs. Building on our recently published systematic review, we analysed a total of 33 published RCTs studying the HRQoL methods reported in RCTs since 1991. We focussed on the ability of the methods to deal with the three major problems commonly encountered when analysing HRQoL data: their multidimensional and longitudinal structure and the commonly high rate of missing data. All studies reported HRQoL being assessed repeatedly over time for a period ranging from 2 to 36 months. Missing data were common, with compliance rates ranging from 45% to 90%. From the 33 studies considered, 12 different statistical methods were identified. Twenty-nine studies analysed each of the questionnaire sub-dimensions without type I error adjustment. Thirteen studies repeated the HRQoL analysis at each assessment time again without type I error adjustment. Only 8 studies used methods suitable for repeated measurements. Our findings show a lack of consistency in statistical methods for analysing HRQoL data. Problems related to multiple comparisons were rarely considered leading to a high risk of false positive results. It is therefore critical that international recommendations for improving such statistical practices are developed. Copyright © 2017. Published by Elsevier Ltd.
Sunspot activity and influenza pandemics: a statistical assessment of the purported association.
Towers, S
2017-10-01
Since 1978, a series of papers in the literature have claimed to find a significant association between sunspot activity and the timing of influenza pandemics. This paper examines these analyses, and attempts to recreate the three most recent statistical analyses by Ertel (1994), Tapping et al. (2001), and Yeung (2006), which all have purported to find a significant relationship between sunspot numbers and pandemic influenza. As will be discussed, each analysis had errors in the data. In addition, in each analysis arbitrary selections or assumptions were also made, and the authors did not assess the robustness of their analyses to changes in those arbitrary assumptions. Varying the arbitrary assumptions to other, equally valid, assumptions negates the claims of significance. Indeed, an arbitrary selection made in one of the analyses appears to have resulted in almost maximal apparent significance; changing it only slightly yields a null result. This analysis applies statistically rigorous methodology to examine the purported sunspot/pandemic link, using more statistically powerful un-binned analysis methods, rather than relying on arbitrarily binned data. The analyses are repeated using both the Wolf and Group sunspot numbers. In all cases, no statistically significant evidence of any association was found. However, while the focus in this particular analysis was on the purported relationship of influenza pandemics to sunspot activity, the faults found in the past analyses are common pitfalls; inattention to analysis reproducibility and robustness assessment are common problems in the sciences, that are unfortunately not noted often enough in review.
Permutation testing of orthogonal factorial effects in a language-processing experiment using fMRI.
Suckling, John; Davis, Matthew H; Ooi, Cinly; Wink, Alle Meije; Fadili, Jalal; Salvador, Raymond; Welchew, David; Sendur, Levent; Maxim, Vochita; Bullmore, Edward T
2006-05-01
The block-paradigm of the Functional Image Analysis Contest (FIAC) dataset was analysed with the Brain Activation and Morphological Mapping software. Permutation methods in the wavelet domain were used for inference on cluster-based test statistics of orthogonal contrasts relevant to the factorial design of the study, namely: the average response across all active blocks, the main effect of speaker, the main effect of sentence, and the interaction between sentence and speaker. Extensive activation was seen with all these contrasts. In particular, different vs. same-speaker blocks produced elevated activation in bilateral regions of the superior temporal lobe and repetition suppression for linguistic materials (same vs. different-sentence blocks) in left inferior frontal regions. These are regions previously reported in the literature. Additional regions were detected in this study, perhaps due to the enhanced sensitivity of the methodology. Within-block sentence suppression was tested post-hoc by regression of an exponential decay model onto the extracted time series from the left inferior frontal gyrus, but no strong evidence of such an effect was found. The significance levels set for the activation maps are P-values at which we expect <1 false-positive cluster per image. Nominal type I error control was verified by empirical testing of a test statistic corresponding to a randomly ordered design matrix. The small size of the BOLD effect necessitates sensitive methods of detection of brain activation. Permutation methods permit the necessary flexibility to develop novel test statistics to meet this challenge.
Czaplewski, Raymond L.
2015-01-01
Wall-to-wall remotely sensed data are increasingly available to monitor landscape dynamics over large geographic areas. However, statistical monitoring programs that use post-stratification cannot fully utilize those sensor data. The Kalman filter (KF) is an alternative statistical estimator. I develop a new KF algorithm that is numerically robust with large numbers of study variables and auxiliary sensor variables. A National Forest Inventory (NFI) illustrates application within an official statistics program. Practical recommendations regarding remote sensing and statistical issues are offered. This algorithm has the potential to increase the value of synoptic sensor data for statistical monitoring of large geographic areas. PMID:26393588
The Center for In-Service Education. Final Evaluation Report. Volume II. Part 1.
ERIC Educational Resources Information Center
Tennessee State Dept. of Education, Nashville.
This is an overview of the extensive in-service education inventory conducted as an integral portion of the planning contract for Models for In-Service Education supported by the Tennessee State Department of Education under Title III, Elementary and Secondary Education Act. The narrative descriptions are free of extensive statistical references…
Hsu, Justine; Zinsou, Cyprien; Parkhurst, Justin; N'Dour, Marguerite; Foyet, Léger; Mueller, Dirk H
2013-01-01
Behavioural interventions have been widely integrated in HIV/AIDS social marketing prevention strategies and are considered valuable in settings with high levels of risk behaviours and low levels of HIV/AIDS awareness. Despite their widespread application, there is a lack of economic evaluations comparing different behaviour change communication methods. This paper analyses the costs to increase awareness and the cost-effectiveness to influence behaviour change for five interventions in Benin. Cost and cost-effectiveness analyses used economic costs and primary effectiveness data drawn from surveys. Costs were collected for provider inputs required to implement the interventions in 2009 and analysed by 'person reached'. Cost-effectiveness was analysed by 'person reporting systematic condom use'. Sensitivity analyses were performed on all uncertain variables and major assumptions. Cost-per-person reached varies by method, with public outreach events the least costly (US$2.29) and billboards the most costly (US$25.07). Influence on reported behaviour was limited: only three of the five interventions were found to have a significant statistical correlation with reported condom use (i.e. magazines, radio broadcasts, public outreach events). Cost-effectiveness ratios per person reporting systematic condom use resulted in the following ranking: magazines, radio and public outreach events. Sensitivity analyses indicate rankings are insensitive to variation of key parameters although ratios must be interpreted with caution. This analysis suggests that while individual interventions are an attractive use of resources to raise awareness, this may not translate into a cost-effective impact on behaviour change. The study found that the extensive reach of public outreach events did not seem to influence behaviour change as cost-effectively when compared with magazines or radio broadcasts. Behavioural interventions are context-specific and their effectiveness influenced by a multitude of factors. Further analyses using a quasi-experimental design would be useful to programme implementers and policy makers as they face decisions regarding which HIV prevention activities to prioritize.
Schreier, Amy L; Grove, Matt
2014-05-01
The benefits of spatial memory for foraging animals can be assessed on two distinct spatial scales: small-scale space (travel within patches) and large-scale space (travel between patches). While the patches themselves may be distributed at low density, within patches resources are likely densely distributed. We propose, therefore, that spatial memory for recalling the particular locations of previously visited feeding sites will be more advantageous during between-patch movement, where it may reduce the distances traveled by animals that possess this ability compared to those that must rely on random search. We address this hypothesis by employing descriptive statistics and spectral analyses to characterize the daily foraging routes of a band of wild hamadryas baboons in Filoha, Ethiopia. The baboons slept on two main cliffs--the Filoha cliff and the Wasaro cliff--and daily travel began and ended on a cliff; thus four daily travel routes exist: Filoha-Filoha, Filoha-Wasaro, Wasaro-Wasaro, Wasaro-Filoha. We use newly developed partial sum methods and distribution-fitting analyses to distinguish periods of area-restricted search from more extensive movements. The results indicate a single peak in travel activity in the Filoha-Filoha and Wasaro-Filoha routes, three peaks of travel activity in the Filoha-Wasaro routes, and two peaks in the Wasaro-Wasaro routes; and are consistent with on-the-ground observations of foraging and ranging behavior of the baboons. In each of the four daily travel routes the "tipping points" identified by the partial sum analyses indicate transitions between travel in small- versus large-scale space. The correspondence between the quantitative analyses and the field observations suggest great utility for using these types of analyses to examine primate travel patterns and especially in distinguishing between movement in small versus large-scale space. Only the distribution-fitting analyses are inconsistent with the field observations, which may be due to the scale at which these analyses were conducted. © 2013 Wiley Periodicals, Inc.
Rosenberg, David M; Horn, Charles C
2016-08-01
Neurophysiology requires an extensive workflow of information analysis routines, which often includes incompatible proprietary software, introducing limitations based on financial costs, transfer of data between platforms, and the ability to share. An ecosystem of free open-source software exists to fill these gaps, including thousands of analysis and plotting packages written in Python and R, which can be implemented in a sharable and reproducible format, such as the Jupyter electronic notebook. This tool chain can largely replace current routines by importing data, producing analyses, and generating publication-quality graphics. An electronic notebook like Jupyter allows these analyses, along with documentation of procedures, to display locally or remotely in an internet browser, which can be saved as an HTML, PDF, or other file format for sharing with team members and the scientific community. The present report illustrates these methods using data from electrophysiological recordings of the musk shrew vagus-a model system to investigate gut-brain communication, for example, in cancer chemotherapy-induced emesis. We show methods for spike sorting (including statistical validation), spike train analysis, and analysis of compound action potentials in notebooks. Raw data and code are available from notebooks in data supplements or from an executable online version, which replicates all analyses without installing software-an implementation of reproducible research. This demonstrates the promise of combining disparate analyses into one platform, along with the ease of sharing this work. In an age of diverse, high-throughput computational workflows, this methodology can increase efficiency, transparency, and the collaborative potential of neurophysiological research. Copyright © 2016 the American Physiological Society.
2016-01-01
Neurophysiology requires an extensive workflow of information analysis routines, which often includes incompatible proprietary software, introducing limitations based on financial costs, transfer of data between platforms, and the ability to share. An ecosystem of free open-source software exists to fill these gaps, including thousands of analysis and plotting packages written in Python and R, which can be implemented in a sharable and reproducible format, such as the Jupyter electronic notebook. This tool chain can largely replace current routines by importing data, producing analyses, and generating publication-quality graphics. An electronic notebook like Jupyter allows these analyses, along with documentation of procedures, to display locally or remotely in an internet browser, which can be saved as an HTML, PDF, or other file format for sharing with team members and the scientific community. The present report illustrates these methods using data from electrophysiological recordings of the musk shrew vagus—a model system to investigate gut-brain communication, for example, in cancer chemotherapy-induced emesis. We show methods for spike sorting (including statistical validation), spike train analysis, and analysis of compound action potentials in notebooks. Raw data and code are available from notebooks in data supplements or from an executable online version, which replicates all analyses without installing software—an implementation of reproducible research. This demonstrates the promise of combining disparate analyses into one platform, along with the ease of sharing this work. In an age of diverse, high-throughput computational workflows, this methodology can increase efficiency, transparency, and the collaborative potential of neurophysiological research. PMID:27098025
Statistics of Stokes variables for correlated Gaussian fields
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eliyahu, D.
1994-09-01
The joint and marginal probability distribution functions of the Stokes variables are derived for correlated Gaussian fields [an extension of D. Eliyahu, Phys. Rev. E 47, 2881 (1993)]. The statistics depend only on the first moment (averaged) Stokes variables and have a universal form for [ital S][sub 1], [ital S][sub 2], and [ital S][sub 3]. The statistics of the variables describing the Cartesian coordinates of the Poincare sphere are given also.
de Savigny, Don; Riley, Ian; Chandramohan, Daniel; Odhiambo, Frank; Nichols, Erin; Notzon, Sam; AbouZahr, Carla; Mitra, Raj; Cobos Muñoz, Daniel; Firth, Sonja; Maire, Nicolas; Sankoh, Osman; Bronson, Gay; Setel, Philip; Byass, Peter; Jakob, Robert; Boerma, Ties; Lopez, Alan D.
2017-01-01
ABSTRACT Background: Reliable and representative cause of death (COD) statistics are essential to inform public health policy, respond to emerging health needs, and document progress towards Sustainable Development Goals. However, less than one-third of deaths worldwide are assigned a cause. Civil registration and vital statistics (CRVS) systems in low- and lower-middle-income countries are failing to provide timely, complete and accurate vital statistics, and it will still be some time before they can provide physician-certified COD for every death. Proposals: Verbal autopsy (VA) is a method to ascertain the probable COD and, although imperfect, it is the best alternative in the absence of medical certification. There is extensive experience with VA in research settings but only a few examples of its use on a large scale. Data collection using electronic questionnaires on mobile devices and computer algorithms to analyse responses and estimate probable COD have increased the potential for VA to be routinely applied in CRVS systems. However, a number of CRVS and health system integration issues should be considered in planning, piloting and implementing a system-wide intervention such as VA. These include addressing the multiplicity of stakeholders and sub-systems involved, integration with existing CRVS work processes and information flows, linking VA results to civil registration records, information technology requirements and data quality assurance. Conclusions: Integrating VA within CRVS systems is not simply a technical undertaking. It will have profound system-wide effects that should be carefully considered when planning for an effective implementation. This paper identifies and discusses the major system-level issues and emerging practices, provides a planning checklist of system-level considerations and proposes an overview for how VA can be integrated into routine CRVS systems. PMID:28137194
Jin, Zhichao; Yu, Danghui; Zhang, Luoman; Meng, Hong; Lu, Jian; Gao, Qingbin; Cao, Yang; Ma, Xiuqiang; Wu, Cheng; He, Qian; Wang, Rui; He, Jia
2010-01-01
Background High quality clinical research not only requires advanced professional knowledge, but also needs sound study design and correct statistical analyses. The number of clinical research articles published in Chinese medical journals has increased immensely in the past decade, but study design quality and statistical analyses have remained suboptimal. The aim of this investigation was to gather evidence on the quality of study design and statistical analyses in clinical researches conducted in China for the first decade of the new millennium. Methodology/Principal Findings Ten (10) leading Chinese medical journals were selected and all original articles published in 1998 (N = 1,335) and 2008 (N = 1,578) were thoroughly categorized and reviewed. A well-defined and validated checklist on study design, statistical analyses, results presentation, and interpretation was used for review and evaluation. Main outcomes were the frequencies of different types of study design, error/defect proportion in design and statistical analyses, and implementation of CONSORT in randomized clinical trials. From 1998 to 2008: The error/defect proportion in statistical analyses decreased significantly ( = 12.03, p<0.001), 59.8% (545/1,335) in 1998 compared to 52.2% (664/1,578) in 2008. The overall error/defect proportion of study design also decreased ( = 21.22, p<0.001), 50.9% (680/1,335) compared to 42.40% (669/1,578). In 2008, design with randomized clinical trials remained low in single digit (3.8%, 60/1,578) with two-third showed poor results reporting (defects in 44 papers, 73.3%). Nearly half of the published studies were retrospective in nature, 49.3% (658/1,335) in 1998 compared to 48.2% (761/1,578) in 2008. Decreases in defect proportions were observed in both results presentation ( = 93.26, p<0.001), 92.7% (945/1,019) compared to 78.2% (1023/1,309) and interpretation ( = 27.26, p<0.001), 9.7% (99/1,019) compared to 4.3% (56/1,309), some serious ones persisted. Conclusions/Significance Chinese medical research seems to have made significant progress regarding statistical analyses, but there remains ample room for improvement regarding study designs. Retrospective clinical studies are the most often used design, whereas randomized clinical trials are rare and often show methodological weaknesses. Urgent implementation of the CONSORT statement is imperative. PMID:20520824
Use of Statistical Analyses in the Ophthalmic Literature
Lisboa, Renato; Meira-Freitas, Daniel; Tatham, Andrew J.; Marvasti, Amir H.; Sharpsten, Lucie; Medeiros, Felipe A.
2014-01-01
Purpose To identify the most commonly used statistical analyses in the ophthalmic literature and to determine the likely gain in comprehension of the literature that readers could expect if they were to sequentially add knowledge of more advanced techniques to their statistical repertoire. Design Cross-sectional study Methods All articles published from January 2012 to December 2012 in Ophthalmology, American Journal of Ophthalmology and Archives of Ophthalmology were reviewed. A total of 780 peer-reviewed articles were included. Two reviewers examined each article and assigned categories to each one depending on the type of statistical analyses used. Discrepancies between reviewers were resolved by consensus. Main Outcome Measures Total number and percentage of articles containing each category of statistical analysis were obtained. Additionally we estimated the accumulated number and percentage of articles that a reader would be expected to be able to interpret depending on their statistical repertoire. Results Readers with little or no statistical knowledge would be expected to be able to interpret the statistical methods presented in only 20.8% of articles. In order to understand more than half (51.4%) of the articles published, readers were expected to be familiar with at least 15 different statistical methods. Knowledge of 21 categories of statistical methods was necessary to comprehend 70.9% of articles, while knowledge of more than 29 categories was necessary to comprehend more than 90% of articles. Articles in retina and glaucoma subspecialties showed a tendency for using more complex analysis when compared to cornea. Conclusions Readers of clinical journals in ophthalmology need to have substantial knowledge of statistical methodology to understand the results of published studies in the literature. The frequency of use of complex statistical analyses also indicates that those involved in the editorial peer-review process must have sound statistical knowledge in order to critically appraise articles submitted for publication. The results of this study could provide guidance to direct the statistical learning of clinical ophthalmologists, researchers and educators involved in the design of courses for residents and medical students. PMID:24612977
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-09
... Statistical Data for Refugee/Asylee Adjusting Status; OMB Control No. 1615-0070. The Department of Homeland.../Collection: Health and Human Services Statistical Data for Refugee/Asylee Adjusting Status. (3) Agency form... asked or required to respond, as well as a brief abstract: Primary: Individuals or Households. Refugees...
Performance of the S - [chi][squared] Statistic for Full-Information Bifactor Models
ERIC Educational Resources Information Center
Li, Ying; Rupp, Andre A.
2011-01-01
This study investigated the Type I error rate and power of the multivariate extension of the S - [chi][squared] statistic using unidimensional and multidimensional item response theory (UIRT and MIRT, respectively) models as well as full-information bifactor (FI-bifactor) models through simulation. Manipulated factors included test length, sample…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-09-03
... information could include a significant amount of statistical information that would be difficult to file... required static pool information. Given the large amount of statistical information involved, commentators....; and 18 U.S.C. 1350. * * * * * 2. Amend Sec. 232.312 paragraph (a) introductory text by removing...
Telecommunication market research processing
NASA Astrophysics Data System (ADS)
Dupont, J. F.
1983-06-01
The data processing in two telecommunication market investigations is described. One of the studies concerns the office applications of communication and the other the experiences with a videotex terminal. Statistical factorial analysis was performed on a large mass of data. A comparison between utilization intentions and effective utilization is made. Extensive rewriting of statistical analysis computer programs was required.
Global atmospheric circulation statistics, 1000-1 mb
NASA Technical Reports Server (NTRS)
Randel, William J.
1992-01-01
The atlas presents atmospheric general circulation statistics derived from twelve years (1979-90) of daily National Meteorological Center (NMC) operational geopotential height analyses; it is an update of a prior atlas using data over 1979-1986. These global analyses are available on pressure levels covering 1000-1 mb (approximately 0-50 km). The geopotential grids are a combined product of the Climate Analysis Center (which produces analyses over 70-1 mb) and operational NMC analyses (over 1000-100 mb). Balance horizontal winds and hydrostatic temperatures are derived from the geopotential fields.
NASA Astrophysics Data System (ADS)
Farag, A. Z. A.; Sultan, M.; Elkadiri, R.; Abdelhalim, A.
2014-12-01
An integrated approach using remote sensing, landscape analysis and statistical methods was conducted to assess the role of groundwater sapping in shaping the Saharan landscape. A GIS-based logistic regression model was constructed to automatically delineate the spatial distribution of the sapping features over areas occupied by the Nubian Sandstone Aquifer System (NSAS): (1) an inventory was compiled of known locations of sapping features identified either in the field or from satellite datasets (e.g. Orbview-3 and Google Earth Digital Globe imagery); (2) spatial analyses were conducted in a GIS environment and seven geomorphological and geological predisposing factors (i.e. slope, stream density, cross-sectional and profile curvature, minimum and maximum curvature, and lithology) were identified; (3) a binary logistic regression model was constructed, optimized and validated to describe the relationship between the sapping locations and the set of controlling factors and (4) the generated model (prediction accuracy: 90.1%) was used to produce a regional sapping map over the NSAS. Model outputs indicate: (1) groundwater discharge and structural control played an important role in excavating the Saharan natural depressions as evidenced by the wide distribution of sapping features (areal extent: 1180 km2) along the fault-controlled escarpments of the Libyan Plateau; (2) proximity of mapped sapping features to reported paleolake and tufa deposits suggesting a causal effect. Our preliminary observations (from satellite imagery) and statistical analyses together with previous studies in the North Western Sahara Aquifer System (North Africa), Sinai Peninsula, Negev Desert, and The Plateau of Najd (Saudi Arabia) indicate extensive occurrence of sapping features along the escarpments bordering the northern margins of the Saharan-Arabian Desert; these areas share similar hydrologic settings with the NSAS domains and they too witnessed wet climatic periods in the Mid-Late Quaternary.
Kobayashi, Toshiki; Orendurff, Michael S.; Singer, Madeline L.; Gao, Fan; Daly, Wayne K.; Foreman, K. Bo
2016-01-01
Background Genu recurvatum (knee hyperextension) is a common issue for individuals post stroke. Ankle-foot orthoses are used to improve genu recurvatum, but evidence is limited concerning their effectiveness. Therefore, the aim of this study was to investigate the effect of changing the plantarflexion resistance of an articulated ankle-foot orthosis on genu recurvatum in patients post stroke. Methods Gait analysis was performed on 6 individuals post stroke with genu recurvatum using an articulated ankle-foot orthosis whose plantarflexion resistance was adjustable at four levels. Gait data were collected using a Bertec split-belt instrumented treadmill in a 3-dimensional motion analysis laboratory. Gait parameters were extracted and plotted for each subject under the four plantarflexion resistance conditions of the ankle-foot orthosis. Gait parameters included: a) peak ankle plantarflexion angle, b) peak ankle dorsiflexion moment, c) peak knee extension angle and d) peak knee flexion moment. A non-parametric Friedman test was performed followed by a post-hoc Wilcoxon Signed-Rank test for statistical analyses. Findings All the gait parameters demonstrated statistically significant differences among the four resistance conditions of the AFO. Increasing the amount of plantarflexion resistance of the ankle-foot orthosis generally reduced genu recurvatum in all subjects. However, individual analyses showed that the responses to the changes in the plantarflexion resistance of the AFO were not necessarily linear, and appear unique to each subject. Interpretations The plantarflexion resistance of an articulated AFO should be adjusted to improve genu recurvatum in patients post stroke. Future studies should investigate what clinical factors would influence the individual differences. PMID:27136122
Bénichou, Bernard; Goyal, Sunita; Sung, Crystal; Norfleet, Andrea M; O'Brien, Fanny
2009-01-01
Fabry disease results from a genetic deficiency of alpha-galactosidase A (alpha GAL) and the impaired catabolism of globotriasoylceramide (GL-3) and other glycosphingolipid substrates, which then accumulate pathogenically within most cells. Enzyme replacement therapy (ERT) with agalsidase beta (Fabrazyme), one of two available forms of recombinant human alpha GAL, involves regular intravenous infusions of the therapeutic protein. Immunoglobulin G (IgG) antibodies to recombinant alpha GAL develop in the majority of patients upon repeated infusion. To explore whether anti-alpha GAL IgG interferes with therapeutic efficacy, retrospective analyses were conducted using data obtained from a total of 134 adult male and female patients with Fabry disease who were treated with agalsidase beta at 1mg/kg every 2 weeks for up to 5 years during placebo-controlled trials and the corresponding open-label extension studies. The analyses did not reveal a correlation between anti-alpha GAL IgG titers and the onset of clinical events or the rate of change in estimated GFR during treatment, and no statistically significant association was found between anti-alpha GAL IgG titers and abnormal elevations in plasma GL-3 during treatment. However, a statistically significant association was found between anti-alpha GAL IgG titers and observation of some GL-3 deposition in the dermal capillary endothelial cells of skin during treatment, suggesting that GL-3 clearance may be partially impaired in some patients with high antibody titers. Determination of the long-term impact of circulating anti-alpha GAL IgG antibodies on clinical outcomes will require continued monitoring, and serology testing is recommended as part of the routine care of Fabry disease patients during ERT.
NASA Astrophysics Data System (ADS)
Duroure, Christophe; Sy, Abdoulaye; Baray, Jean luc; Van baelen, Joel; Diop, Bouya
2017-04-01
Precipitation plays a key role in the management of sustainable water resources and flood risk analyses. Changes in rainfall will be a critical factor determining the overall impact of climate change. We propose to analyse long series (10 years) of daily precipitation at different regions. We present the Fourier densities energy spectra and morphological spectra (i.e. probability repartition functions of the duration and the horizontal scale) of large precipitating systems. Satellite data from the Global precipitation climatology project (GPCP) and local pluviometers long time series in Senegal and France are used and compared in this work. For mid-latitude and Sahelian regions (North of 12°N), the morphological spectra are close to exponential decreasing distribution. This fact allows to define two characteristic scales (duration and space extension) for the precipitating region embedded into the large meso-scale convective system (MCS). For tropical and equatorial regions (South of 12°N) the morphological spectra are close to a Levy-stable distribution (power law decrease) which does not allow to define a characteristic scale (scaling range). When the time and space characteristic scales are defined, a "statistical velocity" of precipitating MCS can be defined, and compared to observed zonal advection. Maps of the characteristic scales and Levy-stable exponent over West Africa and south Europe are presented. The 12° latitude transition between exponential and Levy-stable behaviors of precipitating MCS is compared with the result of ECMWF ERA-Interim reanalysis for the same period. This morphological sharp transition could be used to test the different parameterizations of deep convection in forecast models.
Effect Size Analyses of Souvenaid in Patients with Alzheimer’s Disease
Cummings, Jeffrey; Scheltens, Philip; McKeith, Ian; Blesa, Rafael; Harrison, John E.; Bertolucci, Paulo H.F.; Rockwood, Kenneth; Wilkinson, David; Wijker, Wouter; Bennett, David A.; Shah, Raj C.
2016-01-01
Background: Souvenaid® (uridine monophosphate, docosahexaenoic acid, eicosapentaenoic acid, choline, phospholipids, folic acid, vitamins B12, B6, C, and E, and selenium), was developed to support the formation and function of neuronal membranes. Objective: To determine effect sizes observed in clinical trials of Souvenaid and to calculate the number needed to treat to show benefit or harm. Methods: Data from all three reported randomized controlled trials of Souvenaid in Alzheimer’s disease (AD) dementia (Souvenir I, Souvenir II, and S-Connect) and an open-label extension study were included in analyses of effect size for cognitive, functional, and behavioral outcomes. Effect size was determined by calculating Cohen’s d statistic (or Cramér’s V method for nominal data), number needed to treat and number needed to harm. Statistical calculations were performed for the intent-to-treat populations. Results: In patients with mild AD, effect sizes were 0.21 (95% confidence intervals: –0.06, 0.49) for the primary outcome in Souvenir II (neuropsychological test battery memory z-score) and 0.20 (0.10, 0.34) for the co-primary outcome of Souvenir I (Wechsler memory scale delayed recall). No effect was shown on cognition in patients with mild-to-moderate AD (S-Connect). The number needed to treat (6 and 21 for Souvenir I and II, respectively) and high number needed to harm values indicate a favorable harm:benefit ratio for Souvenaid versus control in patients with mild AD. Conclusions: The favorable safety profile and impact on outcome measures converge to corroborate the putative mode of action and demonstrate that Souvenaid can achieve clinically detectable effects in patients with early AD. PMID:27767993
NASA Astrophysics Data System (ADS)
Vespignani, A.
2004-09-01
Networks have been recently recognized as playing a central role in understanding a wide range of systems spanning diverse scientific domains such as physics and biology, economics, computer science and information technology. Specific examples run from the structure of the Internet and the World Wide Web to the interconnections of finance agents and ecological food webs. These networked systems are generally made by many components whose microscopic interactions give rise to global structures characterized by emergent collective behaviour and complex topological properties. In this context the statistical physics approach finds a natural application since it attempts to explain the various large-scale statistical properties of networks in terms of local interactions governing the dynamical evolution of the constituent elements of the system. It is not by chance then that many of the seminal papers in the field have been published in the physics literature, and have nevertheless made a considerable impact on other disciplines. Indeed, a truly interdisciplinary approach is required in order to understand each specific system of interest, leading to a very interesting cross-fertilization between different scientific areas defining the emergence of a new research field sometimes called network science. The book of Dorogovtsev and Mendes is the first comprehensive monograph on this new scientific field. It provides a thorough presentation of the forefront research activities in the area of complex networks, with an extensive sampling of the disciplines involved and the kinds of problems that form the subject of inquiry. The book starts with a short introduction to graphs and network theory that introduces the tools and mathematical background needed for the rest of the book. The following part is devoted to an extensive presentation of the empirical analysis of real-world networks. While for obvious reasons of space the authors cannot analyse in every detail all the various examples, they provide the reader with a general vista that makes clear the relevance of network science to a wide range of natural and man-made systems. Two chapters are then committed to the detailed exposition of the statistical physics approach to equilibrium and non-equilibrium networks. The authors are two leading players in the area of network theory and offer a very careful and complete presentation of the statistical physics theory of evolving networks. Finally, in the last two chapters, the authors focus on various consequences of network topology for dynamical and physical phenomena occurring in these kinds of structures. The book is completed by a very extensive bibliography and some useful appendices containing some technical points arising in the mathematical discussion and data analysis. The book's mathematical level is fairly advanced and allows a coherent and unified framework for the study of networked structure. The book is targeted at mathematicians, physicists and social scientists alike. It will be appreciated by everybody working in the network area, and especially by any researcher or student entering the field that would like to have a reference text on the latest developments in network science.
Development of the Statistical Reasoning in Biology Concept Inventory (SRBCI)
Deane, Thomas; Nomme, Kathy; Jeffery, Erica; Pollock, Carol; Birol, Gülnur
2016-01-01
We followed established best practices in concept inventory design and developed a 12-item inventory to assess student ability in statistical reasoning in biology (Statistical Reasoning in Biology Concept Inventory [SRBCI]). It is important to assess student thinking in this conceptual area, because it is a fundamental requirement of being statistically literate and associated skills are needed in almost all walks of life. Despite this, previous work shows that non–expert-like thinking in statistical reasoning is common, even after instruction. As science educators, our goal should be to move students along a novice-to-expert spectrum, which could be achieved with growing experience in statistical reasoning. We used item response theory analyses (the one-parameter Rasch model and associated analyses) to assess responses gathered from biology students in two populations at a large research university in Canada in order to test SRBCI’s robustness and sensitivity in capturing useful data relating to the students’ conceptual ability in statistical reasoning. Our analyses indicated that SRBCI is a unidimensional construct, with items that vary widely in difficulty and provide useful information about such student ability. SRBCI should be useful as a diagnostic tool in a variety of biology settings and as a means of measuring the success of teaching interventions designed to improve statistical reasoning skills. PMID:26903497
Mutual interference between statistical summary perception and statistical learning.
Zhao, Jiaying; Ngo, Nhi; McKendrick, Ryan; Turk-Browne, Nicholas B
2011-09-01
The visual system is an efficient statistician, extracting statistical summaries over sets of objects (statistical summary perception) and statistical regularities among individual objects (statistical learning). Although these two kinds of statistical processing have been studied extensively in isolation, their relationship is not yet understood. We first examined how statistical summary perception influences statistical learning by manipulating the task that participants performed over sets of objects containing statistical regularities (Experiment 1). Participants who performed a summary task showed no statistical learning of the regularities, whereas those who performed control tasks showed robust learning. We then examined how statistical learning influences statistical summary perception by manipulating whether the sets being summarized contained regularities (Experiment 2) and whether such regularities had already been learned (Experiment 3). The accuracy of summary judgments improved when regularities were removed and when learning had occurred in advance. In sum, calculating summary statistics impeded statistical learning, and extracting statistical regularities impeded statistical summary perception. This mutual interference suggests that statistical summary perception and statistical learning are fundamentally related.
Human Spaceflight Architecture Model (HSFAM) Data Dictionary
NASA Technical Reports Server (NTRS)
Shishko, Robert
2016-01-01
HSFAM is a data model based on the DoDAF 2.02 data model with some for purpose extensions. These extensions are designed to permit quantitative analyses regarding stakeholder concerns about technical feasibility, configuration and interface issues, and budgetary and/or economic viability.
Secondary Analysis of National Longitudinal Transition Study 2 Data
ERIC Educational Resources Information Center
Hicks, Tyler A.; Knollman, Greg A.
2015-01-01
This review examines published secondary analyses of National Longitudinal Transition Study 2 (NLTS2) data, with a primary focus upon statistical objectives, paradigms, inferences, and methods. Its primary purpose was to determine which statistical techniques have been common in secondary analyses of NLTS2 data. The review begins with an…
A Nonparametric Geostatistical Method For Estimating Species Importance
Andrew J. Lister; Rachel Riemann; Michael Hoppus
2001-01-01
Parametric statistical methods are not always appropriate for conducting spatial analyses of forest inventory data. Parametric geostatistical methods such as variography and kriging are essentially averaging procedures, and thus can be affected by extreme values. Furthermore, non normal distributions violate the assumptions of analyses in which test statistics are...
ERIC Educational Resources Information Center
Ellis, Barbara G.; Dick, Steven J.
1996-01-01
Employs the statistics-documentation portion of a word-processing program's grammar-check feature together with qualitative analyses to determine that Henry Watterson, long-time editor of the "Louisville Courier-Journal," was probably the South's famed Civil War correspondent "Shadow." (TB)
Square2 - A Web Application for Data Monitoring in Epidemiological and Clinical Studies
Schmidt, Carsten Oliver; Krabbe, Christine; Schössow, Janka; Albers, Martin; Radke, Dörte; Henke, Jörg
2017-01-01
Valid scientific inferences from epidemiological and clinical studies require high data quality. Data generating departments therefore aim to detect data irregularities as early as possible in order to guide quality management processes. In addition, after the completion of data collections the obtained data quality must be evaluated. This can be challenging in complex studies due to a wide scope of examinations, numerous study variables, multiple examiners, devices, and examination centers. This paper describes a Java EE web application used to monitor and evaluate data quality in institutions with complex and multiple studies, named Square 2 . It uses the Java libraries Apache MyFaces 2, extended by BootsFaces for layout and style. RServe and REngine manage calls to R server processes. All study data and metadata are stored in PostgreSQL. R is the statistics backend and LaTeX is used for the generation of print ready PDF reports. A GUI manages the entire workflow. Square 2 covers all steps in the data monitoring workflow, including the setup of studies and their structure, the handling of metadata for data monitoring purposes, selection of variables, upload of data, statistical analyses, and the generation as well as inspection of quality reports. To take into account data protection issues, Square 2 comprises an extensive user rights and roles concept.
Lovrenski, Jovan
2012-03-01
To evaluate the diagnostic possibilities of lung ultrasonography (LUS) in detecting pulmonary complications in preterm infants with respiratory distress syndrome (RDS). A prospective study included 120 preterm infants with clinical and radiographic signs of RDS. LUS was performed using both a transthoracic and a transabdominal approach within the first 24 h of life, and, after that, follow-up LUS examinations were performed. In 47 detected pulmonary complications of RDS (hemorrhage, pneumothorax, pneumonia, atelectasis, bronchopulmonary dysplasia), comparisons between LUS and chest X-ray (CXR) were made. Also, 90 subpleural consolidations registered during LUS examinations were analysed. Statistical analysis included MANOVA and discriminant analysis, t-test, confidence interval, and positive predictive value. In 45 of 47 instances the same diagnosis of complication was detected with LUS as with CXR, indicating a high reliability of the method in premature infants with RDS. The only two false negative findings concerned partial pneumothorax. The positive predictive value of LUS was 100%. A statistically significant difference of LUS findings between the anterior and posterior lung areas was observed in both right and left hemithoraces. LUS enables the detection of pulmonary complications in preterm infants with RDS and has the potential to reduce the number of CXRs. The specific guidelines for its use should be provided in a more extensive study.
Reuter, Martin; Wolter, Franz-Erich; Shenton, Martha; Niethammer, Marc
2009-01-01
This paper proposes the use of the surface based Laplace-Beltrami and the volumetric Laplace eigenvalues and -functions as shape descriptors for the comparison and analysis of shapes. These spectral measures are isometry invariant and therefore allow for shape comparisons with minimal shape pre-processing. In particular, no registration, mapping, or remeshing is necessary. The discriminatory power of the 2D surface and 3D solid methods is demonstrated on a population of female caudate nuclei (a subcortical gray matter structure of the brain, involved in memory function, emotion processing, and learning) of normal control subjects and of subjects with schizotypal personality disorder. The behavior and properties of the Laplace-Beltrami eigenvalues and -functions are discussed extensively for both the Dirichlet and Neumann boundary condition showing advantages of the Neumann vs. the Dirichlet spectra in 3D. Furthermore, topological analyses employing the Morse-Smale complex (on the surfaces) and the Reeb graph (in the solids) are performed on selected eigenfunctions, yielding shape descriptors, that are capable of localizing geometric properties and detecting shape differences by indirectly registering topological features such as critical points, level sets and integral lines of the gradient field across subjects. The use of these topological features of the Laplace-Beltrami eigenfunctions in 2D and 3D for statistical shape analysis is novel. PMID:20161035
Schlägel, Ulrike E; Lewis, Mark A
2016-12-01
Discrete-time random walks and their extensions are common tools for analyzing animal movement data. In these analyses, resolution of temporal discretization is a critical feature. Ideally, a model both mirrors the relevant temporal scale of the biological process of interest and matches the data sampling rate. Challenges arise when resolution of data is too coarse due to technological constraints, or when we wish to extrapolate results or compare results obtained from data with different resolutions. Drawing loosely on the concept of robustness in statistics, we propose a rigorous mathematical framework for studying movement models' robustness against changes in temporal resolution. In this framework, we define varying levels of robustness as formal model properties, focusing on random walk models with spatially-explicit component. With the new framework, we can investigate whether models can validly be applied to data across varying temporal resolutions and how we can account for these different resolutions in statistical inference results. We apply the new framework to movement-based resource selection models, demonstrating both analytical and numerical calculations, as well as a Monte Carlo simulation approach. While exact robustness is rare, the concept of approximate robustness provides a promising new direction for analyzing movement models.
Robustly detecting differential expression in RNA sequencing data using observation weights
Zhou, Xiaobei; Lindsay, Helen; Robinson, Mark D.
2014-01-01
A popular approach for comparing gene expression levels between (replicated) conditions of RNA sequencing data relies on counting reads that map to features of interest. Within such count-based methods, many flexible and advanced statistical approaches now exist and offer the ability to adjust for covariates (e.g. batch effects). Often, these methods include some sort of ‘sharing of information’ across features to improve inferences in small samples. It is important to achieve an appropriate tradeoff between statistical power and protection against outliers. Here, we study the robustness of existing approaches for count-based differential expression analysis and propose a new strategy based on observation weights that can be used within existing frameworks. The results suggest that outliers can have a global effect on differential analyses. We demonstrate the effectiveness of our new approach with real data and simulated data that reflects properties of real datasets (e.g. dispersion-mean trend) and develop an extensible framework for comprehensive testing of current and future methods. In addition, we explore the origin of such outliers, in some cases highlighting additional biological or technical factors within the experiment. Further details can be downloaded from the project website: http://imlspenticton.uzh.ch/robinson_lab/edgeR_robust/. PMID:24753412
ACCOUNTING FOR CALIBRATION UNCERTAINTIES IN X-RAY ANALYSIS: EFFECTIVE AREAS IN SPECTRAL FITTING
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, Hyunsook; Kashyap, Vinay L.; Drake, Jeremy J.
2011-04-20
While considerable advance has been made to account for statistical uncertainties in astronomical analyses, systematic instrumental uncertainties have been generally ignored. This can be crucial to a proper interpretation of analysis results because instrumental calibration uncertainty is a form of systematic uncertainty. Ignoring it can underestimate error bars and introduce bias into the fitted values of model parameters. Accounting for such uncertainties currently requires extensive case-specific simulations if using existing analysis packages. Here, we present general statistical methods that incorporate calibration uncertainties into spectral analysis of high-energy data. We first present a method based on multiple imputation that can bemore » applied with any fitting method, but is necessarily approximate. We then describe a more exact Bayesian approach that works in conjunction with a Markov chain Monte Carlo based fitting. We explore methods for improving computational efficiency, and in particular detail a method of summarizing calibration uncertainties with a principal component analysis of samples of plausible calibration files. This method is implemented using recently codified Chandra effective area uncertainties for low-resolution spectral analysis and is verified using both simulated and actual Chandra data. Our procedure for incorporating effective area uncertainty is easily generalized to other types of calibration uncertainties.« less
1993-08-01
subtitled "Simulation Data," consists of detailed infonrnation on the design parmneter variations tested, subsequent statistical analyses conducted...used with confidence during the design process. The data quality can be examined in various forms such as statistical analyses of measure of merit data...merit, such as time to capture or nmaximurn pitch rate, can be calculated from the simulation time history data. Statistical techniques are then used
Chung, Sang M; Lee, David J; Hand, Austin; Young, Philip; Vaidyanathan, Jayabharathi; Sahajwalla, Chandrahas
2015-12-01
The study evaluated whether the renal function decline rate per year with age in adults varies based on two primary statistical analyses: cross-section (CS), using one observation per subject, and longitudinal (LT), using multiple observations per subject over time. A total of 16628 records (3946 subjects; age range 30-92 years) of creatinine clearance and relevant demographic data were used. On average, four samples per subject were collected for up to 2364 days (mean: 793 days). A simple linear regression and random coefficient models were selected for CS and LT analyses, respectively. The renal function decline rates per year were 1.33 and 0.95 ml/min/year for CS and LT analyses, respectively, and were slower when the repeated individual measurements were considered. The study confirms that rates are different based on statistical analyses, and that a statistically robust longitudinal model with a proper sampling design provides reliable individual as well as population estimates of the renal function decline rates per year with age in adults. In conclusion, our findings indicated that one should be cautious in interpreting the renal function decline rate with aging information because its estimation was highly dependent on the statistical analyses. From our analyses, a population longitudinal analysis (e.g. random coefficient model) is recommended if individualization is critical, such as a dose adjustment based on renal function during a chronic therapy. Copyright © 2015 John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
Karakatsanis, L. P.; Pavlos, G. P.; Iliopoulos, A. C.; Pavlos, E. G.; Clark, P. M.; Duke, J. L.; Monos, D. S.
2018-09-01
This study combines two independent domains of science, the high throughput DNA sequencing capabilities of Genomics and complexity theory from Physics, to assess the information encoded by the different genomic segments of exonic, intronic and intergenic regions of the Major Histocompatibility Complex (MHC) and identify possible interactive relationships. The dynamic and non-extensive statistical characteristics of two well characterized MHC sequences from the homozygous cell lines, PGF and COX, in addition to two other genomic regions of comparable size, used as controls, have been studied using the reconstructed phase space theorem and the non-extensive statistical theory of Tsallis. The results reveal similar non-linear dynamical behavior as far as complexity and self-organization features. In particular, the low-dimensional deterministic nonlinear chaotic and non-extensive statistical character of the DNA sequences was verified with strong multifractal characteristics and long-range correlations. The nonlinear indices repeatedly verified that MHC sequences, whether exonic, intronic or intergenic include varying levels of information and reveal an interaction of the genes with intergenic regions, whereby the lower the number of genes in a region, the less the complexity and information content of the intergenic region. Finally we showed the significance of the intergenic region in the production of the DNA dynamics. The findings reveal interesting content information in all three genomic elements and interactive relationships of the genes with the intergenic regions. The results most likely are relevant to the whole genome and not only to the MHC. These findings are consistent with the ENCODE project, which has now established that the non-coding regions of the genome remain to be of relevance, as they are functionally important and play a significant role in the regulation of expression of genes and coordination of the many biological processes of the cell.
Jacobsen, Julie S; Nielsen, Dennis B; Sørensen, Henrik; Søballe, Kjeld; Mechlenburg, Inger
2014-01-01
Background and purpose — Hip dysplasia can be treated with periacetabular osteotomy (PAO). We compared joint angles and joint moments during walking and running in young adults with hip dysplasia prior to and 6 and 12 months after PAO with those in healthy controls. Patients and methods — Joint kinematics and kinetics were recorded using a 3-D motion capture system. The pre- and postoperative gait characteristics quantified as the peak hip extension angle and the peak joint moment of hip flexion were compared in 23 patients with hip dysplasia (18–53 years old). Similarly, the gait patterns of the patients were compared with those of 32 controls (18–54 years old). Results — During walking, the peak hip extension angle and the peak hip flexion moment were significantly smaller at baseline in the patients than in the healthy controls. The peak hip flexion moment increased 6 and 12 months after PAO relative to baseline during walking, and 6 months after PAO relative to baseline during running. For running, the improvement did not reach statistical significance at 12 months. In addition, the peak hip extension angle during walking increased 12 months after PAO, though not statistically significantly. There were no statistically significant differences in peak hip extension angle and peak hip flexion moment between the patients and the healthy controls after 12 months. Interpretation — Walking and running characteristics improved after PAO in patients with symptomatic hip dysplasia, although gait modifications were still present 12 months postoperatively. PMID:25191933
Mellin, G
1988-06-01
Mobility of hips and lumbar spine were measured in 301 men and 175 women who were in employment but suffered from chronic or recurrent low-back pain. The degree of low-back pain (LBP) was assessed with a questionnaire. Hip flexion, extension, internal rotation, and hamstring flexibility in the men, and hip flexion and extension in the women had statistically significant negative correlations with LBP. Among the correlations between hip and lumbar spinal mobility, hip flexion and extension with lumbar rotation were strongest.
Fundamental properties of fracture and seismicity in a non extensive statistical physics framework.
NASA Astrophysics Data System (ADS)
Vallianatos, Filippos
2010-05-01
A fundamental challenge in many scientific disciplines concerns upscaling, that is, of determining the regularities and laws of evolution at some large scale, from those known at a lower scale. Earthquake physics is no exception, with the challenge of understanding the transition from the laboratory scale to the scale of fault networks and large earthquakes. In this context, statistical physics has a remarkably successful work record in addressing the upscaling problem in physics. It is natural then to consider that the physics of many earthquakes has to be studied with a different approach than the physics of one earthquake and in this sense we can consider the use of statistical physics not only appropriate but necessary to understand the collective properties of earthquakes [see Corral 2004, 2005a,b,c;]. A significant attempt is given in a series of works [Main 1996; Rundle et al., 1997; Main et al., 2000; Main and Al-Kindy, 2002; Rundle et al., 2003; Vallianatos and Triantis, 2008a] that uses classical statistical physics to describe seismicity. Then a natural question arises. What type of statistical physics is appropriate to commonly describe effects from fracture level to seismicity scale?? The application of non extensive statistical physics offers a consistent theoretical framework, based on a generalization of entropy, to analyze the behavior of natural systems with fractal or multi-fractal distribution of their elements. Such natural systems where long - range interactions or intermittency are important, lead to power law behavior. We note that this is consistent with a classical thermodynamic approach to natural systems that rapidly attain equilibrium, leading to exponential-law behavior. In the frame of non extensive statistical physics approach, the probability function p(X) is calculated using the maximum entropy formulation of Tsallis entropy which involves the introduction of at least two constraints (Tsallis et al., 1998). The first one is the classical normalization of p(X). The second one is based on the definition of the expectation value which has to be generalized to the "q-expectation value", according to the generalization of the entropy [Abe and Suzuki, 2003]. In order to calculate p(X) we apply the technique of Langrange multipliers maximizing an appropriate functional and leading tο maximization of the Tsallis entropy under the constraints on the normalization and the q-expectation value. It is well known that the Gutenberg-Richter (G-R) power law distribution has to be modified for large seismic moments because of energy conservation and geometrical reasons. Several models have been proposed, either in terms of a second power law with a larger b value beyond a crossover magnitude, or based on a magnidute cut-off using an exponential taper. In the present work we point out that the non extensivity viewpoint is applicable to seismic processes. In the frame of a non-extensive approach which is based on Tsallis entropy we construct a generalized expression of Gutenberg-Richter (GGR) law [Vallianatos, 2008]. The existence of lower or/and upper bound to magnitude is discussed and the conditions under which GGR lead to classical GR law are analysed. For the lowest earthquake size (i.e., energy level) the correlation between the different parts of elements involved in the evolution of an earthquake are short-ranged and GR can be deduced on the basis of the maximum entropy principle using BG statistics. As the size (i.e., energy) increases, long range correlation becomes much more important, implying the necessity of using Tsallis entropy as an appropriate generalization of BG entropy. The power law behaviour is derived as a special case, leading to b-values being functions of the non-extensivity parameter q. Furthermore a theoretical analysis of similarities presented in stress stimulated electric and acoustic emissions and earthquakes are discussed not only in the frame of GGR but taking into account a universality in the description of intrevent times distribution. Its particular form can be well expressed in the frame of a non extensive approach. This formulation is very different from an exponential distribution expected for simple random Poisson processes and indicates the existence of a nontrivial universal mechanism in the generation process. All the aforementioned similarities within stress stimulated electrical and acoustic emissions and seismicity suggests a connection with fracture phenomena at much larger scales implying that a basic general mechanism is "actively hidden" behind all this phenomena [Vallianatos and Triantis, 2008b]. Examples from S.Aegean seismicity are given. Acknowledgements: This work is partially supported by the "NEXT EARTH" project FP7-PEOPLE, 2009-2011 References Abe S. and Suzuki N., J. Goephys. Res. 108 (B2), 2113, 2003. Corral A., Phys. Rev. Lett. 92, 108501, 2004. Corral A., Nonlinear Proc. Geophys. 12, 89, 2005a. Corral A., Phys. Rev. E 71, 017101, 2005b. Corral A., Phys. Rev. Lett. 95, 028501, 2005c. Main I. G., Rev. of Geoph., 34, 433, 1996. Main I. G., O' Brien G. And Henderson R., J. Geoph. Res., 105, 6105, 2000. Main I. G. and Al-Kindy F. H., Geoph. Res. Let., 29, 7, 2002. Rundle J. B., Gross S., Klein W., Fergunson C. and Turcotte D., Tectonophysics, 277, 147-164, 1997. Rundle J. B., Turcotte D. L., Shcherbakov R., Klein W. and Sammis C., Rev. Geophys. 41, 1019, 2003. Tsallis C., J. Stat. Phys. 52, 479, 1988; See also http://tsallis.cat.cbpf.br/biblio.htm for an updated bibliography. Vallianatos, F., 2th IASME/WSEAS International Conference on Geology and Seismology (GES08), Cambridge, U.K, 2008. Vallianatos F. and Triantis D., Physica A, 387, 4940-4946, 2008a.
Teaching statistics to nursing students: an expert panel consensus.
Hayat, Matthew J; Eckardt, Patricia; Higgins, Melinda; Kim, MyoungJin; Schmiege, Sarah J
2013-06-01
Statistics education is a necessary element of nursing education, and its inclusion is recommended in the American Association of Colleges of Nursing guidelines for nurse training at all levels. This article presents a cohesive summary of an expert panel discussion, "Teaching Statistics to Nursing Students," held at the 2012 Joint Statistical Meetings. All panelists were statistics experts, had extensive teaching and consulting experience, and held faculty appointments in a U.S.-based nursing college or school. The panel discussed degree-specific curriculum requirements, course content, how to ensure nursing students understand the relevance of statistics, approaches to integrating statistics consulting knowledge, experience with classroom instruction, use of knowledge from the statistics education research field to make improvements in statistics education for nursing students, and classroom pedagogy and instruction on the use of statistical software. Panelists also discussed the need for evidence to make data-informed decisions about statistics education and training for nurses. Copyright 2013, SLACK Incorporated.
Transportation elements assessment : Town of Milton, September 15, 2009.
DOT National Transportation Integrated Search
2009-09-15
During the summer of 2009, the Delaware T2 Center collected extensive data and completed analyses related to transportation infrastructure in the Town of Milton, Delaware. This report presents those data, the analyses, and resulting recommendations.
Transportation elements assessment : Town of Milton, November 2, 2009.
DOT National Transportation Integrated Search
2010-11-02
During the summer of 2009, the Delaware T2 Center collected extensive data and completed analyses related to transportation infrastructure in the Town of Milton, Delaware. This report presents those data, the analyses, and resulting recommendations.
Impacts of extension access and cooperative membership on technology adoption and household welfare.
Wossen, Tesfamicheal; Abdoulaye, Tahirou; Alene, Arega; Haile, Mekbib G; Feleke, Shiferaw; Olanrewaju, Adetunji; Manyong, Victor
2017-08-01
This paper examines the impacts of access to extension services and cooperative membership on technology adoption, asset ownership and poverty using household-level data from rural Nigeria. Using different matching techniques and endogenous switching regression approach, we find that both extension access and cooperative membership have a positive and statistically significant effect on technology adoption and household welfare. Moreover, we find that both extension access and cooperative membership have heterogeneous impacts. In particular, we find evidence of a positive selection as the average treatment effects of extension access and cooperative membership are higher for farmers with the highest propensity to access extension and cooperative services. The impact of extension services on poverty reduction and of cooperatives on technology adoption is significantly stronger for smallholders with access to formal credit than for those without access. This implies that expanding rural financial markets can maximize the potential positive impacts of extension and cooperative services on farmers' productivity and welfare.
ERIC Educational Resources Information Center
Adair, Desmond; Jaeger, Martin; Price, Owen M.
2018-01-01
The use of a portfolio curriculum approach, when teaching a university introductory statistics and probability course to engineering students, is developed and evaluated. The portfolio curriculum approach, so called, as the students need to keep extensive records both as hard copies and digitally of reading materials, interactions with faculty,…
USDA-ARS?s Scientific Manuscript database
The mixed linear model (MLM) is currently among the most advanced and flexible statistical modeling techniques and its use in tackling problems in plant pathology has begun surfacing in the literature. The longitudinal MLM is a multivariate extension that handles repeatedly measured data, such as r...
ERIC Educational Resources Information Center
Parrott, Roxanne; Silk, Kami; Dorgan, Kelly; Condit, Celeste; Harris, Tina
2005-01-01
Too little theory and research has considered the effects of communicating statistics in various forms on comprehension, perceptions of evidence quality, or evaluations of message persuasiveness. In a considered extension of Subjective Message Construct Theory (Morley, 1987), we advance a rationale relating evidence form to the formation of…
Content-Based VLE Designs Improve Learning Efficiency in Constructivist Statistics Education
Wessa, Patrick; De Rycker, Antoon; Holliday, Ian Edward
2011-01-01
Background We introduced a series of computer-supported workshops in our undergraduate statistics courses, in the hope that it would help students to gain a deeper understanding of statistical concepts. This raised questions about the appropriate design of the Virtual Learning Environment (VLE) in which such an approach had to be implemented. Therefore, we investigated two competing software design models for VLEs. In the first system, all learning features were a function of the classical VLE. The second system was designed from the perspective that learning features should be a function of the course's core content (statistical analyses), which required us to develop a specific–purpose Statistical Learning Environment (SLE) based on Reproducible Computing and newly developed Peer Review (PR) technology. Objectives The main research question is whether the second VLE design improved learning efficiency as compared to the standard type of VLE design that is commonly used in education. As a secondary objective we provide empirical evidence about the usefulness of PR as a constructivist learning activity which supports non-rote learning. Finally, this paper illustrates that it is possible to introduce a constructivist learning approach in large student populations, based on adequately designed educational technology, without subsuming educational content to technological convenience. Methods Both VLE systems were tested within a two-year quasi-experiment based on a Reliable Nonequivalent Group Design. This approach allowed us to draw valid conclusions about the treatment effect of the changed VLE design, even though the systems were implemented in successive years. The methodological aspects about the experiment's internal validity are explained extensively. Results The effect of the design change is shown to have substantially increased the efficiency of constructivist, computer-assisted learning activities for all cohorts of the student population under investigation. The findings demonstrate that a content–based design outperforms the traditional VLE–based design. PMID:21998652
NASA Astrophysics Data System (ADS)
Williams, Randolph; Goodwin, Laurel; Sharp, Warren; Mozley, Peter
2017-04-01
U-Th dates on calcite precipitated in coseismic extension fractures in the Loma Blanca normal fault zone, Rio Grande rift, NM, USA, constrain earthquake recurrence intervals from 150-565 ka. This is the longest direct record of seismicity documented for a fault in any tectonic environment. Combined U-Th and stable isotope analyses of these calcite veins define 13 distinct earthquake events. These data show that for more than 400 ka the Loma Blanca fault produced earthquakes with a mean recurrence interval of 40 ± 7 ka. The coefficient of variation for these events is 0.40, indicating strongly periodic seismicity consistent with a time-dependent model of earthquake recurrence. Stochastic statistical analyses further validate the inference that earthquake behavior on the Loma Blanca was time-dependent. The time-dependent nature of these earthquakes suggests that the seismic cycle was fundamentally controlled by a stress renewal process. However, this periodic cycle was punctuated by an episode of clustered seismicity at 430 ka. Recurrence intervals within the earthquake cluster were as low as 5-11 ka. Breccia veins formed during this episode exhibit carbon isotope signatures consistent with having formed through pronounced degassing of a CO2 charged brine during post-failure, fault-localized fluid migration. The 40 ka periodicity of the long-term earthquake record of the Loma Blanca fault is similar in magnitude to recurrence intervals documented through paleoseismic studies of other normal faults in the Rio Grande rift and Basin and Range Province. We propose that it represents a background rate of failure in intraplate extension. The short-term, clustered seismicity that occurred on the fault records an interruption of the stress renewal process, likely by elevated fluid pressure in deeper structural levels of the fault, consistent with fault-valve behavior. The relationship between recurrence interval and inferred fluid degassing suggests that pore fluid pressure along the fault may have been driven by variations in CO2 content, thereby fundamentally affecting earthquake frequency. Thus, the Loma Blanca fault provides a record of "naturally induced" seismicity, with lessons for better understanding anthropogenic induced seismicity.
Inferential Statistics in "Language Teaching Research": A Review and Ways Forward
ERIC Educational Resources Information Center
Lindstromberg, Seth
2016-01-01
This article reviews all (quasi)experimental studies appearing in the first 19 volumes (1997-2015) of "Language Teaching Research" (LTR). Specifically, it provides an overview of how statistical analyses were conducted in these studies and of how the analyses were reported. The overall conclusion is that there has been a tight adherence…
Vogelbacher, Christoph; Möbius, Thomas W D; Sommer, Jens; Schuster, Verena; Dannlowski, Udo; Kircher, Tilo; Dempfle, Astrid; Jansen, Andreas; Bopp, Miriam H A
2018-05-15
Large, longitudinal, multi-center MR neuroimaging studies require comprehensive quality assurance (QA) protocols for assessing the general quality of the compiled data, indicating potential malfunctions in the scanning equipment, and evaluating inter-site differences that need to be accounted for in subsequent analyses. We describe the implementation of a QA protocol for functional magnet resonance imaging (fMRI) data based on the regular measurement of an MRI phantom and an extensive variety of currently published QA statistics. The protocol is implemented in the MACS (Marburg-Münster Affective Disorders Cohort Study, http://for2107.de/), a two-center research consortium studying the neurobiological foundations of affective disorders. Between February 2015 and October 2016, 1214 phantom measurements have been acquired using a standard fMRI protocol. Using 444 healthy control subjects which have been measured between 2014 and 2016 in the cohort, we investigate the extent of between-site differences in contrast to the dependence on subject-specific covariates (age and sex) for structural MRI, fMRI, and diffusion tensor imaging (DTI) data. We show that most of the presented QA statistics differ severely not only between the two scanners used for the cohort but also between experimental settings (e.g. hardware and software changes), demonstrate that some of these statistics depend on external variables (e.g. time of day, temperature), highlight their strong dependence on proper handling of the MRI phantom, and show how the use of a phantom holder may balance this dependence. Site effects, however, do not only exist for the phantom data, but also for human MRI data. Using T1-weighted structural images, we show that total intracranial (TIV), grey matter (GMV), and white matter (WMV) volumes significantly differ between the MR scanners, showing large effect sizes. Voxel-based morphometry (VBM) analyses show that these structural differences observed between scanners are most pronounced in the bilateral basal ganglia, thalamus, and posterior regions. Using DTI data, we also show that fractional anisotropy (FA) differs between sites in almost all regions assessed. When pooling data from multiple centers, our data show that it is a necessity to account not only for inter-site differences but also for hardware and software changes of the scanning equipment. Also, the strong dependence of the QA statistics on the reliable placement of the MRI phantom shows that the use of a phantom holder is recommended to reduce the variance of the QA statistics and thus to increase the probability of detecting potential scanner malfunctions. Copyright © 2018 Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Jones, A. L.
1972-01-01
Requirements and concepts and the tradeoff analysis leading to the preferred concept are presented. Integrated analyses are given for subsystems and thermal control. Specific tradeoffs and analyses are also given for water management, atmosphere control, energy storage, radiators, navigation, control moment gyros, and system maintenance. The analyses of manipulator concepts and requirements, and supplemental analyses of information management issues are summarized. Subsystem reliability analyses include a detailed discussion of the critical failure analysis.
SOCR Analyses - an Instructional Java Web-based Statistical Analysis Toolkit.
Chu, Annie; Cui, Jenny; Dinov, Ivo D
2009-03-01
The Statistical Online Computational Resource (SOCR) designs web-based tools for educational use in a variety of undergraduate courses (Dinov 2006). Several studies have demonstrated that these resources significantly improve students' motivation and learning experiences (Dinov et al. 2008). SOCR Analyses is a new component that concentrates on data modeling and analysis using parametric and non-parametric techniques supported with graphical model diagnostics. Currently implemented analyses include commonly used models in undergraduate statistics courses like linear models (Simple Linear Regression, Multiple Linear Regression, One-Way and Two-Way ANOVA). In addition, we implemented tests for sample comparisons, such as t-test in the parametric category; and Wilcoxon rank sum test, Kruskal-Wallis test, Friedman's test, in the non-parametric category. SOCR Analyses also include several hypothesis test models, such as Contingency tables, Friedman's test and Fisher's exact test.The code itself is open source (http://socr.googlecode.com/), hoping to contribute to the efforts of the statistical computing community. The code includes functionality for each specific analysis model and it has general utilities that can be applied in various statistical computing tasks. For example, concrete methods with API (Application Programming Interface) have been implemented in statistical summary, least square solutions of general linear models, rank calculations, etc. HTML interfaces, tutorials, source code, activities, and data are freely available via the web (www.SOCR.ucla.edu). Code examples for developers and demos for educators are provided on the SOCR Wiki website.In this article, the pedagogical utilization of the SOCR Analyses is discussed, as well as the underlying design framework. As the SOCR project is on-going and more functions and tools are being added to it, these resources are constantly improved. The reader is strongly encouraged to check the SOCR site for most updated information and newly added models.
Extense historical droughts in Spain derived from documentary sources
NASA Astrophysics Data System (ADS)
Dominguez-Castro, F.; García-Herrera, R.; Barriendos, M.
2009-09-01
Documentary records, specially those from rogation ceremonies have been extensively used to build proxy series of droughts and floods in Spain. Most of the work done previously has focused in the abstraction of the documents and building of the individual series, but less attention has been paid to the joint analysis of this type of records. This is problematic because, due to the diversity of Spanish climates, the climatological meaning of the rogation ceremonies changes depending on the considered region. This paper aims to analyse the spatial extension of drought events from the rogation records from Barcelona, Bilbao, Gerona, Murcia, Seville, Tarragona, Toledo, Tortosa and Zamora, which cover the 16th to 19th centuries. The representativeness of each of them is analysed taking into account the local climate and the series variability. Then the spatial scale of the recorded droughts is examined at seasonal scale. The results show high multidecadal variability, with the driest periods at national scale recorded during the 1680s, 1730s and 1780s. Finally, the dry years of 1680, 1683 and 1817 are analysed in detail.
q-triplet for Brazos River discharge: The edge of chaos?
NASA Astrophysics Data System (ADS)
Stosic, Tatijana; Stosic, Borko; Singh, Vijay P.
2018-04-01
We study the daily discharge data of Brazos River in Texas, USA, from 1900 to 2017, in terms of concepts drawn from the non-extensive statistics recently introduced by Tsallis. We find that the Brazos River discharge indeed follows non-extensive statistics regarding equilibrium, relaxation and sensitivity. Besides being the first such finding of a full-fledged q-triplet in hydrological data with possible future impact on water resources management, the fact that all three Tsallis q-triplet values are remarkably close to those of the logistic map at the onset of chaos opens up new questions towards a deeper understanding of the Brazos River dynamics, that may prove relevant for hydrological research in a more general sense.
Non-extensive quantum statistics with particle-hole symmetry
NASA Astrophysics Data System (ADS)
Biró, T. S.; Shen, K. M.; Zhang, B. W.
2015-06-01
Based on Tsallis entropy (1988) and the corresponding deformed exponential function, generalized distribution functions for bosons and fermions have been used since a while Teweldeberhan et al. (2003) and Silva et al. (2010). However, aiming at a non-extensive quantum statistics further requirements arise from the symmetric handling of particles and holes (excitations above and below the Fermi level). Naive replacements of the exponential function or "cut and paste" solutions fail to satisfy this symmetry and to be smooth at the Fermi level at the same time. We solve this problem by a general ansatz dividing the deformed exponential to odd and even terms and demonstrate that how earlier suggestions, like the κ- and q-exponential behave in this respect.
Transportation elements assessment : Town of Milton, Delaware, September 29, 2009.
DOT National Transportation Integrated Search
2009-09-29
During the summer of 2009, the Delaware T2 Center collected extensive data : and completed analyses related to transportation infrastructure in the Town : of Milton, Delaware. This report presents those data, the analyses, and : resulting recommendat...
Body shape analyses of large persons in South Korea.
Park, Woojin; Park, Sungjoon
2013-01-01
Despite the prevalence of obesity and overweight, anthropometric characteristics of large individuals have not been extensively studied. This study investigated body shapes of large persons (Broca index ≥ 20, BMI ≥ 25 or WHR>1.0) using stature-normalised body dimensions data from the latest South Korean anthropometric survey. For each sex, a factor analysis was performed on the anthropometric data set to identify the key factors that explain the shape variability; and then, a cluster analysis was conducted on the factor scores data to determine a set of representative body types. The body types were labelled in terms of their distinct shape characteristics and their relative frequencies were computed for each of the four age groups considered: the 10s, 20s-30s, 40s-50s and 60s. The study findings may facilitate creating artefacts that anthropometrically accommodate large individuals, developing digital human models of large persons and designing future ergonomics studies on largeness. This study investigated body shapes of large persons using anthropometric data from South Korea. For each sex, multivariate statistical analyses were conducted to identify the key factors of the body shape variability and determine the representative body types. The study findings may facilitate designing artefacts that anthropometrically accommodate large persons.
Detection of greenhouse-gas-induced climatic change. Progress report, July 1, 1994--July 31, 1995
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jones, P.D.; Wigley, T.M.L.
1995-07-21
The objective of this research is to assembly and analyze instrumental climate data and to develop and apply climate models as a basis for detecting greenhouse-gas-induced climatic change, and validation of General Circulation Models. In addition to changes due to variations in anthropogenic forcing, including greenhouse gas and aerosol concentration changes, the global climate system exhibits a high degree of internally-generated and externally-forced natural variability. To detect the anthropogenic effect, its signal must be isolated from the ``noise`` of this natural climatic variability. A high quality, spatially extensive data base is required to define the noise and its spatial characteristics.more » To facilitate this, available land and marine data bases will be updated and expanded. The data will be analyzed to determine the potential effects on climate of greenhouse gas and aerosol concentration changes and other factors. Analyses will be guided by a variety of models, from simple energy balance climate models to coupled atmosphere ocean General Circulation Models. These analyses are oriented towards obtaining early evidence of anthropogenic climatic change that would lead either to confirmation, rejection or modification of model projections, and towards the statistical validation of General Circulation Model control runs and perturbation experiments.« less
Potency factors for risk assessment at Libby, Montana.
Moolgavkar, Suresh H; Turim, Jay; Alexander, Dominik D; Lau, Edmund C; Cushing, Colleen A
2010-08-01
We reanalyzed the Libby vermiculite miners' cohort assembled by Sullivan to estimate potency factors for lung cancer, mesothelioma, nonmalignant respiratory disease (NMRD), and all-cause mortality associated with exposure to Libby fibers. Our principal statistical tool for analyses of lung cancer, NMRD, and total mortality in the cohort was the time-dependent proportional hazards model. For mesothelioma, we used an extension of the Peto formula. For a cumulative exposure to Libby fiber of 100 f/mL-yr, our estimates of relative risk (RR) are as follows: lung cancer, RR = 1.12, 95% confidence interval (CI) =[1.06, 1.17]; NMRD, RR = 1.14, 95% CI =[1.09, 1.18]; total mortality, RR = 1.06, 95% CI =[1.04, 1.08]. These estimates were virtually identical when analyses were restricted to the subcohort of workers who were employed for at least one year. For mesothelioma, our estimate of potency is K(M) = 0.5 x 10(-8), 95% CI =[0.3 x 10(-8), 0.8 x 10(-8)]. Finally, we estimated the mortality ratios standardized against the U.S. population for lung cancer, NMRD, and total mortality and obtained estimates that were in good agreement with those reported by Sullivan. The estimated potency factors form the basis for a quantitative risk assessment at Libby.
Weighing Evidence “Steampunk” Style via the Meta-Analyser
Bowden, Jack; Jackson, Chris
2016-01-01
ABSTRACT The funnel plot is a graphical visualization of summary data estimates from a meta-analysis, and is a useful tool for detecting departures from the standard modeling assumptions. Although perhaps not widely appreciated, a simple extension of the funnel plot can help to facilitate an intuitive interpretation of the mathematics underlying a meta-analysis at a more fundamental level, by equating it to determining the center of mass of a physical system. We used this analogy to explain the concepts of weighing evidence and of biased evidence to a young audience at the Cambridge Science Festival, without recourse to precise definitions or statistical formulas and with a little help from Sherlock Holmes! Following on from the science fair, we have developed an interactive web-application (named the Meta-Analyser) to bring these ideas to a wider audience. We envisage that our application will be a useful tool for researchers when interpreting their data. First, to facilitate a simple understanding of fixed and random effects modeling approaches; second, to assess the importance of outliers; and third, to show the impact of adjusting for small study bias. This final aim is realized by introducing a novel graphical interpretation of the well-known method of Egger regression. PMID:28003684
Bordes, Frédéric; Morand, Serge; Pilosof, Shai; Claude, Julien; Krasnov, Boris R; Cosson, Jean-François; Chaval, Yannick; Ribas, Alexis; Chaisiri, Kittipong; Blasdell, Kim; Herbreteau, Vincent; Dupuy, Stéphane; Tran, Annelise
2015-09-01
1. While the effects of deforestation and habitat fragmentation on parasite prevalence or richness are well investigated, host-parasite networks are still understudied despite their importance in understanding the mechanisms of these major disturbances. Because fragmentation may negatively impact species occupancy, abundance and co-occurrence, we predict a link between spatiotemporal changes in habitat and the architecture of host-parasite networks. 2. For this, we used an extensive data set on 16 rodent species and 29 helminth species from seven localities of South-East Asia. We analysed the effects of rapid deforestation on connectance and modularity of helminth-parasite networks. We estimated both the degree of fragmentation and the rate of deforestation through the development of land uses and their changes through the last 20 to 30 years in order to take into account the dynamics of habitat fragmentation in our statistical analyses. 3. We found that rapid fragmentation does not affect helminth species richness per se but impacts host-parasite interactions as the rodent-helminth network becomes less connected and more modular. 4. Our results suggest that parasite sharing among host species may become more difficult to maintain with the increase of habitat disturbance. © 2015 The Authors. Journal of Animal Ecology © 2015 British Ecological Society.
1987-08-01
HVAC duct hanger system over an extensive frequency range. The finite element, component mode synthesis, and statistical energy analysis methods are...800-5,000 Hz) analysis was conducted with Statistical Energy Analysis (SEA) coupled with a closed-form harmonic beam analysis program. These...resonances may be obtained by using a finer frequency increment. Statistical Energy Analysis The basic assumption used in SEA analysis is that within each band
A d-statistic for single-case designs that is equivalent to the usual between-groups d-statistic.
Shadish, William R; Hedges, Larry V; Pustejovsky, James E; Boyajian, Jonathan G; Sullivan, Kristynn J; Andrade, Alma; Barrientos, Jeannette L
2014-01-01
We describe a standardised mean difference statistic (d) for single-case designs that is equivalent to the usual d in between-groups experiments. We show how it can be used to summarise treatment effects over cases within a study, to do power analyses in planning new studies and grant proposals, and to meta-analyse effects across studies of the same question. We discuss limitations of this d-statistic, and possible remedies to them. Even so, this d-statistic is better founded statistically than other effect size measures for single-case design, and unlike many general linear model approaches such as multilevel modelling or generalised additive models, it produces a standardised effect size that can be integrated over studies with different outcome measures. SPSS macros for both effect size computation and power analysis are available.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Adams, Brian M.; Ebeida, Mohamed Salah; Eldred, Michael S
The Dakota (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a exible and extensible interface between simulation codes and iterative analysis methods. Dakota contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quanti cation with sampling, reliability, and stochastic expansion methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components requiredmore » for iterative systems analyses, the Dakota toolkit provides a exible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a theoretical manual for selected algorithms implemented within the Dakota software. It is not intended as a comprehensive theoretical treatment, since a number of existing texts cover general optimization theory, statistical analysis, and other introductory topics. Rather, this manual is intended to summarize a set of Dakota-related research publications in the areas of surrogate-based optimization, uncertainty quanti cation, and optimization under uncertainty that provide the foundation for many of Dakota's iterative analysis capabilities.« less
Fitchett, Elizabeth J A; Seale, Anna C; Vergnano, Stefania; Sharland, Michael; Heath, Paul T; Saha, Samir K; Agarwal, Ramesh; Ayede, Adejumoke I; Bhutta, Zulfiqar A; Black, Robert; Bojang, Kalifa; Campbell, Harry; Cousens, Simon; Darmstadt, Gary L; Madhi, Shabir A; Meulen, Ajoke Sobanjo-Ter; Modi, Neena; Patterson, Janna; Qazi, Shamim; Schrag, Stephanie J; Stoll, Barbara J; Wall, Stephen N; Wammanda, Robinson D; Lawn, Joy E
2016-10-01
Neonatal infections are estimated to account for a quarter of the 2·8 million annual neonatal deaths, as well as approximately 3% of all disability-adjusted life-years. Despite this burden, few data are available on incidence, aetiology, and outcomes, particularly regarding impairment. We aimed to develop guidelines for improved scientific reporting of observational neonatal infection studies, to increase comparability and to strengthen research in this area. This checklist, Strengthening the Reporting of Observational Studies in Epidemiology for Newborn Infection (STROBE- NI), is an extension of the STROBE (Strengthening the Reporting of Observational Studies in Epidemiology) statement. STROBE-NI was developed following systematic reviews of published literature (1996-2015), compilation of more than 130 potential reporting recommendations, and circulation of a survey to relevant professionals worldwide, eliciting responses from 147 professionals from 37 countries. An international consensus meeting of 18 participants (with expertise in infectious diseases, neonatology, microbiology, epidemiology, and statistics) identified priority recommendations for reporting, additional to the STROBE statement. Implementation of these STROBE-NI recommendations, and linked checklist, aims to improve scientific reporting of neonatal infection studies, increasing data utility and allowing meta-analyses and pathogen-specific burden estimates to inform global policy and new interventions, including maternal vaccines. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Wang, Yijiao; Huang, Peng; Xin, Zheng; Zeng, Lang; Liu, Xiaoyan; Du, Gang; Kang, Jinfeng
2014-01-01
In this work, three dimensional technology computer-aided design (TCAD) simulations are performed to investigate the impact of random discrete dopant (RDD) including extension induced fluctuation in 14 nm silicon-on-insulator (SOI) gate-source/drain (G-S/D) underlap fin field effect transistor (FinFET). To fully understand the RDD impact in extension, RDD effect is evaluated in channel and extension separately and together. The statistical variability of FinFET performance parameters including threshold voltage (Vth), subthreshold slope (SS), drain induced barrier lowering (DIBL), drive current (Ion), and leakage current (Ioff) are analyzed. The results indicate that RDD in extension can lead to substantial variability, especially for SS, DIBL, and Ion and should be taken into account together with that in channel to get an accurate estimation on RDF. Meanwhile, higher doping concentration of extension region is suggested from the perspective of overall variability control.
Sargeant, J M; O'Connor, A M; Dohoo, I R; Erb, H N; Cevallos, M; Egger, M; Ersbøll, A K; Martin, S W; Nielsen, L R; Pearl, D L; Pfeiffer, D U; Sanchez, J; Torrence, M E; Vigre, H; Waldner, C; Ward, M P
2016-11-01
Reporting of observational studies in veterinary research presents challenges that often are not addressed in published reporting guidelines. To develop an extension of the STROBE (Strengthening the Reporting of Observational Studies in Epidemiology) statement that addresses unique reporting requirements for observational studies in veterinary medicine related to health, production, welfare, and food safety. Consensus meeting of experts. Mississauga, Canada. Seventeen experts from North America, Europe, and Australia. Experts completed a pre-meeting survey about whether items in the STROBE statement should be modified or added to address unique issues related to observational studies in animal species with health, production, welfare, or food safety outcomes. During the meeting, each STROBE item was discussed to determine whether or not rewording was recommended and whether additions were warranted. Anonymous voting was used to determine consensus. Six items required no modifications or additions. Modifications or additions were made to the STROBE items 1 (title and abstract), 3 (objectives), 5 (setting), 6 (participants), 7 (variables), 8 (data sources/measurement), 9 (bias), 10 (study size), 12 (statistical methods), 13 (participants), 14 (descriptive data), 15 (outcome data), 16 (main results), 17 (other analyses), 19 (limitations), and 22 (funding). The methods and processes used were similar to those used for other extensions of the STROBE statement. The use of this STROBE statement extension should improve reporting of observational studies in veterinary research by recognizing unique features of observational studies involving food-producing and companion animals, products of animal origin, aquaculture, and wildlife. Copyright © 2016 The Authors. Journal of Veterinary Internal Medicine published by Wiley Periodicals, Inc. on behalf of the American College of Veterinary Internal Medicine.
Walker, V A; Tranquille, C A; Newton, J R; Dyson, S J; Brandham, J; Northrop, A J; Murray, R C
2017-09-01
Dressage horses are often asked to work in lengthened paces during training and competition, but to date there is limited information about the biomechanics of dressage-specific paces. Preliminary work has shown increased fetlock extension in extended compared with collected paces, but further investigation of the kinematic differences between collected, medium and extended trot in dressage horses is warranted. Investigation of the effect of collected vs. medium/extended trot on limb kinematics of dressage horses. Prospective kinematic evaluation. Twenty clinically sound horses in active dressage training were used. Group 1: Ten young horses (≤6 years) were assessed at collected and medium trot and Group 2: Ten mature horses (≥9 years) were assessed at collected and extended trot. All horses were evaluated on two different surfaces. High speed motion capture (240 Hz) was used to determine kinematic variables. Fore- and hindlimb angles were measured at mid-stance. Descriptive statistics and mixed effect multilevel regression analyses were performed. Speed and stride length were reduced and stride duration increased at collected compared with medium/extended trot. Lengthened trot (medium/extended trot) was associated with increased fetlock extension in both the fore- and hindlimbs in both groups of horses. Changes were greater in mature horses compared with young horses. Shoulder and carpus angles were associated with forelimb fetlock angle. Hock angle was not significantly influenced by pace. Surface had no effect on fetlock or hock angles. Only 2D motion analysis was carried out. Results may have differed in horses with more extreme gait characteristics. Medium/extended trot increases extension of the fore- and hindlimb fetlock joints compared with collected trot in both young and mature dressage horses, respectively. © 2017 EVJ Ltd.
Biomechanical measures of knee joint mobilization.
Silvernail, Jason L; Gill, Norman W; Teyhen, Deydre S; Allison, Stephen C
2011-08-01
The purpose of this study was to quantify the biomechanical properties of specific manual therapy techniques in patients with symptomatic knee osteoarthritis. Twenty subjects (7 female/13 male, age 54±8 years, ht 1·7±0·1 m, wt 94·2±21·8 kg) participated in this study. One physical therapist delivered joint mobilizations (tibiofemoral extension and flexion; patellofemoral medial-lateral and inferior glide) at two grades (Maitland's grade III and grade IV). A capacitance-based pressure mat was used to capture biomechanical characteristics of force and frequency during 2 trials of 15 second mobilizations. Statistical analysis included intraclass correlation coefficient (ICC(3,1)) for intrarater reliability and 2×4 repeated measures analyses of variance and post-hoc comparison tests. Force (Newtons) measurements (mean, max.) for grade III were: extension 45, 74; flexion 39, 61; medial-lateral glide 20, 34; inferior glide 16, 27. Force (Newtons) measurements (mean, max.) for grade IV were: extension 57, 76; flexion 47, 68; medial-lateral glide 23, 36; inferior glide 18, 35. Frequency (Hz) measurements were between 0·9 and 1·2 for grade III, and between 2·1 and 2·4 for grade IV. ICCs were above 0·90 for almost all measures. Maximum force measures were between the ranges reported for cervical and lumbar mobilization at similar grades. Mean force measures were greater at grade IV than III. Oscillation frequency and peak-to-peak amplitude measures were consistent with the grade performed (i.e. greater frequency at grade IV, greater peak-to-peak amplitude at grade III). Intrarater reliability for force, peak-to-peak amplitude and oscillation frequency for knee joint mobilizations was excellent.
Biomechanical measures of knee joint mobilization
Silvernail, Jason L; Gill, Norman W; Teyhen, Deydre S; Allison, Stephen C
2011-01-01
Background and purpose The purpose of this study was to quantify the biomechanical properties of specific manual therapy techniques in patients with symptomatic knee osteoarthritis. Methods Twenty subjects (7 female/13 male, age 54±8 years, ht 1·7±0·1 m, wt 94·2±21·8 kg) participated in this study. One physical therapist delivered joint mobilizations (tibiofemoral extension and flexion; patellofemoral medial–lateral and inferior glide) at two grades (Maitland’s grade III and grade IV). A capacitance-based pressure mat was used to capture biomechanical characteristics of force and frequency during 2 trials of 15 second mobilizations. Statistical analysis included intraclass correlation coefficient (ICC3,1) for intrarater reliability and 2×4 repeated measures analyses of variance and post-hoc comparison tests. Results Force (Newtons) measurements (mean, max.) for grade III were: extension 45, 74; flexion 39, 61; medial–lateral glide 20, 34; inferior glide 16, 27. Force (Newtons) measurements (mean, max.) for grade IV were: extension 57, 76; flexion 47, 68; medial–lateral glide 23, 36; inferior glide 18, 35. Frequency (Hz) measurements were between 0·9 and 1·2 for grade III, and between 2·1 and 2·4 for grade IV. ICCs were above 0·90 for almost all measures. Discussion and conclusion Maximum force measures were between the ranges reported for cervical and lumbar mobilization at similar grades. Mean force measures were greater at grade IV than III. Oscillation frequency and peak-to-peak amplitude measures were consistent with the grade performed (i.e. greater frequency at grade IV, greater peak-to-peak amplitude at grade III). Intrarater reliability for force, peak-to-peak amplitude and oscillation frequency for knee joint mobilizations was excellent. PMID:22851879
Testing Extension Services through AKAP Models
ERIC Educational Resources Information Center
De Rosa, Marcello; Bartoli, Luca; La Rocca, Giuseppe
2014-01-01
Purpose: The aim of the paper is to analyse the attitude of Italian farms in gaining access to agricultural extension services (AES). Design/methodology/approach: The ways Italian farms use AES are described through the AKAP (Awareness, Knowledge, Adoption, Product) sequence. This article investigated the AKAP sequence by submitting a…
ERIC Educational Resources Information Center
Thompson, Bruce; Melancon, Janet G.
Effect sizes have been increasingly emphasized in research as more researchers have recognized that: (1) all parametric analyses (t-tests, analyses of variance, etc.) are correlational; (2) effect sizes have played an important role in meta-analytic work; and (3) statistical significance testing is limited in its capacity to inform scientific…
Comments on `A Cautionary Note on the Interpretation of EOFs'.
NASA Astrophysics Data System (ADS)
Behera, Swadhin K.; Rao, Suryachandra A.; Saji, Hameed N.; Yamagata, Toshio
2003-04-01
The misleading aspect of the statistical analyses used in Dommenget and Latif, which raises concerns on some of the reported climate modes, is demonstrated. Adopting simple statistical techniques, the physical existence of the Indian Ocean dipole mode is shown and then the limitations of varimax and regression analyses in capturing the climate mode are discussed.
Ramanathan, Arvind; Savol, Andrej J.; Agarwal, Pratul K.; Chennubhotla, Chakra S.
2012-01-01
Biomolecular simulations at milli-second and longer timescales can provide vital insights into functional mechanisms. Since post-simulation analyses of such large trajectory data-sets can be a limiting factor in obtaining biological insights, there is an emerging need to identify key dynamical events and relating these events to the biological function online, that is, as simulations are progressing. Recently, we have introduced a novel computational technique, quasi-anharmonic analysis (QAA) (PLoS One 6(1): e15827), for partitioning the conformational landscape into a hierarchy of functionally relevant sub-states. The unique capabilities of QAA are enabled by exploiting anharmonicity in the form of fourth-order statistics for characterizing atomic fluctuations. In this paper, we extend QAA for analyzing long time-scale simulations online. In particular, we present HOST4MD - a higher-order statistical toolbox for molecular dynamics simulations, which (1) identifies key dynamical events as simulations are in progress, (2) explores potential sub-states and (3) identifies conformational transitions that enable the protein to access those sub-states. We demonstrate HOST4MD on micro-second time-scale simulations of the enzyme adenylate kinase in its apo state. HOST4MD identifies several conformational events in these simulations, revealing how the intrinsic coupling between the three sub-domains (LID, CORE and NMP) changes during the simulations. Further, it also identifies an inherent asymmetry in the opening/closing of the two binding sites. We anticipate HOST4MD will provide a powerful and extensible framework for detecting biophysically relevant conformational coordinates from long time-scale simulations. PMID:22733562
Karuza, Elisabeth A; Li, Ping; Weiss, Daniel J; Bulgarelli, Federica; Zinszer, Benjamin D; Aslin, Richard N
2016-10-01
Successful knowledge acquisition requires a cognitive system that is both sensitive to statistical information and able to distinguish among multiple structures (i.e., to detect pattern shifts and form distinct representations). Extensive behavioral evidence has highlighted the importance of cues to structural change, demonstrating how, without them, learners fail to detect pattern shifts and are biased in favor of early experience. Here, we seek a neural account of the mechanism underpinning this primacy effect in learning. During fMRI scanning, adult participants were presented with two artificial languages: a familiar language (L1) on which they had been pretrained followed by a novel language (L2). The languages were composed of the same syllable inventory organized according to unique statistical structures. In the absence of cues to the transition between languages, posttest familiarity judgments revealed that learners on average more accurately segmented words from the familiar language compared with the novel one. Univariate activation and functional connectivity analyses showed that participants with the strongest learning of L1 had decreased recruitment of fronto-subcortical and posterior parietal regions, in addition to a dissociation between downstream regions and early auditory cortex. Participants with a strong new language learning capacity (i.e., higher L2 scores) showed the opposite trend. Thus, we suggest that a bias toward neural efficiency, particularly as manifested by decreased sampling from the environment, accounts for the primacy effect in learning. Potential implications of this hypothesis are discussed, including the possibility that "inefficient" learning systems may be more sensitive to structural changes in a dynamic environment.
Early-type galaxies in the Antlia cluster: catalogue and isophotal analysis
NASA Astrophysics Data System (ADS)
Calderón, Juan P.; Bassino, Lilia P.; Cellone, Sergio A.; Gómez, Matías
2018-06-01
We present a statistical isophotal analysis of 138 early-type galaxies in the Antlia cluster, located at a distance of ˜ 35 Mpc. The observational material consists of CCD images of four 36 × 36 arcmin2 fields obtained with the MOSAIC II camera at the Blanco 4-m telescope at Cerro Tololo Interamerican Observatory. Our present work supersedes previous Antlia studies in the sense that the covered area is four times larger, the limiting magnitude is MB ˜ -9.6 mag, and the surface photometry parameters of each galaxy are derived from Sérsic model fits extrapolated to infinity. In a companion previous study we focused on the scaling relations obtained by means of surface photometry, and now we present the data, on which the previous paper is based, the parameters of the isophotal fits as well as an isophotal analysis. For each galaxy, we derive isophotal shape parameters along the semimajor axis and search for correlations within different radial bins. Through extensive statistical tests, we also analyse the behaviour of these values against photometric and global parameters of the galaxies themselves. While some galaxies do display radial gradients in their ellipticity (ɛ) and/or their Fourier coefficients, differences in mean values between adjacent regions are not statistically significant. Regarding Fourier coefficients, dwarf galaxies usually display gradients between all adjacent regions, while non-dwarfs tend to show this behaviour just between the two outermost regions. Globally, there is no obvious correlation between Fourier coefficients and luminosity for the whole magnitude range (-12 ≳ MV ≳ -22); however, dwarfs display much higher dispersions at all radii.
Barbie, Dana L.; Wehmeyer, Loren L.
2012-01-01
Trends in selected streamflow statistics during 1922-2009 were evaluated at 19 long-term streamflow-gaging stations considered indicative of outflows from Texas to Arkansas, Louisiana, Galveston Bay, and the Gulf of Mexico. The U.S. Geological Survey, in cooperation with the Texas Water Development Board, evaluated streamflow data from streamflow-gaging stations with more than 50 years of record that were active as of 2009. The outflows into Arkansas and Louisiana were represented by 3 streamflow-gaging stations, and outflows into the Gulf of Mexico, including Galveston Bay, were represented by 16 streamflow-gaging stations. Monotonic trend analyses were done using the following three streamflow statistics generated from daily mean values of streamflow: (1) annual mean daily discharge, (2) annual maximum daily discharge, and (3) annual minimum daily discharge. The trend analyses were based on the nonparametric Kendall's Tau test, which is useful for the detection of monotonic upward or downward trends with time. A total of 69 trend analyses by Kendall's Tau were computed - 19 periods of streamflow multiplied by the 3 streamflow statistics plus 12 additional trend analyses because the periods of record for 2 streamflow-gaging stations were divided into periods representing pre- and post-reservoir impoundment. Unless otherwise described, each trend analysis used the entire period of record for each streamflow-gaging station. The monotonic trend analysis detected 11 statistically significant downward trends, 37 instances of no trend, and 21 statistically significant upward trends. One general region studied, which seemingly has relatively more upward trends for many of the streamflow statistics analyzed, includes the rivers and associated creeks and bayous to Galveston Bay in the Houston metropolitan area. Lastly, the most western river basins considered (the Nueces and Rio Grande) had statistically significant downward trends for many of the streamflow statistics analyzed.
Kyle J. Haynes; Andrew M. Liebhold; Ottar N. Bjørnstad; Andrew J. Allstadt; Randall S. Morin
2018-01-01
Evaluating the causes of spatial synchrony in population dynamics in nature is notoriously difficult due to a lack of data and appropriate statistical methods. Here, we use a recently developed method, a multivariate extension of the local indicators of spatial autocorrelation statistic, to map geographic variation in the synchrony of gypsy moth outbreaks. Regression...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-04-20
... Statistics on September 30, 1998 (63 FR 52192). Pursuant to this authority, the BTS, now part of the Research... transferred the responsibility for the F&OS program from BTS, to FMCSA (69 FR 51009). On August 10, 2006 (71... financial and statistical reporting regulations of BTS that were formerly located at chapter XI of title 49...
An introduction to Bayesian statistics in health psychology.
Depaoli, Sarah; Rus, Holly M; Clifton, James P; van de Schoot, Rens; Tiemensma, Jitske
2017-09-01
The aim of the current article is to provide a brief introduction to Bayesian statistics within the field of health psychology. Bayesian methods are increasing in prevalence in applied fields, and they have been shown in simulation research to improve the estimation accuracy of structural equation models, latent growth curve (and mixture) models, and hierarchical linear models. Likewise, Bayesian methods can be used with small sample sizes since they do not rely on large sample theory. In this article, we discuss several important components of Bayesian statistics as they relate to health-based inquiries. We discuss the incorporation and impact of prior knowledge into the estimation process and the different components of the analysis that should be reported in an article. We present an example implementing Bayesian estimation in the context of blood pressure changes after participants experienced an acute stressor. We conclude with final thoughts on the implementation of Bayesian statistics in health psychology, including suggestions for reviewing Bayesian manuscripts and grant proposals. We have also included an extensive amount of online supplementary material to complement the content presented here, including Bayesian examples using many different software programmes and an extensive sensitivity analysis examining the impact of priors.
Jouet, Agathe; McMullan, Mark; van Oosterhout, Cock
2015-06-01
Plant immune genes, or resistance genes, are involved in a co-evolutionary arms race with a diverse range of pathogens. In agronomically important grasses, such R genes have been extensively studied because of their role in pathogen resistance and in the breeding of resistant cultivars. In this study, we evaluate the importance of recombination, mutation and selection on the evolution of the R gene complex Rp1 of Sorghum, Triticum, Brachypodium, Oryza and Zea. Analyses show that recombination is widespread, and we detected 73 independent instances of sequence exchange, involving on average 1567 of 4692 nucleotides analysed (33.4%). We were able to date 24 interspecific recombination events and found that four occurred postspeciation, which suggests that genetic introgression took place between different grass species. Other interspecific events seemed to have been maintained over long evolutionary time, suggesting the presence of balancing selection. Significant positive selection (i.e. a relative excess of nonsynonymous substitutions (dN /dS >1)) was detected in 17-95 codons (0.42-2.02%). Recombination was significantly associated with areas with high levels of polymorphism but not with an elevated dN /dS ratio. Finally, phylogenetic analyses show that recombination results in a general overestimation of the divergence time (mean = 14.3%) and an alteration of the gene tree topology if the tree is not calibrated. Given that the statistical power to detect recombination is determined by the level of polymorphism of the amplicon as well as the number of sequences analysed, it is likely that many studies have underestimated the importance of recombination relative to the mutation rate. © 2015 John Wiley & Sons Ltd.
ILDgenDB: integrated genetic knowledge resource for interstitial lung diseases (ILDs).
Mishra, Smriti; Shah, Mohammad I; Sarkar, Malay; Asati, Nimisha; Rout, Chittaranjan
2018-01-01
Interstitial lung diseases (ILDs) are a diverse group of ∼200 acute and chronic pulmonary disorders that are characterized by variable amounts of inflammation, fibrosis and architectural distortion with substantial morbidity and mortality. Inaccurate and delayed diagnoses increase the risk, especially in developing countries. Studies have indicated the significant roles of genetic elements in ILDs pathogenesis. Therefore, the first genetic knowledge resource, ILDgenDB, has been developed with an objective to provide ILDs genetic data and their integrated analyses for the better understanding of disease pathogenesis and identification of diagnostics-based biomarkers. This resource contains literature-curated disease candidate genes (DCGs) enriched with various regulatory elements that have been generated using an integrated bioinformatics workflow of databases searches, literature-mining and DCGs-microRNA (miRNAs)-single nucleotide polymorphisms (SNPs) association analyses. To provide statistical significance to disease-gene association, ILD-specificity index and hypergeomatric test scores were also incorporated. Association analyses of miRNAs, SNPs and pathways responsible for the pathogenesis of different sub-classes of ILDs were also incorporated. Manually verified 299 DCGs and their significant associations with 1932 SNPs, 2966 miRNAs and 9170 miR-polymorphisms were also provided. Furthermore, 216 literature-mined and proposed biomarkers were identified. The ILDgenDB resource provides user-friendly browsing and extensive query-based information retrieval systems. Additionally, this resource also facilitates graphical view of predicted DCGs-SNPs/miRNAs and literature associated DCGs-ILDs interactions for each ILD to facilitate efficient data interpretation. Outcomes of analyses suggested the significant involvement of immune system and defense mechanisms in ILDs pathogenesis. This resource may potentially facilitate genetic-based disease monitoring and diagnosis.Database URL: http://14.139.240.55/ildgendb/index.php.
Super-delta: a new differential gene expression analysis procedure with robust data normalization.
Liu, Yuhang; Zhang, Jinfeng; Qiu, Xing
2017-12-21
Normalization is an important data preparation step in gene expression analyses, designed to remove various systematic noise. Sample variance is greatly reduced after normalization, hence the power of subsequent statistical analyses is likely to increase. On the other hand, variance reduction is made possible by borrowing information across all genes, including differentially expressed genes (DEGs) and outliers, which will inevitably introduce some bias. This bias typically inflates type I error; and can reduce statistical power in certain situations. In this study we propose a new differential expression analysis pipeline, dubbed as super-delta, that consists of a multivariate extension of the global normalization and a modified t-test. A robust procedure is designed to minimize the bias introduced by DEGs in the normalization step. The modified t-test is derived based on asymptotic theory for hypothesis testing that suitably pairs with the proposed robust normalization. We first compared super-delta with four commonly used normalization methods: global, median-IQR, quantile, and cyclic loess normalization in simulation studies. Super-delta was shown to have better statistical power with tighter control of type I error rate than its competitors. In many cases, the performance of super-delta is close to that of an oracle test in which datasets without technical noise were used. We then applied all methods to a collection of gene expression datasets on breast cancer patients who received neoadjuvant chemotherapy. While there is a substantial overlap of the DEGs identified by all of them, super-delta were able to identify comparatively more DEGs than its competitors. Downstream gene set enrichment analysis confirmed that all these methods selected largely consistent pathways. Detailed investigations on the relatively small differences showed that pathways identified by super-delta have better connections to breast cancer than other methods. As a new pipeline, super-delta provides new insights to the area of differential gene expression analysis. Solid theoretical foundation supports its asymptotic unbiasedness and technical noise-free properties. Implementation on real and simulated datasets demonstrates its decent performance compared with state-of-art procedures. It also has the potential of expansion to be incorporated with other data type and/or more general between-group comparison problems.
NASA Astrophysics Data System (ADS)
Schliep, E. M.; Gelfand, A. E.; Holland, D. M.
2015-12-01
There is considerable demand for accurate air quality information in human health analyses. The sparsity of ground monitoring stations across the United States motivates the need for advanced statistical models to predict air quality metrics, such as PM2.5, at unobserved sites. Remote sensing technologies have the potential to expand our knowledge of PM2.5 spatial patterns beyond what we can predict from current PM2.5 monitoring networks. Data from satellites have an additional advantage in not requiring extensive emission inventories necessary for most atmospheric models that have been used in earlier data fusion models for air pollution. Statistical models combining monitoring station data with satellite-obtained aerosol optical thickness (AOT), also referred to as aerosol optical depth (AOD), have been proposed in the literature with varying levels of success in predicting PM2.5. The benefit of using AOT is that satellites provide complete gridded spatial coverage. However, the challenges involved with using it in fusion models are (1) the correlation between the two data sources varies both in time and in space, (2) the data sources are temporally and spatially misaligned, and (3) there is extensive missingness in the monitoring data and also in the satellite data due to cloud cover. We propose a hierarchical autoregressive spatially varying coefficients model to jointly model the two data sources, which addresses the foregoing challenges. Additionally, we offer formal model comparison for competing models in terms of model fit and out of sample prediction of PM2.5. The models are applied to daily observations of PM2.5 and AOT in the summer months of 2013 across the conterminous United States. Most notably, during this time period, we find small in-sample improvement incorporating AOT into our autoregressive model but little out-of-sample predictive improvement.
Bianchi, Simonetta; Bendinelli, Benedetta; Castellano, Isabella; Piubello, Quirino; Renne, Giuseppe; Cattani, Maria Grazia; Di Stefano, Domenica; Carrillo, Giovanna; Laurino, Licia; Bersiga, Alessandra; Giardina, Carmela; Dante, Stefania; Di Loreto, Carla; Quero, Carmela; Antonacci, Concetta Maria; Palli, Domenico
2012-10-01
Flat epithelial atypia (FEA) may represent the earliest precursor of low-grade breast cancer and often coexists with more advanced atypical proliferative breast lesions such as atypical ductal hyperplasia (ADH) and lobular intraepithelial neoplasia (LIN). The present study aims to investigate the association between morphological parameters of FEA and presence of malignancy at surgical excision (SE) and the clinical significance of the association of FEA with ADH and/or LIN. This study included 589 cases of stereotactic 11-gauge vacuum-assisted needle core biopsy (VANCB), reporting a diagnosis of FEA, ADH or LIN with subsequent SE from 14 pathology departments in Italy. Available slides were reviewed, with 114 (19.4 %) showing a malignant outcome at SE. Among the 190 cases of pure FEA, no statistically significant association emerged between clinical-pathological parameters of FEA and risk of malignancy. Logistic regression analyses showed an increased risk of malignancy according to the extension of ADH among the 275 cases of FEA associated with ADH (p = 0.004) and among the 34 cases of FEA associated with ADH and LIN (p = 0.02). In the whole series, a statistically significant increased malignancy risk emerged according to mammographic R1-R3/R4-R5 categories (OR = 1.56; p = 0.04), extension (OR = 1.24; p = 0.04) and grade (OR = 1.94; p = 0.004) of cytological atypia of FEA. The presence of ADH was associated with an increased malignancy risk (OR = 2.85; p < 0.0001). Our data confirm the frequent association of FEA with ADH and/or LIN. A diagnosis of pure FEA on VANCB carries a 9.5 % risk of concurrent malignancy and thus warrants follow-up excision because none of the clinical-pathological parameters predicts which cases will present carcinoma on SE.
Papageorgiou, Spyridon N; Kloukos, Dimitrios; Petridis, Haralampos; Pandis, Nikolaos
2015-10-01
To assess the hypothesis that there is excessive reporting of statistically significant studies published in prosthodontic and implantology journals, which could indicate selective publication. The last 30 issues of 9 journals in prosthodontics and implant dentistry were hand-searched for articles with statistical analyses. The percentages of significant and non-significant results were tabulated by parameter of interest. Univariable/multivariable logistic regression analyses were applied to identify possible predictors of reporting statistically significance findings. The results of this study were compared with similar studies in dentistry with random-effects meta-analyses. From the 2323 included studies 71% of them reported statistically significant results, with the significant results ranging from 47% to 86%. Multivariable modeling identified that geographical area and involvement of statistician were predictors of statistically significant results. Compared to interventional studies, the odds that in vitro and observational studies would report statistically significant results was increased by 1.20 times (OR: 2.20, 95% CI: 1.66-2.92) and 0.35 times (OR: 1.35, 95% CI: 1.05-1.73), respectively. The probability of statistically significant results from randomized controlled trials was significantly lower compared to various study designs (difference: 30%, 95% CI: 11-49%). Likewise the probability of statistically significant results in prosthodontics and implant dentistry was lower compared to other dental specialties, but this result did not reach statistical significant (P>0.05). The majority of studies identified in the fields of prosthodontics and implant dentistry presented statistically significant results. The same trend existed in publications of other specialties in dentistry. Copyright © 2015 Elsevier Ltd. All rights reserved.
Graph embedding and extensions: a general framework for dimensionality reduction.
Yan, Shuicheng; Xu, Dong; Zhang, Benyu; Zhang, Hong-Jiang; Yang, Qiang; Lin, Stephen
2007-01-01
Over the past few decades, a large family of algorithms - supervised or unsupervised; stemming from statistics or geometry theory - has been designed to provide different solutions to the problem of dimensionality reduction. Despite the different motivations of these algorithms, we present in this paper a general formulation known as graph embedding to unify them within a common framework. In graph embedding, each algorithm can be considered as the direct graph embedding or its linear/kernel/tensor extension of a specific intrinsic graph that describes certain desired statistical or geometric properties of a data set, with constraints from scale normalization or a penalty graph that characterizes a statistical or geometric property that should be avoided. Furthermore, the graph embedding framework can be used as a general platform for developing new dimensionality reduction algorithms. By utilizing this framework as a tool, we propose a new supervised dimensionality reduction algorithm called Marginal Fisher Analysis in which the intrinsic graph characterizes the intraclass compactness and connects each data point with its neighboring points of the same class, while the penalty graph connects the marginal points and characterizes the interclass separability. We show that MFA effectively overcomes the limitations of the traditional Linear Discriminant Analysis algorithm due to data distribution assumptions and available projection directions. Real face recognition experiments show the superiority of our proposed MFA in comparison to LDA, also for corresponding kernel and tensor extensions.
Non-extensivity and complexity in the earthquake activity at the West Corinth rift (Greece)
NASA Astrophysics Data System (ADS)
Michas, Georgios; Vallianatos, Filippos; Sammonds, Peter
2013-04-01
Earthquakes exhibit complex phenomenology that is revealed from the fractal structure in space, time and magnitude. For that reason other tools rather than the simple Poissonian statistics seem more appropriate to describe the statistical properties of the phenomenon. Here we use Non-Extensive Statistical Physics [NESP] to investigate the inter-event time distribution of the earthquake activity at the west Corinth rift (central Greece). This area is one of the most seismotectonically active areas in Europe, with an important continental N-S extension and high seismicity rates. NESP concept refers to the non-additive Tsallis entropy Sq that includes Boltzmann-Gibbs entropy as a particular case. This concept has been successfully used for the analysis of a variety of complex dynamic systems including earthquakes, where fractality and long-range interactions are important. The analysis indicates that the cumulative inter-event time distribution can be successfully described with NESP, implying the complexity that characterizes the temporal occurrences of earthquakes. Further on, we use the Tsallis entropy (Sq) and the Fischer Information Measure (FIM) to investigate the complexity that characterizes the inter-event time distribution through different time windows along the evolution of the seismic activity at the West Corinth rift. The results of this analysis reveal a different level of organization and clusterization of the seismic activity in time. Acknowledgments. GM wish to acknowledge the partial support of the Greek State Scholarships Foundation (IKY).
Kinesio Taping effects on knee extension force among soccer players
Serra, Maysa V. G. B.; Vieira, Edgar R.; Brunt, Denis; Goethel, Márcio F.; Gonçalves, Mauro; Quemelo, Paulo R. V.
2015-01-01
Background: Kinesio Taping (KT) is widely used, however the effects of KT on muscle activation and force are contradictory. Objective: To evaluate the effects of KT on knee extension force in soccer players. Method: This is a clinical trial study design. Thirty-four subjects performed two maximal isometric voluntary contractions of the lower limbs pre, immediately post, and 24 hours after tape application on the lower limbs. Both lower limbs were taped, using K-Tape and 3M Micropore tape randomly on the right and left thighs of the participants. Isometric knee extension force was measured for dominant side using a strain gauge. The following variables were assessed: peak force, time to peak force, rate of force development until peak force, time to peak rate of force development, and 200 ms pulse. Results: There were no statistically significant differences in the variables assessed between KT and Micropore conditions (F=0.645, p=0.666) or among testing sessions (pre, post, and 24h after) (F=0.528, p=0.868), and there was no statistical significance (F=0.271, p=0.986) for interaction between tape conditions and testing session. Conclusion: KT did not affect the force-related measures assessed immediately and 24 hours after the KT application compared with Micropore application, during maximal isometric voluntary knee extension. PMID:25789557
Kinesio Taping effects on knee extension force among soccer players.
Serra, Maysa V G B; Vieira, Edgar R; Brunt, Denis; Goethel, Márcio F; Gonçalves, Mauro; Quemelo, Paulo R V
2015-01-01
Kinesio Taping (KT) is widely used, however the effects of KT on muscle activation and force are contradictory. To evaluate the effects of KT on knee extension force in soccer players. This is a clinical trial study design. Thirty-four subjects performed two maximal isometric voluntary contractions of the lower limbs pre, immediately post, and 24 hours after tape application on the lower limbs. Both lower limbs were taped, using K-Tape and 3M Micropore tape randomly on the right and left thighs of the participants. Isometric knee extension force was measured for dominant side using a strain gauge. The following variables were assessed: peak force, time to peak force, rate of force development until peak force, time to peak rate of force development, and 200 ms pulse. There were no statistically significant differences in the variables assessed between KT and Micropore conditions (F=0.645, p=0.666) or among testing sessions (pre, post, and 24h after) (F=0.528, p=0.868), and there was no statistical significance (F=0.271, p=0.986) for interaction between tape conditions and testing session. KT did not affect the force-related measures assessed immediately and 24 hours after the KT application compared with Micropore application, during maximal isometric voluntary knee extension.
DESIGNING ENVIRONMENTAL MONITORING DATABASES FOR STATISTIC ASSESSMENT
Databases designed for statistical analyses have characteristics that distinguish them from databases intended for general use. EMAP uses a probabilistic sampling design to collect data to produce statistical assessments of environmental conditions. In addition to supporting the ...
Comparing Visual and Statistical Analysis of Multiple Baseline Design Graphs.
Wolfe, Katie; Dickenson, Tammiee S; Miller, Bridget; McGrath, Kathleen V
2018-04-01
A growing number of statistical analyses are being developed for single-case research. One important factor in evaluating these methods is the extent to which each corresponds to visual analysis. Few studies have compared statistical and visual analysis, and information about more recently developed statistics is scarce. Therefore, our purpose was to evaluate the agreement between visual analysis and four statistical analyses: improvement rate difference (IRD); Tau-U; Hedges, Pustejovsky, Shadish (HPS) effect size; and between-case standardized mean difference (BC-SMD). Results indicate that IRD and BC-SMD had the strongest overall agreement with visual analysis. Although Tau-U had strong agreement with visual analysis on raw values, it had poorer agreement when those values were dichotomized to represent the presence or absence of a functional relation. Overall, visual analysis appeared to be more conservative than statistical analysis, but further research is needed to evaluate the nature of these disagreements.
An interactive data management and analysis system for clinical investigators.
Groner, G F; Hopwood, M D; Palley, N A; Sibley, W L; Baker, W R; Christopher, T G; Thompson, H K
1978-09-01
An interactive minicomputer-based system has been developed that enables the clinical research investigator to personally explore and analyze his research data and, as a consequence of these explorations, to acquire more information. This system, which does not require extensive training or computer programming, enables the investigator to describe his data interactively in his own terms, enter data values while having them checked for validity, store time-oriented patient data in a carefully controlled on-line data base, retrieve data by patient, variable, and time, create subsets of patients with common characteristics, perform statistical analyses, and produce tables and graphs. It also permits data to be transferred to and from other computers. The system is well accepted and is being used by a variety of medical specialists at the three clinical research centers where it is operational. Reported benefits include less elapsed and nonproductive time, more thorough analysis of more data, greater and earlier insight into the meaning of research data, and increased publishable results.
NASA Astrophysics Data System (ADS)
Baeza, Andrés; Estrada-Barón, Alejandra; Serrano-Candela, Fidel; Bojórquez, Luis A.; Eakin, Hallie; Escalante, Ana E.
2018-06-01
Due to unplanned growth, large extension and limited resources, most megacities in the developing world are vulnerable to hydrological hazards and infectious diseases caused by waterborne pathogens. Here we aim to elucidate the extent of the relation between the spatial heterogeneity of physical and socio-economic factors associated with hydrological hazards (flooding and scarcity) and the spatial distribution of gastrointestinal disease in Mexico City, a megacity with more than 8 million people. We applied spatial statistics and multivariate regression analyses to high resolution records of gastrointestinal diseases during two time frames (2007–2009 and 2010–2014). Results show a pattern of significant association between water flooding events and disease incidence in the city center (lowlands). We also found that in the periphery (highlands), higher incidence is generally associated with household infrastructure deficiency. Our findings suggest the need for integrated and spatially tailored interventions by public works and public health agencies, aimed to manage socio-hydrological vulnerability in Mexico City.
The physics of the knee in the cosmic ray spectrum
NASA Astrophysics Data System (ADS)
Kampert, K.-H.; Antoni, T.; Apel, W. D.; Badea, F.; Bekk, K.; Bercuci, A.; Blümer, H.; Bollmann, E.; Bozdog, H.
Recent results from the KASCADE extensive air shower experiment are presented. After briefly reviewing the status of the experiment we report on tests of hadronic interaction models and emphasize the progress being made in understanding the properties and origin of the knee at Eknee ˜= 4 · 1015 eV. Analysing the muonand hadron trigger rates in the KASCADE calorimeter as well as the global properties of high energy hadrons in the shower core leads us to conclude that QGSJET still provides the best overall description of EAS data, being superior to DPMJET II-5 and NEXUS 2, for example. Performing high statistics CORSIKA simulations and applying sophisticated unfolding techniques to the electron and muon shower size distributions, we are able to successfully deconvolute the all-particle energy spectrum into energy spectra of 4 individual primary mass groups (p, He, C, Fe). Each of these preliminary energy distributions exhibits a knee like structure with a change of their knee positions suggesting a constant rigidity of R ˜= 2-3 PV.
MULTISCALE ADAPTIVE SMOOTHING MODELS FOR THE HEMODYNAMIC RESPONSE FUNCTION IN FMRI*
Wang, Jiaping; Zhu, Hongtu; Fan, Jianqing; Giovanello, Kelly; Lin, Weili
2012-01-01
In the event-related functional magnetic resonance imaging (fMRI) data analysis, there is an extensive interest in accurately and robustly estimating the hemodynamic response function (HRF) and its associated statistics (e.g., the magnitude and duration of the activation). Most methods to date are developed in the time domain and they have utilized almost exclusively the temporal information of fMRI data without accounting for the spatial information. The aim of this paper is to develop a multiscale adaptive smoothing model (MASM) in the frequency domain by integrating the spatial and temporal information to adaptively and accurately estimate HRFs pertaining to each stimulus sequence across all voxels in a three-dimensional (3D) volume. We use two sets of simulation studies and a real data set to examine the finite sample performance of MASM in estimating HRFs. Our real and simulated data analyses confirm that MASM outperforms several other state-of-art methods, such as the smooth finite impulse response (sFIR) model. PMID:24533041
NASA Astrophysics Data System (ADS)
Fosas de Pando, Miguel; Schmid, Peter J.; Sipp, Denis
2016-11-01
Nonlinear model reduction for large-scale flows is an essential component in many fluid applications such as flow control, optimization, parameter space exploration and statistical analysis. In this article, we generalize the POD-DEIM method, introduced by Chaturantabut & Sorensen [1], to address nonlocal nonlinearities in the equations without loss of performance or efficiency. The nonlinear terms are represented by nested DEIM-approximations using multiple expansion bases based on the Proper Orthogonal Decomposition. These extensions are imperative, for example, for applications of the POD-DEIM method to large-scale compressible flows. The efficient implementation of the presented model-reduction technique follows our earlier work [2] on linearized and adjoint analyses and takes advantage of the modular structure of our compressible flow solver. The efficacy of the nonlinear model-reduction technique is demonstrated to the flow around an airfoil and its acoustic footprint. We could obtain an accurate and robust low-dimensional model that captures the main features of the full flow.
Revising the Lubben Social Network Scale for use in residential long-term care settings.
Munn, Jean; Radey, Melissa; Brown, Kristin; Kim, Hyejin
2018-04-19
We revised the Lubben Social Network Scale (LSNS) to develop a measure of social support specific to residential long-term care (LTC) settings, the LSNS-LTC with five domains (i.e., family, friends, residents, volunteers, and staff). The authors modified the LSNS-18 to capture sources of social support specific to LTC, specifically relationships with residents, volunteers, and staff. We piloted the resultant 28-item measure with 64 LTC residents. Fifty-four respondents provided adequate information for analyses that included descriptive statistics and reliability coefficients. Twenty of the items performed well (had correlations >0.3, overall α = 0.85) and were retained. Three items required modification. The five items related to volunteers were eliminated due to extensive (>15%) missing data resulting in a proposed 23-item measure. We identified, and to some degree quantified, supportive relationships within the LTC environment, while developing a self-report tool to measure social support in these settings.
Formal Models of the Network Co-occurrence Underlying Mental Operations.
Bzdok, Danilo; Varoquaux, Gaël; Grisel, Olivier; Eickenberg, Michael; Poupon, Cyril; Thirion, Bertrand
2016-06-01
Systems neuroscience has identified a set of canonical large-scale networks in humans. These have predominantly been characterized by resting-state analyses of the task-unconstrained, mind-wandering brain. Their explicit relationship to defined task performance is largely unknown and remains challenging. The present work contributes a multivariate statistical learning approach that can extract the major brain networks and quantify their configuration during various psychological tasks. The method is validated in two extensive datasets (n = 500 and n = 81) by model-based generation of synthetic activity maps from recombination of shared network topographies. To study a use case, we formally revisited the poorly understood difference between neural activity underlying idling versus goal-directed behavior. We demonstrate that task-specific neural activity patterns can be explained by plausible combinations of resting-state networks. The possibility of decomposing a mental task into the relative contributions of major brain networks, the "network co-occurrence architecture" of a given task, opens an alternative access to the neural substrates of human cognition.
Theoretical Foundations of Study of Cartography
NASA Astrophysics Data System (ADS)
Talhofer, Václav; Hošková-Mayerová, Šárka
2018-05-01
Cartography and geoinformatics are technical-based fields which deal with modelling and visualization of landscape in the form of a map. The theoretical foundation is necessary to obtain during study of cartography and geoinformatics based mainly on mathematics. For the given subjects, mathematics is necessary for understanding of many procedures that are connected to modelling of the Earth as a celestial body, to ways of its projection into a plane, to methods and procedures of modelling of landscape and phenomena in society and visualization of these models in the form of electronic as well as classic paper maps. Not only general mathematics, but also its extension of differential geometry of curves and surfaces, ways of approximation of lines and surfaces of functional surfaces, mathematical statistics and multi-criterial analyses seem to be suitable and necessary. Underestimation of the significance of mathematical education in cartography and geoinformatics is inappropriate and lowers competence of cartographers and professionals in geographic information science and technology to solve problems.
Formal Models of the Network Co-occurrence Underlying Mental Operations
Bzdok, Danilo; Varoquaux, Gaël; Grisel, Olivier; Eickenberg, Michael; Poupon, Cyril; Thirion, Bertrand
2016-01-01
Systems neuroscience has identified a set of canonical large-scale networks in humans. These have predominantly been characterized by resting-state analyses of the task-unconstrained, mind-wandering brain. Their explicit relationship to defined task performance is largely unknown and remains challenging. The present work contributes a multivariate statistical learning approach that can extract the major brain networks and quantify their configuration during various psychological tasks. The method is validated in two extensive datasets (n = 500 and n = 81) by model-based generation of synthetic activity maps from recombination of shared network topographies. To study a use case, we formally revisited the poorly understood difference between neural activity underlying idling versus goal-directed behavior. We demonstrate that task-specific neural activity patterns can be explained by plausible combinations of resting-state networks. The possibility of decomposing a mental task into the relative contributions of major brain networks, the "network co-occurrence architecture" of a given task, opens an alternative access to the neural substrates of human cognition. PMID:27310288
Acute effects of The Stick on strength, power, and flexibility.
Mikesky, Alan E; Bahamonde, Rafael E; Stanton, Katie; Alvey, Thurman; Fitton, Tom
2002-08-01
The Stick is a muscle massage device used by athletes, particularly track athletes, to improve performance. The purpose of this project was to assess the acute effects of The Stick on muscle strength, power, and flexibility. Thirty collegiate athletes consented to participate in a 4-week, double-blind study, which consisted of 4 testing sessions (1 familiarization and 3 data collection) scheduled 1 week apart. During each testing session subjects performed 4 measures in the following sequence: hamstring flexibility, vertical jump, flying-start 20-yard dash, and isokinetic knee extension at 90 degrees x s(-1). Two minutes of randomly assigned intervention treatment (visualization [control], mock insensible electrical stimulation [placebo], or massage using The Stick [experimental]) was performed immediately prior to each performance measure. Statistical analyses involved single-factor repeated measures analysis of variance (ANOVA) with Fisher's Least Significant Difference post-hoc test. None of the variables measured showed an acute improvement (p < or = 0.05) immediately following treatment with The Stick.
Bagging Voronoi classifiers for clustering spatial functional data
NASA Astrophysics Data System (ADS)
Secchi, Piercesare; Vantini, Simone; Vitelli, Valeria
2013-06-01
We propose a bagging strategy based on random Voronoi tessellations for the exploration of geo-referenced functional data, suitable for different purposes (e.g., classification, regression, dimensional reduction, …). Urged by an application to environmental data contained in the Surface Solar Energy database, we focus in particular on the problem of clustering functional data indexed by the sites of a spatial finite lattice. We thus illustrate our strategy by implementing a specific algorithm whose rationale is to (i) replace the original data set with a reduced one, composed by local representatives of neighborhoods covering the entire investigated area; (ii) analyze the local representatives; (iii) repeat the previous analysis many times for different reduced data sets associated to randomly generated different sets of neighborhoods, thus obtaining many different weak formulations of the analysis; (iv) finally, bag together the weak analyses to obtain a conclusive strong analysis. Through an extensive simulation study, we show that this new procedure - which does not require an explicit model for spatial dependence - is statistically and computationally efficient.
Silva, Fabio; Vander Linden, Marc
2017-09-20
Large radiocarbon datasets have been analysed statistically to identify, on the one hand, the dynamics and tempo of dispersal processes and, on the other, demographic change. This is particularly true for the spread of farming practices in Neolithic Europe. Here we combine the two approaches and apply them to a new, extensive dataset of 14,535 radiocarbon dates for the Mesolithic and Neolithic periods across the Near East and Europe. The results indicate three distinct demographic regimes: one observed in or around the centre of farming innovation and involving a boost in carrying capacity; a second appearing in regions where Mesolithic populations were well established; and a third corresponding to large-scale migrations into previously essentially unoccupied territories, where the travelling front is readily identified. This spatio-temporal patterning linking demographic change with dispersal dynamics, as displayed in the amplitude of the travelling front, correlates and predicts levels of genetic admixture among European early farmers.
Methylcyclopentadienyl manganese tricarbonyl: health risk uncertainties and research directions.
Davis, J M
1998-01-01
With the way cleared for increased use of the fuel additive methylcyclopentadienyl manganese tricarbonyl (MMT) in the United States, the issue of possible public health impacts associated with this additive has gained greater attention. In assessing potential health risks of particulate Mn emitted from the combustion of MMT in gasoline, the U.S. Environmental Protection Agency not only considered the qualitative types of toxic effects associated with inhaled Mn, but conducted extensive exposure-response analyses using various statistical approaches and also estimated population exposure distributions of particulate Mn based on data from an exposure study conducted in California when MMT was used in leaded gasoline. Because of limitations in available data and the need to make several assumptions and extrapolations, the resulting risk characterization had inherent uncertainties that made it impossible to estimate health risks in a definitive or quantitative manner. To support an improved health risk characterization, further investigation is needed in the areas of health effects, emission characterization, and exposure analysis. PMID:9539013
NASA Astrophysics Data System (ADS)
Reis, D. S.; Stedinger, J. R.; Martins, E. S.
2005-10-01
This paper develops a Bayesian approach to analysis of a generalized least squares (GLS) regression model for regional analyses of hydrologic data. The new approach allows computation of the posterior distributions of the parameters and the model error variance using a quasi-analytic approach. Two regional skew estimation studies illustrate the value of the Bayesian GLS approach for regional statistical analysis of a shape parameter and demonstrate that regional skew models can be relatively precise with effective record lengths in excess of 60 years. With Bayesian GLS the marginal posterior distribution of the model error variance and the corresponding mean and variance of the parameters can be computed directly, thereby providing a simple but important extension of the regional GLS regression procedures popularized by Tasker and Stedinger (1989), which is sensitive to the likely values of the model error variance when it is small relative to the sampling error in the at-site estimator.
PIVOT: platform for interactive analysis and visualization of transcriptomics data.
Zhu, Qin; Fisher, Stephen A; Dueck, Hannah; Middleton, Sarah; Khaladkar, Mugdha; Kim, Junhyong
2018-01-05
Many R packages have been developed for transcriptome analysis but their use often requires familiarity with R and integrating results of different packages requires scripts to wrangle the datatypes. Furthermore, exploratory data analyses often generate multiple derived datasets such as data subsets or data transformations, which can be difficult to track. Here we present PIVOT, an R-based platform that wraps open source transcriptome analysis packages with a uniform user interface and graphical data management that allows non-programmers to interactively explore transcriptomics data. PIVOT supports more than 40 popular open source packages for transcriptome analysis and provides an extensive set of tools for statistical data manipulations. A graph-based visual interface is used to represent the links between derived datasets, allowing easy tracking of data versions. PIVOT further supports automatic report generation, publication-quality plots, and program/data state saving, such that all analysis can be saved, shared and reproduced. PIVOT will allow researchers with broad background to easily access sophisticated transcriptome analysis tools and interactively explore transcriptome datasets.
Modular space station phase B extension preliminary system design. Volume 5: configuration analyses
NASA Technical Reports Server (NTRS)
Stefan, A. J.; Goble, G. J.
1972-01-01
The initial and growth modular space station configurations are described, and the evolutionary steps arriving at the final configuration are outlined. Supporting tradeoff studies and analyses such as stress, radiation dosage, and micrometeoroid and thermal protection are included.
USDA-ARS?s Scientific Manuscript database
Agronomic and Environmental research experiments result in data that are analyzed using statistical methods. These data are unavoidably accompanied by uncertainty. Decisions about hypotheses, based on statistical analyses of these data are therefore subject to error. This error is of three types,...
The Large-Scale Structure of Semantic Networks: Statistical Analyses and a Model of Semantic Growth
ERIC Educational Resources Information Center
Steyvers, Mark; Tenenbaum, Joshua B.
2005-01-01
We present statistical analyses of the large-scale structure of 3 types of semantic networks: word associations, WordNet, and Roget's Thesaurus. We show that they have a small-world structure, characterized by sparse connectivity, short average path lengths between words, and strong local clustering. In addition, the distributions of the number of…
Federal Register 2010, 2011, 2012, 2013, 2014
2013-10-31
... scientific and technical analyses, OSHA requests that you disclose: (1) The nature of any financial... such as social security numbers and birthdates. If you submit scientific or technical studies or other... data and technical information submitted to the record. This request is consistent with Executive Order...
Factors of Role Conflict among Livestock Extension Professionals in Andhra Pradesh, India
ERIC Educational Resources Information Center
Sasidhar, P. V. K.; Rao, B. Sudhakar; Sreeramulu, Piedy
2008-01-01
To know the factors of role conflict among livestock extension professionals in Andhra Pradesh, India. Study was conducted following ex-post facto research design. Data were collected from 180 respondents through survey questionnaires. The data were subjected to multiple regression and path analyses to know the factors of role conflict.…
ERIC Educational Resources Information Center
Moumouni, Ismail M.; Vodouhe, Simplice D.; Streiffeler, Friedhelm
2009-01-01
This paper analyses the organizational, financial and technological incentives that service organizations used to motivate farmers to finance agricultural research and extension in Benin. Understanding the foundations and implications of these motivation systems is important for improving farmer financial participation in agricultural research and…
Differences in Performance Among Test Statistics for Assessing Phylogenomic Model Adequacy.
Duchêne, David A; Duchêne, Sebastian; Ho, Simon Y W
2018-05-18
Statistical phylogenetic analyses of genomic data depend on models of nucleotide or amino acid substitution. The adequacy of these substitution models can be assessed using a number of test statistics, allowing the model to be rejected when it is found to provide a poor description of the evolutionary process. A potentially valuable use of model-adequacy test statistics is to identify when data sets are likely to produce unreliable phylogenetic estimates, but their differences in performance are rarely explored. We performed a comprehensive simulation study to identify test statistics that are sensitive to some of the most commonly cited sources of phylogenetic estimation error. Our results show that, for many test statistics, traditional thresholds for assessing model adequacy can fail to reject the model when the phylogenetic inferences are inaccurate and imprecise. This is particularly problematic when analysing loci that have few variable informative sites. We propose new thresholds for assessing substitution model adequacy and demonstrate their effectiveness in analyses of three phylogenomic data sets. These thresholds lead to frequent rejection of the model for loci that yield topological inferences that are imprecise and are likely to be inaccurate. We also propose the use of a summary statistic that provides a practical assessment of overall model adequacy. Our approach offers a promising means of enhancing model choice in genome-scale data sets, potentially leading to improvements in the reliability of phylogenomic inference.
Perturbative thermodynamic geometry of nonextensive ideal classical, Bose, and Fermi gases.
Mohammadzadeh, Hosein; Adli, Fereshteh; Nouri, Sahereh
2016-12-01
We investigate perturbative thermodynamic geometry of nonextensive ideal classical, Bose, and Fermi gases. We show that the intrinsic statistical interaction of nonextensive Bose (Fermi) gas is attractive (repulsive) similar to the extensive case but the value of thermodynamic curvature is changed by a nonextensive parameter. In contrary to the extensive ideal classical gas, the nonextensive one may be divided to two different regimes. According to the deviation parameter of the system to the nonextensive case, one can find a special value of fugacity, z^{*}, where the sign of thermodynamic curvature is changed. Therefore, we argue that the nonextensive parameter induces an attractive (repulsive) statistical interaction for z
Narayanan, Roshni; Nugent, Rebecca; Nugent, Kenneth
2015-10-01
Accreditation Council for Graduate Medical Education guidelines require internal medicine residents to develop skills in the interpretation of medical literature and to understand the principles of research. A necessary component is the ability to understand the statistical methods used and their results, material that is not an in-depth focus of most medical school curricula and residency programs. Given the breadth and depth of the current medical literature and an increasing emphasis on complex, sophisticated statistical analyses, the statistical foundation and education necessary for residents are uncertain. We reviewed the statistical methods and terms used in 49 articles discussed at the journal club in the Department of Internal Medicine residency program at Texas Tech University between January 1, 2013 and June 30, 2013. We collected information on the study type and on the statistical methods used for summarizing and comparing samples, determining the relations between independent variables and dependent variables, and estimating models. We then identified the typical statistics education level at which each term or method is learned. A total of 14 articles came from the Journal of the American Medical Association Internal Medicine, 11 from the New England Journal of Medicine, 6 from the Annals of Internal Medicine, 5 from the Journal of the American Medical Association, and 13 from other journals. Twenty reported randomized controlled trials. Summary statistics included mean values (39 articles), category counts (38), and medians (28). Group comparisons were based on t tests (14 articles), χ2 tests (21), and nonparametric ranking tests (10). The relations between dependent and independent variables were analyzed with simple regression (6 articles), multivariate regression (11), and logistic regression (8). Nine studies reported odds ratios with 95% confidence intervals, and seven analyzed test performance using sensitivity and specificity calculations. These papers used 128 statistical terms and context-defined concepts, including some from data analysis (56), epidemiology-biostatistics (31), modeling (24), data collection (12), and meta-analysis (5). Ten different software programs were used in these articles. Based on usual undergraduate and graduate statistics curricula, 64.3% of the concepts and methods used in these papers required at least a master's degree-level statistics education. The interpretation of the current medical literature can require an extensive background in statistical methods at an education level exceeding the material and resources provided to most medical students and residents. Given the complexity and time pressure of medical education, these deficiencies will be hard to correct, but this project can serve as a basis for developing a curriculum in study design and statistical methods needed by physicians-in-training.
ERIC Educational Resources Information Center
Rupp, Andre A.
2007-01-01
One of the most revolutionary advances in psychometric research during the last decades has been the systematic development of statistical models that allow for cognitive psychometric research (CPR) to be conducted. Many of the models currently available for such purposes are extensions of basic latent variable models in item response theory…
Statistical Analyses of Raw Material Data for MTM45-1/CF7442A-36% RW: CMH Cure Cycle
NASA Technical Reports Server (NTRS)
Coroneos, Rula; Pai, Shantaram, S.; Murthy, Pappu
2013-01-01
This report describes statistical characterization of physical properties of the composite material system MTM45-1/CF7442A, which has been tested and is currently being considered for use on spacecraft structures. This composite system is made of 6K plain weave graphite fibers in a highly toughened resin system. This report summarizes the distribution types and statistical details of the tests and the conditions for the experimental data generated. These distributions will be used in multivariate regression analyses to help determine material and design allowables for similar material systems and to establish a procedure for other material systems. Additionally, these distributions will be used in future probabilistic analyses of spacecraft structures. The specific properties that are characterized are the ultimate strength, modulus, and Poisson??s ratio by using a commercially available statistical package. Results are displayed using graphical and semigraphical methods and are included in the accompanying appendixes.
Powerlaw: a Python package for analysis of heavy-tailed distributions.
Alstott, Jeff; Bullmore, Ed; Plenz, Dietmar
2014-01-01
Power laws are theoretically interesting probability distributions that are also frequently used to describe empirical data. In recent years, effective statistical methods for fitting power laws have been developed, but appropriate use of these techniques requires significant programming and statistical insight. In order to greatly decrease the barriers to using good statistical methods for fitting power law distributions, we developed the powerlaw Python package. This software package provides easy commands for basic fitting and statistical analysis of distributions. Notably, it also seeks to support a variety of user needs by being exhaustive in the options available to the user. The source code is publicly available and easily extensible.
NASA Astrophysics Data System (ADS)
Salmahaminati; Husnaqilati, Atina; Yahya, Amri
2017-01-01
Trash management is one of the society participation to have a good hygiene for each area or nationally. Trash is known as the remainder of regular consumption that should be disposed to do waste processing which will be beneficial and improve the hygiene. The way to do is by sorting plastic which is processed into goods in accordance with the waste. In this study, we will know what are the factors that affect the desire of citizens to process the waste. The factors would have the identity and the state of being of each resident, having known of these factors will be the education about waste management, so it can be compared how the results of the extension by using preliminary data prior to the extension and the final data after extension. The analysis uses multiple logistic regression is the identify factors that influence people’s to desire the waste while the comparison results using t analysis. Data is derived from statistical instrument in the form of a questionnaire.
Radiographic versus clinical extension of Class II carious lesions using an F-speed film.
Kooistra, Scott; Dennison, Joseph B; Yaman, Peter; Burt, Brian A; Taylor, George W
2005-01-01
This study investigated the difference in the apparent radiographic and true clinical extension of Class II carious lesions. Sixty-two lesions in both maxillary and mandibular premolars and molars were radiographed using Insight bitewing film. Class II lesions were scored independently by two masked examiners using an 8-point lesion severity scale. During the restoration process the lesions were dissected in a stepwise fashion from the occlusal aspect. Intraoperative photographs (2x) of the lesions were made, utilizing a novel measurement device in the field as a point of reference. Subsequently, the lesions were all given clinical scores using the same 8-point scale. Statistical analysis showed a significant difference between the true clinical extension of the lesions compared to the radiographic score. "Aggressive" and "Conservative" radiographic diagnoses underestimated the true clinical extent by 0.66 mm and 0.91 mm, respectively. No statistical difference was found between premolars and molars or maxillary and mandibular arches. The results of this study help to define the parameters for making restorative treatment decisions involving Class II carious lesions.
Post Hoc Analyses of ApoE Genotype-Defined Subgroups in Clinical Trials.
Kennedy, Richard E; Cutter, Gary R; Wang, Guoqiao; Schneider, Lon S
2016-01-01
Many post hoc analyses of clinical trials in Alzheimer's disease (AD) and mild cognitive impairment (MCI) are in small Phase 2 trials. Subject heterogeneity may lead to statistically significant post hoc results that cannot be replicated in larger follow-up studies. We investigated the extent of this problem using simulation studies mimicking current trial methods with post hoc analyses based on ApoE4 carrier status. We used a meta-database of 24 studies, including 3,574 subjects with mild AD and 1,171 subjects with MCI/prodromal AD, to simulate clinical trial scenarios. Post hoc analyses examined if rates of progression on the Alzheimer's Disease Assessment Scale-cognitive (ADAS-cog) differed between ApoE4 carriers and non-carriers. Across studies, ApoE4 carriers were younger and had lower baseline scores, greater rates of progression, and greater variability on the ADAS-cog. Up to 18% of post hoc analyses for 18-month trials in AD showed greater rates of progression for ApoE4 non-carriers that were statistically significant but unlikely to be confirmed in follow-up studies. The frequency of erroneous conclusions dropped below 3% with trials of 100 subjects per arm. In MCI, rates of statistically significant differences with greater progression in ApoE4 non-carriers remained below 3% unless sample sizes were below 25 subjects per arm. Statistically significant differences for ApoE4 in post hoc analyses often reflect heterogeneity among small samples rather than true differential effect among ApoE4 subtypes. Such analyses must be viewed cautiously. ApoE genotype should be incorporated into the design stage to minimize erroneous conclusions.
Rao, Goutham; Lopez-Jimenez, Francisco; Boyd, Jack; D'Amico, Frank; Durant, Nefertiti H; Hlatky, Mark A; Howard, George; Kirley, Katherine; Masi, Christopher; Powell-Wiley, Tiffany M; Solomonides, Anthony E; West, Colin P; Wessel, Jennifer
2017-09-05
Meta-analyses are becoming increasingly popular, especially in the fields of cardiovascular disease prevention and treatment. They are often considered to be a reliable source of evidence for making healthcare decisions. Unfortunately, problems among meta-analyses such as the misapplication and misinterpretation of statistical methods and tests are long-standing and widespread. The purposes of this statement are to review key steps in the development of a meta-analysis and to provide recommendations that will be useful for carrying out meta-analyses and for readers and journal editors, who must interpret the findings and gauge methodological quality. To make the statement practical and accessible, detailed descriptions of statistical methods have been omitted. Based on a survey of cardiovascular meta-analyses, published literature on methodology, expert consultation, and consensus among the writing group, key recommendations are provided. Recommendations reinforce several current practices, including protocol registration; comprehensive search strategies; methods for data extraction and abstraction; methods for identifying, measuring, and dealing with heterogeneity; and statistical methods for pooling results. Other practices should be discontinued, including the use of levels of evidence and evidence hierarchies to gauge the value and impact of different study designs (including meta-analyses) and the use of structured tools to assess the quality of studies to be included in a meta-analysis. We also recommend choosing a pooling model for conventional meta-analyses (fixed effect or random effects) on the basis of clinical and methodological similarities among studies to be included, rather than the results of a test for statistical heterogeneity. © 2017 American Heart Association, Inc.
Takahashi, Fumihiro; Takei, Koji; Tsuda, Kikumi; Palumbo, Joseph
2017-10-01
In the 24-week double-blind study of edaravone in ALS (MCI186-16), edaravone did not show a statistically significant difference versus placebo for the primary efficacy endpoint. For post-hoc analyses, two subpopulations were identified in which edaravone might be expected to show efficacy: the efficacy-expected subpopulation (EESP), defined by scores of ≥2 points on all 12 items of the ALS Functional Rating Scale-Revised (ALSFRS-R) and a percent predicted forced vital capacity (%FVC) ≥80% at baseline; and the definite/probable EESP 2 years (dpEESP2y) subpopulation which, in addition to EESP criteria, had definite or probable ALS diagnosed by El Escorial revised criteria, and disease duration of ≤2 years. In the 36-week extension study of MCI186-16, a 24-week double-blind comparison followed by 12 weeks of open-label edaravone (MCI186-17; NCT00424463), analyses of ALSFRS-R scores of the edaravone-edaravone group and edaravone-placebo group for the full analysis set (FAS) and EESP, as prospectively defined, were reported in a previous article. Here we additionally report results in patients who met dpEESP2y criteria at the baseline of MCI186-16. In the dpEESP2y, the difference in ALSFRS-R changes from 24 to 48 weeks between the edaravone-edaravone and edaravone-placebo groups was 2.79 (p = 0.0719), which was greater than the differences previously reported for the EESP and the FAS. The pattern of adverse events in the dpEESP2y did not show any additional safety findings to those from the earlier prospective study. In conclusion, this post-hoc analysis suggests a potential effect of edaravone between 24 and 48 weeks in patients meeting dpEESP2y criteria at baseline.
Interdependency of the maximum range of flexion-extension of hand metacarpophalangeal joints.
Gracia-Ibáñez, V; Vergara, M; Sancho-Bru, J-L
2016-12-01
Mobility of the fingers metacarpophalangeal (MCP) joints depends on the posture of the adjacent ones. Current Biomechanical hand models consider fixed ranges of movement at joints, regardless of the posture, thus allowing for non-realistic postures, generating wrong results in reach studies and forward dynamic analyses. This study provides data for more realistic hand models. The maximum voluntary extension (MVE) and flexion (MVF) of different combinations of MCP joints were measured covering their range of motion. Dependency of the MVF and MVE on the posture of the adjacent MCP joints was confirmed and mathematical models obtained through regression analyses (RMSE 7.7°).
2014-01-01
Objective To offer a practical demonstration of receiver operating characteristic (ROC) analyses, diagnostic efficiency statistics, and their application to clinical decision making using a popular parent checklist to assess for potential mood disorder. Method Secondary analyses of data from 589 families seeking outpatient mental health services, completing the Child Behavior Checklist and semi-structured diagnostic interviews. Results Internalizing Problems raw scores discriminated mood disorders significantly better than did age- and gender-normed T scores, or an Affective Problems score. Internalizing scores <8 had a diagnostic likelihood ratio <0.3, and scores >30 had a diagnostic likelihood ratio of 7.4. Conclusions This study illustrates a series of steps in defining a clinical problem, operationalizing it, selecting a valid study design, and using ROC analyses to generate statistics that support clinical decisions. The ROC framework offers important advantages for clinical interpretation. Appendices include sample scripts using SPSS and R to check assumptions and conduct ROC analyses. PMID:23965298
Youngstrom, Eric A
2014-03-01
To offer a practical demonstration of receiver operating characteristic (ROC) analyses, diagnostic efficiency statistics, and their application to clinical decision making using a popular parent checklist to assess for potential mood disorder. Secondary analyses of data from 589 families seeking outpatient mental health services, completing the Child Behavior Checklist and semi-structured diagnostic interviews. Internalizing Problems raw scores discriminated mood disorders significantly better than did age- and gender-normed T scores, or an Affective Problems score. Internalizing scores <8 had a diagnostic likelihood ratio <0.3, and scores >30 had a diagnostic likelihood ratio of 7.4. This study illustrates a series of steps in defining a clinical problem, operationalizing it, selecting a valid study design, and using ROC analyses to generate statistics that support clinical decisions. The ROC framework offers important advantages for clinical interpretation. Appendices include sample scripts using SPSS and R to check assumptions and conduct ROC analyses.
Winer, E Samuel; Cervone, Daniel; Bryant, Jessica; McKinney, Cliff; Liu, Richard T; Nadorff, Michael R
2016-09-01
A popular way to attempt to discern causality in clinical psychology is through mediation analysis. However, mediation analysis is sometimes applied to research questions in clinical psychology when inferring causality is impossible. This practice may soon increase with new, readily available, and easy-to-use statistical advances. Thus, we here provide a heuristic to remind clinical psychological scientists of the assumptions of mediation analyses. We describe recent statistical advances and unpack assumptions of causality in mediation, underscoring the importance of time in understanding mediational hypotheses and analyses in clinical psychology. Example analyses demonstrate that statistical mediation can occur despite theoretical mediation being improbable. We propose a delineation of mediational effects derived from cross-sectional designs into the terms temporal and atemporal associations to emphasize time in conceptualizing process models in clinical psychology. The general implications for mediational hypotheses and the temporal frameworks from within which they may be drawn are discussed. © 2016 Wiley Periodicals, Inc.
This tool allows users to animate cancer trends over time by cancer site and cause of death, race, and sex. Provides access to incidence, mortality, and survival. Select the type of statistic, variables, format, and then extract the statistics in a delimited format for further analyses.
Gamma, Alex; Lehmann, Dietrich; Frei, Edi; Iwata, Kazuki; Pascual-Marqui, Roberto D; Vollenweider, Franz X
2004-06-01
The complementary strengths and weaknesses of established functional brain imaging methods (high spatial, low temporal resolution) and EEG-based techniques (low spatial, high temporal resolution) make their combined use a promising avenue for studying brain processes at a more fine-grained level. However, this strategy requires a better understanding of the relationship between hemodynamic/metabolic and neuroelectric measures of brain activity. We investigated possible correspondences between cerebral blood flow (CBF) as measured by [H2O]-PET and intracerebral electric activity computed by Low Resolution Brain Electromagnetic Tomography (LORETA) from scalp-recorded multichannel EEG in healthy human subjects during cognitive and pharmacological stimulation. The two imaging modalities were compared by descriptive, correlational, and variance analyses, the latter carried out using statistical parametric mapping (SPM99). Descriptive visual comparison showed a partial overlap between the sets of active brain regions detected by the two modalities. A number of exclusively positive correlations of neuroelectric activity with regional CBF were found across the whole EEG frequency range, including slow wave activity, the latter finding being in contrast to most previous studies conducted in patients. Analysis of variance revealed an extensive lack of statistically significant correspondences between brain activity changes as measured by PET vs. EEG-LORETA. In general, correspondences, to the extent they were found, were dependent on experimental condition, brain region, and EEG frequency. Copyright 2004 Wiley-Liss, Inc.
Pachón-García, F T; Paniagua-Sánchez, J M; Rufo-Pérez, M; Jiménez-Barco, A
2014-12-01
This article analyses the electric field levels around medium-wave transmitters, delimiting the temporal variability of the levels received at a pre-established reception point. One extensively used dosimetric criterion is to consider historical levels of the field recorded over a certain period of time so as to provide an overall perspective of radio-frequency electric field exposure in a particular environment. This aspect is the focus of the present study, in which the measurements will be synthesised in the form of exposure coefficients. Two measurement campaigns were conducted: one short term (10 days) and the other long term (1 y). The short-term data were used to study which probability density functions best approximate the measured levels. The long-term data were used to compute the principal statistics that characterise the field values over a year. The data that form the focus of the study are the peak traces, since these are the most representative from the standpoint of exposure. The deviations found were around 6 % for short periods and 12 % for long periods. The information from the two campaigns was used to develop and implement a computer application based on the Monte Carlo method to simulate values of the field, allowing one to carry out robust statistics. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Zakopoulou, Victoria; Pashou, Theodora; Tzavelas, Panagiotis; Christodoulides, Pavlos; Anna, Milona; Iliana, Kolotoura
2013-11-01
The development of learning difficulties is associated with problems in external (executive) and extensive behaviour in a co-occurrence with psycho-emotional problems beginning from pre-school, school age, and adolescence up to adulthood. Through the current survey, we aim to emphasise the early role of learning difficulties during the school age and adolescence of prisoners and their effects on the onset of offending behaviours in adulthood, such as criminal behaviour. Altogether, we studied 117 Greek adult prisoners from 18 to 70 years old who were accused of different types or degrees of offences. Through statistical analyses, the following factors were observed with high statistical significance as early indicators of criminal behaviour in the adult lives of the prisoners: (i) learning difficulties, (ii) family problems, (iii) behaviour disorders, (iv) developmental disorders, and (v) psycho-emotional disorders. As a result, the learning difficulties were assumed to be the most decisive factor in the developmental progression of prisoners because they manifested early in the prisoners' lives, weakened the prisoners to be competitive and robust, provoked a bad self-image and low self-esteem, and, in the frame of a weak or negative family and educational environment, they accompanied antisocial behaviour and psycho-emotional disorders even from adolescence, which continued into adulthood. Copyright © 2013 Elsevier Ltd. All rights reserved.
Curcio, Giuseppe
2018-01-01
In the past 20 years of research regarding effects of mobile phone-derived electromagnetic fields (EMFs) on human cognition, attention has been one of the first and most extensively investigated functions. Different domains investigated covered selective, sustained, and divided attention. Here, the most relevant studies on this topic have been reviewed and discussed. A total of 43 studies are reported and summarized: of these, 31 indicated a total absence of statistically significant difference between real and sham signal, 9 showed a partial improvement of attentional performance (mainly increase in speed of performance and/or improvement of accuracy) as a function of real exposure, while the remaining 3 showed inconsistent results (i.e., increased speed in some tasks and slowing in others) or even a worsening in performance (reduced speed and/or deteriorated accuracy). These results are independent of the specific attentional domain investigated. This scenario allows to conclude that there is a substantial lack of evidence about a negative influence of non-ionizing radiations on attention functioning. Nonetheless, published literature is very heterogeneous under the point of view of methodology (type of signal, exposure time, blinding), dosimetry (accurate evaluation of specific absorption rate-SAR or emitted power), and statistical analyses, making arduous a conclusive generalization to everyday life. Some remarks and suggestions regarding future research are proposed.
Optimal spectral tracking--adapting to dynamic regime change.
Brittain, John-Stuart; Halliday, David M
2011-01-30
Real world data do not always obey the statistical restraints imposed upon them by sophisticated analysis techniques. In spectral analysis for instance, an ergodic process--the interchangeability of temporal for spatial averaging--is assumed for a repeat-trial design. Many evolutionary scenarios, such as learning and motor consolidation, do not conform to such linear behaviour and should be approached from a more flexible perspective. To this end we previously introduced the method of optimal spectral tracking (OST) in the study of trial-varying parameters. In this extension to our work we modify the OST routines to provide an adaptive implementation capable of reacting to dynamic transitions in the underlying system state. In so doing, we generalise our approach to characterise both slow-varying and rapid fluctuations in time-series, simultaneously providing a metric of system stability. The approach is first applied to a surrogate dataset and compared to both our original non-adaptive solution and spectrogram approaches. The adaptive OST is seen to display fast convergence and desirable statistical properties. All three approaches are then applied to a neurophysiological recording obtained during a study on anaesthetic monitoring. Local field potentials acquired from the posterior hypothalamic region of a deep brain stimulation patient undergoing anaesthesia were analysed. The characterisation of features such as response delay, time-to-peak and modulation brevity are considered. Copyright © 2010 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Traper, Sandra; Pöppl, Ronald; Rascher, Eric; Sass, Oliver
2016-04-01
In recent times different types of natural disasters like debris flow events have attracted increasing attention worldwide, since they can cause great damage and loss of infrastructure or even lives is not unusual when it comes to such an event. The engagement with debris flows is especially important in mountainous areas like Austria, since Alpine regions have proved to be particularly prone to the often harmful consequences of such events because of increasing settlement of previously uninhabited regions. Due to those frequently damaging effects of debris flows, research on this kind of natural disaster often focuses on mitigation and recovery measures after an event and on how to restore the initial situation. However, a view on the situation of an area, where severe debris flows recently occurred and are well documented, before the actual event can aid in discovering important preparatory factors that contribute to initiating debris flows and hillslope-channel connectivity in the first place. Valuable insights into the functioning and preconditions of debris flows and their potential connectivity to the main channel can be gained. The study focuses on two geologically different areas in the Austrian Alps, which are both prone to debris flows and have experienced rather severe events recently. Based on data from debris flow events in two regions in Styria (Austria), the Kleinsölk and the Johnsbach valleys, the aim of the study is to identify factors which influence the development of debris flows and the potential of such debris flows to reach the main channel potentially clogging up the river (hillslope-channel connectivity). The degree of hillslope-channel coupling was verified in extensive TLS and ALS surveys, resulting in DEMs of different resolution and spatial extension. Those factors are obtained, analyzed and evaluated with DEM-based GIS- and statistical analyses. These include factors that are attributed to catchment topography, such as slope angle, curvature, size, shape as well as topographic channel parameters. Together with factors of land cover/use and lithology those features provide the independent variables for further statistical analyses. With the help of several logistic regressions the likelihoods of influencing topographical and lithological factors and factors of land cover/use leading to debris flow events and those for debris flows to reach the main channel (hillslope-channel connectivity) are computed. First results will be presented at the EGU General Assembly 2016.
How distributed processing produces false negatives in voxel-based lesion-deficit analyses.
Gajardo-Vidal, Andrea; Lorca-Puls, Diego L; Crinion, Jennifer T; White, Jitrachote; Seghier, Mohamed L; Leff, Alex P; Hope, Thomas M H; Ludersdorfer, Philipp; Green, David W; Bowman, Howard; Price, Cathy J
2018-07-01
In this study, we hypothesized that if the same deficit can be caused by damage to one or another part of a distributed neural system, then voxel-based analyses might miss critical lesion sites because preservation of each site will not be consistently associated with preserved function. The first part of our investigation used voxel-based multiple regression analyses of data from 359 right-handed stroke survivors to identify brain regions where lesion load is associated with picture naming abilities after factoring out variance related to object recognition, semantics and speech articulation so as to focus on deficits arising at the word retrieval level. A highly significant lesion-deficit relationship was identified in left temporal and frontal/premotor regions. Post-hoc analyses showed that damage to either of these sites caused the deficit of interest in less than half the affected patients (76/162 = 47%). After excluding all patients with damage to one or both of the identified regions, our second analysis revealed a new region, in the anterior part of the left putamen, which had not been previously detected because many patients had the deficit of interest after temporal or frontal damage that preserved the left putamen. The results illustrate how (i) false negative results arise when the same deficit can be caused by different lesion sites; (ii) some of the missed effects can be unveiled by adopting an iterative approach that systematically excludes patients with lesions to the areas identified in previous analyses, (iii) statistically significant voxel-based lesion-deficit mappings can be driven by a subset of patients; (iv) focal lesions to the identified regions are needed to determine whether the deficit of interest is the consequence of focal damage or much more extensive damage that includes the identified region; and, finally, (v) univariate voxel-based lesion-deficit mappings cannot, in isolation, be used to predict outcome in other patients. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.
SOCR Analyses – an Instructional Java Web-based Statistical Analysis Toolkit
Chu, Annie; Cui, Jenny; Dinov, Ivo D.
2011-01-01
The Statistical Online Computational Resource (SOCR) designs web-based tools for educational use in a variety of undergraduate courses (Dinov 2006). Several studies have demonstrated that these resources significantly improve students' motivation and learning experiences (Dinov et al. 2008). SOCR Analyses is a new component that concentrates on data modeling and analysis using parametric and non-parametric techniques supported with graphical model diagnostics. Currently implemented analyses include commonly used models in undergraduate statistics courses like linear models (Simple Linear Regression, Multiple Linear Regression, One-Way and Two-Way ANOVA). In addition, we implemented tests for sample comparisons, such as t-test in the parametric category; and Wilcoxon rank sum test, Kruskal-Wallis test, Friedman's test, in the non-parametric category. SOCR Analyses also include several hypothesis test models, such as Contingency tables, Friedman's test and Fisher's exact test. The code itself is open source (http://socr.googlecode.com/), hoping to contribute to the efforts of the statistical computing community. The code includes functionality for each specific analysis model and it has general utilities that can be applied in various statistical computing tasks. For example, concrete methods with API (Application Programming Interface) have been implemented in statistical summary, least square solutions of general linear models, rank calculations, etc. HTML interfaces, tutorials, source code, activities, and data are freely available via the web (www.SOCR.ucla.edu). Code examples for developers and demos for educators are provided on the SOCR Wiki website. In this article, the pedagogical utilization of the SOCR Analyses is discussed, as well as the underlying design framework. As the SOCR project is on-going and more functions and tools are being added to it, these resources are constantly improved. The reader is strongly encouraged to check the SOCR site for most updated information and newly added models. PMID:21546994
ERIC Educational Resources Information Center
Carriere, Ronald A.; And Others
This report focuses on a set of supplemental analyses that were performed on portions of the Emergency School Aid Act (ESAA) evaluation data. The goal of these analyses was to explore additional relationships in the data that might help to inform program policy, to confirm and/or further explicate some of the findings reported earlier, and to put…
Statistical innovations in diagnostic device evaluation.
Yu, Tinghui; Li, Qin; Gray, Gerry; Yue, Lilly Q
2016-01-01
Due to rapid technological development, innovations in diagnostic devices are proceeding at an extremely fast pace. Accordingly, the needs for adopting innovative statistical methods have emerged in the evaluation of diagnostic devices. Statisticians in the Center for Devices and Radiological Health at the Food and Drug Administration have provided leadership in implementing statistical innovations. The innovations discussed in this article include: the adoption of bootstrap and Jackknife methods, the implementation of appropriate multiple reader multiple case study design, the application of robustness analyses for missing data, and the development of study designs and data analyses for companion diagnostics.
Huvane, Jacqueline; Komarow, Lauren; Hill, Carol; Tran, Thuy Tien T.; Pereira, Carol; Rosenkranz, Susan L.; Finnemeyer, Matt; Earley, Michelle; Jiang, Hongyu (Jeanne); Wang, Rui; Lok, Judith
2017-01-01
Abstract The Statistical and Data Management Center (SDMC) provides the Antibacterial Resistance Leadership Group (ARLG) with statistical and data management expertise to advance the ARLG research agenda. The SDMC is active at all stages of a study, including design; data collection and monitoring; data analyses and archival; and publication of study results. The SDMC enhances the scientific integrity of ARLG studies through the development and implementation of innovative and practical statistical methodologies and by educating research colleagues regarding the application of clinical trial fundamentals. This article summarizes the challenges and roles, as well as the innovative contributions in the design, monitoring, and analyses of clinical trials and diagnostic studies, of the ARLG SDMC. PMID:28350899
Xu, H; Li, C; Zeng, Q; Agrawal, I; Zhu, X; Gong, Z
2016-06-01
In this study, to systematically identify the most stably expressed genes for internal reference in zebrafish Danio rerio investigations, 37 D. rerio transcriptomic datasets (both RNA sequencing and microarray data) were collected from gene expression omnibus (GEO) database and unpublished data, and gene expression variations were analysed under three experimental conditions: tissue types, developmental stages and chemical treatments. Forty-four putative candidate genes were identified with the c.v. <0·2 from all datasets. Following clustering into different functional groups, 21 genes, in addition to four conventional housekeeping genes (eef1a1l1, b2m, hrpt1l and actb1), were selected from different functional groups for further quantitative real-time (qrt-)PCR validation using 25 RNA samples from different adult tissues, developmental stages and chemical treatments. The qrt-PCR data were then analysed using the statistical algorithm refFinder for gene expression stability. Several new candidate genes showed better expression stability than the conventional housekeeping genes in all three categories. It was found that sep15 and metap1 were the top two stable genes for tissue types, ube2a and tmem50a the top two for different developmental stages, and rpl13a and rp1p0 the top two for chemical treatments. Thus, based on the extensive transcriptomic analyses and qrt-PCR validation, these new reference genes are recommended for normalization of D. rerio qrt-PCR data respectively for the three different experimental conditions. © 2016 The Fisheries Society of the British Isles.
Genomic evidence of gene flow during reinforcement in Texas Phlox.
Roda, Federico; Mendes, Fábio K; Hahn, Matthew W; Hopkins, Robin
2017-04-01
Gene flow can impede the evolution of reproductive isolating barriers between species. Reinforcement is the process by which prezygotic reproductive isolation evolves in sympatry due to selection to decrease costly hybridization. It is known that reinforcement can be prevented by too much gene flow, but we still do not know how often have prezygotic barriers evolved in the presence of gene flow or how much gene flow can occur during reinforcement. Flower colour divergence in the native Texas wildflower, Phlox drummondii, is one of the best-studied cases of reinforcement. Here we use genomic analyses to infer gene flow between P. drummondii and a closely related sympatric species, Phlox cuspidata. We de novo assemble transcriptomes of four Phlox species to determine the phylogenetic relationships between these species and find extensive discordance among gene tree topologies across genes. We find evidence of introgression between sympatric P. drummondii and P. cuspidata using the D-statistic, and use phylogenetic analyses to infer the predominant direction of introgression. We investigate geographic variation in gene flow by comparing the relative divergence of genes displaying discordant gene trees between an allopatric and sympatric sample. These analyses support the hypothesis that sympatric P. drummondii has experienced gene flow with P. cuspidata. We find that gene flow between these species is asymmetrical, which could explain why reinforcement caused divergence in only one of the sympatric species. Given the previous research in this system, we suggest strong selection can explain how reinforcement successfully evolved in this system despite gene flow in sympatry. © 2017 John Wiley & Sons Ltd.
Rue-Albrecht, Kévin; McGettigan, Paul A; Hernández, Belinda; Nalpas, Nicolas C; Magee, David A; Parnell, Andrew C; Gordon, Stephen V; MacHugh, David E
2016-03-11
Identification of gene expression profiles that differentiate experimental groups is critical for discovery and analysis of key molecular pathways and also for selection of robust diagnostic or prognostic biomarkers. While integration of differential expression statistics has been used to refine gene set enrichment analyses, such approaches are typically limited to single gene lists resulting from simple two-group comparisons or time-series analyses. In contrast, functional class scoring and machine learning approaches provide powerful alternative methods to leverage molecular measurements for pathway analyses, and to compare continuous and multi-level categorical factors. We introduce GOexpress, a software package for scoring and summarising the capacity of gene ontology features to simultaneously classify samples from multiple experimental groups. GOexpress integrates normalised gene expression data (e.g., from microarray and RNA-seq experiments) and phenotypic information of individual samples with gene ontology annotations to derive a ranking of genes and gene ontology terms using a supervised learning approach. The default random forest algorithm allows interactions between all experimental factors, and competitive scoring of expressed genes to evaluate their relative importance in classifying predefined groups of samples. GOexpress enables rapid identification and visualisation of ontology-related gene panels that robustly classify groups of samples and supports both categorical (e.g., infection status, treatment) and continuous (e.g., time-series, drug concentrations) experimental factors. The use of standard Bioconductor extension packages and publicly available gene ontology annotations facilitates straightforward integration of GOexpress within existing computational biology pipelines.
Majdan, Marek; Mauritz, Walter; Wilbacher, Ingrid; Janciak, Ivan; Brazinova, Alexandra; Rusnak, Martin; Leitgeb, Johannes
2013-08-01
Road traffic accidents (RTAs) have been identified by public health organizations as being of major global concern. Traumatic brain injuries (TBIs) are among the most severe injuries and are in a large part caused by RTA. The objective of this article is to analyse the severity and outcome of TBI caused by RTA in different types of road users in five European countries. The demographic, severity and outcome measures of 683 individuals with RTA-related TBI from Austria, Slovakia, Bosnia, Croatia and Macedonia were analysed. Five types of road users (car drivers, car passengers, motorcyclists, bicyclists and pedestrians) were compared using univariate and multivariate statistical methods. Short-term outcome [intensive care unit (ICU) survival] and last available long-term outcome of patients were analysed. In our data set, 44% of TBI were traffic related. The median age of patients was 32.5 years, being the lowest (25 years) in car passengers. The most severe and extensive injuries were reported in pedestrians. Pedestrians had the lowest rate of ICU survival (60%) and favourable long-term outcome (46%). Drivers had the highest ICU survival (73%) and car passengers had the best long-term outcome (59% favourable). No differences in the outcome were found between countries with different economy levels. TBI are significantly associated with RTA and thus, tackling them together could be more effective. The population at highest risk of RTA-related TBI are young males (in our sample median age: 32.5 years). Pedestrians have the most severe TBI with the worst outcome. Both groups should be a priority for public health action.
National Center for Mathematics and Science - K-12 education research
motion, calculus, statistics, genetics, evolution, astronomy, and other topics. Teacher professional ). Extensive materials developed for instruction in evolutionary biology and astronomy - using the model-based
ISSUES IN THE STATISTICAL ANALYSIS OF SMALL-AREA HEALTH DATA. (R825173)
The availability of geographically indexed health and population data, with advances in computing, geographical information systems and statistical methodology, have opened the way for serious exploration of small area health statistics based on routine data. Such analyses may be...
Nonindependence and sensitivity analyses in ecological and evolutionary meta-analyses.
Noble, Daniel W A; Lagisz, Malgorzata; O'dea, Rose E; Nakagawa, Shinichi
2017-05-01
Meta-analysis is an important tool for synthesizing research on a variety of topics in ecology and evolution, including molecular ecology, but can be susceptible to nonindependence. Nonindependence can affect two major interrelated components of a meta-analysis: (i) the calculation of effect size statistics and (ii) the estimation of overall meta-analytic estimates and their uncertainty. While some solutions to nonindependence exist at the statistical analysis stages, there is little advice on what to do when complex analyses are not possible, or when studies with nonindependent experimental designs exist in the data. Here we argue that exploring the effects of procedural decisions in a meta-analysis (e.g. inclusion of different quality data, choice of effect size) and statistical assumptions (e.g. assuming no phylogenetic covariance) using sensitivity analyses are extremely important in assessing the impact of nonindependence. Sensitivity analyses can provide greater confidence in results and highlight important limitations of empirical work (e.g. impact of study design on overall effects). Despite their importance, sensitivity analyses are seldom applied to problems of nonindependence. To encourage better practice for dealing with nonindependence in meta-analytic studies, we present accessible examples demonstrating the impact that ignoring nonindependence can have on meta-analytic estimates. We also provide pragmatic solutions for dealing with nonindependent study designs, and for analysing dependent effect sizes. Additionally, we offer reporting guidelines that will facilitate disclosure of the sources of nonindependence in meta-analyses, leading to greater transparency and more robust conclusions. © 2017 John Wiley & Sons Ltd.
RipleyGUI: software for analyzing spatial patterns in 3D cell distributions
Hansson, Kristin; Jafari-Mamaghani, Mehrdad; Krieger, Patrik
2013-01-01
The true revolution in the age of digital neuroanatomy is the ability to extensively quantify anatomical structures and thus investigate structure-function relationships in great detail. To facilitate the quantification of neuronal cell patterns we have developed RipleyGUI, a MATLAB-based software that can be used to detect patterns in the 3D distribution of cells. RipleyGUI uses Ripley's K-function to analyze spatial distributions. In addition the software contains statistical tools to determine quantitative statistical differences, and tools for spatial transformations that are useful for analyzing non-stationary point patterns. The software has a graphical user interface making it easy to use without programming experience, and an extensive user manual explaining the basic concepts underlying the different statistical tools used to analyze spatial point patterns. The described analysis tool can be used for determining the spatial organization of neurons that is important for a detailed study of structure-function relationships. For example, neocortex that can be subdivided into six layers based on cell density and cell types can also be analyzed in terms of organizational principles distinguishing the layers. PMID:23658544
Applying the multivariate time-rescaling theorem to neural population models
Gerhard, Felipe; Haslinger, Robert; Pipa, Gordon
2011-01-01
Statistical models of neural activity are integral to modern neuroscience. Recently, interest has grown in modeling the spiking activity of populations of simultaneously recorded neurons to study the effects of correlations and functional connectivity on neural information processing. However any statistical model must be validated by an appropriate goodness-of-fit test. Kolmogorov-Smirnov tests based upon the time-rescaling theorem have proven to be useful for evaluating point-process-based statistical models of single-neuron spike trains. Here we discuss the extension of the time-rescaling theorem to the multivariate (neural population) case. We show that even in the presence of strong correlations between spike trains, models which neglect couplings between neurons can be erroneously passed by the univariate time-rescaling test. We present the multivariate version of the time-rescaling theorem, and provide a practical step-by-step procedure for applying it towards testing the sufficiency of neural population models. Using several simple analytically tractable models and also more complex simulated and real data sets, we demonstrate that important features of the population activity can only be detected using the multivariate extension of the test. PMID:21395436
A Weibull characterization for tensile fracture of multicomponent brittle fibers
NASA Technical Reports Server (NTRS)
Barrows, R. G.
1977-01-01
A statistical characterization for multicomponent brittle fibers in presented. The method, which is an extension of usual Weibull distribution procedures, statistically considers the components making up a fiber (e.g., substrate, sheath, and surface) as separate entities and taken together as in a fiber. Tensile data for silicon carbide fiber and for an experimental carbon-boron alloy fiber are evaluated in terms of the proposed multicomponent Weibull characterization.
Universal Recurrence Time Statistics of Characteristic Earthquakes
NASA Astrophysics Data System (ADS)
Goltz, C.; Turcotte, D. L.; Abaimov, S.; Nadeau, R. M.
2006-12-01
Characteristic earthquakes are defined to occur quasi-periodically on major faults. Do recurrence time statistics of such earthquakes follow a particular statistical distribution? If so, which one? The answer is fundamental and has important implications for hazard assessment. The problem cannot be solved by comparing the goodness of statistical fits as the available sequences are too short. The Parkfield sequence of M ≍ 6 earthquakes, one of the most extensive reliable data sets available, has grown to merely seven events with the last earthquake in 2004, for example. Recently, however, advances in seismological monitoring and improved processing methods have unveiled so-called micro-repeaters, micro-earthquakes which recur exactly in the same location on a fault. It seems plausible to regard these earthquakes as a miniature version of the classic characteristic earthquakes. Micro-repeaters are much more frequent than major earthquakes, leading to longer sequences for analysis. Due to their recent discovery, however, available sequences contain less than 20 events at present. In this paper we present results for the analysis of recurrence times for several micro-repeater sequences from Parkfield and adjacent regions. To improve the statistical significance of our findings, we combine several sequences into one by rescaling the individual sets by their respective mean recurrence intervals and Weibull exponents. This novel approach of rescaled combination yields the most extensive data set possible. We find that the resulting statistics can be fitted well by an exponential distribution, confirming the universal applicability of the Weibull distribution to characteristic earthquakes. A similar result is obtained from rescaled combination, however, with regard to the lognormal distribution.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Metoyer, Candace N.; Walsh, Stephen J.; Tardiff, Mark F.
2008-10-30
The detection and identification of weak gaseous plumes using thermal imaging data is complicated by many factors. These include variability due to atmosphere, ground and plume temperature, and background clutter. This paper presents an analysis of one formulation of the physics-based model that describes the at-sensor observed radiance. The motivating question for the analyses performed in this paper is as follows. Given a set of backgrounds, is there a way to predict the background over which the probability of detecting a given chemical will be the highest? Two statistics were developed to address this question. These statistics incorporate data frommore » the long-wave infrared band to predict the background over which chemical detectability will be the highest. These statistics can be computed prior to data collection. As a preliminary exploration into the predictive ability of these statistics, analyses were performed on synthetic hyperspectral images. Each image contained one chemical (either carbon tetrachloride or ammonia) spread across six distinct background types. The statistics were used to generate predictions for the background ranks. Then, the predicted ranks were compared to the empirical ranks obtained from the analyses of the synthetic images. For the simplified images under consideration, the predicted and empirical ranks showed a promising amount of agreement. One statistic accurately predicted the best and worst background for detection in all of the images. Future work may include explorations of more complicated plume ingredients, background types, and noise structures.« less
Tuuli, Methodius G; Odibo, Anthony O
2011-08-01
The objective of this article is to discuss the rationale for common statistical tests used for the analysis and interpretation of prenatal diagnostic imaging studies. Examples from the literature are used to illustrate descriptive and inferential statistics. The uses and limitations of linear and logistic regression analyses are discussed in detail.
Using a Five-Step Procedure for Inferential Statistical Analyses
ERIC Educational Resources Information Center
Kamin, Lawrence F.
2010-01-01
Many statistics texts pose inferential statistical problems in a disjointed way. By using a simple five-step procedure as a template for statistical inference problems, the student can solve problems in an organized fashion. The problem and its solution will thus be a stand-by-itself organic whole and a single unit of thought and effort. The…
NASA Technical Reports Server (NTRS)
Taylor, G. R.
1972-01-01
Extensive microbiological analyses that were performed on the Apollo 14 prime and backup crewmembers and ancillary personnel are discussed. The crewmembers were subjected to four separate and quite different environments during the 137-day monitoring period. The relation between each of these environments and observed changes in the microflora of each astronaut are presented.
Harris, Michael; Radtke, Arthur S.
1976-01-01
Linear regression and discriminant analyses techniques were applied to gold, mercury, arsenic, antimony, barium, copper, molybdenum, lead, zinc, boron, tellurium, selenium, and tungsten analyses from drill holes into unoxidized gold ore at the Carlin gold mine near Carlin, Nev. The statistical treatments employed were used to judge proposed hypotheses on the origin and geochemical paragenesis of this disseminated gold deposit.
ERIC Educational Resources Information Center
Neumann, David L.; Hood, Michelle
2009-01-01
A wiki was used as part of a blended learning approach to promote collaborative learning among students in a first year university statistics class. One group of students analysed a data set and communicated the results by jointly writing a practice report using a wiki. A second group analysed the same data but communicated the results in a…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shi, Xin, E-mail: xinshih86029@gmail.com; Zhao, Xiangmo, E-mail: xinshih86029@gmail.com; Hui, Fei, E-mail: xinshih86029@gmail.com
Clock synchronization in wireless sensor networks (WSNs) has been studied extensively in recent years and many protocols are put forward based on the point of statistical signal processing, which is an effective way to optimize accuracy. However, the accuracy derived from the statistical data can be improved mainly by sufficient packets exchange, which will consume the limited power resources greatly. In this paper, a reliable clock estimation using linear weighted fusion based on pairwise broadcast synchronization is proposed to optimize sync accuracy without expending additional sync packets. As a contribution, a linear weighted fusion scheme for multiple clock deviations ismore » constructed with the collaborative sensing of clock timestamp. And the fusion weight is defined by the covariance of sync errors for different clock deviations. Extensive simulation results show that the proposed approach can achieve better performance in terms of sync overhead and sync accuracy.« less
Extreme between-study homogeneity in meta-analyses could offer useful insights.
Ioannidis, John P A; Trikalinos, Thomas A; Zintzaras, Elias
2006-10-01
Meta-analyses are routinely evaluated for the presence of large between-study heterogeneity. We examined whether it is also important to probe whether there is extreme between-study homogeneity. We used heterogeneity tests with left-sided statistical significance for inference and developed a Monte Carlo simulation test for testing extreme homogeneity in risk ratios across studies, using the empiric distribution of the summary risk ratio and heterogeneity statistic. A left-sided P=0.01 threshold was set for claiming extreme homogeneity to minimize type I error. Among 11,803 meta-analyses with binary contrasts from the Cochrane Library, 143 (1.21%) had left-sided P-value <0.01 for the asymptotic Q statistic and 1,004 (8.50%) had left-sided P-value <0.10. The frequency of extreme between-study homogeneity did not depend on the number of studies in the meta-analyses. We identified examples where extreme between-study homogeneity (left-sided P-value <0.01) could result from various possibilities beyond chance. These included inappropriate statistical inference (asymptotic vs. Monte Carlo), use of a specific effect metric, correlated data or stratification using strong predictors of outcome, and biases and potential fraud. Extreme between-study homogeneity may provide useful insights about a meta-analysis and its constituent studies.
Cavalcante, Y L; Hauser-Davis, R A; Saraiva, A C F; Brandão, I L S; Oliveira, T F; Silveira, A M
2013-01-01
This paper compared and evaluated seasonal variations in physico-chemical parameters and metals at a hydroelectric power station reservoir by applying Multivariate Analyses and Artificial Neural Networks (ANN) statistical techniques. A Factor Analysis was used to reduce the number of variables: the first factor was composed of elements Ca, K, Mg and Na, and the second by Chemical Oxygen Demand. The ANN showed 100% correct classifications in training and validation samples. Physico-chemical analyses showed that water pH values were not statistically different between the dry and rainy seasons, while temperature, conductivity, alkalinity, ammonia and DO were higher in the dry period. TSS, hardness and COD, on the other hand, were higher during the rainy season. The statistical analyses showed that Ca, K, Mg and Na are directly connected to the Chemical Oxygen Demand, which indicates a possibility of their input into the reservoir system by domestic sewage and agricultural run-offs. These statistical applications, thus, are also relevant in cases of environmental management and policy decision-making processes, to identify which factors should be further studied and/or modified to recover degraded or contaminated water bodies. Copyright © 2012 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Ogunsua, B. O.; Laoye, J. A.
2018-05-01
In this paper, the Tsallis non-extensive q-statistics in ionospheric dynamics was investigated using the total electron content (TEC) obtained from two Global Positioning System (GPS) receiver stations. This investigation was carried out considering the geomagnetically quiet and storm periods. The micro density variation of the ionospheric total electron content was extracted from the TEC data by method of detrending. The detrended total electron content, which represent the variation in the internal dynamics of the system was further analyzed using for non-extensive statistical mechanics using the q-Gaussian methods. Our results reveals that for all the analyzed data sets the Tsallis Gaussian probability distribution (q-Gaussian) with value q > 1 were obtained. It was observed that there is no distinct difference in pattern between the values of qquiet and qstorm. However the values of q varies with geophysical conditions and possibly with local dynamics for the two stations. Also observed are the asymmetric pattern of the q-Gaussian and a highly significant level of correlation for the q-index values obtained for the storm periods compared to the quiet periods between the two GPS receiver stations where the TEC was measured. The factors responsible for this variation can be mostly attributed to the varying mechanisms resulting in the self-reorganization of the system dynamics during the storm periods. The result shows the existence of long range correlation for both quiet and storm periods for the two stations.
Bell, Melanie L; Horton, Nicholas J; Dhillon, Haryana M; Bray, Victoria J; Vardy, Janette
2018-05-26
Patient reported outcomes (PROs) are important in oncology research; however, missing data can pose a threat to the validity of results. Psycho-oncology researchers should be aware of the statistical options for handling missing data robustly. One rarely used set of methods, which includes extensions for handling missing data, is generalized estimating equations (GEEs). Our objective was to demonstrate use of GEEs to analyze PROs with missing data in randomized trials with assessments at fixed time points. We introduce GEEs and show, with a worked example, how to use GEEs that account for missing data: inverse probability weighted GEEs and multiple imputation with GEE. We use data from an RCT evaluating a web-based brain training for cancer survivors reporting cognitive symptoms after chemotherapy treatment. The primary outcome for this demonstration is the binary outcome of cognitive impairment. Several methods are used, and results are compared. We demonstrate that estimates can vary depending on the choice of analytical approach, with odds ratios for no cognitive impairment ranging from 2.04 to 5.74. While most of these estimates were statistically significant (P < 0.05), a few were not. Researchers using PROs should use statistical methods that handle missing data in a way as to result in unbiased estimates. GEE extensions are analytic options for handling dropouts in longitudinal RCTs, particularly if the outcome is not continuous. Copyright © 2018 John Wiley & Sons, Ltd.
Thieler, E. Robert; Himmelstoss, Emily A.; Zichichi, Jessica L.; Ergul, Ayhan
2009-01-01
The Digital Shoreline Analysis System (DSAS) version 4.0 is a software extension to ESRI ArcGIS v.9.2 and above that enables a user to calculate shoreline rate-of-change statistics from multiple historic shoreline positions. A user-friendly interface of simple buttons and menus guides the user through the major steps of shoreline change analysis. Components of the extension and user guide include (1) instruction on the proper way to define a reference baseline for measurements, (2) automated and manual generation of measurement transects and metadata based on user-specified parameters, and (3) output of calculated rates of shoreline change and other statistical information. DSAS computes shoreline rates of change using four different methods: (1) endpoint rate, (2) simple linear regression, (3) weighted linear regression, and (4) least median of squares. The standard error, correlation coefficient, and confidence interval are also computed for the simple and weighted linear-regression methods. The results of all rate calculations are output to a table that can be linked to the transect file by a common attribute field. DSAS is intended to facilitate the shoreline change-calculation process and to provide rate-of-change information and the statistical data necessary to establish the reliability of the calculated results. The software is also suitable for any generic application that calculates positional change over time, such as assessing rates of change of glacier limits in sequential aerial photos, river edge boundaries, land-cover changes, and so on.
From sexless to sexy: Why it is time for human genetics to consider and report analyses of sex.
Powers, Matthew S; Smith, Phillip H; McKee, Sherry A; Ehringer, Marissa A
2017-01-01
Science has come a long way with regard to the consideration of sex differences in clinical and preclinical research, but one field remains behind the curve: human statistical genetics. The goal of this commentary is to raise awareness and discussion about how to best consider and evaluate possible sex effects in the context of large-scale human genetic studies. Over the course of this commentary, we reinforce the importance of interpreting genetic results in the context of biological sex, establish evidence that sex differences are not being considered in human statistical genetics, and discuss how best to conduct and report such analyses. Our recommendation is to run stratified analyses by sex no matter the sample size or the result and report the findings. Summary statistics from stratified analyses are helpful for meta-analyses, and patterns of sex-dependent associations may be hidden in a combined dataset. In the age of declining sequencing costs, large consortia efforts, and a number of useful control samples, it is now time for the field of human genetics to appropriately include sex in the design, analysis, and reporting of results.
Comment on the asymptotics of a distribution-free goodness of fit test statistic.
Browne, Michael W; Shapiro, Alexander
2015-03-01
In a recent article Jennrich and Satorra (Psychometrika 78: 545-552, 2013) showed that a proof by Browne (British Journal of Mathematical and Statistical Psychology 37: 62-83, 1984) of the asymptotic distribution of a goodness of fit test statistic is incomplete because it fails to prove that the orthogonal component function employed is continuous. Jennrich and Satorra (Psychometrika 78: 545-552, 2013) showed how Browne's proof can be completed satisfactorily but this required the development of an extensive and mathematically sophisticated framework for continuous orthogonal component functions. This short note provides a simple proof of the asymptotic distribution of Browne's (British Journal of Mathematical and Statistical Psychology 37: 62-83, 1984) test statistic by using an equivalent form of the statistic that does not involve orthogonal component functions and consequently avoids all complicating issues associated with them.
Modular space station phase B extension, preliminary system design. Volume 4: Subsystems analyses
NASA Technical Reports Server (NTRS)
Antell, R. W.
1972-01-01
The subsystems tradeoffs, analyses, and preliminary design results are summarized. Analyses were made of the structural and mechanical, environmental control and life support, electrical power, guidance and control, reaction control, information, and crew habitability subsystems. For each subsystem a summary description is presented including subsystem requirements, subsystem description, and subsystem characteristics definition (physical, performance, and interface). The major preliminary design data and tradeoffs or analyses are described in detail at each of the assembly levels.
Pike, Katie; Nash, Rachel L; Murphy, Gavin J; Reeves, Barnaby C; Rogers, Chris A
2015-02-22
The Transfusion Indication Threshold Reduction (TITRe2) trial is the largest randomized controlled trial to date to compare red blood cell transfusion strategies following cardiac surgery. This update presents the statistical analysis plan, detailing how the study will be analyzed and presented. The statistical analysis plan has been written following recommendations from the International Conference on Harmonisation of Technical Requirements for Registration of Pharmaceuticals for Human Use, prior to database lock and the final analysis of trial data. Outlined analyses are in line with the Consolidated Standards of Reporting Trials (CONSORT). The study aims to randomize 2000 patients from 17 UK centres. Patients are randomized to either a restrictive (transfuse if haemoglobin concentration <7.5 g/dl) or liberal (transfuse if haemoglobin concentration <9 g/dl) transfusion strategy. The primary outcome is a binary composite outcome of any serious infectious or ischaemic event in the first 3 months following randomization. The statistical analysis plan details how non-adherence with the intervention, withdrawals from the study, and the study population will be derived and dealt with in the analysis. The planned analyses of the trial primary and secondary outcome measures are described in detail, including approaches taken to deal with multiple testing, model assumptions not being met and missing data. Details of planned subgroup and sensitivity analyses and pre-specified ancillary analyses are given, along with potential issues that have been identified with such analyses and possible approaches to overcome such issues. ISRCTN70923932 .
Hankó-Bauer, Orsolya; Georgescu, Rares; Coros, Marius F; Boros, Monica; Barsan, Iulia; Stolnicu, Simona
We aimed to evaluate whether obese women experience more advanced invasive breast carcinoma (IBC) with a higher number of involved lymph nodes, higher range of axillary lymph node ratio (LNR) and presence and size of extracapsular extension as it may have an impact on prognosis and management. 245 patients diagnosed with IBC were divided into normal weight (NW), overweight (OW) and obese (OB) groups. Patients were divided into high range of LNR (LNR over or equal to 0.2) and low LNR (LNR less than 0.2). The extracapsular extension dimensions were measured on the original slides of each case and grouped into ≤ 1 mm and > 1 mm. 84 patients (33.07%) were OW, 72 (29.38%) OB and 91 (37.14%) NW. 45.7% of cases had macrometastasis in the axillary lymph nodes. NW patients had significantly fewer metastatic lymph nodes (p = 0.05) than in the OW/OB groups. There was no statistically significant difference between BMI groups according to the LNR (p = 0.66). Out of 111 cases with macrometastasis, 58 cases (52.25%) had extracapsular extension (ECE) (11.7% NW, 24.32% OW and 16.22% OB). Significantly more OW patients presented extranodal invasion (p = 0.04). We found no statistically significant relationship between the extracapsular extension diameter and BMI groups (p = 0.1).
Bennett, Derrick A; Landry, Denise; Little, Julian; Minelli, Cosetta
2017-09-19
Several statistical approaches have been proposed to assess and correct for exposure measurement error. We aimed to provide a critical overview of the most common approaches used in nutritional epidemiology. MEDLINE, EMBASE, BIOSIS and CINAHL were searched for reports published in English up to May 2016 in order to ascertain studies that described methods aimed to quantify and/or correct for measurement error for a continuous exposure in nutritional epidemiology using a calibration study. We identified 126 studies, 43 of which described statistical methods and 83 that applied any of these methods to a real dataset. The statistical approaches in the eligible studies were grouped into: a) approaches to quantify the relationship between different dietary assessment instruments and "true intake", which were mostly based on correlation analysis and the method of triads; b) approaches to adjust point and interval estimates of diet-disease associations for measurement error, mostly based on regression calibration analysis and its extensions. Two approaches (multiple imputation and moment reconstruction) were identified that can deal with differential measurement error. For regression calibration, the most common approach to correct for measurement error used in nutritional epidemiology, it is crucial to ensure that its assumptions and requirements are fully met. Analyses that investigate the impact of departures from the classical measurement error model on regression calibration estimates can be helpful to researchers in interpreting their findings. With regard to the possible use of alternative methods when regression calibration is not appropriate, the choice of method should depend on the measurement error model assumed, the availability of suitable calibration study data and the potential for bias due to violation of the classical measurement error model assumptions. On the basis of this review, we provide some practical advice for the use of methods to assess and adjust for measurement error in nutritional epidemiology.
NASA Astrophysics Data System (ADS)
Vogelmann, A. M.; Gustafson, W. I., Jr.; Toto, T.; Endo, S.; Cheng, X.; Li, Z.; Xiao, H.
2015-12-01
The Department of Energy's Atmospheric Radiation Measurement (ARM) Climate Research Facilities' Large-Eddy Simulation (LES) ARM Symbiotic Simulation and Observation (LASSO) Workflow is currently being designed to provide output from routine LES to complement its extensive observations. The modeling portion of the LASSO workflow is presented by Gustafson et al., which will initially focus on shallow convection over the ARM megasite in Oklahoma, USA. This presentation describes how the LES output will be combined with observations to construct multi-dimensional and dynamically consistent "data cubes", aimed at providing the best description of the atmospheric state for use in analyses by the community. The megasite observations are used to constrain large-eddy simulations that provide a complete spatial and temporal coverage of observables and, further, the simulations also provide information on processes that cannot be observed. Statistical comparisons of model output with their observables are used to assess the quality of a given simulated realization and its associated uncertainties. A data cube is a model-observation package that provides: (1) metrics of model-observation statistical summaries to assess the simulations and the ensemble spread; (2) statistical summaries of additional model property output that cannot be or are very difficult to observe; and (3) snapshots of the 4-D simulated fields from the integration period. Searchable metrics are provided that characterize the general atmospheric state to assist users in finding cases of interest, such as categorization of daily weather conditions and their specific attributes. The data cubes will be accompanied by tools designed for easy access to cube contents from within the ARM archive and externally, the ability to compare multiple data streams within an event as well as across events, and the ability to use common grids and time sampling, where appropriate.
Lowe, Gary R; Griffin, Yolanda; Hart, Michael D
2014-08-01
Modern electronic health record systems (EHRS) reportedly offer advantages including improved quality, error prevention, cost reduction, and increased efficiency. This project reviewed the impact on specimen turnaround times (TAT) and percent compliance for specimens processed in a STAT laboratory after implementation of an upgraded EHRS. Before EHRS implementation, laboratory personnel received instruction and training for specimen processing. One laboratory member per shift received additional training. TAT and percent compliance data sampling occurred 4 times monthly for 13 months post-conversion and were compared with the mean of data collected for 3 months pre-conversion. Percent compliance was gauged using a benchmark of reporting 95% of all specimens within 7 min from receipt. Control charts were constructed for TAT and percent compliance with control limits set at 2 SD and applied continuously through the data collection period. TAT recovered to pre-conversion levels by the 6th month post-conversion. Percent compliance consistently returned to pre-conversion levels by the 10th month post-conversion. Statistical analyses revealed the TAT were significantly longer for 3 months post-conversion (P < .001) compared with pre-conversion levels. Statistical significance was not observed for subsequent groups. Percent compliance results were significantly lower for 6 months post-conversion (P < .001). Statistical significance was not observed for subsequent groups. Extensive efforts were made to train and prepare personnel for challenges expected after the EHRS upgrade. Specific causes identified with the upgraded EHRS included multiple issues involving personnel and the EHRS. These data suggest that system and user issues contributed to delays in returning to pre-conversion TAT and percent compliance levels following the upgrade in the EHRS.
Ramanathan, Arvind; Savol, Andrej J; Agarwal, Pratul K; Chennubhotla, Chakra S
2012-11-01
Biomolecular simulations at millisecond and longer time-scales can provide vital insights into functional mechanisms. Because post-simulation analyses of such large trajectory datasets can be a limiting factor in obtaining biological insights, there is an emerging need to identify key dynamical events and relating these events to the biological function online, that is, as simulations are progressing. Recently, we have introduced a novel computational technique, quasi-anharmonic analysis (QAA) (Ramanathan et al., PLoS One 2011;6:e15827), for partitioning the conformational landscape into a hierarchy of functionally relevant sub-states. The unique capabilities of QAA are enabled by exploiting anharmonicity in the form of fourth-order statistics for characterizing atomic fluctuations. In this article, we extend QAA for analyzing long time-scale simulations online. In particular, we present HOST4MD--a higher-order statistical toolbox for molecular dynamics simulations, which (1) identifies key dynamical events as simulations are in progress, (2) explores potential sub-states, and (3) identifies conformational transitions that enable the protein to access those sub-states. We demonstrate HOST4MD on microsecond timescale simulations of the enzyme adenylate kinase in its apo state. HOST4MD identifies several conformational events in these simulations, revealing how the intrinsic coupling between the three subdomains (LID, CORE, and NMP) changes during the simulations. Further, it also identifies an inherent asymmetry in the opening/closing of the two binding sites. We anticipate that HOST4MD will provide a powerful and extensible framework for detecting biophysically relevant conformational coordinates from long time-scale simulations. Copyright © 2012 Wiley Periodicals, Inc.
Generalized functional linear models for gene-based case-control association studies.
Fan, Ruzong; Wang, Yifan; Mills, James L; Carter, Tonia C; Lobach, Iryna; Wilson, Alexander F; Bailey-Wilson, Joan E; Weeks, Daniel E; Xiong, Momiao
2014-11-01
By using functional data analysis techniques, we developed generalized functional linear models for testing association between a dichotomous trait and multiple genetic variants in a genetic region while adjusting for covariates. Both fixed and mixed effect models are developed and compared. Extensive simulations show that Rao's efficient score tests of the fixed effect models are very conservative since they generate lower type I errors than nominal levels, and global tests of the mixed effect models generate accurate type I errors. Furthermore, we found that the Rao's efficient score test statistics of the fixed effect models have higher power than the sequence kernel association test (SKAT) and its optimal unified version (SKAT-O) in most cases when the causal variants are both rare and common. When the causal variants are all rare (i.e., minor allele frequencies less than 0.03), the Rao's efficient score test statistics and the global tests have similar or slightly lower power than SKAT and SKAT-O. In practice, it is not known whether rare variants or common variants in a gene region are disease related. All we can assume is that a combination of rare and common variants influences disease susceptibility. Thus, the improved performance of our models when the causal variants are both rare and common shows that the proposed models can be very useful in dissecting complex traits. We compare the performance of our methods with SKAT and SKAT-O on real neural tube defects and Hirschsprung's disease datasets. The Rao's efficient score test statistics and the global tests are more sensitive than SKAT and SKAT-O in the real data analysis. Our methods can be used in either gene-disease genome-wide/exome-wide association studies or candidate gene analyses. © 2014 WILEY PERIODICALS, INC.
Bredbenner, Todd L.; Eliason, Travis D.; Francis, W. Loren; McFarland, John M.; Merkle, Andrew C.; Nicolella, Daniel P.
2014-01-01
Cervical spinal injuries are a significant concern in all trauma injuries. Recent military conflicts have demonstrated the substantial risk of spinal injury for the modern warfighter. Finite element models used to investigate injury mechanisms often fail to examine the effects of variation in geometry or material properties on mechanical behavior. The goals of this study were to model geometric variation for a set of cervical spines, to extend this model to a parametric finite element model, and, as a first step, to validate the parametric model against experimental data for low-loading conditions. Individual finite element models were created using cervical spine (C3–T1) computed tomography data for five male cadavers. Statistical shape modeling (SSM) was used to generate a parametric finite element model incorporating variability of spine geometry, and soft-tissue material property variation was also included. The probabilistic loading response of the parametric model was determined under flexion-extension, axial rotation, and lateral bending and validated by comparison to experimental data. Based on qualitative and quantitative comparison of the experimental loading response and model simulations, we suggest that the model performs adequately under relatively low-level loading conditions in multiple loading directions. In conclusion, SSM methods coupled with finite element analyses within a probabilistic framework, along with the ability to statistically validate the overall model performance, provide innovative and important steps toward describing the differences in vertebral morphology, spinal curvature, and variation in material properties. We suggest that these methods, with additional investigation and validation under injurious loading conditions, will lead to understanding and mitigating the risks of injury in the spine and other musculoskeletal structures. PMID:25506051
Generalized Functional Linear Models for Gene-based Case-Control Association Studies
Mills, James L.; Carter, Tonia C.; Lobach, Iryna; Wilson, Alexander F.; Bailey-Wilson, Joan E.; Weeks, Daniel E.; Xiong, Momiao
2014-01-01
By using functional data analysis techniques, we developed generalized functional linear models for testing association between a dichotomous trait and multiple genetic variants in a genetic region while adjusting for covariates. Both fixed and mixed effect models are developed and compared. Extensive simulations show that Rao's efficient score tests of the fixed effect models are very conservative since they generate lower type I errors than nominal levels, and global tests of the mixed effect models generate accurate type I errors. Furthermore, we found that the Rao's efficient score test statistics of the fixed effect models have higher power than the sequence kernel association test (SKAT) and its optimal unified version (SKAT-O) in most cases when the causal variants are both rare and common. When the causal variants are all rare (i.e., minor allele frequencies less than 0.03), the Rao's efficient score test statistics and the global tests have similar or slightly lower power than SKAT and SKAT-O. In practice, it is not known whether rare variants or common variants in a gene are disease-related. All we can assume is that a combination of rare and common variants influences disease susceptibility. Thus, the improved performance of our models when the causal variants are both rare and common shows that the proposed models can be very useful in dissecting complex traits. We compare the performance of our methods with SKAT and SKAT-O on real neural tube defects and Hirschsprung's disease data sets. The Rao's efficient score test statistics and the global tests are more sensitive than SKAT and SKAT-O in the real data analysis. Our methods can be used in either gene-disease genome-wide/exome-wide association studies or candidate gene analyses. PMID:25203683
Asteroid shape and spin statistics from convex models
NASA Astrophysics Data System (ADS)
Torppa, J.; Hentunen, V.-P.; Pääkkönen, P.; Kehusmaa, P.; Muinonen, K.
2008-11-01
We introduce techniques for characterizing convex shape models of asteroids with a small number of parameters, and apply these techniques to a set of 87 models from convex inversion. We present three different approaches for determining the overall dimensions of an asteroid. With the first technique, we measured the dimensions of the shapes in the direction of the rotation axis and in the equatorial plane and with the two other techniques, we derived the best-fit ellipsoid. We also computed the inertia matrix of the model shape to test how well it represents the target asteroid, i.e., to find indications of possible non-convex features or albedo variegation, which the convex shape model cannot reproduce. We used shape models for 87 asteroids to perform statistical analyses and to study dependencies between shape and rotation period, size, and taxonomic type. We detected correlations, but more data are required, especially on small and large objects, as well as slow and fast rotators, to reach a more thorough understanding about the dependencies. Results show, e.g., that convex models of asteroids are not that far from ellipsoids in root-mean-square sense, even though clearly irregular features are present. We also present new spin and shape solutions for Asteroids (31) Euphrosyne, (54) Alexandra, (79) Eurynome, (93) Minerva, (130) Elektra, (376) Geometria, (471) Papagena, and (776) Berbericia. We used a so-called semi-statistical approach to obtain a set of possible spin state solutions. The number of solutions depends on the abundancy of the data, which for Eurynome, Elektra, and Geometria was extensive enough for determining an unambiguous spin and shape solution. Data of Euphrosyne, on the other hand, provided a wide distribution of possible spin solutions, whereas the rest of the targets have two or three possible solutions.
NASA Astrophysics Data System (ADS)
Burns, R. G.; Meyer, R. W.; Cornwell, K.
2003-12-01
In-basin statistical relations allow for development of regional flood frequency and magnitude equations in the Cosumnes River and Mokelumne River drainage basins. Current equations were derived from data collected through 1975, and do not reflect newer data with some significant flooding. Physical basin characteristics (area, mean basin elevation, slope of longest reach, and mean annual precipitation) were correlated against predicted flood discharges for each of the 5, 10, 25, 50, 100, 200, and 500-year recurrence intervals in a multivariate analysis. Predicted maximum instantaneous flood discharges were determined using the PEAKFQ program with default settings, for 24 stream gages within the study area presumed not affected by flow management practices. For numerical comparisons, GIS-based methods using Spatial Analyst and the Arc Hydro Tools extension were applied to derive physical basin characteristics as predictor variables from a 30m digital elevation model (DEM) and a mean annual precipitation raster (PRISM). In a bivariate analysis, examination of Pearson correlation coefficients, F-statistic, and t & p thresholds show good correlation between area and flood discharges. Similar analyses show poor correlation for mean basin elevation, slope and precipitation, with flood discharge. Bivariate analysis suggests slope may not be an appropriate predictor term for use in the multivariate analysis. Precipitation and elevation correlate very well, demonstrating possible orographic effects. From the multivariate analysis, less than 6% of the variability in the correlation is not explained for flood recurrences up to 25 years. Longer term predictions up to 500 years accrue greater uncertainty with as much as 15% of the variability in the correlation left unexplained.
Nocturnal oxygen saturation profiles of healthy term infants
Terrill, Philip Ian; Dakin, Carolyn; Hughes, Ian; Yuill, Maggie; Parsley, Chloe
2015-01-01
Objective Pulse oximetry is used extensively in hospital and home settings to measure arterial oxygen saturation (SpO2). Interpretation of the trend and range of SpO2 values observed in infants is currently limited by a lack of reference ranges using current devices, and may be augmented by development of cumulative frequency (CF) reference-curves. This study aims to provide reference oxygen saturation values from a prospective longitudinal cohort of healthy infants. Design Prospective longitudinal cohort study. Setting Sleep-laboratory. Patients 34 healthy term infants were enrolled, and studied at 2 weeks, 3, 6, 12 and 24 months of age (N=30, 25, 27, 26, 20, respectively). Interventions Full overnight polysomnography, including 2 s averaging pulse oximetry (Masimo Radical). Main outcome measurements Summary SpO2 statistics (mean, median, 5th and 10th percentiles) and SpO2 CF plots were calculated for each recording. CF reference-curves were then generated for each study age. Analyses were repeated with sleep-state stratifications and inclusion of manual artefact removal. Results Median nocturnal SpO2 values ranged between 98% and 99% over the first 2 years of life and the CF reference-curves shift right by 1% between 2 weeks and 3 months. CF reference-curves did not change with manual artefact removal during sleep and did not vary between rapid eye movement (REM) and non-REM sleep. Manual artefact removal did significantly change summary statistics and CF reference-curves during wake. Conclusions SpO2 CF curves provide an intuitive visual tool for evaluating whether an individual's nocturnal SpO2 distribution falls within the range of healthy age-matched infants, thereby complementing summary statistics in the interpretation of extended oximetry recordings in infants. PMID:25063836
The Toxicological Evaluation of Realistic Emissions of Source Aerosols Study: Statistical Methods
Coull, Brent A.; Wellenius, Gregory A.; Gonzalez-Flecha, Beatriz; Diaz, Edgar; Koutrakis, Petros; Godleski, John J.
2013-01-01
The Toxicological Evaluation of Realistic Emissions of Source Aerosols (TERESA) study involved withdrawal, aging, and atmospheric transformation of emissions of three coal-fired power plants. Toxicological evaluations were carried out in rats exposed to different emission scenarios with extensive exposure characterization. Data generated had multiple levels of resolution: exposure, scenario and constituent chemical composition. Here, we outline a multilayered approach to analyze the associations between exposure and health effects beginning with standard ANOVA models that treat exposure as a categorical variable. The model assessed differences in exposure effects across scenarios (by plant). To assess unadjusted associations between pollutant concentrations and health, univariate analyses were conducted using the difference between the response means under exposed and control conditions and a single constituent concentration as the predictor. Then, a novel multivariate analysis of exposure composition and health was used based on random forests, a recent extension of classification and regression trees that were applied to the outcome differences. For each exposure constituent, this approach yielded a nonparametric measure of the importance of that constituent in predicting differences in response on a given day, controlling for the other measured constituent concentrations in the model. Finally, an R2 analysis compared the relative importance of exposure scenario, plant, and constituent concentrations on each outcome. Peak expiratory flow is used to demonstrate how the multiple levels of the analysis complement each other to assess constituents most strongly associated with health effects. PMID:21913820
The toxicological evaluation of realistic emissions of source aerosols study: statistical methods.
Coull, Brent A; Wellenius, Gregory A; Gonzalez-Flecha, Beatriz; Diaz, Edgar; Koutrakis, Petros; Godleski, John J
2011-08-01
The Toxicological Evaluation of Realistic Emissions of Source Aerosols (TERESA) study involved withdrawal, aging, and atmospheric transformation of emissions of three coal-fired power plants. Toxicological evaluations were carried out in rats exposed to different emission scenarios with extensive exposure characterization. Data generated had multiple levels of resolution: exposure, scenario, and constituent chemical composition. Here, we outline a multilayered approach to analyze the associations between exposure and health effects beginning with standard ANOVA models that treat exposure as a categorical variable. The model assessed differences in exposure effects across scenarios (by plant). To assess unadjusted associations between pollutant concentrations and health, univariate analyses were conducted using the difference between the response means under exposed and control conditions and a single constituent concentration as the predictor. Then, a novel multivariate analysis of exposure composition and health was used based on Random Forests(™), a recent extension of classification and regression trees that were applied to the outcome differences. For each exposure constituent, this approach yielded a nonparametric measure of the importance of that constituent in predicting differences in response on a given day, controlling for the other measured constituent concentrations in the model. Finally, an R(2) analysis compared the relative importance of exposure scenario, plant, and constituent concentrations on each outcome. Peak expiratory flow (PEF) is used to demonstrate how the multiple levels of the analysis complement each other to assess constituents most strongly associated with health effects.
Martínez, Cristina; Méndez, Carlos; Sánchez, María; Martínez-Sánchez, José María
To assess attitudes towards the extension of outdoor smoke-free areas on university campuses. Cross-sectional study (n=384) conducted using a questionnaire administered to medical and nursing students in Barcelona in 2014. Information was obtained pertaining to support for indoor and outdoor smoking bans on university campuses, and the importance of acting as role models. Logistic regression analyses were performed to examine agreement. Most of the students agreed on the importance of health professionals and students as role models (74.9% and 64.1%, respectively) although there were statistically significant differences by smoking status and age. 90% of students reported exposure to smoke on campus. Students expressed strong support for indoor smoke-free policies (97.9%). However, only 39.3% of participants supported regulation of outdoor smoking for university campuses. Non-smokers (OR=12.315; 95% CI: 5.377-28.204) and students ≥22 years old (OR=3.001; 95% CI: 1.439-6.257) were the strongest supporters. The students supported indoor smoke-free policies for universities. However, support for extending smoke-free regulations to outdoor areas of university campuses was limited. It is necessary to educate students about tobacco control and emphasise their importance as role models before extending outdoor smoke-free legislation at university campuses. Copyright © 2016 SESPAS. Publicado por Elsevier España, S.L.U. All rights reserved.
The extension of smoke-free areas and acute myocardial infarction mortality: before and after study.
Villalbí, Joan R; Sánchez, Emília; Benet, Josep; Cabezas, Carmen; Castillo, Antonia; Guarga, Alex; Saltó, Esteve; Tresserras, Ricard
2011-05-18
Recent studies suggest that comprehensive smoking regulations to decrease exposure to second-hand smoke reduce the rates of acute myocardial infarction (AMI). The objective of this paper is to analyse if deaths due to AMI in Spain declined after smoking prevention legislation came into force in January 2006. Information was collected on deaths registered by the Instituto Nacional de Estadística for 2004-2007. Age- and sex-specific annual AMI mortality rates with 95% CIs were estimated, as well as age-adjusted annual AMI mortality rates by sex. Annual relative risks of death from AMI were estimated with an age-standardised Poisson regression model. Adjusted AMI mortality rates in 2004 and 2005 are similar, but in 2006 they show a 9% decline for men and a 8.7% decline for women, especially among those over 64 years of age. In 2007 there is a slower rate of decline, which reaches statistical significance for men (-4.8%) but not for women (-4%). The annual relative risk of AMI death decreased in both sexes (p < 0.001) from 1 to 0.90 in 2006, and to 0.86 in 2007. The extension of smoke-free regulations in Spain was associated with a reduction in AMI mortality, especially among the elderly. Although other factors may have played a role, this pattern suggests a likely influence of the reduction in population exposure to second-hand smoke on AMI deaths.
[Characteristics of Dupuytren's disease in women. A study of 67 cases].
Ferry, N; Lasserre, G; Pauchot, J; Lepage, D; Tropet, Y
2013-12-01
The aim of this study was to identify the clinical differences of the Dupuytren's disease in gender. Testosterone induces an increase of the Dupuytren's fibroblast proliferation via androgen's receptors. Testosterone rate increases during pregnancy and menopausis. We also reached a link between this factors and the clinical aspects of Dupuytren' disease in the women of our study. This retrospective, comparative study was about all women and a randomized number of men, who underwent surgery for Dupuytren' disease between 1980 and 2010. We analysed all the epidemiologic and clinical data, the surgery procedures and the complications. Pre- and postoperative measurements of the extension lack of all the joints were performed with a manual goniometer. Disabilities of the Arm, Shoulder and Hand (DASH) questionnaire was used to evaluate the patients function. This specific data of women were reached. Sixty-seven women and 69 men were compared. The complex regional pain syndrome was significantly more common in women and the correction of the proximal interphalangeal joint was significantly lower in women. Recurrence rate and mean follow up were not statistically different. Mean DASH score was higher in women. We have not found any association between menopausis, pregnancy and the average age at presentation of the disease, the recurrence rate or the extension rate. The prognosis of the Dupuytren's disease is worse in women than in men. Other studies are necessary to reach the link between the testosterone and the clinical history of the disease in women. Copyright © 2012 Elsevier Masson SAS. All rights reserved.
Atlantic Bluefin Tuna (Thunnus thynnus) Biometrics and Condition.
Rodriguez-Marin, Enrique; Ortiz, Mauricio; Ortiz de Urbina, José María; Quelle, Pablo; Walter, John; Abid, Noureddine; Addis, Piero; Alot, Enrique; Andrushchenko, Irene; Deguara, Simeon; Di Natale, Antonio; Gatt, Mark; Golet, Walter; Karakulak, Saadet; Kimoto, Ai; Macias, David; Saber, Samar; Santos, Miguel Neves; Zarrad, Rafik
2015-01-01
The compiled data for this study represents the first Atlantic and Mediterranean-wide effort to pool all available biometric data for Atlantic bluefin tuna (Thunnus thynnus) with the collaboration of many countries and scientific groups. Biometric relationships were based on an extensive sampling (over 140,000 fish sampled), covering most of the fishing areas for this species in the North Atlantic Ocean and Mediterranean Sea. Sensitivity analyses were carried out to evaluate the representativeness of sampling and explore the most adequate procedure to fit the weight-length relationship (WLR). The selected model for the WLRs by stock included standardized data series (common measurement types) weighted by the inverse variability. There was little difference between annual stock-specific round weight-straight fork length relationships, with an overall difference of 6% in weight. The predicted weight by month was estimated as an additional component in the exponent of the weight-length function. The analyses of monthly variations of fish condition by stock, maturity state and geographic area reflect annual cycles of spawning and feeding behavior. We update and improve upon the biometric relationships for bluefin currently used by the International Commission for the Conservation of Atlantic Tunas, by incorporating substantially larger datasets than ever previously compiled, providing complete documentation of sources and employing robust statistical fitting. WLRs and other conversion factors estimated in this study differ from the ones used in previous bluefin stock assessments.
A null model for microbial diversification
Straub, Timothy J.
2017-01-01
Whether prokaryotes (Bacteria and Archaea) are naturally organized into phenotypically and genetically cohesive units comparable to animal or plant species remains contested, frustrating attempts to estimate how many such units there might be, or to identify the ecological roles they play. Analyses of gene sequences in various closely related prokaryotic groups reveal that sequence diversity is typically organized into distinct clusters, and processes such as periodic selection and extensive recombination are understood to be drivers of cluster formation (“speciation”). However, observed patterns are rarely compared with those obtainable with simple null models of diversification under stochastic lineage birth and death and random genetic drift. Via a combination of simulations and analyses of core and phylogenetic marker genes, we show that patterns of diversity for the genera Escherichia, Neisseria, and Borrelia are generally indistinguishable from patterns arising under a null model. We suggest that caution should thus be taken in interpreting observed clustering as a result of selective evolutionary forces. Unknown forces do, however, appear to play a role in Helicobacter pylori, and some individual genes in all groups fail to conform to the null model. Taken together, we recommend the presented birth−death model as a null hypothesis in prokaryotic speciation studies. It is only when the real data are statistically different from the expectations under the null model that some speciation process should be invoked. PMID:28630293
Qiu, Ya; Liu, Hua; Qing, Yufeng; Yang, Min; Tan, Xiaoyao; Zhao, Mingcai; Lin, Monica; Zhou, Jingguo
2014-09-01
Individual genetic association studies examining the relationship between the ABCG2 gene polymorphisms and gout have yielded inconsistent results. This study aims to evaluate the association between the ABCG2 gene variants and gout using meta-analysis. Relevant studies were identified by searching databases extensively. The odds ratio (OR) was calculated using a random-effect or fixed-effect model. A Q statistic was used to evaluate homogeneity, and Egger's test and funnel plot were used to assess publication bias. Subgroup analyses on ethnicities and sex were also performed. A total of 7 studies, including 2185 gout patients and 8028 controls from 5 countries or regions, were included and identified for the current meta-analysis. It was found that the A allele or AA genotype of the ABCG2 Q141K polymorphism (rs2231142) had an increased risk of gout in the general population (A allele, p < 0.00001 and AA genotype, p < 0.00001, respectively). On the contrary, CC homozygote played a protective role against the risk of gout (p < 0.00001). Similar results were found in subgroup analyses. However, there was a significant heterogeneity among studies. Existing evidence indicates that the Q141K polymorphism (rs2231142, the A allele and AA genotype) is associated with an increased risk of gout.
The potential pitfalls of studying adult sex ratios at aggregate levels in humans.
Pollet, Thomas V; Stoevenbelt, Andrea H; Kuppens, Toon
2017-09-19
Human adult sex ratios have been studied extensively across the biological and social sciences. While several studies have examined adult sex ratio effects in a multilevel perspective, many studies have focused on effects at an aggregated level only. In this paper, we review some key issues relating to such analyses. We address not only nation-level analyses, but also aggregation at lower levels, to investigate whether these issues extend to lower levels of aggregation. We illustrate these issues with novel databases covering a broad range of variables. Specifically, we discuss distributional issues with aggregated measures of adult sex ratio, significance testing, and statistical non-independence when using aggregate data. Firstly, we show that there are severe distributional issues with national adult sex ratio, such as extreme cases. Secondly, we demonstrate that many 'meaningless' variables are significantly correlated with adult sex ratio (e.g. the max. elevation level correlates with sex ratio at US state level). Finally, we re-examine associations between adult sex ratios and teenage fertility and find no robust evidence for an association at the aggregate level. Our review highlights the potential issues of using aggregate data on adult sex ratios to test hypotheses from an evolutionary perspective in humans.This article is part of the themed issue 'Adult sex ratios and reproductive decisions: a critical re-examination of sex differences in human and animal societies'. © 2017 The Author(s).
2013-01-01
Background Species are the fundamental units in evolutionary biology. However, defining them as evolutionary independent lineages requires integration of several independent sources of information in order to develop robust hypotheses for taxonomic classification. Here, we exemplarily propose an integrative framework for species delimitation in the “brown lemur complex” (BLC) of Madagascar, which consists of seven allopatric populations of the genus Eulemur (Primates: Lemuridae), which were sampled extensively across northern, eastern and western Madagascar to collect fecal samples for DNA extraction as well as recordings of vocalizations. Our data base was extended by including museum specimens with reliable identification and locality information for skull shape and pelage color analysis. Results Between-group analyses of principal components revealed significant heterogeneity in skull shape, pelage color variation and loud calls across all seven populations. Furthermore, post-hoc statistical tests between pairs of populations revealed considerable discordance among different data sets for different dyads. Despite a high degree of incomplete lineage sorting among nuclear loci, significant exclusive ancestry was found for all populations, except for E. cinereiceps, based on one mitochondrial and three nuclear genetic loci. Conclusions Using several independent lines of evidence, our results confirm the species status of the members of the BLC under the general lineage concept of species. More generally, the present analyses demonstrate the importance and value of integrating different kinds of data in delimiting recently evolved radiations. PMID:24159931
Parental Opinions and Attitudes about Children's Vaccination Safety in Silesian Voivodeship, Poland.
Braczkowska, Bogumiła; Kowalska, Małgorzata; Barański, Kamil; Gajda, Maksymilian; Kurowski, Tomasz; Zejda, Jan E
2018-04-15
Despite mandatory vaccinations in Poland, the final decision on vaccination in children is taken by their parents or legal guardians. Understanding parents' attitudes and opinions regarding vaccinations is essential for planning and undertaking extensive and properly targeted educational actions aimed at preventing their hesitancy. In 2016, a cross-sectional study was conducted in the Silesian Voivodeship (Poland) in 11 randomly selected educational institutions. The authors' self-administered questionnaire contained 24 mixed-type questions. It was distributed among 3000 parents or legal guardians of children aged 6-13 years; prior consent of the relevant bioethics committee had been obtained. The response rate was 41.3% ( N = 1239). Data were analysed using descriptive and analytical statistics, and focused on parental opinions regarding the safety of vaccines. Results of simple and multivariable analyses showed that perceived risk of adverse vaccine reaction (AVR), contraindications and perception of the qualification procedure for vaccination as substandard were significant factors associated with the rating of children's vaccination as unsafe ( p < 0.001). Respondents with a lower level of education, compared with those with higher, more often declared vaccinations to be safe ( p = 0.03); however, results of multivariable analysis did not confirm that effect. AVR occurrence, finding of contraindication to vaccinations and perception of qualification procedure for vaccination were found to be the most important factors responsible for influencing general public opinions in the field of vaccination safety.
Rezende, Thiago J R; Silva, Cynthia B; Yassuda, Clarissa L; Campos, Brunno M; D'Abreu, Anelyssa; Cendes, Fernando; Lopes-Cendes, Iscia; França, Marcondes C
2016-01-01
Spinal cord and peripheral nerves are classically known to be damaged in Friedreich's ataxia, but the extent of cerebral involvement in the disease and its progression over time are not yet characterized. The aim of this study was to evaluate longitudinally cerebral damage in Friedreich's ataxia. We enrolled 31 patients and 40 controls, which were evaluated at baseline and after 1 and 2 years. To assess gray matter, we employed voxel-based morphometry and cortical thickness measurements. White matter was evaluated using diffusion tensor imaging. Statistical analyses were both cross-sectional and longitudinal (corrected for multiple comparisons). Group comparison between patients and controls revealed widespread macrostructural differences at baseline: gray matter atrophy in the dentate nuclei, brainstem, and precentral gyri; and white matter atrophy in the cerebellum and superior cerebellar peduncles, brainstem, and periventricular areas. We did not identify any longitudinal volumetric change over time. There were extensive microstructural alterations, including superior cerebellar peduncles, corpus callosum, and pyramidal tracts. Longitudinal analyses identified progressive microstructural abnormalities at the corpus callosum, pyramidal tracts, and superior cerebellar peduncles after 1 year of follow-up. Patients with Friedreich's ataxia present more widespread gray and white matter damage than previously reported, including not only infratentorial areas, but also supratentorial structures. Furthermore, patients with Friedreich's ataxia have progressive microstructural abnormalities amenable to detection in a short-term follow-up. © 2015 International Parkinson and Movement Disorder Society.
Tyllianakis, Emmanouil; Skuras, Dimitris
2016-11-01
The income elasticity of Willingness-To-Pay (WTP) is ambiguous and results from meta-analyses are disparate. This may be because the environmental good or service to be valued is very broadly defined or because the income measured in individual studies suffers from extensive non-reporting or miss reporting. The present study carries out a meta-analysis of WTP to restore Good Ecological Status (GES) under the Water Framework Directive (WFD). This environmental service is narrowly defined and its aims and objectives are commonly understood among the members of the scientific community. Besides income reported by the individual studies, wealth and income indicators collected by Eurostat for the geographic entities covered by the individual studies are used. Meta-regression analyses show that income is statistically significant, explains a substantial proportion of WTP variability and its elasticity is considerable in magnitude ranging from 0.6 to almost 1.7. Results are robust to variations in the sample of the individual studies participating in the meta-analysis, the econometric approach and the function form of the meta-regression. The choice of wealth or income measure is not that important as it is whether this measure is Purchasing Power Parity (PPP) adjusted among the individual studies. Copyright © 2016 Elsevier Ltd. All rights reserved.
The Genesis Mission Solar Wind Collection: Solar-Wind Statistics over the Period of Collection
NASA Technical Reports Server (NTRS)
Barraclough, B. L.; Wiens, R. C.; Steinberg, J. E.; Reisenfeld, D. B.; Neugebauer, M.; Burnett, D. S.; Gosling, J.; Bremmer, R. R.
2004-01-01
The NASA Genesis spacecraft was launched August 8, 2001 on a mission to collect samples of solar wind for 2 years and return them to earth September 8, 2004. Detailed analyses of the solar wind ions implanted into high-purity collection substrates will be carried out using various mass spectrometry techniques. These analyses are expected to determine key isotopic ratios and elemental abundances in the solar wind, and by extension, in the solar photosphere. Further, the photospheric composition is thought to be representative of the solar nebula with a few exceptions, so that the Genesis mission will provide a baseline for the average solar nebula composition with which to compare present-day compositions of planets, meteorites, and asteroids. The collection of solar wind samples is almost complete. Collection began for most substrates in early December, 2001, and is scheduled to be complete on April 2 of this year. It is critical to understand the solar-wind conditions during the collection phase of the mission. For this reason, plasma ion and electron spectrometers are continuously monitoring the solar wind proton density, velocity, temperature, the alpha/proton ratio, and angular distribution of suprathermal electrons. Here we report on the solar-wind conditions as observed by these in-situ instruments during the first half of the collection phase of the mission, from December, 2001 to present.
[STandardized Reporting Of Secondary data Analyses (STROSA)—a recommendation].
Swart, Enno; Schmitt, Jochen
2014-01-01
Secondary data analyses will play an increasingly important role in health services research. But to date, there is no guideline for the systematic, transparent and complete reporting of secondary data. We investigated whether the STROBE statement, i.e., the recommendations for reporting observational studies, satisfies the specific characteristics of secondary data analyses and whether any specifications/modifications and extensions are necessary. For the majority of the 22 STROBE criteria, specifications and extensions are needed to meet the requirements of systematic, transparent and complete reporting of secondary data analysis. Seven aspects of secondary data analysis not covered by STROBE (legal aspects, data flow, protocol, unit of analysis, internal validations/definitions, advantages of secondary data utilisation, role of data owners) should be considered as a specific complement to STROBE. The so called STROSA (STandardized Reporting Of Secondary data Analyses) checklist therefore includes 29 items that relate to the title/abstract, introduction, methods, results and discussion sections of articles. The STROSA checklist is intended to support authors and readers in the critical appraisal of secondary data analyses. This proposal will now be subject to continued scientific discussions. Copyright © 2014. Published by Elsevier GmbH.
ERIC Educational Resources Information Center
Karamidehkordi, Esmail
2013-01-01
Purpose: This article aims to show the linkage of Iranian agricultural research centres with extension and farmers, using three case studies in 1999, 2005 and 2010. Design/methodology/approach: The data were collected through document analyses, structured and semi-structured interviews and observations. Findings: The 1999 and 2005 cases were…
The Influence of Translation on Reading Amount, Proficiency, and Speed in Extensive Reading
ERIC Educational Resources Information Center
Sakurai, Nobuko
2015-01-01
This study attempted to examine the influence of a decrease in translation on the number of words read, reading comprehension, and reading rate in an extensive reading (ER) program. The participants were 70 first-year university students who experienced ER both in and outside the classroom for 15 weeks. The results of regression analyses confirmed…
ERIC Educational Resources Information Center
Science Software Quarterly, 1984
1984-01-01
Provides extensive reviews of computer software, examining documentation, ease of use, performance, error handling, special features, and system requirements. Includes statistics, problem-solving (TK Solver), label printing, database management, experimental psychology, Encyclopedia Britannica biology, and DNA-sequencing programs. A program for…
14 CFR 217.6 - Extension of filing time.
Code of Federal Regulations, 2010 CFR
2010-01-01
...) ECONOMIC REGULATIONS REPORTING TRAFFIC STATISTICS BY FOREIGN AIR CARRIERS IN CIVILIAN SCHEDULED, CHARTER... in § 217.10 at least 3 days in advance of the due date, and must set forth reasons to justify...
NASA Technical Reports Server (NTRS)
Morrissey, L. A.; Weinstock, K. J.; Mouat, D. A.; Card, D. H.
1984-01-01
An evaluation of Thematic Mapper Simulator (TMS) data for the geobotanical discrimination of rock types based on vegetative cover characteristics is addressed in this research. A methodology for accomplishing this evaluation utilizing univariate and multivariate techniques is presented. TMS data acquired with a Daedalus DEI-1260 multispectral scanner were integrated with vegetation and geologic information for subsequent statistical analyses, which included a chi-square test, an analysis of variance, stepwise discriminant analysis, and Duncan's multiple range test. Results indicate that ultramafic rock types are spectrally separable from nonultramafics based on vegetative cover through the use of statistical analyses.
[Clinical research=design*measurements*statistical analyses].
Furukawa, Toshiaki
2012-06-01
A clinical study must address true endpoints that matter for the patients and the doctors. A good clinical study starts with a good clinical question. Formulating a clinical question in the form of PECO can sharpen one's original question. In order to perform a good clinical study one must have a knowledge of study design, measurements and statistical analyses: The first is taught by epidemiology, the second by psychometrics and the third by biostatistics.